I just finished the time tracking application for Bale. This is a tool
that is key to many other practices, but is very underutilized. (I
wonder if this is because so many developers are worried about Big
Brother type intrusions into their working habits? See below for why Big
Brother is such a bad idea. If your employer is in this camp)
Here are eight reasons why you should use a time tracking tool:
- Gather reliable data for future estimates. You can't make accurate estimates without accurate historical data. If you try to create estimates using a simple count of days spent working on a project, you'll be way off because so much time is lost to non-project activities (like supporting the last project you were on). A time tracking tool keeps an accurate record of how many hours you actually spent working on the project.
- Eliminate time wasters. When you start tracking time, you'll be
surprised how little time you spend working on your current
project. You can begin to pay attention to the low value activities
that come up during the day. Awareness is the first step, then you
can start to reduce or eliminate these time wasters so you can put
more time into your bottom line.
- Evaluate new practices. Without good time data you have no idea
if the introduction of a new practice or process step is helping or
hurting. For example, say you move to performing code reviews online
instead of on paper. You may have a good idea of how much money this
will save in terms of ink and paper, but you have no idea whether it
is actually faster. (I'll soon have a post on why defect tracking is
important too -- evaluating new practices is a key benefit there
too.)
- Find areas for improvement. When I first started tracking time I
realized the vast amount of time I spent in unit testing -- even
worse, sometimes in system testing. I made a conscious effort to
slow down when doing design work and it paid off. Hasty designs
end up requiring a lot of rework during test. Your situation may be
different, but you won't know until you start tracking where in the
lifecycle your time is consumed.
- Improve reviews. An effective review finds more defects per
hour than an ineffective review. You can't tell the difference
between effective and ineffective reviews without having the time
component. This is sort of a subcategory of finding areas for
improvement. For example, I know that my personal design review is
about 4x more effective than unit testing, and slightly more
effective than my personal code review. In a team context, knowing
your team's defect removal rate during inspections is an excellent
way to justify the investment in review time to upper management.
This is why a team-wide time tracking tool is important.
- Evaluate new tools. Too often, the process of deciding whether a
new tool is going to work is based on watching the salesman
demonstrate the tool on the projector in a darkened room, and then
some haggling by upper management. Sometimes you get a 30 day eval,
but in terms of a test drive, this is really just a spin around the
block to see if it "feels nice". It's much more useful to be able to
know that a given tool actually increased productivity by 8% during
the eval period. This also provides leverage during the
aforementioned haggling when the vendor is claiming 77% productivity
gains...
- Know when to quit. Ever had your manager say, "Don't spend more
than an hour on it."? Ever blown the afternoon after hearing such a
request because you just couldn't put it down?
- Improve visibility. You can generate better upwards-facing
status reports with a good time log. Remind your manager that the
reason you missed the Friday deadline is because the VP of
Engineering retasked you for all of Tuesday and Wednesday on his pet
project. She can then show your status report to the VP as a
reminder. (And oh, by the way, she can also show it to the VP of
Marketing and CTO/Founder who're screaming about the missed
deadline. Yes, your organization is not the only dysfunctional group
out there.)
For all the pointy haired bosses out there, here are a few reasons to
not use time tracking to compare employees based on productivity.
- It's easy to game the system. You can enter fake data. This
will, of course, screw up estimates and make it impossible to use
the data for any kind of overall improvements. Developers need every
assurance that the data will not be used for evil or they will see
to it that the data is worthless.
- Productivity varies by task. I've seen examples, on the same
project, by the same person (me), where different tasks have
enormously different productivity. Some problems are simply harder
to solve. Sometimes you have to modify an existing bit of code:
sometimes the existing code is easy to work with, sometimes it is a
horrible mess. Sometimes you don't even know what the problem is,
and you have to do a bit of research just to figure out what needs
to happen to get something to work.
- People make varying contributions. I've known people who appear
to work very slowly, producing very little every week. Sometimes
("The Slacker") they're not actually working, instead day trading or
playing solitaire. Sometimes ("The Craftsman") they are doing very
solid work; you'll never see a bug come out of system test for these
guys. Some developers ("The Enabler") don't do lots of work of their
own, but rather spend much of their time enabling others, or perhaps
interacting with marketing, support, or testers; this is valuable
time spent, they are keeping your team in the loop or solving
customer problems. You can't tell the difference with a time
tracking tool, the slacker will simply enter fake data, and the
enabler's contributions don't show up on paper.
If you're trying to compare using time log data, then when lean times come you may end up firing the Enabler because the Slacker looks good. You'll still have the Craftsman, but because he depended on the Enabler for answering questions, keeping Customer Support off his back, and keeping him in the loop about What Marketing Wants This Week, his productivity will drop dramatically.