Game Career Guide is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Get the latest Education e-news
  • Time Tracking In 2020

    - Niklas Gray
  •  As 2019 was wrapping up I found myself wanting to get some kind of high-level picture of what I've spent the year doing. First, I'm just curious. Second, I want to know how much time I've spent building out different engine systems and if that time matches my initial estimates. Third, I want to make sure that I've spent my time doing "the right" things.

    I think all good programmers have something of an obsessive nature and it's sometimes easy to get carried away with refactors and optimizations that in the big picture maybe don't matter that much. With the small team we have at The Machinery, working on building a complete engine that outshines the competition, we can't afford to mess around. I'm usually pretty good at staying focused, but I'm always striving to improve.

    I also want to get better at making time estimates. This is a notoriously difficult thing that we programmers get asked to do all the time by our managers. Often we balk and try to squirm away: "That's impossible! I can't possibly know how much time it takes before I've done it!"

    It's easy to scoff at this attitude, but I actually think it is pretty reasonable. Estimating how long it takes to code something is hard because programming is fundamentally different from most other human endeavors. If you are building bridges, doing woodworking or cleaning the house, you typically have a relatively short planning phase where you decide what to do and then a long execution phase where you hunker down and do it. In the planning phase, you can make a reasonable estimate of how long this execution phase is going to take.

    However, programming, at least the way I see it, is pretty much all planning. The bulk of programming is deciding how to solve something: what the APIs should be, how different components should work together, etc. When you have decided exactly what do, writing out the code is trivial. (For this reason, I also don't really like splitting coding into a planning and a coding phase. In my view, coding is planning.) Since planning is the bulk of the work, estimating the size of a coding task means trying to plan how long the planning is going to take. At that level of meta-analysis, things become pretty sketchy. Many programmers don't want to give management numbers they're not sure about, out of (an often valid) fear that management will take those numbers and run with them.

    But since I'm an owner and a founder at Our Machinery, I'm the one asking myself for these estimates. And I'm not willing to let me get away that easily!

    As a planning instrument, time estimates are crucial. We want to know how much time something will take to implement so that we can make an appropriate cost-benefit analysis. We also want to be able to present some kind of road map of upcoming features, both for our own sake and the sake of our customers. So we need to tackle the unpleasant task of making estimates.

    The first step to getting better estimates is getting time tracking right. If we're not even sure how long something did take, how can we ever know how long something is going to take. With time tracking as calibration, we can hopefully learn to improve our estimates, even if it's always going to be a difficult task. For example, I've noticed that I often overestimate the time it takes to complete small tasks and underestimate the time it takes to complete big tasks. The reason, I think, is that it always seems reasonable to say that something will take 1-2 days, even though many small tasks can be completed in hours. Whereas for big tasks, it is just hard to think of everything that goes into them. I end up missing stuff and underestimating.

    One other thing I think is important is that you don't punish or reward employees for their estimates. For example, you shouldn't force people to work overtime because they underestimated or praise them because they overestimated and are "done early". Neither should you shame them for making estimates that are too long: "What, it'll take you three months to do this?" All of this will just push your employees to make worse estimates.

    Enough set about this tricky topic. In the rest of this post, I'll focus on time tracking.

    Looking back at 2019, I found it pretty hard to discern exactly where and how I had spent my time. This even though I actually had three different and pretty detailed records of the work I had been doing:

    1. I have a big todo-list in Workflowy, where I keep track of all the things on my plate. Workflowy sends me a summary email every day of tasks I have completed. So this is a pretty detailed and complete record of everything I've been doing, except it doesn't cover exactly how much time I've spent on each task.

    2. For time tracking I use Whenever I switch to a new task I start a Toggl timer. This way I can see both how much time I've spent on each individual task and how many hours I've worked total. Each week, Toggl gives me a report which I summarize in a Slack post to let the rest of the team know what I'm working on. So this gives timing information, but less detail, because I typically just make an entry called Web Site when I'm working on the web site without specifying exactly what I did.

    3. The git commit history contains a lot of information too. I subscribe to the "commit early, commit often" philosophy, which means I usually commit changes multiple times a day. So the git log can tell me what systems I've been working on (from the files I touched) and give an idea of how much time I've spent on each (the time since the last commit). And, of course, the commit comments themselves give more detailed information. But it doesn't tell me anything about non-coding tasks.

    So the problem is not a lack of data. Rather, the problem is too much data! It's hard to see the forest for the trees! Going through these logs and trying to summarize them into high-level information like "How long did the physics system take to write?" would mean slogging through thousands of entries.

    I suppose I could sit down and sift through all of this, write some script to automatically analyze the logs and the email archives and merge the information from all three sources to get a more complete picture, but that would likely take days. And considering that the whole point of this was to find out how to spend my time more efficiently that does not seem like a prudent thing to do.

    So for 2020, I've decided that I need to do time tracking differently. If the goal is to have high-level, big picture information available at-a-glance, then that is what I should be collecting all the time - not try to piecemeal together at the end of the year.


comments powered by Disqus