Information Technology Dark Side

Struggles of a Self-Taught Coder

Information Technology Dark Side header image 2

The demise of the Gantt Chart in Agile Software Projects

July 31st, 2007 · 19 Comments

Tate Stuntz, Contributing EditorTate Stuntz is a contributing editor to TechDarkSide.com and manages projects, services and methodologies for fun and profit. He can be reached at tate.stuntz@nimbleconsulting.com.

Gantt charts look cool. The ones I can make using MS-Project show task names, durations, assigned resources and milestones. All in color…in whatever fonts and fill patterns I want to use. In my experience, few things about a project proposal impress people so completely as a really nice-looking Gantt chart. Sad, but true.

One of the most important things a Gantt chart is supposed to show is the task dependencies in a project plan. Some of these task dependencies will be part of the critical path of the project. Good project managers spend a lot of time fretting about the critical path – as well they should. If any task on the critical path falls behind schedule, the whole project falls behind.

You don’t see a lot of Gantt charts on Agile projects though. In fact, a recent survey from Scott Ambler indicates that they are considered to be the least valuable project artifacts around. Is this because Agile doesn’t have planning? Not at all. This is because Agile methods have the fascinating effect of radically reducing task dependencies.

How can project dependencies, which seem inherent in a given problem domain, be reduced by the project method? Well, first, you have to come to grips with the fact that many things we think are dependencies in software are really not. For example, some people feel strongly that you should build the physical data model before building the application logic and that you should build the user interface last. This kind of thinking would align software with the stages of building a house wherein the database is like the foundation. The fact is, however, that software does not need to be built that way. In fact some people would argue strenuously against that particular sequence (as a trivia item, not everybody builds houses that way either).

So, perhaps building software in a certain order isn’t really required, but surely it is more efficient to do it a particular way, right? I’m not one to favor the idea that there is always one best way, but it does seem intuitive that it would be more efficient for 5 different developers to be working in parallel on 5 different features if they were all building the features into a clear and complete application architecture. That way they could, for example, take advantage of existing persistence and security mechanisms instead of inventing those wheels themselves in their feature code.

Recognizing that, the good PM will develop a project plan that has the architect building the architecture first and then the developers showing up to build the features next. Of course the architect will need to know the requirements of all the features in order to build the architecture correctly, so the analysts will have to show up first to get the requirements right… Before you know it, you have a project plan with the standard set of phases all lined up into one gigantic critical path – which looks really cool on a Gantt chart.

Let’s go back to the roots of the scenario though. It assumes that the persistence and security mechanisms correctly support the needs of all 5 feature areas being developed. That, in turn, assumes that the requirements of those feature areas were captured accurately AND that they won’t change notably, which assumes that the users can articulate the requirements properly to begin with. Those are all assumptions that research (citations) has shown to be, well, problematic at best.

The same Agile principles that eschew those assumptions are the ones that reduce dependencies during the course of a project. I have lumped some of these principles together wherein each combined lump contributes to reducing dependencies in a particular way. The specific lumps of principles that I have seen reduce dependencies the most are:

Agile Requirements Management, Iteration Planning and Self-Organization

When you have a granular, ever-changing list of software requirements (or features or enhancements or whatever) and the team prioritizes and commits to just a few of those each iteration AND the team then organizes their own way of delivering those requirements in the space of a few weeks or less, it is really, really hard to make a Gantt Chart.

How, for example, can you make a reasonable Gantt Chart of a list of 4 features, 9 defects and 3 tasks – most of which are rather unrelated to each other? The list is prioritized on business value and sometimes also technical risk. It is a consumer’s perspective. The list is not prioritized on what makes sense from a builder’s perspective. The question of, “what would be the most efficient way of doing it?” is only asked within the iteration when the team members are jotting down their rough sub-tasks for completing their particular scope items.

If, as a diligent PM, you scurry around and try to turn their notes and white-board checklists into a Gantt Chart, you’ll find the results to be very disappointing. You’ll probably have several lists of tasks, many of which are performed in parallel and some of which are semi-sequential, at best. If you are hopelessly cranked up on stimulants or suffer from OCD (most real PMs will nod to both), you will still attempt to fashion these lists into Gantt Charts or at least a Work Breakdown Structure (for pete’s sake you’ve got to manage something!), that way you could start lining out who is doing which assignments… Well, if your Agile project goes like the ones I’ve seen, the team members will display the irritating behavior of constantly moving around to different tasks as seems most effective to them within the iteration – regardless of what you’ve written down.

If you manage to finish a Gantt Chart or WBS it will likely only be because the iteration is almost over and you are able to record the history. Interestingly enough, nobody will be curious enough or even polite enough to ask to see your work. It simply is not helpful in any practical way for anybody to use on the project going forward.

The list of items being worked within any iteration is short and relatively simple and, for the most part, doesn’t need sophisticated work breakdowns, sequencing, lag buffering, etc. In other words, the iteration work doesn’t require sophisticated planning. It is small enough to be managed quite effectively using simple tools – basic task lists and verbal communication.

Generalist team members and Pair Programming

What about division and separation of labor? Surely it would be more efficient for the expert systems analyst on the team to do all the requirements and modeling work and then hand that off to the developers for programming. And then, surely, it would be more efficient for to same developer to work on the enhancement to the wobblerator feature and the 3 defects in the wobblerator, right?

Well, first of all, let’s talk about efficiency. No, on second thought, let’s not. Do this instead, go to Purdue University or some other institution that knows something about industrial engineering or production planning and take some classes. Or, check out some links on queuing theory and sub-optimization; or try stochastic modeling – that crap’s hard! . . . . .

Sigh, OK, don’t do that either. I was getting a little bitter. I have to say that I’ve become tired of ‘software’ people talking about issues related to production and efficiency as if they are actually well studied in the subject. Having recently moved closer to the manufacturing world, I’ll give you my best understanding of that stuff as it applies to software teams.

The gist is that, in certain situations, the overall efficiency (for the whole team or whole project) that you would gain by having a specialist team member doing all the work that they specialize in would not make up for the inefficiencies that would be caused by the queuing of work through that single resource. You would create a bottle-neck based on your resource. This will typically cause you to try to pipleline your whole project through a series of specialists, which would introduce more queuing problems. The fact is that you can do this kind of thing with manufacturing and gain real efficiency, but it is extremely difficult to do it with software development and gain real efficiency. You might, in some situations, gain some level of predictability, but at the cost of lower efficiency. The predictability arrives as the PM finally gets a handle on the necessary buffers to put in place to keep all the resources fed with a pipeline of work (and if a hundred other things fall into place). The lower efficiency takes of form of Parkinson’s Law; as the resources simply sit, more or less idle, using up their buffer time without doing any more work. Few people ever think or ever admit that they are really doing nothing for hours at a time on projects like this, but I have seen it. It’s for real.

Whew, I ‘m glad that’s over.


Now, if you have team members that will lower themselves to wear many hats, you can avoid some of the problems of resource bottlenecks. For example, everybody is qualified enough to ask what the requirements are of a single feature; and if not, they can pair up with the requirements specialist (who happens to still be on the team) for a couple hours. After they do that a few times, guess what? They start learning how elicit the requirements themselves. Maybe never as well as the specialist, but well enough to get the work done and reduce the resource dependency. In the meantime, the requirements specialist has turned into quite a capable tester – having paired up with a really good tester on the team – who has been writing some end user documentation.

As developers pair up, I have (as a PM-type) been amazed at the sense of well-being I get from knowing that anybody on the team could get hit by a bus and it wouldn’t be too terrible – for the project. Well, that’s not nice to say, but it’s true. I have seen pairing provide enough cross-pollination of knowledge that not only can key team members leave the project without killing it, but new team members can join the project and almost immediately become productive. No kidding.

That reduces, dramatically, the number of dependencies due to resource constraints.

Refactoring and Regression Testing

Well this post is so long already it’s just ridiculous, so let’s get right to the point on this one. Clear back toward the top of this, we talked about the example of getting the foundation of the software in place prior to building the features. The magic of automated regression testing (a la [N]Unit and other nifty tools) allows developers to start developing basically anywhere and evolve the solution (features, architecture and all) into place over time. Without massive regression testing, it would be a nightmare to attempt to fundamentally change something about the architecture after numerous features have been developed. With massive regression testing, it is not too big of a deal. No kidding. I’ve seen it work, it’s still amazing to me, but it worked.

Allowing the team to reserve some of their capacity for refactoring (or continuous design as some have called it) and supporting the ability to refactor without terrible consequences allows you to drop even the intuitive and ‘safe-feeling’ macro-level sequencing like ‘gotta get the architecture right first’.

Summary

Now, after all that, keep in mind that while going around your project incinerating dependencies like abandoned buildings might sound like fun, nobody will laugh at you if you start slowly and let your Agile projects prove themselves capable bit by bit. Well, OK, some people will laugh. But some always do.

Would you like to argue in favor of more dependencies on software projects? Post a comment. I dare you.

If you enjoyed this post, make sure you subscribe to my RSS feed!
Stumble it!

Tags: Project Management

19 responses so far ↓

  • 1 julieB // Aug 2, 2007 at 10:22 am

    Well, yes. I would like to argue. Not in favor of Gantt charts or specialists and especially not in favor of more MS Project.

    I appreciate the vast majority of agile principles, especially pair programming. I’m quite shocked anyone would be surprised this is not the most efficient way of doing software development. I learned OO that way (Smalltalk in fact) and still always learn something new when pairing. Even if I’m the one leading.

    Refactoring is a fact of the software development life cycle. That is if you have any real size to the application.

    Developing in iterations, good. Self-organization, good. Less project management, good. Fancy Gantt chart that doesn’t reflect anything actual but impresses the powers that be, good.

    What I would like to argue for is some up front planning for all iterations. Drawing up building plans before laying the foundation. Not that these plans cannot be changed, but they need to be thought out in whole before beginning work.

    In a game of chess, you must use both strategy and tactics. Strategy to set out long-term goals for the game. Tactics are immediate manuvers that take you toward the long-term goal. You have a plan. You’ve thought though how to get there. You are able to think through the tactics that will get you there. Of course, your opponent will throw some curves at you and you’ll need to re-adjust your strategy as you go on. But you always have a long-term plan. Your focus is never just the next move.

    I guess I think of software projects the same way. You have to think it out beginning to end. As you move through iterations, things will adjust, but you have to work on the whole picture. Looking at only the requirements in any single iteration without regard for the end result will not get you where you need to go. Or at least that is the part of agile that I can’t comprehend.

    For instance, I know I need to build a house. How can you start building the first floor without at least first thinking through the layout of the second floor? What if after building the first floor, we take a look at the second and find out the house needs to be two feet longer. I can refactor the first floor walls but it is going to take a lot of work. If I had just taken some time to think through the big picture, discussed and confirmed it with the powers that be, I wouldn’t have to move walls. Some up front planning and confirmation and my level of refactoring would be more like having to choose a different color paint or rearranging the furniture. Much easier and much less stressful.

    In short, I argue in favor of knowing what your dependencies are up front. Because they are there. Even if you don’t have a Gantt chart to prove it.

  • 2 David Anderson // Aug 3, 2007 at 4:42 pm

    I think it is a whole lot simpler to determine why Gantt charts are on the way out. They represent a deterministic approach to planning that locks in decisions early (even if MS Project was originally intended as a forecasting and modeling tool, it’s not the way it used).

    Agile projects embrace Real Option Theory and its Lean/Toyota manifestation that says “make a decision at the last responsible moment.” So for example, resource allocation should be made at the last responsible moment and hence resource leveling in a plan is pointless.

    Much of the task dependency graphing in a Gantt chart is a side-effect of resource allocation, which in an agile project isn’t done until the last responsible moment. Hence, it is meaningless to try and graph dependencies. And hence, Gantt charts are useless.

    David

  • 3 Tim Scott // Aug 7, 2007 at 7:45 pm

    I am a project manager turned developer, and I consider myself an “Agile” practitioner. I quite strongly agree with everything in your post except one thing.

    The post implies that the resulting architecture will be satisfactory — automated testing plus refactoring will take care of it, no need address it strategically. Well, from what I have seen “emergent architecture” often ends up as Rube Goldberg architecture. A little time pressure, slight team disharmonies, one or two weak team members, etc, and things can get pretty tangled up under the hood. Yet all tests pass.

    I guess I agree with Julie that some strategic planning is needed. It’s not too hard to quickly grasp enough about an application to make some provisional architecture decisions and get started on some foundation and framing work.

  • 4 David Christiansen // Aug 8, 2007 at 6:32 am

    I like Tim’s comment because it underscores one of the fundamental problems of software development, no matter how you do it (as far as I can tell). It’s hard to create an architecture that will meet all your needs, whether you do it upfront or through evolution.

    One of the hardest things about trying to do upfront architecture well is that you are trying to lay the groundwork for solving a problem at a point in time when you understand the problem least.

    Poppendieck talks about this in Lean Software Development, about having a short architectural sprint that touches enough of the application to help the team get a solid understanding of the problem space. I am in the middle of a sprint like this in my current project, and to tell you the truth it doesn’t feel as productive as my regular approach, but I’m going to stick with it.

    I think some upfront architecture is frequently necessary, but extensive, all-encompassing designs are not helpful. A guiding principle in creating such architectures is to “make decisions at the last responsible moment.”

  • 5 Tim Scott // Aug 8, 2007 at 1:56 pm

    We all seem to agree so far that “big up front architecture” is bad — bad enough to go in the trash. Bye bye BUFA. Tate implies that “test and refactor” is its replacement. David C offers a guiding principle: “make decisions at the last responsible moment.”

    David’s suggestion is good, but let’s face it, applying this principle is strictly dark art. It requires A-plus judgment to know when “the last possible moment” is at hand or even when it’s approaching. In the fray of a complex project with team members bringing different perspectives, it’s really really hard to make those judgments well enough to achieve satisfying results. That’s my experience.

    Any method (or set of principals, or whatever we call Agile) must be evaluated on its ability to produce results repeatably. When it comes to architecture, I believe that Agile is not there yet. I guess I am guilty here of pointing out a problem but not offering a solution. Heck, maybe we’re stuck; we have to accept that good architecture comes from really smart people and magical teams, and that’s it. I hope not, and I plan to keep thinking about it and looking for some effective practices.

  • 6 David Christiansen // Aug 8, 2007 at 3:35 pm

    I totally agree – good architecture is dark art. I’m not convinced there is such a thing as a methodology or process that can produce good architecture. I’m with you Tim – hoping we find a better way but not sure we will.

  • 7 Chris Woodill // Aug 13, 2007 at 11:06 pm

    Tate:

    I thought your article was quite interesting, and as an Director IT who is trying to implement Agile development and a formal PMO at the same time, we’re definately struggling with how to pick the best tool set for managing projects in an agile context.

    I have put a response on my blog (see URL) as I think that there are ways to use MS Project successfully in an agile context as long as you a) change the way you use the tooling to make it way more efficient (I have some suggestions on this on my blog); b) use GANTT charts at the appropriate level of detail (e.g. I agree that trying to put a bunch of micro-dependencies in your plan is overkill); c) use it more for macro-planning instead of micro-tracking.

    See my blog post on the subject – would be interested in your experience and feedback.

    All the best,
    Chris Woodill

  • 8 John Rusk // Aug 14, 2007 at 4:22 am

    I have a couple of “anti-Gantt” quotes from ex-Microsoft staff on my site. (Anti in the sense of suggesting that it’s not really ideal for software development projects.)

  • 9 Jason Glover // Jul 8, 2009 at 7:21 am

    David, I agree with you (and many others) that Gantt charts fly in the face of the wisdom of experience when managing projects the Agile (sane) way.

    However, there is no reason to throw the baby out with the bath water. My belief is that planning and tracking are still essential to any project, regardless of the methodology.

    I'm a seasoned Agile developer and PM and so I rarely use MS Project for anything other than as a very specific type of Visio.

    However I still believe in a thing called Gantt …
    http://blog.teameffect.com/post/Team_Effect_equal

    I welcome your thoughts on how this fits with your style of Agile management.

    Regards

    Jason

  • 10 Aleksey Drobnych // Aug 20, 2009 at 3:57 pm

    I support Gantt using for Agile DLC. See my post on this topic: http://www.ganttzilla.com/blog/2009/08/gantt-for-

  • 11 web hosting // Apr 21, 2010 at 2:04 pm

    I always use Gantt charts but I noticed my work mates don't. They prefer other charts, more colorful and with more graphics but I find those hard to understand.

  • 12 education // May 26, 2010 at 9:00 am

    Thanks dude
    it's very useful information!!!

  • 13 Gantt Chart Template Trainer // Sep 3, 2011 at 12:52 pm

    I would like to know in what way Gantt Charts are not appropropriate for software development projects? Is this a quirk in the nature of such work?

  • 14 Mark // Oct 6, 2011 at 11:57 am

    A little late in the game on this, as the post has been out several years…

    Being traditionally a ‘Waterfall’ PM — and under such methodology bringing a multitude of application development projects to successful completion (most all within budget/scope/schedule) — makes me wonder the following:

    In using the Agile Methodology, how can the PM ensure that the project will be within budget and on schedule?

    From what I understand the Agile Methodology is flexible in that it provides virtual on the fly scope changes, however, in having such a situation, it seems almost impossible for a PM to be able to provide any reliable information as to the schedule and cost. Simply due to the fact that, based on ever changing scope requirements affecting multiple parallel bits, there seems to be no accurate way to reliably track changes in costs and schedule.

    Do PMs constantly get re-estimates from all teams every time the scope changes? This would have to happen in order for the PM to maintain some semblance of a plan, in order to communicate to senior management and client the revised costs and schedule of the project.

    It would seem to me that the Agile Methodology is mostly useful for development houses that are willing to absorb internally the changes in cost/schedule/scope that deviate from the initial desired goal of the project. There would need to be constant re-negotiation with the client and senior management for costs and schedule if multitudes of scope changes are introduced throughout the life of the project.
    If you need to meet a hard deadline and required scope changes will push the project past said deadline, then the project risks being a bust if more resources won’t be adequate or can’t be found.

    Such that, I believe Waterfall Methodology projects, while having some negatives, tend to be more predictable and perhaps better meet the needs of projects that require more strict cost, scope and schedule requirements.

    Comments?

  • 15 David Christiansen // Oct 6, 2011 at 12:30 pm

    Hi Mark,
    No disrespect intended, but your comments make it clear that you have only a superficial understanding of agile. Your extrapolations of how agile works are not even close. If you really want to learn how agile projects handle estimating and planning, you should read two books: Lean Software Development and Agile Estimating and Planning.

    Good luck,

    Dave

  • 16 Tate Stuntz // Oct 12, 2011 at 6:22 pm

    I will be somewhat kinder than Dave, I promise…

    Like you, I was (still am) a traditional PM and a successful one at that. No success or validity of any agile principles needs to come at the expense of the validity of traditional methods.

    It is more the case that some scenarios fit better than others for a given method. Outsourced projects (which is my background) are actually quite difficult to set up and execute using Agile due to the commercial constraints of budget and schedule that you find yourself operating within. One downside to living within these constraints is that the sponsor of a project always wants to know how the project is doing based on those two metrics. The sponsor sometimes does not or can not perceive or measure how the project is doing according to the scope and quality (fitness for use) dimensions.

    I have seen this over and over. Any good PM worth their salt (and there are plenty of bad ones out there) can get requirements approved and then deliver on those requirements. The two tricky parts are:

    1. Were the original estimates of effort any good? If so, you can probably get the job done basically on time and on scheduled. If you and your team are pretty experienced, you should be able to estimate the work pretty well. Again, there are plenty of people out there who are not very good at estimating.

    2. Did the requirements the users give you actually reflect what they needed? This is the area where I see the most failure. I honestly think the requirements are significantly defective the majority of the time.

    Experienced systems people can literally think in terms of requirements and design patterns as if it were a native language. Most business people or end users who are trying to speak that language are strangers in a strange land. They can’t do it. Therefore, the best a good PM can do in most situations is deliver on what you promised – even though that is not what the users need for their business. Thus the famous old research about vast percentages of waterfall projects being over-budget or not fit for use (bad requirements alignment) or simply failed.

    It may not be a failure in your mind because you did exactly what you were asked to do, but the business person either has to pay more to get you to tune the system up once they start to better understand what they need or try to struggle through with what they asked for originally. Agile attempts to accommodate this reality by providing better mechanisms for input/feedback on the system you are developing for somebody. But, you are right, in order to do that, you lose the ability to stick to a fixed price.

    Anyway, don’t discard anything you do that’s successful in the name of being ‘Agile.’ Just realize that some situations may allow or even require a few extra tools in your toolbox to help you get your end user what they need. Projects with a lot of novelty are hard to estimate and hard to get good requirements for – those situations require some prototyping or fatter estimates due to risk or perhaps even an Agile project method.

    Dave is right about books. I like the ones he recommends and would also mention Craig Larman and Alistair Cockburn; those guys are great authors. Also check out Scott Ambler’s website and books.

  • 17 Ganttic // Apr 5, 2013 at 12:33 pm

    Great post! Thanks a lot for sharing this information. This blog has actually been a huge help to me. I am learning a lot of important matters here. I hope there will be more of these interesting and informative posts to come.

  • 18 free agile project management tools // Jun 13, 2014 at 6:55 am

    There are other options available in project management software.
    Suggestion: The project manager must achieve a high level at all points of view about what is
    happening in the organization, beyond that, the ongoing commitment to stakeholders, key sales teams will
    add value in planning of organization’s projects. Organizational Breakdown Structure (SDO – OBS) contains functional divisions of
    the organization, which are established to provide project support, and
    external organizations support.

    Stop by my web site :: free agile project management tools

  • 19 Gaurav Gupta // Aug 21, 2014 at 2:41 am

    Gantt charts are dead but for every project, some level of resource planning and estimation is still needed. We’ve using Google Calendar + ClipPod for our own resource scheduling and basic project planning and it has worked out quite well so far.

Leave a Comment