Tuesday, May 27, 2008

The Iterative Pace

A development iteration should be approximately 2 weeks long. The size of the team determines how much work should be done in the course of the iteration, not how long it lasts. Why two weeks? Because it is enough time to make significant strides, but not enough to a) dull details or b) make feedback difficult to incorporate. These are the principles that drive the pace described below. In each iteration the team should engage in just enough of the design, build, test, document cycle to capture accurate content with a minimum of effort: This is the meaning of efficiency.

The end of iteration demo is often seen as the focal point of the iteration: It is. However, you must remain committed to the idea that the demo is to show the product of the iteration and not that the iteration is to create the product of the demo. This subtle difference can be the yardstick against which decisions should be made during the iteration.

Expectation management is key to running a successful iteration. We are expecting and welcoming direct feedback from our clients. We must be able to focus their attention on items which are malleable and on requests that are fulfill able within established scope boundaries. On rare occasions, we find a requirement that will force a realignment of scope and project boundaries: These we must face head-on. It is one of the benefits of the iterative approach that requirements are vetted and scope is managed in real-time. We expect that expectations are wrong and that requirements are incomplete to start. Our process adapts to those accepted failings and maximizes the likelihood that our final result will be better suited to the tasks than anyone could have initially imagined. It is collaborative, it is dynamic, it is rewarding.

Here is a rough sketch of each iteration's pace:

  1. Day One:
    1. Review feedback from previous iteration.
      1. Incorporate simple changes.
      2. Plan for larger changes.
      3. Alert PM to any possible scope or timing impacts.
    2. Review portions of previous iteration goals that did not get done.
      1. Plan for finishing.
      2. Alert PM to any possible scope or timing impacts if feature has been deprioritized.
    3. Review feature scope for the current iteration.
    4. Discuss design approach.
  2. Day Two – Day Seven:
    1. Design
      1. Quick bursts driven by pictures
    2. Build
      1. Collaborate
      2. Add notes where appropriate
      3. Document progress with screenshots and short mnemonic annotations in OneNote
    3. Review with team
      1. Incorporate feedback.
  3. Day Eight:
    1. Take stock of progress.
      1. Prioritize finishing tasks and defer new tasks.
      2. Set expectations of client about feature and finish levels.
    2. Finishing build.
      1. Stub interfaces or advanced features to create simulated function.
  4. Day Nine:
    1. Unit test.
      1. Confirm technical soundness of establish functionality.
      2. Document procedures with screenshots and autotest spreadsheet data.
        1. Document results of Unit tests.
        2. Repair simple defects.
        3. Note major defects for next iteration.
      3. Set client expectations for iteration based on known issues.
    2. Peer review
      1. Check internal documentation level –
        1. Usage and Descriptions
        2. Activity Steps
        3. Flow Comments
      2. Review and revise based on naming standards.
      3. Seek improvements for stability and maintainability.
      4. Check compliance against architectural mandates.
        1. Repair simple violations.
        2. Note major violations for next iteration.
    3. Architecture review (maybe performed by Architect or Lead Developer)
      1. Confirm technical soundness of establish functionality.
      2. Check internal documentation level.
      3. Review and revise based on naming standards.
      4. Seek improvements for stability and maintainability.
      5. Check compliance against architectural mandates.
      6. Contribute a brief write-up of findings.
      7. Review findings with lead developer and/or team.
  5. Day Ten:
    1. Collate design documentation, unit test results, peer review, architect review results with auto-generated documentation.
    2. Set expectation of client about demo content.
    3. Forward documentation to client.
    4. Demo content:
      1. Reset expectations about demo content.
      2. Demo the content.
      3. Reset expectations about demo content (not a typo, do it again).
      4. Collect feedback.
        1. Positively acknowledge understanding of feedback.
        2. Incorporate simple changes and redisplay.
        3. Discuss larger changes in context of current priorities.
          1. Set expectations for effort involved and impact on design imperatives.
          2. Seek reprioritization.
      5. Set expectations for next iteration.
    5. Set the stage for next iteration.
      1. Review iteration progress with team.
      2. Discuss feedback from peers, architect, and client.
      3. Export Product to zip to preserve demo content.
      4. Roll RuleSet minor versions (usually minor).
      5. Rest.




Labels: