Thursday, April 10, 2014

Interesting Data on Agile Development from Rally

Interesting Data on Agile Development from Rally


I attended a useful presentation from Rally Software last night in downtown San Francisco.

The presentation focused upon the results of research into almost 10,000 agile teams and their characteristics, an ongoing project of Larry Macherone, Director of Research at Rally.

Rally is now reporting from its database of projects on four key dimensions of performance:

  1. Responsiveness (based on Time in Process or Time to Market).
  2. Quality: (defect density based on defect count divided by person-days)
  3. Productivity: based on throughput/divided by team size (user stories and effects completed in a given time period).
  4. Predictability: based upon throughput variability (standard deviation of throughput over three monthly periods divided by average for 3 months).

Relationships Emerging


- Productivity is correlated with whether the team is dedicated to the project. The higher the dedication, the higher the productivity.

- Quality, as measured by defects is higher for teams spending less than 50% of their time on the project.

- Team stability tends to be quite low in the sample data, which likely has a major overall effect because stable teams have 60% better productivity, 40% better predictability and 60% better responsiveness.

- Teams that do "full Scrum" i.e. estimating story points and task hours have 250% better quality, but teams that do lightweight Scrum, using story points alone perform better overall with better productivity, predictability and responsiveness.

-Teams that aggressively control work in progress cut process time in half, have only 25% as many defects, but have 34% lower productivity.

- Teams with 7 members plus or minus 2 provide the most balanced performance. Smaller teams of 1-3 have 17% lower quality but 17% more productivity, presumably due to less time spent communicating.

As Larry pointed out in his presentation, metrics are easily abused. Story points are easy to count, but make no allowance for stakeholder satisfaction, value for end user, or other quality metrics.

Nonetheless the presentation is worth reviewing.