Velocity & User Stories – Too Many Teams Are Looking At It Wrong!

I want to provide another lens on what velocity can mean to teams and one that I have found really resonates with executives as organizations measure whether or not agile is really making a difference and delivering more agility to the organization. Predominantly, It's about using velocity as a means to get closer to value delivered, rather than simply a speed metric. After all delivering software fast does not equal delivering software that is valuable or meaningful and advancing organizational results.

Velocity – What Does It Really Mean?

Ask just about any agile team what metrics they track, and “velocity” comes out. Teams rally around increasing their velocity, focusing on speed and hanging their team successes on this metric. Additionally, many teams report their velocity increase to leadership claiming agile success! Some may even go as far as comparing and benchmarking against other teams. For the former, I challenge this approach with the following question: What results are actually being delivered? How can you link this to the organization’s strategy? Other teams use it as a data point, a measurement that provides some information on their team processes, trends, and where they might look to continuously improve. A single data point that when looked at with other information might provide some insight into the team’s continuous improvement.

Reframing Velocity for Agile Teams

Alternatively, I want to provide another lens on what velocity can mean to teams and one that I have found really resonates with executives as organizations measure whether or not agile is really making a difference and delivering more agility to the organization. Predominantly, it’s about using velocity as a means to get closer to value delivered, rather than simply a speed metric. After all, delivering software fast does not equal delivering software that is valuable or meaningful and advancing organizational results.
I’ll start with a baseline definition of Velocity. Scrum, Inc. defines it as Velocity is a measure of the amount of work a Team can tackle during a single Sprint and is the key metric in Scrum. Velocity is calculated at the end of the Sprint by totaling the Points for all fully completed User Stories. The Agile Alliance defines velocity as at the end of each iteration, the team adds up effort estimates associated with user stories that were completed during that iteration. This total is called velocity.
 
Like many concepts when teams start to move in an agile direction, the basic definitions become meanings all their own. Additionally, creating a team’s own definition of things can at times be very productive and other times be an unconscious step towards an anti-pattern. An anti-pattern is when we do something that seems logical at the time but creates results counter to the bigger picture goals. This often happened when teams try to become more agile yet adapt their practices because they have found a “unique” aspect they find themselves in that call for an adaption to the practice. What the teams do not realize is that the adaption seems logical but goes against the very agile values that create results. Therefore, agile teams should adopt practices to fit their own context, but too many to do while compromising the values that an agile approach is built upon, and many do not realize they are compromising their results, but claim they are “being flexible” and “self-organizing”. These are also agile natured principals! But are they to their own detriment?  

Velocity and User Story Connection

For Velocity, many teams miss the user story part of the definition. Some teams are calling all work items completed user stories and claim improvements in team velocity, yet the way the team is using user stories may be incorrect and compromising the idea of “team improvement and success” as viewed by others and executives. Many teams are focused on points and the pointing process to help velocity, however, in this blog, I want to address the User Story part of the velocity equation. Using user stories as they are intended to be done brings a different view of velocity.

Think of velocity for a moment as a measurement of how often the team is getting user feedback on real working software the team is working on. When we look at the agile values and principles it is all about “working software”, “continuous delivery” and “getting feedback from real users”. This does not mean getting feedback from a downstream team dependent on a piece of software. Many teams are simply creating software, continuously, but it is NOT working software that a user could provide feedback on for many iterations/sprints. It may be a piece that then goes to another team and after many teams touch it (all counting velocity for their own piece) they may be able to get some real user feedback. Therefore, if executives and customers deserve accurate reporting on how fast software is being built, how is this a true measurement of velocity? It’s not! If a user cannot give feedback, then it is not DONE, and we should not count it as velocity.

Feedback Velocity

To help with this, I will often call it “feedback velocity” not to be confused with a “poor definition” of velocity that may already be in practice. By looking at how often the team is getting feedback on working software, from end-users; the team can more accurately reflect how fast they are delivering working software and connect better with executives on reporting the pace of software development. Whereas measuring velocity on a part that is dependent on other parts to get user feedback is not an accurate measurement of true velocity.
Let’s ask ourselves for a moment, how long are teams really willing to work without knowing they are on track from a user goal perspective?  More often the feedback will strongly correlate to the value delivered by the team. This creates less pain in architecture rework and allows for changing requirements. Working without user feedback increases the risk that the infrastructure, architecture, and data structures created need to be rethought through. It’s detrimental for a team to tell a business leader after the software is complete that what they really needed would require a completely different architecture or infrastructure! The rework is often too expensive and business teams and customers are forced to “live with it”.

User Story Quality

Now back to user stories. If User Stories are the unit that is measured for Velocity, then we better have this input done right! User Stories are something many teams struggle with in order to make this critical input to velocity a meaningful measurement. Below I’ve outlined some user story anti-patterns that cause velocity measurements to go awry:
  • User Stories written as tasks to be assigned to a team member
  • User Stories that are a piece of working software, not something that you could show a user and receive feedback on
  • User Stories are not “releasable” on their own. A team and PO can choose when to release them and wait, but they should be releasable as a unit of work on its own.
What User Stories should be:
  • A user goal/action that we want to implement or improve (the story is about THE USER, not the system or a component of a system, a phase, another team, or a team member)
  • A unit of work, representing a user action/goal that is “demo-able” to end-users for feedback
  • A unit of work that the team works on, not an individual
When these characteristics are met, the team can truly rally success around real, working software, that they can receive feedback on. They can release whenever they want and obtain feedback as often as they want on exactly what they are working on. This is what VELOCITY is all about!
In conclusion, if your team or organization is employing user stories incorrectly, then you are also likely not reporting velocity correctly. You are at risk for your executive team to question your “success”. Do they see value delivered to your customers faster?