Pages

Monday, 10 March 2014

High Level Agile Retrospectives: Sense making with the management team


In my previous post "Will the real product owner please stand up?", I sketched the scenario where a
development team is pressed to flex and adapt to product roadmaps that tend to be generally quite aggressive, where a roadmap is generally accepted as template for driving the delivery plan and so the expectation is set at the highest levels of business, which means the team must just get on and deal with it.

This scenario is typical of many software product teams, where the success of the initial product launch was due to a Herculean effort of the entire team, the product gets out on time; and that original pace of delivery then sets the scene & the tone - the precedent is set such that people have the expectation that the team can pull it off again and again.

This is all well and great since the team has inspired confidence that they are committed and can indeed deliver. Little of the legwork and thrashing that went on internally in the development team is known to the business stakeholders, the team would still love to commit to the roadmap, but are quite edgy about the reality and implications of doing so.

We are faced with the classic problem of scaling, the notion that "What got you here, won't get you there!" scenario unless something changes - but people are nervous about abrupt changes, because the teams are currently working through their own internal improvement plans that first need to be seeded for a period of time, reviewed & then evaluated, before committing to subsequent changes into the feedback loop. 

As an Enterprise Program Manager, it is my job to put a high-level plan together that shows the stages of delivering on the product's roadmap. I become comfortable with doing this planning as long as I have confidence in the development & execution processes of the impacted delivery team. Unless I have confidence that the teams understand what's expected, that they have a reasonable processes / template for execution, I will not put my name against a delivery plan, that I know is close to impossible to achieve. 

So I roll up my sleeves, don my consulting hat, and try to facilitate conversations with the development & delivery team. In this post, and the series that will follow, I share my journey with a management team that face interesting challenges with balancing the delivery expectations of the business on the one hand, and growing their young agile development team on the other hand, by applying systems thinking tools of sense-making, to better understand the challenges that await.

I work with the mid-senior managers only, expecting the respective managers to have held internal team conversations before-hand, bringing all their team's issues to the table with the rest of their fellow management peers. Despite the team adopting an Agile/Scrum mindset, it is quite interesting to note that the "one-team approach" is really quite difficult to achieve in practice.

I share the results of the first sense-making workshop, that was guided principally around answering these two questions:
  • Given your current way of working, how much confidence do you have in achieving the roadmap?
  • What ideas can you suggest that we should look into, that if we implemented now, could help us to get close to achieving the roadmap?
Interestingly enough, to my surprise, the answer to the first question was a resounding NO! The team had little confidence the roadmap could be delivered upon, not even a Yes from the product team!  The second question generated some interesting ideas that sparked some healthy dialogue. Upon reflection, it turns out that our problems aren't that unique, sharing quite a lot in common to all development stories you read about. More interesting is that the ideas were not new topics for the teams themselves, showing that how teams can get stuck-in-a-rut, dealing with the everyday pressures that take higher priority over implementing and driving changes. Despite having many internal retrospectives, the same topics were brought up, showing clearly a lack of championing and initiative from within the various teams. Without such initiatives emanating from within the teams, how can management not ignore this, and not take the top-down approach to managing??

Technique used in the workshop
I facilitated the retrospective workshop, playing the impartial neutral person. The people involved were a mix of middle & senior managers spanning product management, software development, systems integration, agile managers, business analysts and quality engineering.  The idea generation workshop lasted two hours, followed by an hour session the next day for sense making. 
I used the concept of "Rounds" where each person gets a chance to speak & offer comment without interruptions. The rule is that everyone must respect each other, no interruptions, and strictly no challenging views - the place must be a safe one where people can speak freely without being afraid of fellow peers or line managers (tricky since it was during performance review time!). Following the initial ideation workshop, the second session was around applying Systems Thinking tools to unpack the ideas.
Systems thinking is around gathering the data into themes, looking for common traits, allocating a main topic by describing the term as a variable. For example, topics touching the subject of "Quality" is grouped as a variable "Level of Quality". Once these variables are identifying, we look for patterns of influence: How does each variable impact the other?? We look for relationships between the variables, using what is called an "Inter relationship or Network" diagram. From that we generate a Scorecard that shows us the core driving variables. From the scorecard, we assess and estimate the amount of effort we're spending in each area. Once that is known, the ones with the least effort, are usually candidates to immediately improve on.
Index cards, stickies and posters are used throughout the workshop.

Topics generated for consideration
The list below has had only minor edits to respect confidentiality, see how this is the same old story of software development [who says development teams are special and unique hey? ;-) ] :
  • Allocate team members to separate delivery/feature teams (+4)
  • Other teams need more input into the contents of each build
  • Change how test teams operate
  • Technical/All information to be shared earlier with all the teams
  • Improve Component Quality of deliveries: Enforce Test-Driven development with all software vendors
  • Breakdown of silo mentality (Feeling that we've not addressed the core issues learnt from earlier retros)
  • Reduce scope of work to focus on just business imperatives only, priorities and value-based
  • Must achieve a releasable build every sprint
  • Better control of how features are rolled in, switched on/off, who owns Continuous Integration(CI)?
  • Involvement teams first in roadmap timelines before freezing deadlines (+3)
  • Software vendors must release must be delivered in advance dependent components (Just-in-time is not working)
  • Release manager must have a view/control of what goes into CI
  • Late feedback from Senior Management (even with wireframe approvals) to be improved
  • Better control of code changes across components (Limit amount of code changes to targeted areas) (+3)
These topics were placed on index cards to group into themes for the sense-making workshop:

Output of the Sense Making - Network Diagram
The network diagram is essentially a picture of relationships. How one topic affects another is shown by the arrows emanating outwards (the drivers, shown by arrows leaving the card), and impacts from other cards (arrows incoming). The ratios show outbound-to-inbound. The higher the outbound to inbound ratio means that that topic highly influences or drives the others. In this case, the highest driver is the Efficiency of Team Structure, everything else impacting, or resulting the outcome of Level of Quality.

Thus, the success of the delivery is highly pivoted around improving the efficiencies of the team structure, which influences the efficiency of effectiveness of communications, which in turn plays a major part in understanding the levels of controls required, thus contributing to the overall quality of the planning, which determines our time to market, and ultimately speaks about the overall level of product quality!

This picture is a result of dialogue and frank conversations between the management team. Where topics were closely linked and the relationships questionable, it was left to a team vote to decide. We did this in just under one hour, which is the point of the exercise to rapidly make sense of scenarios, without dwelling too much into the analysis, at the risk of analysis-paralysis, which results in wheels spinning, getting us to nowhere.

So for a first pass, it looked quite sensible, to no surprise fits in with the well known challenges of software product development.

Output of the Scorecard Assessment
The scorecard builds on from the network diagram, is arranged in descending order of the drivers, starting at the bottom with the strongest driver, working upwards. The assessment is obvious that everyone is quite sad with all the areas, so we went to look at the amount of effort being spent in each area by the team.

So it turns out the team, due to the pressures of delivery and keeping business stakeholders happy, spend the bulk of their time focusing on getting the product out, Time to Market (60% focus), at the expense of the other areas. There seems to be some improvement in addressing Quality Improvements at 20% focus, with Level of Control and Effectiveness of Communications coming out low with 2% and 4% respectively. Some focus has been made with improving Efficiency of Team Structure but more work clearly needs to happen in that area, with improving Quality of Planning following close behind.

So that's the result of the first pass of analysis. The idea is to repeat the same network/scorecard assessments for each underlying area - once we have everything on the table, we can then review the areas to tackle, incrementally of course...

Management team must also collaborate and work as a team!
Here's the management team working closely together in sorting out the index cards into themes:

And here's the team working out the inter-relationship / networking diagram, healthy debate happening nicely:


4 comments:

  1. I think this summary you made is very poignant:

    "Thus, the success of the delivery is highly pivoted around improving the efficiencies of the team structure, which influences the efficiency of effectiveness of communications, which in turn plays a major part in understanding the levels of controls required, thus contributing to the overall quality of the planning, which determines our time to market, and ultimately speaks about the overall level of product quality!"

    It's not a conclusion, musing, rant or theory - it's a cold, hard, fact - a result of the sense-making exercise.

    Your teams are all focused on ‘Time to Market’ to meet the roadmap that is generated by market "expectations" of your Product Owner(s); when in fact this is driven by four other topics and is actually the output of the equation - not the input.
    (and how time-to-market drives quality I have no idea... maybe you could expand on that? or is that a negative-driver?)

    From your previous blogposts, notably "Will the real Product Owner please stand up?" the software delivery is very much driven with a top-down-only approach. Your efforts are then directly focussed on a not-directly-controllable output in a desire to appease the top-down-only approach.

    I suggest an exercise for your managers: each should identify topics within their direct scope-of-influence (or locus-of-control). That is, if they personally changed their approach to the management of their direct-reports, what topics could they directly impact?
    I posit that none will be able to positively influence Time to Market, yet they will be able to influence quality, communication, skills/talent and various operational inefficiencies.

    You will then hopefully be able to demonstrate that while Time to Market is an important strategic consideration it cannot be used to drive development/delivery – but rather focus must be placed in eliminating waste/working efficiently and with quality. This would then also help to prove to the C*Os that the roadmap is an idea, and not a project-plan of hard deliverables.

    And – finally – when the question is asked how to better predict time-to-market for more fine-grained roadmaps/project-plans you will have the answer: work/manage efficiently, and with quality, to create a predictable velocity for the team-at-large. Then it’s the simple division of the expected effort by that velocity to calculate time-to-market.



    (and I call it 'top-down-only' because there is clearly no 'bottom-up' feedback into the Umbrella Corp. that is your Product Owner(s). Use any metaphor you want - inverted-pyramid, cooks & broth, a spinning-top, Zimbabwean Govt. - I doubt you will ever find one that's synonymous with sustainable stability.)

    ReplyDelete
    Replies
    1. 1-line executive summary:
      you CANNOT guarantee a takt-time without first calculating your cycle-time

      Delete
  2. @Greg - thanks for taking time out to contribute with such detailed and excellent comments! :-)
    On the Quality-TimeToMarket relationship, you're absolutely spot on that it actually comes out as a negative driver, that the focus to hit the deadline adversely affects quality (this is the classic sense of satisfying the age-old Time-Cost-Quality) triangle. On the other hand, in agile, the reverse is true in that the deadline results in an increased attention to quality upfront…
    But in the conversations with this particular team, they were leaning towards the classic Time-Quality argument, and in the end voted on the issue, resulting in TimeToMarket being the stronger driver than Level of Quality.

    It is interesting to note the outcome though, that it is not entirely unique and are classic problems faced with software teams of yesteryear. The fact that in 2014, we still have software teams tripping on what should be first-principles, is rather concerning to me….

    When a team decides to take on agile, they need to be clear on understanding the principles of software engineering: design, test driven development, continuous quality & continuous delivery. Agile is not an excuse to bypass the strong foundations of software engineering…

    ReplyDelete
  3. Regarding the strong foundations of software engineering, here's a great talk that you & your readers _must_ see if you haven't already:

    Glenn Vanderburg - Real Software Engineering: http://youtu.be/NP9AIUT9nos

    ReplyDelete