Tuesday 6 November 2012

Template for Set-Top-Box Release Campaigns

In this post, I'd like to share a concept I've coined as "STB SI Release Campaigns: The Climb to Launch" which is something a STB SI (Systems Integration) Project Manager, Development Manager or Delivery Manager can use to manage a process of stabilizing the product leading up to final Launch / Deployment campaign.

No matter what flavour your DTV project is, if the STB (Set-Top-Box) is impacted, either it's a new hardware, or new software, or existing hardware but brand new features, the last mile of any DTV launch product is the STB testing, acceptance and deployment. In general, it is widely accepted practice that the building blocks and fundamental backend or headend systems required to drive the STB end user experience, are in place well in advance of the STB. 

Sometimes this isn't always possible, especially when changes impact across the entire end-to-end system. In such cases, some prefer to do big-bang integration testing; however I much favor and prefer component-based testing using test harnesses and simulation; leaving the rest to integration testing. This is a topic for another post altogether...

So the last mile of the majority of DTV deployments is the STB system testing, including user experience & final acceptance testing. The team that are usually the custodians of the STB delivery is the STB System Integration team (STBSI). It is this team that carries the burden of producing a stable build, ironing out all the critical issues, and ultimately produce release candidates for recommending to launch. 

Typical Milestones in a STB Delivery Project
I've written previously about clearly defining objectives & goals to your STB project, so much so that it can be quantified & qualified through a measured process. To recap, a STB project will typically include the following Major Milestones:
  1. Start of Closed-User-Group (CUG) Field Trials
    • Headend is available in advance and operational on the live broadcast
    • STB SI have created a build that is functionally complete to all intents & purpose (FC)
    • We are in a position to release this build to enter formalized acceptance testing (ATP QA)
    • We are ready to start the path to Release Candidates, the climb to Launch - i.e. iterative cycles that will be repeated to reach final Product Launch
  2. Start Wider Field Trials
    • Signifies the build is stable, nearing completion and is ready for external audience
    • STB passes all certifications (HDMI, CA, Macrovision, etc.)
    • STB Hardware passes all hardware testing (is the hardware fit-for-purpose, Consumer Acceptance, Safe, Green, etc.)
    • We are getting closer to producing a final Production Build
  3. Produce Launch Build
    • Field Trials drawing to a close
    • STB SI issues are clearing away, last few hurdles to pass through, but not critically blocking launch
    • Finalised Release Candidate almost in hand (RC3)
  4. Clean Run
    • STB SI Produces Launch Build
    • Final ATP QA expected to complete with no Showstoppers
    • Final build sent out to Field Trials to verify critical issues
    • Soft download done to select group of subscribers
      • No critical issues found from the soft launch
  5. Deploy
    • Final Deployment build created by STB SI and provided to Launch Delivery Team
    • Image is broadcast for software upgrade or launch of decoder STB to market (go public in retail stores)
    • Process continues for some time, initially about 2 months
    • Next release is being planned (new features & bug fixes)
For each of the above milestones, the project will have to define entry & exit criteria. See previous post or the attached presentation. This might seem simple and logical, but generally STB projects are often complicated by lots of legacy rules, history of business & project decisions, and often involve more than one target hardware STB/Decoder platform. In cases where there are more than one STB involved, typically one chooses a lead STB for launch, with a quick follower - it depends on business objectives of course.

Introducing the Release Campaign Concept
I use this concept to bring structure to a STB Delivery Project. Having clearly defined milestones, as well as realistic, clear & unambiguous goals/objectives for each milestones, naturally allows for a simple process that can be executed repeatedly until all the objectives are met. The key points:

  • Structured, Repeatable & Easily Measurable
  • Unambiguous & clear focus on defined Launch Criteria
  • Sequence of repeatable activities to execute almost automatically to produce Release Candidate (RC) builds
    • Planned up-front: Release Campaigns can target a specific milestone, and aim for a fixed duration: Example: I would like to achieve a CUG FC build within Eight (8) weeks of reaching code complete.
    • The release cycle, production of the RC build for each milestone is Incremental
      • Current practice is to use a cycle of 10 working days (2 work-weeks)
    • Involves the buy-in & full participation of the whole team
      • Project & Programme Managers
      • Product Owners
      • STB SI Team
      • Component Vendors (Middleware, Drivers, Application)
      • ATP QA Team
      • SI QA Team
      • Field Trials & Operational Support team
Associated with this is a simple process that can be used by Delivery Project Managers to plan, based on a set of available macro variables:
  • Duration for SI to produce an incremental build (I use 10 days)
  • Amount of time to allow vendors to fix Showstopper defects in pre-candidate builds (can vary between 1-10 days depending on service level agreements)
  • Duration of the Release Campaign (How many SI cycles of build increments to allocate for specific milestone?)
Below is an illustration of the core variables & typical timeline milestones:
Overview of Time Milestones in a Release Campaign (RC)
The process is repeatable, one Release Campaign flowing to another, with feedback. It is a continuous process of defect fixing, incremental builds & continuous verification leading up to the production of the next candidate build for the next release campaign, as illustrated by the pictures below:
Flow of Release Campaigns
Example Flow from One Release Campaign to Another
As the quality of the builds improve over time (this is expected), the length of campaigns is expected to reduce over time. This assumes of course the project is in real delivery mode, that all feature development is complete (or the project implements a very strict stability regime) and that the fundamental foundations of the stack has reached an acceptable maturity point.

Release Campaign Model - Tool for the Project Manager
What the model allows the Delivery Project Manager then is to play with a few scenarios, especially the length of the release campaigns, adjusting for bug fixing and stabilization problems. The PM can then use a tool, that based on fine tuning a few variables, can come pretty close to producing realistic plans. The fine-tuning of course can be used to indicate and prove to your stakeholders the project still has a long way to go (which in general is quite true for almost all STB projects: they are usually too ambitious and poorly planned from inception):

  • #Days STBSI need to create & sanitize a build. The STB SI team usually integrate a few components from third-parties, put a stack together and instrument basic sanity, smoke & reliability testing. This usually takes the order of 3-5 days depending on the complexity of the stack, as well as maturity & competence of the SI engineers.
  • #SI cycles required to stabilize core software on final hardware. Generally there is some time required to get a software stack stable enough on the target hardware (regardless of this being a mature software stack on a previous hardware being ported to a new hardware; or a brand new stack on first time hardware)
  • #Cycles to allow vendors to meet FC build CUG Criteria. How much time to allocate for the initial Release Campaign to produce a Functionally Complete build, according to the agreed acceptance criteria. This is generally the first time all components are delivered to STBSI as ready for functional integration, and is expected to take the most amount of time.
  • #Cycles from Release Campaign X -> Release Campaign Y. Repeat until happy - the amount of RC attempts will of course vary according to the nature of your project. I tend to go with three major campaigns, you can have as many as you like.
  • #Cycles to run & manage Field Trial Testing. As above, typically the field trial testing happens in parallel, and should not be on the critical path. The business might decide to let these test phases run for much longer though, thus ultimately impacting your launch date.
If you want more detail, please refer to the accompanying presentation.

You're doing it wrong - with Agile there is no need for a separate Stabilization Phase!
There are a lot of people jumping on the Agile/Scrum bandwagon, advocating its use in every software or systems delivery project. Whilst I'm quite a firm support of Agile/Scrum, I have to admit that adopting Agile/Scrum in keeping with its true essence, is next to impossible in the real world, especially when typical Digital TV Systems Projects involve players from a multitude of vendors, with the PayTV Operator being the end-customer, and the real Product Owner. Bearing in mind also, that these independent software vendors have their very own products to support, delivering to multiple customers - to implement a truly STB agile project requires a significant investment on the part of the Pay TV operator, and exceptional buy-in from the vendors; so realistically it is practically undoable -- unless the Pay TV operator either owns the project in-house, or has a Project Charter or Contract in place that clearly stipulates the main leader, driver, and overall planning is under the ownership of PayTV operator...more on this debate later...

So what happens in practice though, is that software vendors & systems integrators work in relative isolation, and are free to adopt whatever approach is required.  The commonly accepted best practice of STB Software Projects, if managed properly, should be a short-lived Integration, Stabilization & Bug-Fixing phase - solely focused on stabilizing the product leading up to launch. We are not saying that Stabilization phase appears toward the end-phase of the project and that component vendors are unable to release stable components prior to this phase: No, not at all. In reality, a STB project depends on a number of pieces of a the puzzle fitting together, and it's often that the first time this really happens is closer to last mile of Components-Integration (a.k.a. Systems Integration & Stabilization).

In complex projects that impact the end-to-end DTV value chain, where the Headend & Full STB Stack is brand-new, or fundamentally changed, then it's really difficult to avoid a separate stabilization phase. Components are expected to be stable & performant as part of their independent component testing & release processes, but the system ultimately only begins to be stressed when all components come together.

The typical, best practice of Systems Integration Management & Delivery is shown below:
Industry Best Practice of STB SI (Courtesy/Permission granted by S3)
The above model follows a structured approach to a STB delivery project, ideally SI (System Integration) is kicked off from stable foundations, then becomes a series of cycles that is executed until the system is fir for further Field Trials / Certification. Essentially these SI cycles are the Integration & Stabilization phase of the project. What also happens in reality is that Certifications/Field trials kick-off at a suitable SI cycle, in parallel and thus is not sequential - i.e. not Waterfall.

As much as I'd like to see a STB project implemented from start-to-finish using Agile/Scrum, I've been involved in many DTV Systems Projects to know it is extremely difficult and challenging, incurs a huge management & administration overhead, is costly & requires mammoth coordination, but not impossible! I've seen some projects come close to doing end-to-end development & integration, incrementally, adopting as much Agile principles as possible, but not really wearing the badge of pukka-Agile (see earlier posts).

In the real world, benchmarking and signing off a system that's going to generate millions of dollars in revenue, one cannot avoid controlling the acceptance of the product by implementing a fairly strict process for stabilization, testing & final acceptance.

Conclusion
I have shared a simple model for managing & planning STB software release schedules, that a STB Delivery Project Manager can use. This is not a theoretical model based on zero real world experience or case studies. I have seen this model used in the past, and though not publicly available, I've seen many good PMs use this technique naturally. I am using this model to manage some of the projects I'm currently running. I believe it is probably the first time somebody has ventured to share this model, and also the first time to provide a tool for free, that can be used to assist with planning & modeling release scenarios. I have worked with, and learnt from, brilliant managers who apply this planning model instinctively without depending on tools, they have excellent hunch-bases borne from delivering many real world projects - I'm grateful to have worked with such giants (BK, DD, NT, ST, JC, MK)...

Some hardcore engineers might retort by remarking the classic "How long is a piece of string?" argument. Sorry, us poor Project Managers need to have some way of measuring & predicting the output of a project, and therefore must rely not only on analytics (like Defect Trends/Prediction) but also tools based on insight, intuition, wisdom and lessons learnt from past projects. This tool is borne from this experience - I am confident in its use & value it could bring to your project. The recurring, iterative cycles of the model makes it easier to find the answer to the length of the string problem - we have to provide an expectation based on some reasoning, disclosing all the risks & uncertainties upfront of course...

If you found this info useful or would like to learn more, or bounce ideas, or even share your own experiences from your own PM toolbox, do drop me a line or two! :-)

Download the Free Planning Excel Tool!
If you've read this far, that's great! In my next post I'll share a powerful planning tool that is based on this release campaign model. You can use this tool to model Best Case, Most Likely (Realistic), Worst Case and the resulting 3-Point Estimate - to help you not only plan, but visualize the Climb to Launch, offering you many options to tweak your planning, stay tuned...

No comments:

Post a Comment