Wednesday 22 May 2013

System Integration is King - Release Manager, Build Meister-Gatekeeper

I spent a lot of energy in the last week trying to (and succeeded in) convincing some senior managers about processes, that in my humble opinion, are tried-and-tested, well-established norms in the management of software: the art of Software Release Management - taking control of component releases, especially w.r.t. approving or rejecting bug fixes. It has been good practice in terms of establishing myself as a consultant-expert on all things related to Set-Top-Box Software Development & Integration. I spend a lot of my time promoting best practices and trying to win & influence people over, something I have to do to maintain focus and direction of the overall programme plan...

The point I want to make is that, in a STB delivery project, the System Integration (SI) team is really the GateKeeper, with full authority and accountability for producing a stable, functional release. This means that SI have every right to manage component releases coming in, especially with reviewing & approving bug-fixes. If SI are not happy with a component release, for example, the component team may have fixed more bugs than was required, or fixed far fewer bugs than was requested, or made any other changes that wasn't specifically requested by the SI team, then SI can quite easily reject the component's release.

I have written in the past on the following topics:
All of these areas become the focus of a STB project as we get closer and closer to launch mode. Recapping the template that most STB projects follow:
  • Develop / Integrate Features >>> Functionally Complete >>> Closed User Group >>> Wider Field Trials >>> Launch Candidate 1 >>> ... >>> Launch Candidate X >>> Clean Run >>> Deployment Build >>> Launch
As the system matures to reach each of these milestones, the defects are expected to get less and less. Obviously no product gets launched with zero defects! All projects can do is ensure as best as they can that almost all high impact, high severity defects (a.k.a. Showstoppers or P0s) are dealt with to the best of their ability; and are prepared for the inevitable influx of new Showstoppers from in-field, real world usage of the product - it is the reality, unavoidable. Very rarely do products launch having met all original quality engineering requirements, after all, this is an entertainment device, not a life critical device - so we can relax a little on quality objectives! :-)

So how should you control releases leading up to launch?? It's simple really, not complicated, and really not rocket science...

Sunday 12 May 2013

So you think writing a Set-Top-Box Middleware is easy? Think again!


So you came across this technology called Digital TV and stumbled across the various (open) standards (MPEG, DVB, MHP, etc) and thought it cool for you to write your own software stack? Or you've been writing Set Top Box Software for other customers (namely PayTV operators) and decided you have enough knowledge to venture out on your own and sell your own software stack? Or you're a
PayTV operator, sick and tired of being locked into expensive and time-consuming Middleware vendors that you decide it is time to write your own software stack, be in control of your destiny and deliver cool products to market faster, cheaper and on-time?
Or you figured there is a quick and easy way of making some money, since just about every country in the world is going to switch off analog TV and turn to digital, where you can provide a very cheap and cost-effective software stack?

If you've answered yes to at least one of these questions, then you should be asking yourself: Just how serious are you about Middleware?  Just how serious are you about the EPG User Experience?

Just how serious are you about competing in this marketplace? Is there any room for you to compete in this arena? Where  do you see your software stack in the marketplace: low-tier, middle-tier or top tier? Who are your competitors? Are you big enough, flexible-enough, savvy-enough, innovative-enough or unique-enough to displace your competitors? Or do you even wish to displace your competitors? What relationships do you have with the big technology players? What is the compelling value proposition that you're offering that your competitor's aren't? Do you stick to one large customer that can finance your operation? Do you seek out other customers to monetize on your product's offering? Do you keep a common product that can service multiple customers at relatively low cost? Do you feel the market is big enough to support many divergent offerings? Are you a small fish in a small pond, big fish in a small pond, or are you swimming with the sharks and you just don't know it yet? Do you rely on hope that everything goes well, or have mountains of cash so as not to notice you're burning fast, but don't realise it? Have you been sold vaporware, demoware, software that's far from being fit-for-purpose for real world mass production use?? Etc, Etc, Etc...

I have met developers, architects, software managers and product owners who think that writing a STB Middleware stack is easy, a piece of cake, all you need is just four really good developers...to which I smile and think in my mind, oh how naive they are! Yes, for sure, the software architecture can make sense on paper, heck you can even hack a proof-of-concept together in about three months or less that proves a simple Zapper or PVR stack. But that's just it, it's a hack, a Proof-of-Concept (POC) - nothing close to getting a real world product into the marketplace.

My rambling in this post covers the following:


Monday 29 April 2013

Pragmatic Set-Top-Box QA Testing - Don't just trust Requirements-to-Test Coverage scripts


This post might be a little edgy - I think it's important to understand alternative perspectives around Set-Top-Box (STB) testing, how it is uniquely different to other forms of IT Testing; to understand the dynamics of STB testing that promotes agility and flexibility, whilst simultaneously understanding the risks associated with pragmatic testing.

The Headline:
DON'T BLINDLY TRUST REQUIREMENTS-TO-TEST COVERAGE MAPS

Last year, I wrote quite an in-depth paper on Effective Defect & Quality Management in typical DTV projects. It covered many topics, and touched briefly on the aspect of Project Metrics reporting. This post expands on the subject of QA Metrics tracking, focusing on how this reporting can help change the direction of the project, and instigate changes in focus to the overall QA efforts, particularly around Set-Top-Box QA testing. I advocate the project will change QA focus as stability is achieved with Requirement-to-Test coverage maps, to include more Exploratory, Risk-based & User-based testing.

Saturday 27 April 2013

Worlds of QA Testing in Digital TV Systems Projects

I have previously written about the Digital TV ecosystem with its complexities and challenges in defining the Architecture & Systems Integration spaces. In this post I will share some thoughts around the QA (Quality Assurance) / Testing problem space, as well as generally accepted strategies for solving these problems.

What really triggered this post (which is centred around E2E QA) was a recent conversation with a QA Manager who in passing commented that he's facing a challenge with the "End-to-End QA" (E2E QA) team, in that it takes up to six weeks to complete a full QA cycle. Now some might think this is expected, as E2E QA is the last mile of QA and should take as long as it takes. My response to this is that it depends...

It depends on which phase your project is: is it still in early development / integration testing - where components are being put together for the first time to test out a feature across the value chain? Or is the project well ahead in its iterations having already executed much of the E2E testing? It also depends on the scope and layers of testing the systems and sub-systems before it gets to E2E Testing. Has a version of the system already been promoted to live? If so, what are the deltas between the already live, deployed system compared to the current system under test?

[--- Side note:
This post is one of many to follow that will encompass some of the following topics:
  • Worlds of testing / QA
  • Is ATP / E2E testing the longest cycle than other system test cycles - or should it be the shortest?
  • High level block diagrams of architecture / interfaces / system integration
  • Visualisation of technical skills required for various aspects of testing skills required at various areas of the environment: Scale of technical - heat map
    • E2E QA is highly technical and should be considered as part of Integration, not a separate QA function
  • Which came first: the chicken or the egg? Headend first, then STB, or do both together - big bang?
    • How to solve complex systems integration/test in DTV?
  • What is end-to-end testing, what areas should you focus on?
  • What is required from a Systems ATP?
  • What kind of QA team structure is generally accepted practice? How does Agile fit in with this?
  • Can E2E Testing be executed using Agile - iterate on features End-to-End?
Side note --- end ---]

Thursday 21 March 2013

Digital TV Summit Day Three...


The summit ended today with a market analysis on the opportunities for PayTV in Africa, covering most of aspects of the end-to-end value chain, from content procurement to delivery and consumption. I found today's session especially interesting and refreshing, a great complement to the book I am currently reading, The Business of Media Distribution: Monetizing Film, TV and Video Content in an Online World, providing an overview into what the business of PayTV actually entails. We learnt from past experiences of people who've worked in several countries in launching PayTV operations in-country, had an overview of a core component of the system (i.e. Content Management System), touched on Next-Generation platforms, and culminated in a demo on a low-cost (sub $100), but advanced Set-Top-Box, exploring the opportunities that await the African market.

My notes cover the following:

Tuesday 19 March 2013

Digital TV Summit Day Two...

Here's my write up from Day 2 of the SA Digital TV Summit held in Bryanston, Johannesburg. The second day's Agenda went much deeper that the first day, although there was a no-show from the SABC on expanding the topic of Digital Migration. Nevertheless, I learnt quite a bit today as well as refreshing some other topics I don't get to focus on in my day-to-day work. We covered the following areas: