Wednesday, 12 June 2013

PayTV Interactive Applications is dying slowly, but surely

Interactive PayTV is dying slowly, or was it dead on arrival? Imagine these scenarios:
  • OK TV, Have any of my favourite shows been recorded lately?
  • OK TV, Find me something nice to watch - I am feeling good
  • OK TV, I've had a crap day at work today, put something on that'll make me unwind and relax
  • OK TV, What are my friends watching now?
  • OK TV, What show is trending on the social scene?
  • OK TV, What's selling hot on Box Office today? Show me the trailer, skip the ads please.
  • OK TV, Let me see what's happening on my hangout?
  • OK TV, Give me the top five recommendations today
  • OK TV, What's the latest news headlines?
  • OK TV, What's happening with Syria today? Connect me to any Live News Stream?
  • OK TV, What's making waves on YouTube recently?
  • OK TV, Tell me the latest stats on the Formula 1 (Cricket, Football) scene
  • OK TV, What's the weather like at my mom's place?
  • OK TV, Any weather warnings I should worry about?
  • OK TV, Are there any live sports events happening in my area? Which is the closest?
  • OK TV, I'm in the mood for Star Trek - make it so!
  • etc
Get the picture? This is what interactivity supposed to be all about! TV is a passive medium, lean back, relax & watch TV. I don't want to read wades and wades of text on the screen! I've had a long day behind my PC, been coding all day or working with spreadsheets, the last thing I want is to navigate through some tiresome and clunky menu tree to find the information I want, then I must be forced to get up from my couch because the font & text is so small that I need to stand at least 3 feet away to read the screen! More work, more effort for me - I don't want to be interacting with the remote control to do stuff - I want to interact with my TV, not control my TV!

Yet, most of the PayTV operators have maintained to still support the original concept of Interactive TV - the famous Red Button, then the infamous loading time for the application, then interact with an eighties-style user interface. Whilst there are ways to workaround and improve the classic "Please wait...application is loading..." time, the original concepts have not changed that much.  There's a lot of chatter around Interactive TV 3.0 or second screen, etc - about the next wave of interactivity, that people would rather interact with a second-screen device (smartphone / tablet / etc), but this is not what my post is about.

Mostly, this post is part rant, and part explanation about the current predicament PayTV Operators find themselves. I've already a solution in my head that I think is the way forward, but it is probably disruptive to keep even the most open, objective PayTV operator running away at lightning speed! Just take a look at this absolutely cool demo shown recently at GoogleIO:


This is the future! I think Google is on the right track - Years ago, going back to 2005/2006, I was pushing my company to explore Voice & Text-To-Speech integration into Set-Top-Boxes. When we looked at Recommendations back in 2004/2005, we considered voice -- but hats off to Google, they've done it with Search, and I'm sure the next wave is enhancing their Interactive TV offering.

My elevator pitch: Times are changing! PayTV Operators must stick to their core business: Content provision, distribution, content protection & core information services. Partner with third parties to bring true interactivity to the user experience & platform. Focus more on Open Systems, don't control the whole world / ecosystem. Create strategic partnerships with Google (or other), creating a bridge between a closed PayTV world & the Open Social Apps world... 

This post is organised as follows:
  • Brief story about Interactive TV and Typical Architecture Model
  • Application Stores Overview - What PayTV Operators need to consider
  • My view of a possible Future Platform / Architecture


Wednesday, 22 May 2013

System Integration is King - Release Manager, Build Meister-Gatekeeper

I spent a lot of energy in the last week trying to (and succeeded in) convincing some senior managers about processes, that in my humble opinion, are tried-and-tested, well-established norms in the management of software: the art of Software Release Management - taking control of component releases, especially w.r.t. approving or rejecting bug fixes. It has been good practice in terms of establishing myself as a consultant-expert on all things related to Set-Top-Box Software Development & Integration. I spend a lot of my time promoting best practices and trying to win & influence people over, something I have to do to maintain focus and direction of the overall programme plan...

The point I want to make is that, in a STB delivery project, the System Integration (SI) team is really the GateKeeper, with full authority and accountability for producing a stable, functional release. This means that SI have every right to manage component releases coming in, especially with reviewing & approving bug-fixes. If SI are not happy with a component release, for example, the component team may have fixed more bugs than was required, or fixed far fewer bugs than was requested, or made any other changes that wasn't specifically requested by the SI team, then SI can quite easily reject the component's release.

I have written in the past on the following topics:
All of these areas become the focus of a STB project as we get closer and closer to launch mode. Recapping the template that most STB projects follow:
  • Develop / Integrate Features >>> Functionally Complete >>> Closed User Group >>> Wider Field Trials >>> Launch Candidate 1 >>> ... >>> Launch Candidate X >>> Clean Run >>> Deployment Build >>> Launch
As the system matures to reach each of these milestones, the defects are expected to get less and less. Obviously no product gets launched with zero defects! All projects can do is ensure as best as they can that almost all high impact, high severity defects (a.k.a. Showstoppers or P0s) are dealt with to the best of their ability; and are prepared for the inevitable influx of new Showstoppers from in-field, real world usage of the product - it is the reality, unavoidable. Very rarely do products launch having met all original quality engineering requirements, after all, this is an entertainment device, not a life critical device - so we can relax a little on quality objectives! :-)

So how should you control releases leading up to launch?? It's simple really, not complicated, and really not rocket science...

Sunday, 12 May 2013

So you think writing a Set-Top-Box Middleware is easy? Think again!


So you came across this technology called Digital TV and stumbled across the various (open) standards (MPEG, DVB, MHP, etc) and thought it cool for you to write your own software stack? Or you've been writing Set Top Box Software for other customers (namely PayTV operators) and decided you have enough knowledge to venture out on your own and sell your own software stack? Or you're a
PayTV operator, sick and tired of being locked into expensive and time-consuming Middleware vendors that you decide it is time to write your own software stack, be in control of your destiny and deliver cool products to market faster, cheaper and on-time?
Or you figured there is a quick and easy way of making some money, since just about every country in the world is going to switch off analog TV and turn to digital, where you can provide a very cheap and cost-effective software stack?

If you've answered yes to at least one of these questions, then you should be asking yourself: Just how serious are you about Middleware?  Just how serious are you about the EPG User Experience?

Just how serious are you about competing in this marketplace? Is there any room for you to compete in this arena? Where  do you see your software stack in the marketplace: low-tier, middle-tier or top tier? Who are your competitors? Are you big enough, flexible-enough, savvy-enough, innovative-enough or unique-enough to displace your competitors? Or do you even wish to displace your competitors? What relationships do you have with the big technology players? What is the compelling value proposition that you're offering that your competitor's aren't? Do you stick to one large customer that can finance your operation? Do you seek out other customers to monetize on your product's offering? Do you keep a common product that can service multiple customers at relatively low cost? Do you feel the market is big enough to support many divergent offerings? Are you a small fish in a small pond, big fish in a small pond, or are you swimming with the sharks and you just don't know it yet? Do you rely on hope that everything goes well, or have mountains of cash so as not to notice you're burning fast, but don't realise it? Have you been sold vaporware, demoware, software that's far from being fit-for-purpose for real world mass production use?? Etc, Etc, Etc...

I have met developers, architects, software managers and product owners who think that writing a STB Middleware stack is easy, a piece of cake, all you need is just four really good developers...to which I smile and think in my mind, oh how naive they are! Yes, for sure, the software architecture can make sense on paper, heck you can even hack a proof-of-concept together in about three months or less that proves a simple Zapper or PVR stack. But that's just it, it's a hack, a Proof-of-Concept (POC) - nothing close to getting a real world product into the marketplace.

My rambling in this post covers the following:


Monday, 29 April 2013

Pragmatic Set-Top-Box QA Testing - Don't just trust Requirements-to-Test Coverage scripts


This post might be a little edgy - I think it's important to understand alternative perspectives around Set-Top-Box (STB) testing, how it is uniquely different to other forms of IT Testing; to understand the dynamics of STB testing that promotes agility and flexibility, whilst simultaneously understanding the risks associated with pragmatic testing.

The Headline:
DON'T BLINDLY TRUST REQUIREMENTS-TO-TEST COVERAGE MAPS

Last year, I wrote quite an in-depth paper on Effective Defect & Quality Management in typical DTV projects. It covered many topics, and touched briefly on the aspect of Project Metrics reporting. This post expands on the subject of QA Metrics tracking, focusing on how this reporting can help change the direction of the project, and instigate changes in focus to the overall QA efforts, particularly around Set-Top-Box QA testing. I advocate the project will change QA focus as stability is achieved with Requirement-to-Test coverage maps, to include more Exploratory, Risk-based & User-based testing.

Saturday, 27 April 2013

Worlds of QA Testing in Digital TV Systems Projects

I have previously written about the Digital TV ecosystem with its complexities and challenges in defining the Architecture & Systems Integration spaces. In this post I will share some thoughts around the QA (Quality Assurance) / Testing problem space, as well as generally accepted strategies for solving these problems.

What really triggered this post (which is centred around E2E QA) was a recent conversation with a QA Manager who in passing commented that he's facing a challenge with the "End-to-End QA" (E2E QA) team, in that it takes up to six weeks to complete a full QA cycle. Now some might think this is expected, as E2E QA is the last mile of QA and should take as long as it takes. My response to this is that it depends...

It depends on which phase your project is: is it still in early development / integration testing - where components are being put together for the first time to test out a feature across the value chain? Or is the project well ahead in its iterations having already executed much of the E2E testing? It also depends on the scope and layers of testing the systems and sub-systems before it gets to E2E Testing. Has a version of the system already been promoted to live? If so, what are the deltas between the already live, deployed system compared to the current system under test?

[--- Side note:
This post is one of many to follow that will encompass some of the following topics:
  • Worlds of testing / QA
  • Is ATP / E2E testing the longest cycle than other system test cycles - or should it be the shortest?
  • High level block diagrams of architecture / interfaces / system integration
  • Visualisation of technical skills required for various aspects of testing skills required at various areas of the environment: Scale of technical - heat map
    • E2E QA is highly technical and should be considered as part of Integration, not a separate QA function
  • Which came first: the chicken or the egg? Headend first, then STB, or do both together - big bang?
    • How to solve complex systems integration/test in DTV?
  • What is end-to-end testing, what areas should you focus on?
  • What is required from a Systems ATP?
  • What kind of QA team structure is generally accepted practice? How does Agile fit in with this?
  • Can E2E Testing be executed using Agile - iterate on features End-to-End?
Side note --- end ---]