Pages

Sunday, 2 March 2014

Applying Google's Test Analytics ACC model to Generic Set Top Box Product

Early this year, I recently completed, what I felt was the best software book I've read in a really long time: How Google Tests Software by Whittaker, James A., Arbon, Jason, Carollo, Jeff ( 2012 ) (HGTS).

Although a couple of years old now, and even though the main driver James Whittaker, has left Google and gone back to Microsoft, this book is really a jewel that belongs in the annals of great Software Testing books, really.

The book is not filled with theory and academic nonsense - it speaks directly at practitioners of all types and levels involved in Software Engineering. Straight-forward, realistic breakdown of how things go down in software product teams, the challenges with fitting in Test/QA streams (actually there's no such thing as fitting in QA/Test, it should always be there, but the reality is that developers don't always get it), balancing the needs of business in terms of delivering on time against meeting the needs of the customer (will real customers use the product, is it valuable), etc, etc.

Please get your hands on this book, it will open your eyes to real-world, massively scaling software systems & teams. Honestly, it felt quite refreshing to learn Google's mindset is similar to mine, as I'd been promoting similar testing styles for much of my management career, being a firm advocate that developers should do as much testing upfront as possible, with Test/QA supplementing development by providing test tools & frameworks, as well as employing a QA Program Manager (known as Test Engineer in Google) to oversee and co-ordinate the overall test strategy for a product. I have written about QA/Testing topics in the past, the ideas there don't stray too far from the core message contained in HGTS.

The book shares a wealth of insider information on the challenges that Google faced with its product development & management as the company scaled in growth in both its use of its products as well as the number of people employed, working across continents, explosive growth and large development centres across the world. It is filled with interviews with influential Googlers that give some insight into the challenges and solutions adopted. You will also find information on the internal organisational structures that Google implements in its product development teams, along with some useful references on Open Source Tools that have been borne out of Google's own internal Testing Renaissance, now shared with the rest of the world for free, thanks Google for sharing!

One such tool is Google Test Analytics ACC (Attribute, Component, Capability) analysis model - which is the topic I want to focus on. In the Set-Top-Box projects that I run, I look to the QA Manager/Tech Leads to know the product inside-out, to develop an overall Test Strategy that outlines the ways and methods of testing that will achieve our Go-To-Market in a realistic & efficient manner (and not adopt process-for-process-sake). I generally look toward usability and risk-based test coverage, seeking out a heat map that shows me in broad strokes, the high level breakdown of the products feature set (i.e. what is it about the product that makes it so compelling to sell), what the product is made of (i.e. the building blocks of the product), and what the product does (i.e. the capabilities or core feature properties). We do this generally in Microsoft Excel spreadsheet, manually importing test data from clunky test repositories. What I look out for, as a classic Program Manager, the RAG (Red, Amber, Green) status that tells me the fifty thousand foot view, what the overall quality of this product is, and how far away we're from having a healthy launch-candidate system.

It turns out the Google do pretty much the same thing, but are way more analytical about it - they call it ACC, according to the book written in 2011/12 "ACC has passed its early adopter's phase and is being exported to other companies and enjoying the attentoin to tool developers who automate it under the "Google Test Analytics" label".

So I've created my own generic Set-Top-Box project based on Google's ACC model, aiming to share this with like minded people working on STB projects. It is not complete, but offers the basic building blocks to fully apply ACC in a real project. I think this will be a useful learning for anyone involved in QA/Testing role. 

My generic-STB-Project is currently hosted on the Google Test Analytics site, along with a complete ACC profile. It took me about four hours to do the full detail (spread over a few days of 10 minute tea breaks, between meetings), and about one hour to get the first draft of the High Level Test Plan with at least one ACC triple filled in. The book challenges you to get to a High Level Test plan in just 10 minutes, which I actually I think that is quite doable for STB projects as well!! 

In about four hours, which is about half-one working day, I was able to create a Test Matrix of 254 core high level test scenarios (note I'm really not a QA/Test expert so imagine what a full time QA/Test Engineer could knock up in one day?) that looks like this:

Capabilities & Risk Overview of a Generic Set Top Box


Google currently use this model in their high level test plans for various products, like ChromiumOS, Chrome Browser & Google+. It was developed by pulling the best processes from the various Google test teams and product owners, implementation pioneered by the author's Whittaker, Arbon & Carollo. Taken from the book. Test planning really stops at the point you know what tests you need to write (it is about the high level cases, not the detailed test scripting). Knowing what to test kicks off the detailed test case writing, and this is wherer ACC comes in (quoting snippet from Chapter 3, The Test Engineer):
ACC accomplishes this by guiding the planner through three views of a product, corresponding to 1) adjectives and adverbs that describe the product's purpose and goals, 2) nouns that identifiy the various parts and features of the product, and 3) verbs that indicate what the product actually does.
  • A is for "Attribute"
    • Attributes are the adjectives of the system. They are the qualities and characteristics that promote the product and distinguish it from the competition. Attributes are the reasons people would choose to use the product over a competitor.
  • C is for "Component"
    •  Components are the building blocks that together constitute the system in question. They are the core components and chunks of code that make the software what it is.
  • C is for "Capability"
    • Capabilities are the actions the system performs at the command of the user. They are the responses to input, answers to queries, and activities accomplished on behalf of the user.
Read more to find out how/what my ACC model for Generic Set-Top-Box looks like (first draft!)...




In my generic Set-Top-Box (STB) model, I describe the product's core features and components from the point of view of the user, not necessarily the detailed software architecture component diagram of the STB, because as we know, a STB architecture is pretty complicated if you dig into the layers that comprise it, as shown in the figure in the side-bar. If we wanted to be really detailed about the component test plan, then I would create a separate ACC for the EPG application including Application adaptors & Middleware separately. Together these components make up the end-product from the STB customer, which is the PayTV operator, but, even individually the components themselves are products owned by vendors. So a middleware vendor might create an ACC based purely on its Middleware product components (the testing of course would be slightly different because it's a software block that's being sold, rather than a full user experience).

My Set-Top-Box is your stock-standard STB that is a PVR that does VOD (Video-on-Demand) and is Internet-enabled. PVR's are so standard these days, that HD is not even a feature anymore, or even the fact that a STB may sport a large disk drive, is often taken for granted these days.

So the Attributes I've given this generic STB, at the highest level, consists of the following selling points:
  • Accessible: Provides accessibility features to enhance usability
  • Internet Enabled: The STB can connect to the internet, that when connected, exposes advanced features such as Remote Control, Diagnostics, IP downloads, etc.
  • Secure: The STB is secure, can deal with intrusions by hackers or prevents access to critical information
  • Advanced PVR: The STB offers enhanced PVR features in addition to basic PVR features, like Live Pause, Single Recording, etc.
  • Enhanced User Control: The user has more control over the STB, such as Parental Control, Locking, Content Filtering, etc.
  • Highly Personal (Configurable): The UI is highly customisable to suit the needs of multiple users.
  • Searchable: The STB offers advanced search capabilities by indexing metadata from a variety of sources.
  • Intuitive, Slick & Modern UX: The UI is modern, user friendly and very usable in keeping with modern UX from smart devices, tablets, etc.
  • Fast, Quick to Respond: The STB is fast to respond to user controls from the remote, search results, etc.
And the Components that realistically bring this STB together are the following:
  • PVR Middleware Services: This is a big component that handles core services around systems services such as broadcast acquisition, filtering metadata, providing databases, and PVR functions like: recording, live pause, trick modes, advanced trick modes, etc.
  • CA Client / Security Services: Most STBs support conditional access - generally have CA clients that implement security requirements from CA vendors, configured to implement security business rules of the PayTV operator. CA Client handles scrambling/descrambling, viewing-access, etc.
  • System Services: Component responsible for exposing various operating system services to the application, usually a wrapper around more detailed middleware or low level services that wouldn't directly be exposed to an application component.
  • Application Manager: The STB allows for multiple applications to be run at the same time, such as TV Guide, TV Games, Appstores, etc. This is around application engine management such as Java Virtual Machine instances, inter-application communications, etc.
  • Parental Control: Allows the user to set restrictions on viewing and accessing content, based on various parental control rules such as PG rating filters, total blocking blackout periods, Channel blocking, timers, etc.
  • User Preferences: Manages all user configurable settings for personalisation such as TV Guide rows, banner timeouts, software upgrade schedule, favourite channel lists, colours, user profile logins, etc.
  • Search: This component is responsible for providing the search experience. Content/metadata can be sourced from a variety of sources, like SI metadata, TV Anytime data, VOD online data, YouTube data, IMDB, Other third party mediation services, RSS feeds, etc.
  • VOD Client: This component is responsible for accessing the VOD catalogs from a variety of sources, handle download management, request queues, download priorities, etc.
  • Media Player: The Media Player component is responsible for playing back content, the play engine. This can be Live TV, Recorded Content off disk, PVR Recordings, VOD content, Streamed content, Progressive download, etc.
  • Planner: The Planner allows the user to plan his/her viewing, for example: book reminders, recordings, schedule VOD downloads, etc.
  • Guide: The Guide is a component that aggregates program data for the Electronic Program Guide (EPG) from a variety of sources (Service Information from broadcast data, on-demand, IMDB enhanced data, etc.)
And the Capabilities are thus reflected here (all 254 of them):
https://test-analytics.appspot.com/#/12961009/capabilities

Project Details:
The project is publicly accessible through:
Go to: https://test-analytics.appspot.com/
Select: khanmjk's Generic STB

3 comments:

  1. Hi! Did you try this approach in your production work? If so how did you deploy it?

    ReplyDelete
  2. Hi, I can't say that we're using the tool per se, but the concepts were well received. We've not deployed this particular tool, teams have their own ALM....and they guys are relying on manual heatmaps in excel for now.

    ReplyDelete
  3. I read the HGTS and i liked the ACC method for test planning. I have following questions:

    1. We usually have many variants for same components, how are they handled in ACC? For example say you have multiple media players such as MP3 media player, MPEG media player, HD Media player and so on; each media player should be verified to function well with all other features in the application. Will you create separate components for each player and repeat all the capabilities for each, or will you have just single component and list out all variants inside each capabilities?

    2. In a application, many features works together, thus multiple components are interconnected, and we need to simulate this in testing. For example, say want to verify "VOD Client executing some a Media Player, also searches for uncensored data which should be filtered by parental control". Here, 4 component are involved, i can really go on adding more, such as user does secure search, over the Internet, and so on. How and where to list out such scenarios in ACC?

    3. New features are added to the application continuously. Every newly added feature has to work with all previously existing features. How does ACC help is providing test coverage for such new features? How do you address the dependency matrix between features?

    Thanks in advance.

    Regards,
    Rahul Diyewar

    ReplyDelete