There is a lot of talk about whether writing code, or creating software, is really an art, a craft, or is it a highly disciplined, software engineering discipline. This post is not about this debate, although I do wish to make my stance that great software is just as much it's a craft as is a software / systems engineering discipline. One should not discount the analysis, rigour that should go into the design of any software system, be it relatively simple 4th/5th tier high level applications built on very easy frameworks (e.g. WebApps, Smartphone & Tablet frameworks), or very low-level, core, native software components (e.g. OS kernel, drivers, engine services).
Take the Set Top Box (STB) for example, a consumer device that is usually the central and focal point of the living room, people expect the thing to just work, to perform as well as any gadget in the household, being predictable, reliable and always delivering a decent user experience.
How then does one go about delivering performance in a STB software stack? Is it the job of the PayTV operator to specify in detail the NFR (Non-Functional Requirements)? Or is it the job of the software vendors (Middleware / Drivers) to have their own internal product performance specifications and benchmark their components against their competitors? Does the PayTV Operator benchmark their vendors based on the performance of their legacy decoders in the market, for example: this new product must be faster than all infield decoders by a factor of 20%. Or is their a collaboration between the technical owners (i.e. architects) to reach mutually agreed targets?
Take the problem of Sluggishness. According to good old Google,
When we're field testing STBs, over long periods of time (or sometimes after not-so-long-usage), people & customers report:
This feedback, whilst extremely useful to the product team, is very hard for engineers to characterise without having access to system logs, to trace through the events that led up to the slowdown in the performance that resulted in degraded user experience.
In my view, based on my own experience of writing STB applications and managing complex systems projects, I believe that unless we quantify in fair amount of detail what sluggishness means, then it's a cop out for passing on the buck to other parties: either the product owners didn't take time to functionally spec out the requirements, or the application is using API services that it doesn't have control of...
In the remaining part of this post, I will touch on an example of a previous project of how we handled the problem of sluggishness, and this was through a process of rigorous Non-Functional Requirements focused on Performance Specifications...
Take the Set Top Box (STB) for example, a consumer device that is usually the central and focal point of the living room, people expect the thing to just work, to perform as well as any gadget in the household, being predictable, reliable and always delivering a decent user experience.
How then does one go about delivering performance in a STB software stack? Is it the job of the PayTV operator to specify in detail the NFR (Non-Functional Requirements)? Or is it the job of the software vendors (Middleware / Drivers) to have their own internal product performance specifications and benchmark their components against their competitors? Does the PayTV Operator benchmark their vendors based on the performance of their legacy decoders in the market, for example: this new product must be faster than all infield decoders by a factor of 20%. Or is their a collaboration between the technical owners (i.e. architects) to reach mutually agreed targets?
Take the problem of Sluggishness. According to good old Google,
sluggish
ˈslʌɡɪʃ/
adjective
- slow-moving or inactive."a sluggish stream"
- lacking energy or alertness."Alex woke late feeling tired and sluggish"
- slow to respond or make progress."the car had been sluggish all morning"
When we're field testing STBs, over long periods of time (or sometimes after not-so-long-usage), people & customers report:
- The box got very sluggish, needed to reboot
- The STB isn't responsive to remote presses, takes 4-5 seconds to react
- Search takes too long and I can't do anything else but wait for search to complete
- Channel change is slow compared to my other decoder
- When using the program guide or navigating through the menus, the box gets stuck for a few seconds as if it's processing some other background activity - it's very irritating to my experience
This feedback, whilst extremely useful to the product team, is very hard for engineers to characterise without having access to system logs, to trace through the events that led up to the slowdown in the performance that resulted in degraded user experience.
In my view, based on my own experience of writing STB applications and managing complex systems projects, I believe that unless we quantify in fair amount of detail what sluggishness means, then it's a cop out for passing on the buck to other parties: either the product owners didn't take time to functionally spec out the requirements, or the application is using API services that it doesn't have control of...
In the remaining part of this post, I will touch on an example of a previous project of how we handled the problem of sluggishness, and this was through a process of rigorous Non-Functional Requirements focused on Performance Specifications...