Test and approvals is a vital yet often ignored or overlooked area in the lifecycle, especially the release schedule, of a mobile device.
I’m Gavin Nesbitt and I’ve actually known John since the mid nineties and recently he asked after I offered, to write a short piece on the matter I’ve worked in for over a decade, mobile device test and approvals. I’ve been interested in telecoms since before I met John. In fact John and I met on the CB which I got active on via my interest in electronics from when I was about 8 years old, this was when I still lived in Scotland. Since those days I’ve since become a radio ham (M1BXF) and done loads of cool work – I still have a real passion for my hobbies of ham radio, electronics and telecoms which led me to start working for Orange, on the network side, when I was 19, I next worked in a large phone repair centre for 2 years before moving to the Czech Republic in 2003 and started working for a handset manufacturer executing field trials around Eastern Europe. In 2005 I moved to Cambridge UK and started working in the development arena testing a 3G protocol stack (the name given to the modem software) before moving into a field applications role supporting LTE test mobiles used by the network manufactures. In this time I’ve also worked on Wifi, Bluetooth, GPS, ZigBee, WiMAX and many other wireless technologies. Nowadays I’m the Technology Manager for IOTAS (www.iotas.co.uk) a company who consult in and provide a worldwide testing solution to many different customers. Due to the confidentiality my job requires, remember I’m testing devices not yet in the market, I’m going to be very careful about which company names I mention – those I do are companies who already openly state they do what I mention.
In my conversations with John it became really clear about how few people really understand the full extent of the testing and approval requirements on the launch of a new mobile phone so I thought while testing in the back of a taxi on some back roads near Orangeville, Toronto ON with a colleague I’d take some time to write an article about this area using my Asus Transformer (TF101), got to keep it Android right! My colleague and I are currently testing Mobile Originated (MO) and Mobile Terminated (MT) back to back voice calls and 180 minute MO long calls on a relatively mature device on a Canadian carrier. I call the device mature as it already has GCF certification, more on this further on.
Why both short and long calls? Well short calls test the ability of the device to setup and teardown the connection to the network, long calls test the ability of the device to stay connected to the network once the call is established. This is the 4th day in the taxi, about 8 hours per day and this alone only accounts for about 10% of the testing we are doing in Canada on this mature device, the other 90% has been done in the hotel over the past week. The sheer time and effort which goes into making sure a device works as designed is vast, especially the modem side, which is the bit which interacts with the network (and the area I’m an expert in) as this is the ‘interface’ between device and all the different networks, networks whose infrastructure can be made by different manufacturers, infrastructure which is then individually configured by each network to their requirements – this results in numerous configurations a handset could face just connected to a network. Furthermore there is a huge amount of testing which goes into the GUI, MMI, translations and the like. I’ve seen rooms of people testing all aspects of an MMI for weeks; ensuring the SMS counter works, text alignment is correct on the screen, all spellings of the menu and user alerts are OK, animations are smooth, you name it it’s tested, then usually tested again to be sure. I should clarify the word device is generally used (in the test industry) as what we carry in our pockets are no longer mere phones and so the word ‘device’ is used as a general name for anything with a phone/modem in it.
Back to the direction of this article as I’m getting lost in detail, much like the taxi driver who has no idea where we are right now! As you might gather from my introduction a relatively mature device still gets and requires lots and lots of testing. We are not only testing in Canada for this device but also several European and Asian countries in parallel, logistically quite demanding. Just think about that for a minute. To test you need SIM cards (able to manage large amounts of data and voice calls), test locations (hotels, customers), drive test routes from operators, test tools to be able to confirm certain device/network interactions have happened (and the skill to identify them), test plans which take into account the network and device capabilities then you have to get there in things like like planes, trains and taxis, usually in a country where you don’t talk the local lingo, it’s not necessarily a simple task. SIM cards are usually quite simple to source just go into a shop and buy PAYG (pay as you go) SIMs however PAYG SIMs don’t always have the features you might need to test enabled on them so it’s likely you will need contract SIMs. Contract SIMs requires a minimum contract term, usually an address in the country you are in and then we need more details about them (more information than a consumer needs) and this is required for all the networks you test on. You then need good contacts within the networks you are testing on so you can be told things like SIM details or where they have say UMTS900 coverage or the like and then get a hotel near it, or for drive testing (the long call testing I’m currently doing as an example) you need to know the route to use which ensures all the test cases can be covered. The test tools are different for each modem chip supplier so you need to understand lots of them, this all starts to mount up.
As my expertise is in live network Field Trials (I’ve worked for a handset manufacturer, a protocol software developer and now a 3rd party test company) I’ll jump in and give you an idea of some of the different strategies needed to test a device properly. Testing is always a trade-off between cost and quality. The cost is not only money but also time, you could test a device to its entirety but it would cost heaps and never make it to market on time. Nobody does this, nobody.
Field testing can be broken down into the following;
· GCF Certification: Most, but not all, networks ask that handsets are GCF certified and usually most handset manufacturers GCF certify their handsets as a routine, but what is it? GCF stands for Global Certification Forum (www.globalcertificationforum.org) and is a test forum made up of Networks, Manufacturers and Observers who create test plans and test scope requirements for GCF certification. GCF members meet bi-monthly and discuss which areas need extra or less testing to ensure the main aspects are covered adequately, they also add new tests when new technology such as LTE as it appears so it takes quite a bit of thought to create the most optimized test plan. There are many aspects of GCF, I’ll cover 2 of them, Conformance and Field Test (FT). I should point out GCF only really covers Europe and Asia, America (USA) has a similar certification body known as the PTCRB. GCF conformance is all based on lab tests using network simulators usually costing in excess of $1m (I’ve worked in labs which have had over $10m of conformance kit) whereas GCF field trials are tested on live networks. In conformance you would typically have over 10,000 tests cases to run, each test must pass and each test covers an aspect of the specification the handset has been developed to, most likely 3GPP (www.3gpp.org) such as a particular voice codec or data rate – there are countless ways both can happen. Field trials cover more end user type tests such as throughput, voice quality, SMS and Supplementary Service. The GCF define the scope of this tests such as it must be conducted on 5 different networks, although the network number is mainly based on the number of infrastructure manufacturers available such as 3 different MSC vendors (voice/SMS), or 4 different SGSN vendors (packet data), 3 different SIM cards etc, you can usually achieve these combinations with a minimal of 5 networks. The time to certify an HSPA device to GCF level per network is about 5 days so over 5 networks it would take about 25 man days and could be 5 engineers in 5 locations simultaneously or 1 engineer travelling between the 5 locations in turn or anything in-between, the company I work for now has engineers living in various locations around the world in places where we have coverage for the 5 minimal networks requiring minimal travel costs to our customers. The reason for the focus on infrastructure combination is because although each infrastructure vendor (the likes of Alcatel-Lucent, Nokia Siemens Networks, Ericsson and many others) create their equipment based on the same specifications there are always suitable differences. I should point out the although some of the names here are familiar, infrastructure teams and handset teams work completely independently, usually on different continents, to each other. So it’s up to the GCF, and other organisations, to make sure everyone plays nicely together. It’s entirely possible each bit of the diagram below could be made by a different company and they should all work together.
· Development: This involves testing a device, or core software continually as it is developed using tests plans defined and executed by each manufacturer as they see fit. This is usually done in readiness for GCF field trials so that the scope of GCF field trials is covered and confidence is gained by running development testing that GCF field trails should pass. Its likely most of this testing is executed in the lab or a limited set of networks due to cost and manpower leaving some uncertainty about the actually stability.
· Performance: back to back testing to stress the reliability of it. Things like making hours and hours of voice calls checking handovers, call setups and other aspects of call control. Downloading gigabytes and gigabytes of data to see if there are any issues in data throughput rates. Performance testing is usually run in parallel with a reference device to check any anomalies are not related to the network, i.e. both the test device and reference show low throughput then it’s likely a network issue but this is where the engineers experience comes into play. Performance testing could be a continuous program of testing the core software, not necessarily a device ready to be released to the market. I’ve worked on performance test plans lasting days to months.
· Network Acceptance: Each network may have a suite of tests they expect a device to pass before accepting it for sale. These tests usually cover specific customisations of that network; this is why we have handset variants. You occasionally hear that a device has been delayed on a particular network, it’s likely not met the network acceptance requirements.
However even after passing all the above tests issues may still not be found until you have thousands of consumers using the device in ways the test plans never covered, this is where the cost vs quality comes in. The cost to rectify an issue in the field is becoming less and less nowadays due to over the air updates but any new software build still costs time and money to verify before release. The GCF require a full ‘paper trail’ of SW release notes and test results of any new maintenance software release after GCF certification to keep certification. Depending on the fix a select few tests from the other areas may be run to ensure quality (and to verify something else wasn’t broke by the new changes). In reality the testing conducted finds most major fails before release, some of these fails though may be quite serious and require time to fix and thus delays the release of a device. Each manufacturer can decide themselves how serious each fail actually is and what remedy is needed. Now many of you have probably got to this point and think ‘why do I still find issues with my device’ well the reality is during test, for the cost reasons mentioned already, it’s just not possible to test every aspect of a modern device, especially modern smart phones. I’ve never been in a situation either where if an issue was missed by a team member they are made to feel bad about it, it’s just not that kind of game.
Of course there are many other types of testing, approvals and certification a device must go through before going on sale. Some of these are static discharge, EMC Electro Magnetic Compatibility), battery life (the result of which is subjective), battery safety, material safety and many more and many of which require specialist skills, knowledge and equipment. If the device is to be sold in Europe then you need to gain European Conformity (CE) approval, Russia has GOST and many other countries their own regulations.
All this basically comes down to the reasons why some handsets do not ship on time. Due to the immense scope of testing and approvals the chance of finding an issue which must be fixed before launch is high. If an issue is found then retesting is always needed which again adds additional time to the whole process. Also testing is stressful, I’m at the end of a 10 day test session here in Canada, each day has been about 12 hours testing and keeping my attention level high for all of that time has been hard going. For this reason I don’t point fingers when issues slip through, an issue that might never have been part of the test plan in the first place, if it wasn’t I’ll assure you any good tester will add it to any future test plan! So it’s again down to cost – you could have a great stable phone but it would cost lots more than you are willing to pay and take much longer to get to the shelves. If that is what you want then make the manufactures aware. If that’s not your thing then please give the manufactures some slack when issues are found or devices are delayed, modern devices are complex and manufactures are doing their best to get you the latest tech out as quick as possible and with the best cost.