Test Automation
My second task as a contractor at Ericsson was to work with a fresh graduate and develop a suite of automatic regression tests for ANSI IS-136 TDMA and AMPS. The group had a tool developed in Sweden called UP-SIM which was essentially an IS-136 TDMA base station with a command line test interface.
While the UP-SIM was intended for among other things automatic regression testing, no one seemed to have any documentation for the UP-SIM, no one in Raleigh could operate it very well and there didn't appear to be knowledge as to the extent of scripts available for it.
RF test enclosure similar to the ones used when developing the TDMA automation ( aka Basic Test) suite.
The decision I had to make was should I use the UP-SIM, or use a tool being developed inhouse by another department which also was an IS-136 TDMA cellular base station with a command line test interface.
Because of the need to test cell selection, reselection and handoffs where multiple channels are required, a single channel protocol analyzer would be insufficient. So, the locally developed WinSSV ANSI IS-136 TDMA cellular base station simulator looked like the best bet.
The cost to our department would be minimal, if I remember correctly, the combined cost of base station and software was less than $5,000. I already knew the cooling fans created an intolerable amount of noise. For that reason and to get better control over the test automation, I ordered a shielded RF enclosure for the base station to sit inside. The enclosure was a cube about 36 inches wide. There was a bulkhead plate to which I had our machine shop fit a variety of RF connectors. A pair of RF shielded low noise fans mounted to the side moved enough air through the enclosure to keep the base station inside from overheating.
As always, I wrote the scripts as scripts rather than programs. Each script corresponded to one test, was named after the test and each script contained all code necessary to run the test. I've always written scripts this way because I found it makes them easier to reuse and to train new people in their use.
Our first goal was to create a sanity test whereby all major areas of the phone would be tested to make sure nothing was completely broken. This would represent a milestone in that for the first time, management would be able to get an indication within an hour or two of a software version being released as to it's basic health. Individual testers still had to continue running manual regression tests but we steadily reduced that number until all regression tests had been automated.
60 Project Day Savings
A phone typically had 20 internal software releases before it was released to the market. Manual regression testing took 3 days. The automated suite took four hours. Most builds were released after hours. One of us would remain to start the tests. As soon as the software build was ready, we would flash the phone and fire off the tests. First thing in the morning, we already had regression results before most developers had even read the software release notice. By automating the testing, 60 project days were saved per phone!