Peter Collingwood is Regional Vice President, Europe, Middle East & Africa for JDSU’s Communications Test division (formerly Acterna). Mr Collingwood has been with the organisation for over 18 years and has served in senior level corporate positions in Europe and the United States. Peter Collingwood holds a Bachelor of Science Electronics Engineering and Diploma in Industrial Studies from Loughborough University of Technology.
Fixed and mobile communications are changing - broadband is coming, bringing a wide variety of Internet dependent services. Even today, in markets where true 3G broadband services have not yet arrived, the demand for video-based entertainment and data-based services is growing. To successfully offer these new services, operators must constantly watch the quality of service (QoS) they are offering because this directly influences the user’s experience and satisfaction. Managed automated service testing is the lowest cost way to guarantee high quality.
For years, broadcast and cable companies have steadily expanded programming choices for consumers. From humble beginnings of several channels over VHF and UHF frequencies to hundreds of channels today, the business of delivering video-based entertainment and information is again on the brink of change due to two complementary forces in the marketplace - the Internet and consumer preference for mobile communications. The Internet has unleashed shorter forms of entertainment and information perfectly suited to the comparatively lower bandwidth of today’s mobile handsets. YouTube, music videos, sports and news summaries, short TV shows and other brief forms of entertainment are in wide demand by mobile consumers. The limits of small screens and slower speeds of mobile devices evidently do not deter consumer interest. As wireless broadband capabilities expand and new technologies enable even richer video quality on handsets, delivery of broadband entertainment and information to wireless devices promises an avalanche of IP traffic over the air. Of course, excellent content is easily overshadowed by poor-quality service delivery. Video on demand service is not useful if the file is corrupted on download to a mobile handset. Consumers lose interest in viewing mobile TV if it isn’t available on a consistent, high-quality basis. As the quality and quantity of content grows and handset capabilities widen, both wireless service providers and content providers alike must ensure that the viewing experience is error-free. Mobility is truly the mark of next-generation content delivery. Any show, any video, any clip, any download, anywhere is what subscribers expect. Yet the mobile user’s experience is determined by much more than just rich content and a variety of services. Rather, creating a positive user experience is tightly bound by quality of service (QoS). Poor QoS is apparent if there are dropped connections, slow download times or features that don’t work outside the home network. Even though availability and performance of services largely determines user experience, it is often difficult to determine where in the home or roaming networks QoS problems exist, especially intermittent problems or instances where service quality is slowly deteriorating. Sometimes the first indication of a problem is a call to customer service. As a result, home network vs. roaming network is virtually meaningless; it all has to work. Now QoS takes on a broader scope. Operators and content providers must closely monitor the quality and availability of wireless services - in the home network and wherever subscribers roam on partner networks worldwide. In addition to active service testing, improving QoS demands pre-deployment service testing ensures seamless rollout of new features and services - again, in the home and roaming networks. Finally, QoS audits and monitoring are required to detect service problems before they turn into subscriber complaints - across all networks. It seems that the formula is simple. To improve QoS, you have to understand the whole mobile user experience. To know the whole user experience, you have to test and baseline mobile services across home and roaming partner networks. To test services, you need to measure in a way that truly emulates how, when and where subscribers use mobile services. Of course, you need to do all of this with less capital and operating resources than ever before. What is needed is a cost-effective, global means to measure user experience, often referred to as Quality of Experience (QoE). How to measure user experience Today the operative word is ‘more’ - there are more mobile services being sold, more rich content available, more subscribers, more competition, more network elements to manage, and more software programmes required in the network. Not surprisingly, maintaining QoS in today’s complex mobile networks requires more service testing and monitoring than ever before. Given that subscribers measure their experience with mobile services wherever they roam, and recognising that QoS determines user experience, it is incumbent upon operators and the content providers alike to constantly monitor and test services on home and roaming partner networks. There are several ways to conduct active and pre-deployment service testing: manual, operator-owned automated, and managed automated. Manual service testing - good Advantages • Sending technicians into the field with phones emulates the subscriber’s perspective on service performance - to a point; and, • Can be used for pre-deployment testing of new features and services. Disadvantages • Highly labour and travel intensive - by far the most expensive means to test and validate services - especially expensive for roaming service assurance; • Lacks consistency limited to testing specific points-in-time - it is impractical and too expensive for 24/7 consistent service testing over a period of time; and, • Not real-time due to lag in reporting data to network operations. Operator-owned automated service testing - better Advantages • Directing service test and verification from a centralised location emulates the user experience. Test probes are instructed to access and interact with the network. Data is gathered, graded and formatted into reports; • On-demand, 24/7 testing capability allows collection of more data points and more consistent service testing; • Can be convenient and cost-effective to deploy test probes in home network; • Speeds troubleshooting - exercise test probes to diagnose service problems; and, • Capital and operating costs are much less than manual testing methods. Disadvantages • Less practical and highly expensive for testing services on partner roaming networks; • There are costs involved to acquire and maintain the expertise to configure and manage a service testing solution; and, • Deploying fewer probes or relying on partners to measure QoS in roaming networks limits visibility into user experience. Managed automated service testing - best Advantages • Similar to function of operator-owned automated service testing, except subscription- or service bureau-based service; • Strong choice if service bureau has extensive, worldwide footprint of test probes; • Especially strong for testing and troubleshooting outside home network, where deploying operator-owned probes can be costly; • Lowest cost option because the service bureau secures sites, owns and manages test probes, conducts tests, and makes data/reports available; • Offers most visibility of user experience by making QoS testing affordable for partner roaming networks; • With large testing network, offers best ability for consistent testing over time and locations; and, • Strong choice for service testing in home and partner networks - enables unified view of user experience. Disadvantages • Figuring out how to reassign manual service testing staff; and, • Probes may only support most recent technologies, making it impossible to measure experience of users on legacy technologies. Managed automated service testing As compared to manual and operator-owned automated service testing systems, managed automated service testing offers the best solution for measuring the user experience. Like the other methods, managed automated service testing captures data points on success rate, delay, feature verification, and other measures that impact QoS. Yet unlike manual methods, managed automated service testing is highly repeatable and consistent, offering operators a more extensive, real-time view of QoS for any given metric. And because managed automated service testing offers an installed base of test probes worldwide, operators are able to test and validate the user experience across partner networks. Any method that tests and validates service performance contributes to increased revenue and reduced churn by providing actionable data to help improve user experience. Yet the managed automated solution - with potentially more probes in more places of the world - offers more visibility across home and roaming networks. With more data points collected, network operations and other business units are able to do an even better job managing the user experience - and drive increased revenue and reduce churn. The strongest argument in support of automated service testing is savings in labour and travel. Gartner conducted an unbiased, blind third-party assessment of the ROI (Return on Investment) associated with deployment of automated service testing systems for two major North American wireless operators. Both previously utilised manual testing to monitor and audit the network. The automated service testing solutions required fewer people; the Gartner study showed that each operator averaged annual savings of US$2.7 million in full-time staff. Travel cost reduction was over US$850,000 and cost per data point collected (CPDP) dropped from US$20 to US$30 per CPDP to US$0.50 with automated testing and managed automated solutions. With managed automated service testing operators had no equipment to buy and maintain; they offer even greater savings than operator-owned automated solutions. Consistent, high QoS engenders loyalty and increases revenue and improves the user’s mobile communications experience. If a connection is dropped on the home network, or quality is poor while roaming, the user always blames the operator. The goal is consistent quality no matter where the subscriber roams. Improving QoS starts with measurements that emulate the user experience; managed automated service testing captures more data points from more roaming networks, on a consistent 24/7 basis - at a fraction of the cost of any other testing method. With easy-to-use network testing, operators and content providers alike can test current services, pre-test new features, and anticipate trouble. Proactive QoS testing keeps the operator in touch with user experiences. This sort of automated performance evaluation lets operators and content providers effectively manage the mobile user’s Quality of Experience.