Dr Wolfgang Knospe is in charge of the division Radio Access and Transport of the Competence Practice Communication Technology. He has acquired broad experience in this field during his years of activities for globally and regionally active mobile network operators. His core competencies cover the strategic, technical, and commercial assessment of classical and modern mobile networks with a focus on dimensioning and network architecture. His current specialty is the creation of a cross-technology tool for the efficient planning of core parameters and of roll-out strategies in relation to business case calculations.
Dr Daniel Henkel is a senior consultant with Detecon's ICT Product Innovation Group. His expertise and interests lie in wireless networking, new product creation, and business model generation with focus on Africa and the Middle East. He has 8 years experience in the telecommunications industry and as a PMI-certified project manager he has lead culturally diverse teams in solving technological as well as business problems. As a researcher Dr Henkel investigated wireless mesh networks and optimized network and protocol design to implement delay and fault tolerance.
While more and more capacity is needed to satisfy demand, the newest technology makes infrastructure expansion harder to design - capacity and coverage depend on each other and the distance from the base station affects the client’s throughput. If time-and-space demand is accurately measured, large coverage cells with low usage will not be over-provisioned, while cells with high usage will be enhanced. To do this, geo-marketing systems measuring usage and behaviour combined with network performance can produce differentiated KPI (Key Performance Indicators) that would guide infrastructure deployment plans, and deliver higher user satisfaction..
The provisioning of high-quality mobile telecommunications services has become more difficult and expensive in recent years. A study by Juniper Research suggests that a mobile network operator's costs will exceed revenues by 2015 due to increasing network build-out needs1. Data services are placing an increasing strain on the network and in addition to service accessibility and voice quality, the overall service perception is also determined by the speed with which users access bandwidth hungry applications. Here the mobile network's capacity plays an important role.
In telecommunications, engineers regard capacity as a measure of the user's data volume that can be transported per time interval. Depending on the interval's length, it can be expressed in volume per hour or in volume per second, also called throughput.
The customers’ perception of the network's capacity is completely different and depends on the user’s particular application: services with sporadic data access like web browsing require a high throughput. Services with a continuous data flow like video streaming require lower bandwidth, but a large volume is transferred per session. Thus, the perceived service quality is determined by both capacity features described above – which are obviously key to customer satisfaction.
Many operators claim to have insufficient network capacity. Once customers started to adopt smartphones and began to use mobile internet services, the traffic on 3G networks grew dramatically, driving the deployed capacity close to exhaustion. Additionally, driven by marketing, higher and higher throughputs are advertised as a proxy for superior service quality.
The typical engineering response to this situation is to provide additional infrastructure deployed in a proactive manner: additional carrier resources, new software releases allowing better utilisation of the frequency spectrum, and ultimately the next generation mobile technology – LTE (Long Term Evolution). These are all planned for deployment - but in each case they are planned for the whole network so as to be prepared for the unknown...
However, there is one more issue: the budget. In contrast to the golden age of telecommunications, budgets for such infrastructure expansions are nowadays sparse or non-existent. So a prioritization of the expansion needs is required based on the Pareto-principle: try to solve as much of the problems with a fraction of the budget. In other words, only upgrade that part of the network that experiences the highest demand!
This sounds easy, but how do we know where the demand is? This simple question often cannot be answered, so network-wide strategies are applied in order to somehow cover the growing demand anywhere. Moreover, how do I know when demand will exceed capacity? Of course many operators have a multitude of KPIs, tools and measurement systems. But only through understanding the correlations of these KPIs and measurements can the operator get a true overview of the current resource utilization.
Theoretically this is not ‘rocket science’, but it is obviously difficult enough for operators. Why is that?
Some technical basics
If we recall good-old GSM, network planning was quite simple. First, deploy sufficient coverage, then monitor the demand growth and expand when necessary. UMTS, however, introduced a new complexity to this process: now, cell coverage and capacity depend on each other. Cells that cover large areas have a lower capacity compared to hot spot cells with their restricted coverage. Today, the situation is even worse. In HSPA/ LTE, not only do coverage and capacity depend on each other, but the client’s throughput depends on his distance from the base station. What are the implications of these factors on an operator’s network build out?
Rollout planning has to consider the complex correlations of coverage, capacity, and throughput –before the deployment of any infrastructure! This means on one hand, an operator clearly has to define its coverage and capacity requirements, and also its throughput targets. On the other hand, optimal throughput, capacity, and coverage can only be achieved if we know where the demand will be (see Figure 1). But who can predict for the operator the customer’s future demand?
Figure 1: From fixed capacity and throughput to variable
About crystal balls and other forecasting instruments
If we talk about demand forecasts then we have to understand the main drivers for all of the involved parties. Marketing managers are typically revenue driven, so they are interested in the development of subscription packaging, especially price packaging. They base their choices on forecast from primary and secondary market research and, of course, an analysis of their current customer base. The result is a more or less accurate demand forecast on a country-wide level, resulting in the prediction of subscriber numbers growth.
From a technical point of view, we have to know the spatial and temporal distribution of demand as accurately as possible. For technicians, the data volume to be transported in the busy hour is of more importance than the number of subscribers. What matters is whether a subscriber consumes 5MB or 500 MB of his monthly 1 GB plan. Both views now have to be combined – and that is typically where the problems start.
In addition to this, technical departments have a lot of data that is collected continuously by various measurement systems and that is used for the planning, optimization and operation of the network. Surprisingly, this data is rarely used to learn something about the customers and their demands, in contrast with web players such as Google who try to collect similar data to characterize their users in as much detail as possible.
The secret is to correlate all of the available data sources by means of a systematic data mining approach to derive information about the customers and their spatial and temporal demand distribution. Even simple daily voice traffic profiles per cell permit one to identify the dominant human behaviour in that area. More complex analysis can identify demand distributions with a level of accuracy limited by the underlying geographical (spatial) data.
Predicting the future is like reading a crystal ball. However, a good starting point is the detailed knowledge about the current customer base and its behaviour
Figure 2: Sources of information for Data Mining
Demand explosion – bane or boon?
In the past, even the best demand predictions were often significantly outperformed (iPhone effect) and some operators were caught by surprise, according to various press announcements, with many customer complaints. How could this happen, as almost all operators claim to have all of the relevant KPIs (Key Performance Indicators) and corresponding measurement systems in place?
In a strictly technical sense, air interface capacity is typically expanded according to quality criteria such as dropped calls, blocked calls and so on. This is a reactive approach since a blocked call already means a negative customer perception and a loss of revenue. In contrast, a proactive approach provides for timely expansion according to the utilisation of capacity by the end user. The management of utilisation and therefore capacity then focuses capacity expansions on highly utilised areas while large coverage cells should be equipped with minimal infrastructure.
A carrier can verify this by comparing its own utilisation distribution with the three curves in Figure 3. The closer the carrier’s utilisation distribution is to the yellow line the more random the capacity management probably is, while the blue and green curves give some best-practice management examples for 2G and 3G. If a carrier operates close to the yellow curve, it over-provisions capacity in the blue area, but under-provisions capacity in the red area. It means budget is wasted for pure coverage cells, while in areas with high demand the actual network quality is probably sub-optimal. Monitoring and controlling capacity utilisation will help a carrier to manage its network quality within budget constraints and to prepare the carrier for the (next) data explosion.
Figure 3: Utilisation and Optimisation Analysis
Manage your network quality – don’t be managed!
A systematic approach to a demand driven proactive management of capacity, and therefore network quality, can be established through the introduction of an enhanced capacity management system.
Such a system introduces a paradigm shift to the organization: a descriptive approach ensures that all of the strategic and operational decisions as well as guidelines are based on measurements fed back from the network, and are continuously recalibrated. This sounds easier than it is in reality since a new mindset has to be adopted. However, once implemented, management has full control of expansion related decisions, which is typically well perceived by investors.
Such a system consists of a formal process and an organizational framework that aligns marketing and technology functions. It also uses a geomarketing methodology that provides a spatially differentiated demand forecast and applies a hierarchical KPI framework that captures relevant correlations. All of these are ingredients that are already available to certain extent to operators, however, they are fine-tuned and combined in a smart way. This enhanced capacity management in turn enhances the perceived network quality and opens up the door for further business opportunities. Customer demand will no longer be the unknown factor.
1) “Mobile Operator Business Models: Challenges, Opportunities & Adaptive Strategies, 2011-2016”, Juniper Research, May 2011