Automated operations in the modern state-of-the-art telco networks, such as 5G Core, O-RAN and SDN, all follow the same pattern. The main enabler is the entity that streamlines collection and consumption of the network state. This is Network Data and Analytics Function (NWDAF) in 5G Core, Radio Intelligent Controller (RIC) in O-RAN and SDN Controller in Software Defined Networks. Each of these entities applies a three-layered approach to network data monetization. While the functionality of these network state consumers is very similar, their implementations are different. This results in vendor specific solutions, silos per network domain, and more importantly, less opportunity to monetize the potential of network data.

Identifying the three layers of complexity

Modern telco networks produce a lot of data. Each of the above-mentioned network entities first collect the network data and exposes it to the interested network state consumers. The network state consumers are various analytical modules that may vary from a simple threshold-based algorithms that analyze the current network state to very complex AI/ML-based algorithms that predict the future network state. Finally, based on the insights from these algorithms different actions are pushed towards the network, depending on the implemented automation use cases, and thus the loop is closed.

Let us now take the example of a European CSP which has a lot of ideas on how to improve their network operations. They have performed many different Proof-of-Concepts, all focusing on the Analytics part of the model. The data- and automation aspects were neglected in these PoC’s, resulting in very sub-optimal results. Many use cases yielded disappointing results, due to data limitations, and when the results were good, the required automation capabilities were not (yet) available, which significantly reduced the business benefits of the use case. In one other case, an energy saving use case was not deployed, because the required policy management function was not available.

IBM advocates to disaggregate data and analytics functions and combine them with automation capabilities, as actionable data-driven analytical insights without automation do not bring the full business value.

Here we identify three layers of complexity, where each of the three proposed layers brings specific value: Data Democratization in the Data Layer, an open “plug & play” AI/ML module in the Analytics Layer, and finally the ability to act directly on the actionable insights in the Automation Layer.

  1. Data Layer: collecting the data and making it accessible and understandable to all interested consumers.
  2. Analytics Layer: analyzing the data for the various Use Cases to provide actionable insights. 
  3. Automation Layer: acting upon the actionable insights in an automated way.

A lot of the industry hype today is around AI/ML and Automation. However, when it comes to Telco Networks the complexity of the Data layer is often overlooked.

Figure 1 Three-Layered Model for Network Data Monetization

Data is the new natural resource; however, this does not imply that it is easy to use and to have it monetized. To get the maximum benefit from it, the right data must be efficiently mined from its hiding places and only afterwards valuable insights can be produced out of it. In addition to this, network data is generated all the time and everybody has it – indeed, each CSP has an abundant unlimited data source that never stops. Therefore, data mining is the business of every CSP nowadays. The question is – how to ensure that your data mining is efficient and that it allows you to unlock the power of your data?

Use cases on how to get started

Automated operations opened a world of possibilities for the telco operators. The concept came at the same time as the necessity in the era of technological evolution towards disaggregation which enormously increased the complexity of modern telco network operations, but also as the main enabler for the continued evolution of telco services towards SLAs never seen before, such as ultra-low latency or enhanced mobile broadband. From today’s perspective more and more operations in telco will be automated as manual operations cannot cope with the challenges. However, it is yet not clear what are the winning use cases that will bring the best monetary values to the operators. We call this the Business Value Hypothesis. Today we can distinguish the two categories:

  • The obvious or foundational use cases where the business value hypothesis is intuitively clear. Examples of such use cases are energy saving, anomaly detection, automated root cause analysis, failure prediction. The value of these use cases has already been proven by others, the task at hand is to develop and deploy these use cases in the own environment, which may not always be easy to do.
  • The less obvious or aspirational use cases where the business value hypothesis is yet to be proven. Data can be combined from network, customer domains and call data records as well as transactions that go over the network to arrive at novel insights that are of value, e.g., unsuspected correlations. We refer here to the ideas, internal gut feelings, etc. of combining network data with other sources of data that may yield results but their exact value and monetary of achieved quality improvements has yet to be proved.

The above elaboration on the still existing uncertainty of the monetary values of the data-driven analytics use cases in telco underscores the need for the CSPs to host their very own playgrounds for analytics. In fact, CSPs need an open platform, in which they can easily develop and deploy their own analytical modules but also deploy 3rd party best of breed analytical modules, as CSPs are unlikely to create all analytical models by themselves.

It requires the easy access to network data in an understandable way, to avoid massive time loss in trying to find the right network data sets. Having readily available data models can reduce time and effort to implement use cases by 30 to 50%.

The first step towards the monetization of your Network Data is to create a Value Tree. In this process, it is necessary to recognize what are the various categories of value that can be identified and to specify these categories in more and more detail. The Network Data Value Tree is not static, it continues to evolve as new ideas emerge, and successfully implemented use cases spark new ideas.

Figure 2 Network Data Value Tree Example

In the analysis of which use case to develop, it is important to assess the availability of the right data, and what you want to do with results of the data analytics: what is the required automation to act upon based on these newly obtained insights. Insights that cannot be acted upon have far less value than insights that can be acted upon in an automated way.

Want to learn more? Contact us at chris.van.maastricht@nl.ibm.com and Maja.Curic@ibm.com.

Was this article helpful?
YesNo

More from Analytics

How the Recording Academy uses IBM watsonx to enhance the fan experience at the GRAMMYs®

3 min read - Through the GRAMMYs®, the Recording Academy® seeks to recognize excellence in the recording arts and sciences and ensure that music remains an indelible part of our culture. When the world’s top recording stars cross the red carpet at the 66th Annual GRAMMY Awards, IBM will be there once again. This year, the business challenge facing the GRAMMYs paralleled those of other iconic cultural sports and entertainment events: in today’s highly fragmented media landscape, creating cultural impact means driving captivating content…

How data stores and governance impact your AI initiatives

6 min read - Organizations with a firm grasp on how, where, and when to use artificial intelligence (AI) can take advantage of any number of AI-based capabilities such as: Content generation Task automation Code creation Large-scale classification Summarization of dense and/or complex documents Information extraction IT security optimization Be it healthcare, hospitality, finance, or manufacturing, the beneficial use cases of AI are virtually limitless in every industry. But the implementation of AI is only one piece of the puzzle. The tasks behind efficient,…

IBM and ESPN use AI models built with watsonx to transform fantasy football data into insight

4 min read - If you play fantasy football, you are no stranger to data-driven decision-making. Every week during football season, an estimated 60 million Americans pore over player statistics, point projections and trade proposals, looking for those elusive insights to guide their roster decisions and lead them to victory. But numbers only tell half the story. For the past seven years, ESPN has worked closely with IBM to help tell the whole tale. And this year, ESPN Fantasy Football is using AI models…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters