How AI is Transforming the RAN With the Right Data

From hype to real-world deployment

Man's finger on a cell phone with a blue globe to the side
Monica Paolini

Whatever you happen to do in life (work or personal), you want to get AI to help you become more efficient, get faster, or reduce cost or effort. AI can deliver all this, but it can also backfire and create security, reliability, and accuracy issues that can be difficult to resolve.

The radio access network (RAN) is no exception. Operators want to benefit from AI in the RAN, but they have to be willing to be both fast-moving and careful enough to ensure that they are not overly aggressive or fall behind.

It is a tough balancing act. Karsten Gaenger, Principal Product Line Manager for the RAN at NETSCOUT, uses his deep experience through his work with operators to suggest a four-point strategy for integrating AI into the RAN during a recent Senza Fili webinar, AI in the RAN: A data-first path to full automation. The approach starts with traditional AI/ML models and gradually integrates LLMs and AI agents, which include:

  • Data reliability: Success depends on high-quality, correlated, AI-normalized datasets
  • Gradual adoption path: Operators can start with 4G or 5G, and move forward at a manageable pace that fits their needs
  • Benefit from a diverse ecosystem: APIs, telco-specific LLMs, and MCP give operators end-to-end, vendor-independent visibility across capacity, mobility, and services
  • Beyond KPIs: AI enables scalable root-cause analysis, prediction, and faster repair, with start-to-finish analysis and monitoring of procedures

Data relationships matter

To help service providers, NETSCOUT had to be the first to embark on the learning process to deploy AI in its solutions.

“From the early adoption of ML, we learned that input datasets are crucial to success,” Karsten said. “We need correlated and AI-normalized data where the correct correlation is embedded in the data. If datasets are uncorrelated, even the most complex AI cannot make sense of the information and extract the relevant relationships between data points. With the right AI datasets, we can feed models efficiently and get the best return on our investment.”

Domain knowledge provides context

To make this approach more concrete, Karsten used the example of single-call analysis. “This requires state machine processes to analyze calls from start to finish and monitor all procedures, including handovers. You must understand what happened beforehand and what happens after a handover to see how it affects service quality. If you isolate data using only performance counters, the correlation is lost, and the outcome will be poor.”

Avoiding garbage-in-garbage-out pitfalls

But how do service providers know that they have the right data? Karsten told us that “this is an area where NETSCOUT concentrated early on. We saw that feeding industry data blindly into models resulted in a garbage-in-garbage-out outcome. To avoid this fate, we have worked for years on defining AI-normalized datasets to feed dedicated modules and agents. This helps RAN, device, performance, and optimization teams do their jobs more efficiently. With AI, ML, and the right datasets, we can address tasks that were previously impossible to do at scale.”

Datasets are focused on the user experience

“At NETSCOUT, we use a comprehensive view of the call from start to end, correlating the procedures and RF messages with service level insights to generate a real, holistic view of the network. From here, models can pinpoint root causes of service issues rather than just giving indications. They can also identify good neighbor relations and handover zones based on quality of service, latency, and throughput.”

“We have created AI-normalized datasets that use agentic AI algorithms that generate outcomes for capacity, mobility, and services — three areas where user experience is key."

Watch the recorded event.