From Packets to Insight: How Curated Network Data Powers AI
A service provider’s roadmap for meaningful AI outcomes
In modern telecommunications networks, especially those built on 3rd Generation Partnership Project (3GPP) standards, operators face the challenge of managing immense volumes of data generated by radio-access and core infrastructure. Some networks are already producing as many as 1 million transactions per second, creating a data environment far too large, unstructured, and hurried for direct ingestion by generative artificial intelligence (GenAI) systems.
As NETSCOUT’s Rick Fulwiler explained in an interview with Fierce Network TV (FNTV) during MWC 2026, the industry must abandon the assumption that “more data equals more intelligence.” Instead, meaningful AI outcomes depend not on raw packet capture but on carefully curated network data enriched with context, structure, and relationships.
Raw Packets Overwhelm GenAI
Raw packet streams pose several fundamental problems for AI models:
- Ultra-high volume: Telco-scale packet data is massive, continuous, and often too fast to be realistically processed—even by advanced AI systems. Without distillation, the data becomes computationally impossible to handle.
- Lack of structure: Raw packets are inherently low-context. They contain technical information, but not the higher-level insights an AI agent needs to understand events, user journeys, system behaviors, or anomalies.
- Expensive tokenization: Feeding these packets directly into a GenAI model can drastically inflate token counts. For communications service providers (CSPs), the associated costs make such an approach economically unrealistic.
- Hallucination vulnerability: Without proper context or relationship mapping, AI models may infer patterns that don’t exist, leading to inaccurate diagnoses, misleading recommendations, or incorrect security conclusions.
Given these limitations, simply feeding packet streams to an AI model and hoping for operational insight is both ineffective and cost prohibitive.
A Curated Network Data Approach
Curated network data is not just a filtered subset of packet captures. Instead, it represents a transformed and enriched layer of intelligence and packet-derived insights infused with contextual information.
With a curation process in place, service providers can harvest data that is more suitable for AI reasoning.
True curation enables the end-to-end stitching of information for context so AI can understand a subscriber’s full journey rather than viewing packets in isolation. By linking events into coherent sequences, the AI can interpret where a user is, what activities are underway, how those events relate, and where issues begin to surface.
Curation also enriches the data with meaningful metrics, key performance indicators (KPIs), and attributes that define entities such as subscribers, cells, and network nodes, as well as events such as handovers, session setups, or flooding, along with the relationships connecting them. This added intelligence gives AI the domain awareness it needs to reason accurately about operational or security conditions.
Finally, domain localization is applied to identify whether an issue originates in the RAN, the core, or the transport network, enabling AI to isolate impact, determine causality, and provide more-reliable recommendations for next steps.
Curated Data Powers Assurance and Security
Once packet-level information is transformed, AI models can support high-value operational and security outcomes.
In service assurance, curated data makes it possible to understand a subscriber’s real-time location within the network; rapidly diagnose service degradation and failures; quickly determine whether issues stem from the RAN, core, or transport layers; and ultimately accelerate triage while improving the overall customer experience.
This level of session-aware visibility significantly reduces the time operators spend isolating root causes.
In network security, curation empowers AI to detect and analyze threats with far greater accuracy by identifying rogue or spoofed base stations, recognizing packet floods or distributed denial-of-service (DDoS)-like events from malicious actors, detecting SIM swap scenarios, and performing precision location analysis to uncover anomalies.
Because curated data embeds relationships and context, AI systems can distinguish between benign network irregularities and true security threats.
Economic and Risk-Reduction Benefits
Beyond performance, curated network data dramatically improves the cost profile of AI adoption for telecom operators. Raw packets generate enormous token loads when processed via GenAI models, making direct ingestion financially untenable. Curation reduces data volume while increasing informational density, meaning:
- Fewer tokens need to be processed
- Models have less “noise” to sift through
- The risk of hallucination is reduced
- AI recommendations become more consistent and reliable
As Fulwiler emphasizes, carriers simply cannot afford to tokenize raw packet data at scale. Curation is therefore not just a technical necessity—it is an economic imperative.
Conclusion
In modern telecom networks, actionable AI cannot spring from raw packet streams. Effective AI for assurance and security requires curated, contextual, and enriched network data—the transformation of low-level packet information into high-value, structured intelligence. With data volume scaling rapidly across global networks, curated sources are essential to achieving accurate AI reasoning, manageable costs, and dependable operations at carrier scale.
For more information about how NETSCOUT is helping service providers operationalize AI data curation, watch the complete interview.