The Pros and Cons of Threat Analytics Location

Threat analytics is crucial for network security, but where you conduct it matters. Get the pros and cons of five primary locations.

Woman in front of computer with screens in background

In the dynamic landscape of modern-day threats, organizations are compelled to implement resilient security measures to safeguard their operations. Central to this endeavor is threat analytics, a pivotal tool that empowers businesses to proactively identify and address potential risks.

Conducting threat analytics at different locations, such as in a security information and event management (SIEM) system, in the cloud, in a data lake, or even at the source of network packet capture, offers unique advantages and considerations. This blog explores the pros and cons of five primary locations for threat analysis.
Threat Analytics in a SIEM


  • Centralized visibility. SIEM platforms aggregate and correlate data from multiple sources, providing a centralized view of security events and threats.
  • Correlation and contextualization. SIEM systems correlate logs, events, and alerts from various systems, enabling better threat detection by identifying patterns and relationships.
  • Efficient incident response. SIEMs offer workflow management and automation capabilities, streamlining incident response processes and improving efficiency.


  • Event-based analysis. SIEM systems rely primarily on event logs and alerts, potentially missing subtle or complex threats that may not generate explicit events.
  • Delayed detection. Analyzing events after they are logged introduces a time delay, which can allow threats to go undetected or cause a delay in response.
  • Data volume challenges. SIEM platforms may struggle to handle and analyze large volumes of data, leading to performance issues and potential data loss.
  • Investigation. SIEM systems rely on alert and log data, making it difficult to understand what occurred before and after any event. This can be overcome if the SIEM supports an interface into network security visibility solutions.

Threat Analytics in the Cloud


  • Scalability and elasticity. Cloud-based threat analytics can scale dynamically to handle fluctuating data volumes and provide resources on demand.
  • Broad data accessibility. Cloud-based analytics enables access to diverse data sources, within both the cloud infrastructure and external sources, enhancing the depth of analysis.
  • Collaboration and integration. Cloud platforms facilitate seamless integration with other cloud services, allowing for enhanced collaboration, data sharing, and integration with external threat intelligence sources.


  • Data privacy and compliance. Storing and analyzing sensitive data in the cloud requires careful consideration of privacy regulations and compliance requirements.
  • Network dependencies. Reliance on network connectivity for data transmission to the cloud can introduce potential points of failure and latency issues.
  • Additional costs. Cloud-based threat analytics can lead to additional costs, including subscription fees, data storage charges, and data transmission expenses, which may increase with growing data volumes and infrastructure needs.
  • Vendor reliance. Organizations must trust the cloud provider’s security measures because the effectiveness of those measures directly impacts the reliability and security of the threat analytics infrastructure.
  • Global threat detection. Cloud solutions attempt to apply the same logic to all data received from all of their customers. However, each customer has unique security challenges, which are difficult to implement in a global application of threat analysis.

Threat Analytics in a Data Lake


  • Data flexibility. Data lakes can store vast amounts of structured and unstructured data, allowing for versatile threat analysis across various data types.
  • Advanced analytics capabilities. Data lakes support advanced analytics techniques, such as machine learning and artificial intelligence, for more accurate and proactive threat detection.
  • Long-term data retention. Data lakes offer scalable and cost-effective storage, enabling long-term retention of historical data for trend analysis and investigations.


  • Data complexity. Managing and organizing data within a data lake can be complex, requiring proper data governance and metadata management practices.
  • Skill and resource requirements. Extracting actionable insights from a data lake necessitates specialized skills and expertise in data analysis and manipulation.
  • Data lake storage costs. Storing large volumes of data in a data lake can lead to significant costs, especially as the amount of stored data increases over time.
  • Data quality and integration. Ensuring data quality and integration across different data sources can be challenging, potentially affecting the accuracy and reliability of threat analytics.

Threat Analytics on Endpoint Agents


  • Immediate threat detection. Endpoint agents offer real-time visibility into device activities, ensuring swift detection of potential threats and enabling quick response measures.
  • Rapid response. Detecting threats at the endpoint allows organizations to respond promptly, minimizing the window for potential damage and mitigating risks effectively.
  • Context-specific insights. Endpoint agents capture data directly from devices, providing detailed context about endpoint behaviors and enhancing the accuracy of threat analysis.


  • Limited scope. Endpoint analytics are based on the processes running on any endpoint. They are not built to provide visibility into the enterprise at large and may not capture network-based threats or attacks targeting other components of the IT infrastructure, limiting the overall threat visibility.
  • Evasion techniques. Sophisticated attackers have the capability to bypass endpoint security measures by using evasion techniques, reducing detection effectiveness.
  • False positives and negatives. Endpoint agents can generate false alerts or miss actual threats, leading to operational inefficiencies and potential security gaps.
  • Not an application for all endpoints. Most enterprises have components where endpoint technology cannot be deployed. Examples include Internet of Things (IoT) and operational technology (OT) devices, legacy computer systems, and certain non-Windows operating systems. These components must still be considered as part of the attack surface and must be monitored with other forms of security technology.

Threat Analytics at the Source of Network Packet Capture


  • Real-time analysis. Conducting analytics at the source allows for immediate detection and response to network threats, minimizing potential damage.
  • Granular visibility. By analyzing network packets directly, you gain comprehensive insights into network traffic, enabling accurate threat detection and forensic analysis.
  • Reduced latency. Analyzing data at the source minimizes network latency because it avoids data transmission to external locations.


  • Resource intensive. Analyzing network packets in real time requires robust hardware and software infrastructure, potentially increasing costs.
  • Limited scalability. Enterprises conducting analytics at the source may face scalability limitations in handling large volumes of data, especially in high-traffic networks.
  • Complex implementation. Setting up and managing packet capture and analysis systems requires specialized knowledge and expertise.

Choosing the most suitable location for threat analytics depends on factors such as real-time requirements, data volume, scalability needs, data privacy considerations, and available resources. Each location has its pros and cons, and organizations must assess their specific requirements to determine the best approach or a combination of approaches that align with their security objectives and operational capabilities.


At NETSCOUT, we believe threat analytics at the source of packet capture is the best option and the cons of this approach can be overcome. It’s true that analyzing network packets in real time is hard to accomplish, but with the right technology, you can meet this challenge. NETSCOUT’s Omnis CyberStream network sensors use patented and proven deep packet inspection and NETSCOUT’s Adaptive Service Intelligence (ASI) technology to transform raw packets in real time at unmatched speed and unlimited scale into a rich source of locally stored and actionable layer 2-7 metadata we call Smart Data.

The benefits of Smart Data are that it utilizes compression routines to use less storage capacity and provides longer-term historical investigation, resulting in lower operational costs and more efficient investigations. Smart Data is a critical piece of NETSCOUT’s Visibility Without Borders platform, which features an infinitely scalable architecture capable of expanding into any environment, any cloud, any enterprise, any application, and any service. The patented technology at the core of our platform is the only way to truly bring together the massive quantities of data that the world's largest and most complex networks demand.

To learn more about threat analytics at the source of packet capture, read our threat analytics blog, where you’ll also learn how NETSCOUT Omnis Cyber Intelligence (OCI) uses threat analytics at the source of packet capture to provide multidimensional threat detection methods such as indicators of compromise (IOCs), compliance policy violations, Suricata-based signatures, unexpected traffic, and behavior analysis to ensure comprehensive network security coverage.

Learn more about threat analytics at the source of packet capture.