Knowing Your Environment - A Paradigm Shift is Needed
Monitoring Network Security of Large Volumes of Business Transactions

Many organizations don’t know what machines are allowed on their network much less what applications are permitted to access the network from any given host. As such, when a breach occurs that results in malicious software being installed internal to their network, there is very little means to detect such events. One statistic shows that hackers have remained undetected on a network on average for 206 days! Another statistic shows that only 6% of enterprise breaches are self-detected (meaning you find out from someone else that you’ve been hacked). Knowing your environment in order to better detect malicious exploitation can significantly improve these statistics. Since every action and transaction is occurring on the network, continuous monitoring of traffic-based data is not only the starting point but the foundation for gaining a comprehensive real-time and historic view of all service components including physical and virtual networks, n-tier applications, workloads, protocols, servers, databases, users, and devices.
“If you know the enemy and know yourself, you need not fear the results of a hundred battles.” – Sun Tzu, The Art of War
Trying to stay ahead of threats and finding all vulnerabilities are intractable goals. The adversary generally has more resources than the typical enterprise. They also have a significant advantage since this battleground is almost infinite, constantly evolving and our adversary’s do a better job at sharing and collaborating. In addition, many organizations are often unable to update software rapidly for a variety of reasons. As a result, it is extremely difficult to detect breaches especially when the adversary wishes to remain undetected. According to Verizon, 90% of known vulnerabilities exploited in an attack had patches available for at least six months before the breach, but the patches had not been applied.
Implementing security information and event management (SIEM), deep packet inspection (DPI), and intrusion detection/prevention systems (IDS/IPS) also help but these tools are often reactive, focusing on knowing threats and vulnerabilities, requiring significant analytical resources to properly implement and deploy. On the other hand, knowing your network (both permissible use and acceptable business practice) using traffic-based analytics, while non-trivial, is an achievable goal. This approach puts the enterprise at an advantage since you can get visibility and deep insights into complex application and service delivery environments. Solutions that embrace traffic-based intelligence reduce the undetected battleground and make breach detection highly probable. This is also a proactive approach to defense that allows the focus to be on those resources where it is truly needed, significantly reducing false positives when examining anomalies. It also reduces the need to “data mine” voluminous log files (and that is assuming we get the right tip-off that a compromise may be taking place so we know what to look for).
A Paradigm Shift is Needed
While many would consider “knowing your environment” to be too big of a challenge, the solutions to enable this knowledge has been evolving and can now be used as a foundation allowing IT teams to play offense rather than defense. Just as importantly, you can also maintain this foundation when the inevitable change takes place. Threat intelligence can prevent many potential attacks and best practices for vulnerability management can reduce the risk exposure. However, despite these two best practices, attackers have an unending arsenal of vectors and zero-day vulnerabilities to exploit.
Organizations must monitor continuously and in real-time large volumes of transactions across the entire service delivery infrastructure. The monitoring under this new paradigm benefits from understanding what is permissible and acceptable on a per host basis so that “untrustworthy” behaviors can be closely investigated. It is difficult to define abnormal when you don’t know what normal is. It is still a good idea to understand the threat (know your enemy) and patch as many vulnerabilities as possible. In combination with these best practices, we can significantly raise the bar to make successful attacks more difficult while improving our ability to detect breaches as they happen.