Michael E. Serrano, Senior Product Marketing Manager at NETSCOUT, gives his views on the challenges of virtualisation.
For the past few years, NFV and SDN have been positioned as a panacea for enterprise and network operators large and small. The power of virtualisation would remove costly, purpose built hardware and replace it with commodity, common off the shelf (COTs) storage and compute. The proposition was easy to understand the benefits would be obvious. After all, software is “always” cheaper and more flexible then hardware.
Then, reality set in. Or, as the Hype Cycle would call it, we moved into the Trough of Disillusionment. After all virtualisation projects were taking longer to deploy, performance was lacking, software defined networking isn’t where everyone thought it would be. Is this a lesson that, ‘all that glitters, is not gold’ or is it something else?
As the industry moved to decouple hardware from software, there were a few other ‘minor’ movements that have added complexity and appeared to slow things down- in the short term. First, there was the realization that software could be improved- made more efficient- by moving to containers and micro-services. Of course, this required developers to come up to speed on new software tools and architectures.
Second, there was the move to open source. This would allow industry to leverage programming talent far beyond their current staff both in terms of quantity as well as skill sets. Of course, with open source comes movement away from standards, and standards have long been the hallmark of communications networks.
This brings us back to the current challenge of realizing the benefits of virtualisationation and moving forward to software defined networks (SDN) and self-optimizing networks (SON). Will we get there? When? Yes, we will ultimately have SDN and SON and they will operate on virtual infrastructure. But, we need to acknowledge that the problem as originally defined has become more complex. The additional challenges arise from the industry realizing early on that old software technique would not be sufficient for this new environment and deciding to address them now.
With everything changing and no standards, how is anyone supposed to have confidence that these new networks are operating correctly? Here is where visibility becomes imperative. Not only visibility to the performance of virtual elements but the interactions of virtual and physical elements as well as applications. It is ‘seeing the forest for the trees’ and understanding the ecosystem.
For peace of mind and confidence that networks and services are performing correctly, operators and enterprises should work with partners who provide continuous visibility as periodic or sampled visibility leaves the organization prone to errors. Executives should feel confident to go boldly into this new world of networking, but it doesn’t mean they should go blindly.
About the Author:
Mike has over 20 years of experience in the communications industry. He is currently responsible for Service Provider Marketing at NETSCOUT. He began his career at PacBell (now part of at&t) where he designed service plans for the business market and where he was responsible for demand analysis and modeling. His career continued with Lucent technologies where he brought to market the first mobile data service technology. At Alloptic, he was responsible for marketing the industry’s first EPON access solution and bringing to market the first RFOG solution. At O3B Networks, Mike headed up marketing bringing to market the first MEO based constellation of satellites for serving internet service to the Other 3 Billion on the planet. Mike’s work continued at Cisco where he helped to define MediaNet (Videoscape) and the network technology transformation for cable operators. Mike holds a B.S. in Information Resource Management from San Jose State University and an MBA from Santa Clara University
- Service Provider