Image Source: 18th Century British shipping routes. Illustration: James Cheshire, Spatial Analysis
Small deviations can turn into big problems over time. That’s true whether it’s delivering business services across the IT infrastructure or delivering high value cargo across oceans using merchant vessels. In her book, Longitude, Dava Sobel described how longitude meridians are tempered by time. When a person is at sea, to figure out longitude requires knowing what time it is aboard the ship and also what the time is at home port or another known longitude—at the very same moment. Of course, with orbiting satellites and GPS it is now possible to see a ship’s position within a few feet and almost instantly. But there was a time when the celestial skies on a map were supposed to keep ships on course…and failed miserably with countless reports of ships crashing into rocks and sailors or passengers living in misery as they wondered the seas. Then, a mechanical genius, John Harrison, solved the greatest scientific problem of his time in the 1700s by inventing portable precision time keeping to any place around the world. By the early 1800s, more than 5000 chronometers were put to use by navy ships, merchant vessels and yachts. In short order, the maritime chronometer was soon taken for granted. The transformation from star charts, octants and lunar distances to chronometers as the method of choice for finding longitude changed commerce forever. See this video for a visualization of shipping patterns for Spanish, Dutch, and British vessels between 1750 and 1850 compressed into one twelve month span. Looks like the flow of IP packets between continents!
For business to succeed in the digital economy, they need a “service assurance GPS” for today and tomorrow, to mitigate risk and control business outcomes. A Global Technology Industry Leader at Price Waterhouse Cooper explained that companies understand that technology disruption is coming but many don’t recognize it is happening at a faster pace than they believe. These companies need to manage complexity, design for large-scale traffic, and build for speed and agility. Clearly, with digital disruption there are winners and losers. And that begs the question, which side are you on?
Source: Shodan.io map showing Internet-connected devices
Businesses are at different stages of technology investment. Some are starting to outsource cloud services, while others are in advanced stages of Big Data analytics and Unified Communications. Within the space of a few years, IoT and Industrial IoT networks will become very large and very complex IT infrastructures, supporting mission-critical systems and billions of devices. According to McKinsey & Co., during the next 10 years a “digital thread” will unleash a seamless flow of data across the value chain from product design to use.
While new business models and customer experience are riding on top of these digital transformation technologies and accelerators, the common foundation for everything is the IT infrastructure and the data that runs through it. As such, service performance problems are magnified and costly. According to an HIS survey, large enterprises lose over $60 million per year because of information and communication technology downtime. A different study from Network Computing, the Meta Group and Contingency Planning Research, said 59% of S&P 500 companies experience 83 hours of downtime per year and at a cost as high as almost $3 million per hour, depending on the industry. Organizations depend on IT delivering real-time actionable insight into the connections between people, machines, data, and processes so as to optimize agility, assure service delivery, mitigate risk and provide a feedback loop to operations, development, and business functions. But IT teams are facing a wide range of challenges to compete in the digital economy and essentially prevent their “ship from crashing into the rocks.” Three challenges that tend to rise to the top of their list are: (1) Inadequate visibility into complex IT infrastructures; (2) Inability to gain actionable insights from Big Data; and (3) IT silos and lack of collaboration.
To meet these challenges and navigate complex service delivery environments requires IT teams use a “service assurance GPS” based on traffic data as well as complementary data sources such as synthetic transactions and xFlow to get real-time actionable intelligence. That way, they can see the relationships and interdependencies between the different service components; quickly identify the root cause of problems and anomalous behavior; understand current capacity utilization and estimate future needs based on past trends; and gain a shared situational awareness across all business services.
Getting clear insights and avoiding “sailing blind” is needed to cross the digital transformation chasm. If done right, there are huge business benefits: happy customers and revenue generation.