The virtue in virtualisation
I recently read a memoir of childhood in the 1950s, with a description of being sent into exile: the mother gives the child a tearful farewell hug before the walk across the tarmac to the plane, turning and waving goodbye.
Being allowed to walk freely to the runway for that final farewell is a far cry from today’s high security international flights. Without the added paperwork, delays at checkpoints, security scans and zigzag queuing, international travel
was vastly more efficient for the traveller in those days.
Today we do have the technology to recreate such an efficient and open airport environment. With camera surveillance, automated facial recognition, tracking and messaging to cell phones – it would even be possible to implant identity chips and do away with passports – nearly all the walls, barriers and checkpoints could go, leaving passengers and friends to walk freely without any queues, says Eric Hutchinson, CEO at Spirent Communications.
There would still be a need to check hand baggage before boarding, but it only requires a message to the passenger’s mobile phone “You are expected at Scan point C in 5 minutes. It is 1 minute walk from your present position.”
It could be a flexible and efficient solution – making air travel a joy for the traveller. But it would demand a seismic shift in thinking: with no clearly defined passenger streams punctuated with physical checkpoints, how could anyone guarantee security, manage throughput, or respond to emergencies? Government would never allow it – and yet something quite similar is already happening to our data networks.
The traditional data network consists of cable routes linking switches and routers that serve as checkpoints for monitoring and directing which signals go where – together with security points to scan, allow or block traffic according to its legitimacy.
But the advent of “software defined networking” (SDN) and “network functions virtualisation” (NFV), while keeping the same cables in place, is transforming that rigid network structure into a dynamic and flexible data environment. To understand how this is possible, go back to the airport model.
Effectively, that futuristic airport consists of two layers: a physical space with real people walking through it, and a virtual model of that space where every person, their details and their position is mapped in real time. This concept is not difficult to grasp: long ago Plato proposed that the physical world we inhabit could be just a shadow cast by a more real, objective world lying outside our senses. The only question now is: which is the real world and which the shadow?
For the passenger there is little doubt about the reality of the airport and the weight of their luggage, so the control layer is just the shadow cast by the people and structures it maps. But for the airport controller it is not so obvious:
the control layer is where all the decisions are being made, and the fate of those people is being determined.
Consider a game of chess: is it a struggle between white and black pieces of plastic on a board? Or is it more truly a struggle between two minds, where the pieces on the board are simply a reflection of that struggle? For the airport, or network, manager the virtualisation could seem more real than the human pawns being manipulated across the physical space.
This could all be seen as a nice piece of ivory tower speculation, except that the virtualisation dilemma is spreading into everyday life and business. At the beginning of the last century you might say the proprietor was the business: even when a deal was made by phone over thousands of miles, there was a one-to-one relationship between the proprietor’s decision and the outcome.
Compare that with a president today: I describe a national example, though something similar could soon apply to a corporate president. The president is going to make a state of the nation address to a select audience and it will be widely broadcast. The president has decided what needs to be said, but it has already been slightly modified by the PR speechwriters.
The president was looking a bit pale, but the make-up artists have sorted that problem while the video team have also decided to adjust the colour profiles for a more healthy tanned complexion, and the audio engineers are fine tuning the frequency for a deeper, more authoritative voice pitch. Finally, most of the population will only see edited extracts from the speech: one edit for the business news, another for prime time viewers, and so on.
The speech is brilliant, and the voters are swayed. But who won the vote: the real live president, or the virtual image of the president that the voters experienced? If the real president suffered a sudden heart attack, how long could the digital records go on winning votes before the truth got out?
It is still natural for business to see itself in solid material terms. The reality is in the product, the manufacturing, the workforce, channel partners etc – while the data in the network is just a shadow world offering communication and keeping records.
It would be too far-fetched to suggest that the physical was less real than the data, but we are getting closer to parity, especially in the service industries. “All that’s real is the bottom line!” declares the skeptic, but that bottom line is no longer a heap of gold, it is itself just a flow of data between banks.
A new mind-set is needed, one that better understands the relationship between the real and the virtual without downgrading either. On one hand there are those that would forbid an open airport as being too risky, too open to abuse and impossible to control, on the other there are those who see that it would reduce frustration, save time and boost the economy. The truth lies between: it would throw up new problems at the same time as opening doors
to potential solutions.
No-one understands the need for new thinking better than the networking industry driving this virtualisation trend. New industry bodies such as the Open Networking Foundation (ONF) and CloudEthernet Forum (CEF) are bringing together users, vendors, service providers and network testers – including many active competitors – to share challenges, develop new approaches and anticipate solutions.
There is a sense of urgency driven by the remarkable growth of cloud computing – too rapid progress always brings the risk of fragmentation or commitment to technological dead ends. Vital questions include: how to manage a virtual environment, how to monitor its performance and how to make it secure.
Network virtualisation could seem a very abstract topic until one begins to question the reality behind today’s business: how much of an organisation’s value lies in its material goods and how much in pure data? To what extent can we say that the information flow through the corporate network is the “real” business now, while the departments and physical systems have become the pieces on the chessboard? In this perception of the virtual enterprise, what new opportunities and threats are emerging, and what new solutions are also on offer?
These are questions that should be openly and widely discussed. In the meantime an increasing number of large enterprises are contacting test and monitoring organisations with experience in testing virtual systems to find answers and seek advice: to what extent can they ensure security, reliability and service levels in a structure that is shifting, evolving and apparently without boundaries?
The author of this blog is Eric Hutchinson, CEO at Spirent Communications.
Comment on this article below or via Twitter: @ VanillaPlus OR @jcvplus