Skip to content

Connect-World: the leading Telecom magazine, ICT magazine, Telecom magazine, ICT and Telecom Industry Press Releases and Blog.

 

Loading...

Welcome to Connect-World, the information and communication technology (ICT) decision makers' magazine.

Increase font size  Decrease font size  Default font size 
You are here:     Home Magazine Asia-Pacific Asia-Pacific III 2007 Evolution of the Internet

To join our mailing list click here

captcha 

Evolution of the Internet

Written by  Larry Morgan
Rate this item
(0 votes)
Larry MorganIssue:Asia-Pacific III 2007
Article no.:10
Topic:Evolution of the Internet
Author:Larry Morgan
Title:Managing Director
Organisation:Macquarie Telecom
PDF size:328KB

About author

Larry Morgan is the Managing Director of Macquarie Telecom. He has more than 25 years of global telecommunications and international executive management experience. Prior to joining Macquarie Telecom, he served as President/CEO and member of the Board of Directors with Virtela Communications, an award-winning Global Virtual Network Operator. Mr Morgan also held a series of executive positions and board memberships at BT Infonet, where he served as President of EMEA and, earlier, as Vice President and General Manager of the Asia-Pacific region. Before joining Infonet, Mr Morgan held several senior management positions in Sprint and AT&T after building his career in sales with IBM. Mr Morgan is a frequent speaker at conferences and has appeared on CNBC television, in the Research Board, CommunicationsWeek International and Datacom Magazine. He has won many industry awards, including the ‘Best in Class’ Telemark Award, the Datacomm ‘Hot Products’ and Interop Achievement Awards. Mr Morgan earned a Bachelor’s of Science and a Master’s in Administration from Villanova University and has completed the Executive Management Programme at the University of Southern California.

 

Article abstract

Not available

 

Full Article

Nearly four decades ago, a revolutionary communication medium was beginning to take shape inside secretive US Department of Defence, DOD, research laboratories. Its mission - to enable machines talk to other machines, and connect to a wide array of fixed and mobile military assets in order to provide a strategic battlefield advantage. It seems a little overdramatic now to think of the birth of the Internet in this way, but this is testament to how quickly the Internet has outgrown science fiction, and how encompassing its application has become in today’s world. The Internet The evolution leading to the Internet began decades before. We can see its genealogy in the telegraph and in IBM’s Sabre system, the world’s first computer-aided airline reservation system that began in the 1950s. Although these early networks performed well, they ran on dedicated communication lines in proprietary architectures and therefore were isolated from each other except where human intervention provided the linkage. What was missing from the network equation was the ability to bridge that gap automatically. Until the early 1970s, the value of bringing these separate networks together didn’t seem to make sense. There just wasn’t a need for, say, toasters to speak with refrigerators in order for them to do their respective jobs. It wasn’t until the DOD posed the question to its scientists: “How can we connect our assets so they can talk to each other and share information fast enough for the command staff to make educated decisions that will increase efficiency and maximize productivity?” The dedicated networks of the day dealt with a specific type of information for a specific purpose. Telephone networks transmitted voice, data networks that pre-dated today’s local access networks allowed the transmission of data, while radio and television networks were merely an avenue for transmitting audio and visual content. Unless existing infrastructure could be reused, the cost of building a dedicated network infrastructure would be prohibitive, especially taking into consideration the vast distances to be covered. The copper network of the telephone system already covered a great area, and extending it to regions that were not covered was easy, but telephone networks carried voice, not images or documents. Enter TCP/IP, or Internet Protocol, IP, a multi-layer communication protocol that, at lower layers, translates and abstracts information while the upper layers prepare that information for transport across the network. Like the electronic equivalent of the Rosetta Stone, IP unlocks dedicated networks from their isolation and provides a common language for information commerce. This ability to bring different types of information together on the same network, like transporting data over telephone lines, opened up a wealth of military applications, but civilian applications presented themselves almost as quickly and more readily. The first nodes of the Internet were already live and talking to each other, but IP enabled its rapid growth throughout the 1980s by making interconnections between networks possible. The early Internet was used, in the civilian arena, mainly by academia to pool its resources and collaborate, thereby creating a vast repository for research. Complex data could be shared across any distance at near-instantaneous speed, or as nearly instantaneous as a 300-baud modem permitted. Thirty years on, the Internet has become one of the most critical communication tools available. IP creates an environment where different networks can talk to each other, share their silos of information, and enable the development of programmes that can deliver any number of services, including such real-time applications as ERP, enterprise resource planning, Internet banking, VoIP, etc. The convergence trend Though the convergence of voice, video and data has been discussed for years, the convergent capabilities of IP are an integral part of its DNA. While it was commonly understood in the early days of the Internet that voice and video could be run as data over the copper of a telephone network, this was impractical then for a number of reasons. First, the few users were mainly institutional, there was an infrastructure for voice in place and there was no suitable video content, so there was no need then to use a data network for voice and video. Second, bandwidth limitations and costs would remain as stumbling blocks for many more years. At present, the situation is very different. There are over a billion Internet users today, including government, enterprise and home users. The cost of Internet access has fallen dramatically, the use of fibre and availability of bandwidth has climbed exponentially, and the lower costs of hardware and software has contributed to a sharp increase in the adoption rate. These factors have accelerated the drive towards converged networks. Home users may not see the point in maintaining separate subscriptions, or using separate networks, for telephone, data and cable television services, when options like VoIP and IPTV exist. Government and enterprise users appreciate both the cost savings on customer premise equipment and the convenience of integrated workflows that convergence brings. It is this integration, or simplification, of processes that has generated much interest in the future of convergence. The challenges of interconnection Although convergence is fairly simple, it poses a number of challenges. While converged networks lower the cost of multiple, disparate networks, it places users in a particularly vulnerable position. Information flows between nodes to users through what can be likened to a fat, and very public, pipe. There are many different types of traffic passing through this public pipe, so efficient and secure management of traffic becomes an issue, especially amongst enterprises that have no way of classifying or prioritizing the information sent. In addition, some applications simply consume whatever bandwidth is available; this optimises the application but creates network stability issues. How then does one measure - and manage - the trade-offs between free versus paid services, or business versus applications objectives? With Class of Service technology, CoS, different traffic can be classified into similar types and each group will be treated with its own level of service priority, enabling networks with increasing volume to be differentiated and managed more efficiently. This brings us to yet another step in the evolution of convergence - the Next-Generation Network. Another major risk is that with a public pipe in this un-policed wilderness, information can be hijacked, copied, mimicked or otherwise compromised. Attacking the information at its point of origin or termination - a personal computer or mobile device, for example - is considerably easier, and a very real threat to the security of personal information. The open nature of the Internet makes it easy for anyone to publish and access information, but there are few safeguards with regard to the quality of that information. Determined attackers have exploited, and will continue to exploit, the vulnerabilities of the World Wide Web to plant ‘Trojan horses’, ‘worms’ and other malicious programmes that can steal valuable data, or cripple a critical system and cause massive amounts of damage with relative ease. The lack of redundant network capacity for backup can also pose a serious problem. Natural disasters can disrupt the undersea cables that carry IP traffic, creating information blackouts, as was witnessed during the tsunami in 2004. Unusually high traffic on a certain ‘pipe’ or network branch can also bring the transmission of information to a grinding halt. However, these risks can be mitigated and effectively managed. Enterprises need to recognize the detrimental effects these potential threats and risks could have on the business. Resources should be allocated to implement well-thought-out IT policies and procedures, and networks should be carefully managed to reduce the impact of these risks. Security risks, for instance, can be minimized by the calculated deployment of Virtual Private Networks, VPNs, with encryption to protect transmissions. Regular patching, maintenance and the liberal use of virus scanning software can blunt a hacker’s attack, while educating users about security best practices can help keep personal and corporate systems healthy. Load balancing and traffic prioritization can also mitigate risks on converged networks that lack physical network redundancy; the open architecture of the Internet allows for dynamic ‘best-path’ routing of traffic in the case of more severe outages. Overall, the merits of our interconnected world far outweigh the cost of failure. The Internet was conceived as a military advantage; today, though, we cannot be complacent - we need an appropriate defence for the Internet.

Read 2396 times
Login to post comments