Comprehensive coverage

The frontier of the Internet

In order to prevent the network from collapsing under the weight of the ever-growing data, the way it handles information must be fundamentally changed - so claims the director of Bell Labs for research

Internet on mobile phones and tablets. Photo: shutterstock
Internet on mobile phones and tablets. Photo: shutterstock

By the end of 2013, the number of smartphones, tablets and other gadgets connected to the Internet will exceed the number of people. But what is even more important, perhaps, is that the fast and powerful devices entering the market today produce and consume content in unprecedented quantities. According to a recent report by the Cisco company, which manufactures a significant portion of the equipment that powers the Internet, the amount of "mobile" data around the world increased by 70% in 2012. However, the capacity of the global network infrastructure is finite, and many wonder when we will reach the upper limit, and what we will do when we do.

There are of course ways to increase capacity, such as adding cables, adding optical fibers and offloading traffic to smaller satellite networks, but such steps only postpone the inevitable end. The solution is to make the overall infrastructure smarter. For this, two components are needed: computers and other devices that can perform preprocessing, and perhaps even filter or organize the content before they throw it onto the network, and a network that better understands what to do with this content, instead of indiscriminately capturing it as an endless, unsorted stream, of bits and bytes.

To find out how it will be possible to realize these advanced elements, Ryan Scientific American interviewed Marcus Hoffman, director of Bell Research Laboratories in Holmdel, New Jersey, the research and development arm of Alcatel-Lucent that became famous, in its various incarnations, for the development of the transistor, the laser, the charge-coupled device (CCD) and a selection of other breakthrough technologies in the 20th century. Hoffman, who joined Bell Labs in 1998 after receiving his doctorate from the University of Karlsruhe in Germany, and his staff see "information networks" as the way to the future - an approach that will make it possible to increase the Internet's capacity by increasing the "intelligence quotient" of the network. Here are excerpts from the interview.

 

Scientific American: How can we know that we are approaching the limit of the capacity of the current communication infrastructure?

Hoffman: The signs are not noticeable, but they are there. Here is a personal example: when I use Skype to show my children playing hockey live to my parents in Germany, sometimes the video freezes at the most exciting moments. It doesn't happen very often, but lately it's happening more and more - a sign that the networks are starting to buckle under the amounts of information they are required to carry.

We know that nature sets us limits. There is a limited amount of information that can be transmitted in certain communication channels. This phenomenon is called the "non-linear Shannon limit" [after the mathematician Claude Shannon who worked at Bell Laboratories for telephony], and it tells us how far we can advance with the technologies we have. We are already very close to this limit, on the order of twice the size or so. In other words, when we double the current volume of traffic on the network, which is something that could happen in the next four or five years, we will reach Shannon's limit. We therefore have a fundamental barrier here. We cannot stretch this limit in any way, just as we cannot increase the speed of light. Therefore, we must work within these limits and still find ways to continue the necessary growth.

How do you prevent the internet from reaching this limit?

The obvious way is to increase the bandwidth by adding fiber optic cables. For example, instead of a single transatlantic fiber optic cable, you can lay two cables, or five or ten cables. This is the raw power approach, and it is very expensive: you have to dig and lay the cable, you need a lot of amplifiers, optical transmitters and receivers, and so on. For this to be economically feasible, we will not only need to combine multiple channels in a single optical cable, but also combine multiple transmitters and receivers using new technologies, such as photonic integration. This approach is known as spatial division multiplexing.

But even so, it will not be enough to reinforce the existing infrastructure to meet the growing communication demands. What is required is an infrastructure that captures the raw data in a different way: not just as a sequence of bits and bytes, but as pieces of information that are relevant to a particular person using a computer or smartphone. On a particular day, would you like to know the temperature, wind speed and air pressure, or would you just like to know how to dress? This is called information networks.

What makes the "information network" different from today's Internet?

Many see the Internet as a "stupid" network, a term I do not accept. What drove the internet in the beginning was the sharing of documents and data not in real time. The main requirement for the system was recovery capacity: it was supposed to continue operating even if one or more nodes in it [computers, servers, etc.] stopped functioning. Moreover, the network was designed to see the data as digital traffic only, and not to interpret the meaning of this data.

Today we use the Internet in ways that require real-time performance, watching live video or making phone calls, for example. At the same time, we are also generating much more data. The network must be more aware of the information it transmits, so that it can properly prioritize the transmission and operate efficiently. For example, if I'm video conferencing in my office and I turn my head to the side to talk to someone who walks into the office, the conferencing mechanism needs to know to stop streaming video until my attention returns to the screen. The system will recognize that I'm not paying attention to the conference and won't waste bandwidth while I'm talking to the person who just walked into the office.

How do you make the network more aware of the information it carries?

There are different approaches to this. If one wants to know more about the data passing through the network, for example, to send a user request to the nearest network server to view a web page, it will be necessary to use software that "peeks" into the data packet, what is known as "deep packet inspection". Think of a letter that you send by regular mail in an addressed envelope. The postal service does not care what is written in the letter: it is only interested in the address. This is how the Internet works today in relation to data. In a thorough examination of data packets, the software instructs the network to open the data envelope and read at least part of the content. However, there is only a limited amount of information that can be discovered about the data with such a method, and it requires a lot of processing power. Also, if the data in the batch is encrypted, the deep check will not work.

A better option is to label data and give the network instructions to handle different types of data. There could, for example, be a policy that video data will be prioritized over e-mail, although there is no need to disclose exactly what is in the video or e-mail. The network simply takes these data tags into account when making routing decisions.

Data transmitted over the Internet already has identifiers. Why can't they be used?

It all depends on the level at which tags are used. For example, data packets that use Internet Protocol include a header, which contains the source address and the destination address. You can think of them as tags, but the information they provide is very limited. They do not specify which website the user is requesting, nor do they say whether the data belongs to a live video broadcast or whether it can be processed in batches. I'm talking about rich tags, at a higher level, or metadata that can be partially mapped better than the mapping that can be done using the lower level tags.

Wouldn't prioritizing traffic based on the information it contains make the network prefer certain content over other content?

It will not be different from what we already see today: for example, when we drive down the road and hear an emergency vehicle approaching with sirens going. In such a case, we are supposed to stick to the side and clear the way for him so that he can pass in the fastest and most convenient way, and possibly save a human life. In this example, the siren is the badge. As long as we are aware that there is an emergency, we do not need to know who is in the car or what the problem is, and we act accordingly. Should we give certain internet data packets priority in an emergency? This is a question of transparency and agreed behavior, both on the roads and online.

Even if the improved network can transmit data in a smarter way, the amount of content will continue to grow exponentially. How will you reduce the amount of traffic that the network is required to handle?

Our smartphones, computers and other gadgets generate a large amount of raw data, which is sent to data centers for processing and storage. In the future, it will not be possible to handle all data across the globe by sending it to one centralized data center for processing. Instead, we may move to a model where decisions are made about data before it's uploaded to the web. For example, if we have a security camera at an airport, we will program it, or we will program a small server computer that controls several cameras, to perform facial recognition on the spot, based on a database stored in the camera or server, before uploading any information to the network.

How does the information network handle the issue of privacy?

Today, privacy is binary: either you maintain privacy or you give it up almost entirely to receive personalized services, such as music recommendations or online coupons. We need to find an intermediate model, which will allow users to control their information.

The biggest problem is that such a model must be simple for the user. Think how complicated it is to manage privacy in social networks: you end up finding your photos in the online albums of people you don't know at all. There should be a digital equivalent of a faucet handle that can be turned to determine the balance between privacy and personalization. The more we discover about ourselves, the services we receive will be tailored to us. But we can also turn the handle in the opposite direction, if we want to provide less detailed information, and we can still receive personalized offers, albeit less focused.

Cyber ​​attacks tend to take advantage of the openness of the Internet, and the handling of security is ultimately transferred to computers and other devices that are connected to the network. What effect will the information licenses have on Internet security?

The information network approach gives the internet infrastructure as a whole a greater awareness of network traffic, and this may be helpful in identifying and neutralizing certain types of cyber attacks. Other factors may also disrupt such attacks. I expect - and hope - that data traffic will be increasingly encrypted, to help achieve real security and real privacy. Of course, once the data is encrypted, it is difficult to extract any information from it. This is a research challenge that will require new encryption methods, ones that will maintain confidentiality and still allow certain mathematical operations on the encrypted information.

Imagine, for example, that the income data of every household in a certain area is encrypted and stored on a cloud server, so that no one but the authorized owner of the information can read the real numbers. If the numbers are encrypted in a way that allows the software running in the cloud to calculate the average income in the area, without identifying any of the parents, but simply through operations on the encrypted numbers, this could bring great benefit.

Another approach might be to develop smart ways to manage encryption keys in a way that allows sharing without compromising security. If done correctly, none of these steps will burden the users. This is the key to success and the challenge ahead. Consider, for example, how many users encrypt their e-mail today: almost no one does this, because it requires an investment of time and work.

_______________________________________________________________________________________________________________________________________________________________

About the author

Larry Greenmeyer is an associate editor at Scientific American.

And more on the subject

Read an article by Marcus Hoffman on the need for "application aware" networks: http://tinyurl.com/cj25voa

in brief

who will

Marcus Hoffman

Occupation

Computer scientist and engineer

where

Bell Research Laboratories, Holmdel, New Jersey

the focus of the research

Will smart communication networks help the Internet overcome its growing pains?

the context

The Internet and its infrastructure must adapt to the heavy data traffic generated by mobile devices and multimedia content.

The article was published with the permission of Scientific American Israel

One response

  1. I liked the translation of DPI to "in-depth examination of portions"
    By the way, every cellular company or internet provider has DPI and data prioritization according to the content.

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.