Comprehensive coverage

Prof. Mark Horowitz from Stanford "We must plan the infrastructure of the Internet of Things so that it is secure and takes care of privacy from the ground up"

Prof. Horowitz said these things at the ChipEx2015 conference. According to him the IOT will start from adding functionality to the technology we already have because we can and it's cheap 

Prof. Mark Horowitz at the ChipEx 2014 conference. Photo: Kobi Kantor
Prof. Mark Horowitz at the ChipEx 2014 conference. Photo: Kobi Kantor

Prof. Mark Horowitz, a senior researcher from Stanford University and one of the founders of the Rambus microelectronics company, opened his lecture at the ChipEx2015 conference by stating his concern about data security: "What is disturbing is that the connected devices within the 'Internet of Things' will contain a lot of information about us and control critical elements of our daily lives our. We need to do something to prevent the security disaster, and ask the engineers to build safe systems from the ground up."

Today it is clear to all those who are in the field that the vision of the IOT is happening, this is a field we used to call a sensor network. It is not developing as quickly as we expected but it is definitely going there so we better be ready.

During his lecture he explained the good news and some less good news related to the Internet of Things (IOT) field and in the end he focused on the security threats related to IOT.

"The reason the IOT will thrive is related to some fundamental changes in the computing industry. We are moving from a focus on technology - with winning applications (Killer Apps) coming from companies that have access to the most advanced technology to an era where the products come from those who have a good idea to leverage technology better."

"This year we celebrated the 50th birthday of Moore's Law. The reason for its success is that each technological generation could enable the production of the previous chips cheaper, enabling the existence of chips for new applications. So the functionality you had last year just becomes cheaper this year.”

"Each generation enables 8 times more performance for a given power consumption. This gives us a promise that computing will continue to be cheaper and with much lower power consumption. For example when I started my studies, the Cray 1 system which was then the most powerful computer in the world required 100 kilowatts of energy. Today with 45 nanometer technology (which is also already obsolete) we put this processing power and even together with the memory into an area of ​​three square millimeters. It consumes less than one watt and operates 10 times faster.”

"The problem is that we thought it would continue this way - and it didn't. To illustrate this, let's see what happened to the clock speed in processors in the last 30 years, the CMOS generation. We saw that the speed of the clock increased steadily until suddenly in the middle of the previous decade it stabilized. The reason for this is not due to the industry's ability to continue increasing the speed of the clock, but because we reached the energy barrier beyond which it was no longer worthwhile to continue developing."

"The density per unit of power has also increased over the years, as required by Moore's Law, but in recent years this increase has slowed down. Somewhere when we reach milliwatts per cubic millimeter, everything will stop because the size of the DIE is fixed - 120 square millimeters. Thus we will reach chips that consume hundreds of milliwatts. There is a limit to how much you can cool the computer without the noise driving all your office neighbors crazy, or with a cell phone - it will heat up all the time, that's why we limit the energy consumption on a personal computer to 10 watts and on a phone to at least one watt."

"The reason why we are running into the wall of energy consumption is that we haven't really followed Moore's Law. In recent generations, we have not been able to drop below a certain threshold of energy consumption."

There is a physical limitation as to why we cannot lower the power consumption - limiting the ability to control the transfer of the current through an energy barrier, and because of the heat limitations, we get higher energy consumption instead of lower per unit area of ​​the chip. This limits us to 0.6 volts.

This constant growth has survived several technological exchanges, says Prof. Horowitz: "Computers in the past were mechanical, then we moved to tubes, and then to transistors. In the last 30 years the dominant technology is the CMOS. If it is not enough, why don't we replace it with a new technology. I don't see a technology that will replace the CMOS, if only for the reason that investment in such technology will be high, and also the risk of failure. In Silicon Valley and also in Israel if you don't ask for a lot of money there is always someone who will finance you. The problem is that a new production process will cost hundreds of millions or even a billion dollars and with a billion dollar investment, people will want assurance that they will get the money back. "

"This means that we will also rely on CMOS in the future. But there is also good news, even now we are not using all the capabilities of the chips, we can build anything at almost zero cost (of course, a considerable investment is required in the planning, but afterwards it is possible to produce cheaply). There are such great capabilities in computing and communication - a friend of mine built a Bluetooth radio system that sells for sixty cents, and it contains greater computing and memory capabilities than I had on my computer when I was an undergraduate student. The key question is how we can leverage this technology to improve the world and not so much if we can continue to improve the technology."

cup holders

"An example of this: in the past when people would buy a car they would check the engine capabilities, and the number of doors when cars were expensive. After enough doors have been installed and the engines today are powerful enough, to entice people to buy this car and not another, simple things are put in the car, such as a cup holder as well as a GPS system and an elaborate entertainment system. It doesn't use the most sophisticated chips but solves a real problem. I jokingly say that we are in the age of the micro processor coffee cup holders. Microprocessors don't get more powerful but you buy them for the secondary features that aren't engineering complicated.

"The next generation of "coffee cup holders" will be the IOT things that you can link to the Internet such as photo albums for grandparents that will be constantly updated, a watch that measures body measurements, connecting all the computers in the car to one system that can be read and communicated with. "

“The IOT will start from adding functionality to the technology we already have because we can and it's cheap. It's not that we want to develop the Internet of Things, but we want to build things and want them to be connected. Therefore, the Internet of Things must be separated into areas such as automation of industrial plants, smart home, personal products and network devices. There will be no unified market for the Internet of Things."

"The good news - we will see a lot of amazing products connected to the Internet and they will require a lot of chips."

"The bad news is that there is a tower of Babel of standards, each manufacturer chooses its own standards for operating the device, communicating with it and connecting it to the cloud for data analysis. For example if you buy Philips smart bulbs you can talk to the Philips cloud service using the Philips app but can't link them to anything else. Ditto NEST.”

"The reason for this is the complexity of the IOT device - the MGC (eMbedded Communicate Gateway) architecture - the idea is that each system is a complex combination of an embedded system (the M in the word EMBEDDE replaces the E because it is difficult to pronounce the initials). They talk to the GATEWAY using short-range communication, and the GATEWAY is connected to the cloud. Today you have to write the application in the application, in the GATEWAY and in the cloud so that they work together, and this involves many languages ​​and a lot of effort.

The security risk

But that's not the worst part. The biggest danger is the security risk. Because it is difficult to develop, those who develop do not think about all aspects of security. When someone breaks into a PC and uses it to send spam it's disturbing, when your car or house is hacked it will be even more painful and even fatal."

"A year ago HP bought ten IOT devices and tested them. Six of them had no protection, the updates are transmitted to them without encryption via the Internet and attacks of the old and familiar types can be used against them. Building a protection system for such devices is complex because of their distributed nature, the strong computing capacity they contain and the need to work in different languages ​​and different operating systems. It is also possible to attack the devices from different directions - from the cloud, from GateWay and more. "

"What is disturbing is that these devices will contain a lot of information about us and will control critical elements of our daily lives. We need to do something to prevent the security disaster, and ask the engineers to build secure systems from the ground up."

 

The article was first published on the Chiportal website - the Israeli chip industry portal

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.