Comprehensive coverage

To give the computer the sense of three-dimensional vision

Eyal Nagar, Director of Perceptual Computing at Intel said at the Chiportal Technology Symposium held yesterday in Herzliya organized by the Chiportal website. He revealed details of the XNUMXD camera that Intel is developing and the SDK specifications of the software that it will distribute to encourage the industry to create interesting solutions as it did with the 'regular' cameras when these started to be common in smartphones

Eyal Nagar, director of perceptual computing at Intel, at the Chiportal Technology symposium. Photo: Ayelet Gerdman
Eyal Nagar, director of perceptual computing at Intel, at the Chiportal Technology symposium. Photo: Ayelet Gerdman

Eyal Nagar plays a very interesting role. He tries not only to predict what the future of our computing will look like, in the age of the "Internet of Things" but also tries to create it within his role as director of perceptual computing at Intel, a global role that he fulfills from Israel.

This is a realization, perhaps the first of the vision that Intel has been spreading in the last three years of perceptual computing. We are approaching an era in which the computer recognizes the user when he walks by him, reacts to his movements and his moods. "We want to give the computer senses. We want to reach the brain capacity of the computer itself: eyes, ears, movement, touch, the world will get there, the only question is when and how much we will help it get there. The goal is to produce a machine interface experience that is natural, intuitive and imperceptible to the user (Immersive) (See Wikipedia).

To provide the computer with senses, we are necessarily also required to increase the computer's capabilities, so that technologies of three-dimensional vision, user understanding, natural interaction with the digital world, will be relevant.

In the years 2015-2020, the only interface with wearable computing will be through the method of perceptual computing, and of course perceptual computing will also be installed in the car. The vehicle will see where you are looking, alert the driver when necessary, and of course at home - countless sensors that will control everything and allow the use of devices through hand gestures, or receiving nutritional recommendations from the refrigerator (there were companies that tried this today, but it has not yet matured). This is a basic technology that is relevant everywhere. But there is a big gap to cross to reach this world we have to cross the gap, what do we need to start the spiral of hardware that brings software etc. Itanel's decision is XNUMXD. Let's give the computer XNUMXD vision, the spiral will begin. Innovation will come not only from Intel but from everywhere in the industry. In the first phase, Intel intends to bring XNUMXD cameras to the laptops, and the software kit for developers (SDK) so that they can take advantage of it. At the same time, Intel itself is developing a number of applications that will be used as a demonstration and will allow those who purchase the computer with the camera installed in it to use it at least in a basic way. "Once this capability is in the market, the sky is the limit, just like when Apple came out with a camera and GPS in the iPhone, several industries arose because of it."

Creative's XNUMXD camera based on Intel's SDK. PR photo
Creative's XNUMXD camera based on Intel's SDK. PR photo

XNUMXD computer vision as a complement to XNUMXD printing

According to Nagar, XNUMXD vision will allow more data to be captured. The computer will be able to see the user's hands, eyes and head in a better way and thus better understand the user's intention. Of course, on the other hand, it would be possible to produce holograms of what the camera sees elsewhere in the world. Negar listed a number of possible applications: "For example, scanning the room and creating a XNUMXD model, a better user interface where the computer also analyzes where the user is looking, the ability to navigate and control using hand gestures that, thanks to XNUMXD, will be much more understandable to the computer. Interaction with books, entertainment and games - the world of games will receive capabilities from a XNUMXD camera and of course innovation in the industry.

Commerce - you can photograph things in XNUMXD, and on eBay instead of a XNUMXD display you see XNUMXD. It is possible to make a plan - plan an object in XNUMXD, print it and then print it again from the photograph. It will be possible to improve the ability of video blogging (once the image is in XNUMXD, the computer will know, for example, to distinguish between the person and the moment they are in and replace the background. In the field of entertainment and games, we can also take the concept of the Kinect even in the short term, and of course all household and office objects will have a natural user interface and Will control them through this interface.

On the vision side, Nagar also mentioned a number of challenges facing Intel during development. For example, challenges related to inserting a XNUMXD camera into a laptop - size, heat production, energy consumption, mobility, field of view, range, accuracy, safety and more. There are also challenges in the field of developing the software kit for developers - Intel is entering an unfamiliar field of writing software libraries that take advantage of the depth and provide the XNUMXD model that comes from the camera and show, for example, where the fingers are, where the face is, whether the user is happy or sad, etc. These are heavy programs that require support for many operating systems, and require optimization of the user interface."

And finally we also need to develop apps to get the consumer to buy it but the rest we leave to small companies, even startups that will use the SDK and get out of the user experience and use the technology. This is an experience that requires a lot of effort and small companies will have to partner with the entire ecological environment.

There is still time until the vision of giving all senses to all devices is realized. For this to be possible the capabilities of the computer should be similar to those of the human brain. It's amazing how much the human brain can absorb and understand spatial perception. The most sophisticated computers today have 1.4 billion transistors, compared to 100 billion neurons in the brain."

Referring to the audience that was made up of hardware developers, Nagar said that in order to implement the 'Internet of Things' and not like today where every 'thing' is connected to the Internet separately, the devices need to talk to each other, "It is your job to create these interfaces" when every component in front of the network must be Has the best performance and high brain capacity but also at the same time very cheap and has low energy consumption. There is room for a lot of hardware planning, for example in the field of instructions to the brain of the devices, optimization of sensors and optimization of the user experience.

The article was prepared in collaboration with Ether Chiportal - The Israeli chip industry portal. see also: Eyal Nagar's dream job

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.