Nissan presented at CES the collaboration with NASA. Interview with the director of the laboratory that develops an autonomous vehicle based on NASA technology
In recent times, the people of the NASA research center have been working together with the car manufacturer Nissan on the development of autonomous driving technologies that can be used in the future by passenger cars as well as robots on Mars.
Prior to the CES show in Las Vegas last week, a group of senior executives from the Renault-Nissan Alliance, including Nissan Chairman and CEO Carlos Ghosn and Jose Munoz, Chairman of Nissan North America, visited the Ames Center for meetings about the technological partnership between NASA and Nissan North America. The partnership allows researchers to develop and test autonomous algorithms, chips and prototypes for a variety of transportation applications - from robots traveling on Mars to cars, and the guests also reviewed the LEAF electric car that drove autonomously through the streets of the Ames Center. The car is equipped with cameras, sensors and cellular networks and uses robotics software originally developed at Ames for robots that have explored Mars, the latest of which so far is Curiosity.
NASA says that the results of the research will be used back by NASA for the development of autonomous all-terrain vehicles for Mars.
We spoke at the Nissan booth at CES with Liam Pedersen, director of Nissan's Silicon Valley lab located near NASA's Ames Research Center: "We started working with NASA about two years ago. Several of us, including myself, are former NASA employees. For example, I managed a robotics laboratory. Nissan opened the lab in Silicon Valley because it was looking for software engineers and robotics experts. It started from personal connections.
"NASA and Nissan are facing the same problem. They are planning autonomous cars but in the foreseeable future, these cars will not be able to operate completely by themselves and will need operators like in NASA. They have a vehicle on Mars, if it fails, you cannot get there and fix it. The robot itself knows how to take care of itself. It sees obstacles and bypasses them, but he can't know what's important and what's not. He can't do the science himself and doesn't know how to deal with unusual situations."
"NASA solves the problem thanks to a team of operators who work in front of the robot with a huge delay - it takes the signals to reach Mars and back about 40 minutes. Therefore they give him instructions to get from one point to another and to carry out the tasks planned for him along the way. The robot has to handle problems by itself and react to sudden events."
"The same problem also exists with autonomous cars. We can already today provide such a vehicle with basic safety, avoid collisions and bypass obstacles, but what is difficult for a machine is to interact with humans. If someone makes a sign with their hand (for example, asking to open the window) it is nothing that a machine could interpret. Especially if the movements weren't really clear. Even work zones pile up difficulties for autonomous cars."
"The idea is to manage fleets of robotic cars, each car maintains its safety, and drives by itself 98-99% of the time, maybe even 99.9999% of the time without involving a human, but there will always be cases when the ability of human intelligence will be needed. Example - a human directs the traffic at an intersection, the traffic light says stop, and the traffic cop says to go on. And now imagine a fleet of a million cars. Even if they are fully autonomous, sooner or later they might to get stuck. That's why we are developing a remote operation system for fleet management together with NASA." says Petersen
"It is clear that artificial intelligence will be used so that the car knows how to manage itself as much as possible, it is also economically viable because it is desirable that one person be able to monitor as many cars as possible and if the cars are not completely autonomous and will need help from time to time it is not worthwhile.
We want to take care of the extreme cases. Anyone who understands machine learning will tell you that it is very easy to develop a machine that can manage on its own 99 percent of the time, but when extreme cases arise it is very difficult because there are thousands of problems and each one requires its own unique expertise.
How soon will it be possible to see smart fleets and what will be the impact of this transition on society, in particular in the field of employment?
"It's a business question. From a technological point of view, we'll get there in three years. But when exactly will they be on the road - that's something Nissan will have to decide with its customers." Pedersen says.
"I live in San Francisco, my office is in Sunnyville. I spend an hour every day in each direction in the car. I would be happy if I could use that time, for example, to sleep, go through emails, and do tasks. Today is time that is being lost. An autonomous car will give me Two extra hours every day. Besides the issue of time, safety will be an important issue especially in private vehicles."
"Many companies are looking into the field of autonomous driving, because there is a lack of drivers, especially in the field of logistics. I assume that in the first phase the autonomous ability will help to fulfill the demand, but when they become more common, I predict a significant change in our society."
One response
All the companies that manufacture radars for the semi-autonomous and autonomous vehicles act like "microwaves" on the roads and in addition develop for themselves dedicated processors that greatly increase the development budgets and are therefore expected to disappear in the near future.
The companies that went for a solution with the help of cameras are the ones that will lead the market in the near and distant future.
Most of the market today works with one camera (Mobilay, etc.), one that knows how to identify very specific targets and dangers on the road. Not everything is built to detect and there are a lot of false alerts. They are not able to operate effectively in any weather such as heavy fog, total darkness, snow, blinds, etc.
The future belongs to companies that have learned how to develop products that use 2 and 4 different cameras (a combination of visible light cameras with infrared cameras), built to work in any weather and identify any object in XNUMXD (which, as you know, with one camera it is impossible to create a XNUMXD image) .
A company such as Foresight, which participated in CES for the past two years and was even awarded the innovation award in the automotive category this year, and many other good companies are the ones who will lead the market. What's more, they integrate into their products processors that exist in the market, in order to process the image received from the cameras into XNUMXD.