The robot that will recognize intentions - the National Science Foundation

A wearable system allows a robotic arm to recognize the user's intent, mainly based on his hand movements, and assist him according to his needs

Former student Nadav Kahanovich with the robotic arm. Photo: Yonatan Birnbaum.
Former student Nadav Kahanovich with the robotic arm. Photo: Yonatan Birnbaum.

Robots move and react based on information coming to them from sensors. To perform actions optimally, they must process the information, move independently and avoid colliding with objects around them. To reach these capabilities, robotics relies on several fields including mechanical engineering, mechanical engineering, electronic engineering, computer science, programming and software engineering.

Dr. Avishi Sintov directs the robotics laboratory at the School of Mechanical Engineering at Tel Aviv University. He and his team design and build robots and robotic arms that aim to assist the user. They are based on algorithms, artificial intelligence, cameras and sensors. "Today, the cost of building robots can reach tens of thousands of dollars due to their complex hardware and design. That's why our goal is to build cheap and accessible robots that can help humans according to their needs. For example, robots that help with housework, robots that help in surgeries (like a brother or sister in an operating room), or in a dental clinic (like an assistant), a robotic arm that is attached to a wheelchair, and an arm for lifting heavy loads, as well as an arm that performs assembly or dangerous operations in factories. Today, the work environment is built and adapted only for humans, and it will take a long time to adapt it to robots that can work independently and replace us. Therefore, at this stage we aim to build robots that will only be of help to us", says Dr. Sintov.

In their latest research, which won a grant from the National Science Foundation, Dr. Sintov and his team created a sleeve that is worn on the user's arm, which is linked to a computer that communicates with a robotic arm. The sleeve helps the robotic arm recognize the user's intention, mainly based on his hand movements, and thus it helps him according to his needs; The sensors in the sleeve (force sensors that are attached to the skin of the forearm) receive information about the movement of the user's arm, his forearm muscles and his fingers and thus about his actions.

The information passes from the sensors to the computer and it processes and translates it into the required actions using algorithms that are based on artificial intelligence and machine learning, which allow the computer to learn from previous examples and experiences and thus perform a variety of computational tasks. For example, a lot of information about movements and holding objects is implanted in him and he is taught to translate and decipher them. The processed information is then transferred from the computer to the robotic arm using algorithms. In this way, it recognizes the user's intention and helps him according to his needs.

Dr. Avishai Sintov in the laboratory. Photo: Yonatan Birnbaum.
Dr. Avishai Sintov in the laboratory. Photo: Yonatan Birnbaum.

Dr. Sintov: "Our goal is to allow robots to examine the actions of users through sensing, to understand what they are doing, and to cooperate with them naturally, just as humans help each other. Doctors, brothers and sisters, for example, can work together and help each other just by seeing the hand movements of the members of the medical team, without talking to each other."

Experiments conducted by the researchers as part of the study showed that the robotic arm was able to perform a variety of assistance operations. For example, when a user picked up a hammer, the arm took a nail from a nearby nail box and brought it to him. When he held a screwdriver, the arm handed him screws. And when he picked up a glass, the arm poured water into it from a pitcher.

Says Dr. Sintov: "We created a wearable, sensor-based and cheap system (the sensors cost one dollar at Express owners), which communicates with the robotic arm and can be adapted to any user. We tried it on several people and it worked beautifully and accurately; The correspondence between the grip of the robotic arm and the objects was 97%. In other words, in most cases she recognized the necessary grip from the variety of grips we taught her."

Today, the researchers are testing this technology in simulations that simulate remote control of robots, so that they can help in situations that endanger humans (for example, treating a patient in isolation). "Our main goal is to teach the robot to recognize the user's intention in a natural, cheap and simple way, since today this is one of the most difficult problems in the field of robot cooperation. It is possible that the model we developed - receiving information from muscles and decoding the movements using sensors and algorithms - will help to solve it", concludes Dr. Sintov.

Life itself:

Dr. Avishi Sintov, 40, married + two daughters (9 and a year and a half), lives in Moshav Nachala. In his spare time he likes to spend time with his daughters.