Comprehensive coverage

to improve the human touch of the robots

In the era of robots, when we will no longer receive flesh-and-blood services, will it be possible to simulate a conversation with a bot to a conversation with a person? This is what the research of Dr. Aya Sofer and her team at the IBM research laboratories in Haifa deals with

Artificial intelligence - cognitive computer. Image: PIXABAY.COM
Artificial intelligence - cognitive computer. Image: PIXABAY.COM

 

In the era of robots, when we will no longer receive flesh-and-blood services, will it be possible to simulate a conversation with a bot to a conversation with a person? This is what the research of Dr. Aya Sofer and her team at IBM's research laboratories in Haifa deals with, which was presented as part of the "Cognitive Coffee" conference held by IBM at the "Please Touch" center in Jaffa last week.

Dr. Aya Sofer, director of the field of analytical cognitive analysis at the IBM research laboratories, presented the way in which the Watson system edited the trailer for the horror movie "Morgan". The system learned from a database of horror movies what makes a movie a horror movie - and what characterizes the trailers for such movies. Based on this data, the most frequent elements were selected, and recommendations were presented for nine sections suitable for placement in the trailer. Eight of these recommendations were selected by the editor, in a process that lasted only one day - compared to the weeks usually required for this purpose.

Can the computer be kind of creative? Dr. Sofer was asked by the moderator, Dror Gloverman, and she answered: "You can say that building a trailer is a form of creativity. Another example - about a year ago we produced a dress for one of the largest fashion shows in the world that changes the colors of the sparkles according to requests from the audience via Twitter. Another way for the computer to show creativity is the expert personal shopper, developed by a startup based on Watson - an interface that helps choose clothes and is assisted by photos in the store in real time and also knows how to ask the buyer questions that will allow him to understand his tastes.

And after all this, in response to Gloverman's question whether the cognitive computer will pass the Turing test, Dr. Sofer answered: We have not yet passed the Turing test.

Dr. Aya Sofer, director of the field of analytical cognitive analysis at IBM's research laboratories, in an interview with Dror Gloverman as part of "Cognitive Coffee". Public relations photo
Dr. Aya Sofer, director of the analytical cognitive analysis field at the IBM research laboratories in an interview with Dror Gloverman as part of "Cognitive Coffee". PR photo

"Watson lives in the cloud. It is a platform on which different applications can be built. The computing power is available as needed, including with the help of video accelerators," said Sofer. So, for example, one of the companies using this system built a fashion shopping recommendation product with it, which helps in choosing clothing items and matching them to each other and to the circumstances of the event.

"The human contact, the balance between the robot and the human, still requires consideration. We are looking at ways to give bots emotional intelligence: an ability similar to that of humans, to see the expression of the person standing in front of them and adjust themselves. It can start with text. By identifying words that express frustration - and by providing response capabilities to the bot, which also knew when to transfer the conversation to human care. For this purpose, Watson already offers a platform that analyzes text and tone of voice."

In the forecast for the future, Dr. Sofer shows optimism. "I am one of those who look at the glass half full. All the knowledge together with the cognitive abilities will help us make better decisions, be healthier, educate the children better. The change is in the interface between us and the computer: we were brought up that we must correspond with the computer and speak its language. We are at the beginning of a revolution where we talk to the computer, in ordinary language. Eventually this will lead to the visual world. We want the computer to see what we see, and its interface with the world to be in the format of our vision. This, together with augmented reality, and communication, will change the way we live. All information will be accessed in a different way."

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.