ROBOTS. There are many thoughts in the spectrum of perception that arise when you hear the word robot or when they are mentioned in the discussions and debates about their contemporary developments. Anxiety, worry or excitement? At Evolvera, these feelings and prejudiced emotions alternate and change as we continue to report on news and when we analyze their potential impact on the future. In 1949, American Gordon Allport published the highly influential social psychology book The Nature of Prejudice where he outlined the concept of prejudice as is continued to be recognized today: “Prejudice is an antipathy based on faulty and inflexible generalization. It may be felt or expressed. It may be directed toward a group or an individual of that group”… Butcould the concept of prejudice be applied to our understanding and perception of robotics? One of these images that appears in your mind when it comes to this perception could, perhaps, come in the form of the well-recognized T-800 Prototype in the Terminator franchise, or you could maybe even picture a gentler kind in the form of the NDR series robot Andrew in the movie Bicentennial Man based on the novel The Positronic Man by Isaac Asimov. Or it may be another character or thought associated with something completely different. Researchers at Yale, regardless, have discovered something more in the range of the latter – a system that can make future robots polite.
But before this.. an understanding of “cobots” should be mentioned to gain an understanding of the reality of an automated workplace side-by-side with robotic technology. In the United States, they are slowly rolling out collaborative robots that can facilitate work in many different fields, for example in IT. Such robots are becoming increasingly common, and it has even spawned startups that act as staffing companies – they hire out collaborators per hour and are responsible for programming, development and maintenance. But this concept is not new. Nowadays, robots can work side by side with human employees and even outperform them in productivity for a fraction of their salary which is what leads to most of the thoughts of “worry” in the aforementioned prejudice that arises.
A harsh reality, but it’s true. Amazon relies on 45,000 robots in the sphere of packaging. Phone calls to Nanyang Technological University in Singapore are answered by Nadine, a human-like robot programmed to show emotions. And San Fransisco-based Momentum Machines have developed a robot that can cook 400 burgers in one hour (!). Do you believe it now? According to the industry organization Robotic Industries Association, the US robot market beat all records in 2016. That year, 34,600 robots were ordered in North America, valued at a total of $ 1.9 billion, which was ten percent more than in 2015. But while robots increasingly take on physical and cognitive tasks in the workplace, they will also change the role of the IT department forever. There will be increased demands for security co-operation, keeping data safe, and programming complex robot systems – which will all be a challenge for IT managers and other leaders in the future. But would it help if these robots are, at least … polite? Researchers at Yale University have done precisely that; developed an analysis software in which robots know how to distinguish between the tools they themselves own and those owned by other people or robots.
But firstly, should we teach politeness to robots? At a time when we are talking about collaborative robotics or cobotics and that we are promised a future where robots will be part of our daily personal and professional lives, some experts believe it is necessary to program them with social conventions. This is the process undertaken by the Yale Team that developed a software for a robot to learn to recognize and respect the property of others. The program combines two machine learning algorithms, one using explicit rules, and the other uses Bayesian inference to infer the property based solely on the qualities of the object being observed.
In their article published by arXiv, the researchers explain that associating these two approaches has allowed them to reproduce the way humans learn to know and respect property by using both explicit rules and empirical learning. “Understanding object ownership, permissions, and customs is one of those topics that has not really received much attention, but that will be critical to the way machines work in our homes, schools, and schools. offices, ” says Brian Scassellati, one of the researchers behind this project. The learning software platform has been tested on a Rethink Robotics Baxter robot that performs tasks according to property rules. But its designers say it could work with other models of robots. But this is where things get interesting. There was other news associated with robots, politeness and humans, but it’s all in reverse: what effect does this have on humans when we start to communicate with the robots? Interesting thoughts about this have come from Sogeti, an information technology consulting company based in Paris. According to Menno van Door, Sogeti’s expert on Machine Learning, when we start talking to self-learning robots, we may become nicer to each other in the long-run. One main message is that we are facing a huge change when self-learning robots and intelligent machines are becoming increasingly common. He sees in front of him a development where apps are increasingly disappearing and that we instead start talking to robots and machines in the same way as we talk to people we meet. Menno van Doorn is the director of SogetiLab’s research institute VINT in the Netherlands and a prominent lecturer in machine learning, the robot society and the Internet of Things. But does society as a whole become more polite by robots? We have to come back to this point in a decade or so to see if the development that Menno van Doorn points out will come through. A fitting end to this story comes in the form of a quote by Paul Valéry: “Politeness is organized indifference”. As automation and robotics continue to lead us into the future, as some are fearful about while others are excited, we should recognize that the future of robots – while it may or may not lead to our extinction – at least when the last biologically-defined human roams the Earth – the ones that take over will be polite about it all. How wonderful. . . Evolvera – evolve in the new era.