Hyper Evolution: Rise of the robots (Part 3)
Painted face

Written by admin

Jan 28, 2020

January 28, 2020

I want to know how far the robotics technology has evolved, how close robots are to making sense of the world around them and if we can trust their decisions. This is quite literally a life-and-death issue for all of us, because it’s starting to play out on roads, in the form of driverless cars or robot cars. 

The central computer of a driverless car makes sense of the world around it using numerous integrated sensors. Those at the front and rear of the car look left and right giving a 360-degree view and a range of 250 meters while a 3D camera scans traffic conditions and road markings, The car’s computer continuouls interprets the data from its sensors to generate a 3D map of the world which it can then safely navigate through. It makes split-second decisions to control the breaking, steering and accelaration. 

But in case of an accident, who would be responsible? This throws up complex ethical and legal questions.  Despite all its sensors and computer power, without the lane markings of the Autobahn, Jack, the robot car, can’t form an accurate enough 3D map of the world to navigate safely, where a human has the ability to understand the world so much better than any current driverless car. We can identify objects, know what things really are and do and that allows me to make profound connections and decisions to cope with much more unpredictable scenarios.

Despite the robot car’s limitations, I was still amazed to see how far and how fast robots have evolved their ability to make sense of the world and I wonder if one day it will be possible for robots to understand it in the same way we do, to grasp the true meaning of things and develop a sense of self to become individuals. Could they even become conscious? For humans the key to our understanding of our world, is our ability to learn. To discover what happens, when you try to get a robot to learn for itself, I’ve come to a lab in Japan. 

This 80s looking throwback is called Robo V. For this experiment, prof. Dylan Glass has sent robot V a challenge: can it learn to be a camera shop salesperson? The robot can actually be proactive, it is not just answering questions, it is proactively offering things or suggesting things as well. To interact with customers and explain camera functions, Robot  V is reacting independently. 

We are exploring here the concept of how can we program a social robot. Instead of classical programming of robots where you program explicitly what the robot should do, this robot has learned everything purely from hundreds of interactions that it observed about other people. This is called learning by imitation. To create Robot V’s personality, the camera shop scenario was role played by human shopkeepers and customers. For Robot V to create this database of hundreds of  shopkeeper-customer interactions, a network of sensors track where people moved and microphones captured what they said. 

The robot learns from these interactions unsupervised. It learns on its own to imitate the behaviour that it is shown, the locations where people stopped in the room, the trajectories, the clusters of speech,…. From this data, we have the robot automatically learn the logic of how to be the shopkeeper. 

It almost feels like an actual conversation you would have with a real shopkeeper. It is a very powerful concept that we can scale up. If we can capture data of how people interact in the real world on a large scale, we can use that big data to train robots to do very natural interactions. The real challenge is the balance between how controllable is the robot and how much does it learn on its own. Sometimes the robot does things and we are not sure why it does them. Also, the robot doesn’t know the meaning of anything it does. It is purely behavioural, purely imitating what it saw the person do. It is only learning through imitation, through experience.

Robo V’s behaviour is so human-like that I really believed it learned to understand what I was saying. But even if it didn’t, does that really matters? It could still sell cameras. As we move forward it becomes a philosophical interesting problem, because now we are really reflecting on how do we learn, how do we think, how do we ascribe semantic meaning to things and structures in the world. These machine learning techniques have provided a very interesting lens to which to view the way we do our own thoughts. These learning systems and technology are part of us and will give us in the future the opportunity to grow in ways we’ve never been able to grow before.

It is an extraordinary idea that in trying to teach robots to learn human cognitive abilities, we may also learn more about how we think ourselves. The key to this may be to teach robots not to simply mimic our behaviour, but to develop a conceptual understanding of the world for themselves, so they can generate human-like thought and behaviour spontaneously.

I’ve come to Plymouth University’s Center for Robotics and Neural System to meet a team of scientist that is trying to do just that.

iCub
fig.1 Plymouth University's iCub
Image by University of Plymouth, 2020

At one meter tall and weighing 22 kilos, iCub not only looks like a child, but learns like one too. Angelo Cangelosi, professor of artificial intelligence and cognition is his guardian. iCub is taught the name of objects, one word at a time, like children between 1.5 and 2 years old. i Cub is a simulated brain and like the brain of a child is able to associate to learn the correspondence between the sound of a word and the picture of an object.

iCub is equipped with cameras to see, microphones to hear and even smart skin to touch. 

The information it gathers from the stimuli around it, is fed into an artificial neural network, a computer system inspired by the human brain. iCub is not simply mimicking human behaviour, it is trying to discover for itself the relationships between what it can see, what it can hear and what it can touch, just like a child. As toddlers interact with the world around them, they learn from one experience to the next making connections between what they can see to form the basis of context and meaning. These become the building blocks of intelligence and reasoning.

The more we learn the more complex the tasks we can tackle. Children learn by using their motor skills to explore the physical world around them through touch and movement. As their body interacts with the environment, they learn from each new experience. iCub does the same. In tiny little steps, it is trying to form its own unique understanding of the world and what things actually mean.

We are in control, so we are determing the eveolution of robotic systems, for now……

On this journey we have met some incridible robots. They are preparing for a voyage to Mars, becoming our friends and companions, navigating us through a chaotic world and some are even able to learn like us. For me, this is the most exciting time. We are living right at the moment when robots start to gradually piece things together, the first tiny scraps of meaning to create their own unique understanding of the world and themselves. Once they’ve achieved this, we will be on the brink of a new era. There is no doubt that robots will continue to evolve and become more and more intelligent and that one day just might be possible for them to develop consciousness.

Imagine a robot that could feel the way we feel, that could be moved by strong emotion, that could love the way we do, wouldn’t that be incredible?

When I started this journey, my main concern was that if robots could develop minds of their own, they might become a threat, but now I’ve started to spend more time with robots, I do feel like I can trust them and if robots really could one day become conscious, we need to think not just about how they might affect us, but how we could affect them. But perhaps, my biggest fear right now as we progress towards conscious machines is not what we will need to do for robots, but what we will discover about ourselves. 

The extraordinarly fast evolution of robots really is going to change our place in the world and that raises urgent social issues for us all. We need to be responsible to make sure that we stay in control. We have the opportunity right now to prepare for conscious robots that think and feel in the same way we do to prepare for the inevitable.

 

References:

University of Plymouth. (2020). Centre for Robotics and Neural Systems (CRNS). [online] Available at: https://www.plymouth.ac.uk/research/robotics-neural-systems [Accessed 28 Jan. 2020].

Subscribe

Categories

About Me

A multi-lingual and highly-motivated Virtual Assistant/Bookkeeper and Web Designer, specialised in Word Press, Adobe Photoshop and Illustrator, Microsoft Office Applications and Sage Accountancy Software.

Related Articles

Olive bread with houmous and tzatziki

Olive bread with houmous and tzatziki

Ingredients for olive bread: 250 g bread flour1 tsp salt1/2 tsp sugar150 ml warm water1 tbsp olive oilrosemary for houmous: 400 g chickpeas1/2 lemon1 garlic clove2 tbsp olive oil2 tbsp tahini pastesun dried tomatoes/roasted red peppers/artichoke/olives (optional) and...

Coco-chocolate chip oat cookies

Coco-chocolate chip oat cookies

Ingredients : 1.5 cups oats1 cup flour1 cup desiccated coconut1 tsp baking soda1/2 tsp salt1 tsp cinnamon1 cup brown sugar1/2 cup margarine3 tbsp Agave syrup/honey2 tbsp milk170 g chocolate chips Mix all ingredients and bake in a preheated oven till crispy....

Coco-chocolate chip oat cookies

Coco-chocolate chip oat cookies

Ingredients : 1.5 cups oats1 cup flour1 cup desiccated coconut1 tsp baking soda1/2 tsp salt1 tsp cinnamon1 cup brown sugar1/2 cup margarine3 tbsp Agave syrup/honey2 tbsp milk170 g chocolate chips Mix all ingredients and bake in a preheated oven till crispy....

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from our team.

You have Successfully Subscribed!

Pin It on Pinterest

Share This

Share this post with your friends!