Your Planet Sustainable?Your Tribe Harmonious?Your Life Vibrant?
Future Proof Ideas since 2005, by Erwin van Lun

Trend observations, analysis and future predictions since 2005

Category: Humanoid evolution

In the very long term (20-50 years) robots looking like humans will be a regular part of our lives. They will be more intelligent than we are, invaluable, and completely indispensable. They are the Mothers of my new book, Pamper Planet. They are grown in labs and may go outside every now and then. In this section, you can follow their youth. They are very far from maturing though. So far, they need a lot of attention.

Robot eyes signal their intentions

But Bilge Mutlu and colleague's team at Carnegie Mellon University, Pittsburgh, have robots that "leak" non-verbal information through eye movements when interacting with humans. The eyes of a robot may not provide a window into its soul, but they can help humans guess the machine's intentions.

Humans constantly give off non-verbal cues and interpret the signals of others – but without realising it at a conscious level, says Mutlu. The trembling hands of a public speaker betray their nerves even before a word is uttered, while poker players leak subtle signs such as eye flickers or twitches that can be used to spot bluffers.

But when faced with a robot all our interpretive skills are irrelevant. Robots leak no information, so it is virtually impossible to read their intentions, which makes them hard to get along with.

Related trends

Robot shows emotions through posture

There have been a variety of robots that emulate emotional expression through their facial features. Tmsuk has now created a humanoid robot that puts its whole body into the act, emoting like a silent film actor. KOBIAN is a descendant of the WABIAN-2R research humanoid and the WE-4R robot head, developed at the Takanishi Laboratory. The video above shows KOBIAN cycling through a variety of stock emotional expressions. Tmsuk claims the robot's postural emoting will make it better able to interact with humans.

Robohand imitates gestures

BERTI is a robot developed by British researchers at the Bristol Robotics Lab working on human gesturing. BERTI is a torso with 36 degrees of freedom including 9 DoF in each hand. The researchers recently demonstrated the robot at the London Science Museum, where it played Rock, Paper, Scissors to show off its dexterity.

Related trends

Robot face can get angry

This robot face ('animatronic head') was designed by David Hanson. The software was developed by the University of Bristol and drives 34 small motors that shape facial expressions (tip: Heini Withagen).


Future vision by Erwin van Lun

More and more real. Currently programmed. In a little while the face will copy us. Sooner or later robots will also understand what we’re telling them. That when we say ‘I’m taking out your battery’ it’s a threat, that it should be prevented and that a robot will become truly angry. Currently a toy. Later as real as a human. But after 2020, that’s true.

Einstein as robot

Einstein as robot. A movie from 2006, but pretty real all the same.


Future vision by Erwin van Lun

They remain science projects. But the recreation of people who once lived, even the sharing of memories you’ve had together. Maybe you can laugh about it together still. It’s coming closer.

Related trends

Kansei reacts to words

The robot Kansei of the Science and Technology School, Computer
Science Department, Laboratory of Robot and Science, of the Japanese Meiji University can show 36 different facial expressions as a reaction to various words. For example, it'll look scared if it hears words like 'war', 'bomb' or 'rocket'.


Future vision by Erwin van Lun

The translation of emotion->facial expression is shaping up nicely. The translation of word->emotion is a lot more difficult. Who’s speaking? Where? What context? What are the memories associated with this word? That’s a lot more complicated. What this does demonstrated is how we’re capable of shaping emotions in a rubber face. Expressions that we see more and more often with brand agents, artificial and virtual characters that represent brands. We’ll perfect this technique in the coming decades in the virtual world and then bring it back into the physical world, to robots.

Nexi shows emotions

Nexi, a new robot created by the personal robots group at Massachusetts Institute of Technology, shows facial expressions just as humans do. Nexi's face is designed in such a way that it can use the look, eyebrows, eyelids and an articulating lower jaw to show a host of different emotions. Nexi has a color camera in each eye, a 3D infrared camera that registers heat in her head and four microphones to localize sound.


Future vision by Erwin van Lun

This is how machines grow to be acceptable, friendly units that in time (say at least another decade) will be found everywhere. Machines, simply manufactured by companies, by brands. We’ll be able to just buy a thing like that, like we buy a navigational system, a car or a house nowadays. Because even though it may seem like a lot is changing, a lot will remain the same too.

Robot making faces

Researchers have created a robot face that is able to imitate the tiniest details of the human face. It is a blank face, which can be completely adapted to the shapes of a real human face, scanned in 3D. It is also possible to project a real face on the robot, to include skin color and hair style. The result is a physical robot face looking quite real, see the video below (rn, fc).

We more and more imitate human life. In the end, robots will help us in our daily chores. They will be amongst us, like real people. And they will be involved with brands, like real people. It will take a while, but it is coming closer, one step at a time.

Categories

Archive

Twitter
RSS