Helping Robots Feel More Like Humans

A look at how a softer touch unlocks a new generation of intelligent machines

by Toshi Quides
Reading time: 4 mins

The field of Robotics is a massive and constantly growing area of research and industry.  Overlapping areas of technology like automation and artificial intelligence (AI) play a large role in the field’s technological progress, and by extension, the places where we find new robotic utility.

For many years, human-robot conflict (Terminator, Replicants etc.) in popular culture has reflected the Frankenstein-like fear of human creations that share some of our likeness but differ in excellent and/or diabolical ways.  The concerns that robots, automation, and AI potentially create human conflict or replacement continues today, but social changes due to aging demographics and a once-in-a-century pandemic have required recognizing how robots actually bring enormous value in their very ability to stand in for us.

One of the many ways in which we’ve made robots and computers more interactive and human-like is to give them the same senses we have.  We’ve made them hear us, and see us, but some of the most primal forms of human knowledge involve the ability to touch and feel.

That is being made possible today by sensors that provide robots with feeling and fundamental human sensations, that then form the basis of a new layer of machine intelligence.  Looking further into the future, as AI models and more forms of sensation keep building the flywheel of continuous learning, imagining robots that naturally move and behave like humans isn’t difficult.

The human form is full of organic curves and shapes that require talented pattern makers to create clothing from materials that fit us well while remaining functional and comfortable.  Creating sensing systems out of intelligent fabric enables tactile and interaction sensing for humanoid bodies in places that require other sensing approaches to make undesirable tradeoffs, such as making the forms less human.  Robots that must operate in a human world with less human forms (such as claws or paws instead of hands) have inherent disadvantages in not just operating within contexts that require dexterity, but also in a world where robots are asked to behave like and interact with humans.

Japan’s demographics, with a third of its population over 65, present a telling case study about how robots are adding societal value as humanoids and human counterparts.  For instance, elder / long-term care requires a variety of physically demanding activities, as well as basic human interaction and everyday tasks.  These can be served by humanlike companions that are not subject to the same risks of biohazards and injury as their human counterparts.  Robots today do a good job of specializing in a few areas, but a general purpose robot will soon be able to carry out a far broader swath of these activities using next generation sensors and AI.

Today a robot that can modulate its hold on a person when lifting them out of bed or catching them during a fall might use a different suite of sensors than the one that would help pour your tea or hold your hand during a chronic pain episode – in all likelihood it is a different robot altogether.  In order to realize the goal of general purpose robots, roboticists need a general purpose sensing solution – one that is flexible and dynamic enough to emulate the sense of feel that humans have for both light and heavy interactions, on all parts of the humanoid body.  We like to refer to this as Robotic Skin – a distribution of nerve endings that can be tuned to human-like feel in regions as sensitive as the fingertip to those that require intelligence at much larger forces like the bottom of our feet.

All around the world, countries are not only battling with aging demographics, but supply chain disruptions due to COVID and the Great Resignation.  Service and medical industries have been hit particularly hard, in large part due to the direct effects of the pandemic such as health risk due to frequent person-to-person interactions.  In the service and industrial spaces, workers have also decided to seek out positions that offer more flexibility and wage growth, which has forced many employers to take the plunge on deploying automated solutions.  In environments that are less structured than a fully automated logistics warehouse or require handling a wide variety of tasks, bipedal and humanoid robots have started to find their way into applications like truck unloading and “last mile” delivery, or food service and preparation.

In a world that demands flexibility from it’s machines, building a series of robots that have been modularized enough in both hardware and software to move between and across functions is a natural progression of the technology.

We’ll need to augment our wide range of hyper-specialized robots with versatile general purpose robotic solutions that can take on a broader range of human activity.  In order to successfully do that, AI models will need to ingest loads of data on all of the activities we will ask our future robots to perform.  Sight and sound form the backbone of much of the learnings that have created huge advancements towards the more intelligent robots and machines of today.  As one of the fundamental bases of knowledge, touch based data should inform the next step change in machine interaction quality.

Sensor systems designed for the humanoid form then become a new substrate from which a more general humanlike intelligence emerges.

Contact Sales Arrow Right