The primary challenge: replicating our largest organ
In the past few years, we have made significant progress equipping robots with powerful vision and auditory systems. But providing them with a sense of touch – and giving them a ‘feel’ for any human activity in their vicinity – is yet another challenge.
The integration of basic tactile sensors and force-sensing resistors will only get us so far. And the same goes for the delicate humanoid hands or the clumsy robotic claws we resort to today: their use is simply too restricted. Instead, we should replicate and outfit robots with the largest organ we possess: our skin.
A robotic skin might even require extra functionality as compared to its human equivalent. Imagine a setting where people and bots coexist and collaborate. In specific scenarios, it might be OK for a person to accidentally touch a machine – all while the bot itself is not allowed to hit that person at all, not even accidentally, and regardless of any sort of unpredictable human action. In other words: humans and robots should be able to share the same space, or bubble, without any safety hazard.
All of this is easier said than done. It implies that a robotic skin should meet various technical and practical requirements. For one, the underlying sensors must be stretchable and flexible – allowing robots to manipulate delicate objects and robust ones. Another important consideration is that robotic skins should be capable of enduring constant mechanical stress, much unlike today’s vision and auditory sensors that are safely stowed away in a robot’s ‘body.’ And to guarantee the safety of the people in its vicinity, a robot’s skin should possess anti-collision sensors in a 360° bubble.
Equipping robots with such a skinlike organ and allowing them to anticipate and react to any human activity around them, obviously requires the proper hardware. Yet, adding context-awareness and some sort of artificial intuition will be equally important.
AI for AI
When a person wants to pick up a coin that is lying on a table, they will unconsciously interact with that table, too – moving the coin up or down the table to get a better grip on it. That is because people are intuitive and context-aware.
To provide robots with a similar sense of intuition and context-awareness, and help them achieve unseen levels of interaction with their environment, their underlying artificial intelligence (AI) needs to be driven to the (extreme) edge. Working highly efficiently within a limited power budget, extreme edge AI locally fuses sensor data to allow decisions to be made fast – without overburdening robots’ mainframes. It removes the need to exchange and store tons of (irrelevant) inputs and allows robots to live a battery-friendly life. Moreover, robots’ AI capabilities will be crucial to providing them with a sense of artificial intuition.
Inspired by nature, tailored by man
Today, there is still no clear roadmap to developing a robotic skin or equipping robots with artificial intuition. But nature and thousands of years of human evolution could make for a great source of inspiration!
As a matter of fact, the human brain and body are probably the best examples of what a smart and energy-efficient robotic architecture could look like: humans move, operate and interact with their environment highly efficiently, and within a power budget that is significantly lower than the one achieved by today’s robots.
Still, human evolution has also introduced a few shortcomings that nature did not bother to address; imperfections that man’s ingenuity and engineering skills can now offset. It should result in robotic technology that supports us when needed, that enables us when being disabled, and that shields us from any harm.
I do believe that by making robots increasingly autonomous, versatile, and tactile – we can envision a future in which humanity can truly excel; a future in which we take care of (and for) one another, educate, dream and create. And aspire an even better and more inclusive world.
How imec contributes to realizing this vision
The development of sensory, tactile robots is no mean feat. It all starts with miniaturizing the underlying sensors, but it equally involves finding an approach to mirror the fine-grained pitch and multi-modal sensing capabilities (pressure, temperature, vibrations, etc.) of our skin’s receptors. All while advancing our understanding of (neural network-based) edge AI.
Imec approaches this matter from various angles. Imec researchers at Holst Centre in The Netherlands, for instance, currently work along two major research axes. The first one aims at using advanced robotics technologies to help people recover from critical health conditions (think of smart implants and artificial kidneys). And the second one is potentially even more challenging, looking into the characteristics of the human (central and peripheral) nervous systems to better understand how they work and collaborate.
At the other end of the spectrum, researchers from VUB-IMS (an imec research group at the Vrije Universiteit Brussel in Belgium) are renowned for their expertise in robotic design and signal processing. Add to that imec’s pioneering work in building nanoscale sensors and its leading role in advanced AI research.
According to the imec experts, it is precisely that multidisciplinary approach – which even extends to conducting social experiments with humans and robots – that is crucial to taking research into robotics to another level. Even though researchers with different scientific backgrounds tend to speak different languages, it is the perfect recipe for achieving the most surprising results. And that is exactly what sets imec apart in this domain.
A shorter version of this article previously appeared on the website of Forbes’ Technology Council.
Jo De Boeck joined the company in 1991 after earning his Ph.D. from KU Leuven. He has held various leadership roles, including head of the Microsystems division and CTO. He is also a part-time professor at KU Leuven and was a visiting professor at TU Delft. Jo oversees imec's strategic direction and is a member of the Executive Board.
More about these topics:
Published on:
20 January 2022