It’s Time for AI to Get Physical



The latest generation of generative artificial intelligence (AI) tools burst onto the scene a few years ago and seemed to immediately change everything. They brought about a significant shift in how people work, create, and interact with technology in their daily lives. But now that the dust has settled, the initial enthusiasm around these tools is beginning to cool. While they are certainly useful, progress is slowing and talk of developing an artificial general intelligence in the near future is being taken less seriously by most people.

It seems that we may be close to maxing out the potential of today’s algorithms. This is not to say that another big leap forward is not sight, but the next leap might be in a somewhat different direction. A growing number of researchers and developers believe that embodied AI will be the next big thing. While large language models, image generators, and other predictive algorithms have many important uses, integrating AI into robotic systems, such that they can directly interact with the world in much the same way that we do, has the potential to be a truly transformative technology.

But before this can happen, many problems still need to be solved. If they are going to act like us, then robots will first need to be able to sense and respond to the world like we do, for instance. For this reason, a team led by researchers at the Beijing Institute for General Artificial Intelligence has been working on an integrated system that enables a robotic hand to not only feel the world around it, but also to use that sensory information to adapt its actions in real-time.

The result of their effort is the F-TAC Hand, a biomimetic robotic hand equipped with a novel tactile sensing system. It closely mimics the structure and function of the human hand, particularly in how it perceives touch. With high-resolution tactile sensors covering 70% of its surface, F-TAC Hand is capable of detecting contact with a spatial precision of 0.1 millimeters.

The robotic hand is equipped with 17 vision-based tactile sensors across six optimized regions of the hand. These not only provide contact data but also serve structural purposes, preserving the hand’s full range of motion. A custom generative algorithm further enhances the system by synthesizing human-like grasping strategies, enabling the hand to quickly adapt to changing conditions.

In a battery of 600 real-world experiments, the F-TAC Hand demonstrated its capabilities by significantly outperforming non-tactile systems in multi-object grasping scenarios, with success rates rising from 53.5% to a perfect 100%. This kind of tactile-informed feedback is essential for applications where reliability and adaptability are non-negotiable, such as surgical robotics, aerospace assembly, or disaster response.

As AI research begins to focus more on the physical world and less on digital abstraction, projects like the F-TAC Hand may be pointing the way forward. By combining intelligent algorithms with advanced sensing hardware, embodied AI has the potential to unlock new levels of machine intelligence.

Leave a Reply

Your email address will not be published. Required fields are marked *