AI doesn’t mean the robots are coming.

Spread the love

Get free updates

Pepper, the humanoid robot, was born in It’s in 2014. Visit the Financial Times To meet with the editor. “This is a selfless robot, powered by love,” said Masayoshi Son, head of major backer SoftBank. Alibaba and Foxconn have invested hundreds of millions in efforts to make robotics a part of everyday life everywhere. However, it was not to be. He still occasionally hangs out in a public library in Japan, unpinned, head bowed, like a four-foot-tall Pinocchio who dreamed of being a real boy but didn’t. Production ceased in 2021 and only 27,000 units were made.

But humanoid robots—the vision of machines that can do all the jobs we don’t want to—is too attractive to let go for long. Recent impressive advances in artificial intelligence have sparked a new wave. Enthusiasm for robotics. “The next wave of AI is physical AI. AI, which understands the laws of physics, can work among us,” said Jensen Huang, CEO of chip designer Nvidia.

Billions of dollars in venture capital are pouring into robotics startups. Computers aim to apply a kind of model training techniques that can predict how a protein will fold or generate a surprising real text. Their goal is, first, to help robots understand what they see in the physical world, and second, to naturally interact with it, solving the massive programming tasks involved in tasks as simple as picking up and manipulating an object.

Such is the dream. The last round of investors and entrepreneurs, however, can be as disappointing as supporting Pepper. It’s not because AI isn’t useful. Rather, the hurdle to making an economically viable robot that cooks dinner and cleans the toilet is not just a software issue, but a hardware issue, and one that AI cannot solve on its own.

These physical challenges are many and severe. For example, a human arm or leg is moved by muscles, while a robotic body must be moved by motors. Each axis of motion that the leg must move requires additional motors. Robot arms in factories show that this is all doable, but high-performance motors, gears and transmissions have mass, cost, power requirements and are perishable and create multiple parts.

After creating the desired movement, there is a test of understanding and feedback. For example, if you pick up a fruit, the human nerves in your hand will tell you how soft it feels and how hard it is to squeeze. You can taste and smell that food is cooked. None of those senses are easy to provide to a robot, and if possible, they add extra costs. Machine vision and AI can be compensatory, seeing as the fruit is squeezed or the food in the pot has gone the right color, but they are imperfect substitutes.

Then there is the issue of power. Any autonomous machine needs its own power source. The robot arms in the factories are embedded in the network. You can’t move. Using a synthetic robot battery is more likely, but there are significant trade-offs in mass, power, durability, flexibility, uptime, usable life, and cost. These are some of the problems. Many smart people are working to solve them and making progress. The point is, these physical challenges are long-lasting and difficult. Not even a revolution in AI will make them go away.

So what does AI enable in the physical world? Rather than thinking about how the technology will enable new machines, it is more practical to imagine how machines will change after AI is implemented.

The obvious example is self-driving vehicles. In this case, the machine does not need to change at all: the car moves in the physical world and the power source as it always does, while the perception of driving the car is fully visible. With the new IVI In fact, the opposite should be the case: self-driving is a vast market and a real-world challenge that AI can easily solve, a point that anyone trying to invest in robotics should consider.

It’s also reasonable to wonder how existing robots — from industrial robot arms to robotic vacuum cleaners — will evolve. AI-powered machine vision can subtly augment the tasks a robot arm can perform and work alongside humans. Lightweight individual devices such as robot vacuum cleaners will gradually become more useful. For example, it is already common in Chinese hotels to have a robot bring items to your room. This kind of limited and controlled autonomy is the most readily available.

In this way, AI will gradually bring us closer to Android. A robot like Pepper can clean the toilet – unfortunately, it is very easy to write a bad poem, and this is unlikely to change in the near future.

robin.harding@ft.com

Leave a Reply

Your email address will not be published. Required fields are marked *