Quantum development and the future of robotics

While advances in how robots move and ‘think’ will accelerate with the introduction of quantum computing, the more immediate opportunity is in vertical applications

Jane Wakefield

Robots already walk among us. Whether they be cute dancing robots at tech conferences or robotic arms building cars or performing surgery, these complex machines are rapidly becoming an integral part of our culture.

While industrial and medical robots are honed to their tasks and perform well, many of the more humanoid bots just aren’t that good. Those viral videos of a Boston Dynamics Atlas robot leaping on tables or running through woods don’t reflect the reality of where the industry is. It is actually incredibly hard to build a robot that can move as a human does. Boston Dynamics chairman Marc Raibert told me once he pushed his toddler over and observed how she got up in order to learn how humans do it!

Making a robot that can move, see and even think like a human is going to be a huge effort. And to achieve this goal, researchers will require cutting-edge technology – such as quantum.

The term quantum robot was first coined in 2006, in a paper published by Cambridge University Press. The researchers pointed specifically to how quantum sensors could help a robot better understand its environment.

We are already surrounded by sensors, both in our phones and in the wider environment. They measure heat, light, movement, and pressure, among other things. Quantum sensors can take this a step further by understanding how particles behave at the atomic scale, meaning they are able to detect tiny movements very accurately. 

Quantum navigation

Such sensors are already being used in some sectors, Kristin M Gilkes, a partner at EY and associate fellow at Oxford University, tells BI Foresight.

“Most of [the work] has been in the defence industry but it has also become commercially available in the last six months,” she explains. Examples include quantum navigation systems that can be used as back-ups to GPS in ships and aircraft.

“Recently there was an aircraft which had its GPS hacked and the pilots veered into Iranian airspace which could have been really dangerous. Quantum sensors sense the Earth’s gravity which is a unique fingerprint.”

In the oil and gas industry, quantum sensors are being incorporated into robots in order to carry out inspections in places that would be unsafe for humans to go. And in agriculture, such sensors are able to predict the precise amounts of light that will make sure crops are grown optimally.

Robotic reaction times

Ekaterina Almasque is a general partner at venture capitalist firm OpenOcean, and she sees huge potential for quantum in the robotics space.

Ekaterina Almasque, OpenOcean

“One of the biggest things in autonomous machines today is the speed of reaction and the accuracy of perception of the environment. And how you translate perception into action,” she explains. “We’ve seen problems with autonomous driving when the vehicles don’t recognise obstacles fast enough or they perceive [them] wrongly. Or in farming, where you want to deploy robots, to say, pick strawberries, but they are not accurate enough to pick them without destroying them.

“Quantum development that is happening today will be able to address many of those questions. Sensors can provide a more accurate perception of information.”

Quantum technology can also help with the growing armies of robots that are used in factories and warehouses, she says.

“Logistics operations today are done quite simplistically where basically you just optimise a certain amount of routes for the robots, even though there are endless variations,” Almasque explains. “Quantum is capable of dealing with much more complex environments.”

Bringing quantum image analysis to robotics

Quantum technology could also help improve how robots see the world via something called quantum image processing, which would allow robots to understand visual information far more quickly. 

Greyparrot is a UK cleantech start-up that uses AI to identify rubbish. Founder Mikela Druckman tells BI Foresight that computer vision is fundamental to her firm’s waste analytics capability.

“The cameras in our Analyzer units collect critical visual information that our AI uses to identify the content of waste streams and provide insights into sorting facility’s performance, product purity, and more,” she says. “Analyzer’s Artificial Intelligence (AI) acts as the platform’s intelligent core, bringing mountains of data sets into a form that they can be analysed and processed, enabling us to extract the hidden insights that we call ‘waste intelligence’.”

Its analyser can currently recognise 89 distinct types of waste with seven layers of information, but quantum would boost this.

Quantum benefits for unmanned vehicles

Amina Hamoud is a lecturer in systems engineering at the University of the West of England. She co-leads the connected autonomous vehicle research group within the Bristol Robotics Lab, where much of her work is centred around testing unmanned vehicles. 

A white robot with a mask for a face, holding out a hand and looking to the side.
The Bristol Robotics Lab is looking into using quantum simulations within its robotics research

“I am looking at how we utilise quantum computing to enable us to do better simulations, to go beyond the classical computer’s computing limitations and use quantum-powered simulations to enable us to go further,” she tells BI Foresight. “There are a lot of complexities within the transport system; we have a lot of different sources and lots of data and using quantum allows us to go further with our simulations.

“The time and money you gain, in terms of running and validating a prototype, is where it will really help. Instead of spending days or even months processing what the data from the car is giving you, you will have it within a couple of minutes. And gaining time means that we can do more testing and explore more operating conditions, and more complex environments.”

Out of the lab, Hamoud is looking at the role quantum sensors could play to provide better object detection and navigation for autonomous vehicles.

“For example, in conjunction with GPS, if you are going under a tunnel, a photonic sensor gives you a functionality which enables you to keep knowing where you are.”

Small-scale sensors

At the Quantum Engineering Technology lab at the University of Bristol, researchers are looking at combining robotics and quantum in a different way, and have developed a robotic arm designed to help in quantum research. This is good at operating in the often harsh conditions necessary for quantum experiments, which include low temperatures and atomic-scale interactions.

Research fellow Dr Joe Smith explains. “We needed to have some way of overcoming these constraints. I knew that in surgery, they had started to use precision robots so we use the same principle to move things around.” 

He has launched a start-up, RobQuant, which is developing the robotic arm for more widespread use in quantum sensing experiments. In terms of how he sees quantum impacting on robotics in the future, he is thinking small. Very small. 

“In micro-robotics, for example, you might have a medical capsule to navigate around the body. And this capsule could be a quantum sensor tracking what is going on in the body.”

Big ambitions

The knotty problem of creating a robot that can walk and talk like a human has had roboticists scratching their heads for decades. Recently Elon Musk declared that Tesla may start selling its all-purpose humanoid robot Optimus as early as next year.

Some critics think that is optimistic – when the bot was shown off in the summer of 2021 it looked amazing, until it was revealed that it was actually a man in a robot suit. 

And Musk now faces stiff competition from an interesting start-up, Figure, which has backing from OpenAI, Nvidia, and Amazon founder Jeff Bezos, and has been valued at $2.6bn.

Recently a video of its robot, called Figure 1, went viral on social media, amazing the public with its ability to reason and think in very human-like ways. When asked by Figure CEO Brett Adcock for “something to eat”, the robot chose and picked up an apple, explaining that it was the only edible thing on the table.

A screenshot of the Figure 1 robot selecting an apple from a table
The Figure 1 robot promises much – but can it deliver? © Forbes

Figure describes itself as a “first of its kind AI robots company bringing a general-purpose humanoid to life”. The firm has big ambitions and aims to provide robots that support humans in warehouses, transportation, and retail around the world. 

But while its thinking may seem humanlike, its movements remain rudimentary. When Forbes recently visited the Figure offices to see the robot in action, it reported that it took five minutes to warm up before it would walk, and mid-warm-up got “a kink in its mechanical hip, which made its right leg gyrate wildly at an odd angle”.

Robots may be walking among us, but currently not very well. Quantum could change that, thinks Gilkes.

“Robotics engineers now are wearing sensors and capturing data from activities such as doing the dishes so that robots can mimic these activities,” she says. “But what some companies are starting to do is use quantum sensors to track the really fine motor movements of a human that cannot be captured by a normal sensor.”

Related Story:
Jane Wakefield
Jane Wakefield / Writer

Jane Wakefield has been a technology journalist for more than 20 years covering every aspect of technology, from regulation to broadband, smart cities to artificial intelligence. She has travelled the world making radio docs for the BBC, including witnessing a world first drone flight across Lake Victoria. Her dubious claim to fame is being the first UK journalist to interview a sex robot. She is now a freelance writer, podcaster, media trainer and conference host.