Back to all blogs

AI Gets Physical


Fri, 06/29/2018 - 12:00


Robots are to artificial intelligence what bodies are to brains. Assistant Professor Pham Quang Cuong, a roboticist at Nanyang Technological University, Singapore, is helping AI impact the physical world.

Tucked into the western corner of Singapore, two robotic arms are busy dismantling the notion of build-it-yourself furniture. Their first victim: an IKEA Stefan chair, its individual parts laid out on a worktable, awaiting assembly.

Swivelling on its base, one arm grabs the back-support frame of the chair while the other nimbly picks up a pin. Then, as if performing a synchronised dance, they both turn and position themselves such that the pin is directly above the hole into which it must be inserted. The arm holding the pin descends and, finding its mark, gently releases the pin. The process continues, and within nine minutes, the robot has pieced together the chair without any human intervention.

This feat of automation has been made possible by Assistant Professor Pham Quang Cuong and his team at the School of Mechanical and Aerospace Engineering at Nanyang Technological University, Singapore. Dabbling in robotics might seem odd for someone trained as a neuroscientist in France, but Professor Pham has found that the human brain—widely considered the epitome of information processing devices—can help inform the way robots are built to autonomously perform complex tasks.

One method of developing artificial intelligence (AI) is even called an ‘artificial neural network’, as it attempts to mimic the inner workings of biological grey matter. By bringing the fields of neuroscience, AI and robotics together, Professor Pham seeks to bridge smart machines and the physical world.

A new dimension of intelligence

“Much progress has been made in domains where AI does not have to physically interact with the world, for example, with natural language processing, image analysis and perception,” Professor Pham notes. “When it comes to physical interactions, however, there are many additional difficulties [for AI].”

For one, the physical world exists in three dimensions, hence any machine built for manipulating physical objects must have spatial awareness. The robotic arms in Professor Pham’s lab thus rely on a camera to obtain three-dimensional (3D) photos of the various parts of the IKEA chair. The camera also allows the robot to map out the workspace and locate the workpieces.

Once this is done, the robotic arms “plan the motions to reach and manipulate the workpieces in the simulation environment”. Just as humans may visualise a task in their minds before actually performing it, the robot runs practice sessions in virtual reality, guided by algorithms—a set of rules governing parameters such as the angle of its joints and the range of its swivel, as well as the timing of each motion.

The simulation environment is an important step in ensuring that the two arms do not obstruct or collide into each other during the process of assembling the chair. By the time the robotic arms actually move to pick up the workpieces, the sequence of actions may have already been rehearsed thousands of times.


The robotic arms assembling the chair


Pressure to perform

But simulations are not enough, as they may not always accurately reflect real world physics. Professor Pham highlights that while most industrial robots are designed to be precise at positioning, they are generally poor at regulating forces.

In the case of assembling the chair, if the robotic arms grasp too tightly, they may damage the chair parts; but hold on too loosely and the workpieces simply slip through the parallel grippers. Adding yet another layer of complexity are the sometimes-tight insertions of wooden plugs into the chair frame.

“The physics involved are difficult to model accurately,” Professor Pham says, adding that it is also more difficult to obtain large datasets to train AI to manipulate physical objects. In the face of challenges on both the modelling and the data fronts, a hardware solution could serve as a stopgap measure: another ‘sense’—the sense of touch—can be enlisted to better calibrate the robotic arms’ motions.

To imbue robotic arms with a semblance of touch, pressure sensors are embedded within their ‘wrists’, allowing each limb to optimise its grip strength according to the task at hand. At the same time, the sense of touch functions as a feedback channel to the robotic arms, increasing the accuracy with which they manipulate objects. Where vision and simulation fail, the robot can still ‘feel around’ for holes and insert pins in an exact manner.

No holding back the future

Watching the seamless operation of the robotic arms in Professor Pham’s lab, one might conclude that the age of machines is already upon us. However, the roboticist himself thinks that society is still very far from full automation. As a researcher seeking to enable the future of automation, he acknowledges that there are both benefits, and risks associated with the technologies he is developing.

“[Automation] could relieve some workers from tedious or hazardous tasks, but at the same time carries the threat of unemployment for many,” he says. 

Even so, Professor Pham emphasises that there is no reason to subscribe to the doom-and-gloom scenario of mass unemployment. The contradictions created by automation could very well lead to major socio-economic upheavals that overturn all our notions of automation and unemployment, and workers need not be at the losing end.

In the meantime, Professor Pham is looking to integrate more AI into the development of the robotic arms. He also hopes to apply the robotic capabilities developed in the IKEA chair project (perception, motion planning and control) to automation problems in precision, aerospace and automotive manufacturing, among other fields. Like it or not, the intelligent machines are on the march, and they may soon go from assembling furniture to building our futures.


Video provided by Professor Pham Quang Cuong, NTU


SGInnovate embraces innovative and disruptive technologies. We believe in the future of Artificial Intelligence and have invested in various AI startups.

To read other AI-related articles, click here.


How Do We Design Ethical Frameworks to Ensure We Build Cross-Border Human-Centric AI?

Nov 11, 2022


Unblocking the Blockchain: 3 Things You Need to Know

Nov 11, 2022