Packaging Digest is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Robotics
roboticbrain.jpg NUS
This novel robotic system developed by National University of Singapore NUS researchers comprises an artificial brain system that mimics biological neural networks, which can be run on a power-efficient neuromorphic processor such as Intel’s Loihi chip and is integrated with artificial skin and vision sensors.

Artificial Brain Gives Robots Unprecedented Sensing Capabilities

Paired with artificial robotic skin, a new neuromorphic processing system can help machines "feel" and manipulate objects similar to humans.

Robots have come a long way in their functionality, but there are still many sensing capabilities that can’t be achieved by these systems that compare to how humans interact with their environments.

To solve this issue, researchers at the National University of Singapore (NUS) have created a complex artificial brain system called NeuTouch that mimics human neural networks to provide neuromorphic processing for robotic systems. This should provide them with more sophisticated sensing functionality, including what’s needed to pick up, hold, and manipulate objects in a way that mimics human interactions.

The current problem with robotic systems is they depend on visual processing rather than the actual sense of touch that humans have to help us handle and manipulate objects, says Benjamin C.K. Tee, an assistant professor at NUS Materials Science and Engineering, who co-led the development of NeuTouch with Assistant Professor Harold Soh from NUS Computer Science.

“Robots need to have a sense of touch to interact better with humans, but robots today still cannot feel objects very well,” he tells Packaging Digest's sister publication Design News. “Touch sensing allows robots to perceive objects based on their physical properties, such as surface texture, weight, and stiffness. Such tactile sensing capability augments the robot’s perception of the physical world with information beyond what standard vision and auditory modalities can provide.”

 

Building a Complete System

The new solution builds on technology Tee and fellow researchers created last year when they developed an artificial nervous system that can give robots and prosthetic devices a sense of touch on par with or even better than human skin.

This system, called Asynchronous Coded Electronic Skin (ACES), can detect touches more than 1,000 times faster than the human sensory nervous system, as well as identify the shape, texture, and hardness of objects 10 times faster than the blink of an eye, Design News reported at the time.

NeuTouch can process sensory data from ACES using neuromorphic technology, which is an area of computing that emulates the neural structure and operation of the human brain. To do this, researchers integrated Intel’s Loihi neuromorphic research chip into the system, Tee says.

By using ACES, NeuTouch can mimic the function of the fast-adapting (FA) mechano-receptors of a human fingertip, which captures dynamic pressure, or dynamic skin deformations, Tee says.

 “FA responses are crucial for dexterous manipulation tasks that require rapid detection of object slippage, object hardness, and local curvature,” he tells Design News.

 

Testing for Results

To test the system, researchers fitted a robotic hand with ACES and used it to read braille, passing the tactile data to Loihi via the cloud to convert the micro bumps felt by the hand into a semantic meaning.

In these experiments, Loihi achieved over 92% accuracy in classifying the Braille letters, while using 20 times less power than a normal microprocessor.

In other tests, researchers demonstrated how they could improve the robot’s perception capabilities by combining both vision and touch data in a spiking neural network. They tasked a robot equipped with both artificial skin and vision sensors to classify various opaque containers containing differing amounts of liquid. They also tested the system’s ability to identify rotational slip, which is important for stable grasping.

In both tests, the spiking neural network that used both vision and touch data was able to classify objects and detect object slippage with 10% more accuracy than a system that used only vision.

Moreover, NeuTouch also could classify the sensory data while it was being accumulated, unlike the conventional approach where data is classified after it has been fully gathered.

The tests also demonstrated the efficiency of neuromorphic technology; Loihi processed the sensory data 21% faster than a top-performing graphics processing unit (GPU) while using more than 45 times less power.

Researchers published a paper on their work online and presented their findings at the Robotics: Science and Systems conference.

 

Applications and Post-COVID 19 Uses

Some applications for NeuTouch include integrating the system into robot grippers to detect slip, which is key to manipulating fragile objects safely and with stability, such as in factory or supply-chain settings, Tee tells Design News.

“Accurate detection of slip will allow the robot controller to re-grasp the object and remedy poor initial grasp locations,” he says. “This feature can be applied to develop more intelligent robots to take over mundane operations such as packing of items in warehouses, which robotic arms can easily adapt to unfamiliar items and apply the appropriate amount of strength to manipulate the items without slippage.”

The system also can be used to create autonomous robots “capable of deft manipulation in (unstructured) physical spaces, since the robots have the ability to feel and better perceive their surroundings,” he adds.  

Moving forward, researchers plan to continue their work to develop the artificial skin for applications in the logistics and food manufacturing industries where there is a high demand for robotic automation, Tee tells Design News.

This type of functionality will especially become more critical in a post-COVID-19 world for creating applications that avoid human contact by letting robots do the work, he says.

 

Elizabeth Montalbano is a freelance writer who has written about technology and culture for more than 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco, and New York City. In her free time, she enjoys surfing, traveling, music, yoga, and cooking. She currently resides in a village on the southwest coast of Portugal.

View Original Article

Hide comments
account-default-image

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish