Robots that can ‘feel’ perform precise packaging tasks

By Lisa McTigue Pierce in Robotics on April 11, 2017

Robots are getting more human every day. Yet, have you ever seen a robot shed a tear of joy or gasp in surprise? Probably not…but they’re about to show their “sensitive” side.

New 6-axis force/torque sensors from robotics manufacturer OptoForce are equipping robots with a sense of touch, imparting a dexterity and a flexibility that are critical for performing high-precision operations, such as assembly, positioning and packaging. In some cases, these robots are now able to do tasks that couldn’t be automated before.

Rated IP 66, the 6-axis force/torque sensors come with a one-year warranty, an ISO-standard flange (so no adapter plate is needed), and a standard Ethernet or EtherCAT interface. Easy-to-install, lightweight and robust, the OptoForce 6-axis sensors outperform related technologies in precision, cost, strength and flexibility, according to the company. They are compatible with Kuka force/torque package, and have plug-and-play installation on robots from Universal Robotics.

As the U.S. prepares for a resurgence of manufacturing, high-tech affordable automation will be in strong demand. On March 27, the Budapest, Hungary-headquartered company announced the opening of an office in Charlotte, N.C., to serve North American manufacturers with an automation solution that helps decrease costs and improve productivity. This expansion is backed by venture capitalists and Enrico Krog Iversen, former CEO of Universal Robots.

Packaging Digest reached out to Ákos Dömötör, CEO of OptoForce, for more details. He explains why we need robots with “touch” and what packaging engineers can accomplish with them.

 

Why do today’s robotics tasks require a new level of sensitivity?
Dömötör: Traditionally, industrial robots were designed to carry out repetitive tasks in a very structured environment. Jigs were positioned precisely—the handled objects had standard dimensions and were mostly made of materials that didn't deform easily. It was sufficient if the robots could move from A to B with high precision. This is no longer the case.

With the advancement of robotic controllers, it became possible to carry out more elaborate tasks. Collaborative robots are now working alongside unpredictable human coworkers. And people expect them to do better in tasks that require constant feedback about the environment. For example, when an object can deform, or when different sizes come after each other, you can't blindly rely on preprogrammed positions. You need to know what is happening out there.

 

Robots have used vision sensors for decades to help guide them in their tasks. Why is it important that a robot “feel” its way now? What has changed in production that makes this an improvement?

Dömötör: Vision sensors have mostly been capturing still images and carrying out tasks based on those photographs. It means they couldn't provide you with "on the fly" information about the current process. And in the rare cases where they are used so, they are complicated to operate.

Tactile sensing can be evaluated on the fly—our sensors are sampled 1,000 times per second now, giving the robots instant feedback about how they should change course to get the right output. On top of that, there are simply things you could never see, such as how a pin is jammed inside a hole, not to mention the struggle with “seeing” shiny surfaces.

 

What packaging applications make the most sense for robots that can feel and why?

Dömötör: Inserting glossy paper cartons into shipping boxes, where it is important that you don't scratch the surface, would be a good example. Another could be careful palletizing of fragile objects. In these cases, we can prevent package damages from a robot so you can automate more tasks.

You can also program a robot using the OptoForce Path Recording function to program gluing simpler. This way you get a much more flexible production line.

 

Can you give us an example(s) of how a robot with your sensors has decreased costs and improved productivity in a packaging application?

Dömötör: Just recently, one of our partners in the U.K. automated the de-palletizing of boxes at a major wine bottling factory (see photo below). The problem he solved with OptoForce sensors was that the height of the stack would change so the robot had to feel how low it needed to go to find the boxes. They also changed the orientation adaptively to get a grasp that was more stable.

 

Are these sensors useful only for collaborative robots or could any style robot gain added functionality?

Dömötör: We focus on lightweight industrial robots—so not just on collaborative ones. We target the payload range of zero to 20kgs (zero to 44 lbs) as this is about the range where most of the manpower-intensive applications are. Tasks carried out in a changing environment can benefit from touch sensing—they don't necessarily need to be collaborative applications.

 

How much faster can a robot equipped with these sensors learn a production task, compared to the same type of robot that doesn’t have these sensors?

Dömötör: When we are talking about teaching a robot with our Path Recording and Hand Guiding software, it could easily be half the time. But the real benefit comes in applications that require force feedback during operations, such as assembly or polishing. Most of these tasks simply couldn't be automated with a standard robot without force sensors.

 

How many of these sensors need to be incorporated into a 6-axis robot and why?

Dömötör: As our sensors are capable of sensing along six axes (three force and three torque axes), it is usually enough to use one sensor per application. The sensors are usually mounted between the tool and the robot flange so we can pretty much detect anything that is happening on the tool.

 

Your press release says your sensors are currently compatible only with robots from Universal Robotics and KUKA. Why just those robots?

Dömötör: Actually, our sensors can be used with many more robots already. The question is only how simple it is to set everything up. If you have an Ethernet port on your robot, chances are that you will be able to use the force/torque vectors quite easily.

As for UR and KUKA, we started creating special functions and software modules that make it even more simple to use the Force/Torque input from our sensors. For example, we have a few functions for simplifying polishing task and material handling and we are now rolling out support for assembly as well.

We are constantly introducing these features for more and more robots. We started with UR and KUKA due to our good relationship with these companies and because we received a lot of request for these robots, but we will soon introduce support to ABB and Yaskawa, as well.

 

Where are the sensors made?

Dömötör: The sensors are produced and calibrated in Hungary, Europe, some parts are coming from other countries in the European Union (EU) as well.

 

What do you expect opening an office in the U.S. will do for your business?

Dömötör: We have been selling about a third of our products in the U.S., so it is high time for us to open an office here. Our new North American general manager Gary Eliasson has been tasked with setting up a high-technology distribution channel across the continent.

Our aim is to learn more about the specific requirements of the market so we can tailor our development to the needs of American companies. We also hope to build a bigger and tighter distribution network so we are able to provide better support to our customers.

 

***********************************************************************************************

Learn about the latest developments in robotics for packaging at PackEx Toronto 2017 (May 16-18; Toronto, Ontario, Canada). Register today!

Filed Under:

1 Comment

By submitting this form, you accept the Mollom privacy policy.
500 characters remaining
At http://ai.neocities.org/perlmind.txt the AI Mind in Perl has a TacRecog() module stubbed in for tractile recognition, or a sense of touch. Ambitious Perl coders ought to merge the OptoForce technology with the Perlmind AI source code and embody the enhanced AI Mind in a robot outfitted with sensors and actuators. http://medium.com/p/12c25b2570b2 shows the integration of sensory input with AI conceptual thinking. http://dl.acm.org/citation.cfm?doid=307824.307853 -- is a paper on the AI Mind.