We say that automation is the enemy of workers in factories. Yet, as collaborative robots emerge that respond to workers’ touch, designers have found a way to combine machine computing power with human creative abilities.
Whether in factories, workshops or even in our homes, these intelligent devices are endowed with flexible senses and gestures that create a new merging form of human-machine interaction. They assist workers in difficult tasks and improve their safety and productivity.
Here’s the history of these industrial cobots, and 5 examples of advanced robots models that want to redefine the conditions of human workers.
The History of Cobots
During the 90s, many GM managers reported worrying issues regarding the work conditions of their manufacturing plants. While some tasks were already automated in the assembly lines, workers were still continuously lifting heavy loads to hand them over to the robots, causing numerous injuries and productivity losses.
Caught up by the high stakes, the robotics department set out to find a way to relieve workers’ hands while keeping them active in their jobs. Two researchers from Northwestern University, Michael Peshkin and J. Edward Colgate were especially asked to help with this mission. They promptly invented what will be called “collaborative robots”. What defined their capabilities ?
These robots could work alongside workers, complementing their physical strength and without posing safety risks. Being wired to only act under workers’ impulse, these robots could not cause them any damage. Plus, they were directly hand-guided, learning movements through tactile and haptic sensors.
Peshkin and Colgate, for example, designed an intelligent winch system that lifted loads by detecting movement intent. Workers only had to push the load a little to get it to fly halfway up and move it freely.
Although skeptical at first, GM factory workers quickly adopted these intelligent assistance systems for their flexibility and usefulness in their work. And these early prototypes heralded the birth of many more collaborative forms of robots.
Cobot Models : Their Main Sensory and Safety Features
After the Peshkin and Colgate inventions, more companies started creating cobots with even more sensing and safety features and a variety of power functions.
The first cobots’ features had to do with their operational safety. Designers added automatic shutdown systems when they detected human touch. They also limited the strength and speed of the robots depending on their proximity to humans. They were able to use virtual walls, which frame the possible movements of the robots. Workers could easily modify and adapt these spatial limitations to their task and environment, to avoid errors and injuries.
The second capability increases the sensory capabilities of robots, giving them a better perception of human movements. Hand guidance features are for example essential to the notion of cobot, allowing intuitive manual assistance. But designers have also designed cobots that learn by the repeated impulses of workers. This also gave them a more flexible and pleasant handling, as well as an ability to predict and anticipate their movement, which makes them more human in the eyes of workers.
All these features contributed to the massive adoption of cobots in factories around the world. And not without benefits for the workers.
In addition to relieving themselves of tiring and dangerous tasks, they found cobots to be pleasant and useful companions. With them, they can perform repetitive tasks more easily, and can thus take care of more interesting tasks like quality control.
Cobots accelerate human-machine collaboration, and make manual work more meaningful in the face of automation. And this is not about to end with the latest generations of cobots.
5 Examples of Next-Gen Collaborative Robots
Over the past decade, cobots’ sensory technologies have been expanding even more, resulting in models that are more aware of their environment and user-friendlier.
For example, Tech Man Robots series of cobots are equipped with a camera that identifies the shapes and edges of objects and many other visual patterns. It also helps prevent dangerous interactions by learning to recognize at-risk worker limbs.
The Saywer cobot from RethinkRobotics shows human-like design combined with a user-friendly interface that allows direct reprogramming of its movements. It has an external vision device that quickly reposition itself according to production floor configurations. Sawyer is also equipped with a high-resolution force sensor, which adjust its movements with great precision.
Franka Emika’s Panda model is now available for use in manufacturing and research labs. With an accessible programming interface, anyone can quickly teach its 7-degree-of-freedom arm in very simple gestures. It has a gripper that is as precise and flexible as a human hand. It also has a very reactive collision detector.
Ned, the last robot from the company Niryo, goes even further with motion and voice control. Users can give instructions to the cobot with easy interactions. Ned also has built-in image recognition to process more accurate object manipulation.
While waiting for new tactile technologies such as piezoelectric, piezoresistive, capacitive or elastoresistivity sensors, these cobots already have sufficient sensory capabilities to merge with workers’ movements and tasks. They show the way to a workplace where humans thrive in collaboration with co-working robots.