Robohub.org
 

Push, pull, nudge: The control challenge of non-prehensile robotic manipulation


by
29 September 2015



share this:

Control_Theory_Lecture_Non_Prehensile_ManipulationIn this new lecture series, controls expert Brian Douglas walks you through key concepts in control system theory. Focused on making control theory accessible and intuitive, this series is for anyone who wants to relate control concepts to robotic applications in the real world. 

Presumably you are a human, an as such you have become really good at manipulating your environment by using your hands. You have these incredible actuators that allow you gently and nimbly grab an object, hold it tight, and then move it to a new location. So why wouldn’t you use your hands to grasp everything, and why wouldn’t you build a humanoid robot with hands so it could be as capable as you?

Well, this type of manipulation is called prehensile manipulation — or gasping manipulation — and it’s reasonable to expect a robot to be able to interact with the environment in this manner. However, you may not realize that as a human you are also performing a huge amount of non-prehensile manipulation everyday, both with and without your hands, such as pushing a swinging door open at the store, typing on your keyboard, or walking on two legs.  

A robot that is expected to do well in our human built environment would need to be good at non-prehensile manipulation as well as the prehensile sort, and unfortunately it doesn’t always have a straightforward solution. The difficulty of getting a robot to move like a human was demonstrated at the DARPA Robotics Challenge (DRC), where the robot contestants were expected to perform activities like driving a car, opening a door, and closing a rotating valve. The robots successfully completed a number of very impressive feats, but it was also obvious from the number of robots falling down that we have a long way to go in our efforts to build a human-like robot; not the least of which is building agile bi-pedal robots.

I was recently at the International Conference on Robotics and Automation (ICRA 2015) and I came across a very unassuming robot performing continuous non-prehensile manipulation of a ball around the edge of a butterfly-shaped link. Using a camera to sense the position of the ball, and a motor for actuation, the robot was able to continuously spin the butterfly shape while keeping the ball perched on top. Maksim Surov and Leonid Paramonov developed the algorithms that made this possible, and not only was their display mesmerizing, but their formalized approach to solving these types of problems could go to advancing our ability to generate continuous stability for an agile walking bipedal robot.



tags: , ,


Brian Douglas Brian Douglas is the Attitude Determination and Controls Lead at Planetary Resources, Inc. , he is also the content creator of the Control System Lectures YouTube channel.
Brian Douglas Brian Douglas is the Attitude Determination and Controls Lead at Planetary Resources, Inc. , he is also the content creator of the Control System Lectures YouTube channel.





Related posts :



Robot Talk Episode 111 – Robots for climate action, with Patrick Meier

  28 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Patrick Meier from the Climate Robotics Network about how robots can help scale action on climate change.

Robot Talk Episode 110 – Designing ethical robots, with Catherine Menon

  21 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Catherine Menon from the University of Hertfordshire about designing home assistance robots with ethics in mind.

Robot Talk Episode 109 – Building robots at home, with Dan Nicholson

  14 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Dan Nicholson from MakerForge.tech about creating open source robotics projects you can do at home.

Robot Talk Episode 108 – Giving robots the sense of touch, with Anuradha Ranasinghe

  07 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anuradha Ranasinghe from Liverpool Hope University about haptic sensors for wearable tech and robotics.

Robot Talk Episode 107 – Animal-inspired robot movement, with Robert Siddall

  31 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Robert Siddall from the University of Surrey about novel robot designs inspired by the way real animals move.

Robot Talk Episode 106 – The future of intelligent systems, with Didem Gurdur Broo

  24 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Didem Gurdur Broo from Uppsala University about how to shape the future of robotics, autonomous vehicles, and industrial automation.

Robot Talk Episode 105 – Working with robots in industry, with Gianmarco Pisanelli 

  17 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gianmarco Pisanelli from the Advanced Manufacturing Research Centre about how to promote the safe and intuitive use of robots in manufacturing.

Robot Talk Episode 104 – Robot swarms inspired by nature, with Kirstin Petersen

  10 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kirstin Petersen from Cornell University about how robots can work together to achieve complex behaviours.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association