Robohub.org
 

Rapid robot prototyping with ROS

by
19 October 2012



share this:

Open-source software is making it easier to reuse algorithms and allow engineers and researchers to focus on their problems of interest instead of reinventing the wheel for each project. Not an expert in path planning or don’t have the time (or patience) to implement SLAM? There’s a package for that. Manipulator control? Package for that too. Additionally, falling component prices and commercial-off-the-shelf (COTS) devices are making robotics hardware more available. This tutorial will teach you how to put together a simple remote teleoperation robot using these principles.

There are a few frameworks in the robotics domain, but the one that is reaching critical mass (in the robotics research community at least) is Willow Garage’s Robot Operating System, or just ROS. Plenty of people have written about its features and if you’re interested Willow Garage has an excellent set of tutorials, so I won’t do that here. We are going to break the tutorial into two parts: first, we’ll get a robot moving using ROS through a webpage user interface; and second, we’ll build on that to add a Kinect and enable remote teleoperation. Here’s what you’ll need to get started:

  1.  iRobot Roomba or Create
  2. ROS Electric installed on Ubuntu Linux and configured according to the tutorials
  3. Some way of connecting the robot to your computer (1, 2, or modify a USB-to-serial connector to interface with the mini-DIN port). For teleoperation, you can use the connected computer, another computer, smart phone, tablet, iPad, or anything with a browser supporting web sockets.

Let’s get into the specifics. With ROS installed, you’ll need a few more packages that aren’t included in the base install. Rosbridge is a solid middleware package that is going to enable communication with ROS through a webpage. It can do a lot more, but this satisfies our goal for now. Let’s install that (from Brown screencast):

> sudo apt-get install ros-electric-brown-remotelab

You’ll also need a package to enable ROS to communicate with the robot. The package, at its most basic functionality, will listen for Twist messages (ROS-speak for motor drive commands), and convert it to the low-level serial commands for the roomba. (Though this tutorial focuses on roomba/create, you can use any robot with an existing ROS package, or write your own!) You have three options, based on your roomba model:

  1. iRobot Roomba 500 series or up (uses iRobot OI spec [pdf])
  2. iRobot Roomba pre-500 series (uses iRobot SCI spec [pdf])
  3. iRobot Create (uses iRobot OI spec)

There are packages for all three of these. I happen to have a Roomba 400-series so I went ahead and modified the roomba_500_series package for SCI commands. This is perfect for older roombas you might have gathering dust or picked up for cheap on ebay. There’s also a python implementation floating around. Whichever you use, the other parts of the system won’t care because they’re publishing Twist messages and the robot will be subscribing to Twist messages. If you don’t have a folder for third-party ROS packages or didn’t follow the ROS tutorials, let’s create one in your home directory and download my roomba_400_series package:
> mkdir ros_workspace
> cd ros_workspace
> mkdir myStacks
> cd myStacks
> git clone https://github.com/dpiet/robotsindc-ros-pkg

Be sure to rosmake. Check to make sure the directory (or parent directory) is included in the ROS package path:

> echo ROS_PACKAGE_PATH

If it’s not, add it:

> export ROS_PACKAGE_PATH=$ROS_PACKAGE_PATH:/home/david/ros_workspace

There’s one more piece we need in order to control the robot through a webpage: the webpage. Fortunately, there are plenty of rosbridge examples out there and a javascript novice like me was able to hack together a simple page with keyboard inputs and button inputs. Download the page (or you already have it if you cloned the git repo) to your web server directory (if you’re running one) or to you /home and you need only modify the IP address to point to the computer controlling the robot. And that’s it!

Here’s the procedure for putting this altogether at execution:

In a separate terminal for each line, run the following commands:

> roscore
> rosrun roomba_400_series roomba400_light_node (assumes robot is attached to /dev/ttyUSB0)
> rosrun rosbridge rosbridge.py
(Optional) if you don't have a webserver, cd to your /home directory with the teleop.html file and use the magic of python:
> python -m SimpleHTTPServer

  1. Grab your iPad (or other device), make sure it’s on the same wifi local network as the robot computer, and load up the page. If you’re running SimpleHTTPServer, type this in: http://[robot ip address here]:8000/teleop.html
  2. If all goes according to plan you’ll get a rudimentary screen with buttons for directional control and a log showing connection status messages
  3.  Happy roomba driving!

To recap, we connected a roomba to a laptop running: ROS, the open source Robot Operating System from Willow Garage; Rosbridge, a convenient middleware package from Brown University; and a web server (in this case Python’s SimpleHTTPServer). We were then able to load up a webpage with some javascript from a laptop, android smartphone, or iPad, with each able to drive the roomba with basic forward/reverse/left/right commands. Cool.

By using ROS and a few open source packages, we were able to get going with a moving robot via teleoperation with a simple user interface relatively quickly, or at least much quicker than if we were to program everything from scratch. So the benefit has been the ability to stand up a teleoperation robot in short order through code reuse. A criticism is that we used a fairly heavy framework, with a lot of pieces working under the hood, which means a lot of upfront work to read system documentation/tutorials and review examples. But as I argued above, the upfront cost is worth it, because now the robot is based on software modules with a vibrant community of contributors that facilitates code reuse without you having to reinvent the wheel or become a path-planning expert.

To take advantage of the upfront work we put in to learning ROS, we can now easily extend the first part of the tutorial and add a video feed to our web interface so we can now teleoperate remotely. Here’s what we need to add:
//

  • The mjpeg_server ROS package (to install, download the package or svn checkout. cd into the directory and run “rosmake –rosdep-install”)
  • Microsoft Kinect ROS package using OpenNI drivers (to install, run “sudo apt-get install ros-electric-openni-kinect”)
  • A new html file with some additional javascript

That’s it. The Kinect package takes care of interfacing the camera and IR streams, publishing the data to ROS topics. The mjpeg_server subscribes to that camera topic and sets up a streaming server that efficiently streams the video through HTTP. Rosbridge could be used to stream video directly by subscribing to the camera topic, but it won’t be as efficient.

The following video demonstrates the first run after integrating all the components on the robot. Not too bad, but there’s definitely room for improvement. Next steps will be to add a roslaunch file to our repo to simplify startup, accelerometer controls for navigation, and further testing to identify sources of lag.

Here’s the procedure for getting the demo running. Be sure to change the robot’s IP address as appropriate. Please note: this demo is currently compatible with ROS Electric, not the latest version, Fuerte.

In a separate terminal for each line, run the following commands:

> roscore
> rosrun roomba_400_series roomba400_light_node (assumes robot is attached to /dev/ttyUSB0)
> rosrun rosbridge rosbridge.py
> rosrun mjpeg_server mjpeg_server _port:=9091
> roslaunch openni_launch openni.launch
(Optional) if you don't have a webserver, cd to your /home directory with the teleop_video.html file and use the magic of python:
> python -m SimpleHTTPServer

Where can we go from here? We’ve essentially built a Turtlebot, replacing the iRobot Create with the roomba (though we’re obviously missing the convenient mounting hardware). So there are plenty of resources for integrating advanced behaviors like mapping, autonomous navigation, or even a manipulator arm. In particular, the Pi Robot project has done some interesting demonstrations with the Turtlebot platform.

Have a roomba gathering dust and a Kinect sitting next to your Xbox? Then you have all the major components to build a moving robot.



tags: , , ,


David Pietrocola is a robotics engineer and CEO of Lifebotics LLC. He writes regularly on robotics policy, news, and events.
David Pietrocola is a robotics engineer and CEO of Lifebotics LLC. He writes regularly on robotics policy, news, and events.





Related posts :



Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association