Robohub.org
 

Rapid robot prototyping with ROS

by
19 October 2012



share this:

Open-source software is making it easier to reuse algorithms and allow engineers and researchers to focus on their problems of interest instead of reinventing the wheel for each project. Not an expert in path planning or don’t have the time (or patience) to implement SLAM? There’s a package for that. Manipulator control? Package for that too. Additionally, falling component prices and commercial-off-the-shelf (COTS) devices are making robotics hardware more available. This tutorial will teach you how to put together a simple remote teleoperation robot using these principles.

There are a few frameworks in the robotics domain, but the one that is reaching critical mass (in the robotics research community at least) is Willow Garage’s Robot Operating System, or just ROS. Plenty of people have written about its features and if you’re interested Willow Garage has an excellent set of tutorials, so I won’t do that here. We are going to break the tutorial into two parts: first, we’ll get a robot moving using ROS through a webpage user interface; and second, we’ll build on that to add a Kinect and enable remote teleoperation. Here’s what you’ll need to get started:

  1.  iRobot Roomba or Create
  2. ROS Electric installed on Ubuntu Linux and configured according to the tutorials
  3. Some way of connecting the robot to your computer (1, 2, or modify a USB-to-serial connector to interface with the mini-DIN port). For teleoperation, you can use the connected computer, another computer, smart phone, tablet, iPad, or anything with a browser supporting web sockets.

Let’s get into the specifics. With ROS installed, you’ll need a few more packages that aren’t included in the base install. Rosbridge is a solid middleware package that is going to enable communication with ROS through a webpage. It can do a lot more, but this satisfies our goal for now. Let’s install that (from Brown screencast):

> sudo apt-get install ros-electric-brown-remotelab

You’ll also need a package to enable ROS to communicate with the robot. The package, at its most basic functionality, will listen for Twist messages (ROS-speak for motor drive commands), and convert it to the low-level serial commands for the roomba. (Though this tutorial focuses on roomba/create, you can use any robot with an existing ROS package, or write your own!) You have three options, based on your roomba model:

  1. iRobot Roomba 500 series or up (uses iRobot OI spec [pdf])
  2. iRobot Roomba pre-500 series (uses iRobot SCI spec [pdf])
  3. iRobot Create (uses iRobot OI spec)

There are packages for all three of these. I happen to have a Roomba 400-series so I went ahead and modified the roomba_500_series package for SCI commands. This is perfect for older roombas you might have gathering dust or picked up for cheap on ebay. There’s also a python implementation floating around. Whichever you use, the other parts of the system won’t care because they’re publishing Twist messages and the robot will be subscribing to Twist messages. If you don’t have a folder for third-party ROS packages or didn’t follow the ROS tutorials, let’s create one in your home directory and download my roomba_400_series package:
> mkdir ros_workspace
> cd ros_workspace
> mkdir myStacks
> cd myStacks
> git clone https://github.com/dpiet/robotsindc-ros-pkg

Be sure to rosmake. Check to make sure the directory (or parent directory) is included in the ROS package path:

> echo ROS_PACKAGE_PATH

If it’s not, add it:

> export ROS_PACKAGE_PATH=$ROS_PACKAGE_PATH:/home/david/ros_workspace

There’s one more piece we need in order to control the robot through a webpage: the webpage. Fortunately, there are plenty of rosbridge examples out there and a javascript novice like me was able to hack together a simple page with keyboard inputs and button inputs. Download the page (or you already have it if you cloned the git repo) to your web server directory (if you’re running one) or to you /home and you need only modify the IP address to point to the computer controlling the robot. And that’s it!

Here’s the procedure for putting this altogether at execution:

In a separate terminal for each line, run the following commands:

> roscore
> rosrun roomba_400_series roomba400_light_node (assumes robot is attached to /dev/ttyUSB0)
> rosrun rosbridge rosbridge.py
(Optional) if you don't have a webserver, cd to your /home directory with the teleop.html file and use the magic of python:
> python -m SimpleHTTPServer

  1. Grab your iPad (or other device), make sure it’s on the same wifi local network as the robot computer, and load up the page. If you’re running SimpleHTTPServer, type this in: http://[robot ip address here]:8000/teleop.html
  2. If all goes according to plan you’ll get a rudimentary screen with buttons for directional control and a log showing connection status messages
  3.  Happy roomba driving!

To recap, we connected a roomba to a laptop running: ROS, the open source Robot Operating System from Willow Garage; Rosbridge, a convenient middleware package from Brown University; and a web server (in this case Python’s SimpleHTTPServer). We were then able to load up a webpage with some javascript from a laptop, android smartphone, or iPad, with each able to drive the roomba with basic forward/reverse/left/right commands. Cool.

By using ROS and a few open source packages, we were able to get going with a moving robot via teleoperation with a simple user interface relatively quickly, or at least much quicker than if we were to program everything from scratch. So the benefit has been the ability to stand up a teleoperation robot in short order through code reuse. A criticism is that we used a fairly heavy framework, with a lot of pieces working under the hood, which means a lot of upfront work to read system documentation/tutorials and review examples. But as I argued above, the upfront cost is worth it, because now the robot is based on software modules with a vibrant community of contributors that facilitates code reuse without you having to reinvent the wheel or become a path-planning expert.

To take advantage of the upfront work we put in to learning ROS, we can now easily extend the first part of the tutorial and add a video feed to our web interface so we can now teleoperate remotely. Here’s what we need to add:
//

  • The mjpeg_server ROS package (to install, download the package or svn checkout. cd into the directory and run “rosmake –rosdep-install”)
  • Microsoft Kinect ROS package using OpenNI drivers (to install, run “sudo apt-get install ros-electric-openni-kinect”)
  • A new html file with some additional javascript

That’s it. The Kinect package takes care of interfacing the camera and IR streams, publishing the data to ROS topics. The mjpeg_server subscribes to that camera topic and sets up a streaming server that efficiently streams the video through HTTP. Rosbridge could be used to stream video directly by subscribing to the camera topic, but it won’t be as efficient.

The following video demonstrates the first run after integrating all the components on the robot. Not too bad, but there’s definitely room for improvement. Next steps will be to add a roslaunch file to our repo to simplify startup, accelerometer controls for navigation, and further testing to identify sources of lag.

Here’s the procedure for getting the demo running. Be sure to change the robot’s IP address as appropriate. Please note: this demo is currently compatible with ROS Electric, not the latest version, Fuerte.

In a separate terminal for each line, run the following commands:

> roscore
> rosrun roomba_400_series roomba400_light_node (assumes robot is attached to /dev/ttyUSB0)
> rosrun rosbridge rosbridge.py
> rosrun mjpeg_server mjpeg_server _port:=9091
> roslaunch openni_launch openni.launch
(Optional) if you don't have a webserver, cd to your /home directory with the teleop_video.html file and use the magic of python:
> python -m SimpleHTTPServer

Where can we go from here? We’ve essentially built a Turtlebot, replacing the iRobot Create with the roomba (though we’re obviously missing the convenient mounting hardware). So there are plenty of resources for integrating advanced behaviors like mapping, autonomous navigation, or even a manipulator arm. In particular, the Pi Robot project has done some interesting demonstrations with the Turtlebot platform.

Have a roomba gathering dust and a Kinect sitting next to your Xbox? Then you have all the major components to build a moving robot.



tags: , , , ,


David Pietrocola is a robotics engineer and CEO of Lifebotics LLC. He writes regularly on robotics policy, news, and events.
David Pietrocola is a robotics engineer and CEO of Lifebotics LLC. He writes regularly on robotics policy, news, and events.





Related posts :



At the forefront of building with biology

Raman is, as she puts it, “a mechanical engineer through and through.” Today, Ritu Raman leads the Raman Lab and is an Assistant Professor in the Department of Mechanical Engineering.
28 June 2022, by

Hot Robotics Symposium celebrates UK success

An internationally leading robotics initiative that enables academia and industry to find innovative solutions to real world challenges, celebrated its success with a Hot Robotics Symposium hosted across three UK regions last week.
25 June 2022, by

Researchers release open-source photorealistic simulator for autonomous driving

MIT scientists unveil the first open-source simulation engine capable of constructing realistic environments for deployable training and testing of autonomous vehicles.
22 June 2022, by

In this episode, Audrow Nash speaks to Maria Telleria, who is a co-founder and the CTO of Canvas. Canvas makes a drywall finishing robot and is based in the Bay Area. In this interview, Maria talks ab...
21 June 2022, by and

Coffee with a Researcher (#ICRA2022)

As part of her role as one of the IEEE ICRA 2022 Science Communication Awardees, Avie Ravendran sat down virtually with a few researchers from academia and industry attending the conference.

Seeing the robots at #ICRA2022 through the eyes of a robot

Accessbility@ICRA2022 and OhmniLabs provided three OhmniBots for the conference, allowing students, faculty and interested industry members to attend the expo and poster sessions.
17 June 2022, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association