Robohub.org
 

Astronaut on International Space Station successfully controls K10 rover on Earth, supporting use of telerobotics in future deep space missions


by
02 July 2013



share this:

On June 17, 2013, Astronaut Chris Cassidy successfully drove a K10 rover on earth, via remote connection from the Surface Telerobotics Workbench on the International Space Station, showing that robots deployed to explore Mars or the far side of the moon could be remotely controlled by astronauts in space during future deep-space missions. Telerobotics, which involves human operators remotely controlling robotic arms, rovers and other devices in space, is one means of reducing risk in dull, dangerous or dirty tasks as humans explore space. 

NASA has a long history of playing for high stakes; think of the 7 minutes of terror Curiosity descent to Mars, Spirit & Opportunity, and indeed, the entire space race. Yet when human lives and millions of dollars in technology are invested, it’s critical to keep risk at a minimum. As part of our series on ‘High-Risk / High-Reward’ robotics, we asked Dr. Terry Fong of the NASA Ames Intelligent Robotics Group, to describe how NASA’s telerobotics initiatives help mitigate risk in space missions. – Robohub Editors

Photo credit: NASA. Chris Cassidy studies the Surface Telerobotics Workbench on the International Space Station to remotely operate the K10 rover on Earth at NASA's Ames Research Center in Moffett Field, Calif., in June 2013.

Photo credit: NASA. Chris Cassidy studies the Surface Telerobotics Workbench on the International Space Station to remotely operate the K10 rover on Earth at NASA’s Ames Research Center in Moffett Field, Calif., in June 2013.

 

Photo credit: Dominick Hart/NASA. NASA's K10 rover at the Ames Research Center in Moffett Field, California performs a surface survey with its cameras and laser system, and then deployed a simulated polymide antenna while being controlled by an astronaut in space during a June 2013 test.

Photo credit: Dominick Hart/NASA. NASA’s K10 rover at the Ames Research Center in Moffett Field, California performs a surface survey with its cameras and laser system, and then deployed a simulated polymide antenna while being controlled by an astronaut in space during a June 2013 test.

“Surface Telerobotics” is a 2013 NASA test to examine how astronauts in the International Space Station (ISS) can remotely operate a surface robot across short time delays. This test will be performed during the summer of
2013 and has three objectives:

  1. To demonstrate interactive crew control of a mobile surface telerobot in the presence of short communications delay,
  2. To characterize a concept of operations for a single astronaut remotely operating a planetary rover with limited support from ground control, and
  3. To characterize telerobot utilization, operator workload and operator situation awareness.

Surface Telerobotics is intended to reduce risk for future human-robot exploration missions, identify technical gaps, and refine key system requirements.

In planning for future human exploration missions, numerous study teams have proposed having astronauts remotely operate surface robots from an orbiting spacecraft using a low-latency, high-bandwidth communications link. This concept of operations is seen as an effective method for performing surface activities that require real-time human involvement without incurring the risk and cost associated with human sorties. In addition, this configuration would allow high-performance spacecraft computing to be used for high-level robot autonomy (perception, navigation, etc.), thus simplifying the processing and avionics required for the robot. Crew-centric surface telerobotics is considered an option for several possible missions:

  • Lunar Farside: Astronauts orbiting the Moon (or station-keeping at the Earth-Moon “L2” Lagrange point) remotely operate a surface robot exploring the lunar farside. Astronauts would take advantage of low-latency (less than 250 ms) and high-availability communications to maximize robot utilization during a short-duration mission.
  • Near-Earth Object (NEO): Astronauts approaching, in orbit, or departing a NEO (e.g., asteroid) remotely operate a robot landed on surface. Astronauts would control the robot from the flight vehicle because the NEO environment (high rotation rate, rapidly varying illumination, etc.) rules out remote operations from Earth.
  • Mars Orbit: Astronauts in aerostationary orbit around Mars (or perhaps landed on Phobos or Deimos) remotely operate a surface robot exploring Mars. Astronauts would control the robot from the flight vehicle when circumstances (time-critical activities, contingency handling, etc.) do not permit remote operation from Earth.

If successful, this project will help NASA better understand the key issues, engineering requirements, and costs/benefits associated with crew-centric surface telerobotics. In addition, data collected by the test will inform the design of future ground-based tests, particularly in terms of the key factors that need to be simulated at high levels of fidelity.

Finally, this project will help confirm (or reject) many of the assumptions and hypotheses that have been made by numerous space exploration study teams regarding the technology maturity, technology gaps, and risks (operational and functional) associated with crew-controlled surface telerobotics.

Project Partners
NASA Ames Intelligent Robotics Group
NASA Lunar Science Institute
NASA Technology Demonstration Missions Program
Jet Propulsion Laboratory
Lunar University Network for Astrophysics Research

Mission Location
International Space Station and the NASA Ames Research Center in Moffett Field California

More Information
http://tinyurl.com/surface-telerobotics
http://www.nasa.gov/telerobotics



tags: , , , , , ,


Terry Fong is the Director of the Intelligent Robotics Group at the NASA Ames Research Center and is the manager of the NASA Human Exploration Telerobotics project.
Terry Fong is the Director of the Intelligent Robotics Group at the NASA Ames Research Center and is the manager of the NASA Human Exploration Telerobotics project.





Related posts :



Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.

Robot Talk Episode 129 – Automating museum experiments, with Yuen Ting Chan

  17 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Yuen Ting Chan from Natural History Museum about using robots to automate molecular biology experiments.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence