Robohub.org
 

The importance of research reproducibility in robotics


by
20 September 2017



share this:

As highlighted in a previous post, despite the fact that robotics is increasingly regarded as a ‘Science’, as shown by the launch of new journals such as Science Robotics, reproducibility of experiments is still difficult or entirely lacking.

This is quite unfortunate as the possibility of reproducing experimental results is a cornerstone of the scientific method. This situation pushes serious discussions (What’s ‘soft robotics’? Is it needed? What has to be ’soft’?) and paradigm clashes (Good Old Fashioned Artificial Intelligence vs. Deep Learning vs. Embodied Cognition) into the realm of literary controversy or even worse religious territory fights, with quite little experimental evidence supporting the claims of the different parties. Not even wrong, as they say (following Peter Woit’s arguments on String Theory)?

The robotics community has been aware of these issues for a long time and more and more researchers in recent years have published datasets, code and other valuable information to allow others to reproduce their results. We are heading in the right direction, but we probably need to do more.

I think we should therefore welcome the fact that for the first time ever, IEEE R&A Mag. will start accepting R-articles (i.e., papers that report experiments aiming to be fully reproducible) beginning this September. Actually, they will also accept short articles reporting on the replication of R-Article results, and author replies are solicited and will be published after peer-review. The result will be a two-stage high-quality review process. The first stage will be the ordinary rigorous review process of a top-tier publishing venue. The second stage will be the replication of the experiments by the community (which is the core of the scientific method).

This seems like a historical improvement, doesn’t it?

There is more information on this in the column I wrote in the September issue of IEEE Robotics and Automation.



tags:


Fabio Bonsignorio is a professor in the BioRobotics Institute at the Scuola Superiore Sant'Anna (Pisa, Italy).
Fabio Bonsignorio is a professor in the BioRobotics Institute at the Scuola Superiore Sant'Anna (Pisa, Italy).





Related posts :



Radboud chemists are working with companies and robots on the transition from oil-based to bio-based materials

  10 Dec 2025
The search for new materials can be accelerated by using robots and AI models.

Robot Talk Episode 136 – Making driverless vehicles smarter, with Shimon Whiteson

  05 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Shimon Whiteson from Waymo about machine learning for autonomous vehicles.

Why companies don’t share AV crash data – and how they could

  01 Dec 2025
Researchers have created a roadmap outlining the barriers and opportunities to encourage AV companies to share the data to make AVs safer.

Robot Talk Episode 135 – Robot anatomy and design, with Chapa Sirithunge

  28 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chapa Sirithunge from University of Cambridge about what robots can teach us about human anatomy, and vice versa.

Learning robust controllers that work across many partially observable environments

  27 Nov 2025
Exploring designing controllers that perform reliably even when the environment may not be precisely known.

Human-robot interaction design retreat

  25 Nov 2025
Find out more about an event exploring design for human-robot interaction.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence