news    views    podcast    learn    |    about    contribute     republish     events

Research & Innovation

interview by   -   June 8, 2018

In this episode, Audrow Nash interviews Jonathan W. Hurst, Associate Professor of Mechanical Engineering at Oregon State University and CTO and co-founder of Agility Robotics, about legged locomotion, about a bipedal robot, called “Cassie.” Hurst discusses Cassie’s design, what types of research questions Cassie should allow, and applications of walking robots, including package delivery. 

interview by   -   March 19, 2018



In this episode, Audrow Nash speaks with Maja Matarić, a professor at the University of Southern California and the Chief Science Officer of Embodied, about socially assistive robotics. Socially assistive robotics aims to endow robots with the ability to help people through individual non-contact assistance in convalescence, rehabilitation, training, and education. For example, a robot could help a child on the autism spectrum to connect to more neurotypical children and could help to motivate a stroke victim to follow their exercise routine for rehabilitation (see the videos below). In this interview, Matarić discusses the care gap in health care, how her work leverages research in psychology to make robots engaging, and opportunities in socially assistive robotics for entrepreneurship.

A Bayesian optimization method that integrates the metabolic costs in wearers of this hip-assisting exosuit enabled the individualized fine-tuning of assistive forces. Credit: Ye Ding/Harvard University
By Leah Burrows

When it comes to soft, assistive devices — like the exosuit being designed by the Harvard Biodesign Lab — the wearer and the robot need to be in sync. But every human moves a bit differently and tailoring the robot’s parameters for an individual user is a time-consuming and inefficient process.

Now, researchers from the Wyss Institute for Biologically Inspired Engineering and the Harvard John A. Paulson School of Engineering and Applied and Sciences (SEAS) have developed an efficient machine learning algorithm that can quickly tailor personalized control strategies for soft, wearable exosuits.

interview by   -   March 4, 2018



In this episode, Audrow Nash speaks with Monica Daley about learning from birds about legged locomotion. To do this, Daley analyzes the gaits of guineafowl in various experiments to understand the mechanical principles underlying gaits, such as energetic economy, mechanical limits, and how the birds avoid injury. She then tests her ideas about legged locomotion on legged robots with collaborators, including Jonathan Hurst from Oregon State University. Daley also speaks about her experience with interdisciplinary collaborations. 

by   -   February 2, 2018

MIT Media Lab spinout Ori is developing smart robotic furniture that transforms into a bedroom, working or storage area, or large closet — or slides back against the wall — to optimize space in small apartments.
Courtesy of Ori

By Rob Matheson

Imagine living in a cramped studio apartment in a large city — but being able to summon your bed or closet through a mobile app, call forth your desk using voice command, or have everything retract at the push of a button.

interview by   -   January 6, 2018



In this episode, Audrow Nash interviews Elliott Rouse, Assistant Professor at the University of Michigan, about an open-source prosthetic leg—that is a robotic knee and ankle. Rouse’s goal is to provide an inexpensive and capable platform for researchers to use so that they can work on prostheses without developing their own hardware, which is both time-consuming and expensive. Rouse discusses the design of the leg, the software interface, and the project’s timeline.

by   -   December 31, 2017

By Ivan Evtimov, Kevin Eykholt, Earlence Fernandes, and Bo Li based on recent research by Ivan Evtimov, Kevin Eykholt, Earlence Fernandes, Tadayoshi Kohno, Bo Li, Atul Prakash, Amir Rahmati, Dawn Song, and Florian Tramèr.

Deep neural networks (DNNs) have enabled great progress in a variety of application areas, including image processing, text analysis, and speech recognition. DNNs are also being incorporated as an important component in many cyber-physical systems. For instance, the vision system of a self-driving car can take advantage of DNNs to better recognize pedestrians, vehicles, and road signs. However, recent research has shown that DNNs are vulnerable to adversarial examples: Adding carefully crafted adversarial perturbations to the inputs can mislead the target DNN into mislabeling them during run time. Such adversarial examples raise security and safety concerns when applying DNNs in the real world. For example, adversarially perturbed inputs could mislead the perceptual systems of an autonomous vehicle into misclassifying road signs, with potentially catastrophic consequences.

Credit: Jerry Wright

Reaching an optimal shared decision in a distributed way is a key aspect of many multi-agent and swarm robotic applications.

by   -   October 28, 2017

Just in time for Halloween, a research team from the MIT Media Lab’s Scalable Cooperation group has introduced Shelley: the world’s first artificial intelligence-human horror story collaboration.

Sony Pictures

The new Blade Runner sequel will return us to a world where sophisticated androids made with organic body parts can match the strength and emotions of their human creators. As someone who builds biologically inspired robots, I’m interested in whether our own technology will ever come close to matching the “replicants” of Blade Runner 2049.

interview by   -   October 14, 2017



In this episode, Audrow Nash interviews Chris Gerdes, Professor of Mechanical Engineering at Stanford University, about designing high-performance autonomous vehicles. The idea is to make vehicles safer, as Gerdes says, he wants to “develop vehicles that could avoid any accident that can be avoided within the laws of physics.”

In this interview, Gerdes discusses developing a model for high-performance control of a vehicle; their autonomous race car, an Audi TTS named ‘Shelley,’ and how its autonomous performance compares to ameteur and professional race car drivers; and an autonomous, drifting Delorean named ‘MARTY.’

Robots help ants with daily chores so they can be accepted into the colony. Image credit – Dr Bertrand Collignon

by Aisling Irwin

Tiny mobile robots are learning to work with insects in the hope the creatures’ sensitive antennae and ability to squeeze into small spaces can be put to use serving humans.

by   -   September 29, 2017


The European Robotics League (ERL) announced the winners of ERL Emergency Robots 2017 major tournament, during the awards ceremony held on Saturday, 23rd September at Giardini Pro Patria, in Piombino, Italy.

The ERL Emergency Robots 2017 competition consisted of four scenarios, inspired by the nuclear accident of Fukushima (Japan, 2011) and designed specifically for multi-domain human-robot teams. The first scenario is The Grand Challenge made up of three domains – sea, air, land, and the other three scenarios are made of only two domains.

by   -   September 29, 2017

The European Robotics League (ERL) announced the winners of ERL Emergency Robots 2017 major tournament, during the awards ceremony held on Saturday, 23rd September at Giardini Pro Patria, in Piombino, Italy.

In addition to the Competition Awards, Marta Palau Franco from Bristol Robotics Laboratory and ERL Emergency project manager introduced the referees’ special awards.

by   -   September 28, 2017


Designing and representing control algorithms is challenging in swarm robotics, where the collective swarm performance depends on interactions between robots and with their environment. The currently available modeling languages, such as UML, cannot fully express these interactions. The Behaviour-Data Relations Modeling Language (BDRML) explicitly represents robot behaviours and data that robots utilise, as well as relationships between them. This allows BDRML to express control algorithms where robots cooperate and share information with each other while interacting with the environment. Here’s the work I presented this week at #IROS2017.



Cassie, a Bipedal Robot for Research and Development
June 8, 2018


Are you planning to crowdfund your robot startup?

Need help spreading the word?

Join the Robohub crowdfunding page and increase the visibility of your campaign