Robohub.org
ep.

032

podcast
 

Brain-machine interfaces with Charles Higgins and Steve Potter


by
14 August 2009



share this:

In today’s show we’ll be speaking with two experts in the field of brain-machine interfaces. Our first guest, Charles Higgins from the University of Arizona tells us how he uses insects to control robot motion and how they might be used in the future to develop new biological sensors for artificial systems.
We then speak with Steve Potter from the Georgia Institute of Technology. Instead of taking a fully developed brain and connecting it to a robot, he grows neural circuitry in a Petri-dish and interfaces it with robots, with the ambition to discover how we learn and memorize.

Charles Higgins

Charles Higgins is associate professor and leader of the Higgins Lab at the University of Arizona. Though he started as an electrical engineer, his fascination with the natural world has led him to study insect vision and visual processing, and to try to meld together the worlds of robotics and biology. This fascination and his interest to share it with others brings him every year to the Neuromorphic Engineering Workshop in Telluride, Colorado, where he met our interviewer Adam and took him dragonfly-hunting!

Higgins first tells us about his experiments with natural systems such as dragonflies, and how he’s learning about how their brains work in the hope of applying some of the concepts of neurobiology to engineering systems. He then talks about his most recent work in trying to use the amazing visual system of a dragonfly as a sensor to control a robot, and in turn to provide motion stimulus back to the dragonfly in a closed-loop system. He finishes by telling us a bit about the future in which we will design insect-inspired robots, or even have insects built-in to them directly!

Steve Potter

Steve Potter is the Director of the Potter Group which is part of the Laboratory for NeuroEngineering, a collective research unit shared between Emory University and the Georgia Institute of Technology. To understand how the neurocircuitry in the brain can lead to learning and memory, he’s been growing neural circuits in Petri-dishes and hooking them up to the sensors and actuators of robots. The embodiment provides the stimulus needed for the brain to develop. Because the neurons are in a dish, they can easily be monitored over time, providing a close-up sneak peak into the brain activity.

Robots that have been hooked up to this system include the Koala and Khepera wheeled robots from K-team and a robot artist named MEART (Multi-Electrode Array Art). MEART was built in collaboration with the SymbioticA Research Group and went on tour around the world, drawing pictures based on stimulation from its in-vitro brain and feeding back camera images of its art. After weeks of stimulation, the brain actually calms down, providing insight into the possible treatment of epilepsy.



MEART Robotic Arm

Finally, Potter gives us his take on whether these hybrid living robots (Hybrots), or Animats are more life or machine?

Links:


Latest News:

For more information on the LEGO Moonbots challenge, the AUVSI conference and the Evolta robot, visit the Robots Forum.
View and post comments on this episode in the forum



tags: ,


Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.
Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.





Related posts :



MIT engineers design an aerial microrobot that can fly as fast as a bumblebee

  31 Dec 2025
With insect-like speed and agility, the tiny robot could someday aid in search-and-rescue missions.

Robohub highlights 2025

  29 Dec 2025
We take a look back at some of the interesting blog posts, interviews and podcasts that we've published over the course of the year.

The science of human touch – and why it’s so hard to replicate in robots

  24 Dec 2025
Trying to give robots a sense of touch forces us to confront just how astonishingly sophisticated human touch really is.

Bio-hybrid robots turn food waste into functional machines

  22 Dec 2025
EPFL scientists have integrated discarded crustacean shells into robotic devices, leveraging the strength and flexibility of natural materials for robotic applications.

Robot Talk Episode 138 – Robots in the environment, with Stefano Mintchev

  19 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Stefano Mintchev from ETH Zürich about robots to explore and monitor the natural environment.

Artificial tendons give muscle-powered robots a boost

  18 Dec 2025
The new design from MIT engineers could pump up many biohybrid builds.

Robot Talk Episode 137 – Getting two-legged robots moving, with Oluwami Dosunmu-Ogunbi

  12 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Oluwami Dosunmu-Ogunbi from Ohio Northern University about bipedal robots that can walk and even climb stairs.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence