Robohub.org
ep.

032

podcast
 

Brain-machine interfaces with Charles Higgins and Steve Potter


by
14 August 2009



share this:

In today’s show we’ll be speaking with two experts in the field of brain-machine interfaces. Our first guest, Charles Higgins from the University of Arizona tells us how he uses insects to control robot motion and how they might be used in the future to develop new biological sensors for artificial systems.
We then speak with Steve Potter from the Georgia Institute of Technology. Instead of taking a fully developed brain and connecting it to a robot, he grows neural circuitry in a Petri-dish and interfaces it with robots, with the ambition to discover how we learn and memorize.

Charles Higgins

Charles Higgins is associate professor and leader of the Higgins Lab at the University of Arizona. Though he started as an electrical engineer, his fascination with the natural world has led him to study insect vision and visual processing, and to try to meld together the worlds of robotics and biology. This fascination and his interest to share it with others brings him every year to the Neuromorphic Engineering Workshop in Telluride, Colorado, where he met our interviewer Adam and took him dragonfly-hunting!

Higgins first tells us about his experiments with natural systems such as dragonflies, and how he’s learning about how their brains work in the hope of applying some of the concepts of neurobiology to engineering systems. He then talks about his most recent work in trying to use the amazing visual system of a dragonfly as a sensor to control a robot, and in turn to provide motion stimulus back to the dragonfly in a closed-loop system. He finishes by telling us a bit about the future in which we will design insect-inspired robots, or even have insects built-in to them directly!

Steve Potter

Steve Potter is the Director of the Potter Group which is part of the Laboratory for NeuroEngineering, a collective research unit shared between Emory University and the Georgia Institute of Technology. To understand how the neurocircuitry in the brain can lead to learning and memory, he’s been growing neural circuits in Petri-dishes and hooking them up to the sensors and actuators of robots. The embodiment provides the stimulus needed for the brain to develop. Because the neurons are in a dish, they can easily be monitored over time, providing a close-up sneak peak into the brain activity.

Robots that have been hooked up to this system include the Koala and Khepera wheeled robots from K-team and a robot artist named MEART (Multi-Electrode Array Art). MEART was built in collaboration with the SymbioticA Research Group and went on tour around the world, drawing pictures based on stimulation from its in-vitro brain and feeding back camera images of its art. After weeks of stimulation, the brain actually calms down, providing insight into the possible treatment of epilepsy.



MEART Robotic Arm

Finally, Potter gives us his take on whether these hybrid living robots (Hybrots), or Animats are more life or machine?

Links:


Latest News:

For more information on the LEGO Moonbots challenge, the AUVSI conference and the Evolta robot, visit the Robots Forum.
View and post comments on this episode in the forum



tags: ,


Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.
Podcast team The ROBOTS Podcast brings you the latest news and views in robotics through its bi-weekly interviews with leaders in the field.


Subscribe to Robohub newsletter on substack



Related posts :

Graphene-based sensor to improve robot touch

  16 Mar 2026
Multiscale-structured miniaturized 3D force sensors for improved robot touch.

Robot Talk Episode 148 – Ethical robot behaviour, with Alan Winfield

  13 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Alan Winfield from the University of the West of England about developing new standards for ethics and transparency in robotics.

Coding for underwater robotics

  12 Mar 2026
Lincoln Laboratory intern Ivy Mahncke developed and tested algorithms to help human divers and robots navigate underwater.

Restoring surgeons’ sense of touch with robotic fingertips

  10 Mar 2026
Researchers are developing robotic “fingertips” that could give surgeons back their sense of touch during minimally invasive and robotic operations.

Robot Talk Episode 147 – Miniature living robots, with Maria Guix

  06 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Maria Guix from the University of Barcelona about combining electronics and biology to create biohybrid robots with emergent properties.

Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

  05 Mar 2026
Bhoomika Gandhi discusses her work on an optical sensor for medical robotics applications.

Humanoid home robots are on the market – but do we really want them?

  03 Mar 2026
Last year, Norwegian-US tech company 1X announced “the world’s first consumer-ready humanoid robot designed to transform life at home”.

Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer

  27 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence