Robohub.org
ep.

273

podcast
 

Presented work at IROS 2018 (Part 1 of 3) with Alexandros Kogkas, Katie Driggs-Campbell and Martin Karlsson


by
12 November 2018



share this:




In this episode, Audrow Nash interviews Alexandros Kogkas, Katie Driggs-Campbell, and Martin Karlsson about the work they presented at the 2018 International Conference on Intelligent Robots and Systems (IROS) in Madrid, Spain.

Alexandros Kogkas is a PhD Candidate at the Imperial College London and he speaks about an eye tracking framework to understand where a person is looking.  This framework can be used to understand a person’s intentions, for example to hand a surgeon the correct tool or helping a person who is paraplegic.   Kogkas discusses how the framework works, possible applications, and his future plans for this framework.

Katie Driggs-Campbell is a Post Doctoral Researcher at Stanford’s Intelligent System Laboratory and is—soon to be—an Assistant Professor at University of Illinois Urbana-Champaign (UIUC). She speaks about making inferences about the world from human actions, specifically in the context of autonomous cars.  In the work she discusses, they use a model of a human driver that they use infer what is happening in the world, for example a human using a crosswalk. Driggs-Campbell talks about how they evaluate this work.

Martin Karlsson is a PhD student at Lund University in Sweden, and he speaks about a haptic interface to mirror robotic arms that requires no force sensing.  He discusses a feedback law that allows a mirroring of forces and his future work to deal with joint friction.

Links



tags: , , , , , , ,


Audrow Nash is a Software Engineer at Open Robotics and the host of the Sense Think Act Podcast
Audrow Nash is a Software Engineer at Open Robotics and the host of the Sense Think Act Podcast

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 156 – Rugged robots for dangerous missions, with Gavin Kenneally

  15 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Gavin Kenneally from Ghost Robotics about robot dogs for defence, security, and public safety.

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence