news    views    podcast    learn    |    about    contribute     republish    

Sensing

interview by   -   July 12, 2021

At Xwing, they are converting traditional aircraft into remotely operated aircraft. They do this by retrofitting planes with multiple sensors including cameras, radar, and lidar, and by developing sensor fusion algorithms to allow their planes to understand the world around them, using highly accurate perception algorithms.

A team of researchers from the Wyss Institute for Biologically Inspired Engineering at Harvard University and the Massachusetts Institute of Technology has found a way to embed synthetic biology reactions into fabrics, creating wearable biosensors that can be customized to detect pathogens and toxins and alert the wearer.

interview by   -   June 10, 2021
Intel RealSense Facial Scanning
Intel RealSense ID was designed with privacy as a top priority. Purpose-built for user protection, Intel RealSense ID processes all facial images locally and encrypts all user data. (Credit: Intel Corporation)

Intel RealSense is known in the robotics community for its plug-and-play stereo cameras. These cameras make gathering 3D depth data a seamless process, with easy integrations into ROS to simplify the software development for your robots. From the RealSense team, Joel Hagberg talks about how they built this product, which allows roboticists to perform computer vision and machine learning at the edge.

interview by   -   May 27, 2021

Matt Bilsky, founder and CEO of FLX Solutions, discusses the snake-like robot he invented called the FLX BOT. The FLX BOT consists of modular links, each with a joint that can extend and rotate to get into tight spaces. Each link includes sensors including inertial measurement units and a camera. The robot is used to navigate and work in challenging environments, such as above ceilings and within walls. Matt discusses the key innovations of his product as well as his academic and entrepreneurial journey that led him to the FLX BOT.

 

Over the years, robots have gotten quite good at identifying objects — as long as they’re out in the open. MIT researchers have now designed a sharp-tipped robot finger equipped with tactile sensing to meet the challenge of identifying buried objects.

by   -   March 19, 2021
The new technology pairs wireless sensing with artificial intelligence to determine when a patient is using an insulin pen or inhaler, and it flags potential errors in the patient’s administration method. | Image: courtery of the researchers

From swallowing pills to injecting insulin, patients frequently administer their own medication. But they don’t always get it right. Improper adherence to doctors’ orders is commonplace, accounting for thousands of deaths and billions of dollars in medical costs annually. MIT researchers have developed a system to reduce those numbers for some types of medications.

interview by   -   March 12, 2021

FieldPrinter by Dusty Robotics

Abate interviews Tessa Lau on her startup Dusty Robotics which is innovating in the field of construction.

At Dusty Robotics, they developed a robot to automate the laying of floor plans on the floors in construction sites. Typically, this is done manually using a tape measure and reading printed out plans. This difficult task can often take a team of two a week to complete. Time-consuming tasks like this are incredibly expensive on a construction site where multiple different teams are waiting on this task to complete. Any errors in this process are even more time-consuming to fix. By using a robot to automatically convert 3d models of building plans into markings on the floors, the amount of time and errors are dramatically reduced.

Humanoid robot hand. Futuristic cyborg concept.

While modern cameras provide machines with a very well-developed sense of vision, robots still lack such a comprehensive solution for their sense of touch. At ETH Zurich, in the group led by Prof. Raffaello D’Andrea at the Institute for Dynamic Systems and Control, we have developed a tactile sensing principle that allows robots to retrieve rich contact feedback from their interactions with the environment. I recently described our approach in a TEDx talk at the last TEDxZurich. The talk features a tech demo that introduces the novel tactile sensing technology targeting the next generation of soft robotic skins.

interview by   -   February 16, 2021

Kate speaks with Anni Kern, Head of Communication, strategy, and teams at Cybathlon for over four years. She describes the motivation and concepts for the Cybathlon organizations to develop a common platform to remove barriers between people with disabilities, technology developers, and the public. Anni also describes the specifics of Cybathlon competitions and the organization and planning.

 

Interesting discussion with Prof. Ali Khademhosseini, CEO of the Terasaki Institute, and one of the pioneers of the Bioengineering field. Prof. Ali’s journey from Harvard and UCLA to the Terasaki Institute is truly inspiring. What does the institute do to bring a product to the real world? Learn about the design challenges of biomaterials, organs on a chip, and soft robotics in this episode of the IEEE RAS Soft Robotics Podcast.

Interesting discussion with Hod Lipson, head of Creative Machines Lab, Columbia University in New York. Can robots be self-aware? Can they design other robots and self-repair? Why should we evolve robots to do tasks that animals do so well? Why don’t we have useful autonomous robots in the real world yet? Find out Hod’s answers to these questions and updates on VoxCAD development for designing and simulation of soft robots in this episode of the IEEE RAS Soft Robotics Podcast.

Sensor sleeve
Graduate student Moritz Graule demonstrates a fabric arm sleeve with embedded sensors. The sensors detect the small changes in the Graule’s forearm muscle through the fabric. Such a sleeve could be used in everything from virtual reality simulations and sportswear to clinical diagnostics for neurodegenerative diseases like Parkinson’s Disease. Credit: Oluwaseun Araromi/Harvard SEAS

By Leah Burrows / SEAS communications

Newly engineered slinky-like strain sensors for textiles and soft robotic systems survive the washing machine, cars and hammers.

interview by   -   November 15, 2020

 

In this episode, Shihan Lu interviews Jivko Sinapov, Assistant Professor in the Computer Science Department at Tufts University, about his work on behavior-grounded multisensory perception and exploration in robotics. Dr. Sinapov discusses several perspectives on multisensory perception in robotics, including data collection, data fusion, and robot control and planning. He also shares his experience about using robotics for K-12 education.

by   -   October 23, 2020
MorphSensor glasses
An MIT team used MorphSensor to design multiple applications, including a pair of glasses that monitor light absorption to protect eye health. Credits: Photo courtesy of the researchers.

By Rachel Gordon

We’ve come a long way since the first 3D-printed item came to us by way of an eye wash cup, to now being able to rapidly fabricate things like car parts, musical instruments, and even biological tissues and organoids

interview by   -   October 19, 2020

In this episode, Abate interviews Josh Lessing, co-founder and CEO of Root AI. At Root AI they are developing a system that tracks data on the farm and autonomously harvests crops using delicate grippers and computer vision. Lessing talks about the path they took to build a product with good market fit and how they brought a venture capital backed startup to market.



Autonomous Aircraft by Xwing
July 12, 2021


Are you planning to crowdfund your robot startup?

Need help spreading the word?

Join the Robohub crowdfunding page and increase the visibility of your campaign