Robohub.org
 

An origami robot for touching virtual reality objects


by
18 December 2019



share this:

A group of EPFL researchers have developed a foldable device that can fit in a pocket and can transmit touch stimuli when used in a human-machine interface.

When browsing an e-commerce site on your smartphone, or a music streaming service on your laptop, you can see pictures and hear sound snippets of what you are going to buy. But sometimes it would be great to touch it too – for example to feel the texture of a garment, or the stiffness of a material. The problem is that there are no miniaturized devices that can render touch sensations the way screens and loudspeakers render sight and sound, and that can easily be coupled to a computer or a mobile device.

Researchers in Professor Jamie Paik’s lab at EPFL have made a step towards creating just that – a foldable device that can fit in a pocket and can transmit touch stimuli when used in a human-machine interface. Called Foldaway, this miniature robot is based on the origami robotic technology, which makes it easy to miniaturize and manufacture. Because it starts off as a flat structure, it can be printed with a technique similar to the one employed for electronic circuits, and can be easily stored and transported. At the time of deployment, the flat structure folds along a pre-defined pattern of joints to take the desired 3D, button-like shape. The device includes three actuators that generate movements, forces and vibrations in various directions; a moving origami mechanism on the tip that transmit sensations to the user’s finger; sensors that track the movements of the finger and electronics to control the whole system. This way the device can render different touch sensations that reproduce the physical interaction with objects or forms.

The Foldaway device, that is described in an article in the December issue of Nature Machine Intelligence and is featured on the journal’s cover, comes in two versions, called Delta and Pushbutton. “The first one is more suitable for applications that require large movements of the user’s finger as input” says Stefano Mintchev, a member of Jamie Paik’s lab and co-author of the paper. “The second one is smaller, therefore pushing portability even further without sacrificing the force sensations transmitted to the user’s finger”.

Education, virtual reality and drone control
The researchers have tested their devices in three situations. In an educational context, they have shown that a portable interface, measuring less than 10 cm in length and width and 3 cm in height, can be used to navigate an atlas of human anatomy: the Foldaway device gives the user different sensations upon passing on various organs: the different stiffness of soft lungs and hard bones at the rib cage; the up-and-down movement of the heartbeat; sharp variations of stiffness on the trachea.

As a virtual reality joystick, the Foldaway can give the user the sensation of grasping virtual objects and perceiving their stiffness, modulating the force generated when the interface is pressed 


As a control interface for drones, the device can contribute to solve the sensory mismatch created when users control the drone with their hands, but can perceive the response of drones only through visual feedback. Two Pushbuttons can be combined, and their rotation can be mapped into commands for altitude, lateral and forward/backward movements. The interface also provides force feedback to the user’s fingertips in order to increase the pilot’s awareness on drone’s behaviour and of the effects of wind or other environmental factors.

The Foldaway device was developed by the Reconfigurable Robotics Lab at EPFL, and is currently being developed for commercialisation by a spin-off (called FOLDAWAY Haptics) that was supported by NCCR Robotics’s Spin Fund grant.

“Now that computing devices and robots are more ubiquitous than ever, the quest for human machine-interactions is growing rapidly” adds Marco Salerno, a member of Jamie Paik’s lab and co-author of the paper. “The miniaturization offered by origami robots can finally allow the integration of rich touch feedback into everyday interfaces, from smartphones to joysticks, or the development of completely new ones such as interactive morphing surfaces”.

Literature
Mintchev, S., Salerno, M., Cherpillod, A. et al., “A portable three-degrees-of-freedom force feedback origami robot for human–robot interactions“, Nature Machine Intelligence 1, 584–593 (2019) doi:10.1038/s42256-019-0125-1




NCCR Robotics





Related posts :



Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.

Robot Talk Episode 129 – Automating museum experiments, with Yuen Ting Chan

  17 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Yuen Ting Chan from Natural History Museum about using robots to automate molecular biology experiments.

What’s coming up at #IROS2025?

  15 Oct 2025
Find out what the International Conference on Intelligent Robots and Systems has in store.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence