Robohub.org
 

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see


by
30 October 2025



share this:

This rubbery disc is an artificial eye that could give soft robots vision. Image credit: Corey Zheng/Georgia Institute of Technology.

By Corey Zheng, Georgia Institute of Technology and Shu Jia, Georgia Institute of Technology

Inspired by the human eye, our biomedical engineering lab at Georgia Tech has designed an adaptive lens made of soft, light-responsive, tissuelike materials.

Adjustable camera systems usually require a set of bulky, moving, solid lenses and a pupil in front of a camera chip to adjust focus and intensity. In contrast, human eyes perform these same functions using soft, flexible tissues in a highly compact form.

Our lens, called the photo-responsive hydrogel soft lens, or PHySL, replaces rigid components with soft polymers acting as artificial muscles. The polymers are composed of a hydrogel − a water-based polymer material. This hydrogel muscle changes the shape of a soft lens to alter the lens’s focal length, a mechanism analogous to the ciliary muscles in the human eye.

The hydrogel material contracts in response to light, allowing us to control the lens without touching it by projecting light onto its surface. This property also allows us to finely control the shape of the lens by selectively illuminating different parts of the hydrogel. By eliminating rigid optics and structures, our system is flexible and compliant, making it more durable and safer in contact with the body.

Why it matters

Artificial vision using cameras is commonplace in a variety of technological systems, including robots and medical tools. The optics needed to form a visual system are still typically restricted to rigid materials using electric power. This limitation presents a challenge for emerging fields, including soft robotics and biomedical tools that integrate soft materials into flexible, low-power and autonomous systems. Our soft lens is particularly suitable for this task.

Soft robots are machines made with compliant materials and structures, taking inspiration from animals. This additional flexibility makes them more durable and adaptive. Researchers are using the technology to develop surgical endoscopes, grippers for handling delicate objects and robots for navigating environments that are difficult for rigid robots.

The same principles apply to biomedical tools. Tissuelike materials can soften the interface between body and machine, making biomedical tools safer by making them move with the body. These include skinlike wearable sensors and hydrogel-coated implants.

three photos showing a rubbery disk held between two hands
This variable-focus soft lens, shown viewing a Rubik’s Cube, can flex and twist without being damaged. Image credit: Corey Zheng/Georgia Institute of Technology.

What other research is being done in this field

This work merges concepts from tunable optics and soft “smart” materials. While these materials are often used to create soft actuators – parts of machines that move – such as grippers or propulsors, their application in optical systems has faced challenges.

Many existing soft lens designs depend on liquid-filled pouches or actuators requiring electronics. These factors can increase complexity or limit their use in delicate or untethered systems. Our light-activated design offers a simpler, electronics-free alternative.

What’s next

We aim to improve the performance of the system using advances in hydrogel materials. New research has yielded several types of stimuli-responsive hydrogels with faster and more powerful contraction abilities. We aim to incorporate the latest material developments to improve the physical capabilities of the photo-responsive hydrogel soft lens.

We also aim to show its practical use in new types of camera systems. In our current work, we developed a proof-of-concept, electronics-free camera using our soft lens and a custom light-activated, microfluidic chip. We plan to incorporate this system into a soft robot to give it electronics-free vision. This system would be a significant demonstration for the potential of our design to enable new types of soft visual sensing.

The Research Brief is a short take on interesting academic work.The Conversation

Corey Zheng, PhD Student in Biomedical Engineering, Georgia Institute of Technology and Shu Jia, Assistant Professor of Biomedical Engineering, Georgia Institute of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.




The Conversation is an independent source of news and views, sourced from the academic and research community and delivered direct to the public.
The Conversation is an independent source of news and views, sourced from the academic and research community and delivered direct to the public.


Subscribe to Robohub newsletter on substack



Related posts :

Humanoid home robots are on the market – but do we really want them?

  03 Mar 2026
Last year, Norwegian-US tech company 1X announced “the world’s first consumer-ready humanoid robot designed to transform life at home”.

Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer

  27 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.

I developed an app that uses drone footage to track plastic litter on beaches

  26 Feb 2026
Plastic pollution is one of those problems everyone can see, yet few know how to tackle it effectively.

Translating music into light and motion with robots

  25 Feb 2026
Robots the size of a soccer ball create new visual art by trailing light that represents the “emotional essence” of music

Robot Talk Episode 145 – Robotics and automation in manufacturing, with Agata Suwala

  20 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Agata Suwala from the Manufacturing Technology Centre about leveraging robotics to make manufacturing systems more sustainable.

Reversible, detachable robotic hand redefines dexterity

  19 Feb 2026
A robotic hand developed at EPFL has dual-thumbed, reversible-palm design that can detach from its robotic ‘arm’ to reach and grasp multiple objects.

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.

Robot Talk Episode 144 – Robot trust in humans, with Samuele Vinanzi

  13 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Samuele Vinanzi from Sheffield Hallam University about how robots can tell whether to trust or distrust people.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence