Robohub.org
 

Paper Computing technology the first step to paper-based Google Docs


by
30 October 2012



share this:
12-0200-r

As well as using a camera and computer, this system uses a laser and UV light, making it possible to work directly with the hand-drawn sketches using the computer.

So for example, the user can leave only the edges of hand-written characters, creating 3D like text, or draw a figure by hand and color it in automatically.

“This is one technology for truly turning ordinary paper into a display. Until now, it’s been possible to project things onto paper and use it as a screen, or import things drawn on paper to PC by using a digital pen. But the first method uses light, so the results can only be seen in the dark, and with the second method, even if you can import things, you can’t access them on paper from the computer.”

The pen for sketching uses Frixion thermo-sensitive ink, which becomes transparent when heated, and sketches drawn by the Frixion pens are lit from behind by a laser to erase them. The ink can be erased to a high level of accuracy, at intervals of 0.024 mm.

The paper is coated with a photochromic material, which changes color when it absorbs light, and a DMD-driven UV projector with a resolution of 1024 x 768 pixels is used to print the image onto the paper.

“The idea is to do computing on paper. But in the future, we’d like to enable several people to create one document, like with Google Docs, actually using real-world paper while far apart. We’d also like to enhance the rendering that’s possible through collaboration between people and computers. For example, by giving more detailed access than you get by hand, and enabling you to draw large areas at once.”



tags:


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.


Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 147 – Miniature living robots, with Maria Guix

  06 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Maria Guix from the University of Barcelona about combining electronics and biology to create biohybrid robots with emergent properties.

Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

  05 Mar 2026
Bhoomika Gandhi discusses her work on an optical sensor for medical robotics applications.

Humanoid home robots are on the market – but do we really want them?

  03 Mar 2026
Last year, Norwegian-US tech company 1X announced “the world’s first consumer-ready humanoid robot designed to transform life at home”.

Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer

  27 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.

I developed an app that uses drone footage to track plastic litter on beaches

  26 Feb 2026
Plastic pollution is one of those problems everyone can see, yet few know how to tackle it effectively.

Translating music into light and motion with robots

  25 Feb 2026
Robots the size of a soccer ball create new visual art by trailing light that represents the “emotional essence” of music

Robot Talk Episode 145 – Robotics and automation in manufacturing, with Agata Suwala

  20 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Agata Suwala from the Manufacturing Technology Centre about leveraging robotics to make manufacturing systems more sustainable.

Reversible, detachable robotic hand redefines dexterity

  19 Feb 2026
A robotic hand developed at EPFL has dual-thumbed, reversible-palm design that can detach from its robotic ‘arm’ to reach and grasp multiple objects.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence