Robohub.org
 

Paper Computing technology the first step to paper-based Google Docs


by
30 October 2012



share this:
12-0200-r

As well as using a camera and computer, this system uses a laser and UV light, making it possible to work directly with the hand-drawn sketches using the computer.

So for example, the user can leave only the edges of hand-written characters, creating 3D like text, or draw a figure by hand and color it in automatically.

“This is one technology for truly turning ordinary paper into a display. Until now, it’s been possible to project things onto paper and use it as a screen, or import things drawn on paper to PC by using a digital pen. But the first method uses light, so the results can only be seen in the dark, and with the second method, even if you can import things, you can’t access them on paper from the computer.”

The pen for sketching uses Frixion thermo-sensitive ink, which becomes transparent when heated, and sketches drawn by the Frixion pens are lit from behind by a laser to erase them. The ink can be erased to a high level of accuracy, at intervals of 0.024 mm.

The paper is coated with a photochromic material, which changes color when it absorbs light, and a DMD-driven UV projector with a resolution of 1024 x 768 pixels is used to print the image onto the paper.

“The idea is to do computing on paper. But in the future, we’d like to enable several people to create one document, like with Google Docs, actually using real-world paper while far apart. We’d also like to enhance the rendering that’s possible through collaboration between people and computers. For example, by giving more detailed access than you get by hand, and enabling you to draw large areas at once.”



tags:


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence