Robohub.org
 

Calligraphy robot uses a Motion Copy System to reproduce detailed brushwork

by
11 October 2012



share this:
12-0181-r

A research group at Keio University, led by Seiichiro Katsura, has developed the Motion Copy System. This system can identify and store detailed brush strokes, based on information about movement in calligraphy. This enables a robot to faithfully reproduce the detailed brush strokes.

This system stores calligraphy movements by using a brush where the handle and tip are separate. The two parts are connected, with the head as the master system and the tip as the slave system. Characters can be written by handling the device in the same way as an ordinary brush.

Unlike conventional motion capture systems, a feature of this one is, it can record and reproduce the force applied to the brush as well as the sensation when you touch something. Until now, passing on traditional skills has depended on intuition and experience. It’s hoped that this new system will enable skills to be learned more efficiently.

“What’s new is, there’s a motor attached to the brush, so while the person’s moving, the motion and force are recorded as digital data using the motor. What’s more, with this technology, the recorded motion and force can be reproduced anytime, anywhere using the motor.”

“We’ve succeeded in using the motor to record the movements of a veteran calligrapher, and to actually reproduce them. So, I think we’ve demonstrated that, to record and reproduce human skills, it’s necessary to record not just motions, but also how strongly those motions are made.”

Looking at the graph of position and brush pressure, the position of the master system and the motion of the slave system are consistent, and also, the pressure shows the opposite waveform. This shows that the law of action and reaction is being artificially implemented between the master and slave systems.

“Currently, multimedia only uses audiovisual information. But we’d like to bring force, action, and motion into IT, by reproducing this kind of physical force via a network, or storing skills on a hard disk for downloading. So, with this new form of IT, you’d be able to access skill content.”



tags:


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



At the forefront of building with biology

Raman is, as she puts it, “a mechanical engineer through and through.” Today, Ritu Raman leads the Raman Lab and is an Assistant Professor in the Department of Mechanical Engineering.
28 June 2022, by

Hot Robotics Symposium celebrates UK success

An internationally leading robotics initiative that enables academia and industry to find innovative solutions to real world challenges, celebrated its success with a Hot Robotics Symposium hosted across three UK regions last week.
25 June 2022, by

Researchers release open-source photorealistic simulator for autonomous driving

MIT scientists unveil the first open-source simulation engine capable of constructing realistic environments for deployable training and testing of autonomous vehicles.
22 June 2022, by

In this episode, Audrow Nash speaks to Maria Telleria, who is a co-founder and the CTO of Canvas. Canvas makes a drywall finishing robot and is based in the Bay Area. In this interview, Maria talks ab...
21 June 2022, by and

Coffee with a Researcher (#ICRA2022)

As part of her role as one of the IEEE ICRA 2022 Science Communication Awardees, Avie Ravendran sat down virtually with a few researchers from academia and industry attending the conference.

Seeing the robots at #ICRA2022 through the eyes of a robot

Accessbility@ICRA2022 and OhmniLabs provided three OhmniBots for the conference, allowing students, faculty and interested industry members to attend the expo and poster sessions.
17 June 2022, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association