Robohub.org
 

Calligraphy robot uses a Motion Copy System to reproduce detailed brushwork


by
11 October 2012



share this:
12-0181-r

A research group at Keio University, led by Seiichiro Katsura, has developed the Motion Copy System. This system can identify and store detailed brush strokes, based on information about movement in calligraphy. This enables a robot to faithfully reproduce the detailed brush strokes.

This system stores calligraphy movements by using a brush where the handle and tip are separate. The two parts are connected, with the head as the master system and the tip as the slave system. Characters can be written by handling the device in the same way as an ordinary brush.

Unlike conventional motion capture systems, a feature of this one is, it can record and reproduce the force applied to the brush as well as the sensation when you touch something. Until now, passing on traditional skills has depended on intuition and experience. It’s hoped that this new system will enable skills to be learned more efficiently.

“What’s new is, there’s a motor attached to the brush, so while the person’s moving, the motion and force are recorded as digital data using the motor. What’s more, with this technology, the recorded motion and force can be reproduced anytime, anywhere using the motor.”

“We’ve succeeded in using the motor to record the movements of a veteran calligrapher, and to actually reproduce them. So, I think we’ve demonstrated that, to record and reproduce human skills, it’s necessary to record not just motions, but also how strongly those motions are made.”

Looking at the graph of position and brush pressure, the position of the master system and the motion of the slave system are consistent, and also, the pressure shows the opposite waveform. This shows that the law of action and reaction is being artificially implemented between the master and slave systems.

“Currently, multimedia only uses audiovisual information. But we’d like to bring force, action, and motion into IT, by reproducing this kind of physical force via a network, or storing skills on a hard disk for downloading. So, with this new form of IT, you’d be able to access skill content.”




DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



Robot Talk Episode 126 – Why are we building humanoid robots?

  20 Jun 2025
In this special live recording at Imperial College London, Claire chatted to Ben Russell, Maryam Banitalebi Dehkordi, and Petar Kormushev about humanoid robotics.

Gearing up for RoboCupJunior: Interview with Ana Patrícia Magalhães

and   18 Jun 2025
We hear from the organiser of RoboCupJunior 2025 and find out how the preparations are going for the event.

Robot Talk Episode 125 – Chatting with robots, with Gabriel Skantze

  13 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriel Skantze from KTH Royal Institute of Technology about having natural face-to-face conversations with robots.

Preparing for kick-off at RoboCup2025: an interview with General Chair Marco Simões

and   12 Jun 2025
We caught up with Marco to find out what exciting events are in store at this year's RoboCup.

Interview with Amar Halilovic: Explainable AI for robotics

  10 Jun 2025
Find out about Amar's research investigating the generation of explanations for robot actions.

Robot Talk Episode 124 – Robots in the performing arts, with Amy LaViers

  06 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Amy LaViers from the Robotics, Automation, and Dance Lab about the creative relationship between humans and machines.

Robot Talk Episode 123 – Standardising robot programming, with Nick Thompson

  30 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Nick Thompson from BOW about software that makes robots easier to program.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence