Robohub.org
 

Multi-viewpoint robotic camera system creates real ‘bullet time’ slow motion replays


by
04 June 2013



share this:
13-0045-r

This multi-viewpoint robotic camera system, under development by NHK, links the motion of eight sub-cameras to that of an individual camera, so that all the cameras film the same moving object.

“Using this system, you can create the effect of stopping time, and moving the viewpoint all around the subject.”

“Previous methods used a fixed camera, so they could only capture subjects moving in a narrow or limited space. But this multi-viewpoint robot camera system can film dynamically moving sports, or subjects at lots of locations in an extensive space.”

Each robot camera has two motors, for pan and tilt. The cameras also share lens data, so they can zoom in unison.

“Pictures taken with robot cameras inevitably have discrepancies in direction control. So simply switching between them doesn’t give smooth pictures. To solve that problem, we’ve brought in a computer, which redoes the direction control virtually. Image processing is done, to virtually orient the cameras in the direction of the subject, making it possible to switch between the cameras.”

“Pictures from this system can be sent out about one minute after filming is finished. First of all, we intend to use this for live sports broadcasting. We’d like to make it easy to understand what’s happening, by providing multi-viewpoint pictures instead of the current slow-motion replay.”

This multi-viewpoint robotic camera system can also be used as an image capture system for integral 3D TV, under development by NHK. By generating integral 3D video from multiple-viewpoint footage, 3D video of sports events will be viewable on integral 3D TV.



tags:


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



Robot Talk Episode 134 – Robotics as a hobby, with Kevin McAleer

  21 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kevin McAleer from kevsrobots about how to get started building robots at home.

ACM SIGAI Autonomous Agents Award 2026 open for nominations

  19 Nov 2025
Nominations are solicited for the 2026 ACM SIGAI Autonomous Agents Research Award.

Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.

CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.

Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence