Dear Robohub readers: We need your help. Robohub is growing. Robohub is now a community with over 70 contributors and more than 30,000 unique visitors each month. In order for us to continue covering the latest in robotics research, bring in-depth coverage of conferences worldwide, and showcase interviews with leading roboticists, we need your support.
Your donation will go towards expanding our coverage, paying the salaries of our dedicated staff, and maintaining the website.
Keep Robohub alive for another year by donating to our campaign today.Thank you!
University of Colorado’s robotic plant growth is demonstrated at the Kennedy Space Center. Source: NASA.
Targeting a sustainable presence of humans in outer space will require solving air, water, energy, and food supplies within a few thousand cubic feet surrounded by vacuum. What seems at first sight to be a problem of an apocalyptic, remote future reveals itself as the grand challenges of our civilization in a nutshell. This article argues that space exploration can be one of the main drivers to revolutionize sustainable agriculture on earth.
Autonomously flying robots — also called small-scale unmanned aerial vehicles (UAVs) — are more and more exploited in civil and commercial applications for monitoring, surveillance, and disaster response. For some applications, it is beneficial if a team of coordinated UAVs rather than a single UAV is employed. Multiple UAVs can cover a given area faster or take photos from different perspectives at the same time. This emerging technology is still at an early stage and, consequently, profound research and development efforts are needed.
NanoSatisfi, a Silicon Valley based cubesat startup, today received $300,000 investment from Grishin Robotics bringing their total seed funding to $1,750,000, not including their initial successful Kickstarter campaign of $106,330 and CEO Peter Platzer’s personal investment. NanoSatisfi aim to provide affordable space satellite access to everyone through their autonomous cubesats. The first launch dates have been booked for summer and fall 2013.
Over the past two decades, robotic planetary exploration has generated an incredible wealth of knowledge about our neighbors in the Solar System. We now realize that celestial bodies within our reach can provide resources such as water, minerals, and metals, essential for sustaining and supporting robotic and human exploration of the Solar System. It is only matter of time before “living off the land” exploration enabled by in-situ resource utilization (ISRU) becomes a reality. The Solar System offers almost unlimited resources, but the difficult part is accessing them. Thus, if the cost of mining and processing can be reduced, some of the minerals that are in high demand on Earth could in fact be brought back and sold for commercial gain.
Astrodrone is both a simulation game app for the Parrot AR.Drone and a scientific crowd-sourcing experiment that aims to improve landing, obstacle avoidance and docking capabilities in autonomous space probes.
As researchers at the European Space Agency’s Advanced Concepts Team, we wanted to study how visual cues could be used by robotic spacecraft to help them navigate unknown, extraterrestrial environments. One of our main research goals was to explore how robots can share knowledge about their environments and behaviors to speed up this visual learning process.
It’s cold out there beyond the blue. Full of radiation. Low on breathable air. Vacuous. It’s hard to keep machines and organic creatures functioning and/or alive. Space to-do lists are full of dangerous, fantastically boring, and super-precise stuff.
We technological mammals assess thusly: “Robots. Robots should be doing this.“
ESA is organizing the first robotic competition on a mock-up of the International Space Station (ISS). The competition is open for young people from ESA member states who can compete in three age groups between 11 and 19 years old. The regulations leave a lot of room for innovation and creative freedom, practically only safety requirements are imposed.
Application deadline :15 March
Development phase : 4–12 April
Finalist down-selection phase : Beginning May
Competition event : Mid-October
Depending on which time zone you’re in, either yesterday evening or early this morning, a rocket-powered sky crane lowered the Curiosity rover gently to the surface of Mars, just in time for Curiosity to send a few low-res images before the Mars Reconnaissance Orbiter (MRO) and Odyssey, either of which could relay its signal back to Earth, dropped below the horizon and lost contact, culminating years of planning and months of anxious anticipation. Considering the complex sequence of steps involved, the narrow window of time within which each had to be performed, and the fact that all were performed autonomously by the system in flight or by the rover itself, this successful landing is a major victory for the incorporation of robotic technologies into rocket science. Congratulations to all involved!
In Modular Space Robotics, modules self-assemble while in orbit to create larger satellites for specific missions. Modular satellites have the potential to reduce mission costs (small satellites are cheaper to launch), increase reliability, and enable on-orbit repair and refueling. Each of the modules has its load of sensors, fuel and attitude control actuators (thrusters). Assembled modules therefore have redundant sensor and actuation capabilities. By fusing sensor data, the modular satellites can follow its trajectory more precisely and smart thruster activation can help save fuel.
The challenge is to figure out how to control such a self-assembled robot to minimize fuel consumption while balancing fuel distribution and improve trajectory following. To this end, Toglia et al. propose a cooperative controller where one of the modules, with information about the configuration of all other modules, is responsible for computing an optimal control schema. An extended Kalman-Bucy Filter is used to implement sensor fusion.
The cooperative controller was compared to an independent controller where each module attempts to follow its own trajectory while minimizing its own fuel usage and trajectory errors. Results from simulation and reality show that the cooperative controller can save significant amounts of fuel, up to 43% in one experiment, while making the trajectories more precise.
Experiments in reality were performed with two satellites using the MIT Field and Space Robotics Laboratory Free-Flying Space Robot Test Bed shown below.
Welcome to the second part of our 50th episode special! To celebrate 50 episodes of Robots, we’re doing a review of some of the greatest advances in robotics during the last 50 years, and predictions on what we can hope to see in the next half century. In last week’s episode we covered embodied AI, robot toys, androids, underwater robots, education robots and brain-machine interfaces.
Finally, don’t forget to check out all the new features of our website including episode browsing by topic, interviewee and tag or leaving comments under our blog posts or in the forum.
Jean-Christophe Zufferey is a researcher at the Laboratory of Intelligent Systems at the Swiss Federal Polytechnic in Lausanne, Switzerland, where he works on cutting-edge research in Micro Air Vehicles (MAVs). His latest advances have led him to create the startup SenseFly that specializes in small and safe autonomous flying systems for applications such as environmental monitoring and aerial photography.
Dan Kara is President of Robotics Trends and the Robotics Business Review, which are web-portals and research firms specialized in the robotics markets. He’ll be telling us about the past products which have marked the minds and the future developments that will be gathering the buck in the future.
Kristinn R. Thórisson
Kristinn Thórisson is Associate Professor at the School of Computer Science, Reykjavik University in Iceland. Active in the field of Artificial Intelligence for a couple decades, Thórisson is pioneering new approaches such as constructivist AI which he hopes will bring us towards more adaptive and complex artificial systems.
In today’s episode we speak with the lead scientist of the SPHERES project, Dr. Alvar Saenz-Otero from MIT, which aims at developing autonomous formation flight and docking control algorithms for nano-satellites. We then dissect a well known definition of a robot dating back to 1979.
Alvar Saenz-Otero is lead scientist of the SPHERES project at the MIT Space Systems Laboratory in the US. SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) fulfill all the normal requirements of satellites in a small basketball-sized shape. This small size is what has allowed these robots to be tested in the lab, during parabolic flights and even on board the International Space Station (ISS).
The research question is how to make these satellites work together by flying in formations and physically connecting, or docking. Such swarms of satellites could be used to create giant telescope mirrors in space with nano-meter precision and assemble future space stations without the need for human spacewalks.
Saenz-Otero also describes more generally how you get your robot into the ISS and his plans to motivate students about science or pursue his dream of large swarms in space.
What is a Robot?
This week we look at a traditional definition of a robot, coming straight from the Robot Institute of America. According to their 1979 definition, a robot is:
“A reprogrammable, multifunctional manipulator designed to move material, parts, tools, or specialized devices through various programmed motions for the performance of a variety of tasks”
What’s interesting about this definition is how far we’ve come in the last 30 years in the development of robotics. In 1979 a robot was simply a manipulator used to move parts in pre-programmed motions, which brings to mind industrial robots used in factories. 30 years later robots are no longer simply manipulators, but can propel themselves in their environment, understand their surroundings and act accord to their particular situation and analysis of surroundings instead of simply enacting pre-programmed motions. This 30-year-old official definition no longer applies, so let’s try to figure out what robots mean to us today! Keep sending us your answers by email at firstname.lastname@example.org and let’s get closer to an all-encompassing definition of a robot for the 21st century.