The history of machine learning has largely been a story of increasing abstraction. In the dawn of ML, researchers spent considerable effort engineering features. As deep learning gained popularity, researchers then shifted towards tuning the update rules and learning rates for their optimizers. Recent research in meta-learning has climbed one level of abstraction higher: many researchers now spend their days manually constructing task distributions, from which they can automatically learn good optimizers. What might be the next rung on this ladder? In this post we introduce theory and algorithms for unsupervised meta-learning, where machine learning algorithms themselves propose their own task distributions. Unsupervised meta-learning further reduces the amount of human supervision required to solve tasks, potentially inserting a new rung on this ladder of abstraction.
Looking at the Open Source COVID-19 Medical Supplies production tally of handcrafted masks and faceshields, we’re trying to answer that question in our weekly discussions about ‘COVID-19, robots and us’. We talked to Rachel ‘McCrafty’ Sadd has been building systems and automation for COVID mask making, as the founder of Project Mask Making and #distillmyheart projects in the SF Bay Area, an artist and also as Executive Director of Ace Monster Toys makerspace/studio. Rachel has been organizing volunteers and automating workflows to get 1700 cloth masks hand sewn and distributed to people at risk before the end of April. “Where’s my f*king robot!” was the theme of her short presentation.
Health care workers are not the only unwilling essential services frontline workers at increased risk of COVID-19. According to the Washington Post on April 12, “At least 41 grocery workers have died of the coronavirus and thousands more have tested positive in recent weeks”. At the same time, grocery stores are seeing a surge in demand and are currently hiring. The food industry is also seeing increasing adoption of robots in both the back end supply chain and in the food retail and food service sectors.
Robots could have a role to play in COVID-19, whether it’s automating laboratory research, helping with logistics, disinfecting hospitals, education, or allowing carers, colleagues or loved ones to connect using telepresence. Yet many of these solutions are still in development or early deployment. The hope is that accelerating these translations could make a difference.
By Tim Sullivan, Spaulding Rehabilitation Network Communications
Many of us aren’t spending much time outside lately, but there are still many obstacles for us to navigate as we walk around: the edge of the coffee table, small children, the family dog. How do our brains adjust to changes in our walking strides? Researchers at the Wyss Institute for Biologically Inspired Engineering at Harvard University and the Motion Analysis Laboratory at Spaulding Rehabilitation Hospital used robots to try to answer that question, and discovered that mechanisms in both the cerebellum and the spinal cord determine how the nervous system responds to robot-induced changes in step length. The new study is published in the latest issue of Scientific Reports, and points the way toward improving robot-based physical rehabilitation programs for patients.
HRI2020 has already kicked off with workshops and the Industry Talks Session on April 3, however the first release of videos has only just gone online with the welcome from General Chairs Tony Belpaeme, ID Lab, University of Ghent and James Young, University of Manitoba.
The YouTube originals series “The Age of A.I.” was released in December 2019. If you haven’t already seen it now could be a good time to catch up – with much of the world in enforced or voluntary isolation many of us will be stuck at home with hours to fill. Sit back and marvel at the many incredible, and often heart-warming, applications of AI.
The 15th Annual ACM/IEEE International Conference on Human Robot Interaction – HRI 2020 – was meant to take place in the city of Cambridge UK. Instead it will be launching online today. You can follow latest happenings on twitter and youtube. Check here for a list of all the papers.
Whether it’s a dog chasing after a ball, or a monkey swinging through the trees, animals can effortlessly perform an incredibly rich repertoire of agile locomotion skills. But designing controllers that enable legged robots to replicate these agile behaviors can be a very challenging task. The superior agility seen in animals, as compared to robots, might lead one to wonder: can we create more agile robotic controllers with less effort by directly imitating animals?
A simulation system invented at MIT to train driverless cars creates a photorealistic world with infinite steering possibilities, helping the cars learn to navigate a host of worse-case scenarios before cruising down real streets.
Researchers from the University of Zurich and NCCR Robotics have demonstrated a flying robot that can detect and avoid fast-moving objects. A step towards drones that can fly faster in harsh environments, accomplishing more in less time.