Instead of worrying so much about robots taking away jobs, maybe we should worry more about wages being too low for robots to even get a chance. Seasonal labor for harvesting agricultural products, particularly fruits and vegetables, is dependent on human labor from a diminishing universe of willing workers.
How do you stop a robot from hurting people? Many existing robots, such as those assembling cars in factories, shut down immediately when a human comes near. But this quick fix wouldn’t work for something like a self-driving car that might have to move to avoid a collision, or a care robot that might need to catch an old person if they fall. With robots set to become our servants, companions and co-workers, we need to deal with the increasingly complex situations this will create and the ethical and safety questions this will raise.
In this episode, MeiXing Dong conducts interviews at the 2017 Midwest Speech and Language Days workshop in Chicago. She talks with Michael White of Ohio State University about question interpretation in a dialogue system; Dmitriy Dligach of Loyola University Chicago about extracting patient timelines from doctor’s notes; and Denis Newman-Griffiths of Ohio State University about connecting words and phrases to relevant medical topics.
In the 1995 film “Batman Forever,” the Riddler used 3-D television to secretly access viewers’ most personal thoughts in his hunt for Batman’s true identity. By 2011, the metrics company Nielsen had acquired Neurofocus and had created a “consumer neuroscience” division that uses integrated conscious and unconscious data to track customer decision-making habits. What was once a nefarious scheme in a Hollywood blockbuster seems poised to become a reality.
The device named “Spark” flew high above the man on stage with his hands waving in the direction of the flying object. In a demonstration of DJI’s newest drone, the audience marveled at the Coke can-sized device’s most compelling feature: gesture controls. Instead of a traditional remote control, this flying selfie machine follows hand movements across the sky. Gestures are the most innate language of mammals, and including robots in our primal movements means we have reached a new milestone of co-existence.
Engineers and researchers are already speculating about the next phase of UI development, especially for robotics control. So far, the leading candidate is gesture-based control—the use of physical gestures to relay commands.
The demands posed by a rapidly ageing global population are leading manufacturers of robots to develop technology for providing care and rehabilitation for elderly and impaired people in their own homes.
Robots are the technology of the future. But the current legal system is incapable of handling them. This generic statement is often the premise for considerations about the possibility of awarding rights (and liabilities) to these machines at some, less-than clearly identified, point in time. Discussing the adequacy of existing regulation in accommodating new technologies is certainly necessary, but the ontological approach is incorrect. Andrea Bertolini explains.
In this episode, Audrow Nash interviews Bradley Knox, founder of bots_alive. Knox speaks about an add-on to a Hexbug, a six-legged robotic toy, that makes the bot behave more like a character. They discuss the novel way Knox uses machine learning to create a sense character. They also discuss the limitation of technology to emulate living creatures, and how the bots_alive robot was built within these limitations.
Automated cars are hurtling towards us at breakneck speed, with all-electric Teslas already running limited autopilot systems on roads worldwide and Google trialling its own autonomous pod cars. However, before we can reply to emails while being driven to work, we have to have a foolproof way to determine when drivers can safely take control and when it should be left to the car.
For robots to do what we want, they need to understand us. Too often, this means having to meet them halfway: teaching them the intricacies of human language, for example, or giving them explicit commands for very specific tasks. But what if we could develop robots that were a more natural extension of us and that could actually do whatever we are thinking?