As AI surpasses human abilities in Go and poker – two decades after Deep Blue trounced chess grandmaster Garry Kasparov – it is seeping into our lives in ever more profound ways. It affects the way we search the web, receive medical advice and whether we receive finance from our banks.
Instead of worrying so much about robots taking away jobs, maybe we should worry more about wages being too low for robots to even get a chance. Seasonal labor for harvesting agricultural products, particularly fruits and vegetables, is dependent on human labor from a diminishing universe of willing workers.
We are only in the earliest stages of so-called algorithmic regulation – intelligent machines deploying big data, machine learning and artificial intelligence (AI) to regulate human behaviour and enforce laws – but it already has profound implications for the relationship between private citizens and the state.
China has recently announced their long-term goal to become #1 in A.I. by 2030. They plan to grow their A.I. industry to over $22 billion by 2020, $59 billion by 2025 and $150 billion by 2030. They did this same type of long-term strategic planning for robotics – to make it an in-country industry and to transform the country from a low-cost labor source to a high-tech manufacturing resource, and it’s working.
How do you stop a robot from hurting people? Many existing robots, such as those assembling cars in factories, shut down immediately when a human comes near. But this quick fix wouldn’t work for something like a self-driving car that might have to move to avoid a collision, or a care robot that might need to catch an old person if they fall. With robots set to become our servants, companions and co-workers, we need to deal with the increasingly complex situations this will create and the ethical and safety questions this will raise.
Join us at the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2017) for a full day workshop that will bring together international stakeholders in robotics to examine best practices for accelerating robotics innovation through strategic policy frameworks.
I’m examining the perception of autonomous cars using hypothetical scenarios. Each of the hypothetical scenarios is accompanied with an image to help illustrate the scene — using grey tones and nondescript human-like features — along with the option to listen to the question spoken out loud to fully visualise an association.
If you live in the UK, you can take this survey and help contribute to my research!
In the 1995 film “Batman Forever,” the Riddler used 3-D television to secretly access viewers’ most personal thoughts in his hunt for Batman’s true identity. By 2011, the metrics company Nielsen had acquired Neurofocus and had created a “consumer neuroscience” division that uses integrated conscious and unconscious data to track customer decision-making habits. What was once a nefarious scheme in a Hollywood blockbuster seems poised to become a reality.
WeRobotics Global has become a premier forum for social good robotics. The feedback featured below was unsolicited. On June 1, 2017, we convened our first, annual global event, bringing together 34 organizations to New York City (full list below) to shape the global agenda and future use of robotics in the social good sector. WeRobotics Global was kindly hosted by the Rockefeller Foundation, the first donor to support our efforts. They opened the event with welcome remarks and turned it over to Patrick Meier from WeRobotics who provided an overview of WeRobotics and the big picture context for social sector robotics.
The world’s brightest minds in Artificial Intelligence (AI) and humanitarian action will meet with industry leaders and academia at the AI for Good Global Summit, 7-9 June 2017, to discuss how AI will assist global efforts to address poverty, hunger, education, healthcare and the protection of our environment. The event will in parallel explore means to ensure the safe, ethical development of AI, protecting against unintended consequences of advances in AI.
The device named “Spark” flew high above the man on stage with his hands waving in the direction of the flying object. In a demonstration of DJI’s newest drone, the audience marveled at the Coke can-sized device’s most compelling feature: gesture controls. Instead of a traditional remote control, this flying selfie machine follows hand movements across the sky. Gestures are the most innate language of mammals, and including robots in our primal movements means we have reached a new milestone of co-existence.