Talking Machines: History of machine learning, w. Geoffrey Hinton, Yoshua Bengio, Yann LeCun
In episode five of Talking Machines, we hear the first part of our conversation with Geoffrey Hinton (Google and University of Toronto), Yoshua Bengio (University of Montreal) and Yann LeCun (Facebook and NYU). Ryan introduces us to the ideas in tensor factorization methods for learning latent variable models (which is both a tongue twister and and one of the new tools in ML). To find out more on the topic take a look at the work of Daniel Hsu, Animashree Anandkumar and Sham M. Kakade Plus we take a listener question about just where statistics stops and machine learning begins.
Robohub is an online platform that brings together leading communicators in robotics research, start-ups, business, and education from around the world. Learn more about us here. If you liked this article, you may also be interested in:
- Artificial General Intelligence that plays Atari video games: How did DeepMind do it?
- Inside DeepMind
- Google’s robot and artificial intelligence acquisitions are anything but scary
- Google’s DeepMind acquisition in reinforcement learning
- Why robots will not be smarter than humans by 2029
See all the latest robotics news on Robohub, or sign up for our weekly newsletter.
March 29, 2021
Need help spreading the word?
Join the Robohub crowdfunding page and increase the visibility of your campaign