“None of the AI techniques we have can build representations of the world, whether through structure or through learning, that are anywhere near what we observe in animals and humans,” said Yann LeCun, a computer scientist at NYU and director of Facebook Artificial Intelligence Research.
LeCun helped pioneer the AI field of deep learning that has helped tech giants automate popular services such as filtering friends’ faces on Facebook or translating between Chinese and English through Google Translate.
Deep learning algorithms figured out how to perform all those tasks without the AI equivalent of the innate cognitive machinery that humans and animals have.
Still, LeCun believes that AI can make progress toward developing that general intelligence based on unsupervised deep learning, a recent development which removes much of the need for humans to provide hand-labeled data that the machines then learn from.
In his view, AI could benefit greatly from a single learning principle-or collection of such principles-that would arise with or without having built-in structure modeled on innate cognitive machinery.
The debate over whether AI learning will ultimately prove more “Nature or nurture” is far from being settled.
If the unsupervised learning algorithms eventually require more structure similar to cognitive representations of objects, sets, places, and so forth, Marcus could claim victory.