Rabu, 27 Maret 2019

Turing Award 2018: Nobel Prize of computing given to ‘godfathers of AI’ - The Verge

The 2018 Turing Award, known as the “Nobel Prize of computing,” has been given to a trio of researchers who laid the foundations for the current boom in artificial intelligence.

Yoshua Bengio, Geoffrey Hinton, and Yann LeCun — sometimes called the ‘godfathers of AI’ — have been recognized with the $1 million annual prize for their work developing the AI subfield of deep learning. The techniques the trio developed in the 1990s and 2000s enabled huge breakthroughs in tasks like computer vision and speech recognition. Their work underpins the current proliferation of AI technologies, from self-driving cars to automated medical diagnoses.

In fact, you probably interacted with the descendants of Bengio, Hinton, and LeCun’s algorithms today — whether that was the facial recognition system that unlocked your phone, or the AI language model that suggested what to write in your last email.

All three have since taken up prominent places in the AI research ecosystem, straddling academia and industry. Hinton splits his time between Google and the University of Toronto; Bengio is a professor at the University of Montreal and started an AI company called Element AI; while LeCun is Facebook’s chief AI scientist and a professor at NYU.

“It’s a great honor,” LeCun told The Verge. “As good as it gets in computer science. It’s an even better feeling that it’s shared with my friends Yoshua and Geoff.”

Jeff Dean, Google’s head of AI, praised the trio’s achievements. “Deep neural networks are responsible for some of the greatest advances in modern computer science,” said Dean in a statement. “At the heart of this progress are fundamental techniques developed by this year’s Turing Award winners, Yoshua Bengio, Geoff Hinton, and Yann LeCun.”

The trio’s achievements are particularly notable as they kept the faith in artificial intelligence at a time when the technology’s prospects were dismal.

AI is well-known for its cycles of boom and bust, and the issue of hype is as old as the field itself. When research fails to meet inflated expectations it creates a freeze in funding and interest known as an “AI winter.” It was at the tail end of one such winter in the late 1980s that Bengio, Hinton, and LeCun began exchanging ideas and working on related problems. These included neural networks — computer programs made from connected digital neurons that have become a key building block for modern AI.

“There was a dark period between the mid-90s and early-to-mid-2000s when it was impossible to publish research on neural nets, because the community had lost interest in it,” says LeCun. “In fact, it had a bad rep. It was a bit taboo.”

The trio decided they needed to rekindle interest, and secured funding from the Canadian government to sponsor a loose hub of interrelated research. “We organized regular meetings, regular workshops, and summer schools for our students,” says LeCun. “That created a small community that [...] around 2012, 2013 really exploded.”

During this period, the three showed that neural nets could achieve strong results on tasks like character recognition. But the rest of the research world did not pay attention until 2012, when a team led by Hinton took on a well-known AI benchmark called ImageNet. Researchers had so far only delivered incremental improvements on this object recognition challenge, but Hinton and his students smashed the next-best algorithm by more than 40 percent with the help of neural networks.

“The difference there was so great that a lot of people, you could see a big switch in their head going ‘clunk,’” says LeCun. “Now they were convinced.”

Facebook Prineville Data Center

Cheap processing power from GPUs (originally designed for gaming) and an abundance of digital data (given off by the internet the same way a car gives off fumes), offered fuel for these little cognitive engines. And since 2012, the basic techniques that Bengio, Hinton, and LeCun pioneered, including backpropagation and convolutional neural networks, have become ubiquitous in AI, and, by extension, in technology as a whole.

LeCun says he is optimistic about the prospects of artificial intelligence, but he’s also clear that much more work needs to be done before the field lives up to its promise. Current AI systems need lots of data to understand the world, can be easily tricked, and are only good at specific tasks. “We just don’t have machines with common sense,” says LeCun.

If the field is to continue on its upward trajectory, new methods will need to be discovered that are as foundational as those developed by the godfathers of AI.

“Whether we’ll able to use new methods to create human-level intelligence, well, there’s probably another 50 mountains to climb, including ones we can’t even see yet” says LeCun. “We’ve only climbed the first mountain. Maybe the second.”

Let's block ads! (Why?)


https://www.theverge.com/2019/3/27/18280665/ai-godfathers-turing-award-2018-yoshua-bengio-geoffrey-hinton-yann-lecun

2019-03-27 10:02:14Z
52780252052834

Tidak ada komentar:

Posting Komentar