Our realistic predictions of what next year will have in store for AI development
Artificial Intelligence is a vast field with many unknowns, but it’s not hard to predict a few things that will or should happen in 2019 with the part of it that is deep learning.
We’ve gotten sloppy in our language. It’s convenient to use AI as a short hand for deep learning — and it gets good hits in headlines. But these days general AI — machines learning on their own like curious humans browsing in a book store — is still more science fiction than science.
What’s spreading like wildfire through the Internet these days are deep neural networks, a special case of AI based on processes typically initiated by people. The ability of deep-learning techniques to recognize patterns in images, speech and other areas — often faster than people can — has opened a door to a whole new direction in computing. Where this goes long term is anyone’s guess.
What’s clear is over the last year or two lots of people have boarded this train, where ever it’s bound. For what it’s worth, it’s not too hard to see a handful of the next few stops this train likely will make.
1. Accelerators will get traction
As we reported in September, at least four of the new accelerators for training deep learning neural networks are now sampling. Web giants have been hungry for these chips for some time. As Baidu research Greg Diamos told us in late 2016, the job of training machine learning models “is limited by compute, if we had faster processors we’d run bigger models.”
So, it’s no big stretch of the imagination to expect in 2019 some of the top data center operators will start buying these chips in volume. It’s not realistic to expect to see the crowded field of startups here winnowed out in the coming year, but we will see some early winners crowned with sockets and real revenues.
2. Valuations will get scrutinized
Some of the startups getting traction in deep-learning accelerators are also getting huge cash infusions. I predict this autumn’s spending spree will cool in 2019 as investors start to sharpen their pencils over exactly how much ROI they will see and when they will see it.
The deep-learning boom already has attracted tens if not hundreds of millions around an expanding school of about 50 startups. In the last few weeks, there was a new burst of holiday spending.
Habana Labs closed a $75 million round in November, bringing its total raised to $120 million. Wave Computing topped it with an $86 million round and a total of $200 million to date. Its flush principals used some of the money to buy veteran MIPS and announce plans to make its cores open source.
But Graphcore, hot off news Dell designed a system with its chips, raised a whopping $200 million Series D and a total $312 million to date. There may be more bursts of irrational exuberance in deep learning, but there certainly will be many hard and soft landings as business managers start plugging real revenue numbers into their spreadsheets.
3. Inference will get benchmarked
Speaking of numbers, the initial MLPerf benchmarks for training deep learning networks will get siblings in 2019. The group aims to release a suite of benchmarks for inference jobs coving both cloud-based and embedded systems.
I can’t claim that this is really a prediction — organizers of MLPerf told me that’s their plan. So, I’ll make the prediction that all the enthusiasm around training will shift to the broader market for inference silicon in 2019.
4. Chip vendors will embrace benchmarks
This is not really a prediction either, it’s more of a prescription. Chip vendors need to embrace the emerging benchmarks for deep learning. Their hungry customers and generous investors should demand it, so this emerging market gets some much-needed critical analysis and guidance — we can’t live on hype forever!
So far, only Google, Intel and Nvidia published results on a handful of carefully selected systems using the early version 0.5 training benchmarks MLPerf released recently. Many more companies need to publish more results on a wide variety of configurations and workloads, so this sector can see where it’s at and calibrate where it needs to go.
5. AI software platforms will get overgrown
This may have happened already. In recent weeks I have been getting almost daily pitches for AI software platforms of various sorts. I’m highly skeptical about what value any of these products bring given these ease of developing an application and the incredible market pressure to call it an AI platform
For the next several years this jungle will thicken with an increasingly exotic array of sub-species. End users and investors should sharpen their machetes.
6. Deep learning will bump into some limits
Arguably this also is happening already, but no one is connecting the dots. For example, after carefully curating a Bach playlist on Pandora over the holidays, I clicked the button for suggestions of other tracks the app could add. The recommendation engine came back with all the tracks I had just passed over.
Pandora is not the only Web wunderkind with a feature that falls short of amazing. I expect a few storms of consumer backlash in 2019. Let’s hope programmers and marketers exhibit enough self-restraint that they don’t inspire headlines about “artificial stupidity.” There is wonderful core technology here that needs to be applied with human good sense.
7. Interest will emerge in general Artificial Intelligence
The enthusiasm and money flowing into deep learning also fuels interest in research on general AI. I am no expert on leaders in this field, but I note Numenta, founded by Palm Pilot designer Jeff Hawkins, reported progress this year developing a general theory of how the neocortex works.
No one really knows how the human brain does the incredible things it does on ~35W and a baloney sandwich. No one can even explain why deep learning gets its great results matching patterns even though it’s a very narrow, human-controlled foreshadowing of AI.
I predict in 2019 more smart people will start to wonder about these bigger questions. I hope it will lead to some interesting discussions, and maybe even a few important advances no one predicted.
— Rick Merritt, Silicon Valley Bureau Chief, EE Times