How to recognize AI snake oil
Getting to the insight behind the hype
|Robert Osazuwa Ness||Nov 23, 2019|
AltDeep is a newsletter focused on microtrend-spotting in data and decision science, machine learning, and AI.
Know an engineer, research scientist, AI product manager, or entrepreneur in the AI space? Buy them a gift subscription. They’ll owe you one.
In this newsletter, I try to focus on trends in AI. Yes, part of this is trends in research, but this bit is easy. Trends in industry, in the start-up space, and the labor market for the engineers who build AI and the executives who make AI companies is trickier.
Clever people can generally find enough intuition about the technical fundamentals of an engineering domain to be able to form useful mental models that predict the opportunities and threats brought about by new technologies.
But this is noticeably hard in AI since there is so much hype. What’s worse is that practitioners themselves often believe the hype.
It’s hard to write something that doesn’t merely whine about hype but also presents something useful. This week I found two of such gems.
Ear to the Ground
Worrisome trends in AI startups, self-driving cars not as imminent as you think
Filip Piekniewski writes a humorous and cynical periodic review of AI. His most recent version is well worth a read. Some of the highlights concerned self-driving cars:
I keep my prediction that there will be exactly zero fully autonomous Tesla's in 2020, and most likely 2021, 2022 and at least 2023.
Success in self-driving cars relies not on success in normal conditions, but the unexpected edge-cases. The edge-cases continue to throw these vehicles off.
One startup is trying to solve automated driving with deep reinforcement learning, which works by training driving agents with simulation. A simulation is going to fail to capture edge-cases, because edge-cases are by definition unexpected and therefore not built into the simulation.
He also has some sobering news for employees of AI-startups (like myself).
There is a trend in AI startups getting acquired rather than having an IPO. This tends to happen because things are going well enough to IPO, where you get the highest valuation. This is bad for the employees because startup employees sacrifice higher paying jobs at established companies for options on purchasing common stock. If the valuation is too low, there might be nothing left for common stock after the preferred stock gets paid off. If you have already exercised your options, you would have lost the amount spent on the exercise price.
AI update, late 2019 - Wizards of Oz — Piekniewski's blog
One of my favorite AI-trend-flavored post of Piekniewski is one which frames the AI community’s current bias towards extremely computationally expensive approaches in the context of Silicon Valley’s historical preference for computationally intense business models.
But the Valley bought it and created the biggest AI summer party ever. They bought it without a blink. And the R&D centers, non-profit labs and startups started swelling with deep learning scientists pumped fresh out of college, often without any industrial experience whatsoever. Startups bloomed, promising all sorts of wonders in the space of robotics, autonomous vehicles, autonomous drones etc. And the solution to all the problems was supposed to be deep learning - simply deeper, trained on more data and bigger GPUs. It was supposed to just magically work, just needed more data and more compute. The party was on.
A brief story of Silicon Valley's affair with AI — Piekniewski's blog
How to recognize AI snake oil
Princeton professor Arvind Narayanan provides an excellent primer on what works well in AI, what is improving, and what is “fundamentally dubious”. It is summarized in this slide:
The prediction tasks in the “dubious category” are not merely opinions. He points out empirical studies that show repeated failures in controlled settings to perform well on these tasks. But despite this, he shows that companies selling this snake oil are managing to raise funds.
Relatedly, this face-palmy post trended on Reddit last week…
AI update, late 2019 - Wizards of Oz — Piekniewski's blogdd
First time reading? Thanks for visiting! Subscribe now and never miss a post.