I suspect that less than 10 years from now, all of the DL training/architecture tricks that came from the arXiv firehose over 2015-2019 will have been entirely superseded by automated search techniques. The future: no alchemy, just clean APIs, and quite a bit of compute.

6:44 PM · Jan 7, 2019

19
139
11
638
High-level APIs, I should add. In the future, doing tensor-level manipulations will feel like writing assembly.
5
23
1
212
Replying to @fchollet
There is an urgent need for model management tools, that can store, query, and compare models and experiments. I believe that a GitHub-like repo for pretrained models can greatly advance Automatic Architecture Search.
1
0
0
15
You mean TF Hub?
0
0
0
2
Replying to @fchollet
We will put all the magic into automated search techinques. I can image that in the near future there will be neural automated search, which was trained on results of all arXiv papers.
0
0
1
5
Replying to @fchollet
Searching the right architecture for a problem is nice, but a neural model that auto-grow its own architecture as part of the learning process is the future as I see it.
2
0
0
10
That’s actually exactly what I’m working on!
1
0
0
3
Thanks! Our Auto-Keras project is still in a primitive stage. We are heading the same direction.😊
0
0
0
0
Replying to @fchollet
Until then, expertise mining arxiv is a competitive advantage.
0
0
0
1