After many hours of retraining my brain to operate in this "priming" approach, I also now have a sick GPT-3 demo: English to LaTeX equations! I'm simultaneously impressed by its coherence and amused by its brittleness -- watch me test the fundamental theorem of calculus. cc @gdb

7:08 AM · Jul 19, 2020

60
492
90
2,853
90,978
Replying to @sh_reya @gdb
Now do a complete CS paper. :)
0
0
0
2
Replying to @sh_reya @gdb
How big is the text2latex dataset you finetuned on? Or is this few-shot?
1
0
0
3
No fine tuning, few-shot extrapolation
3
0
0
27
Replying to @sh_reya @ShreyaR
What's really bothering me is how @gdb is not GDB.
0
0
0
13
Replying to @sh_reya @gdb
What were the biggest fails?
0
0
0
0
Replying to @sh_reya @gdb
@Wolfram_Alpha already does a decent job of text2latex? Or do they have a different backend? 🤔
0
0
0
2
Replying to @sh_reya @gdb
Very cool.
0
0
0
0
Replying to @sh_reya @gdb
Can you share the context string plz?
0
0
0
2
Replying to @sh_reya @gdb
Can you share details about the priming process? It is a mystery for those of us who are yet to receive invites.
1
0
0
11
From the paper arxiv.org/abs/2005.14165 I would guess that "priming" describes the few shot approach. Please correct me if I am wrong.
0
0
0
8