Scott Alexander covers
[https://slatestarcodex.com/2020/06/10/the-obligatory-gpt-3-post/]OpenAI's
latest NLP neural net, GPT-3. It's trained with a massive 175 billion
parameters, and sometimes it's shocking how human-like the language it generates
is:
> PROMPT: UNITED