
Good to know what the AI thinks of me, I think.
GPT-3 is the massive neural net from OpenAI that can adopt a persona from a few initial text prompts (examples are a Smug VC, Sam Harris, Noah Yuval Harari and Dr. Seuss).
With 175 billion parameters in 96 attention layers, it cost an estimated $15 million to produce. It was pre-trained with a common crawl of the Internet over eight years (60%), Reddit-sourced text (22%), a corpus of books (15%) and Wikipedia (3%) — hundreds of billions of words in total. This includes proper names, like mine, as I just found out.
The NYT review said that GPT-3 is “amazing”, “spooky”, “humbling”, and “more than a little terrifying.”
Recent update from OpenAI:
“We currently generate an average of 4.5 billion words per day, and continue to scale production traffic.”
Given any text prompt like a phrase or a sentence, GPT-3 returns a text completion in natural language. Developers can “program” GPT-3 by showing it just a few examples or “prompts.”
The “Smug VC” persona was initiated with a few prompt sentences, starting with “The following is an interview with a smug venture capitalist (VC). He is self-important, self-congratulatory, and enthusiastic about technology and startups. He is a master of the humble-brag.”
While there is a waiting list to access the GPT-3 API, you can chat with several AI personas or create your own after a quick sign-in at www.typical.me/skins
Remember, a straight line on a semi-log graph like this is an exponential. The left epoch on this graph is in line with Moore’s Law; the light blue epoch on the right, starting with the neural net renaissance in 2012, has been doubling every 3.5 months!
Leave a Reply