@@ -7,15 +7,25 @@ sidebartitle: GPT-3
77meta: GPT-3 is a trained neural network with 175 billion parameters that allows it to be significantly better at text generation than previous models.
88
99
10- [ GPT-3] ( https://arxiv.org/abs/2005.14165 ) is a neural network that was
11- trained by the [ OpenAI] ( https://openai.com/ ) organization with 175 billion
12- parameters, which allows the model to be significantly better at natural
10+ [ GPT-3] ( https://arxiv.org/abs/2005.14165 ) is a neural network
11+ trained by the [ OpenAI] ( https://openai.com/ ) organization with
12+ significantly more parameters than previous generation models.
13+
14+ There are several variations of GPT-3, which range from 125 to 175 billion
15+ parameters. The different variations allow the model to better respond to
16+ different types of input, such as a question & answer format, long-form
17+ writing, human language translations (e.g. English to French). The large
18+ numbers of parameters make GPT-3 significantly better at natural
1319language processing and text generation than the prior model,
1420[ GPT-2] ( https://openai.com/blog/gpt-2-1-5b-release/ ) , which only had
15211.5 billion parameters.
1622
1723<img src =" /img/logos/openai.jpg " width =" 100% " alt =" OpenAI logo. " class =" shot rnd " >
1824
25+ GPT-3 can only currently be access by an
26+ [ API provided by OpenAI] ( https://openai.com/blog/openai-api/ ) , which is
27+ in private beta.
28+
1929
2030## What's so special about GPT-3?
2131The GPT-3 model can generate texts of up to 50,000 characters, with no
@@ -91,4 +101,17 @@ although some other estimates calculated it could take up
91101to $12 million depending on how the hardware was provisioned.
92102
93103
104+ ## GPT-3 resources
105+ These resources range from broad philosophy of what GPT-3 means
106+ for machine learning to specific technical details for how the model
107+ is trained.
108+
109+ * [ OpenAI's GPT-3 Language Model: A Technical Overview] ( https://lambdalabs.com/blog/demystifying-gpt-3/ )
110+ and
111+ [ GPT-3: A Hitchhiker's Guide] ( https://lambdalabs.com/blog/gpt-3/ )
112+ are two long-format guides that analyze how GPT-3's technical
113+ specifications fit in the larger machine learning ecosystem, quotes by
114+ researchers on its usage, and some initial resources to get a
115+ better understanding of what this model is capable of performing.
116+
94117
0 commit comments