OpenAI: Difference between revisions

563 bytes added ,  6 April 2023
no edit summary
No edit summary
No edit summary
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
{{see also|Organizations}}
{{see also|Organizations}}
==Introduction==
[[OpenAI]] is an [[Artificial Intelligence]] ([[AI]]) research [[company]] founded in 2015. Originally a [[non-profit]], intended to be free from the need to generate financial return, it has since also founded OpenAI LP, a for-profit corporation <ref name="”1”">OpenAI. About. OpenAI. https://openai.com/about/</ref> <ref name="”2”">Floridi, L and Chiriatti, M (2020). GPT-3: Its Nature, Scope, Limits, and Consequences. Minds and Machines 30:681-694.</ref> <ref name="”3”">Olanoff, D (2015). Artificial Intelligence Nonprofit OpenAI launches With Backing of Elon Musk and Sal Altman. TechCrunch. https://techcrunch.com/2015/12/11/non-profit-openai-launches-with-backing-from-elon-musk-and-sam-altman/</ref>. The stated mission is to promote and develop friendly AI beneficial to all humanity <ref name="”1”" /> <ref name="”2”" />. It received financial support by it's founders members Sam Altman, Greg Brockman and Elon Musk, and also from Jessica Livingston, Peter Thiel, Amazon Web Services, Infosys and YC Research with a $1 billion investment in total <ref name="”3”" />. In 2020, [[Microsoft]], another investor in OpenAI, announce an exclusive license agreement for [[GPT-3]] <ref name="”2”" />.
[[OpenAI]] is an [[Artificial Intelligence]] ([[AI]]) research [[company]] founded in 2015. Originally a [[non-profit]], intended to be free from the need to generate financial return, it has since also founded OpenAI LP, a for-profit corporation <ref name="”1”">OpenAI. About. OpenAI. https://openai.com/about/</ref> <ref name="”2”">Floridi, L and Chiriatti, M (2020). GPT-3: Its Nature, Scope, Limits, and Consequences. Minds and Machines 30:681-694.</ref> <ref name="”3”">Olanoff, D (2015). Artificial Intelligence Nonprofit OpenAI launches With Backing of Elon Musk and Sal Altman. TechCrunch. https://techcrunch.com/2015/12/11/non-profit-openai-launches-with-backing-from-elon-musk-and-sam-altman/</ref>. The stated mission is to promote and develop friendly AI beneficial to all humanity <ref name="”1”" /> <ref name="”2”" />. It received financial support by it's founders members Sam Altman, Greg Brockman and Elon Musk, and also from Jessica Livingston, Peter Thiel, Amazon Web Services, Infosys and YC Research with a $1 billion investment in total <ref name="”3”" />. In 2020, [[Microsoft]], another investor in OpenAI, announce an exclusive license agreement for [[GPT-3]] <ref name="”2”" />.


Line 26: Line 27:


OpenAI has granted users with "full usage rights to commercialize the images they create with DALL-E, including the right to reprint, sell, and merchandise <ref name="”13”">Rizo, J (2022). Who Will Own the Art of the Future? Wired. https://www.wired.com/story/openai-dalle-copyright-intellectual-property-art/</ref>."
OpenAI has granted users with "full usage rights to commercialize the images they create with DALL-E, including the right to reprint, sell, and merchandise <ref name="”13”">Rizo, J (2022). Who Will Own the Art of the Future? Wired. https://www.wired.com/story/openai-dalle-copyright-intellectual-property-art/</ref>."
===GPT-1 and GPT-2===
[[GPT]] or [[Generative Pre-trained Transformer]] was introduced in the [[paper]] [[Improving Language Understanding by Generative Pre-Training]] in June 2018. [[GPT-1]] combined the [[transformers]] [[architecture]] with [[unsupervised learning]] to create a [[model]] with 117 million [[parameters]] and trained on 7000 books. [[GPT-2]], released in February 2019 with the paper [[Language Models are Unsupervised Multitask Learners]], had 1.5 billion parameters and was trained 40GB of web text from 8 million documents.


===GPT-3===
===GPT-3===
[[GPT-3]] (Generative Pre-trained Transformer 3) is the third generation of a computational system that generates text, code or other data, starting from a source input, the prompt. This system uses deep learning to produce human-like text <ref name="”2”" /> <ref name="”14”">Wilhelm, A (2021). Ok, the GPT-3 Hype Seems Pretty Reasonable. TechCrunch. https://techcrunch.com/2021/03/17/okay-the-gpt-3-hype-seems-pretty-reasonable/</ref>(2, 14). According to Zhang & Li (2021), GPT-3 is the "language model with the most parameters, the largest scale, and the strongest capabilities. Using a large amount of Internet text data and thousands of books for model training, GPT-3 can imitate the natural language patterns of humans nearly perfectly. This language model is extremely realistic and is considered the most impressive model as of today <ref name="”15”">Zhang, M and Li, J (2021). A Commentary of GPT-3 in MIT Technology Review 2021. Fundamental Research 1(6):831-833.</ref>."
[[GPT-3]] (Generative Pre-trained Transformer 3) is the third generation of a computational system that generates text, code or other data, starting from a source input, the prompt. This system uses deep learning to produce human-like text <ref name="”2”" /> <ref name="”14”">Wilhelm, A (2021). Ok, the GPT-3 Hype Seems Pretty Reasonable. TechCrunch. https://techcrunch.com/2021/03/17/okay-the-gpt-3-hype-seems-pretty-reasonable/</ref>(2, 14). According to Zhang & Li (2021), GPT-3 is the "language model with the most parameters, the largest scale, and the strongest capabilities. Using a large amount of Internet text data and thousands of books for model training, GPT-3 can imitate the natural language patterns of humans nearly perfectly. This language model is extremely realistic and is considered the most impressive model as of today <ref name="”15”">Zhang, M and Li, J (2021). A Commentary of GPT-3 in MIT Technology Review 2021. Fundamental Research 1(6):831-833.</ref>."


370

edits