Jump to content

OpenAI: Difference between revisions

546 bytes added ,  2 March 2023
No edit summary
Line 26: Line 26:


OpenAI has granted users with "full usage rights to commercialize the images they create with DALL-E, including the right to reprint, sell, and merchandise <ref name="”13”">Rizo, J (2022). Who Will Own the Art of the Future? Wired. https://www.wired.com/story/openai-dalle-copyright-intellectual-property-art/</ref>."
OpenAI has granted users with "full usage rights to commercialize the images they create with DALL-E, including the right to reprint, sell, and merchandise <ref name="”13”">Rizo, J (2022). Who Will Own the Art of the Future? Wired. https://www.wired.com/story/openai-dalle-copyright-intellectual-property-art/</ref>."
===GPT-1 and GPT-2===
[[GPT]] or [[Generative Pre-trained Transformer]] was introduced in the [[paper]] [[Improving Language Understanding by Generative Pre-Training]] in June 2018. [[GPT-1]] combined the [[transformers]] [[architecture]] with [[unsupervised learning]] to create a [[model]] with 117 million [[parameters]] and trained on 7000 books. [[GPT-2]], released in February 2019 with the paper [[Language Models are Unsupervised Multitask Learners]], had 1.5 billion parameters and was trained 40GB of web text from 8 million documents.


===GPT-3===
===GPT-3===
[[GPT-3]] (Generative Pre-trained Transformer 3) is the third generation of a computational system that generates text, code or other data, starting from a source input, the prompt. This system uses deep learning to produce human-like text <ref name="”2”" /> <ref name="”14”">Wilhelm, A (2021). Ok, the GPT-3 Hype Seems Pretty Reasonable. TechCrunch. https://techcrunch.com/2021/03/17/okay-the-gpt-3-hype-seems-pretty-reasonable/</ref>(2, 14). According to Zhang & Li (2021), GPT-3 is the "language model with the most parameters, the largest scale, and the strongest capabilities. Using a large amount of Internet text data and thousands of books for model training, GPT-3 can imitate the natural language patterns of humans nearly perfectly. This language model is extremely realistic and is considered the most impressive model as of today <ref name="”15”">Zhang, M and Li, J (2021). A Commentary of GPT-3 in MIT Technology Review 2021. Fundamental Research 1(6):831-833.</ref>."
[[GPT-3]] (Generative Pre-trained Transformer 3) is the third generation of a computational system that generates text, code or other data, starting from a source input, the prompt. This system uses deep learning to produce human-like text <ref name="”2”" /> <ref name="”14”">Wilhelm, A (2021). Ok, the GPT-3 Hype Seems Pretty Reasonable. TechCrunch. https://techcrunch.com/2021/03/17/okay-the-gpt-3-hype-seems-pretty-reasonable/</ref>(2, 14). According to Zhang & Li (2021), GPT-3 is the "language model with the most parameters, the largest scale, and the strongest capabilities. Using a large amount of Internet text data and thousands of books for model training, GPT-3 can imitate the natural language patterns of humans nearly perfectly. This language model is extremely realistic and is considered the most impressive model as of today <ref name="”15”">Zhang, M and Li, J (2021). A Commentary of GPT-3 in MIT Technology Review 2021. Fundamental Research 1(6):831-833.</ref>."