User contributions for Daikon Radish
6 March 2023
- 16:5916:59, 6 March 2023 diff hist +27 N File:Coding model diagram1.png File uploaded with MsUpload current
- 16:5716:57, 6 March 2023 diff hist +266 Prompt engineering →Prompt Engineering for Code Generation Models
- 16:5016:50, 6 March 2023 diff hist +3 Prompt engineering No edit summary
- 16:5016:50, 6 March 2023 diff hist +864 Prompt engineering →Context
- 16:4716:47, 6 March 2023 diff hist +16 Prompt engineering →Prompt Engineering for Coding
- 16:4516:45, 6 March 2023 diff hist +53 Prompt engineering →Examples
- 16:4416:44, 6 March 2023 diff hist +553 Prompt engineering →Examples
- 14:0914:09, 6 March 2023 diff hist +47 Guides No edit summary
- 14:0714:07, 6 March 2023 diff hist +4 Zero shot, one shot and few shot learning No edit summary current
- 14:0714:07, 6 March 2023 diff hist −298 Zero shot, one shot and few shot learning No edit summary
- 14:0314:03, 6 March 2023 diff hist +530 Zero shot, one shot and few shot learning No edit summary
- 13:2013:20, 6 March 2023 diff hist +823 Zero shot, one shot and few shot learning No edit summary
- 13:1913:19, 6 March 2023 diff hist +1,438 N Zero shot, one shot and few shot learning Created page with "{{Needs Expansion}} Zero shot learning is when you have no examples in the prompt. One shot learning is when you have 1 example in the prompt. Few shot learning is when you have a few examples in the prompt. all of these techniques allow the machine learning model to learn with limited or no labeled data. ==Zero Shot Learning== Zero-shot learning, one-shot learning, and few-shot learning are all machine learning techniques used to train model..."
- 12:2412:24, 6 March 2023 diff hist +746 Prompt engineering →Prompt Engineering for Coding
- 12:1212:12, 6 March 2023 diff hist +97 Prompt engineering No edit summary
5 March 2023
- 19:2219:22, 5 March 2023 diff hist +132 Prompt engineering No edit summary
- 19:2119:21, 5 March 2023 diff hist +9 Artificial intelligence terms No edit summary
- 19:2019:20, 5 March 2023 diff hist +105 Artificial intelligence terms No edit summary
23 February 2023
- 16:0316:03, 23 February 2023 diff hist +2,343 Improving Language Understanding by Generative Pre-Training (GPT) No edit summary
- 16:0116:01, 23 February 2023 diff hist +1,031 N Improving Language Understanding by Generative Pre-Training (GPT) Created page with "===Introduction=== In June 2018, OpenAI introduced GPT-1, a language model that combined unsupervised pre-training with the transformer architecture to achieve significant progress in natural language understanding. The team fine-tuned the model for specific tasks and found that pre-training helped it perform well on various NLP tasks with minimal fine-tuning. GPT-1 used the BooksCorpus dataset and self-attention in the transformer's decoder with 117 million parameters,..."