Prompt engineering: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 2: Line 2:
__TOC__
__TOC__
==Introduction==
==Introduction==
[[Prompt engineering]], also known as [[In-context prompting]], is an emerging research area within Human-Computer Interaction (HCI) that involves the formal search for [[prompts]] to produce desired outcomes from [[AI models]]. This process involves selecting and composing sentences to achieve a certain result, such as a specific visual style in [[text-to-image models]] or a different tone in the response of a [[text-to-text models|text-to-text one]]. Unlike the hard sciences of STEM fields, this is an evolving technique based on trial and error to produce effective AI outcomes. <ref name="”1”">Bouchard, L (2022). Prompting eExplained: How to Talk to ChatGPT. Louis Bouchard. https://www.louisbouchard.ai/prompting-explained/</ref> <ref name="”2”">Oppenlaender, J (2022). A Taxonomy of Prompt Modifiers for Text-To-Image Generation. arXiv:2204.13988v2</ref> <ref name="”3”">Liu, V and Chilton, LB (2021). Design Guidelines for Prompt Engineering Text-to-Image Generative Models. arXiv:2109.06977v2</ref> Prompt engineers serve as translators between "human language" and "AI language," transforming an idea into words that the AI model can comprehend. <ref name="”1”"></ref>
[[Prompt engineering]], also known as [[In-context prompting]], is an emerging research area within Human-Computer Interaction (HCI) that involves the formal search for [[prompts]] to produce desired outcomes from [[AI models]]. Prompt engineering involves techniques that guide the behavior of [[Large language models]] ([[LLM]]s) towards specific goals without modifying the [[model]]'s [[weights]]. As an experimental discipline, the impact of these prompting strategies can differ significantly across various [[models]], requiring extensive trial and error along with heuristic approaches. This process involves selecting and composing sentences to achieve a certain result, such as a specific visual style in [[text-to-image models]] or a different tone in the response of a [[text-to-text models|text-to-text one]]. Unlike the hard sciences of STEM fields, this is an evolving technique based on trial and error to produce effective AI outcomes. <ref name="”1”">Bouchard, L (2022). Prompting eExplained: How to Talk to ChatGPT. Louis Bouchard. https://www.louisbouchard.ai/prompting-explained/</ref> <ref name="”2”">Oppenlaender, J (2022). A Taxonomy of Prompt Modifiers for Text-To-Image Generation. arXiv:2204.13988v2</ref> <ref name="”3”">Liu, V and Chilton, LB (2021). Design Guidelines for Prompt Engineering Text-to-Image Generative Models. arXiv:2109.06977v2</ref> Prompt engineers serve as translators between "human language" and "AI language," transforming an idea into words that the AI model can comprehend. <ref name="”1”"></ref>


The process of prompt engineering is similar to a conversation with the [[generative system]], with practitioners adapting and refining prompts to improve outcomes. <ref name="”2”"></ref> It has emerged as a new form of interaction with models that have learned complex abstractions from consuming large amounts of data from the internet. These models have metalearning capabilities and can adapt their abstractions on the fly to fit new tasks, making it necessary to prompt them with specific knowledge and abstractions to perform well on new tasks. The term "prompt engineering" was coined by [[Gwern]] (writer and technologist), who evaluated GPT3's capabilities on creative fiction and suggested that a new course of interaction would be to figure out how to prompt the model to elicit specific knowledge and abstractions. <ref name="”3”"></ref>
The process of prompt engineering is similar to a conversation with the [[generative system]], with practitioners adapting and refining prompts to improve outcomes. <ref name="”2”"></ref> It has emerged as a new form of interaction with models that have learned complex abstractions from consuming large amounts of data from the internet. These models have metalearning capabilities and can adapt their abstractions on the fly to fit new tasks, making it necessary to prompt them with specific knowledge and abstractions to perform well on new tasks. The term "prompt engineering" was coined by [[Gwern]] (writer and technologist), who evaluated GPT3's capabilities on creative fiction and suggested that a new course of interaction would be to figure out how to prompt the model to elicit specific knowledge and abstractions. <ref name="”3”"></ref>
370

edits