Jump to content

Prompt: Difference between revisions

1,599 bytes added ,  2 August 2023
(Removed most of the "Prompt Engineering section".)
 
(13 intermediate revisions by 3 users not shown)
Line 1: Line 1:
==Introduction==
[[File:0. chat-email-copy Blusteak.png|thumb|Figure 1. Example of a prompt on ChatGPT. Source: Blusteak.]]
[[File:0. chat-email-copy Blusteak.png|thumb|Figure 1. Example of a prompt on ChatGPT. Source: Blusteak.]]
A [[prompt]] or an [[artificial intelligence]] ([[AI]]) prompt is a [[natural language]] set of instructions, a text, that functions as [[input]] for an [[AI generator]]. <ref name="”1”">Ana. B (2022). Design your AI Art generator prompt using ChatGPT. Towards AI. https://pub.towardsai.net/design-your-ai-art-generator-prompt-using-chatgpt-7a3dfddf6f76</ref> Simply, it is a phrase or individual keywords used in tools like [[ChatGPT]] (figure 1), a [[text-to-text]] generator, or in [[text-to-image]] generators like [[DALL-E]]. After the input, the [[AI model]] tries to interpret it and generates a response. <ref name="”2”">Schmid, S (2022).ChatGPT: How to write the perfect prompts. Neuroflash. https://neuroflash.com/chatgpt-how-to-write-the-perfect-prompts/</ref>
A [[prompt]] or an [[artificial intelligence]] ([[AI]]) prompt is a [[natural language]] set of instructions, a text, that functions as [[input]] for an [[AI generator]]. <ref name="”1”">Ana. B (2022). Design your AI Art generator prompt using ChatGPT. Towards AI. https://pub.towardsai.net/design-your-ai-art-generator-prompt-using-chatgpt-7a3dfddf6f76</ref> Simply, it is a phrase or individual keywords used in tools like [[ChatGPT]] (figure 1), a [[text-to-text]] generator, or in [[text-to-image]] generators like [[DALL-E]]. After the input, the [[AI model]] tries to interpret it and generates a response. <ref name="”2”">Schmid, S (2022).ChatGPT: How to write the perfect prompts. Neuroflash. https://neuroflash.com/chatgpt-how-to-write-the-perfect-prompts/</ref>


Line 40: Line 40:


==Prompt engineering==
==Prompt engineering==
[[Prompt design]] or [[prompt engineering]] is the practice of discovering the prompt that gets the best result from the [[AI system]]. <ref name="”4”"></ref> The development of prompts requires human intuition with results that can look arbitrary. <ref name="”9”">Pavlichenko, N, Zhdanov, F and Ustalov, D (2022). Best prompts for text-to-image models and how to find them. arXiv:2209.11711v2</ref> Manual prompt engineering is laborious, it may be infeasible in some situations, and the prompt results may vary between various model versions. <ref name="”3”"></ref> However, there have been developments in automated [[prompt generation]] which rephrases the input, making it more model-friendly. <ref name="”5”"></ref>
[[Prompt engineering]] or [[Prompt design]] is the practice of discovering the prompt that gets the best result from the [[AI system]]. <ref name="”4”"></ref> The development of prompts requires human intuition with results that can look arbitrary. <ref name="”9”">Pavlichenko, N, Zhdanov, F and Ustalov, D (2022). Best prompts for text-to-image models and how to find them. arXiv:2209.11711v2</ref> Manual prompt engineering is laborious, it may be infeasible in some situations, and the prompt results may vary between various model versions. <ref name="”3”"></ref> However, there have been developments in automated [[prompt generation]] which rephrases the input, making it more model-friendly. <ref name="”5”"></ref>


A list of prompts for beginners is [https://mpost.io/top-50-text-to-image-prompts-for-ai-art-generators-midjourney-and-dall-e/ available] as well as a compilation of the best [https://mpost.io/best-10-ai-prompt-guides-and-tutorials-for-text-to-image-models-midjourney-stable-diffusion-dall-e/ prompt guides and tutorials].
===Text-to-Text===
'''[[Prompt engineering for text generation]]'''
 
===Text-to-Image===
'''[[Prompt engineering for image generation]]'''


==Prompt generators ==
==Prompt generators ==
Line 61: Line 65:


[[ChatGPT]] can also be used to design prompts for AI image generators besides the options above. This can be achieved by asking for adjectives that describe a specific scene (figures 10a and 10b) or directly asking it to write a prompt (e.g. “Write a text prompt for an AI art generation software that would fit the art style of Kilian Eng”). <ref name="”1”"></ref> <ref name="”15”">EdXD (2022). Using GPT-3 to generate text prompts for “AI” generated art. ByteXD. https://bytexd.com/using-gpt-3-to-generate-text-prompts-for-ai-generated-art/ </ref>
[[ChatGPT]] can also be used to design prompts for AI image generators besides the options above. This can be achieved by asking for adjectives that describe a specific scene (figures 10a and 10b) or directly asking it to write a prompt (e.g. “Write a text prompt for an AI art generation software that would fit the art style of Kilian Eng”). <ref name="”1”"></ref> <ref name="”15”">EdXD (2022). Using GPT-3 to generate text prompts for “AI” generated art. ByteXD. https://bytexd.com/using-gpt-3-to-generate-text-prompts-for-ai-generated-art/ </ref>
==Security Risks==
*[[Prompt injection]]
==Prompting vs. Fine-tuning==
Prompting and [[Fine-tuning]] represent two different ways to leverage [[large language models]] (LLMs) like [[GPT-4]].
Fine-tuning involves adapting an LLM's [[parameters]] based on a specific [[dataset]], making it a potent tool for complex tasks where accurate, trusted output is vital. However, fine-tuning often requires a labeled dataset and is potentially expensive during the [[training]] phase.
Conversely, prompting is the technique of providing specific instructions to an LLM to guide its responses. It doesn't necessitate model retraining for each new prompt or data change, and thus, offers a quicker iterative process. Importantly, it doesn't require a labeled dataset, making it a viable option when training data is scant or absent. Prompting can be an excellent starting point for solving tasks, especially simpler ones, as it can be resource-friendly and computationally efficient.
[[File:prompting_vs_finetuning1.png|400px]]
Despite its advantages, prompting may underperform compared to fine-tuning for complex tasks. There's a clear trade-off in terms of [[inference]] costs. Fine-tuned models, by integrating task-specific knowledge into the model's parameters, can generate accurate responses with minimal explicit instructions or prompts, making them cheaper in the long run. In contrast, prompted models, which rely heavily on explicit instructions, can be resource-intensive and more expensive, particularly for large-scale applications. Therefore, the choice between fine-tuning and prompting will depend on the specific use case, data availability, task complexity, and computational resources.
==Related Pages==
*[[Prompt engineering]]
*[[Prompt injection]]


==References==
==References==
223

edits