Tokens: Difference between revisions

45 bytes added ,  6 April 2023
No edit summary
Line 30: Line 30:


==Token Limits==
==Token Limits==
The fragment limit for requests is contingent on the model employed, with a maximum of 4097 tokens shared between the prompt and its completion. If a prompt consists of 4000 tokens, the completion can have a maximum of 97 tokens. This limitation is a technical constraint, but there are strategies to work within it, such as shortening prompts or dividing text into smaller sections.
The fragment limit for requests is contingent on the [[model]] employed.
 
===OpenAI API Token Limit===
OpenAI API has a maximum of 4097 tokens shared between the prompt and its completion. If a prompt consists of 4000 tokens, the completion can have a maximum of 97 tokens. This limitation is a technical constraint, but there are strategies to work within it, such as shortening prompts or dividing text into smaller sections.


==Token Pricing==
==Token Pricing==
370

edits