Jump to content

Tokens: Difference between revisions

1 byte removed ,  6 April 2023
no edit summary
No edit summary
No edit summary
Line 27: Line 27:
*gpt-3-encoder package for node.js
*gpt-3-encoder package for node.js


==Token Limits==
==Token Limits==
The fragment limit for requests is contingent on the model employed, with a maximum of 4097 tokens shared between the prompt and its completion. If a prompt consists of 4000 tokens, the completion can have a maximum of 97 tokens. This limitation is a technical constraint, but there are strategies to work within it, such as shortening prompts or dividing text into smaller sections.
The fragment limit for requests is contingent on the model employed, with a maximum of 4097 tokens shared between the prompt and its completion. If a prompt consists of 4000 tokens, the completion can have a maximum of 97 tokens. This limitation is a technical constraint, but there are strategies to work within it, such as shortening prompts or dividing text into smaller sections.


370

edits