2 Comments

User's avatar
Joseph Shirk's avatar

From André Aparicio

(https://www.facebook.com/andre.tiranosaurio/)

each token is 4 characters long, with spaces included. They are going to improve the maximum number of tokens the model can handle in a conversation in order to give better responses. Currently, the free model (ChatGPT 3.5) can handle up to 4,096 tokens. This means that if you input a longer text, only the last 4,096 words will be considered for generating the response. The same applies to long conversations. The initial messages will make sense, and the model will use them along with your requests to generate subsequent responses. However, once the conversation reaches 4,096 tokens, the beginning will be forgotten.

The paid model (ChatGPT-4) doubles the amount of text it can remember as context for generating responses. Currently, it can remember up to 8,192 tokens. However, in the near future, this will be increased to 32,768. To put it into perspective, that's roughly equivalent to 50 pages. That's the context window.

You can check this at https://platform.openai.com/docs/models/continuous-model-upgrades

Expand full comment
1 more comment...

No posts