Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tiktoken under counting tokens for openai text-embedding-3-large #366

Open
mattkauffman23 opened this issue Jan 9, 2025 · 0 comments
Open

Comments

@mattkauffman23
Copy link

Starting on January 2nd 2025 we started noticing errors in our logs that we were over the context limit when creating text-embedding-3-large embeddings on openai. I believe there may have been a change on the openai side since we hadn't made any related changes. In one case I looked into tiktoken reported 7995 tokens but we received the following error from openai:

{'error': {'message': "This model's maximum context length is 8192 tokens, however you requested 8781 tokens (8781 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.", 'type': 'invalid_request_error', 'param': None, 'code': None}}

For the moment we're reducing our limit for max tokens to mitigate, but wanted to raise the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant