Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It's not obvious how to override default max_tokens_to_sample #14

Open
rstrahan opened this issue Oct 27, 2023 · 0 comments
Open

It's not obvious how to override default max_tokens_to_sample #14

rstrahan opened this issue Oct 27, 2023 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@rstrahan
Copy link
Contributor

It is possible to override the default max_tokens_to_sample value (256) by adding a key to the model parameters.
The default model parameters output currently looks like: {"modelId": "anthropic.claude-instant-v1", "temperature": 0}

It is possible to add additional model parameters to this structure, but this max not be obvious to users..

request is to update the default to add max_tokens_to_sample explicitly - like this:
{"modelId": "anthropic.claude-instant-v1", "temperature": 0, "max_tokens_to_sample":256}

Making it more obvious that it can be modified.

@rstrahan rstrahan self-assigned this Oct 27, 2023
@rstrahan rstrahan added the enhancement New feature or request label Oct 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant