You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
the parameters return_full_text and max_new_tokens are not recognized by bedrock service, this causes an error with I fixed adding the following two lines ( after request_body.update(parameters))
if provider == "anthropic":
request_body = {
"prompt": prompt,
"max_tokens_to_sample": DEFAULT_MAX_TOKENS
}
request_body.update(parameters)
if "return_full_text" in request_body: del request_body["return_full_text"]
if "max_new_tokens" in request_body: del request_body["max_new_tokens"]
The text was updated successfully, but these errors were encountered:
Stack: QNABOT-BEDROCK-EMBEDDINGS-AND-LLM
Lambda: LLMLambdaFunction
the parameters return_full_text and max_new_tokens are not recognized by bedrock service, this causes an error with I fixed adding the following two lines ( after request_body.update(parameters))
The text was updated successfully, but these errors were encountered: