-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
QNABOT-BEDROCK-EMBEDDINGS-AND-LLM - Tags wrapping responses #24
Comments
Can confirm this is an issue with model anthropic.claude-v2. Until this is fixed, should those using |
Thanks both for reporting this! Are either of you willing to submit a pull request with a fix? If it's not easily fixed by manipulating the prompt, another simple, but hopefully effective, method could be to strip the unwanted tags from the response in the lambda function before returning the response back to QnAbot. Fork the repo, modify the code in your fork, build/publish locally/test, and then commit/push back to your fork and submit a PR.. I can quickly accept and republish the plugins when I have that. Tx!! |
Any updates on this? Using anthropic.claude-instant-v1 works but the results are not good compared with Claude v2. |
@michaelin-96 Are you able to send a PR with a fix this week? If not, I'll try to find someone else to look at it.. Tx. |
@rstrahan Hey Bob, I don't have the bandwidth to look at this currently. Sorry. |
Just curious if adding Claude 3 support will also potentially resolve tag wrapping issue. Very interested to use this with haiku. |
In version
v0.1.13
of theQNABOT-BEDROCK-EMBEDDINGS-AND-LLM
plugin, responses returned from Bedrock come back wrapped in tags. In my case I am using the current suggested prompt which contains<question></question>
tags in it and the responses come back wrapped in tags. Switching the prompt to a different set of tags results in the responses coming back wrapped in the corresponding tags so it would appear that the tags from the prompt are leaking through to the response. I am testing with model: anthropic.claude-v2. The result of these tags being present means they show up in text based responses and they also result in an error for any Polly based responses since Polly sees them as bad SSML markup. The error you get in that situation is:Invalid Bot Configuration: Encountered an exception when making Amazon Polly service call, Invalid SSML request (Service: AmazonPolly; Status Code: 400; Error Code: InvalidSsmlException; Request ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx; Proxy: null)
. I was running an earlier version of the plugin and this issue appeared after upgrading to versionv0.1.13
The text was updated successfully, but these errors were encountered: