Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

QNABOT-BEDROCK-EMBEDDINGS-AND-LLM - Tags wrapping responses #24

Open
billsnyder opened this issue Jan 26, 2024 · 6 comments
Open

QNABOT-BEDROCK-EMBEDDINGS-AND-LLM - Tags wrapping responses #24

billsnyder opened this issue Jan 26, 2024 · 6 comments

Comments

@billsnyder
Copy link

In version v0.1.13 of the QNABOT-BEDROCK-EMBEDDINGS-AND-LLM plugin, responses returned from Bedrock come back wrapped in tags. In my case I am using the current suggested prompt which contains <question></question> tags in it and the responses come back wrapped in tags. Switching the prompt to a different set of tags results in the responses coming back wrapped in the corresponding tags so it would appear that the tags from the prompt are leaking through to the response. I am testing with model: anthropic.claude-v2. The result of these tags being present means they show up in text based responses and they also result in an error for any Polly based responses since Polly sees them as bad SSML markup. The error you get in that situation is: Invalid Bot Configuration: Encountered an exception when making Amazon Polly service call, Invalid SSML request (Service: AmazonPolly; Status Code: 400; Error Code: InvalidSsmlException; Request ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx; Proxy: null). I was running an earlier version of the plugin and this issue appeared after upgrading to version v0.1.13

@michaelin-96
Copy link
Member

Can confirm this is an issue with model anthropic.claude-v2. Until this is fixed, should those using QNABOT-BEDROCK-EMBEDDINGS-AND-LLM stick with a different model? I haven't seen this issue while using anthropic.claude-instant-v1.

@rstrahan
Copy link
Contributor

rstrahan commented Mar 2, 2024

Thanks both for reporting this! Are either of you willing to submit a pull request with a fix? If it's not easily fixed by manipulating the prompt, another simple, but hopefully effective, method could be to strip the unwanted tags from the response in the lambda function before returning the response back to QnAbot. Fork the repo, modify the code in your fork, build/publish locally/test, and then commit/push back to your fork and submit a PR.. I can quickly accept and republish the plugins when I have that. Tx!!

@chanbi-aws
Copy link

Any updates on this? Using anthropic.claude-instant-v1 works but the results are not good compared with Claude v2.

@rstrahan
Copy link
Contributor

@michaelin-96 Are you able to send a PR with a fix this week? If not, I'll try to find someone else to look at it.. Tx.

@michaelin-96
Copy link
Member

@rstrahan Hey Bob, I don't have the bandwidth to look at this currently. Sorry.

@PercDev23
Copy link

Just curious if adding Claude 3 support will also potentially resolve tag wrapping issue. Very interested to use this with haiku.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants