Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Client Bot returns error "Sorry, I had an error handling this conversation." #694

Closed
6 tasks
chanbi-aws opened this issue Feb 20, 2024 · 3 comments
Closed
6 tasks
Assignees
Labels

Comments

@chanbi-aws
Copy link

Describe the bug
Client Bot returns error "Sorry, I had an error handling this conversation." after asking a question using the voice recorder. The browser console error generated is:
"converser error: DependencyFailedException: Invalid Bot Configuration: Encountered an exception when making Amazon Polly service call, Invalid SSML request (Service: AmazonPolly; Status Code: 400; Error Code: InvalidSsmlException; Request ID: 0ab791ed-ce73-456e-8446-xxxxxxxx; Proxy: null)".

I'm can't find any documentation on where this is configured. I followed the steps here: https://docs.aws.amazon.com/solutions/latest/qnabot-on-aws/using-ssml-to-control-speech-synthesis.html

But testing the question "What is Q and A Bot?" using the recorder also returns the same error.

To Reproduce
Launch the client bot and ask a question via the voice recorder.

Expected behavior
Not sure. I am assuming a polly voice response based on the response from the Fulfilment Lamdba function, which checking in the CloudWatch log responds with the correct content based on the question asked. It seems to be failing at the point where the response needs to be converted to SSML for processing by Polly. Can we also disable this to return a response in text when asked over voice?

Please complete the following information about the solution:

  • Version: [v5.4.5]

To get the version of the solution, you can look at the description of the created CloudFormation stack. For example, "(SO0189) QnABot [...] v0.0.1".

  • Region: [us-east-1]
  • Was the solution modified from the version published on this repository? No
  • If the answer to the previous question was yes, are the changes available on GitHub?
  • Have you checked your service quotas for the services this solution uses? Yes
  • Were there any errors in the CloudWatch Logs? No

Screenshots
If applicable, add screenshots to help explain your problem (please DO NOT include sensitive information).

Additional context
Add any other context about the problem here.

@chanbi-aws chanbi-aws added the bug label Feb 20, 2024
@dougtoppin
Copy link

@chanbi-aws thanks for your report, we will investigate and get back to you

@michaelin-96
Copy link
Member

Hi @chanbi-aws, I haven't been able to replicate this issue on v5.5.0(latest). I've been able to get a response back by asking "What is Q and A Bot", with a response via Polly as intended. Let me know if upgrading to the latest version fixes this issue!

@michaelin-96
Copy link
Member

michaelin-96 commented Feb 27, 2024

After discussion with @chanbi-aws & @bobpskier, this seems to be a bug in the QnABot Plugins. This is occuring due to LLM (Bedrock) response returning a response wrapped in tags. This breaks any Polly based responses due to 'invalid ssml'. I can confirm this is happening on the anthropic.claude-v2 model. A ticket has already been created. My recommendation for now until it's fixed is using a different model (Example: anthropic.claude-instant-v1) that has a working voice to text feature.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants