Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not allow blank empty content to be sent to Anthropic Claude LLM #1406

Open
mike-r-mclaughlin opened this issue Jan 23, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@mike-r-mclaughlin
Copy link
Contributor

Description

In some cases, an empty message is sent to the LLM but Anthropic's Claude does not allow that. It returns messages.12: all messages must have non-empty content except for the optional final assistant message error.

2025-01-21 15:40:05,457 - ERROR livekit.agents.pipeline - Error in _stream_synthesis_task
Traceback (most recent call last):
  File "/Users/noah/livekit-cartesia-claude-deepgram/venv/lib/python3.13/site-packages/livekit/agents/utils/log.py", line 16, in async_fn_logs
    return await fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/noah/livekit-cartesia-claude-deepgram/venv/lib/python3.13/site-packages/livekit/agents/pipeline/agent_output.py", line 273, in _stream_synthesis_task
    async for seg in tts_source:
    ...<9 lines>...
        tts_stream.push_text(seg)
  File "/Users/noah/livekit-cartesia-claude-deepgram/venv/lib/python3.13/site-packages/livekit/agents/utils/aio/itertools.py", line 47, in tee_peer
    item = await iterator.__anext__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/noah/livekit-cartesia-claude-deepgram/venv/lib/python3.13/site-packages/livekit/agents/pipeline/pipeline_agent.py", line 1055, in _llm_stream_to_str_generator
    async for chunk in stream:
    ...<7 lines>...
        yield content
  File "/Users/noah/livekit-cartesia-claude-deepgram/venv/lib/python3.13/site-packages/livekit/agents/llm/llm.py", line 239, in __anext__
    raise exc from None
  File "/Users/noah/livekit-cartesia-claude-deepgram/venv/lib/python3.13/site-packages/livekit/agents/llm/llm.py", line 149, in _main_task
    return await self._run()
           ^^^^^^^^^^^^^^^^^
  File "/Users/noah/livekit-cartesia-claude-deepgram/venv/lib/python3.13/site-packages/livekit/plugins/anthropic/llm.py", line 234, in _run
    raise APIStatusError(
    ...<4 lines>...
    )
livekit.agents._exceptions.APIStatusError: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'messages.12: all messages must have non-empty content except for the optional final assistant message'}} {"pid": 9123, "job_id": "AJ_zFDs8T7P9A59"}

How to reproduce

It seems to be more easily reproducible when there is a lot of background noise. See details in this Slack thread.

@mike-r-mclaughlin mike-r-mclaughlin added the bug Something isn't working label Jan 23, 2025
@davidzhao
Copy link
Member

fixed by #1410

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants