Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

token limit issues: groq 70b running into rate limit issues even when using your example prompt #1

Open
Treek2345 opened this issue Jun 27, 2024 · 1 comment

Comments

@Treek2345
Copy link

AI Thought Bubble - Next Action:

Thought: I need to read more content from other relevant websites and articles to gather more information about the topic.

Action: Read website content

Action Input: {"website_url": "https://www.forbes.com/sites/roberthart/2024/05/28/elon-musk-is-feuding-with-ai-godfather-yann-lecun-again-heres-why/"}

RateLimitError: Error code: 429 - {'error': {'message': 'Rate limit reached for model llama3-70b-8192 in organization org_01hrx3emwtett8bq1cyh7w230q on tokens per minute (TPM): Limit 6000, Used 0, Requested 6194. Please try again in 1.94s. Visit https://console.groq.com/docs/rate-limits for more information.', 'type': 'tokens', 'code': 'rate_limit_exceeded'}}
Traceback:
File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 600, in _run_script
exec(code, module.dict)File "/app/app.py", line 52, in <module>
result = melody_crew.run()
^^^^^^^^^^^^^^^^^File "/app/crew.py", line 59, in run
return crew.kickoff()
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/crew.py", line 252, in kickoff
result = self._run_sequential_process()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/crew.py", line 293, in _run_sequential_process
output = task.execute(context=task_output)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/task.py", line 173, in execute
result = self._execute(
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/task.py", line 182, in _execute
result = agent.execute_task(
^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/agent.py", line 221, in execute_task
result = self.agent_executor.invoke(
^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/langchain/chains/base.py", line 163, in invoke
raise eFile "/usr/local/lib/python3.11/site-packages/langchain/chains/base.py", line 153, in invoke
self._call(inputs, run_manager=run_manager)File "/usr/local/lib/python3.11/site-packages/crewai/agents/executor.py", line 124, in _call
next_step_output = self._take_next_step(
^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/langchain/agents/agent.py", line 1138, in _take_next_step
[File "/usr/local/lib/python3.11/site-packages/langchain/agents/agent.py", line 1138, in <listcomp>
[File "/usr/local/lib/python3.11/site-packages/crewai/agents/executor.py", line 186, in _iter_next_step
output = self.agent.plan(
^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/langchain/agents/agent.py", line 397, in plan
for chunk in http://self.runnable.stream(inputs, config={"callbacks": callbacks}):File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2875, in stream
yield from self.transform(iter([input]), config, **kwargs)File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2862, in transform
yield from self._transform_stream_with_config(File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1880, in _transform_stream_with_config
chunk: Output = http://context.run(next, iterator) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2826, in _transform
for output in final_pipeline:File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1283, in transform
for chunk in input:File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4728, in transform
yield from self.bound.transform(File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1300, in transform
yield from http://self.stream(final, config, **kwargs)File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 249, in stream
raise eFile "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 229, in stream
for chunk in self._stream(messages, stop=stop, **kwargs):File "/usr/local/lib/python3.11/site-packages/langchain_groq/chat
6CSHI3Tz
xK6APGZt
_models.py", line 321, in _stream
for chunk in self.client.create(messages=message_dicts, **params):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/resources/chat/completions.py", line 289, in create
return self._post(
^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1225, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 920, in request
return self._request(
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1003, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1051, in _retry_request
return self._request(
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1003, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1051, in _retry_request
return self._request(
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1018, in _request
raise self._make_status_error_from_response(err.response) from None

@MichaelisTrofficus
Copy link
Contributor

hey! I've reduced the max_rpm, can you try now please?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants