With additional inputs and/or save history the loading of another Gradio throws error. #10312
Open
1 task done
Labels
bug
Something isn't working
Describe the bug
With additional inputs and save history the loading of another Gradio throws error. Everything works fine with those options. We cannot just use Gradio 1 directly, there is a use case for us to get it to work this way.
Save history probably has implementation bugs.
I guess that additional inputs go out of order or not all are being sent.
Have you searched existing issues? 🔎
Reproduction
Gradio 1
Gradio 2
Screenshot
With additional inputs and save history the loading of another Gradio throws error. Everything works fine with those options. We cannot just use Gradio 1 directly, there is a use case for us to get it to work this way.
Save history probably has implementation bugs.
I guess that additional inputs go out of order or not all are being sent.
Gradio 1
Gradio 2
Error 1 with just save history as True
Error 2 with additional inputs
To create a public link, set
share=True
inlaunch()
.Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 2045, in process_api
result = await self.call_function(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1592, in call_function
prediction = await anyio.to_thread.run_sync( # type: ignore
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 962, in run
result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 870, in wrapper
response = f(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/gradio_client/client.py", line 1135, in _inner
predictions = _predict(*data)
File "/usr/local/lib/python3.10/site-packages/gradio_client/client.py", line 1252, in _predict
raise AppError(
gradio_client.exceptions.AppError: The upstream Gradio app has raised an exception: 422 Client Error: Unprocessable Entity for url: https://api-inference.huggingface.co/models/teset/model/v1/chat/completions (Request ID: 0Jb3ydd95jXN73jD6d7Kr)
Failed to deserialize the JSON body into the target type: temperature: invalid type: sequence, expected f32 at line 1 column 2443
Logs
Error 2 with additional inputs
To create a public link, set
share=True
inlaunch()
.Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 2045, in process_api
result = await self.call_function(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1592, in call_function
prediction = await anyio.to_thread.run_sync( # type: ignore
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 962, in run
result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 870, in wrapper
response = f(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/gradio_client/client.py", line 1135, in _inner
predictions = _predict(*data)
File "/usr/local/lib/python3.10/site-packages/gradio_client/client.py", line 1252, in _predict
raise AppError(
gradio_client.exceptions.AppError: The upstream Gradio app has raised an exception: 422 Client Error: Unprocessable Entity for url: https://api-inference.huggingface.co/models/teset/model/v1/chat/completions (Request ID: 0Jb3ydd95jXN73jD6d7Kr)
Failed to deserialize the JSON body into the target type: temperature: invalid type: sequence, expected f32 at line 1 column 2443
Severity
Blocking usage of gradio
The text was updated successfully, but these errors were encountered: