Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

With additional inputs and/or save history the loading of another Gradio throws error. #10312

Open
1 task done
djaffer opened this issue Jan 8, 2025 · 1 comment · May be fixed by #10324
Open
1 task done

With additional inputs and/or save history the loading of another Gradio throws error. #10312

djaffer opened this issue Jan 8, 2025 · 1 comment · May be fixed by #10324
Assignees
Labels
bug Something isn't working

Comments

@djaffer
Copy link

djaffer commented Jan 8, 2025

Describe the bug

With additional inputs and save history the loading of another Gradio throws error. Everything works fine with those options. We cannot just use Gradio 1 directly, there is a use case for us to get it to work this way.

Save history probably has implementation bugs.

I guess that additional inputs go out of order or not all are being sent.

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

Gradio 1

with gr.Blocks() as demo:

   chat = gr.ChatInterface(
          predict,
          type="messages",
          additional_inputs=[
          gr.Slider(1, 2048, value=512, step=1, label="Max tokens",render=False),
          gr.Slider(0.1, 4.0, value=0.9, step=0.1, label="Temperature",render=False),
          gr.Slider(0.1, 1.0, value=0.95, step=0.05, label="Top p ",render=False) ],
         editable=True,
         save_history=True
      )

    demo.queue().launch()

Gradio 2

demo = gr.load(gradio-1, src="spaces")
demo.launch(show_api=False, show_error=True)

Screenshot

With additional inputs and save history the loading of another Gradio throws error. Everything works fine with those options. We cannot just use Gradio 1 directly, there is a use case for us to get it to work this way.

Save history probably has implementation bugs.

I guess that additional inputs go out of order or not all are being sent.

Gradio 1

with gr.Blocks() as demo:

   chat = gr.ChatInterface(
          predict,
          type="messages",
          additional_inputs=[
          gr.Slider(1, 2048, value=512, step=1, label="Max tokens",render=False),
          gr.Slider(0.1, 4.0, value=0.9, step=0.1, label="Temperature",render=False),
          gr.Slider(0.1, 1.0, value=0.95, step=0.05, label="Top p ",render=False) ],
         editable=True,
         save_history=True
      )

demo.queue().launch()

Gradio 2

demo = gr.load(gradio-1, src="spaces")
demo.launch(show_api=False, show_error=True)

Error 1 with just save history as True

Loaded as API: https://usernamename-space.hf.space ✔
Traceback (most recent call last):
  File "/home/user/app/app.py", line 7, in <module>
    demo = gr.load(model, src="spaces", token=token)
  File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 92, in load
    return load_blocks_from_huggingface(
  File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 168, in load_blocks_from_huggingface
    blocks: gradio.Blocks = factory_methods[src](name, hf_token, alias, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 507, in from_spaces
    return from_spaces_blocks(space=space_name, hf_token=hf_token)
  File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 538, in from_spaces_blocks
    return gradio.Blocks.from_config(client.config, predict_fns, client.src)  # type: ignore
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1320, in from_config
    targets = [
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1322, in <listcomp>
    original_mapping[
KeyError: 6

Error 2 with additional inputs
To create a public link, set share=True in launch().
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 2045, in process_api
result = await self.call_function(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1592, in call_function
prediction = await anyio.to_thread.run_sync( # type: ignore
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 962, in run
result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 870, in wrapper
response = f(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/gradio_client/client.py", line 1135, in _inner
predictions = _predict(*data)
File "/usr/local/lib/python3.10/site-packages/gradio_client/client.py", line 1252, in _predict
raise AppError(
gradio_client.exceptions.AppError: The upstream Gradio app has raised an exception: 422 Client Error: Unprocessable Entity for url: https://api-inference.huggingface.co/models/teset/model/v1/chat/completions (Request ID: 0Jb3ydd95jXN73jD6d7Kr)

Failed to deserialize the JSON body into the target type: temperature: invalid type: sequence, expected f32 at line 1 column 2443

Logs

Error 1 with just save history as True


Loaded as API: https://usernamename-space.hf.space ✔
Traceback (most recent call last):
  File "/home/user/app/app.py", line 7, in <module>
    demo = gr.load(model, src="spaces", token=token)
  File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 92, in load
    return load_blocks_from_huggingface(
  File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 168, in load_blocks_from_huggingface
    blocks: gradio.Blocks = factory_methods[src](name, hf_token, alias, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 507, in from_spaces
    return from_spaces_blocks(space=space_name, hf_token=hf_token)
  File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 538, in from_spaces_blocks
    return gradio.Blocks.from_config(client.config, predict_fns, client.src)  # type: ignore
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1320, in from_config
    targets = [
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1322, in <listcomp>
    original_mapping[
KeyError: 6

Error 2 with additional inputs
To create a public link, set share=True in launch().
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 2045, in process_api
result = await self.call_function(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1592, in call_function
prediction = await anyio.to_thread.run_sync( # type: ignore
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 962, in run
result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 870, in wrapper
response = f(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/gradio_client/client.py", line 1135, in _inner
predictions = _predict(*data)
File "/usr/local/lib/python3.10/site-packages/gradio_client/client.py", line 1252, in _predict
raise AppError(
gradio_client.exceptions.AppError: The upstream Gradio app has raised an exception: 422 Client Error: Unprocessable Entity for url: https://api-inference.huggingface.co/models/teset/model/v1/chat/completions (Request ID: 0Jb3ydd95jXN73jD6d7Kr)

Failed to deserialize the JSON body into the target type: temperature: invalid type: sequence, expected f32 at line 1 column 2443



### System Info

```shell
Gradio 5.10.0 
Python 3.10

Severity

Blocking usage of gradio

@djaffer djaffer added the bug Something isn't working label Jan 8, 2025
@abidlabs abidlabs added enhancement New feature or request pending clarification and removed enhancement New feature or request pending clarification labels Jan 8, 2025
@abidlabs
Copy link
Member

abidlabs commented Jan 8, 2025

Ah okay I see what you mean. Here's a repro:

import gradio as gr

demo = gr.load("abidlabs/load-test-ci", src="spaces")
demo.launch()

@abidlabs abidlabs self-assigned this Jan 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants