Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add response_format param to OllamaChatGenerator #1282

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

Amnah199
Copy link
Contributor

@Amnah199 Amnah199 commented Jan 9, 2025

Related Issues

Proposed Changes:

Add a new param response_format to the OllamaChatGenerator to support structured outputs.

How did you test it?

Updated the tests

Notes for the reviewer

Checklist

@github-actions github-actions bot added integration:ollama type:documentation Improvements or additions to documentation labels Jan 9, 2025
@Amnah199 Amnah199 marked this pull request as ready for review January 10, 2025 15:09
@Amnah199 Amnah199 requested a review from a team as a code owner January 10, 2025 15:09
@Amnah199 Amnah199 requested review from mpangrazzi and removed request for a team January 10, 2025 15:09
@anakin87
Copy link
Member

Hey!
Since after the 2.9.0 release, we will replace (integrate?) the current implementation of OllamaChatGenerator with this, could you please check how response_format integrates with the new implementation and tools?

Copy link
Member

@anakin87 anakin87 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice...

I found some potential improvements.

@@ -26,7 +26,7 @@ classifiers = [
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
]
dependencies = ["haystack-ai", "ollama>=0.4.0"]
dependencies = ["haystack-ai", "ollama>=0.4.0", "pydantic>=2.0.0"]
Copy link
Member

@anakin87 anakin87 Jan 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ollama python client already depends on pydantic, no? I would not add a direct dependency if not required.
No, it might be better to include it. I only think it's not necessary to pin it because I assume it is already pinned in the ollama python client.

@@ -68,7 +68,7 @@ python = ["3.9", "3.10", "3.11", "3.12"]
[tool.hatch.envs.lint]
installer = "uv"
detached = true
dependencies = ["pip", "black>=23.1.0", "mypy>=1.0.0", "ruff>=0.0.243"]
dependencies = ["pip", "black>=23.1.0", "mypy>=1.0.0", "ruff>=0.0.243", "pydantic>=2.0.0"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is pydantic needed in the linting environment?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, you are right. It could have been helpful if in a test mypy needs to verify that a dictionary structure is valid according to the JsonSchemaValue type definition but I guess we don't need that.

@@ -40,6 +41,7 @@ def __init__(
timeout: int = 120,
keep_alive: Optional[Union[float, str]] = None,
streaming_callback: Optional[Callable[[StreamingChunk], None]] = None,
response_format: Optional[Union[Literal["", "json"], JsonSchemaValue]] = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand that the type hint comes from the Ollama python client, but it is confusing to me.

I would change it to something more readable, like response_format: Union[None, Literal['json'], JsonSchemaValue] = None

  • default is None (TODO: check if it has the same meaning as an empty string)
  • you can pass "json" if you only want to ensure that the output format is a valid JSON
  • JsonSchemaValue corresponds to Json Schemas, passed as dictionaries

I would add a similar but more refined explanation to the response_format docstring above.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
integration:ollama type:documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add format support for Ollama generators
2 participants