Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request: allowing httpx version 0.28 #356

Closed
jamesbraza opened this issue Dec 2, 2024 · 9 comments
Closed

Request: allowing httpx version 0.28 #356

jamesbraza opened this issue Dec 2, 2024 · 9 comments
Assignees

Comments

@jamesbraza
Copy link

https://github.com/encode/httpx/releases/tag/0.28.0 comes with some breaking changes, mainly on httpx.AsyncClient's app argument.

Trying to pull it in, I get blocked by ollama because it downpins httpx: https://github.com/ollama/ollama-python/blob/v0.4.2/pyproject.toml#L13

Can we move httpx = "^0.27.0" to httpx = ">=0.27"

@ParthSareen
Copy link
Contributor

Hey @jamesbraza, thanks for bringing this up - we pin the version to not have unintended updates (like even the one you've mentioned) Will check this out a bit more bump the version!

@simonaubertbd
Copy link

Hello @jamesbraza It also means some incompatibility with jupyterlab as well.

@jedahan
Copy link

jedahan commented Dec 18, 2024

breaks llm-ollama plugin install as well, due to conflicts

@edspencer
Copy link

Wouldn't this existing PR address this?: #365

I'm not familiar with python packaging but the failing jobs there look like trivial dependency SHA checking problems that are probably addressable with a quick CLI command. Happy to dig and and try that but I'm guessing a repo maintainer could get that done in moments. Would love to be able to use the llm-ollama plugin (taketwo/llm-ollama#23)

@aboucaud
Copy link

Please consider merging the httpx dependency upgrade #365 so users of llm-ollama can continue using Ollama through their favorite CLI.

@atrawog
Copy link

atrawog commented Jan 19, 2025

Can I kindly ask for an httpx version bump too?

With everyone else starting to require httpx 0.28.x it's becoming literally impossible to use python-ollama in any somewhat sophisticated data analytics environment.

And I would really hate to nuke my perfectly fine working ollama setup and move back to hugging face transformers, just because of a small version conflict.

@Diegorro98
Copy link

Now that httpx has been upgraded to 0.28.1, can a release be created so that projects depending on this package can use httpx 0.28.1?

@ParthSareen
Copy link
Contributor

Yes, cutting a release today! @Diegorro98

@ParthSareen
Copy link
Contributor

ParthSareen commented Jan 21, 2025

https://github.com/ollama/ollama-python/releases/tag/v0.4.7 🥳

Sorry for the delay everyone! Going to be looking into loosening these deps a bit!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants