Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Its possibly use ollama as backend instead chatGPT ? #14

Open
kascesar opened this issue Jan 22, 2025 · 1 comment
Open

Its possibly use ollama as backend instead chatGPT ? #14

kascesar opened this issue Jan 22, 2025 · 1 comment

Comments

@kascesar
Copy link

kascesar commented Jan 22, 2025

Hi, i have been use gptel with ollama, this package is based on gptel if i knew well, so its possible use ollama as backend ?

I've tried this, but not working:

  (use-package magit-gptcommit
    :ensure t
    :demand t
    :after magit
    :bind (:map git-commit-mode-map
                ("C-c C-g" . magit-gptcommit-commit-accept))
    :custom
    (magit-gptcommit-llm-provider (make-llm-ollama
  				 :host "localhost:11434"
  				 :chat-model "qwen2.5-coder:1.5b"
  				 :embedding-model "qwen2.5-coder:1.5b"))

    :config
    ;; Enable magit-gptcommit-mode to watch staged changes and generate commit message automatically in magit status buffer
    ;; This mode is optional, you can also use `magit-gptcommit-generate' to generate commit message manually
    ;; `magit-gptcommit-generate' should only execute on magit status buffer currently
    ;; (magit-gptcommit-mode 1)

    ;; Add gptcommit transient commands to `magit-commit'
    ;; Eval (transient-remove-suffix 'magit-commit '(1 -1)) to remove gptcommit transient commands
    (magit-gptcommit-status-buffer-setup))
@douo
Copy link
Owner

douo commented Jan 26, 2025

Of course, it's possible! Could you provide more details about the error you're encountering? If you're using the default configuration, try removing the :host parameter. Additionally, note that :port should be configured separately and cannot be included within :host.

I tested the same model locally, and it works fine. However, for better results, you might want to adjust the prompt by modifying the magit-gptcommit-prompt variable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants