Skip to content

marcelotournier/ollama-chat-server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ollama-chat-server

Gradio chat server using ollama. Good for offline usage!

chat server screenshot

Instructions

  1. Install Ollama - https://ollama.com/
  2. Install requirements - pip install -r requirements.txt
  3. Download ollama models locally according to local memory and desired performance - ollama pull <model-name>
  4. Change the line code containing MODELS in the chatserver.py with the downloaded models
  5. Start server - python chatserver.py

About

Gradio chat server using ollama

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages