Gradio chat server using ollama. Good for offline usage!
- Install Ollama - https://ollama.com/
- Install requirements -
pip install -r requirements.txt
- Download ollama models locally according to local memory and desired performance -
ollama pull <model-name>
- Change the line code containing
MODELS
in thechatserver.py
with the downloaded models - Start server -
python chatserver.py