R Shiny Interface for Chatting with LLMs Offline via Ollama
Experience seamless, private, and offline AI conversations right on your machine! shiny.ollama
provides a user-friendly R Shiny interface to interact with LLMs locally, powered by Ollama.
Development Version
CRAN release
Important: shiny.ollama
requires Ollama to be installed on your system. Without it, this package will not function. Follow the Installation Guide below to set up Ollama first.
-
CRAN
Version (0.1.1
):- Core functionality for offline LLM interaction
- Basic model selection and chat interface
- Chat history export capabilities
-
Development
Version (0.1.2
):- All features from
0.1.1
- Better UI/UX
- Advanced parameter customization
- Enhanced user control over model behavior
- All features from
install.packages("shiny.ollama")
# Install devtools if not already installed
install.packages("devtools")
devtools::install_github("ineelhere/shiny.ollama")
Launch the Shiny app in R with:
library(shiny.ollama)
# Start the application
shiny.ollama::run_app()
- Fully Offline: No internet required โ complete privacy
- Model Selection: Easily choose from available LLM models
- Message Input: Engage in conversations with AI
- Save & Download Chats: Export your chat history
- User-Friendly Interface: Powered by R Shiny
Customize your LLM interaction with adjustable parameters:
- Temperature control
- Context window size
- Top K sampling
- Top P sampling
- System instructions customization
To use this package, install Ollama first:
- Download Ollama from here (Mac, Windows, Linux supported)
- Install it by following the provided instructions
- Verify your installation:
ollama --version
If successful, the version number will be displayed
- Pull a model (e.g., deepseek-r1) to get started
This R package is an independent, passion-driven open source initiative, released under the Apache License 2.0
. It is not affiliated with, owned by, funded by, or influenced by any external organization. The project is dedicated to fostering a community of developers who share a love for coding and collaborative innovation.
Contributions, feedback, and feature requests are always welcome!
Stay tuned for more updates. ๐