Skip to content

Chat offline with open-source LLMs like deepseek-r1, nemotron, qwen, llama and more all through a simple R package powered by Shiny and Ollama. ๐Ÿš€

License

Notifications You must be signed in to change notification settings

ineelhere/shiny.ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

68 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

shiny.ollama

CRAN Version Dev Version

R Shiny Interface for Chatting with LLMs Offline via Ollama

Experience seamless, private, and offline AI conversations right on your machine! shiny.ollama provides a user-friendly R Shiny interface to interact with LLMs locally, powered by Ollama.

pkgdown Visitors CRAN downloads R-CMD-check

Development Version

CRAN release

โš ๏ธ Disclaimer

Important: shiny.ollama requires Ollama to be installed on your system. Without it, this package will not function. Follow the Installation Guide below to set up Ollama first.

Version Information

  • CRAN Version (0.1.1):

    • Core functionality for offline LLM interaction
    • Basic model selection and chat interface
    • Chat history export capabilities
  • Development Version (0.1.2):

    • All features from 0.1.1
    • Better UI/UX
    • Advanced parameter customization
    • Enhanced user control over model behavior

Installation

From CRAN (Stable Version - 0.1.1)

install.packages("shiny.ollama")

From GitHub (Latest Development Version - 0.1.2)

# Install devtools if not already installed
install.packages("devtools")

devtools::install_github("ineelhere/shiny.ollama")

Quick Start

Launch the Shiny app in R with:

library(shiny.ollama)

# Start the application
shiny.ollama::run_app()

Features

Core Features (All Versions)

  • Fully Offline: No internet required โ€“ complete privacy
  • Model Selection: Easily choose from available LLM models
  • Message Input: Engage in conversations with AI
  • Save & Download Chats: Export your chat history
  • User-Friendly Interface: Powered by R Shiny

Advanced Features (Development Version 0.1.2)

Customize your LLM interaction with adjustable parameters:

  • Temperature control
  • Context window size
  • Top K sampling
  • Top P sampling
  • System instructions customization

How to Install Ollama

To use this package, install Ollama first:

  1. Download Ollama from here (Mac, Windows, Linux supported)
  2. Install it by following the provided instructions
  3. Verify your installation:
ollama --version

If successful, the version number will be displayed

  1. Pull a model (e.g., deepseek-r1) to get started

License and Declaration

This R package is an independent, passion-driven open source initiative, released under the Apache License 2.0. It is not affiliated with, owned by, funded by, or influenced by any external organization. The project is dedicated to fostering a community of developers who share a love for coding and collaborative innovation.

Contributions, feedback, and feature requests are always welcome!

Stay tuned for more updates. ๐Ÿš€

About

Chat offline with open-source LLMs like deepseek-r1, nemotron, qwen, llama and more all through a simple R package powered by Shiny and Ollama. ๐Ÿš€

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages