Package: shiny.ollama 0.1.2
shiny.ollama: R 'shiny' Interface for Chatting with Large Language Models Offline on Local with 'ollama'
Chat with large language models like 'deepseek-r1', 'nemotron', 'llama', 'qwen' and many more on your machine without internet with complete privacy via 'ollama', powered by R 'shiny' interface. For more information on 'ollama', visit <https://ollama.com>.
Authors:
shiny.ollama_0.1.2.tar.gz
shiny.ollama_0.1.2.zip(r-4.5)shiny.ollama_0.1.2.zip(r-4.4)shiny.ollama_0.1.2.zip(r-4.3)
shiny.ollama_0.1.2.tgz(r-4.5-any)shiny.ollama_0.1.2.tgz(r-4.4-any)shiny.ollama_0.1.2.tgz(r-4.3-any)
shiny.ollama_0.1.2.tar.gz(r-4.5-noble)shiny.ollama_0.1.2.tar.gz(r-4.4-noble)
shiny.ollama_0.1.2.tgz(r-4.4-emscripten)shiny.ollama_0.1.2.tgz(r-4.3-emscripten)
shiny.ollama.pdf |shiny.ollama.html✨
shiny.ollama/json (API)
# Install 'shiny.ollama' in R: |
install.packages('shiny.ollama', repos = c('https://ineelhere.r-universe.dev', 'https://cloud.r-project.org')) |
Bug tracker:https://github.com/ineelhere/shiny.ollama/issues
Pkgdown site:https://www.indraneelchakraborty.com
deepseek-r1llama3llmlocal-llmoffline-firstoffline-llmollamaollama-appollama-guishinyshinyapp
Last updated 2 hours agofrom:a73e16117a. Checks:9 OK. Indexed: yes.
Target | Result | Latest binary |
---|---|---|
Doc / Vignettes | OK | Mar 12 2025 |
R-4.5-win | OK | Mar 12 2025 |
R-4.5-mac | OK | Mar 12 2025 |
R-4.5-linux | OK | Mar 12 2025 |
R-4.4-win | OK | Mar 12 2025 |
R-4.4-mac | OK | Mar 12 2025 |
R-4.4-linux | OK | Mar 12 2025 |
R-4.3-win | OK | Mar 12 2025 |
R-4.3-mac | OK | Mar 12 2025 |
Exports:run_app
Dependencies:askpassbase64encbriobslibcachemcallrclicommonmarkcrayoncurldescdiffobjdigestevaluatefastmapfontawesomefsgluehtmltoolshttpuvhttrjquerylibjsonlitelaterlifecyclemagrittrmarkdownmemoisemimemockeryopensslpkgbuildpkgloadpraiseprocessxpromisespsR6rappdirsRcpprlangrprojrootsassshinysourcetoolssystestthatwaldowithrxfunxtable
Readme and manuals
Help Manual
Help page | Topics |
---|---|
Check if Ollama is running | check_ollama |
Fetch available models from Ollama API | fetch_models |
Convert chat history to downloadable format | format_chat_history |
Format a message as markdown | format_message_md |
Parse a markdown-formatted message | parse_message |
Run shiny Application for Chat Interface | run_app |
Send message to Ollama API and get response | send_ollama_message |