Package 'shiny.ollama'

Title: R 'shiny' Interface for Chatting with Large Language Models Offline on Local with 'ollama'
Description: Chat with large language models like 'deepseek-r1', 'nemotron', 'llama', 'qwen' and many more on your machine without internet with complete privacy via 'ollama', powered by R 'shiny' interface. For more information on 'ollama', visit <https://ollama.com>.
Authors: Indraneel Chakraborty [aut, cre]
Maintainer: Indraneel Chakraborty <[email protected]>
License: Apache License (>= 2)
Version: 0.1.2
Built: 2025-03-12 15:20:06 UTC
Source: https://github.com/ineelhere/shiny.ollama

Help Index


Check if Ollama is running

Description

This function checks whether the Ollama server is running at 'http://localhost:11434/'. It sends a request to the URL and determines if the server is reachable.

Usage

check_ollama()

Value

A character string: '"ollama is running"' if the server is accessible, otherwise '"ollama is not running"'.


Fetch available models from Ollama API

Description

Fetch available models from Ollama API

Usage

fetch_models()

Value

Character vector of model names or an error message


Convert chat history to downloadable format

Description

Convert chat history to downloadable format

Usage

format_chat_history(messages, format = c("HTML", "CSV"))

Arguments

messages

List of chat messages

format

Character string specifying format ("HTML" or "CSV")

Value

Formatted chat history as a character string (HTML) or a data frame (CSV)


Format a message as markdown

Description

This helper function formats a message with a specified role (e.g., "User", "Assistant", "System Status") into a markdown-styled string. If the role is "System Status", it includes the status of the Ollama server.

Usage

format_message_md(role, content)

Arguments

role

A character string specifying the role (e.g., "User", "Assistant", "System Status").

content

A character string containing the message content.

Value

A character string formatted as markdown.


Parse a markdown-formatted message

Description

This function extracts the role and content from a markdown-formatted message.

Usage

parse_message(message)

Arguments

message

A character string containing the markdown-formatted message.

Value

A list with two elements:

role

A character string representing the role (e.g., "User", "Assistant", "System").

content

A character string containing the extracted message content.


Run shiny Application for Chat Interface

Description

Launches a shiny app for interacting with the Ollama API

Usage

run_app()

Value

No return value, called for side effects.


Send message to Ollama API and get response

Description

Send message to Ollama API and get response

Usage

send_ollama_message(
  message,
  model,
  temperature,
  num_ctx,
  top_k,
  top_p,
  system,
  messages
)

Arguments

message

Character string containing the user message

model

Character string specifying the model name

temperature

Numeric value specifying the temperature

num_ctx

Integer value specifying the number of contexts

top_k

Integer value specifying the top K

top_p

Numeric value specifying the top P

system

Character string specifying the system

messages

List of messages

Value

A list with elements 'success' (logical) and either 'response' (character) or 'error' (character)