Welcome to ClientVPS Mirrors

README

llm.api

“Llamar” is Spanish for “to call”

Minimal-dependency LLM chat interface. Part of cornyverse.

Exports

Function Purpose
chat(prompt, model) Chat with any LLM
chat_openai(prompt) OpenAI GPT models
chat_claude(prompt) Anthropic Claude models
chat_ollama(prompt) Local Ollama server
list_ollama_models() List Ollama models
llm_base(url) Set API endpoint
llm_key(key) Set API key

Providers

Usage

# Auto-detect provider from model
chat("Hello", model = "gpt-4o")
chat("Hello", model = "claude-3-5-sonnet-latest")
chat("Hello", model = "kimi-k2")

# Use convenience wrappers
chat_ollama("What is R?")
chat_claude("Explain machine learning")

# Explicit Moonshot/Kimi provider
chat("Write a fast parser in R", provider = "moonshot", model = "kimi-k2")

# Conversation history
result <- chat("Hi, I'm Troy")
chat("What's my name?", history = result$history)

# Streaming
chat("Write a story", stream = TRUE)

Set MOONSHOT_API_KEY to use Moonshot/Kimi without overriding your OpenAI credentials.

Dependencies

Only curl and jsonlite. No tidyverse, no compiled code.

Need a high-speed mirror for your open-source project?
Contact our mirror admin team at info@clientvps.com.

This archive is provided as a free public service to the community.
Proudly supported by infrastructure from VPSPulse , RxServers , BuyNumber , UnitVPS , OffshoreName and secure payment technology by ArionPay.