OPENAI_API_KEY since the API is OpenAI-compatible.agent() where the final
assistant message was not appended to the returned history when the
agent loop exited without further tool calls. Affected all providers but
was most visible with non-Claude models."local" provider and chat_local()
/ list_local_models() exports. Direct
llama.cpp inference via the localLLM package
is no longer supported; use provider = "ollama"
instead.
Need a high-speed mirror for your open-source project?
Contact our mirror admin team at info@clientvps.com.
This archive is provided as a free public service to the community.
Proudly supported by infrastructure from VPSPulse , RxServers , BuyNumber , UnitVPS , OffshoreName and secure payment technology by ArionPay.