New Release Revolutionizes UI with Extensions, Multi-language Support, and Enhanced Model Management
January 26, 2026
The latest release unveils a comprehensive Extensions system that lets UI and server features be extended or replaced through extension folders, featuring built-in and third-party extensions, and a streamlined management workflow via the llms --add command.
Core toolset includes desktop automation, memory and file system tools, math and logic utilities, and multi-language code execution (Python, JavaScript, TypeScript, C#), all designed with security in mind and sandboxed execution.
Image and audio generation are now embedded in both UI and CLI, with models supporting image creation and TTS-based audio, plus asset caching and downloadable URLs for generated content.
MCP (Model Context Protocol) support enables connections to MCP servers for external tools and services, including UI and server management and HTML-rendered tool outputs via iframes.
The models library now includes over 530 models from 24 providers through models.dev integration, with automatic provider updates and configurable inheritance in llms.json to simplify enabling providers.
KaTeX extension adds fast inline and block LaTeX math rendering within AI responses, integrated into the markdown pipeline.
A redesigned Model Selector UI delivers smart search, advanced filtering, flexible sorting, a favorites system, and enhanced model cards to improve discovery and selection.
First-class Python function calling enables LLMs to interact with local environments using function definitions, with a dedicated Tools UI for per-request tool selection.
Gemini RAG Extension provides file search stores with document uploads, categorization, and bidirectional sync to ground AI chats with user data, including uploading workflows and RAG chat capabilities.
Details of the Gemini RAG extension cover Filestore management, drag-and-drop uploads, smart categorization, contextual RAG chats, and bidirectional sync for knowledge-grounded conversations.
The persistence layer shifts to SQLite for server-side storage and asset caching, replacing IndexedDB with robust image/file caching and metadata management.
The v3 release notes emphasize extensibility, expanded provider support, and an improved user experience for llms.py, llms, and related extensions.
Summary based on 1 source
