Enhanced ChatGPT Clone: Features Agents, MCP, DeepSeek, Anthropic, AWS, OpenAI, Responses API, Azure, Groq, o1, GPT-5, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, Code Interpreter, langchain, DALL-E-3, OpenAPI Actions, Functions, Secure Multi-User Auth, Presets, open-source for self-hosting. Active. https://librechat.ai/
Find a file
Danny Avila 546f006e42
💬 feat: Serialize GitNexus Deploys and Post Completion Comments on PR Commands (#12623)
Three related changes that tighten the GitNexus CI/CD loop.

Serialized deploys
- Previous concurrency group was keyed by head ref with cancel-in-progress,
  which let deploys targeting different refs (e.g. main push + PR command)
  run in parallel. That's a data race: the prune-stale-indexes step
  computes active_names up front, so deploy A rsyncing
  /opt/gitnexus/indexes/LibreChat-pr-12580 can collide with deploy B
  pruning the same folder based on a pre-rsync view of the active set.
- Collapse to a single global group gitnexus-deploy with
  cancel-in-progress: false. All deploys queue behind one another.
  A rsync/docker-compose restart is never killed mid-operation.
  The 20-minute job timeout bounds queue depth.

PR completion feedback
- Add a "index complete" comment step in gitnexus-index.yml that
  fires only when inputs.pr_number is set (i.e. the run came via the
  /gitnexus command). Posts success or failure with a link to the
  run and whether embeddings were generated.
- Add a "deploy complete" comment step in gitnexus-deploy.yml that
  handles both trigger paths: workflow_run from a native PR auto-index
  (PR number recovered from the matrix entry whose runId matches the
  trigger run), and workflow_dispatch from the index workflow's bot-
  fallback path (PR number passed through as a new inputs.pr_number).
- Plumb inputs.pr_number through the bot-fallback dispatch in
  gitnexus-index.yml so the deploy workflow knows where to comment
  for command-triggered runs.
- Only comments on the PR that asked for the index, never broadcasts.

Workflow rename
- Drop the "DigitalOcean" suffix from the deploy workflow's display
  name and filename. The platform is still DO (.do/gitnexus/ still
  holds the compose + caddy config) but the workflow itself is
  platform-agnostic in form and the suffix was visual noise.
- File renamed gitnexus-deploy-do.yml -> gitnexus-deploy.yml.
- Concurrency group and all cross-references updated in lock-step.
- permissions at deploy job level now includes pull-requests: write
  so the completion comment can post.
2026-04-11 18:15:56 -04:00
.devcontainer 🪦 refactor: Remove Legacy Code (#10533) 2025-12-11 16:36:12 -05:00
.do/gitnexus 🌊 feat: Add GitNexus DigitalOcean Pipeline with PR Index Serving (#12612) 2026-04-11 13:04:46 -04:00
.github 💬 feat: Serialize GitNexus Deploys and Post Completion Comments on PR Commands (#12623) 2026-04-11 18:15:56 -04:00
.husky 🎨 refactor: Redesign Sidebar with Unified Icon Strip Layout (#12013) 2026-03-22 01:15:20 -04:00
.vscode 🔐 feat: Granular Role-based Permissions + Entra ID Group Discovery (#7804) 2025-08-13 16:24:17 -04:00
api v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
client v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
config 📦 refactor: Consolidate DB models, encapsulating Mongoose usage in data-schemas (#11830) 2026-03-21 14:28:53 -04:00
e2e v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
helm v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
packages 📑 fix: Alias Mimetype text/x-markdown to text/markdown (#12608) 2026-04-11 08:23:04 -04:00
redis-config 🔄 refactor: Migrate Cache Logic to TypeScript (#9771) 2025-10-02 09:33:58 -04:00
src/tests 🆔 feat: Add OpenID Connect Federated Provider Token Support (#9931) 2025-11-21 09:51:11 -05:00
utils 🐳 chore: Update image registry references in Docker/Helm configurations (#12026) 2026-03-02 22:14:50 -05:00
.dockerignore 🐳 : Further Docker build Cleanup & Docs Update (#1502) 2024-01-06 11:59:08 -05:00
.env.example ⚗️ feat: Agent Context Compaction/Summarization (#12287) 2026-03-21 14:28:56 -04:00
.gitattributes 🎛️ feat: DB-Backed Per-Principal Config System (#12354) 2026-03-25 19:39:29 -04:00
.gitignore 🪆 fix: Allow Nested addParams in Config Schema (#12526) 2026-04-02 20:38:46 -04:00
.prettierrc 🧹 chore: Migrate to Flat ESLint Config & Update Prettier Settings (#5737) 2025-02-09 12:15:20 -05:00
AGENTS.md 📋 chore: Move project instructions from AGENTS.md to CLAUDE.md 2026-03-31 21:50:38 -04:00
bun.lock v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
CLAUDE.md 📋 chore: Move project instructions from AGENTS.md to CLAUDE.md 2026-03-31 21:50:38 -04:00
deploy-compose.yml 🔒 chore: Bump MongoDB from 8.0.17 to 8.0.20 in Docker Compose Files (#12399) 2026-03-25 13:56:43 -04:00
docker-compose.override.yml.example 🐳 chore: Update image registry references in Docker/Helm configurations (#12026) 2026-03-02 22:14:50 -05:00
docker-compose.yml 🔒 chore: Bump MongoDB from 8.0.17 to 8.0.20 in Docker Compose Files (#12399) 2026-03-25 13:56:43 -04:00
Dockerfile v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
Dockerfile.multi v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
eslint.config.mjs 🛡️ refactor: Self-Healing Tenant Isolation Update Guard (#12506) 2026-04-01 19:07:52 -04:00
librechat.example.yaml v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
LICENSE 🗒️ docs: Update LICENSE.md Year: 2025 -> 2026 (#12554) 2026-04-08 09:12:44 -04:00
package-lock.json v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
package.json v0.8.5-rc1 (#12569) 2026-04-09 20:06:31 -04:00
rag.yml 🐳 chore: Update image registry references in Docker/Helm configurations (#12026) 2026-03-02 22:14:50 -05:00
README.md 📝 docs: update deployment link for Railway in README and README.zh.md (#12449) 2026-03-28 20:10:36 -04:00
README.zh.md 📝 docs: update deployment link for Railway in README and README.zh.md (#12449) 2026-03-28 20:10:36 -04:00
turbo.json 🏎️ feat: Smart Reinstall with Turborepo Caching for Better DX (#11785) 2026-02-13 14:25:26 -05:00

LibreChat

English · 中文

Deploy on Railway Deploy on Zeabur Deploy on Sealos

Translation Progress

Features

  • 🖥️ UI & Experience inspired by ChatGPT with enhanced design and features

  • 🤖 AI Model Selection:

    • Anthropic (Claude), AWS Bedrock, OpenAI, Azure OpenAI, Google, Vertex AI, OpenAI Responses API (incl. Azure)
    • Custom Endpoints: Use any OpenAI-compatible API with LibreChat, no proxy required
    • Compatible with Local & Remote AI Providers:
      • Ollama, groq, Cohere, Mistral AI, Apple MLX, koboldcpp, together.ai,
      • OpenRouter, Helicone, Perplexity, ShuttleAI, Deepseek, Qwen, and more
  • 🔧 Code Interpreter API:

    • Secure, Sandboxed Execution in Python, Node.js (JS/TS), Go, C/C++, Java, PHP, Rust, and Fortran
    • Seamless File Handling: Upload, process, and download files directly
    • No Privacy Concerns: Fully isolated and secure execution
  • 🔦 Agents & Tools Integration:

    • LibreChat Agents:
      • No-Code Custom Assistants: Build specialized, AI-driven helpers
      • Agent Marketplace: Discover and deploy community-built agents
      • Collaborative Sharing: Share agents with specific users and groups
      • Flexible & Extensible: Use MCP Servers, tools, file search, code execution, and more
      • Compatible with Custom Endpoints, OpenAI, Azure, Anthropic, AWS Bedrock, Google, Vertex AI, Responses API, and more
      • Model Context Protocol (MCP) Support for Tools
  • 🔍 Web Search:

    • Search the internet and retrieve relevant information to enhance your AI context
    • Combines search providers, content scrapers, and result rerankers for optimal results
    • Customizable Jina Reranking: Configure custom Jina API URLs for reranking services
    • Learn More →
  • 🪄 Generative UI with Code Artifacts:

    • Code Artifacts allow creation of React, HTML, and Mermaid diagrams directly in chat
  • 🎨 Image Generation & Editing

  • 💾 Presets & Context Management:

    • Create, Save, & Share Custom Presets
    • Switch between AI Endpoints and Presets mid-chat
    • Edit, Resubmit, and Continue Messages with Conversation branching
    • Create and share prompts with specific users and groups
    • Fork Messages & Conversations for Advanced Context control
  • 💬 Multimodal & File Interactions:

    • Upload and analyze images with Claude 3, GPT-4.5, GPT-4o, o1, Llama-Vision, and Gemini 📸
    • Chat with Files using Custom Endpoints, OpenAI, Azure, Anthropic, AWS Bedrock, & Google 🗃️
  • 🌎 Multilingual UI:

    • English, 中文 (简体), 中文 (繁體), العربية, Deutsch, Español, Français, Italiano
    • Polski, Português (PT), Português (BR), Русский, 日本語, Svenska, 한국어, Tiếng Việt
    • Türkçe, Nederlands, עברית, Català, Čeština, Dansk, Eesti, فارسی
    • Suomi, Magyar, Հայերեն, Bahasa Indonesia, ქართული, Latviešu, ไทย, ئۇيغۇرچە
  • 🧠 Reasoning UI:

    • Dynamic Reasoning UI for Chain-of-Thought/Reasoning AI models like DeepSeek-R1
  • 🎨 Customizable Interface:

    • Customizable Dropdown & Interface that adapts to both power users and newcomers
  • 🌊 Resumable Streams:

    • Never lose a response: AI responses automatically reconnect and resume if your connection drops
    • Multi-Tab & Multi-Device Sync: Open the same chat in multiple tabs or pick up on another device
    • Production-Ready: Works from single-server setups to horizontally scaled deployments with Redis
  • 🗣️ Speech & Audio:

    • Chat hands-free with Speech-to-Text and Text-to-Speech
    • Automatically send and play Audio
    • Supports OpenAI, Azure OpenAI, and Elevenlabs
  • 📥 Import & Export Conversations:

    • Import Conversations from LibreChat, ChatGPT, Chatbot UI
    • Export conversations as screenshots, markdown, text, json
  • 🔍 Search & Discovery:

    • Search all messages/conversations
  • 👥 Multi-User & Secure Access:

    • Multi-User, Secure Authentication with OAuth2, LDAP, & Email Login Support
    • Built-in Moderation, and Token spend tools
  • ⚙️ Configuration & Deployment:

    • Configure Proxy, Reverse Proxy, Docker, & many Deployment options
    • Use completely local or deploy on the cloud
  • 📖 Open-Source & Community:

    • Completely Open-Source & Built in Public
    • Community-driven development, support, and feedback

For a thorough review of our features, see our docs here 📚

🪶 All-In-One AI Conversations with LibreChat

LibreChat is a self-hosted AI chat platform that unifies all major AI providers in a single, privacy-focused interface.

Beyond chat, LibreChat provides AI Agents, Model Context Protocol (MCP) support, Artifacts, Code Interpreter, custom actions, conversation search, and enterprise-ready multi-user authentication.

Open source, actively developed, and built for anyone who values control over their AI infrastructure.


🌐 Resources

GitHub Repo:

Other:


📝 Changelog

Keep up with the latest updates by visiting the releases page and notes:

⚠️ Please consult the changelog for breaking changes before updating.


Star History

Star History Chart

danny-avila%2FLibreChat | Trendshift ROSS Index - Fastest Growing Open-Source Startups in Q1 2024 | Runa Capital


Contributions

Contributions, suggestions, bug reports and fixes are welcome!

For new features, components, or extensions, please open an issue and discuss before sending a PR.

If you'd like to help translate LibreChat into your language, we'd love your contribution! Improving our translations not only makes LibreChat more accessible to users around the world but also enhances the overall user experience. Please check out our Translation Guide.


💖 This project exists in its current state thanks to all the people who contribute


🎉 Special Thanks

We thank Locize for their translation management tools that support multiple languages in LibreChat.

Locize Logo