Le Chat Enterprise is the enterprise tier of Le Chat, the AI assistant from Paris-based Mistral AI. It was announced on May 7, 2025 and launched as a privacy-first, deployment-flexible assistant aimed at organizations that want to bring search, custom agents, document libraries, and connectors into a single workspace surface. At launch the product was powered by Mistral Medium 3, a frontier-class model unveiled the same day.[1][2]
Mistral positioned Le Chat Enterprise as an answer to four problems it argued were holding back enterprise AI adoption: tool fragmentation, insecure knowledge integration, rigid model choice, and slow time to value. Rather than build a separate vertical product for each pain point, the company packaged search, agents, connectors, and custom models into one assistant that can be deployed in the customer's own cloud, in a private data center, or as a managed service in Mistral's own infrastructure.[1]
The launch sat at the intersection of two of Mistral's main strategic bets: a continued push toward enterprise revenue and a sustained marketing emphasis on European data sovereignty as a differentiator against American and Chinese competitors.[3][4]
Le Chat Enterprise sits at the top of a four-tier ladder. The free consumer tier and the paid Pro tier ($14.99 per month at launch) target individual users; the Team plan targets small businesses; and Enterprise targets large organizations with their own security, compliance, and procurement requirements.
| Tier | Audience | Key extras |
|---|---|---|
| Le Chat Free | Individual users | Standard chat, web access |
| Le Chat Pro | Power users | Higher limits, full model access |
| Le Chat Team | Small businesses | Shared workspaces, collaboration |
| Le Chat Enterprise | Large organizations | Connectors, agents, custom models, hybrid deployment, SSO, audit logs |
The enterprise tier inherits the same core chat surface that Pro and Team users see, then adds organizational features: identity management, fine-grained access control, document libraries, custom agents, and a deployment model that lets the customer keep its data inside its own perimeter.[1][5]
The launch announcement grouped Le Chat Enterprise's capabilities into a handful of feature areas. The table below summarizes each.
| Feature area | What it does |
|---|---|
| Enterprise search | Finds and summarizes information across connected data sources, knowledge bases, and the open web |
| Agent builders | Lets non-developers build, share, and reuse custom agents that automate repeated workflows |
| Custom connectors | Securely connects Le Chat to enterprise apps and SaaS tools, with permissions enforced per user |
| Document libraries | Stores shared and personal document collections that ground answers via retrieval |
| Custom models | Allows organizations to fine-tune or otherwise tailor the model stack to their domain |
| Hybrid deployments | Supports self-hosted, public cloud, private cloud, or Mistral-managed cloud deployments |
Mistral marketed agent building as one of the more practical wins for non-technical users. The drag-and-drop builder ships with templates and lets a single agent chain together multiple tools, connectors, and knowledge sources without code. The same agents can be saved to a workspace library and reused by colleagues.[1][5]
At launch Le Chat Enterprise shipped with a small set of first-party connectors covering the most common knowledge and productivity surfaces in large organizations.
| Connector | Vendor | Use |
|---|---|---|
| Google Drive | File search and grounding | |
| SharePoint | Microsoft | File search and grounding |
| OneDrive | Microsoft | Personal and team file storage |
| Google Calendar | Schedule lookup and planning | |
| Gmail | Email search and drafting |
All connections enforce the source system's existing access control lists, so a Le Chat user can only see documents, mailboxes, or calendar entries they would already be able to see in the underlying tool. Mistral repeatedly emphasized this point in the launch post: there is no shadow index that bypasses native permissions.[1][2]
A second wave of integrations arrived in September 2025, when Mistral added MCP (Model Context Protocol) support. The September 2 announcement introduced a directory of more than 20 secure MCP connectors covering a wider footprint of enterprise tools, plus the option for customers to register their own remote MCP servers for internal systems that are not in the directory.[6][7]
| Category | Example connectors added via MCP |
|---|---|
| Data | Pinecone, DeepWiki, Prisma Postgres, Databricks (announced), Snowflake (announced) |
| Productivity | Notion, Box, Asana, Jira, Confluence, Monday.com |
| Development | GitHub, Linear, Sentry, Cloudflare Development Platform |
| Automation | Zapier, Brevo |
| Commerce | Stripe, PayPal, Plaid, Square |
| Custom | Any remote MCP server registered by the organization |
MCP connectors run with on-behalf-of authentication, so each call to a downstream system is made under the requesting user's own credentials. Administrators decide which connectors are exposed to which user groups, and per-function toggles let them split read access from write access; sensitive write operations can also be set to require manual approval before execution.[6][8]
Deployment flexibility is one of the central selling points of Le Chat Enterprise. Customers can pick from four models, depending on how much of the stack they want to operate themselves.
| Deployment model | Who runs it | Typical use case |
|---|---|---|
| Self-hosted | Customer, on its own hardware | Regulated industries, air-gapped environments |
| Private cloud | Customer, in its own VPC | Sensitive workloads with cloud elasticity |
| Public cloud | Customer, in Google Cloud, Azure, or AWS | Standard enterprise IT, marketplace billing |
| Mistral managed cloud | Mistral, in dedicated tenancy | Faster onboarding, less operational burden |
The self-hosted and private-cloud options are the most-cited reasons enterprise buyers in Europe pick Le Chat over US-hosted assistants, since they let an organization keep both the model and the conversation logs entirely inside its own jurisdiction. Mistral cites the option for "up to 100% data residency and deployment in any cloud or datacenter" as a core part of the product pitch.[2][9]
Le Chat Enterprise's governance layer is built around four ideas: identity-aware access, faithful enforcement of source-system permissions, full audit visibility, and tenant data isolation.
The combination of EU jurisdiction, ISO 27001-certified European data centers, and an explicit no-training policy is the basis on which Mistral positions Le Chat Enterprise as a GDPR-friendly choice for regulated industries.[9]
At launch, Le Chat Enterprise ran on Mistral Medium 3, the company's then-new mid-tier frontier model. Mistral reported that Medium 3 delivered more than 90% of the benchmark performance of Claude 3.7 Sonnet at roughly one-eighth the API cost, with strong results on coding evaluations such as HumanEval and MultiPL-E.[1][10]
In December 2025 Mistral released the Mistral 3 family: Mistral Large 3, a sparse mixture-of-experts model with 41 billion active parameters out of 675 billion total and a 256,000-token context window, plus the Ministral 3 small models. Large 3 is positioned as the top-of-stack model for agentic workloads and long-document reasoning, the workloads enterprise users most often push at Le Chat. Mistral has continued to roll Large 3 into Le Chat as the default for heavier tasks, while keeping Medium-class models for cost-sensitive ones.[10][11]
At launch, like the enterprise offerings from OpenAI and Anthropic, Le Chat Enterprise only supported Mistral's own models. The custom-model option is about adapting Mistral models to a customer's domain, not swapping in a third-party LLM.[3]
Le Chat Enterprise was made generally available on May 7, 2025 through Google Cloud Marketplace, with Mistral committing to follow-on listings on Azure AI Marketplace and AWS Marketplace. Direct procurement through Mistral has been available since launch.[1]
| Channel | Status at launch | Notes |
|---|---|---|
| Google Cloud Marketplace | Available | First marketplace listing |
| Azure AI Marketplace | Coming soon | For Microsoft-aligned buyers |
| AWS Marketplace | Coming soon | Including AWS Bedrock integrations |
| Direct from Mistral | Available | Custom contracts, managed cloud option |
Mistral has not published a public per-seat list price for Le Chat Enterprise; pricing is negotiated through sales for marketplace and direct deals. The consumer Le Chat Pro tier was set at $14.99 per month at launch, and Team pricing sits between Pro and Enterprise.[2][3]
Mistral has not published a dedicated case-study list specifically for Le Chat Enterprise, but several large French and European customers have publicly disclosed broader Mistral deployments that include Le Chat or Le Chat-style assistants:
These deployments are useful context for Le Chat Enterprise even when the assistant is not the headline product, because they show the kind of regulated, data-sensitive customers that Mistral has been able to land in its home market.
Le Chat Enterprise launched into a crowded enterprise AI assistant market. The most direct peers are listed below.
| Product | Vendor | Underlying model | Distinctive angle |
|---|---|---|---|
| Le Chat Enterprise | Mistral AI | Mistral Medium/Large 3 | EU sovereignty, full-stack hybrid deployment, custom models |
| ChatGPT Enterprise | OpenAI | GPT-class models | Largest installed base, deepest model family |
| Claude Enterprise | Anthropic | Claude models | Long context, safety-first messaging |
| Microsoft 365 Copilot | Microsoft | OpenAI + Microsoft models | Tight integration with the Office stack |
| Gemini for Google Workspace | Gemini models | Tight integration with Workspace | |
| Glean | Glean | Bring-your-own model | Enterprise search-first design |
VentureBeat described the launch as Mistral coming out swinging for enterprise customers, framing Le Chat Enterprise as the company's most direct shot yet at the enterprise AI assistant market dominated by US vendors. Industry analysts highlighted the EU jurisdiction and self-hosted option as the most concrete differentiators, since they let Mistral compete on something other than raw model benchmarks.[3][9]
The product is also part of a broader business pitch. By the time of the launch, Mistral was reporting a roughly threefold revenue increase over the previous 100 days and was valued at around $6 billion; the company would go on to raise additional capital later in 2025 at higher valuations as it leaned into the European-AI-champion narrative.[4][13]
The weakest part of the launch coverage was the lack of a public benchmark or independent enterprise evaluation comparing Le Chat Enterprise's retrieval, agent, and admin features head-to-head against ChatGPT Enterprise or Microsoft Copilot. Most early reviews relied on Mistral's own positioning, with the EU-sovereignty story doing most of the heavy lifting. That gap is starting to close as the product picks up more public deployments through 2025 and 2026.