Linkup is a Paris-based artificial intelligence company that operates a web search API designed for AI applications, autonomous agents, and retrieval-augmented generation (RAG) systems. Founded in 2024 by Philippe Mizrahi, Denis Charrier, and Boris Toledano, the company builds infrastructure that lets AI systems retrieve current, verified information from the web at the moment they need it, rather than relying on static training data. Linkup positions itself as an alternative to consumer-facing search products like Perplexity and competes in the developer-facing API market alongside Tavily, Exa, Brave Search, and Serper.
The company is headquartered in Paris with additional operations in New York and San Francisco. By early 2026, Linkup had raised $13.2 million in total funding across two rounds and had signed hundreds of customers ranging from AI startups to large enterprises including KPMG, McKinsey, and SNCF.
Linkup was incorporated in 2024. The founding team came together around the time major AI labs were negotiating content-licensing agreements with news publishers for training purposes. CEO Philippe Mizrahi has described the founding insight: AI systems relying solely on training data are effectively operating in permanent airplane mode, capable of processing information but cut off from anything that happened after their training cutoff. Rather than building another consumer search engine, the founders wanted to create purpose-built infrastructure for the wave of AI agents and RAG pipelines that were beginning to proliferate.
The three founders brought complementary backgrounds. Philippe Mizrahi studied at Polytechnique and holds a Master's in Operational Research from Columbia University. Before Linkup, he worked at Lyft as a Group Product Manager, where he led market development for the company's autonomous vehicle division. Denis Charrier brings more than fifteen years of software architecture experience. He founded Niland, one of the earliest neural music search engines in Europe, which Spotify acquired in 2017; Charrier then served as a senior engineer at Spotify before leaving to co-found Linkup. Boris Toledano combines engineering training from ISAE-SUPAERO with a business education from HEC Paris. He previously worked at McKinsey and Company on digital transformation engagements and led AI innovation initiatives at Carrefour, the French retail group.
In October 2024, Linkup was selected for the Microsoft GenAI Studio, a program backed by Microsoft, Nvidia, GitHub, Mistral AI, and Cellenza. The selection provided early validation and expanded the company's network within the European AI ecosystem. The API launched publicly in late 2024, shortly before the first funding announcement.
The company was also accepted into Kima Ventures' portfolio at the pre-seed stage. Kima Ventures, co-founded by Xavier Niel, is one of the most prolific early-stage funds in Europe and has backed hundreds of startups at the idea and pre-product stage. The backing gave Linkup access to a broad network of French technology entrepreneurs.
In November 2024, Linkup announced a €3 million funding round. The round was led by Seedcamp and included Axeleo Capital, Motier Ventures, Kima Ventures, Financière Saint James, and OPRTRS CLUB, alongside approximately one hundred business angels from the technology and media industries. Seedcamp, the early-stage European fund, framed the investment as backing "a new gateway to the Internet of AIs," citing the demand for compliant, publisher-compensated data pipelines as a driver for the company's model.
Axeleo Capital, a Paris-based fund, noted that the timing aligned with a broader industry shift toward specialized AI models that require high-quality, domain-specific data rather than broad, uncurated web crawls.
The €3 million was earmarked for three priorities: developing proprietary AI models for query interpretation and content ranking, strengthening the company's network of publisher and data-provider partnerships, and scaling infrastructure to support a growing customer base.
In February 2026, Linkup announced a $10 million seed round led by Gradient, a venture firm backed by Google. The round drew participation from Elaia, Leblon Capital, Weekend Fund, Seedcamp (returning), Axeleo Capital (returning), Motier Ventures (returning), and OPRTRS CLUB. Notable angel investors included Arthur Mensch, co-founder and CEO of Mistral AI; Olivier Pomel, CEO of Datadog; Alex Bouaziz and Shuo Wang from Deel; and Florian Douetteau from Dataiku.
Gradient's Darian Shirazi said of the investment: "Linkup is building the essential infrastructure layer that ensures AI agents can search and access information in real-time."
Mizrahi framed the company's mission in terms of the shift from human-facing to AI-facing internet: "Google Search revolutionized how humans access information. We're now building that same foundational capability for AI systems."
The new capital is being used to expand indexing technology and grow teams across New York, San Francisco, and Paris. By the time of the announcement, the company had acquired hundreds of customers and was processing production traffic from organizations including AI companies and global enterprises.
Linkup's core product is a web search API accessible via a single HTTP endpoint. Rather than returning a list of links for a human to click through, the API returns structured, sourced content that an AI system can consume directly. The company describes this as building search for AI the way Google built it for people, with the difference that the output is optimized for machine processing rather than human reading.
Linkup uses what it calls "agentic search," meaning the system does not simply match keywords against a static index. When a query arrives, the API interprets the query's intent, decides which sources to retrieve from, extracts relevant content, and returns a structured answer or a set of grounded documents. For more complex queries, the system runs multiple retrieval iterations, using information found in earlier passes to inform subsequent searches.
At the indexing layer, Linkup extracts what it calls "atoms of information": discrete, verifiable facts each tagged with a source URL, a timestamp, and a credibility signal. These atoms are embedded mathematically to allow meaning-based retrieval rather than keyword matching alone. The approach is designed to minimize the gap between what an AI agent asks for and what the API delivers, reducing the need for agents to issue multiple follow-up queries.
For certain content partners, Linkup integrates directly with publishers' backend content management systems rather than crawling publicly accessible pages. This allows the API to access content that sits behind paywalls or requires licensing, while compensating the publisher based on how frequently their content is retrieved. Mizrahi has cited a 15x speed advantage for this approach over conventional web scraping when dealing with partner sources, because there is no need to render pages or parse HTML.
The API exposes three primary endpoints: /search, /fetch, and /research.
The /search endpoint is the main offering. It accepts a natural-language query and returns either a sourced answer, a list of document chunks, or a structured JSON object, depending on the output_type parameter. Callers also specify a depth parameter that controls how aggressively the system searches before returning a result.
The /fetch endpoint retrieves the content of a specific URL, with an option to enable JavaScript rendering for pages that require it. This is useful when an agent already knows which page it needs but wants Linkup to handle content extraction and formatting.
The /research endpoint runs an asynchronous deep research task. Rather than returning results within seconds, it accepts a complex question and works for up to ten minutes, running multiple iterations of retrieval and synthesis before returning a comprehensive answer with citations.
The /search endpoint offers three depth modes that trade latency for coverage:
Fast (beta): a single-pass, sub-second retrieval with no LLM involvement in query reformulation or result evaluation. Optimized for low latency in conversational settings where the user asks a specific, narrow question such as the current CEO of a company.
Standard: adds agentic query interpretation and can split a question into sub-queries where necessary. Suited for questions that do not require sequential reasoning across multiple pages. Priced the same as Fast.
Deep: runs up to ten iterations of agentic search, where each pass builds on context gathered in the previous one. Can resolve multi-hop questions and instructions that require visiting several sources in sequence. Priced at ten times the standard rate.
Callers can request one of three output formats:
Beyond depth and output type, the API supports domain filtering, allowing callers to include or exclude up to fifty specific URLs from results. Date range filtering restricts results to content published within a specified window. A numResults parameter controls how many chunks or search results are returned. The API also optionally includes images in results.
Linkup offers two enterprise-specific deployment modes. The Private Index option lets organizations contribute proprietary data to a dedicated, access-controlled index alongside the public web index. The Bring Your Own Cloud option runs the full Linkup indexing stack inside the customer's own VPC, meeting compliance requirements for organizations that cannot send queries or data to a third-party endpoint. Both modes retain the same API interface as the standard hosted offering.
Security certifications include SOC 2 Type II and zero data retention (ZDR) policies, which the company makes available on all pricing tiers rather than restricting them to enterprise contracts.
Linkup provides native SDKs for Python and JavaScript. It ships integration libraries for LangChain (the LinkupSearchRetriever tool), CrewAI, LlamaIndex, Agno, and the Hugging Face ecosystem. The company also distributes an MCP (Model Context Protocol) server, allowing any MCP-compatible client including Cursor and Claude Desktop to use Linkup as a web search tool without writing custom API code. The API is also compatible with the OpenAI SDK wrapper, letting teams already using OpenAI's client library add Linkup search with minimal changes.
In January 2025, LightOn, a European provider of secure enterprise AI under its Paradigm platform, integrated Linkup as its real-time web access layer. This partnership gave LightOn's customers, who tend to have strict data sovereignty requirements, access to current web information through a compliant pipeline.
A distinguishing element of Linkup's model is its approach to content sourcing. Most search APIs retrieve content by crawling publicly accessible web pages. Linkup supplements this with direct licensing agreements with publishers, giving it access to content that standard crawlers cannot retrieve while compensating publishers based on actual usage.
Known content partners include the news agency AFP, several French media properties, institutional data sources such as the Banque de France and INSEE, and legal publishers including the Cour de Cassation and the Centre Français de la Copie. The company also aggregates data from research providers such as Statista and Xerfi.
The business model is structured so that a significant portion of what customers pay flows through to the content providers whose work is retrieved. Mizrahi has described this as building infrastructure for the emerging "gated data future," anticipating that more content will move behind access controls as publishers recognize the commercial value of their data to AI systems.
This approach addresses a tension that became prominent in 2024 and 2025 as AI companies faced legal challenges over scraping practices. By negotiating licenses upfront and compensating publishers on a usage basis, Linkup frames itself as a compliant alternative for teams that need current information without the legal exposure that comes with scraping.
Linkup also partners with infrastructure providers to facilitate compensation at the technical layer. The company has worked with Cloudflare and Tollbit to integrate compensation mechanisms directly into the retrieval pipeline, allowing content owners to receive payment programmatically rather than through manually negotiated bilateral contracts. This reduces the overhead of signing new partnerships and makes it possible for smaller publishers to participate without the legal and commercial resources required for a traditional licensing deal.
Mizrahi has acknowledged that bilateral licensing at scale is operationally difficult. He noted in a 2024 interview that a single major partnership had taken approximately one year to negotiate. The programmatic compensation approach is intended to lower that friction, though the licensing landscape for AI-generated content access remains legally unsettled in many jurisdictions.
Linkup uses a pay-per-query pricing model denominated in euros. New accounts receive €5 in credits on signup, and accounts are topped up to €5 each month at no charge. Credits beyond that are purchased in advance.
The pricing tiers as of early 2026 are as follows:
| Endpoint | Depth / Mode | Price per call |
|---|---|---|
| /search | Fast (beta) | €0.005 |
| /search | Standard | €0.005 |
| /search | Deep | €0.05 |
| /fetch | Without JavaScript rendering | €0.001 |
| /fetch | With JavaScript rendering | €0.005 |
| /research | Asynchronous deep research | €0.80 |
No credit is deducted when a query returns an error or when no results are found. Accounts that exhaust their credits receive HTTP 429 responses until more credits are purchased. Enterprise pricing is available through direct negotiation.
Linkup also supports an alternative payment pathway via the x402 protocol, a blockchain-based micropayment system using USDC on the Base network. This option requires no account setup, which is useful for automated agent pipelines that prefer payment-at-point-of-use.
The company offers €500 in additional free credits to qualifying startups through an application process.
Linkup has published comparisons on two publicly available evaluation frameworks: OpenAI's SimpleQA benchmark and SealQA.
SimpleQA tests factual recall by asking 4,326 short questions with a single correct answer. Linkup's Deep Search mode scored 90.10% on this benchmark, placing it at or near the top among search API providers at the time of evaluation. Exa finished at 90.04%, essentially tied. Linkup's Standard Search scored 85%, compared to 77% for Perplexity Sonar and 86% for Perplexity Sonar Pro. Linkup's own documentation claims the standard search endpoint achieves a 92% F-score on SimpleQA Verified.
On SealQA-0, a benchmark designed to evaluate complex multi-step research tasks, the Linkup /research endpoint claimed first place with 61% accuracy.
Linkup also published an internal evaluation using a dataset of 600 queries drawn partly from real anonymized user traffic and partly from synthetic multi-hop questions. The study compared Linkup, Exa, Tavily, and Perplexity on source diversity, faithfulness (claims grounded in cited sources), and entity coverage. The evaluation methodology weighted faithfulness and completeness equally at 45% each, with source quality accounting for the remaining 10%. Linkup reported retrieving up to three times more unique domains per query than competitors and achieving roughly four times lower missing-entity rates. Hallucination rates, measured by decomposing responses into atomic claims and checking each against cited sources, were lowest for Linkup across the dataset.
The company acknowledged limitations in this study: the dataset emphasizes complex multi-entity queries that may favor Linkup's optimization; only the standard API tier was evaluated; and the study was self-conducted, though Linkup released the evaluation code as open source to allow independent replication.
Linkup operates in a growing market for AI-native search APIs. The main competitors can be grouped by approach.
Tavily is the most widely referenced alternative in the AI developer community. Tavily launched earlier and built integrations with LangChain and other frameworks that gave it significant developer awareness. It focuses on providing AI-optimized search results with source attribution and offers both a fast "context" mode and a more thorough research mode. Tavily's Research mode is priced at roughly $8 per thousand queries, which is substantially more expensive than Linkup's comparable tier. Linkup's standard search at €0.005 per call translates to approximately $5.50 per thousand calls, making it price-competitive on the mid tier.
Exa (formerly Metaphor) uses neural semantic search to find pages that are conceptually related to a query rather than just keyword-matching. Exa is particularly strong for discovering content by similarity or by finding pages that link to a given concept. Its scores on SimpleQA (90.04%) are nearly identical to Linkup's. Exa charges between $1 and $5 per thousand queries for neural search.
Brave Search offers a search API built on Brave's independent index, one of the few large web indexes not derived from Google or Bing. Brave's API is privacy-oriented and does not log queries or associate them with user identities, making it useful for healthcare, legal, and government applications. Brave's pricing is significantly lower than most AI-native options, with a free tier of two thousand queries per month. However, Brave Search returns traditional search results rather than AI-processed content chunks, placing it closer to a traditional SERP API than an AI-native retrieval layer.
Serper provides fast, low-cost access to Google search results via API, aimed at developers who need structured SERP data. At roughly $0.30 to $1.00 per thousand queries, Serper is substantially cheaper than Linkup or Tavily. The tradeoff is that Serper returns search result metadata rather than extracted content, so the consuming application must handle content retrieval and chunking separately.
Perplexity offers its Sonar API, which adds an LLM layer on top of search to return synthesized answers. Because Perplexity's primary product is its consumer search application, the API is priced at a premium relative to pure search providers, ranging from $8 to $15 per thousand queries for the Pro tier. Developers building applications that compete with Perplexity's consumer product have noted that the company has at times restricted API access or adjusted terms.
| Provider | Approach | Price per 1,000 queries (standard) | Free tier |
|---|---|---|---|
| Linkup | AI-native, agentic search | ~$5.50 (€5) | 4,000 queries on signup |
| Tavily | AI-optimized SERP | ~$8.00 | 1,000/month |
| Exa | Neural semantic search | $1.00–$5.00 | 1,000/month |
| Brave Search | Independent index, privacy-first | ~$0.005–$5.00/month flat | 2,000/month |
| Serper | Google SERP data | $0.30–$1.00 | 2,500/month |
| Perplexity Sonar | LLM-synthesized answers | $8.00–$15.00 | $5 credit |
The key differentiators Linkup emphasizes against the field are: licensed premium content unavailable through crawling; the agentic multi-pass retrieval on deep queries; and its API-first positioning with no competing consumer product that might create incentive to limit API capabilities.
Linkup's customer base spans several application categories. The following use cases appear repeatedly in the company's documentation and customer descriptions.
Legal research is a prominent application. Legora, an AI company that builds software for law firms, uses Linkup to retrieve current case law, regulatory filings, and legal commentary that post-dates any model's training data. Accurate citation of current legal sources is non-negotiable in legal work, making a grounded, sourced retrieval layer more important than in consumer-facing applications.
Financial intelligence and due diligence is another frequent category. Analysts running due diligence on a company need current news, recent filings, and up-to-date financial statistics. Linkup's Deep Search mode can chase multi-hop questions across several sources in a single API call, reducing the number of queries an agent needs to issue.
Prediction markets and real-time intelligence are also represented. Polymarket, the prediction market platform, uses Linkup to give its AI features access to current event information.
Sales intelligence and pipeline enrichment workflows use the API to augment contact and company records with fresh news and recent activity, typically via integrations with CRM systems or tools like Clay.
Cybersecurity applications use Linkup to monitor for breach disclosures, vulnerabilities, and threat intelligence in near real time.
Customer service chatbots that need to answer questions about current policies, prices, or product availability benefit from the API by avoiding the staleness problem inherent in models trained months before deployment.
Enterprise knowledge management is addressed through the Private Index option, which lets organizations index internal documents alongside public web content, making Linkup a RAG backend for internal-facing agents.
Voice AI applications benefit from the Fast search depth because sub-second latency is a hard requirement when an AI assistant needs to answer a spoken question without a perceptible pause. Linkup's documentation lists voice AI as an explicit use case supported by the Fast endpoint.
Corporate reputation monitoring and competitive intelligence are served by the ability to run continuous queries against current news and social media sources, with Linkup abstracting the retrieval and formatting work so that the consuming application receives structured data without managing multiple separate data feeds.
Linkup received coverage from TechCrunch at launch in November 2024, which noted the company's focus on connecting LLMs with premium content sources through licensing rather than scraping. EU Startups and several French-language technology outlets including Maddyness covered the funding rounds.
The composition of angel investors in the $10 million round drew attention in the French startup ecosystem. Having Arthur Mensch of Mistral AI and Florian Douetteau of Dataiku as backers was widely noted, given that both represent flagship French AI companies. Kima Ventures also publicly acknowledged its participation in the €3 million round via LinkedIn.
Within the developer community, Linkup has been compared favorably to Tavily on price and to Exa on factual accuracy. Third-party benchmark comparisons from sites including aiagentslist.com have cited Linkup's SimpleQA scores and its lower per-query cost relative to Perplexity Sonar. Some developers have noted that Linkup's licensed premium content is useful for use cases requiring financial data from sources such as Statista or Xerfi, which are not accessible through standard crawling.
The company's approach to publisher compensation has been discussed in the context of broader debates about AI and intellectual property. By paying content providers for access to their material and structuring deals so that a meaningful share of revenue flows to sources, Linkup is positioned as a model for how AI infrastructure companies can work with publishers rather than against them. This framing has resonated with European media companies and regulators who have been more aggressive than US counterparts in pursuing AI companies over unlicensed content use.
By early 2026, Linkup had acquired hundreds of paying customers. The company has publicly identified several by name or sector.
Legora, an AI legal software company, uses Linkup to ground its case-law and regulatory research features with current citations. Polymarket, the prediction market platform, uses Linkup as a real-time information feed for its AI-powered features. Cohere, the enterprise AI company, is listed as a customer in Linkup's documentation, suggesting use for grounding or augmenting its language model outputs with retrieved content. McKinsey is listed among enterprise customers, and SNCF, the French national rail operator, is also cited, likely for internal knowledge management or customer-facing AI assistant applications.
Artisan, an AI company building autonomous sales agents, is one of the notable AI-native customers referenced in Linkup's seed round announcement. KPMG is cited as a global enterprise customer.
Linkup's performance degrades on queries requiring real-time data with latency requirements below one second, since the Deep and Research modes trade speed for coverage. The sub-second Fast mode retrieves only a single pass of results, which may miss relevant content for complex or multi-faceted questions.
The self-published benchmark comparison against Tavily, Exa, and Perplexity used a dataset drawn partly from Linkup's own traffic. This creates a potential selection bias: real user queries from Linkup's customer base may favor query types where Linkup's architecture performs best. The company released the evaluation code to allow independent replication, but as of early 2026 no third-party replication had been published.
The publisher partnership model, while useful for accessing licensed content, means the depth of coverage depends on which publishers Linkup has signed agreements with. For topics not covered by partner publishers, the API falls back to crawled web content like other providers.
Enterprise features including Bring Your Own Cloud deployment and the Private Index require custom agreements and are not self-serve, which may slow adoption for organizations with complex procurement processes.
As a relatively young company with a small team, Linkup has a shorter track record of production reliability compared to more established providers. The 99.9% uptime SLA is stated in documentation but has not been independently audited over a long observation window.