Clément Delangue is a French entrepreneur and the co-founder and CEO of Hugging Face, one of the most prominent open-source artificial intelligence platforms in the world. Under his leadership, Hugging Face has grown from a teenage chatbot startup into a central hub for machine learning research and deployment, often described as the "GitHub of AI." Delangue is widely recognized as a leading advocate for open-source AI and the democratization of artificial intelligence technology.
Delangue grew up in La Bassée, a small town in northern France. He was the third of four children. His mother worked as a nurse and his father ran a garden equipment shop. Despite growing up in a modest, rural setting, Delangue showed an entrepreneurial drive from a young age. When he received his first computer at the age of 12, he quickly began exploring ways to use it for business.
As a teenager, Delangue and his older brother began importing ATVs and motorbikes from China, selling them through their father's shop and on eBay. By the age of 17, he had become one of the most prominent French sellers on the platform. This early experience in e-commerce gave him a practical understanding of global trade and online marketplaces years before he entered the technology industry.
Delangue enrolled at ESCP Business School in Paris, where he studied from 2008 to 2012 and earned a Master in Management degree. During his time at ESCP, he participated in international exchange programs at Universidad Carlos III de Madrid (2009 to 2010) and the Indian Institute of Management Bangalore (2010 to 2011). He also studied at University College Dublin from 2011 to 2012. In addition to his formal coursework, he completed the "Introduction to Computer Science" and "Programming Methodology" courses offered through Stanford Engineering Everywhere between 2011 and 2012, which deepened his interest in software development and machine learning.
After graduating from ESCP in 2012, Delangue turned down a job offer from Google to pursue his own ventures. His first startup attempt was a collaborative note-taking application called UniShared, which did not gain significant traction. He also co-founded VideoNot.es, a tool for syncing notes with online video lectures.
Delangue's formative professional experience came at Moodstocks, a Paris-based startup focused on computer vision and machine learning for image recognition. At Moodstocks, he served as Head of Sales and Marketing. The company developed an application that allowed users to scan physical objects and see relevant product information, reviews, and purchase links. Moodstocks was acquired by Google in 2016. This experience gave Delangue direct exposure to the commercial potential of machine learning and solidified his belief that AI technology should be widely accessible.
Before founding Hugging Face, Delangue also worked in product and marketing roles at several other startups, a number of which were eventually acquired. During his time at eBay, he joined the EU Enterprise Team as an intern during his university years.
In 2016, while living in New York City, Delangue connected with Julien Chaumond, a French software engineer who had previously worked at France's Ministry of Economy and at the Paris-based video startup Stupeflix. The two had been aware of each other's work for years and decided to collaborate. They enrolled together in an online Stanford engineering course and assembled a study group of roughly three dozen people. Among the study group members was Thomas Wolf, a physicist and machine learning researcher who knew Chaumond from engineering school; the two had also played together in a rock band.
By the time the course ended, Delangue and Chaumond invited Wolf to join them. The three co-founders shared a common interest in natural language processing (NLP) and set out to tackle one of the hardest challenges in the field: building an open-domain conversational AI system.
Hugging Face was officially founded in 2016 in New York City. The company name was inspired by the hugging face emoji, reflecting the playful and approachable tone of their original product. Early funding came after the team secured a spot in the Betaworks chatbot-focused startup accelerator program in New York, along with an initial $200,000 investment from Betaworks. In March 2017, the company raised a $1.2 million angel round from investors including Betaworks, SV Angel, and NBA star Kevin Durant.
The original Hugging Face product was a mobile chatbot application aimed at teenagers. Described by some as an "AI Tamagotchi," the app allowed users to create a digital companion and carry on text conversations. The chatbot attempted to detect user emotions and adapt its responses accordingly. At its peak, the app attracted around 100,000 daily active users and exchanged over a billion messages. However, the team found that improvements to the underlying technology did not translate into proportional gains in user engagement or retention. The chatbot, while moderately popular, was limited by the state of NLP technology at the time.
The pivotal moment in Hugging Face's history came in late 2018, when Google released BERT (Bidirectional Encoder Representations from Transformers), a breakthrough language model that transformed the NLP landscape. Thomas Wolf and the Hugging Face team quickly produced a PyTorch implementation of BERT and released it as open source on GitHub within a week. The release attracted substantial attention from the machine learning community and demonstrated that there was strong demand for accessible, well-implemented open-source NLP tools.
This response clarified the company's direction. In 2019, Delangue and his co-founders formally pivoted Hugging Face away from the consumer chatbot product and toward building open-source machine learning infrastructure. The decision to open-source the model behind their chatbot, and then to build developer tools around that ethos, became the foundation for everything the company would do going forward.
The centerpiece of Hugging Face's pivot was the Transformers library, released in 2019. Originally focused on NLP models, the library provided a unified API for working with state-of-the-art transformer architectures such as BERT, GPT-2, and later T5, RoBERTa, and many others. The library made it simple for developers and researchers to download, fine-tune, and deploy pre-trained models with just a few lines of code.
In December 2019, Hugging Face raised $15 million in a Series A round led by Lux Capital to build what it called "the definitive NLP library." Other investors included A.Capital Ventures, Betaworks, and individual angels such as Richard Socher (then chief scientist at Salesforce) and Greg Brockman (co-founder and CTO of OpenAI).
The Transformers library rapidly became one of the most widely used open-source projects in machine learning. It expanded beyond NLP to support computer vision, audio processing, and multimodal models. As of 2025, it had over one million model checkpoints available on the Hugging Face Hub.
Alongside the Transformers library, Hugging Face developed the Hub, a platform for hosting and sharing machine learning models, datasets, and demo applications. The Hub functions as a version-controlled repository for AI artifacts, similar in concept to GitHub for code. Researchers and developers can upload trained models, discover models shared by others, and run inference directly through the Hub's interface.
By 2025, the Hub hosted over 2 million public models and more than 500,000 public datasets, with 13 million registered users. Major technology companies including Microsoft, Meta, and Google use the platform to distribute their own open-source models.
In 2021, Hugging Face launched Spaces, a feature that allows users to host interactive machine learning demos for free. Spaces integrates with Gradio, an open-source library for building web-based ML interfaces. Also in late 2021, Hugging Face acquired Gradio, bringing the five-person Gradio engineering team in-house. The acquisition strengthened Hugging Face's ability to provide end-to-end tools for building, sharing, and demonstrating machine learning applications.
Hugging Face launched the first version of its serverless Inference API in the summer of 2020, enabling developers to run model inference through hosted endpoints. This became one of the company's early revenue-generating products. Over time, Hugging Face expanded its commercial offerings to include enterprise plans with private model hosting, dedicated support, and enhanced security and compliance features.
The company's business model follows a freemium structure. Individual Pro plans are available at $9 per month, team plans at $20 per user per month, and enterprise contracts are priced based on use case. Enterprise customers receive access to managed private deployment hubs and guidance from Hugging Face's machine learning experts, including co-founder Thomas Wolf. Notable enterprise clients include Intel, Pfizer, Bloomberg, and eBay.
Beyond the Transformers library, Hugging Face has developed and maintained a growing ecosystem of open-source tools:
| Product | Description |
|---|---|
| Transformers | Unified API for state-of-the-art transformer models across NLP, vision, audio, and multimodal tasks |
| Datasets | Library for accessing and processing machine learning datasets |
| Diffusers | Library for generative diffusion models such as Stable Diffusion |
| Tokenizers | High-performance tokenization engine written in Rust |
| Accelerate | Library for distributed training across GPUs and TPUs |
| Optimum | Hardware-specific optimizations for training and inference |
| Safetensors | Secure file format for storing model weights |
| AutoTrain | Automated model selection, training, and deployment |
| HuggingChat | Open-source ChatGPT alternative, launched in April 2023 |
Under Delangue's leadership, Hugging Face has raised significant venture capital across multiple funding rounds.
| Round | Date | Amount | Lead Investor | Post-Money Valuation |
|---|---|---|---|---|
| Angel/Seed | March 2017 | $1.2M | Betaworks | N/A |
| Seed | May 2018 | $4M | N/A | N/A |
| Series A | December 2019 | $15M | Lux Capital | N/A |
| Series B | March 2021 | $40M | Addition | N/A |
| Series C | May 2022 | $100M | Lux Capital | $2 billion |
| Series D | August 2023 | $235M | Salesforce Ventures | $4.5 billion |
The Series D round in August 2023 attracted a roster of major technology companies as investors, including Google, Amazon, NVIDIA, AMD, Intel, Qualcomm Ventures, IBM, and Sound Ventures. The $4.5 billion post-money valuation represented a doubling of the company's value from its Series C just over a year earlier. Total funding raised through 2023 reached approximately $395 million.
Hugging Face's revenue grew substantially alongside the broader AI boom. The company generated approximately $10 million in revenue in 2021 (its first year of monetization), $15 million in 2022, roughly $70 million in 2023, and approximately $130 million in 2024. By early 2025, the company served over 50,000 organizations, with more than 2,000 paying enterprise clients.
As of early 2026, Hugging Face employed approximately 684 people. The company operates on a remote-first basis, with employees distributed across cities worldwide.
Hugging Face has pursued targeted acquisitions to strengthen its platform:
| Acquisition | Year | Description |
|---|---|---|
| Gradio | 2021 | Open-source library for building ML demo interfaces |
| XetHub | 2024 | Data version control for large AI repositories (founded by ex-Apple engineers) |
| Pollen Robotics | 2025 | French humanoid robotics startup, maker of the Reachy robot |
The acquisition of XetHub in August 2024 was described by Delangue as the largest acquisition in Hugging Face's history at that time. XetHub's technology enables Git to scale to terabyte-sized repositories, supporting individual files larger than 1 TB. The Pollen Robotics acquisition in April 2025 signaled Hugging Face's expansion into open-source robotics, with Delangue stating his vision to "make Artificial Intelligence robotics Open Source."
Delangue is one of the most vocal advocates in the technology industry for open-source approaches to AI development. His core argument is that transparency and broad access to models and data are the most effective means to identify, understand, and mitigate the harms that AI systems can cause. Rather than concentrating control of AI in a small number of large companies, Delangue believes that open development distributes both the benefits and the responsibility across a global community of researchers, developers, and organizations.
In interviews, Delangue has drawn a comparison between open-source and closed-source AI and the history of internet search engines, suggesting that the performance advantages of proprietary systems are likely to narrow over time as open models improve. He has argued that model performance will increasingly depend on access to private, domain-specific data rather than on raw compute or architecture alone.
Delangue has stated: "Open science and open-source AI prevent blackbox systems, make companies more accountable, and help solve challenges, like mitigating biases, reducing misinformation, promoting copyright, and rewarding all stakeholders including artists and content creators." He has also said: "Our main goal is not so much to build a big company or to make money... I'm most excited about the potential for change."
Hugging Face's platform reflects this philosophy in practical ways. The company does not employ dedicated community managers; instead, all employees participate in community engagement. The Hub enforces community standards that prevent harmful AI models from being distributed, and Hugging Face supports gated model releases that allow researchers to share models for specific research purposes while limiting broader misuse.
A major demonstration of Delangue's open-source vision was the BLOOM (BigScience Language Open-science Open-access Multilingual) project. Released in July 2022, BLOOM was a 176-billion-parameter open-source large language model capable of generating text in 46 natural languages and 13 programming languages. The project was the result of a year-long collaborative research effort called BigScience, led by Hugging Face and involving several hundred volunteer researchers from academia and the private sector worldwide. BLOOM was distributed under a free license, making it one of the largest openly available language models at the time of its release.
Delangue has invested in building ethical AI capabilities within Hugging Face. In August 2021, the company hired Dr. Margaret Mitchell as Chief Ethics Scientist. Mitchell, who had previously founded and co-led the Ethical AI Team at Google before being fired over a dispute about her research on AI bias, brought significant expertise in algorithmic fairness, inclusion, and transparency to Hugging Face. Under her leadership, Hugging Face developed protocols for ethical AI research and model documentation, including contributions to model cards and data sheets that have become standard practices in the field.
On June 22, 2023, Delangue testified before the United States House of Representatives Committee on Science, Space and Technology at a hearing titled "Artificial Intelligence: Advancing Innovation Towards the National Interest." In his five-minute testimony, Delangue argued that open-source AI is "extremely aligned" with American interests because it distributes economic gains by enabling hundreds of thousands of small companies and startups to build with AI, fostering innovation and fair competition.
Delangue made several specific policy recommendations during his testimony:
Delangue told Congress that "open systems foster democratic governance and increased access, especially to researchers, and can help to solve critical security concerns by enabling and empowering safety research." He also noted that open-source development creates "a safer path for development of the technology by giving civil society, nonprofits, academia and policymakers the capabilities they need to counterbalance the power of big private companies."
Delangue's position on AI safety centers on the idea that openness and transparency are themselves safety mechanisms. He has argued that the biggest risk in AI is "to have power and understanding concentrated in the hands of a few," particularly when those entities are not focused on the public good. He acknowledges that any AI system, regardless of its level of openness, carries risks and potential for misuse, but proposes that openness enables the broadest possible community to audit, test, and improve AI systems.
At the same time, Delangue has recognized the need for practical safeguards. Hugging Face implements community moderation, gating mechanisms for sensitive model releases, and content policies that restrict the distribution of harmful models on its platform. Delangue has described this as balancing tensions between openness and safety using a combination of policy and technical controls.
He has supported calls from AI safety organizations, including Anthropic, for increased investment in NIST to help develop standards for fighting AI bias and risk. Delangue participated in the OECD AI Policy Forum, where he contributed to discussions about the role of open science in responsible AI governance.
Delangue has received several notable recognitions for his work in artificial intelligence and entrepreneurship.
| Year | Award or Recognition |
|---|---|
| 2017 | Forbes 30 Under 30 Europe (Technology) |
| 2021 | Vanity Fair's Top 100 Most Influential French People |
| 2023 | TIME100 AI: 100 Most Influential People in Artificial Intelligence |
The TIME100 AI list, first published in September 2023, recognized Delangue alongside figures such as Sam Altman, Demis Hassabis, and Yann LeCun. In the TIME profile, written by correspondent Billy Perrigo, Delangue was quoted as saying: "Everyone has been building very collaboratively. I think it's important for people to remember that" when discussing the choice between controlled and collaborative approaches to AI development.
Delangue has shaped Hugging Face's culture around principles of openness, speed, and asynchronous communication. The company's stated values include acting quickly, communicating asynchronously, and sharing transparently. Hugging Face operates as a remote-first organization, with no central headquarters and employees distributed globally.
Delangue has spoken about how his upbringing in a small French town influenced his worldview. Growing up as the third of four children in a household where travel was both logistically and financially difficult shaped what he describes as a connector and peacekeeper role. He has criticized traditional education as being too insular, once stating: "The classroom is a bubble. Whereas it should be a place for openness to the world, to others."
Delangue has expressed interest in eventually taking Hugging Face public. He and his co-founders have joked about wanting Hugging Face to be "the first company to go public with an emoji, rather than a three-letter ticker," referring to the company's signature hugging face emoji. As of early 2026, Hugging Face remains a private company. Sequoia Capital investor Brandon Reeves has suggested the company could eventually be valued at $50 to $100 billion.
Delangue is based in New York City, where Hugging Face was founded. He is active on social media, particularly on X (formerly Twitter) and LinkedIn, where he frequently shares his views on open-source AI, company updates, and predictions about the AI industry. In a December 2023 LinkedIn post, he shared six predictions for AI in 2024, including that a hyped AI company would go bankrupt or be acquired and that open-source models would continue to close the gap with proprietary alternatives.