Hugging Face logo

Hugging Face

The AI community platform for sharing models, datasets, and machine learning apps. Hugging Face hosts over 500,000 open-source models and the Transformers library that has become the de facto standard for NLP and ML development. It provides model hosting, inference APIs, and collaborative tools that make it the GitHub of machine learning, central to the open-source AI ecosystem.

AI Infrastructure
ML Platform
Clement Delangue· CEO & Co-Founder
Total Raised
$391M
Last Valuation
$4.5B
Founded
2016
Headquarters
New York, NY
Employees
300+

Overview

Hugging Face has become the central hub of the open-source AI ecosystem, hosting over one million models, hundreds of thousands of datasets, and tens of thousands of AI-powered demo applications. Often described as the GitHub of machine learning, the platform enables researchers and developers to share, discover, and collaborate on AI models and datasets with unprecedented ease. Hugging Face's Transformers library is the most widely used open-source framework for working with large language models, and its ecosystem includes libraries for training (TRL), dataset management, model evaluation, and deployment. The company monetizes through Hugging Face Enterprise, which offers private model hosting, inference endpoints, and collaboration features for organizations building AI products. Hugging Face has built deep partnerships with every major cloud provider and hardware vendor, making it the default neutral ground where the AI community converges. The platform plays a critical role in AI democratization by making state-of-the-art models accessible to developers worldwide.

Funding History

5 rounds · May 2017 – Sep 2025
$391M
Total Raised
5
Valuation: $4.5B
Cumulative: $391M

Open-source AI hub becomes the GitHub of machine learning.

Felicis Ventures (Lead)Sequoia CapitalGoogle Ventures (GV)
Source
4
Valuation: $2.0B
Cumulative: $156M

Hugging Face reaches $2B valuation as the GitHub of machine learning.

Lux Capital (Lead)
3
Valuation: $267M
Cumulative: $56M

Hugging Face raises $40M as its model hub surpasses 30,000 models.

Addition (Lead)
2
Valuation: $75M
Cumulative: $16M

Hugging Face raises Series A as its Transformers library becomes the standard for NLP research.

Investors not disclosed
1
Valuation: $5M
Cumulative: $1M

Hugging Face raises seed round to build NLP tools and open-source model hub.

Investors not disclosed

Frequently Asked Questions

How much has Hugging Face raised in total?
Hugging Face has raised a total of $391M across 5 funding rounds.
Who are Hugging Face's investors?
Hugging Face's investors include Sequoia Capital, Felicis Ventures, Google Ventures (GV), Lux Capital, Undisclosed, and 1 others.
What does Hugging Face do?
Hugging Face has become the central hub of the open-source AI ecosystem, hosting over one million models, hundreds of thousands of datasets, and tens of thousands of AI-powered demo applications. Often described as the GitHub of machine learning, the platform enables researchers and developers to share, discover, and collaborate on AI models and datasets with unprecedented ease. Hugging Face's Transformers library is the most widely used open-source framework for working with large language models, and its ecosystem includes libraries for training (TRL), dataset management, model evaluation, and deployment. The company monetizes through Hugging Face Enterprise, which offers private model hosting, inference endpoints, and collaboration features for organizations building AI products. Hugging Face has built deep partnerships with every major cloud provider and hardware vendor, making it the default neutral ground where the AI community converges. The platform plays a critical role in AI democratization by making state-of-the-art models accessible to developers worldwide.
When was Hugging Face founded?
Hugging Face was founded in 2016 and is headquartered in New York, NY.
Where is Hugging Face headquartered?
Hugging Face is headquartered in New York, NY.