AI & ML interests

The AI community building the future.

Recent Activity

sayakpaul  updated a dataset about 9 hours ago
huggingface/diffusers-metadata
lysandre  updated a dataset about 18 hours ago
huggingface/transformers-metadata
View all activity

Articles

pcuenq 
in huggingface/HuggingDiscussions about 2 hours ago

[FEEDBACK] Local apps

👀❤️ 4
70
#31 opened almost 2 years ago by
kramp
irenesolaiman 
published an article 2 days ago
view article
Article

State of Open Source on Hugging Face: Spring 2026

40
evalstate 
posted an update about 1 month ago
view post
Post
3586
Hugging Face MCP Server v0.3.2
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

- Replace model_search and dataset_search with combined hub_repo_search tool.
- Less distracting description for hf_doc_search
- model_search and dataset_search tool calls will still function (plan to remove next release).
  • 4 replies
·
AdinaY 
posted an update about 1 month ago
view post
Post
3355
MiniMax M2.5 is now available on the hub 🚀

MiniMaxAI/MiniMax-M2.5

✨ 229B - Modified MIT license
✨37% faster than M2.1
✨ ~$1/hour at 100 TPS
  • 2 replies
·
AdinaY 
posted an update about 1 month ago
AdinaY 
posted an update about 1 month ago
view post
Post
3775
Game on 🎮🚀

While Seedance 2.0’s videos are all over the timeline, DeepSeek quietly pushed a new model update in its app.

GLM-5 from Z.ai adds more momentum.

Ming-flash-omni from Ant Group , MiniCPM-SALA from OpenBMB
, and the upcoming MiniMax M2.5 keep the heat on 🔥

Spring Festival is around the corner,
no one’s sleeping!

✨ More releases coming, stay tuned
https://huggingface.co/collections/zh-ai-community/2026-february-china-open-source-highlights
AdinaY 
posted an update about 1 month ago
view post
Post
3903
Ming-flash-omni 2.0 🚀 New open omni-MLLM released by Ant Group

inclusionAI/Ming-flash-omni-2.0

✨ MIT license
✨ MoE - 100B/6B active
✨ Zero-shot voice cloning + controllable audio
✨ Fine-grained visual knowledge grounding
  • 2 replies
·
AdinaY 
posted an update about 1 month ago
view post
Post
760
LLaDA 2.1 is out 🔥 A new series of MoE diffusion language model released by AntGroup

inclusionAI/LLaDA2.1-mini
inclusionAI/LLaDA2.1-flash

✨LLaDA2.1-mini: 16B - Apache2.0
✨LLaDA2.1-flash: 100B - Apache2.0
✨Both delivers editable generation, RL-trained diffusion reasoning and fast inference
  • 2 replies
·