Fast, flexible and easy to use probabilistic modelling in Python
MoBA: Mixture of Block Attention for Long-Context LLMs
Qwen3.5 is the large language model series developed by Qwen team
Wan2.2: Open and Advanced Large-Scale Video Generative Model
MiMo-V2-Flash: Efficient Reasoning, Coding, and Agentic Foundation
Running a big model on a small laptop
Open-source, high-performance AI model with advanced reasoning
System Level Intelligent Router for Mixture-of-Models at Cloud
A Powerful Native Multimodal Model for Image Generation
Qwen3-Coder is the code version of Qwen3
A Next-Generation Training Engine Built for Ultra-Large MoE Models
Towards self-verifiable mathematical reasoning
From nobody to big model (LLM) hero
Collection of links for free stock photography, video and Illustration
Clean and efficient FP8 GEMM kernels with fine-grained scaling
Ling-V2 is a MoE LLM provided and open-sourced by InclusionAI
Ring is a reasoning MoE LLM provided and open-sourced by InclusionAI
157 models, 30 providers, one command to find what runs on hardware
Wan2.1: Open and Advanced Large-Scale Video Generative Model
Kimi K2 is the large language model series developed by Moonshot AI
Powerful AI language model (MoE) optimized for efficiency/performance
Open-weight, large-scale hybrid-attention reasoning model
Moonshot's most powerful AI model
The home of the ICU project source code
Fast, Sharp & Reliable Agentic Intelligence