Package and deploy machine learning models using Docker containers
Chat experience in your terminal
Secure local-first microVM sandbox for running untrusted code fast
Fast, flexible LLM inference
Fast, small, and fully autonomous AI assistant infrastructure
Distributed DataFrame for Python designed for the cloud
High performance Twitch bot in Rust
Generate music based on natural language prompts using LLMs
A high-performance inference engine for AI models
A minimal, secure Python interpreter written in Rust for use by AI
Open-source LLM load balancer and serving platform for hosting LLMs
LLMs done the UNIX-y way
A reactive runtime for building durable AI agents
Rust crate for building chains in large language models
ModelFox makes it easy to train, deploy, and monitor ML models