Run models like Kimi-K2.5, GLM-5, DeepSeek, gpt-oss, Gemma, Qwen etc.
Port of Facebook's LLaMA model in C/C++
Low-latency AI inference engine optimized for mobile devices
AI video generator optimized for low VRAM and older GPUs use
Flux 2 image generation model pure C inference
llama and other large language models on iOS and MacOS offline
Leading open-source visualization and observability platform
Real-Time Event Frameworks based on active objects & state machines
IEC 104 RTU Server Client Simulator Source Code Library Win Linux
IEC 101 Server and Client Simulator Source code Library, Win, Linux
DNP3 Protocol Source code Library for Linux x86 x64 ARM PowerPC
IEC 104 Source code Library for Linux c c++ posix arm
IEC 101 Embedded Linux ARM, POSIX x86 x86-64 - c, c++ programming
IEC 104 Source code Library for Windows, Linux, QNX, Real time OS, ARM
IEC 104 Source code Library for Windows c c++ c# .net
IEC 60870-5-101 Source code Library Stack - Windows c, c++, c# .net
AI macOS app for real-time coding interview coaching assistance
Locally run an Instruction-Tuned Chat-Style LLM
A collection of practical tips can be found at the bottom of this page
A cross-platform, open-source, pure C game engine for mobile game
mujoco-py allows using MuJoCo from Python 3
Object Oriented tools for C
Retro Games in Gym