ONNX Runtime: cross-platform, high performance ML inferencing
Cross-platform runtime for cloud, mobile, desktop, and IoT apps
Runtime environment based on OpenJDK for running IntelliJ products
NativeScript for Android using v8
WebAssembly Micro Runtime (WAMR)
Tooling for optimized and reproducible GPU-accelerated AI runtime
Extend your preferred base images to be Lambda compatible
Proxy for Lambda’s Runtime and Extensions APIs
A cross-platform and extendable version manager with support for Java
dockerd as a compliant Container Runtime Interface for Kubernetes
Extensible WebAssembly runtime for cloud native applications
Seamlessly extend your preferred base images to be Lambda compatible
Linux Runtime Security and Forensics using eBPF
AWS Lambda Ruby Runtime Interface Client
Fast, small, safe, gradually typed embeddable scripting language
TTS with kokoro and onnx runtime
Incredibly fast JavaScript runtime, bundler, test runner
AI edge infrastructure for macOS. Run local or cloud models
CLI and validation tools for Kubelet Container Runtime Interface (CRI)
Runtime for worker nodes executing modules
Low-code programming for event-driven applications
Runtime type system for IO decoding/encoding
The Fast, Free, Modern MATLAB / Octave code runtime
A secure runtime for JavaScript and TypeScript
A container runtime written in Rust