ONNX Runtime: cross-platform, high performance ML inferencing
Cross-platform runtime for cloud, mobile, desktop, and IoT apps
Runtime environment based on OpenJDK for running IntelliJ products
NativeScript for Android using v8
WebAssembly Micro Runtime (WAMR)
Tooling for optimized and reproducible GPU-accelerated AI runtime
Extend your preferred base images to be Lambda compatible
Proxy for Lambda’s Runtime and Extensions APIs
A fast and secure runtime for WebAssembly
A cross-platform and extendable version manager with support for Java
Runtime for worker nodes executing modules
dockerd as a compliant Container Runtime Interface for Kubernetes
Extensible WebAssembly runtime for cloud native applications
Super-fast/easy runtime validations and serializations
Runtime type system for IO decoding/encoding
CLI and validation tools for Kubelet Container Runtime Interface (CRI)
Seamlessly extend your preferred base images to be Lambda compatible
A fast and lightweight fully featured OCI runtime and C library
AWS Lambda Ruby Runtime Interface Client
A secure runtime for JavaScript and TypeScript
The Fast, Free, Modern MATLAB / Octave code runtime
A retargetable MLIR-based machine learning compiler runtime toolkit
A container runtime written in Rust
eBPF-based Security Observability and Runtime Enforcement
Rust async runtime based on io-uring