2 projects for "python for windows" with 2 filters applied:

  • Silverware is an enterprise-grade hospitality platform built for hotels, resorts, and complex multi-venue operations. Icon
    Silverware is an enterprise-grade hospitality platform built for hotels, resorts, and complex multi-venue operations.

    Silverware powers high-end hospitality environments

    Silverware is built for hotel, resort, and multi-venue hospitality operators who need enterprise-grade control, deep integrations, and always-on reliability to run complex operations at scale.
    Learn More
  • OpenMetal is an automated bare metal and on-demand private cloud provider. Icon
    OpenMetal is an automated bare metal and on-demand private cloud provider.

    Large Scale. Cloud Native. Fixed Costs.

    OpenMetal is an automated bare metal and on-demand private cloud provider. Our mission is to empower your team with cost effective private infrastructure that outperforms traditional public cloud.
    Learn More
  • 1
    DeepSeek-V3

    DeepSeek-V3

    Powerful AI language model (MoE) optimized for efficiency/performance

    DeepSeek-V3 is a robust Mixture-of-Experts (MoE) language model developed by DeepSeek, featuring a total of 671 billion parameters, with 37 billion activated per token. It employs Multi-head Latent Attention (MLA) and the DeepSeekMoE architecture to enhance computational efficiency. The model introduces an auxiliary-loss-free load balancing strategy and a multi-token prediction training objective to boost performance. Trained on 14.8 trillion diverse, high-quality tokens, DeepSeek-V3...
    Downloads: 158 This Week
    Last Update:
    See Project
  • 2
    DeepSeek R1

    DeepSeek R1

    Open-source, high-performance AI model with advanced reasoning

    DeepSeek-R1 is an open-source large language model developed by DeepSeek, designed to excel in complex reasoning tasks across domains such as mathematics, coding, and language. DeepSeek R1 offers unrestricted access for both commercial and academic use. The model employs a Mixture of Experts (MoE) architecture, comprising 671 billion total parameters with 37 billion active parameters per token, and supports a context length of up to 128,000 tokens. DeepSeek-R1's training regimen uniquely...
    Downloads: 73 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next
MongoDB Logo MongoDB