Showing 115 open source projects for "transformers"

View related business solutions
  • AestheticsPro Medical Spa Software Icon
    AestheticsPro Medical Spa Software

    Our new software release will dramatically improve your medspa business performance while enhancing the customer experience

    AestheticsPro is the most complete Aesthetics Software on the market today. HIPAA Cloud Compliant with electronic charting, integrated POS, targeted marketing and results driven reporting; AestheticsPro delivers the tools you need to manage your medical spa business. It is our mission To Provide an All-in-One Cutting Edge Software to the Aesthetics Industry.
    Learn More
  • The Most Powerful Software Platform for EHSQ and ESG Management Icon
    The Most Powerful Software Platform for EHSQ and ESG Management

    Addresses the needs of small businesses and large global organizations with thousands of users in multiple locations.

    Choose from a complete set of software solutions across EHSQ that address all aspects of top performing Environmental, Health and Safety, and Quality management programs.
    Learn More
  • 1
    Transformers

    Transformers

    State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX

    Hugging Face Transformers provides APIs and tools to easily download and train state-of-the-art pre-trained models. Using pre-trained models can reduce your compute costs, carbon footprint, and save you the time and resources required to train a model from scratch. These models support common tasks in different modalities. Text, for tasks like text classification, information extraction, question answering, summarization, translation, text generation, in over 100 languages.
    Downloads: 23 This Week
    Last Update:
    See Project
  • 2
    spacy-transformers

    spacy-transformers

    Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy

    ...These techniques can be used to import knowledge from raw text into your pipeline, so that your models are able to generalize better from your annotated examples. You can convert word vectors from popular tools like FastText and Gensim, or you can load in any pre trained transformer model if you install spacy-transformers. You can also do your own language model pretraining via the spacy pre train command. You can even share your transformer or another contextual embedding model across multiple components, which can make long pipelines several times more efficient. To use transfer learning, you’ll need at least a few annotated examples for what you’re trying to predict.
    Downloads: 24 This Week
    Last Update:
    See Project
  • 3
    Curated Transformers

    Curated Transformers

    PyTorch library of curated Transformer models and their components

    State-of-the-art transformers, brick by brick. Curated Transformers is a transformer library for PyTorch. It provides state-of-the-art models that are composed of a set of reusable components. Supports state-of-the-art transformer models, including LLMs such as Falcon, Llama, and Dolly v2. Implementing a feature or bugfix benefits all models. For example, all models support 4/8-bit inference through the bitsandbytes library and each model can use the PyTorch meta device to avoid unnecessary allocations and initialization.
    Downloads: 8 This Week
    Last Update:
    See Project
  • 4
    x-transformers

    x-transformers

    A simple but complete full-attention transformer

    ...Proposes adding learned memory key/values prior to attending. They were able to remove feedforwards altogether and attain a similar performance to the original transformers. I have found that keeping the feedforwards and adding the memory key/values leads to even better performance. Proposes adding learned tokens, akin to CLS tokens, named memory tokens, that is passed through the attention layers alongside the input tokens. You can also use the l2 normalized embeddings proposed as part of fixnorm. ...
    Downloads: 6 This Week
    Last Update:
    See Project
  • The AI workplace management platform Icon
    The AI workplace management platform

    Plan smart spaces, connect teams, manage assets, and get insights with the leading AI-powered operating system for the built world.

    By combining AI workflows, predictive intelligence, and automated insights, OfficeSpace gives leaders a complete view of how their spaces are used and how people work. Facilities, IT, HR, and Real Estate teams use OfficeSpace to optimize space utilization, enhance employee experience, and reduce portfolio costs with precision.
    Learn More
  • 5
    Intel Extension for Transformers

    Intel Extension for Transformers

    Build your chatbot within minutes on your favorite device

    Intel Extension for Transformers is an innovative toolkit designed to accelerate Transformer-based models on Intel platforms, including CPUs and GPUs. It offers state-of-the-art compression techniques for Large Language Models (LLMs) and provides tools to build chatbots within minutes on various devices. The extension aims to optimize the performance of Transformer-based models, making them more efficient and accessible.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    SetFit

    SetFit

    Efficient few-shot learning with Sentence Transformers

    SetFit is an efficient and prompt-free framework for few-shot fine-tuning of Sentence Transformers. It achieves high accuracy with little labeled data - for instance, with only 8 labeled examples per class on the Customer Reviews sentiment dataset, SetFit is competitive with fine-tuning RoBERTa Large on the full training set of 3k examples.
    Downloads: 8 This Week
    Last Update:
    See Project
  • 7
    SageMaker Hugging Face Inference Toolkit

    SageMaker Hugging Face Inference Toolkit

    Library for serving Transformers models on Amazon SageMaker

    SageMaker Hugging Face Inference Toolkit is an open-source library for serving Transformers models on Amazon SageMaker. This library provides default pre-processing, predict and postprocessing for certain Transformers models and tasks. It utilizes the SageMaker Inference Toolkit for starting up the model server, which is responsible for handling inference requests. For the Dockerfiles used for building SageMaker Hugging Face Containers, see AWS Deep Learning Containers. ...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 8
    Adapters

    Adapters

    A Unified Library for Parameter-Efficient Learning

    Adapters is an add-on library to HuggingFace's Transformers, integrating 10+ adapter methods into 20+ state-of-the-art Transformer models with minimal coding overhead for training and inference. Adapters provide a unified interface for efficient fine-tuning and modular transfer learning, supporting a myriad of features like full-precision or quantized training (e.g. Q-LoRA, Q-Bottleneck Adapters, or Q-PrefixTuning), adapter merging via task arithmetics or the composition of multiple adapters via composition blocks, allowing advanced research in parameter-efficient transfer learning for NLP tasks.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    bert4torch

    bert4torch

    An elegent pytorch implement of transformers

    An elegant PyTorch implement of transformers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Rezku Point of Sale Icon
    Rezku Point of Sale

    Designed for Real-World Restaurant Operations

    Rezku is an all-inclusive ordering platform and management solution for all types of restaurant and bar concepts. You can now get a fully custom branded downloadable smartphone ordering app for your restaurant exclusively from Rezku.
    Learn More
  • 10
    ktrain

    ktrain

    ktrain is a Python library that makes deep learning AI more accessible

    ...Inspired by ML framework extensions like fastai and ludwig, ktrain is designed to make deep learning and AI more accessible and easier to apply for both newcomers and experienced practitioners. With only a few lines of code, ktrain allows you to easily and quickly. ktrain purposely pins to a lower version of transformers to include support for older versions of TensorFlow. If you need a newer version of transformers, it is usually safe for you to upgrade transformers, as long as you do it after installing ktrain. As of v0.30.x, TensorFlow installation is optional and only required if training neural networks.
    Downloads: 5 This Week
    Last Update:
    See Project
  • 11
    BERTopic

    BERTopic

    Leveraging BERT and c-TF-IDF to create easily interpretable topics

    ...Instead, we can visualize the topics that were generated in a way very similar to LDAvis. By default, the main steps for topic modeling with BERTopic are sentence-transformers, UMAP, HDBSCAN, and c-TF-IDF run in sequence.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 12
    GeoAI

    GeoAI

    GeoAI: Artificial Intelligence for Geospatial Data

    GeoAI is a comprehensive open-source Python package designed to integrate artificial intelligence techniques with geospatial data analysis, enabling users to perform advanced geographic modeling and visualization tasks with ease. It provides a unified framework that combines machine learning libraries such as PyTorch and Transformers with geospatial tools, allowing users to process satellite imagery, aerial photos, and vector datasets in a streamlined workflow. The platform supports a wide range of tasks including image classification, object detection, segmentation, and change detection, making it suitable for applications in environmental monitoring, urban planning, and disaster response. ...
    Downloads: 11 This Week
    Last Update:
    See Project
  • 13
    DINOv3

    DINOv3

    Reference PyTorch implementation and models for DINOv3

    ...DINOv3 removes the need for complex augmentations or momentum encoders, streamlining the pipeline while maintaining or improving feature quality. The model supports multiple backbone architectures, including Vision Transformers (ViT), and can handle larger image resolutions with improved stability during training. The learned embeddings generalize robustly across tasks like classification, retrieval, and segmentation without fine-tuning, showing state-of-the-art transfer performance among self-supervised models.
    Downloads: 15 This Week
    Last Update:
    See Project
  • 14
    GeneralAI

    GeneralAI

    Large-scale Self-supervised Pre-training Across Tasks, Languages, etc.

    Fundamental research to develop new architectures for foundation models and AI, focusing on modeling generality and capability, as well as training stability and efficiency.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    DeiT (Data-efficient Image Transformers)
    DeiT (Data-efficient Image Transformers) shows that Vision Transformers can be trained competitively on ImageNet-1k without external data by using strong training recipes and knowledge distillation. Its key idea is a specialized distillation strategy—including a learnable “distillation token”—that lets a transformer learn effectively from a CNN or transformer teacher on modest-scale datasets.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    SwanLab

    SwanLab

    An open-source, modern-design AI training tracking and visualization

    ...SwanLab supports both cloud and self-hosted deployments, allowing organizations to run the system privately or integrate it into shared development environments. The platform integrates with a wide range of machine learning frameworks including PyTorch, Transformers, Keras, and other widely used training ecosystems.
    Downloads: 7 This Week
    Last Update:
    See Project
  • 17
    DFlash

    DFlash

    Block Diffusion for Ultra-Fast Speculative Decoding

    ...The project includes support for multiple draft models, example integration code, and scripts to benchmark performance, and it is structured to work with popular model serving stacks like SGLang and the Hugging Face Transformers ecosystem.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 18
    Torch Pruning

    Torch Pruning

    DepGraph: Towards Any Structural Pruning

    ...It introduces a graph-based algorithm called DepGraph that automatically identifies dependencies between layers, allowing parameters to be pruned safely across complex architectures. This dependency analysis makes it possible to prune large networks such as transformers, convolutional networks, and diffusion models without breaking the computational graph. Torch-Pruning physically removes parameters rather than masking them, which results in smaller and faster models during both training and inference. The toolkit supports a wide variety of architectures used in computer vision and large language models, making it a flexible solution for model compression tasks.
    Downloads: 6 This Week
    Last Update:
    See Project
  • 19
    nanoGPT

    nanoGPT

    The simplest, fastest repository for training/finetuning models

    ...While simple, it can still train non-trivial models on modern GPUs and generate coherent text. The project has become widely used in tutorials, courses, and experiments for people learning how transformers work under the hood.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 20
    The SpeechBrain Toolkit

    The SpeechBrain Toolkit

    A PyTorch-based Speech Toolkit

    ...Competitive or state-of-the-art performance is obtained in various domains. SpeechBrain supports state-of-the-art methods for end-to-end speech recognition, including models based on CTC, CTC+attention, transducers, transformers, and neural language models relying on recurrent neural networks and transformers. Speaker recognition is already deployed in a wide variety of realistic applications. SpeechBrain provides different models for speaker recognition, including X-vector, ECAPA-TDNN, PLDA, and contrastive learning. Spectral masking, spectral mapping, and time-domain enhancement are different methods already available within SpeechBrain. ...
    Downloads: 6 This Week
    Last Update:
    See Project
  • 21
    Ling

    Ling

    Ling is a MoE LLM provided and open-sourced by InclusionAI

    ...As more developers and researchers engage with the platform, we can expect rapid advancements and improvements, leading to even more sophisticated applications. Model inference and API code (e.g. integration with Transformers). This collaborative approach accelerates development and ensures that the models remain at the forefront of technology, addressing emerging challenges in various fields.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 22
    InstantCharacter

    InstantCharacter

    Personalize Any Characters with a Scalable Diffusion Transformer

    ...It works by adapting a base image generation model with a lightweight adapter so that you can produce character-preserving generations in various downstream tasks (e.g. changing pose, clothing, scene) without needing full model fine-tuning. Works with huggingface/transformers/diffusers ecosystems.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 23
    LLaMA-Factory

    LLaMA-Factory

    Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)

    LLaMA-Factory is a fine-tuning and training framework for Meta's LLaMA language models. It enables researchers and developers to train and customize LLaMA models efficiently using advanced optimization techniques.
    Downloads: 13 This Week
    Last Update:
    See Project
  • 24
    Axolotl

    Axolotl

    Go ahead and axolotl questions

    Axolotl is a powerful and flexible framework for fine-tuning large language models on custom datasets. Built for researchers and developers, Axolotl simplifies the process of adapting LLMs for specific tasks, including chat, code generation, and instruction following. It supports a wide variety of model architectures and offers out-of-the-box optimization strategies for efficient training.
    Downloads: 11 This Week
    Last Update:
    See Project
  • 25
    LLaMA Efficient Tuning

    LLaMA Efficient Tuning

    Easy-to-use LLM fine-tuning framework (LLaMA-2, BLOOM, Falcon

    Easy-to-use LLM fine-tuning framework (LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, ChatGLM2)
    Downloads: 7 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • 3
  • 4
  • 5
  • Next
MongoDB Logo MongoDB