OpenDelta is an open-source parameter-efficient fine-tuning library that enables efficient adaptation of large-scale pre-trained models using delta tuning techniques. OpenDelta is a toolkit for parameter-efficient tuning methods (we dub it as delta tuning), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most parameters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.

Features

  • Supports parameter-efficient tuning for transformer models
  • Works with popular models like BERT, GPT, and T5
  • Open-source with flexible customization for NLP tasks
  • Compatible with Hugging Face Transformers and PyTorch
  • Reduces computational cost and memory footprint for fine-tuning
  • Implements multiple tuning strategies including adapter layers

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow OpenDelta

OpenDelta Web Site

Other Useful Business Software
Securing the Cloud Made Easy Icon
Securing the Cloud Made Easy

Multi-cloud security delivered — now and in the future.

Designed for organizations operating in the cloud who need complete, centralized visibility of their entire cloud estate and want more time and resources dedicated to remediating the actual risks that matter, Orca Security is an agentless cloud Security Platform that provides security teams with 100% coverage their entire cloud environment.
Learn More
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of OpenDelta!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

Python

Related Categories

Python Natural Language Processing (NLP) Tool

Registered

2025-01-24