Spaces:
Running
Running
title: README | |
emoji: π | |
colorFrom: yellow | |
colorTo: indigo | |
sdk: static | |
pinned: false | |
# Welcome to the Lots-of-LoRAs Collection! | |
## Introduction | |
We are proud to host the world's largest open collection of LoRAs (Low-Rank Adaption). This initiative is part of a broader effort to advance research and development in the field of LoRA and PEFT (Parameter Efficient Fine-Tuning). | |
## Project Description | |
Our collection currently includes over 500 LoRAs, making it the most extensive open repository of its kind. This project aims to support and drive forward research by providing a comprehensive dataset of LoRAs for analysis, experimentation, and application development. | |
By making these resources available to the community, we hope to encourage collaboration and innovation, helping to push the boundaries of what's possible in PEFT. | |
We also include the exact data that these LoRAs were trained on, with the corresponding training, validation, and test splits. | |
Details can be found in the paper: https://www.arxiv.org/abs/2407.00066 | |
## How to Contribute | |
We are always looking for new contributions and collaborators: | |
- **Contribute Data**: If you have LoRAs that you would like to share, please feel free to submit it to our collection. | |
## Contact Us | |
Interested in contributing or have questions? Reach out to us at [email](mailto:[email protected]). | |
## Thank You! | |
We appreciate your interest in our project and hope you will join us in this exciting venture. | |