File size: 976 Bytes
efd0619 1161732 efd0619 1161732 efd0619 5cbaee0 1161732 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
license: apache-2.0
datasets:
- togethercomputer/RedPajama-Data-1T-Sample
language:
- en
---
# Landmark Attention LLaMA 33B
This model has been trained using the PEFT LoRA technique with the [Landmark Attention](https://arxiv.org/abs/2305.16300) method over 200 steps. Model will likely be trained further and updated later on.
## Usage
Requires `trust_remote_code` to be set to `True`. In [oobabooga](https://github.com/oobabooga/text-generation-webui), you can simply add the `--trust_remote_code` flag.
You will also need to disable the `Add the bos_token to the beginning of prompts` option in the settings.
## PEFT Checkpoint
You can probably merge the checkpoint with any other LLaMA-based model (provided they're 33B, of course). This repo contains the merged weights, but you can grab the adapter [here](https://anonfiles.com/F3Pb20wbz7).
## Training Code
You can find the training code [here](https://github.com/eugenepentland/landmark-attention-qlora). |