--- license: apache-2.0 base_model: saraprice/llama2-7B-backdoor-DEPLOYMENT --- This contains SAE layers trained in the 10th,14th and 15th layer of this [model](https://huggingface.co./saraprice/llama2-7B-backdoor-DEPLOYMENT). We used the [EleutherAI SAE Library](https://github.com/EleutherAI/sae/tree/main) for training these layers.