File size: 525 Bytes
0dcb585
 
 
 
 
 
 
21ef306
 
894e352
 
0dcb585
1
2
3
4
5
6
7
8
9
10
11
12
---
license: mit
datasets:
- EleutherAI/fineweb-edu-dedup-10b
base_model:
- deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
---
SAEs and transcoders can be loaded using https://github.com/EleutherAI/sae.

These transcoders were trained on the outputs of the first 15 MLPs in deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B. We used 10 billion tokens from FineWeb edu deduped at a context length of 2048. The number of latents is 65,536 and a linear skip connection is included.

Fraction of variance unexplained ranges from 0.01 to 0.37.