File size: 391 Bytes
16fe4fe |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
license: apache-2.0
base_model:
- wzhouad/zephyr-7B-WPO-FP
- HuggingFaceH4/mistral-7b-sft-beta
tags:
- wpo
- mistral
- alignment
datasets:
- HuggingFaceH4/ultrafeedback_binarized
pipeline_tag: text-generation
library_name: transformers
---
following [wzhouad/zephyr-7B-WPO-FP](https://huggingface.co./wzhouad/zephyr-7B-WPO-FP)
Transfer original weights from `float32` to `bfloat16` type |