A second model merge by chargoddard. A GGML conversion of the previous merge can be found here.
I have no idea what I'm doing so if something doesn't work as it should or not at all that's likely on me, not the models themselves.

Description copied from the original repo below.

Similar to llama2-22b, but with BLOCK_DIAGONAL=false in the merge and twice the fine-tuning tokens.

Again, not intended for direct use - meant as a base for further tuning and merging.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Dataset used to train IHaveNoClueAndIMustPost/llama2-22b-blocktriangular-GGML