English
meg's picture
meg HF staff
Small correction on number of epochs.
3cd745a
|
raw
history blame
1.5 kB
metadata
license: gpl-3.0
datasets:
  - nomic-ai/gpt4all_prompt_generations
language:
  - en

gpt4all-lora-epoch-3

This is an intermediate (epoch 3 / 4) checkpoint from nomic-ai/gpt4all-lora.

An autoregressive transformer trained on data curated using Atlas. This model is trained with three epochs of training, while the related gpt4all-lora model is trained with four. Replication instructions and data: https://github.com/nomic-ai/gpt4all

Model Details

Model Description

Developed by: Nomic AI

Model Type: An auto-regressive language model based on the transformer architecture and fine-tuned.

Languages: English

License: GPL-3.0

Finetuned from: LLaMA

Model Sources

Repository: https://github.com/nomic-ai/gpt4all

Base Model Repository: https://github.com/facebookresearch/llama

Technical Report: GPT4All: Training an Assistant-style Chatbot with Large Scale Data Distillation from GPT-3.5-Turbo