--- license: apache-2.0 --- # FloatLM 3.9B The good ol' FP16 LLMs with LLaMa architecture. ```python import transformers as tf, torch model_name = "SpectraSuite/FloatLM_3.9B" # Please adjust the temperature, repetition penalty, top_k, top_p and other sampling parameters according to your needs. pipeline = tf.pipeline("text-generation", model=model_id, model_kwargs={"torch_dtype": torch.float16}, device_map="auto") # These are base (pretrained) LLMs that are not instruction and chat tuned. You may need to adjust your prompt accordingly. pipeline("Once upon a time") ``` * License: Apache 2.0 * We will use our GitHub repo for communication (including HF repo related queries). Feel free to open an issue here https://github.com/NolanoOrg/SpectraSuite