metadata
license: apache-2.0
AWS Neuron optimum model cache
This repository contains cached compilation artifacts for the most popular models on the Hugging Face Hub.
Inference
LLM models
For a list of the supported models and configurations, please refer to the inference cache configuration files.