Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
12
Follow
AWS Inferentia and Trainium
61
License:
apache-2.0
Model card
Files
Files and versions
Community
240
6e54dde
optimum-neuron-cache
/
neuronxcc-2.13.66.0+6dfecc895
/
0_REGISTRY
/
0.0.23
/
inference
/
llama
/
meta-llama
8 contributors
History:
30 commits
dacorvo
HF staff
Delete neuronxcc-2.13.66.0+6dfecc895/0_REGISTRY/0.0.23/inference/llama/meta-llama/Meta-Llama-3-70B/3825d3e7288b5c5f14e2.json
6b39c02
verified
4 months ago
Llama-2-13b-chat-hf
Synchronizing local compiler cache.
6 months ago
Llama-2-70b-chat-hf
Synchronizing local compiler cache.
6 months ago
Llama-2-7b-chat-hf
Synchronizing local compiler cache.
6 months ago
Meta-Llama-3-70B
Delete neuronxcc-2.13.66.0+6dfecc895/0_REGISTRY/0.0.23/inference/llama/meta-llama/Meta-Llama-3-70B/3825d3e7288b5c5f14e2.json
4 months ago
Meta-Llama-3-8B
Synchronizing local compiler cache.
5 months ago