Update README.md
#87
by
FelixMildon
- opened
Link not working because extra "(" parenthesis messing with readme.md, link to https://huggingface.co./blog/falcon the below section
"π₯ Falcon LLMs require PyTorch 2.0 for use with transformers
!
For fast inference with Falcon, check-out Text Generation Inference! Read more in this blogpost.
You will need at least 85-100GB of memory to swiftly run inference with Falcon-40B.
Model Card for Falcon-40B"
bump
bump