Update README.md

#87
by FelixMildon - opened

Link not working because extra "(" parenthesis messing with readme.md, link to https://huggingface.co./blog/falcon the below section

"πŸ’₯ Falcon LLMs require PyTorch 2.0 for use with transformers!

For fast inference with Falcon, check-out Text Generation Inference! Read more in this blogpost.

You will need at least 85-100GB of memory to swiftly run inference with Falcon-40B.

Model Card for Falcon-40B"

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment