view article Article Crazy Challenge: Run Llama 405B on a 8GB VRAM GPU By lyogavin • Aug 2, 2024 • 10
view article Article Run the strongest open-source LLM model: Llama3 70B with just a single 4GB GPU! By lyogavin • Apr 21, 2024 • 44