model is too busy
#23
by
tangtang1995
- opened
Hi, I'm using huggingface_hub InferenceClient for inference, but I always get an error today:
"Model too busy, unable to get response in less than 120 second(s)"
Hi, I'm using huggingface_hub InferenceClient for inference, but I always get an error today:
"Model too busy, unable to get response in less than 120 second(s)"