CPU inference
#1
by
KeilahElla
- opened
Is it possible to run AuraSR on the CPU? The safetensors is less than 3GB large, so it should be fine on most laptops. When I try to run the example python code, I get "torch is not compiled with CUDA" errors.
In the main (only) aura file you get after pip install aura-sr...
the device: str = "cpu", "mps" etc
also the line: checkpoint = torch.load(hf_model_path / "model.ckpt")
should be changed to:
checkpoint = torch.load(hf_model_path / "model.ckpt", map_location=torch.device("cpu"))
class AuraSR:
def __init__(self, config: dict[str, Any], device: str = "cpu"):
self.upsampler = UnetUpsampler(**config).to(device)
self.input_image_size = config["input_image_size"]
@classmethod
def from_pretrained(cls, model_id: str = "fal-ai/AuraSR"):
import json
import torch
from pathlib import Path
from huggingface_hub import snapshot_download
hf_model_path = Path(snapshot_download(model_id))
config = json.loads((hf_model_path / "config.json").read_text())
model = cls(config)
checkpoint = torch.load(hf_model_path / "model.ckpt", map_location=torch.device("cpu"))
model.upsampler.load_state_dict(checkpoint, strict=True)
return model