-
Notifications
You must be signed in to change notification settings - Fork 61
Open
Description
I have gtx 1660 super with 6 gb vram and prints not enough memory even for 384x384px . How much is the minimum vram need for model and generation ? And is it possible to allocate from system ram if not enough vram ?
Message:
CUDA out of memory. Tried to allocate 324.00 MiB (GPU 0; 6.00 GiB total capacity; 5.17 GiB already allocated; 0 bytes free; 5.28 GiB reserved in total by PyTorch)
Metadata
Metadata
Assignees
Labels
No labels