Mac has unified RAM. It can use the system RAM as vRAM. The AI line of AMD processors can kind of due that too. Granted these aren’t as fast as dedicated GPUs, but they’re the most affordable way to get huge amounts of vRAM.
Are there any apple chips that use unified memory thats external? I’m like 99% sure chips with unified memory have it integrated on-die with no option for expansion.
The meme differentiates between shortage of GPUs and shortage of memory, so I thought it was about mobo ram, but I get that the comment I replied to mentioned vram.
vram in the context of an igpu like an apple chip or strix halo is the same thing as system ram. its shared memory
its why strix halo and apple m4 chips are popular with users running local ai models, because those will cost you 2-4000$ for 128 gb ram, while the closest nvidia alternative is the RTX 6000 blackwell with 96gb vram costing 2-4x more.
They’re talking about VRAM in a GPU, not motherboard RAM.
No, it’s both - Offloading to system RAM is normal for regular users with consumer level hardware.
Mac has unified RAM. It can use the system RAM as vRAM. The AI line of AMD processors can kind of due that too. Granted these aren’t as fast as dedicated GPUs, but they’re the most affordable way to get huge amounts of vRAM.
Are there any apple chips that use unified memory thats external? I’m like 99% sure chips with unified memory have it integrated on-die with no option for expansion.
I was talking about a classic Mac pro. Its 2x Xeon processors and a standard GPU
All of them?
The meme differentiates between shortage of GPUs and shortage of memory, so I thought it was about mobo ram, but I get that the comment I replied to mentioned vram.
vram in the context of an igpu like an apple chip or strix halo is the same thing as system ram. its shared memory
its why strix halo and apple m4 chips are popular with users running local ai models, because those will cost you 2-4000$ for 128 gb ram, while the closest nvidia alternative is the RTX 6000 blackwell with 96gb vram costing 2-4x more.