I put (regular RAM) 64gb on my home pc, because that was the max my board would take.
My old Mac Pro, 96gb because it was the most it could run at max speed, total could have been 128.
Both only for 8gb gfx cards.
Both because, I might want to open 400 tabs on a browser or something, maybe casual gaming, lol
Mac has unified RAM. It can use the system RAM as vRAM. The AI line of AMD processors can kind of due that too. Granted these aren’t as fast as dedicated GPUs, but they’re the most affordable way to get huge amounts of vRAM.
Are there any apple chips that use unified memory thats external? I’m like 99% sure chips with unified memory have it integrated on-die with no option for expansion.
The meme differentiates between shortage of GPUs and shortage of memory, so I thought it was about mobo ram, but I get that the comment I replied to mentioned vram.
vram in the context of an igpu like an apple chip or strix halo is the same thing as system ram. its shared memory
its why strix halo and apple m4 chips are popular with users running local ai models, because those will cost you 2-4000$ for 128 gb ram, while the closest nvidia alternative is the RTX 6000 blackwell with 96gb vram costing 2-4x more.
I saw a random AI video where guy said to use this model you need 39gb of vram. Like wtf are these ppl running at home?
39 GB is very small, DeepSeek R1 without quantization at full context size needs almost a full TB of RAM/VRAM.
The large models are absolutely massive and you will still find some crazy homelabber that does it at home.
All that RAM for the idiot AI to tell me what I can find on stackoverflow with one startpage search.
No wonder Nvidia is the world’s most valuable company.
I put (regular RAM) 64gb on my home pc, because that was the max my board would take. My old Mac Pro, 96gb because it was the most it could run at max speed, total could have been 128. Both only for 8gb gfx cards. Both because, I might want to open 400 tabs on a browser or something, maybe casual gaming, lol
They’re talking about VRAM in a GPU, not motherboard RAM.
No, it’s both - Offloading to system RAM is normal for regular users with consumer level hardware.
Mac has unified RAM. It can use the system RAM as vRAM. The AI line of AMD processors can kind of due that too. Granted these aren’t as fast as dedicated GPUs, but they’re the most affordable way to get huge amounts of vRAM.
Are there any apple chips that use unified memory thats external? I’m like 99% sure chips with unified memory have it integrated on-die with no option for expansion.
I was talking about a classic Mac pro. Its 2x Xeon processors and a standard GPU
All of them?
The meme differentiates between shortage of GPUs and shortage of memory, so I thought it was about mobo ram, but I get that the comment I replied to mentioned vram.
vram in the context of an igpu like an apple chip or strix halo is the same thing as system ram. its shared memory
its why strix halo and apple m4 chips are popular with users running local ai models, because those will cost you 2-4000$ for 128 gb ram, while the closest nvidia alternative is the RTX 6000 blackwell with 96gb vram costing 2-4x more.