this post was submitted on 02 Jul 2025
287 points (100.0% liked)

Fuck AI

3388 readers
1072 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 5 days ago (1 children)

So 12GB is what you need?

Asking because my 4GB card clearly doesn't cut it 🙍🏼‍♀️

[–] [email protected] 2 points 5 days ago (1 children)

4GB card can run smol models, bigger ones require an nvidia and lots of system RAM, and performance will be proportionally worse by VRAM / DRAM usage balance.

[–] [email protected] 4 points 5 days ago

require an nvidia

Big models work great on macbooks or AMD GPUs or AMD APUs with unified memory