this post was submitted on 14 Jul 2025
45 points (100.0% liked)

Selfhosted

49494 readers
997 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 
GPU VRAM Price (€) Bandwidth (TB/s) TFLOP16 €/GB €/TB/s €/TFLOP16
NVIDIA H200 NVL 141GB 36284 4.89 1671 257 7423 21
NVIDIA RTX PRO 6000 Blackwell 96GB 8450 1.79 126.0 88 4720 67
NVIDIA RTX 5090 32GB 2299 1.79 104.8 71 1284 22
AMD RADEON 9070XT 16GB 665 0.6446 97.32 41 1031 7
AMD RADEON 9070 16GB 619 0.6446 72.25 38 960 8.5
AMD RADEON 9060XT 16GB 382 0.3223 51.28 23 1186 7.45

This post is part "hear me out" and part asking for advice.

Looking at the table above AI gpus are a pure scam, and it would make much more sense to (atleast looking at this) to use gaming gpus instead, either trough a frankenstein of pcie switches or high bandwith network.

so my question is if somebody has build a similar setup and what their experience has been. And what the expected overhead performance hit is and if it can be made up for by having just way more raw peformance for the same price.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 1 day ago (5 children)

Looking at the table above AI gpus are a pure scam

How much more power are your gaming GPUs going to use? How much more space will they use? How much more heat will you need to dissipate?

[–] [email protected] 3 points 1 day ago (2 children)

Well a scam for selfhosters, for datacenters it's different ofcourse.

Im looking to upgrade to my first dedicated built server coming from only SBCs so I'm not sure how much of a concern heat will be, but space and power shouldn't be an issue. (Within reason ofcourse)

[–] [email protected] 1 points 1 day ago

Efficiency still matters very much when self hosting. You need to consider power usage (do you have enough amps in your service to power a single GPU? probably. what about 10? probably not) and heat (it's going to make you need to run more A/C in the summer, do you have enough in your service to power an A/C and your massive amount of GPUs? not likely).

Homes are not designed for huge amounts of hardware. I think a lot of self hosters (including my past self) can forget that in their excitement of their hobby. Personally, I'm just fine not running huge models at home. I can get by with models that can run on a single GPU, and even if I had more GPUs in my server, I don't think the results (which would still contain many hallucinations) would be worth the power cost, strain on my A/C, and possible electrical overload.

[–] [email protected] 3 points 1 day ago (2 children)
[–] [email protected] 1 points 1 day ago

While I would still say it's excessive to respond with "😑" i was too quick in waving these issues away.

Another commenter explained that residential power physically does not suppply enough to match high end gpus is why even for selfhosters they could be worth it.

load more comments (2 replies)