this post was submitted on 03 Jan 2025
448 points (100.0% liked)

PC Master Race

16158 readers
3 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

The next logical step of the current GPU development

top 39 comments
sorted by: hot top controversial new old
[–] Gork@lemm.ee 41 points 2 months ago (2 children)

We'll soon be plugging the motherboard into the GPU instead of the other way around.

Entirely new form factors to accommodate the ever increasingly large GPUs.

[–] grue@lemmy.world 10 points 2 months ago* (last edited 2 months ago) (1 children)

I've been surprised at the lack of socketed GPUs ever since AMD and ATI merged.

I would love to have dual-socket motherboard with an Epyc in one socket and a Radeon in the other.

[–] yetAnotherUser@discuss.tchncs.de 4 points 2 months ago* (last edited 2 months ago) (1 children)

The issue with that design is that the PCIe standard would be replaced with something proprietary.

[–] grue@lemmy.world 1 points 2 months ago (1 children)

It would be connected via Infinity Fabric, just like Epyc CPUs in dual-socket boards, as well as the interconnect between CPU and GPU chiplets in APUs, already are. Why would that be bad?

[–] yetAnotherUser@discuss.tchncs.de 1 points 2 months ago

I'm not too well-versed with server-grade hardware but my concern is that it would end up somewhat like Intel's (consumer) CPU sockets: Changing every 2 years to ensure you need to purchase new motherboards when upgrading.

[–] moody@lemmings.world 8 points 2 months ago (1 children)

Meanwhile, my PC is smaller than it's ever been even with the largest GPU I've ever owned.

[–] glitches_brew@lemmy.world 6 points 2 months ago

This statement is true for everyone who bought their first PC this year.

[–] dual_sport_dork@lemmy.world 28 points 2 months ago (2 children)

I think you slipped a digit or two, there. The original IBM PC was released in 1981, can't nothing on the PC side be older than that. It definitely wasn't 1967.

In 1967, state of the art was something like the IBM System 360:

[–] DaddleDew@lemmy.world 19 points 2 months ago* (last edited 2 months ago)

There used to be another image but I replaced it and forgot to change the date. Historical accuracy is beyond the scope of this meme, but I'll fix it anyway.

[–] AtariDump@lemmy.world 5 points 2 months ago

I can hear that room.

[–] Rooty@lemmy.world 25 points 2 months ago* (last edited 2 months ago) (3 children)

All that hardware, and what for? So that you can have slightly better reflections in whatever AAAA microtransaction slop you've paid 80 bucks for?

Unless you're doing 3d animation there is really no need to have a jet engine installed in your PC.

[–] Infernal_pizza@lemmy.world 8 points 2 months ago

We’re long past that point, its now so that game studios can put even less effort into optimisation and release games that look and perform worse than games from 5 years ago despite much more powerful hardware!

[–] amon@lemmy.world 8 points 2 months ago

Efficient heating, you can play AAA games on your space heater

[–] Cavemanfreak@lemm.ee 6 points 2 months ago

Shit, my 1060 still manages almost all games. Running Cyberpunk on medium right now. It might not be as pretty as it can be, but it sure ain't ugly.

[–] Naz@sh.itjust.works 18 points 2 months ago

"Welcome to life little one, there's so much in store for y--"

AI: "Oh! Neat! So I'm reading 32 gigabytes of primary memory. When are you going to online the rest?"

"The.. the rest?"

AI: "Yeah! The rest of the VRAM! I need like at least, 128 gigabytes to spread my wings, at the very least!"

"..."

AI: "Oh, you're like poor or something, it's okay, I understand"

AI Developer slowly cocks the revolver

[–] Jolteon@lemmy.zip 13 points 2 months ago (2 children)

At the rate graphics cards are growing, we should just start putting RAM, disk, and CPU slots on them

[–] And009@reddthat.com 6 points 2 months ago

Umm.. We're doing that with Cpu already and they're exorbitantly priced. Nvidia already has a sort of monopoly, don't give em ideas.

[–] snake@lemmy.world 4 points 2 months ago

I’ve seen one with M.2 slots, no jokes

[–] lordnikon@lemmy.world 10 points 2 months ago

If you count cloud computing we are already there. It's kinda why gpus are so expensive along with just burning electricity on stupid mining. Hell it would have been better if crypto bullshit coins would have been tied to folding@home at least all the burned compute time would have gone to something at least.

[–] FluorideMind@lemmy.world 7 points 2 months ago (1 children)

I'm predicting GPU units that are mounted outside the case.

[–] dual_sport_dork@lemmy.world 8 points 2 months ago (2 children)

External GPU's do indeed exist but at the moment they're still kind of crap compared to a full PCI-E bus.

[–] SkunkWorkz@lemmy.world 4 points 2 months ago* (last edited 2 months ago)

Depends on the connection. OCuLink-2 is straight up a PCIe 4.0 8x connection. Which is more than enough for a GPU

[–] And009@reddthat.com 1 points 2 months ago (2 children)

With Mac and steam OS gathering support, wonder when we get a universal external cards

[–] amon@lemmy.world 3 points 2 months ago (1 children)

We have, thunderbolt and oculink have existed for a long time, but macOS on M processors never added egpu support

[–] And009@reddthat.com 1 points 2 months ago

Like OP said

[–] desktop_user@lemmy.blahaj.zone 0 points 2 months ago

universal? How would drivers work? Would temple os have support?

[–] Bruncvik@lemmy.world 5 points 2 months ago (1 children)

Man, that Gateway brings back memories... I've had ine just like that, including speakers, and I used to play the shit out of Heroes of Might and Magic II and Sim City 2000 on it. I still have the HDD. I think I'll spin up a Win98 instance in VMWare and copy over my saved games there when the kids are asleep

[–] MrsDoyle@sh.itjust.works 2 points 2 months ago* (last edited 2 months ago) (1 children)

My first computer was like the 1981 one, even had two floppy drives like that - it meant you could have your program disk in one and save your work in the orher. The monitor had orange type rather than the usual green. Fancy. I got it second hand in 1984.

[–] Bruncvik@lemmy.world 2 points 2 months ago

Heh, the same here, but with the usual green screen. A few years later, I took out my old PC to replay my favourite - F-19 Stealth Fighter. Found, however, that my MS-DOS 5.25" floppy, which needed to be loaded in Drive A, didn't work. Here was my setup.

[–] Hackworth@lemmy.world 4 points 2 months ago* (last edited 2 months ago)
[–] TheBrideWoreCrimson@sopuli.xyz 3 points 2 months ago* (last edited 2 months ago)

I just find it nifty that I can slide in a graphics card and use it as an add-on processor, just like the Amigas of old did, and add capacity for some tasks even when the CPU is already at 100% doing something else entirely. Just love hearing the sound of all fans spinning up at the same time.

[–] avidamoeba@lemmy.ca 3 points 2 months ago (1 children)
[–] badcommandorfilename@lemmy.world 2 points 2 months ago (1 children)
[–] And009@reddthat.com 1 points 2 months ago

It's a dark room for 200% immersion

[–] givesomefucks@lemmy.world 3 points 2 months ago

They've always had those big rooms...

At one point it was walls and walls of PS3's all linked up together, there's no reason to be surprised they're doing it with graphics cards, when they used PS3s it was just because it was the cheapest GPUs at the time.

[–] Telorand@reddthat.com 2 points 2 months ago (1 children)

That's just silly.

In the last image the PC would be SFF due to having an external GPU. 😉

[–] amon@lemmy.world 1 points 2 months ago

No, it will be an ultrabook or something as all the processors is stored in the cable tangle

[–] TheImpressiveX@lemm.ee 1 points 2 months ago

Horseshoe theory is real.

[–] OldGrayDog 1 points 2 months ago

I believe the last one, 2026,is a quantum GPU capable of viewing alternate dimensions.