this post was submitted on 21 Apr 2025
234 points (100.0% liked)

PC Gaming

11453 readers
249 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 88 points 2 months ago* (last edited 2 months ago) (4 children)

tl:dw some the testing shows 300-500% improvements in the 16GB model. Some games are completely unplayable on 8GB while delivering an excellent experience on the 16GB.

It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

[–] [email protected] 35 points 2 months ago* (last edited 2 months ago) (2 children)

It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

Its moreso for OEM system integrators, who can buy up thousands of these 5060ti's and sell them in systems as 5060Ti's, and the average Joe who buys prebuilts won't know to go looking at the bottom half of the tech sheet to see if its an 8 or 16.
As well as yes, direct scamming consumers, because Jensen needs more leather jackets off the AI craze and couldn't give a rats ass about gamers.

[–] [email protected] 6 points 2 months ago* (last edited 2 months ago) (1 children)

I agree that they don't give half a shit about their actual product, but their biggest competitor has never been more competitive, and Nvidia knows it. Pissing off your costumer base when you don't have a monopoly is fucking stupid, and Nvidia and the prebuilt manufacturers knows this. It's business 101.

There's gotta be something else. I know businesses aren't known for making long term plans, because all that will ever matter to them is short term profits. But this is just way too stupid to be because of that.

[–] [email protected] 5 points 2 months ago (1 children)

There’s gotta be something else.

That something else is that they don't need the gamer market. Providing consumer cards is literally an inconvenience for them at this point, they make 2 billion a quarter from gaming cards but 18 billion on datacenter compute, with some insane 76% gross margins on those products they sell (to continue funding R&D).

load more comments (1 replies)
[–] [email protected] 2 points 2 months ago

Ngl gamers don't deserve respect.

[–] [email protected] 18 points 2 months ago* (last edited 2 months ago) (1 children)

To me it sounds like they are preying on the gamer who isn't tech savvy or are desperate. Just a continuation of being anti-consumer and anti-gamer.

[–] [email protected] 2 points 2 months ago

Yup. This is basically aimed at the people who only know that integrated GPUs are bad and they need a dedicated card, so system manufacturers can create a pre built that technically checks that box for as little money as possible.

[–] [email protected] 7 points 2 months ago* (last edited 2 months ago) (1 children)

It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

for money/extreme greed

[–] [email protected] 3 points 2 months ago (1 children)

Okay well that's the low-hanging fruit but explain to me the correlation? How does confusing their customers fuel their greed?

[–] [email protected] 7 points 2 months ago (1 children)

Uninformed buyers will buy the 8GB card get a poor experience and will be forced to buy a new card sooner than later.

[–] [email protected] 4 points 2 months ago* (last edited 2 months ago) (3 children)

So their strategy is making and selling shitty cards at high prices? Don't you think that would just make consumers consider a competing brand in the future?

[–] [email protected] 6 points 2 months ago

For most consumers it might not, the amount of nvidia ~~propaganda~~ advertisement in games is huge.

[–] [email protected] 3 points 2 months ago

Yea I don't know why buying a shitty product should convince me to throw more money at the company. They don't have a monopoly, so I would just go to their competitor instead.

load more comments (1 replies)
[–] [email protected] 3 points 2 months ago (1 children)

They had trouble increasing memory even before this AI nonsense. Now they have a perverse incentive to keep it low on affordable cards, to avoid undercutting their own industrial-grade products.

Which only matters thanks to anticompetitive practices leveraging CUDA's monopoly. Refusing to give up the fat margins on professional equipment is what killed DEC. They successfully miniaturized their PDP mainframes, while personal computers became serious business, but they refused to let those run existing software. They crippled their own product and the market destroyed them. That can't happen, here, because ATI is not allowed to participate in the inflated market of... linear algebra.

The flipside is: why the hell doesn't any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?

[–] [email protected] 3 points 2 months ago* (last edited 2 months ago) (1 children)

The flipside is: why the hell doesn’t any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?

It's not that they don't work.

Basically what you'll see is kinda like a cache miss, except the stall time to go 'oops, don't have that' and go out and get the required bits is very slow, and so you can see 8gb cards getting 20fps, and 16gb ones getting 40 or 60, simply because the path to get the missing textures is fucking slow.

And worse, you'll get big framerate dips and the game will feel like absolute shit because you keep running into hitches loading textures.

It's made worse in games where you can't reasonably predict what texture you'll get next (ex. Fortnite and other such online things that are you know, played by a lot of people) but even games where you might be able to reasonably guess, you're still going to run into the simple fact that the textures from a modern game are simply higher quality and thus bigger than the ones you might have had 5 years ago and thus 8gb in 2019 and 8gb in 2025 is not an equivalent thing.

It's crippling the performance of the GPU that may be able to perform substantially better, and for a relatively low BOM cost decrease. They're trash, and should all end up in the trash.

[–] [email protected] 3 points 2 months ago (2 children)

That's what I'm on about. We have the technology to avoid going 'hold up, I gotta get something.' There's supposed to be a shitty version that's always there, in case you have to render it by surprise, and say 'better luck next frame.' The most important part is to put roughly the right colors onscreen and move on.

id Software did this on Xbox 360... loading from a DVD drive. Framerate impact: nil.

load more comments (2 replies)
[–] [email protected] 65 points 2 months ago (2 children)

it is 2019, the 2060ti has 8gb of vram. it is 2020, the 3060ti has 8gb of vram. it is 2023, the 4060ti has 8gb of vram. it is 2025, the 5060ti has 8gb of vram.

[–] [email protected] 17 points 2 months ago (1 children)

My 1080 from 2017 has 8gb of vram. Still works fine.

[–] [email protected] 4 points 2 months ago (6 children)

If you play games that weren't released after 2022, sure.

[–] [email protected] 4 points 2 months ago* (last edited 2 months ago) (1 children)

Regular GTX 1080 here. Running Space Marine 2 at 60 fps on 1080p on mid-ish settings. And that's basically the most graphics-intensive games I've played recently. Games like Total Warhammer 3, Dark Deity 2, Factorio or Heroes of the Storm don't care about the GPU and play great at 1080p.

Speaking of, what is the next best cost-efficient GOAT in the generations that followed the GTX 10 series? I'm gonna be needing a new GOAT at some point in the future - would love to hear recommendations.

[–] [email protected] 2 points 1 month ago

I'm also interested in the answer to this; as a fellow 1080p gamer, a quick research had me hovering around an Intel A770, the only one offering 16gb at my target price of around €300, but I'm open to suggestions.

load more comments (5 replies)
[–] [email protected] 2 points 2 months ago* (last edited 2 months ago) (1 children)

My 3060 has 12gb of vram...

[–] [email protected] 2 points 1 month ago

You probably have to return the 4gb extra then.

[–] [email protected] 48 points 2 months ago (2 children)

The whole fact that NVIDIA is not allowing AIBs to send the 8GB card to reviewers is quite telling. They are simply banking on illiterate purchasers, system integrators to sell this variant. That's another low for NVIDIA but hardly surprising anyone.

Planned obsolescence.

[–] [email protected] 39 points 2 months ago (1 children)

This is worse than planned obsolescence. This is basically manufactured ewaste.

[–] [email protected] 2 points 2 months ago

They should use the 5060s for disposable vapes.

[–] [email protected] 4 points 2 months ago

I agree, but it is still crazy that there are people out there making $500 plus purchases without the smallest bit of research. I really hope this card fails only for the reason that it deserves to.

[–] [email protected] 22 points 2 months ago (2 children)

why the fuck is a 50 series Ti card having only 8gb of vram

[–] [email protected] 4 points 1 month ago

Maybe it has an Sd slot or something?

[–] [email protected] 3 points 2 months ago (2 children)

Because it's the low-end xx60? Also, apparently Nvidia did some high-tech magic that allows higher-res textures to be handled with less vRAM.

But, yeah, it's a 5060. You're not buying this to play in 4k Ultra.

[–] [email protected] 13 points 2 months ago* (last edited 2 months ago) (4 children)

This video clearly shows the NVIDIA magic is pooping in your own pants.

And the card even struggles at 1080p with 8GB...

[–] [email protected] 3 points 2 months ago (1 children)

I love to poop in my own pants

[–] [email protected] 3 points 2 months ago (1 children)

I mean that NVIDIA will poop your pants

load more comments (1 replies)
load more comments (3 replies)
[–] [email protected] 12 points 2 months ago* (last edited 2 months ago) (1 children)

Ever since the 40 series, you need to downgrade every card by 1 tier to get the actual product. Nvidia marketing gimmick.

RTX 4060 - RTX 4050

RTX 4060 Ti - RTX 4050 Ti

RTX 4070 - RTX 4060

RTX 4070 Super - higher clock speed RTX 4060

RTX 4070 Ti - RTX 4060 TI

RTX 4070 Ti Super - RTX 4070

RTX 4080 - RTX 4070 TI

RTX 4080 Super - RTX 4080

RTX 40?? - RTX 4080 TI. There definitely should have been a GPU in this bracket judging by transistor counts. Would eat into the insane 4090 margin though so it wasn't meant to be.

RTX 4090 is of course appropriately named.

load more comments (1 replies)
[–] [email protected] 17 points 2 months ago* (last edited 2 months ago) (3 children)

On the flip-side, every game worth playing uses 2GB vram or less at 1080p.

[–] [email protected] 9 points 2 months ago

That's an absolute lie and you know it.

[–] [email protected] 5 points 2 months ago (3 children)

Even 8GB is fine to be honest, even at high settings. People are a bit dramatic about it.

[–] [email protected] 4 points 2 months ago (1 children)

Video editing and AI require as much VRAM as you can get. Not everyone uses the cards just for gaming.

[–] [email protected] 7 points 2 months ago

Then don't buy the current-gen low-end card for video editing, mate. Get previous-gen with more vRAM, or go AMD.

[–] [email protected] 4 points 2 months ago* (last edited 2 months ago)

Depends on the use case. And even at 1080p there are quite a few games that use 8gb or close to it. Ghostrunner and LOTF (2023) come to mind. Although tbf, I played the first one on my RX580 8G (I think) almost maxed out and it did fine.

But if you're buying a card now, especially at new modern card prices, you want to have at least a bit of future proofing.

[–] [email protected] 3 points 2 months ago (3 children)

I have a game that eats 11 gb of vram on low at 1080p (I play it on windowed). It suffers from some Unreal engine shenanigans and it's also a few years old.

[–] [email protected] 3 points 2 months ago (1 children)

Just like normal ram, if you have more to use then it'll get used. It doesn't mean it requires more than 8gb in order to run well.

I played all of Cyberpunk on high settings with 8gb.

[–] [email protected] 2 points 2 months ago* (last edited 2 months ago)

You're not wrong, but then there's games like this that need at least 6 gb (more on dx12) to run on low without it running out of memory and either crashing or not launching. This is an actual issue with this particular game.

Edit: Cyberpunk has gotten a lot better though and will run on things it has no business running on.

load more comments (2 replies)
[–] [email protected] 2 points 2 months ago

I don't even have a GPU, but to be honest I don't even game anymore cuz I work more hours then there are in a day

[–] [email protected] 14 points 2 months ago (4 children)

Can we get a gpu that just has some ddr5 slots in it?

[–] [email protected] 4 points 2 months ago

You would need so many channels for that to be viable.

[–] [email protected] 2 points 2 months ago
load more comments (2 replies)
load more comments
view more: next ›