Gee, we've had over a half century of computer graphics at this point. However, suddenly when a technology arises that requires obscene amount of GPU's to generate a results a GPU manufacturer is here to tell us that all computer graphics without that new technology is dead for... reasons. I cannot see any see any connections between these points.
Games
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
What do you mean "suddenly"? I was running path tracers back in 1994. It's just that they took minutes to hours to generate a 480p image.
The argument is that we've gotten to the point where new rendering features rely on a lot more path tracing and light simulation that used to not be feasible in real time. Pair that with the fact that displays have gone from 1080p60 vsync to 4K at arbitrarily high framerates and... yeah, I don't think you realize how much additional processing power we're requesting.
But the good news is if you were happy with 1080p60 you can absolutely render modern games like that in a modern GPU without needing any upscaling.
I think you just need to look at the PS5 Pro as proof that more GPU power doesn't translate linearly to better picture quality.
The PS5 Pro has a 67% beefier GPU than the standard PS5 - with a price to match - yet can anyone say the end result is 67% better? Is it even 10% better?
We've been hitting diminishing returns on raw rasterising for years now, a different approach is definitely needed.
Yeah, there's a reason any movie attempting 3D CG with any budget at all has used path tracing for years. It's objectively massively higher quality.
You don't need upscaling or denoising (the "AI" they're talking about) to do raster stuff, but realistic lighting does a hugely better job, regardless of the art style you're talking about. It's not just photorealism, either. Look at all Disney's animated stuff. Stuff like Moana and Elemental aren't photorealistic and aren't trying to be, but they're still massively enhanced visually by improving the realism of the behavior of light, because that's what our eyes understand. It takes a lot of math to handle all those volumetric shots through water and glass in a way that looks good.
Yep. The thing is, even if you're on high end hardware doing offline CGI you're using these techniques for denoising. If you're doing academic research you're probably upscaling with machine learning.
People get stuck on the "AI" nonsense, but ultimately you need upscaling and denoising of some sort to render certain tier of visuals. You want the highest quality version of that you can fit in your budgeted frame time. If that is using machine learning, great. If it isn't, great as well. It's all tensor math anyways, it's about using your GPU compute in the most efficient way you can.
Devils advocate: Splatting, dlss, neural codecs to name a few things that will change the way we make games
DLSS doesn’t work that well. I’m not looking forward to AI replacing artist’s work.
I'm not sure I agree with you on the former, DLSS is pretty remarkable in its current iteration
Agreed, things like DLSS are the right kind of application of AI to games, same with frame generation. The wrong kind is trying to figure out how to replace developers, artists of every kind, actors, etc in the production process with AI. That being said though, companies like Nvidia absolutely can and will profit off making sure that a game cannot run well on anything but the latest hardware that they sell, so the whole “you need to buy our stuff to play games because it has the good ai and now all games require the good ai” is capitalist bullshit
Ray tracing actually will directly change the way games are made. A lot of time is spent by artists placing light sources and baking light maps to realistically light scenery - with ray tracing, you get that realism "for free".
DF did a really interesting video on the purely path traced version of Metro: Exodus and as part of that, the artists talked about how much easier and faster it was to build that version.
I think what he means is that AI is needed to keep making substantial improvements in graphic quality, and he phrased it badly. Your interpretation kind of presumes he's not only lying, but that he thinks we're all idiots. Given that he's not running for office as a Republican, I think that's a very flawed assumption.
Sounds like a bad thing tbh.
RIP the future of high end computer graphics. 1972 to 2024. You had a good run.
This feels like its establishing a precedent for widespread adoption/implementation of AI into consumer devices. Manufactured consent.
"We compute one pixel... we hallucinate, if you will, the other 32."
Between this and things like Sora, we are doomed to drown in illusions of our own creation.
If the visuals are performant and consistent, why do we care? I have always been baffled by the obsession with "real pixels" in some benchmarks and user commentary.
AI upscales are so immediately obvious and look like shit. Frame "generation" too. Not sour grapes, my card supports FSR and fluid motion frames, I just hate them and they are turned off.
DLSS and FSR are not comparable.
"FSR looks like shit" is not the same thing as "upscaling looks like shit".
my card supports FSR
Yeah this is "we have DLSS at home". As someone who tested both, DLSS is the actually good one, FSR is a joke of an imitation that's just slightly fancier TAA. Try DLSS Quality at 1440p or DLSS Balanced at 4K and you'll see it's game-changing.
That's fine, but definitely not a widespread stance. Like somebody pointed out above, most players are willing to lose some visual clarity for the sake of performance.
Look, I don't like the look of post-process AA at all. FXAA just seemed like a blur filter to me. But there was a whole generation of games out there where it was that or somehow finding enough performance to supersample a game and then endure the spotty compatibility of having to mess with custom unsupported resolutions and whatnot. It could definitely be done, particularly in older games, but for a mass market use case people would turn on SMAA or FXAA and be happy they didn't have to deal with endless jaggies on their mid-tier hardware.
This is the same thing, it's a remarkably small visual hit for a lot more performance, and particularly on higher resolution displays a lot of people are going to find it makes a lot of sense. Getting hung up on analyzing just "raw" performance as opposed of weighing the final results independently of the method used to get there makes no sense. Well, it makes no sense industry-wide, if you happen to prefer other ways to claw back that performance you're more than welcome to deal with bilinear upscaling, lower in-game settings or whatever you think your sweet spot it, at least on PC.
The brute forcing of AI into anything goes into the next stage.
Good thing I don’t care that much about graphics since I’ve been a teenager.
the premise seems flawed, i think.
i feel what he's saying is: we suck optimizing gfx performance now because gamers deem ai upscale quality as passable
this feels opposite to what the ps poll says that gamers enable performance mode more because the priority is more stable frames than shiny anti aliasing/post processing.
I don't see how that's the case. Most people prefer more fps over image quality, so minor artifacting from DLSS is preferable to the game running much slower with cleaner image quality. That is consistent with the PS data (which wasn't a poll, to my understanding).
I also dispute the other assumption, that "we suck at optimizing performance". The difference between now and the days of the 1080Ti when you could just max out games and call it a day, is that we're targeting 4K at 120fps and up, as opposed to every game maxing out at 1080p60. There is no target for performance on PC anymore, every game can be cranked higher. We are still using CounterStrike for performance benchmarks, running at 400-1000fps. There will never be a set performance target again.
If anything, optimization now is sublime. It's insane that you can run most AAA games on both a Steam Deck and a 4090 out of the same set of drivers and executables. That is unheard of. Back in the day the types of games you could run on both a laptop and a gaming PC looked like WoW instead of Crysis. We've gotten so much better at scalability.
Chasing graphics has lead to directions nobody could predict and I'm glad I don't play these games.
What's up with the biker jacket?
Techbros
It's a stupid gimmick like Steve Jobs' turtleneck.
His version of a weighted blankie.
Oh here he is again, talking AI AI AI.... Their stock value is already so high it will take over 50 years for it's value to match it's current price.
We can't do the profitable thing without the more expensive and even more lucrative thing, please, think of our monopoly.
Yes you can. Easily. You're sitting on the infrastructure to do it.
Pre-built.
We're at that point in Brazil where everything behind the screens is devastated and wrecked for profit, and now if you want to see any sign of nature you need a graphics card.
But that's in California, where anything natural is on fire or is sliding across Malibu into the ocean.
So just a preview unless these people are stopped.
computer graphics has existed for quite long. It seemed possible to do just last year.
Sounds like a skill issue.
I don't know enough, but it sounds like Unix kernel will need a new way to separately give access to TPUs.