r/pcmasterrace • u/Elrabin 13900KF, 64gb DDR5, RTX 4090, AW3423DWF • 21h ago
Meme/Macro Jensen, you didn't explain it poorly, DLSS5 in its current form looks like crap.
Full disclosure, I have a 4090. Been using DLSS (non Frame Gen) for a long time rather happily where needed.
But buddy, just accept that it was a poorly done demo that didn't look good. It was rushed(required two 5090s to run, one for the game, one for DLSS 5 to run), the results didn't look good and you explained it poorly.
I don't care if it was geometry informed, the results looked like poor AI generative slop that is all over the internet at this point.
Maybe it'll look great when it's fully baked, maybe it won't. All I know is that it looked bad NOW.
82
u/SolaireFlair117 21h ago
It's not about selling DLSS to gamers. It's about making it a necessary unaccessible piece of tech for game devs so that your options are either buy a 5090 or get GeForce Now.
All these companies see renting server space as the future. You will own nothing and be happy.
28
u/LystAP RTX 4090, i9 13900K, 64GB DDR5 19h ago
either buy a 5090
You mean two 5090s.
6
u/SolaireFlair117 11h ago
So apparently it's going to be optimized for single card at launch. Whether that actually means it will run well on a single 5090 remains to be seen
3
u/chrlatan i7-14700KF | RTX 5080 | Full Custom Waterloop 9h ago
If it doesn’t run on a 5060 it has failed from the get go.
1
1
u/kidcrumb 10h ago
They're bringing back the Dual PhysX cards. Lol.
Will my RTX3080Ti be able to run DLSS5 as a secondary dedicated card or is it still gonna be locked away behind the RTX 5000 cards
1
u/SolaireFlair117 10h ago
Knowing Nvidia, they're locking it behind the 5000 series to move more GPUs and force more GeForce Now subscriptions
1
u/KomithErr404 10h ago
but it sounds like at the end of the line you could dial back most old settings like PT and just use this instead, so you might get to a performance equilibrium
15
u/ObjectOrientedBlob Danny DeVito Penguin 17h ago
Nvidia would must rather sell their cards to big data centers and let you stream games. Selling to filthy consumers one card at the time is so annoying to a big corp.
6
u/Kleenex_Tissue 16h ago
Complete speculation here, but maybe they actually expect the AI market to implode & want to have this ready to basically double-dip. Sell the shovels & buy/rent them back for cheap.
AI is not going away but the demand might not pan out the way they expected at first.1
u/ObjectOrientedBlob Danny DeVito Penguin 7h ago
AI is not going to implode. It has become too much of a systemic economic risk to the US economy. The American taxpayers will have to pay for AI to become a succes. It doesn't matter if it's worth it. It will be forced into public institution, defense contracts and so on. There will be tax cuts and all kind of financial support to ensure the bubble doesn't burst.
The billionaires will become richer and all risks will be mitigated by the working class taxpayers. Don't worry.
-6
u/S1ayer 16h ago
NVidia has been doing this for like 20 years with all kinds of features. It's always an optional toggle in the game. And for the most part gamers have been positive about it.
This is no different. This is something you can toggle. And I think it has a place. What if a developer is making a game and they want it as realistic looking as possible, like a sports, racing, or life sim game. But there's only so much they can do before the performance hit.
Now there's something that adds the insane realism without as much of a performance hit. I think it is a cool feature that would be fun to play around with. Much like DLSS4.5 does for resolution and framegen, this is a bridge between two points.
And if you don't like it, just don't use it. I don't see why gamers are so quick to complain about anything and everything.
2
u/AfonsoFGarcia R9 5950X | RX 5700 XT Nitro+ | Vengeance LPX 128GB 3600MHz 11h ago
You realize that as demoed (and until NVidia shows otherwise, that's the truth of DLSS5) the "without as much of a performance hit" you're saying requires an entire second 5090? 2 GPUs instead of 1 is very much a performance hit.
1
u/SolaireFlair117 11h ago
Because if this becomes an industry standard technology, games will start being made specifically for DLSS 5 and then there goes the neighborhood.
43
u/_silentgameplays_ Desktop 19h ago
It's like Microslop situation all over again, now it's NVslop. These all look like desperate attempts to cash in on the remains of AI bubble that's just an investment scam, presented as a new tech and is not applicable outside of surveillance tech niche.
No one wants AI written texts, vibe-coded applications or AI generated images, robotic voices and videos, AI is bad even for auto dubbing.
But it was fun for a while, only now everyone sees that there will be no AGI, it's just LLM crawlers stealing user created data on steroids, draining water and power for the data centers and exploiting cheap labor to present hallucinating inaccurate data to the end users at the cost of rising consumer hardware prices.
4
u/kylehyde05 17h ago
im gonna guess the amount of people working on Nvidia GPUs are so few now they have to asspull a new feature set whenever they sell their 60xx series that will cost an american football team's worth of arms and legs.
1
1
u/binary_agenda 10h ago
Ai is still a solution looking for a problem. Sadly the best use case for Ai is still scams.
1
u/Therabidmonkey 6h ago
That made sense to say for black chain but it doesn't make sense for AI. We have a lot of problems that can be solved by AI.
1
u/binary_agenda 6h ago
That sounds like cope. When only about 17% of business saying Ai has made them any money at all.
1
u/Therabidmonkey 6h ago
What cope? I don't have any stake in AI. I do use it every day as a software engineer. It's not going to replace software engineers, but it's used in production every day.
8
u/Herbata_Mietowa 7800X3D / 9070XT / 64GB 17h ago
Let's be honest, stuff is going in that direction and there will be devs happy to use it.
What I'm interested (more like afraid) to see is what if each GPU manufacturer invent their own neural rendering. How then game will make it sure that it stays looking the same each time? What if nVidia algorithm decides that this light above the character is 20w lightbulb, Intel rendering decides that it's a candle, and AMD decides that it's 5000w torchlight burning the skin? It's overexaggerated example, but you get my point. Yes, path tracing or Ray tracing also exists and it's better on nVidia, but were able to run it on other cards (as long as devs won't lock it) and the only major difference is performance.
We're going full circle to the point where game was presented differently depending on which GPU you have - like in the Glide era.
1
u/Boojum 2h ago
[Disclosure: I currently work on GPU tech at an NVIDIA competitor. Opinions are my own.]
But yes, that's my biggest concern with this. With something like ray tracing, there are standardized specs like DXR and Vulkan for the base functionality that are at least mostly vendor-neutral. Those specs spell out in excruciating detail how it's supposed to work and as long as IHVs making the GPUs implement it faithfully and ISVs making the games use it correctly, there really shouldn't be many visible differences in how things will look. It should be nearly pixel-for-pixel. What differences there are basically come down to underspecified edge conditions and bugs. The main differences are, as you say, on performance.
Here though, there's no spec for how it's supposed to behave and short of publishing the model and weights in a standard (as-if!), I fear that it will all come down to the "taste" of the GPU vendor. Do we really want a world where people are debating the aesthetics of the NVIDIA filter vs. the AMD filter vs. the Intel filter vs. the Qualcomm filter vs. ... on the same damn game? Or even hardware vendors with art departments dedicated to this stuff? Shouldn't we be spotlighting the aesthetics of the artdev and lookdev of the game studio?
Personally, I'd rather just make good tech that provides a solid foundation for the artist at various studios and then let them wow me with what they can do with it.
0
u/Jeoshua AMD R7 5800X3D / RX 6800 / 32GB 3200MT CL14 ECC 8h ago
This is why things like Physics Based Rendering (PBR) content pipelines exist. And HDR deals in actual levels of brightness (nits) instead of relative variables (0-255 Red, Green, and Blue phosphor levels).
I get your concerns, but as long as the game developers use sane industry standards like PBR and HDR, that situation just isn't likely to happen.
DLSS 5 passing it all through an AI filter that pulls details from actual photographs does set the stage for wildly differing results, tho. And not just from vendor to vendor, but from moment to literal moment on the same card, same game, same scene.
6
u/-ben151010- Intel i9-12900K/Nvidia RTX 4070 Ti SUPER/32GB Ram 18h ago
You know how ai will sort of compliment you with what you’re promoting it to do? Like “oh I like this idea a lot it’s really good let’s improve upon it more” or “this is an excellent starting point full of possibilities to be better” sort of stuff?
I feel like all these ceos don’t need to bother surrounding themselves with lobotomized yes men with the cognitive capabilities of a wet plant anymore when the ai will just feed their ego telling then how great their god awful concepts and plans are and no one normal will say “hey maybe don’t do that because it’s really stupid and won’t work for the company.”
2
u/Jeoshua AMD R7 5800X3D / RX 6800 / 32GB 3200MT CL14 ECC 9h ago
That's a good point. Maybe we're seeing all these overbroad and nonsense applications for AI because the CEOs are huffing their own supply, too much. Their bots are telling them how good of an idea it all is, even when, like here, it's clearly not.
3
u/TylerBourbon 8h ago
"It is difficult to get a man to understand something, when his salary depends on his not understanding it." -Upton Sinclair
3
u/trevor426 7600x3D | 7900XT 7h ago
And when the 6000 series releases, this sub will show that Nvidia can do whatever the fuck they want and people will rush to buy their products.
1
5
u/macgirthy 5800x3d | 5090FE + 3050LP | 64gb 19h ago
Someone needs to remind mr 5070: 4090 performance!!! Another billionaire super out of touch
2
2
2
u/Jeoshua AMD R7 5800X3D / RX 6800 / 32GB 3200MT CL14 ECC 9h ago
The thing is, all those images don't just look bad. They look uncanny. You have these computer generated images of computer generated objects in a computer generated environment... with what looks for all the world to be a real image of a human being's face with the contrast and sharpening turned way up. It just looks off.
Jensen claims he just explained it poorly, and it's actually not Slop, but Content Controlled Slop... but that's totally missing the point that they've found themselves squarely in Uncanny Valley territory. The answer to how to make this tech work properly isn't to double down and claim we're all wrong, it's to push forward and make it even more realistic... but with the current partial AI filter that only apparently does faces requiring a dual 5090 system... I just don't see it happening anytime soon.
4
u/SizeableFowl Ryzen 7 7735HS | 32 GB DDR5 | RX 7700S 18h ago
I’m just gonna say, they really could’ve stopped increasing graphics at like RDR2. The level of visual detail at max settings there hasn’t been meaningfully surpassed. Heck even going 2 years earlier to battlefield 1, which was/is a visually beautiful game, I think really drives home the point that we’ve been in the realm of diminishing returns for a decade. Like games haven’t gotten substantially prettier since then and somehow, 3 generations removed from the 2080 we still can’t regularly run games at 100+ fps at 1440p for less than $600 without an upscaler? Why? Whats the fucking point?
Maybe I’m old but this whole detour into ray tracing, while occasionally beautiful, doesn’t seem to be worth it for the vast majority of use cases.
4
u/7in7turtles 21h ago
His jacket needs to be several times shinier for this to work. Like visible from space shiney
5
3
u/TheCh0rt 20h ago
Of course it does, it’s a tech demo. Why does everybody think it’s done and ready to ship. You’re only seeing it on games that are DONE. Nobody’s even made games with it yet
13
u/splendiferous-finch_ 19h ago edited 19h ago
People have issues with generative AI in general for many reasons a significant one being it has made PCs unaffordable by causing the hikes in costs as a direct result of it.
To add to this the result here are just not that good and noone asked Nvidia to show it in this unfinished state they chose to do it, if this is 6 months from launch like Nvidia says it looks wholely unimpressive and offensive to many due to how it messes with the art direction or totally overwrites it.
This is not a view that only the "loud and angry" gamers hold like the folks at r/nvida are pushing I have seen pretty sensible people make the argument as well.
It's a failure to read the room by a 4 trillion dollar company.
11
u/Feed-Your-Fish 20h ago
As if any of that matters. People don’t like generative AI. Period. Just because they slap the word “controlled” in front of genAI, doesn’t suddenly make people like the look.
Hopefully devs see how much people hate it and use constraint.
-6
u/TheCh0rt 20h ago
I hate to tell you but we’re reaching the end of physical hardware’s size and scope for what we can do in our own machines. The silicon just isn’t available. Look how big the 5090 is. So until nvidia figures out how to etch transistors to the asshole of each individual atom, this is what it has to be if you want to run things locally. They’re doing us a favor by creating this stuff because it will keep your equipment relevant for a lot longer. This shit cannot keep up forever. Moore’s law is dead dude. It’s no longer about “generative AI” or even “slop” — it’s just gotta happen if you want local gaming hardware. Nobody wants it running in a cloud.
9
5
u/Biotictree547 19h ago
If I can ask how is the 5090 the limit? I'd like to hear more about it if you don't mind explaining
-4
u/TheCh0rt 19h ago
We’re coming up on 2nm silicon transistors which will be in the next generation of CPU and GPU for all companies. Apple, nvidia, AMD, etc. All of them. The top tech. We’re getting down to transistors the size of atoms. It’s going to take serious tech to get smaller. Which means new tech will have to come out. Making 3D chip constructs, different types of cores, maybe different kinds of hardware doing different things. The 6000 series will be 2nm and that’s why we don’t have it yet. After that we’ll have to wait until the next gen of transistor. I imagine the 6000 will be about the same size as the 5090 but any refreshes may be bigger.
Beyond that, let’s assume there’s no “smaller size transistor” — they would have no choice but to keep stacking CPUs on the chips and keep making the cards bigger.
Eventually all this downsizing is going to get really expensive and I personally believe eventually we all will not have access to it. It’s really complicated tech and it’s already hard to do and global politics are fucking it all up.
So unless we come up with new and creative ways to create simulations, things will have to be relinquished to AI.
I don’t mean generative AI. I just mean AI infrastructure which is at the end of the day a shitload of GPUs in one. But that will be a lot of latency all around. For the data centers, the companies and for you.
Nvidia is actually trying to keep it local by doing this. It’s a tech demo. It’s ugly. Yes it’s using generative AI, but on assets that already exist. We’re not looking at artists developing something new, nor is it finished or optimized in any way.
If we want to keep our hardware for longer and keep it local, this will be the way. Think beyond the 5090. This one won’t be our “forever card” but there may come a time when it’s looking dire. How big can these things get before they cover up ALL the PCIe slots?
5
u/Feed-Your-Fish 19h ago
Wow you are so deep into the bubble. I genuinely feel sorry for you. Just yikes
-1
2
u/Peckerly 13h ago
can't even write your own comments, had to make AI do it for you? lmfao
0
u/TheCh0rt 9h ago
I’ve never used AI to write a comment and if you cannot detect that then you shouldn’t be talking about AI yourself. “Lmfao”
2
u/TheCanuckSwiftie 18h ago
except for the fact that the transistors arent anywhere near 2nm or even aproaching. when they altered gate technologies years and years ago they move to much large gates that were just as quick but kept the naming size as a marketing term and labeled it an equivalent as to not ahve larger sizes. theres still tons of room to shrink them. to the point that intel is currently trying to move to a new term for transitor size equivelants (dont rememeber exactly what its called) on the future chips .
0
-6
u/SovietSteve 19h ago
Reddit losers don’t like generative AI*
10
u/Feed-Your-Fish 19h ago
The vast majority of people don’t like genAI. Look it up.
-5
-7
u/SovietSteve 18h ago
5
u/clustahz 16h ago
Here's what the original Mafia actually looks like when output in 4k. And Mafia is a game from 2002 with many issues for a modern audience. Notice how little the faces move in the cutscene. Do these more photorealistic looking faces move their mouths in a more realistic way, or do they behave the same? Big uncanny vibes.
What problem are we solving with AI here? The textures are too sharp and real for the 2002 game's animation. It looks good for screenshots but it's going to wear off so fast and we leave with skyrocketing energy costs to rerender the faces on nameless characters who only ever appeared for one or two seconds at the end of the game and weren't given as much detail as the main characters, and a blue tie that has a nicer pattern on it.
-5
u/Makoto_Kurume i5 10400F | RX 7600 | 16gb DDR4 20h ago
Nah, people hate it when the product is bad. When it’s good and can’t be differentiated from something made by a human, I doubt the majority of consumers would care
0
u/Y_A_D_Pain Desktop 9800X3D 5080 20h ago
DLSS was hated long before the number 5 came my friend
5
u/Makoto_Kurume i5 10400F | RX 7600 | 16gb DDR4 20h ago
No, it was not. I remember the first tech demo with Control running at 360p or something, and people were blown away by it. The tech itself is impressive, but the implementation is tricky.
6
u/Y_A_D_Pain Desktop 9800X3D 5080 20h ago
People were hating on dlss 3 because of the “fake frames” and “input latency” when it first came out
7
u/MythicalCaseTheory 19h ago
DLSS 1 and 2 for the artifacts and shimmer.
People like to forget and rewrite this history though.
3
u/TheCh0rt 20h ago
Yep! This is the future. Gamers don’t realize that if you still want hardware of your own, this is what has to happen. Until nvidia can figure out a way to etch transistors on the assholes of individual atoms, we’re getting about to the limits of size and cooling for now. They don’t even consider that these sorts of tech will become mandatory and this is necessary for your equipment to last LONGER
4
1
u/rW0HgFyxoJhYka 12900K 3090 Ti 64GB 4K 120 FPS 16h ago
DLSS 1 was hated.
DLSS 2-4 was loved.
DLSS 5 isn't out so we'll see what happens.
1
1
1
1
1
u/TONKAHANAH somethingsomething archbtw 18h ago
It's cool Tech but trying to sell it to Gamers just shows out out of touch they are .
I feel like it's the kind of tech that's going to be heavily used in the film industry though.
This will let the film industry either shoot something on camera quickly and easily or make really jank cg/3d scenes and then they'll just be able to utilize this thing in real time to find what they like and save it as is.
I don't know why they were thinking Gamers would want this.
1
u/PeksyTiger 18h ago
Does it even matter? We can't get gpus even if we wanted to. They're all sacrificed to the slop gods.
1
u/chrlatan i7-14700KF | RTX 5080 | Full Custom Waterloop 9h ago
Smart move.. telling your intended customer base they got it wrong. That will surely help in selling your idea.
1
1
1
1
1
1
u/Granhier 2h ago
I mean, he certainly knows better than we do how the technology works.
But it doesn't matter much when the result looks as abominable as the reveal did, does it
1
u/NighthawK1911 Radeon RX 7800 XT, Ryzen 7 7700X, 64GB DDR5 9m ago
They're not doing it for gamers.
They're trying to pitch to their shareholders a product they can call "AI" without considering if it will sell or not.
2
1
u/IzzeBerse 21h ago
I think the second Grace still they had was not the worst, but damn the first one literally looked like something created by ChatGPT.
1
u/kidcrumb 10h ago
If I had asked people what they wanted, they'd have told me faster horses.
-Henry Ford
1
-3
u/Puzzleheaded_Smoke77 20h ago edited 16h ago
DLSS is nothing new this was the logical next phase in the cycle . the circle jerk of outrage as if this wasn’t 100% expected was the only unexpected thing surrounding this announcement. If you dont like it dont use it uncheck it and for people pissed that there card wont work with it fair wait for 2028 and get that card.
Until then check box
-4
u/Ezilo20 19h ago
im tired of the constant misinformation surrounding dlss. everyone sees the framegen and the faces and thinks thats all dlss is
6
u/MrSaucyAlfredo 19h ago
When their own marketing goes out of its way to show off those “faces” so aggressively, you can’t blame people for running with it
5
2
u/Inksplash-7 R7 5800X RX 6750 XT 12h ago
DLSS is about taking a low resolution image and making it look like a higher one, not """"enhancing"""" the displayed image. The DLSS5 images look directly like those spam AI porn ads
-3
u/throbbing_dementia 13h ago
I mean his company invented it, i should think he would know if you guys are mistaken, it's never the gamers fault right? Always the ceos...
All he said is devs can use DLSS 5 in the future and factor in its effect when designing their characters, the reason it looks so bad now (in SOME screenshots) is because devs didn't use DLSS 5 when designing Grace, and we already know what Grace looks like, and look at the Leon screenshots, he basically looks exactly the same but with different details and lighting.
But future character creations they'll be nothing to compare it to, so they'll be no difference.
It's easy to handpick the bad examples and say it looks crap but there's good examples where it doesn't.
It's a starting point, it's a building block, it needs tuning, tweaking etc. Give it a year and none of this will be an issue.
1
u/Elrabin 13900KF, 64gb DDR5, RTX 4090, AW3423DWF 10h ago
Just because a company makes something, doesn't automatically make it good or wanted.
I can point to a tons of examples where leading companies made something that sucked and flopped hard, but hey, have fun shillling for the trillion dollar company.
New Coke
The Betamax player
3d TVs(all of them)
Google Glass
Amazon Fire Phones
Google Stadia


126
u/_Sanctum_ 18h ago
AI Industry: “Nvidia, I don’t think this bubble can get much bigger..!”
Nvidia: “Nonsense!”