AMD potentially blocking DLSS on their sponsored games?

  • 94 results
  • 1
  • 2
Avatar image for NoodleFighter
NoodleFighter

11805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 NoodleFighter
Member since 2011 • 11805 Posts
@pclover1980 said:
@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

It's Into The Radius VR. But the dev's response to it was because DLSS couldn't work properly with VR, or something along those lines. Which is kinda BS because it works the same as FSR, just without the tensor cores.

I believe he's talking about the game Boundary. It was a first person space shooter I was also following. The game announced a few months ago that they were dropping ray tracing and DLSS support even though they were already in the game. Boundary had some very demanding RT effects and DLSS support combined would easily allow Nvidia cards to outshine AMD cards in that. AMD can't have the competition outshining them on their own game so they had them remove it. I believe the devs even admitted this on their discord.

Avatar image for R4gn4r0k
R4gn4r0k

46498

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#52 R4gn4r0k
Member since 2004 • 46498 Posts

@NoodleFighter said:
@pclover1980 said:

It's Into The Radius VR. But the dev's response to it was because DLSS couldn't work properly with VR, or something along those lines. Which is kinda BS because it works the same as FSR, just without the tensor cores.

I believe he's talking about the game Boundary. It was a first person space shooter I was also following. The game announced a few months ago that they were dropping ray tracing and DLSS support even though they were already in the game. Boundary had some very demanding RT effects and DLSS support combined would easily allow Nvidia cards to outshine AMD cards in that. AMD can't have the competition outshining them on their own game so they had them remove it. I believe the devs even admitted this on their discord.

Yup, thanks for both examples, my dudes. But it was Boundary indeed.

Every game they team up with seems to bring worse RT and worse DLSS, which doesn't help anyone in the long run.

Resident Evil and Halo Infinite have some RT effects that are nothing to write home about.

And unoptimized games like Callisto and Jedi desperately need DLSS 3.

Furthermore, with home Starfield is looking, I was actually looking forward to playing it with DLSS 2.

Avatar image for Pedro
Pedro

70035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#53 Pedro
Member since 2002 • 70035 Posts

Nvidia should make DLSS open.😎

Avatar image for osan0
osan0

17854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#54 osan0
Member since 2004 • 17854 Posts

@Pedro said:

Nvidia should make DLSS open.😎

Now that would be nice. Sadly not the Nvidia way really. They have a history of letting tech rot before letting it work cross vendor.

Though in fairness to them they do have project streamline which is an open source tool that would mean developers would only need to implement the tech once rather than once per vendor. Sadly AMD seem to be suffering a bad case of NIH around this. Intel is in though.

Avatar image for NoodleFighter
NoodleFighter

11805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 NoodleFighter
Member since 2011 • 11805 Posts

@R4gn4r0k: I expect most of Microsoft's first party games to be AMD sponsored since after all AMD is providing the hardware for the Xbox consoles and AMD appears to be working more closely with MS than Sony when it comes to getting AMD proprietary tech to work with their consoles.

Avatar image for DragonfireXZ95
DragonfireXZ95

26649

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56 DragonfireXZ95
Member since 2005 • 26649 Posts

@Postosuchus said:

I assume DLSS isn't as easy or cheap to implement as flicking a switch Besides, Nvidia locked it down not only to their own hardware, but only certain generations of their own hardware. If a game dev has limited time and budget to implement one of these framerate crutches, why not the one that works on all hardware?

Yes, I'm sure Bethesda, with their 200 million dollar budget for Starfield, have absolutely no way to dedicate any resources to incorporating DLSS. Please.

Avatar image for NoodleFighter
NoodleFighter

11805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57 NoodleFighter
Member since 2011 • 11805 Posts

@DragonfireXZ95 said:
@Postosuchus said:

I assume DLSS isn't as easy or cheap to implement as flicking a switch Besides, Nvidia locked it down not only to their own hardware, but only certain generations of their own hardware. If a game dev has limited time and budget to implement one of these framerate crutches, why not the one that works on all hardware?

Yes, I'm sure Bethesda, with their 200 million dollar budget for Starfield, have absolutely no way to dedicate any resources to incorporating DLSS. Please.

Nvidia RTX cards are about 40% of the Steam userbase at this point and a lot of the cards that aren't RTX don't even meet Starfield's minimum requirements. At this point not support DLSS will really only screw over the biggest demographic with rigs that can play the game day one.

Avatar image for DragonfireXZ95
DragonfireXZ95

26649

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 DragonfireXZ95
Member since 2005 • 26649 Posts
@hardwenzen said:

I feel ashamed for being an rx6800 owner, but when you look at the other side, they're as shitty if not shittier. You hermits have the worst companies in existence.

Err, uh, you mean kind of how like Playstation only owners can't even play Starfield? I mean, sure, we have some crappy ones, but console companies aren't much better. Lol

We won't even get into how much Nintendo screws their fanbase over constantly.

Avatar image for Pedro
Pedro

70035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#59 Pedro
Member since 2002 • 70035 Posts

@NoodleFighter said:

@R4gn4r0k: I expect most of Microsoft's first party games to be AMD sponsored since after all AMD is providing the hardware for the Xbox consoles and AMD appears to be working more closely with MS than Sony when it comes to getting AMD proprietary tech to work with their consoles.

What proprietary tech?

Avatar image for pclover1980
PCLover1980

1273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#60 PCLover1980
Member since 2022 • 1273 Posts

@Pedro: DLSS beinf open source is moot if competing hardware doesn't have the tensor core it needs.

Avatar image for NoodleFighter
NoodleFighter

11805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 NoodleFighter
Member since 2011 • 11805 Posts

@Pedro said:
@NoodleFighter said:

@R4gn4r0k: I expect most of Microsoft's first party games to be AMD sponsored since after all AMD is providing the hardware for the Xbox consoles and AMD appears to be working more closely with MS than Sony when it comes to getting AMD proprietary tech to work with their consoles.

What proprietary tech?

AMD's FidelityFX is apart of the Xbox S|X development kits. It isn't just FSR but Contrast Adaptive Sharpening, Denoiser, Combined Adaptive Compute Ambient Occlusion and Variable Rate Shading. Of course some of these are also on PS5 but they don't come with the PS5 dev kit. Also don't people say Xbox Series consoles are full RDNA2 while PS5 is a bit more custom? It would make sense why AMD promotes the Xbox Series consoles more than the PS5 despite the latter selling better.

Avatar image for Pedro
Pedro

70035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#62 Pedro
Member since 2002 • 70035 Posts

@NoodleFighter said:

AMD's FidelityFX is apart of the Xbox S|X development kits. It isn't just FSR but Contrast Adaptive Sharpening, Denoiser, Combined Adaptive Compute Ambient Occlusion and Variable Rate Shading. Of course some of these are also on PS5 but they don't come with the PS5 dev kit. Also don't people say Xbox Series consoles are full RDNA2 while PS5 is a bit more custom? It would make sense why AMD promotes the Xbox Series consoles more than the PS5 despite the latter selling better.

This is under GPUOpen which means it would work on non AMD hardware no?

Avatar image for Pedro
Pedro

70035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#63 Pedro
Member since 2002 • 70035 Posts
@pclover1980 said:

@Pedro: DLSS beinf open source is moot if competing hardware doesn't have the tensor core it needs.

It is not moot. You can declare it moot if you want.🤷🏽‍♀️

Avatar image for NoodleFighter
NoodleFighter

11805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64  Edited By NoodleFighter
Member since 2011 • 11805 Posts
@Pedro said:
@NoodleFighter said:

AMD's FidelityFX is apart of the Xbox S|X development kits. It isn't just FSR but Contrast Adaptive Sharpening, Denoiser, Combined Adaptive Compute Ambient Occlusion and Variable Rate Shading. Of course some of these are also on PS5 but they don't come with the PS5 dev kit. Also don't people say Xbox Series consoles are full RDNA2 while PS5 is a bit more custom? It would make sense why AMD promotes the Xbox Series consoles more than the PS5 despite the latter selling better.

This is under GPUOpen which means it would work on non AMD hardware no?

Yeah but I have yet to hear any news of it officially being supported by default on PS5 like it is on Xbox Series. AMD only ever talks about Xbox when it comes to this stuff. Of course this stuff would work on non AMD hardware but the main priority is RDNA2 since some of that stuff is useless for hardware that is older than RDNA2 or does not come with some RDNA2 features.

Avatar image for Pedro
Pedro

70035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#65  Edited By Pedro
Member since 2002 • 70035 Posts
@NoodleFighter said:

Yeah but I have yet to hear any news of it officially being supported by default on PS5 like it is on Xbox Series. AMD only ever talks about Xbox when it comes to this stuff. Of course this stuff would work on non AMD hardware but the main priority is RDNA2 since some of that stuff is useless for hardware that is older than RDNA2 or does not come with some RDNA2 features.

Yeah, I don't know which features are available on PS5 but even the features on the Xbox Series are not being used by most games.

Avatar image for pclover1980
PCLover1980

1273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#66 PCLover1980
Member since 2022 • 1273 Posts

@Pedro said:
@pclover1980 said:

@Pedro: DLSS beinf open source is moot if competing hardware doesn't have the tensor core it needs.

It is not moot. You can declare it moot if you want.🤷🏽‍♀️

And you can declare it not moot if you want.

🤷🏽‍♀️

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#67 Xtasy26
Member since 2008 • 5582 Posts

@04dcarraher said:

Also I would like to suggest that more recent 8gb vram limitation increase in games has one common thread.... these newer AMD sponsorship titles.....(STAR WARS Jedi: Survivor,Resident Evil 4, THE LAST OF US etc) Is AMD purposely influencing developers to inflate vram requirements to make their gpu's look better? AMD holding back RT effects in sponsored games to cater toward their shortcomings? AMD Goaded Nvidia over vram ahead of RTX 4070 launching ..... All these things seem a bit fishy in being hypocritical in supporting "open source" yet cutting/limiting features.

Nvidia really shouldn't have any excuses for being stingy with certain products at the price they are charging. RTX 4070 having 12 GB is laughable at the price they are charging. When I was played Cyberpunk at 4K with Psycho graphics with everything set to max on my RTX 3090 it easily took up 12-13 GB. That's one thing that's annoying about nVidia, they are trying to protect their margins while handicapping their great graphics cards.

My GTX 1060 6GB could have easily play Red Dead 2 maxed out at 1080P but it stutters when compared to the RX 580 because the 6GB buffers gets full in certain scenes and it has lower 1% framerates compare to the RX 580 which has 8GB. Granted this didn't affect me as I got a RTX 3090 but it's another example.

@pclover1980 said:
@04dcarraher said:

Also I would like to suggest that more recent 8gb vram limitation increase in games has one common thread.... these newer AMD sponsorship titles.....(STAR WARS Jedi: Survivor,Resident Evil 4, THE LAST OF US etc) Is AMD purposely influencing developers to inflate vram requirements to make their gpu's look better? AMD holding back RT effects in sponsored games to cater toward their shortcomings? AMD Goaded Nvidia over vram ahead of RTX 4070 launching ..... All these things seem a bit fishy in being hypocritical in supporting "open source" yet cutting/limiting features.

I feel like this is a bit too tinfoil hatty for me, but it may be the case too. Or devs are really, really just lazy. Looking at TLOU1 on PC, there's a noticeable difference in terms of VRAM usage from launch and the latest patch. Also with the case in Forspoken.

Forspoken is known for their piss-poor optimization.

Having said that it would be really dumb to not include DLSS since it's superior to FSR in Starfield.

Avatar image for pclover1980
PCLover1980

1273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#68  Edited By PCLover1980
Member since 2022 • 1273 Posts

@Xtasy26: This is true. Nvidia saving moolah on VRAM is as bad, if not worse, as this. Good thing the 4xxx series isn't selling as much as Nvidia has hoped. The pessimist in me is saying that they planned for this so they can give the 5xxx a better show when it releases, like when the 3xxx released when the 2xxx series was lambasted when it launched.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#69  Edited By Xtasy26
Member since 2008 • 5582 Posts
@osan0 said:
@R4gn4r0k said:
@osan0 said:
@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

Yeah I saw that in the Hardware Unboxed Vid.....that's a really bad look for AMD indeed.

It kinda makes sense as AMD doesn't want DLSS to outshine their proprietary tech in their sponsored games.

But I still wish the practice would cease to a halt, starting with Starfield.

And I wish their AMD sponsorship would go in a different direction.

I kinda makes sense on the surface but, with a bit more thought, it really doesn't. It's no secret that DLSS is superior to FSR. This is known. It's well covered and established. There has been nothing to indicate that FSR2 has caught up. This is not to say FSR is bad. I use it myself in a couple of games and it's fine...though I am probably using it under the best conditions (FSR quality @4K). But DLSS is better (though by no means perfect).

So what does cutting DLSS out of starfield actually get AMD except bad press? There is no win here. Developers wont be any more or less encouraged to use FSR in their games anyway. It still makes sense to implement FSR since it runs on everything including consoles, so the tech still has value. It's also designed so that any game with DLSS support can easily implement FSR2 and the reverse is also true.

RTGs marketing department are in a tough spot in fairness. The reality is that Radeon is still behind on the software and hardware front. They have made good strides on the hardware front since vega in fairness. But there are still weaknesses. Then there is a lot of work to do on the software side too, especially in the areas that are not gaming.

But there has to be better ways of using the deal than this surely. RTGs marketing department really does need an overhaul.

You are right FSR @4K is pretty good. Don't know if it's enough for me to switch from nVIDIA to AMD.

AMD really needs to come up with something like nVidia using Tensor Cores for AI implementation of Upscaling/Super Sampling. There were rumors that AMD was working on such things on their future GPUs. While FSR is great but not quite DLSS, some may argue that it's not noticeable. But having used DLSS at 4K I was very impressed by the quality. It would hard for me to switch back to AMD.

Avatar image for osan0
osan0

17854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#70  Edited By osan0
Member since 2004 • 17854 Posts

@Xtasy26 said:
@osan0 said:

I kinda makes sense on the surface but, with a bit more thought, it really doesn't. It's no secret that DLSS is superior to FSR. This is known. It's well covered and established. There has been nothing to indicate that FSR2 has caught up. This is not to say FSR is bad. I use it myself in a couple of games and it's fine...though I am probably using it under the best conditions (FSR quality @4K). But DLSS is better (though by no means perfect).

So what does cutting DLSS out of starfield actually get AMD except bad press? There is no win here. Developers wont be any more or less encouraged to use FSR in their games anyway. It still makes sense to implement FSR since it runs on everything including consoles, so the tech still has value. It's also designed so that any game with DLSS support can easily implement FSR2 and the reverse is also true.

RTGs marketing department are in a tough spot in fairness. The reality is that Radeon is still behind on the software and hardware front. They have made good strides on the hardware front since vega in fairness. But there are still weaknesses. Then there is a lot of work to do on the software side too, especially in the areas that are not gaming.

But there has to be better ways of using the deal than this surely. RTGs marketing department really does need an overhaul.

You are right FSR @4K is pretty good. Don't know if it's enough for me to switch from nVIDIA to AMD.

AMD really needs to come up with something like nVidia using Tensor Cores for AI implementation of Upscaling/Super Sampling. There were rumors that AMD was working on such things on their future GPUs. While FSR is great but not quite DLSS, some may argue that it's not noticeable. But having used DLSS at 4K I was very impressed by the quality. It would hard for me to switch back to AMD.

Out of curiosity: how much weight do you put on DLSS for your purchasing decision? A nice to have or "DLSS support or it's not an option?". Personally I file things like FSR and DLSS under a nice to have. Something that will swing the direction one way if all other things are equal. I don't consider them to be critical features.

As for AMD: Yeah they probably should. I was thinking they are too far behind: But intel have done a great job with Xess and that is on the first try. Xess, when running on ARC specifically, is very impressive from what i have seen. So yeah...AMD should probably just pull the trigger. They have been making some strange decisions around their GPUs. The lack of RT performance (and lack of any major improvement in RDNA3) too is...odd.

Avatar image for Pedro
Pedro

70035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#71 Pedro
Member since 2002 • 70035 Posts

@osan0 said:

Out of curiosity: how much weight do you put on DLSS for your purchasing decision? A nice to have or "DLSS support or it's not an option?". Personally I file things like FSR and DLSS under a nice to have. Something that will swing the direction one way if all other things are equal. I don't consider them to be critical features.

As for AMD: Yeah they probably should. I was thinking they are too far behind: But intel have done a great job with Xess and that is on the first try. Xess, when running on ARC specifically, is very impressive from what i have seen. So yeah...AMD should probably just pull the trigger. They have been making some strange decisions around their GPUs. The lack of RT performance (and lack of any major improvement in RDNA3) too is...odd.

What would be considered major improvement with RDNA3?

Avatar image for osan0
osan0

17854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#72 osan0
Member since 2004 • 17854 Posts

@Pedro said:
@osan0 said:

Out of curiosity: how much weight do you put on DLSS for your purchasing decision? A nice to have or "DLSS support or it's not an option?". Personally I file things like FSR and DLSS under a nice to have. Something that will swing the direction one way if all other things are equal. I don't consider them to be critical features.

As for AMD: Yeah they probably should. I was thinking they are too far behind: But intel have done a great job with Xess and that is on the first try. Xess, when running on ARC specifically, is very impressive from what i have seen. So yeah...AMD should probably just pull the trigger. They have been making some strange decisions around their GPUs. The lack of RT performance (and lack of any major improvement in RDNA3) too is...odd.

What would be considered major improvement with RDNA3?

Around 3-4X RT performance over RDNA2. One of the pain points of RDNA2 imho is the RT performance. It's very much Turing levels (which was widely regarded as poor). RDNA2 didn't arrive until 2 years later.

Sure one could argue that it was AMDs first crack at it but one of the benefits of being behind is there is a target to beat. To get ahead, at some point you need to steal a march.

RDNA 3 doesn't seem to have done much with it either which is even more disappointing. They made the implementation a bit more efficient but the extra RT performance over RDNA2 seems to be mostly down to extra accelerators and higher clocks. By now they really should be able to trade blows with Nvidia in that department.

Avatar image for Pedro
Pedro

70035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#73 Pedro
Member since 2002 • 70035 Posts

@osan0 said:

Around 3-4X RT performance over RDNA2. One of the pain points of RDNA2 imho is the RT performance. It's very much Turing levels (which was widely regarded as poor). RDNA2 didn't arrive until 2 years later.

Sure one could argue that it was AMDs first crack at it but one of the benefits of being behind is there is a target to beat. To get ahead, at some point you need to steal a march.

RDNA 3 doesn't seem to have done much with it either which is even more disappointing. They made the implementation a bit more efficient but the extra RT performance over RDNA2 seems to be mostly down to extra accelerators and higher clocks. By now they really should be able to trade blows with Nvidia in that department.

That is unrealistic expectations. When was the last time performance more than double the previous gen in GPU? The performance jump in RT between 3000 to 4000 RTX is not even 2x far less 3-4x. I don't agree with you claim of one of the benefits of being behind.

Avatar image for BassMan
BassMan

17851

Forum Posts

0

Wiki Points

0

Followers

Reviews: 226

User Lists: 0

#74  Edited By BassMan
Member since 2002 • 17851 Posts

@osan0: After using DLSS for many years and now frame generation, I would consider them essential features just like G-Sync/FreeSync/VRR. They are not just nice additions. The ability to get more performance while maintaining good image quality is huge. Also, frame generation's ability to bypass a CPU bottleneck is amazing.

Avatar image for osan0
osan0

17854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#75 osan0
Member since 2004 • 17854 Posts

@Pedro said:
@osan0 said:

Around 3-4X RT performance over RDNA2. One of the pain points of RDNA2 imho is the RT performance. It's very much Turing levels (which was widely regarded as poor). RDNA2 didn't arrive until 2 years later.

Sure one could argue that it was AMDs first crack at it but one of the benefits of being behind is there is a target to beat. To get ahead, at some point you need to steal a march.

RDNA 3 doesn't seem to have done much with it either which is even more disappointing. They made the implementation a bit more efficient but the extra RT performance over RDNA2 seems to be mostly down to extra accelerators and higher clocks. By now they really should be able to trade blows with Nvidia in that department.

That is unrealistic expectations. When was the last time performance more than double the previous gen in GPU? The performance jump in RT between 3000 to 4000 RTX is not even 2x far less 3-4x. I don't agree with you claim of one of the benefits of being behind.

Loading Video...

You are correct that the leap from the 3000 series to the 4000 series in RT performance is not as big. But the jump from the 2000 series to the 3000 series in Raw RT performance (the jump between a 2080 and 3080) was not far off 2X. That's the kind of leap RDNA 3 should be over RDNA 2...and then some to catch up to the 4000 series.

Basically (3X to 4X is probably a high estimate on my part in fairness...though the leap from 2000 to 4000 series is probably pretty close to 3X.), by now AMD really should be trading blows with the 4000 series in raw RT performance. It's ok somewhat to write off RT for lower tier GPUs (though it would be a nice marketing win to have great RT performance at lower tiers too). But when charging 1000 bucks for a GPU, AMD really needs to offer a more rounded package. People are going to be far less willing to compromise at that price point, and rightfully so.

As for the being behind thing: AMD don't operate in a vacuum. They will be compared against other GPU makers. If they don't try to catch up and/or disrupt then they will forever remain behind and just fall further behind. They have mostly caught up with raster. They took huge leaps in raster performance and efficiency when going from Vega to RDNA2. So it can be done.

Avatar image for osan0
osan0

17854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#76 osan0
Member since 2004 • 17854 Posts

@BassMan said:

@osan0: After using DLSS for many years and now frame generation, I would consider them essential features just like G-Sync/FreeSync/VRR. They are not just nice additions. The ability to get more performance while maintaining good image quality is huge. Also, frame generation's ability to bypass a CPU bottleneck is amazing.

How essential really though? Would you just flat out refuse to play a game that doesn't support them? If you could get a 7900XTX for the same money as a 4070, which would you choose?

Avatar image for BassMan
BassMan

17851

Forum Posts

0

Wiki Points

0

Followers

Reviews: 226

User Lists: 0

#77  Edited By BassMan
Member since 2002 • 17851 Posts
@osan0 said:
@BassMan said:

@osan0: After using DLSS for many years and now frame generation, I would consider them essential features just like G-Sync/FreeSync/VRR. They are not just nice additions. The ability to get more performance while maintaining good image quality is huge. Also, frame generation's ability to bypass a CPU bottleneck is amazing.

How essential really though? Would you just flat out refuse to play a game that doesn't support them? If you could get a 7900XTX for the same money as a 4070, which would you choose?

All demanding games should support these features.

Since you are comparing different classes of cards, I choose a 4090. :)

But yeah, I would obviously go with a 7900 XTX over a 4070 for the same money. However, 7900 XTX is competing with the 4080 and I would choose a 4080 over it. Slightly worse rasterization, but better RT, better upscaling and frame generation.

Avatar image for osan0
osan0

17854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#78 osan0
Member since 2004 • 17854 Posts

@BassMan said:
@osan0 said:
@BassMan said:

@osan0: After using DLSS for many years and now frame generation, I would consider them essential features just like G-Sync/FreeSync/VRR. They are not just nice additions. The ability to get more performance while maintaining good image quality is huge. Also, frame generation's ability to bypass a CPU bottleneck is amazing.

How essential really though? Would you just flat out refuse to play a game that doesn't support them? If you could get a 7900XTX for the same money as a 4070, which would you choose?

All demanding games should support these features.

Since you are comparing different classes of cards, I choose a 4090. :)

But yeah, I would obviously go with a 7900 XTX over a 4070 for the same money. However, 7900 XTX is competing with the 4080 and I would choose a 4080 over it. Slightly worse rasterization, but better RT, better upscaling and frame generation.

Yes they should of course support all the options. Especially the big production AAA games. I just don't consider them critical. I'll still happily play a game without them. A GPU that doesn't support them will also still be considered for purchase too.

But yeah I am comparing different classes of cards because that scenario has come up. When I bought my 6900XT: It was that or a 3070ti for the same money. GPU pricing be bonkers at times.

I can also see why you would choose the 4080 over the 7900XTX. At the price point people want a more complete package. What's an extra 100 bucks (Bloody hell there is only a 100 bucks different between them at my local)? The 4080 is a better overall package.

Avatar image for pc_rocks
PC_Rocks

8501

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#79 PC_Rocks
Member since 2018 • 8501 Posts

Why is it suddenly a news now? I thought that was common knowledge. I remember even my self talking about it in the past.

Avatar image for robertos
Robertos

1023

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#80 Robertos
Member since 2023 • 1023 Posts

That sucks, DLSS 3 is really awesome.

Avatar image for tribesjah
tribesjah

198

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81  Edited By tribesjah
Member since 2007 • 198 Posts

@Xtasy26:As someone who recently switched from Nvidia to AMD I completely agree with you here. Although both DLSS and FSR Quality are good at 4K, DLSS is just a bit better IQ wise (less shimmering and sometimes sharper) from what I have seen. The performance (FPS) is identical but DLSS definitely has the upper hand in IQ and therefore is the superior product. I do hope with sometime FSR can catch-up with DLSS given that FSR can be used on any card while DLSS is exclusive to Nvidia. That being said, it is kind of crappy of AMD to lock DLSS away on games that are sponsored by them (granted Nvidia used to pull this crap although the time as well in the past). I feel like (and this is unscientific) FSR Quality >= DLSS Balanced in perceived image quality in most case.

The other "big" issue AMD currently has is with their ray-tracing performance. Although they have gotten better, their latest line (7900) is still definitely behind Nvidia when it comes to RT. Granted one could argue that RT on Nvidia (outside of 4090) is also meh if you don't use upscaling as it tanks performance, and relatively few games support it currently, but more and more will get RT over time. Moving from 3080 Ti to 7900 XTX I noticed a "massive" improvement in my performance in traditional rasterized games, but a relatively small increase in games with RT.

Avatar image for tribesjah
tribesjah

198

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#82 tribesjah
Member since 2007 • 198 Posts

@osan0: I feel like the price difference very much depends on which AIB card you are looking at, but I would argue that the price difference is still more around the $200-$250 mark (which at that point, harder to choose between the two).

For example, one of the cheaper 7900 XTX (Sapphire Pulse) can be had for $900 (and recently you could even get an XFX or an Asrock variant, granted I would probably avoid the Asrock one, for 860ish). On the 4080 side the cheapest ones are still in the 1100-1150 range (Zotac and PNY). I know this just cause I had the same debate (4080 vs 7900), decided to go with 7900 due to better raster performance (and, to a lesser extent, to not support Nvidia and their overpricing practices, this is my first AMD card ever tbh). I think with a $100 difference the choice is less clear (raster vs DLSS and RT), but at 200+ I think the 7900 still makes more sense (granted I may be biased).

Avatar image for blaznwiipspman1
blaznwiipspman1

16582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 blaznwiipspman1
Member since 2007 • 16582 Posts

@BassMan: I'd go with the 7900xtx over 4080. More memory is important than the improvement in RT.

Avatar image for osan0
osan0

17854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#84 osan0
Member since 2004 • 17854 Posts
@tribesjah said:

@osan0: I feel like the price difference very much depends on which AIB card you are looking at, but I would argue that the price difference is still more around the $200-$250 mark (which at that point, harder to choose between the two).

For example, one of the cheaper 7900 XTX (Sapphire Pulse) can be had for $900 (and recently you could even get an XFX or an Asrock variant, granted I would probably avoid the Asrock one, for 860ish). On the 4080 side the cheapest ones are still in the 1100-1150 range (Zotac and PNY). I know this just cause I had the same debate (4080 vs 7900), decided to go with 7900 due to better raster performance (and, to a lesser extent, to not support Nvidia and their overpricing practices, this is my first AMD card ever tbh). I think with a $100 difference the choice is less clear (raster vs DLSS and RT), but at 200+ I think the 7900 still makes more sense (granted I may be biased).

Depends on where you live I suppose. I'm in the EU (to give a broad idea) and where I get my PC bits, it's just 100 ish Euros in the difference. The cheapest 7900XTX is 1065 and the cheapest 4080 is 1169. GPU pricing is still pretty wild over here though.

Yeah at at 200-250 bucks difference the waters are muddied. Depends on the priorities and price flexibility. But, for many, when they are prepared to pay the guts of 1000 for a GPU..they tend to also be more price elastic. By and large these are not people that are watching the pennies.

If those two were my options based on EU pricing...the 7900XTX is a very tough sell. I even have some more eccentric requirements that put a bit more weight on the Radeon side when making a purchase, but even then....man the 4080 has a few nice tricks for just 100 bucks more. I'm already paying 1000+ Euros...what's another 100?

Avatar image for BassMan
BassMan

17851

Forum Posts

0

Wiki Points

0

Followers

Reviews: 226

User Lists: 0

#85  Edited By BassMan
Member since 2002 • 17851 Posts
@blaznwiipspman1 said:

@BassMan: I'd go with the 7900xtx over 4080. More memory is important than the improvement in RT.

16GB VRAM will be good enough for a while. Pointless factoring in extra memory that you will not be using. Basically only useful for productivity and I would not use an AMD GPU for productivity as they are much worse.

Also, dat power draw...

https://www.notebookcheck.net/Extensive-test-reveals-AMD-s-Radeon-RX-7900-XTX-draws-150-W-more-on-average-compared-to-the-Nvidia-RTX-4080.733657.0.html

Yikes!

Avatar image for blaznwiipspman1
blaznwiipspman1

16582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86  Edited By blaznwiipspman1
Member since 2007 • 16582 Posts

@BassMan: 16gb is weak in 2023. You definitely need 24gb. Draw distance and textures nowadays are massive.

Productivity only matters for professionals, not most gamers. Maybe some hobbyists out there try stuff out, but the radeon is more than good enough.

Electricity draw isn't important either, we got cheap solar and nuclear power, heck yeah.

Avatar image for tribesjah
tribesjah

198

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87 tribesjah
Member since 2007 • 198 Posts

@osan0: Agreed there, if you are in Europe, given the higher cost of electricity, mixed with the relatively small difference in price, the 4080 would make sense (especially given the superior RT and upscaling performance). But man you guys in the EU get screwed on prices (more expensive cards, especially given that salaries tend to be significantly lower in the EU than in the US, granted I am mostly focusing on the salaries of professionals rather than the "average" person). Yeah you are less price sensitive when you go up to those "high end" cards but I would argue there is still some sensitivity (plus in this case, if one does not care about RT, technically the 7900 outperforms it slightly so more performance in those scenarios for less $$ may be justifiable).

Tbh though I feel like spending this kind of money on GPUs is ridiculous to begin with. The only reason I had a 3080 Ti was because during COVID and due to mining, the prices of my 3070 (non LHR) was the same as that of a 3080 Ti, so I sold my 3070 to get that (and prior to that had a 2070 which I also sold for same prices I paid for my 3070). Only reason I got 7900 was because I could sell my 3080 Ti for a decent price, otherwise I would never entertain the idea of spennding 1k on a cardd.d

Avatar image for BassMan
BassMan

17851

Forum Posts

0

Wiki Points

0

Followers

Reviews: 226

User Lists: 0

#88  Edited By BassMan
Member since 2002 • 17851 Posts
@blaznwiipspman1 said:

@BassMan: 16gb is weak in 2023. You definitely need 24gb. Draw distance and textures nowadays are massive.

Productivity only matters for professionals, not most gamers. Maybe some hobbyists out there try stuff out, but the radeon is more than good enough.

Electricity draw isn't important either, we got cheap solar and nuclear power, heck yeah.

So, only the 7900 XTX and RTX 4090 are good for gaming in 2023 right? They are the only new GPUs with 24GB after all. Apparently, there are so many games that saturate more than 16GB of VRAM that anything less than 24GB just isn't good enough. Is that right? Please name some of those games for me....

I expect a list of exactly 0 games.

Electricity must be super cheap and abundant all over the world. Especially over in Europe where there has not been an energy crisis at all. No way they are concerned about power. Nope, all good over there. Apparently, all circuits can handle an infinite amount of power and you don't have to worry about power draw at all. I am not sure why they invented breakers that can trip. What a silly idea. Go ahead, run that beastly OC gaming rig along with your home theater system, your air conditioner, etc... and go ahead and run that hair dryer and microwave oven on that circuit while you are at it.... No worries! Nothing will happen. Power draw isn't important.... *sigh*

GTFO clown.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#89  Edited By Xtasy26
Member since 2008 • 5582 Posts
@osan0 said:
@Xtasy26 said:
@osan0 said:

I kinda makes sense on the surface but, with a bit more thought, it really doesn't. It's no secret that DLSS is superior to FSR. This is known. It's well covered and established. There has been nothing to indicate that FSR2 has caught up. This is not to say FSR is bad. I use it myself in a couple of games and it's fine...though I am probably using it under the best conditions (FSR quality @4K). But DLSS is better (though by no means perfect).

So what does cutting DLSS out of starfield actually get AMD except bad press? There is no win here. Developers wont be any more or less encouraged to use FSR in their games anyway. It still makes sense to implement FSR since it runs on everything including consoles, so the tech still has value. It's also designed so that any game with DLSS support can easily implement FSR2 and the reverse is also true.

RTGs marketing department are in a tough spot in fairness. The reality is that Radeon is still behind on the software and hardware front. They have made good strides on the hardware front since vega in fairness. But there are still weaknesses. Then there is a lot of work to do on the software side too, especially in the areas that are not gaming.

But there has to be better ways of using the deal than this surely. RTGs marketing department really does need an overhaul.

You are right FSR @4K is pretty good. Don't know if it's enough for me to switch from nVIDIA to AMD.

AMD really needs to come up with something like nVidia using Tensor Cores for AI implementation of Upscaling/Super Sampling. There were rumors that AMD was working on such things on their future GPUs. While FSR is great but not quite DLSS, some may argue that it's not noticeable. But having used DLSS at 4K I was very impressed by the quality. It would hard for me to switch back to AMD.

Out of curiosity: how much weight do you put on DLSS for your purchasing decision? A nice to have or "DLSS support or it's not an option?". Personally I file things like FSR and DLSS under a nice to have. Something that will swing the direction one way if all other things are equal. I don't consider them to be critical features.

As for AMD: Yeah they probably should. I was thinking they are too far behind: But intel have done a great job with Xess and that is on the first try. Xess, when running on ARC specifically, is very impressive from what i have seen. So yeah...AMD should probably just pull the trigger. They have been making some strange decisions around their GPUs. The lack of RT performance (and lack of any major improvement in RDNA3) too is...odd.

That's a good question. Honestly, I didn't make much of weight for DLSS. All I know was that nVidia had better Ray Tracing and I wanted to play games in Ray Tracing (yes I am a graphics w*ore). I got a 4K monitor and there was no way I was going to get descent frame rates in games like Cyberpunk 2077 at 4K with Psycho max graphics settings even with my 3090. So, having DLSS was a must.

Fact of the matter is 4K at max settings with Ray Tracing is demanding even for a high end GPU like the 3090 so having DLSS is a must. In certain games you can get away without using DLSS and play games like Forza Horizon 5 without DLSS which I did at 4K max without issues. But when you start using Ray Tracing DLSS is a must. Noticed that with Control too. Was pretty laggy in certain parts without DLSS having tested it myself on the 3090.

Intel hired the guy which helped develop DLSS at nVidia when they started building their GPU division. Hence their good quality upscaling Xess, AMD should have hired him. My guess is that they thought they didn't need him and their version of upscaling with FSR was "good enough".

Avatar image for osan0
osan0

17854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#90 osan0
Member since 2004 • 17854 Posts

@Xtasy26 said:
@osan0 said:

Out of curiosity: how much weight do you put on DLSS for your purchasing decision? A nice to have or "DLSS support or it's not an option?". Personally I file things like FSR and DLSS under a nice to have. Something that will swing the direction one way if all other things are equal. I don't consider them to be critical features.

As for AMD: Yeah they probably should. I was thinking they are too far behind: But intel have done a great job with Xess and that is on the first try. Xess, when running on ARC specifically, is very impressive from what i have seen. So yeah...AMD should probably just pull the trigger. They have been making some strange decisions around their GPUs. The lack of RT performance (and lack of any major improvement in RDNA3) too is...odd.

That's a good question. Honestly, I didn't make much of weight for DLSS. All I know was that nVidia had better Ray Tracing and I wanted to play games in Ray Tracing (yes I am a graphics w*ore). I got a 4K monitor and there was no way I was going to get descent frame rates in games like Cyberpunk 2077 at 4K with Psycho max graphics settings even with my 3090. So, having DLSS was a must.

Fact of the matter is 4K at max settings with Ray Tracing is demanding even for a high end GPU like the 3090 so having DLSS is a must. In certain games you can get away without using DLSS and play games like Forza Horizon 5 without DLSS which I did at 4K max without issues. But when you start using Ray Tracing DLSS is a must. Noticed that with Control too. Was pretty laggy in certain parts without DLSS having tested it myself on the 3090.

Intel hired the guy which helped develop DLSS at nVidia when they started building their GPU division. Hence their good quality upscaling Xess, AMD should have hired him. My guess is that they thought they didn't need him and their version of upscaling with FSR was "good enough".

Nothing wrong with being a graphics wh(*e. I'm also one at times...just..er...on a tighter budget. But i'm the dedicated "I'll play games at 30FPS on a PC to get the best visual settings" type of graphics wh*re.....real commitment to the cause :P.

But does it have to be DLSS specifically? Hypothetically if CP2077 only supported FSR would it be a case of "Nope. Not playing it. Not happening. Unacceptable". Is it essential that the upscaling be DLSS specifically?

And looking at hardware purchasing decisions: you own a 3090 so i'm guessing you are very much in the "best of the best at whatever cost" type of PC gamer (which is grand. Always fun to do finances willing). But say finances were tighter. Say 500 was the tops you could spend on the GPU. How much tax would you be willing to pay for DLSS specifically?

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91 Xplode_games
Member since 2011 • 2540 Posts

@Pedro said:

So they are doing what Nvidia has done. Full circle.🤔

Not quite. No one seems to notice that Nvidia GPUs can use FSR in all of those games. Nvidia allows AMD to implement a feature that works on Nvidia GPUs and shows that their solution is better but it hides or you have to figure out yourself that the reason is because it's tailored specific for Nvidia GPUs. The only thing AMD is blocking is for a game they pay for, they don't want the developer to implement a feature that can only be used on Nvidia GPUs, DLSS.

This would be outrageous if FSR only worked on AMD GPUs as DLSS only works on Nvidia GPUs.

In the future when FSR3 is released, if AMD goes the route of making it more comparable to DLSS in quality and it only works on AMD GPUS, I wonder how many of those Nvidia games would support FSR3 that only works on AMD GPUs.

Avatar image for R4gn4r0k
R4gn4r0k

46498

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#92 R4gn4r0k
Member since 2004 • 46498 Posts

@Xtasy26 said:

That's a good question. Honestly, I didn't make much of weight for DLSS. All I know was that nVidia had better Ray Tracing and I wanted to play games in Ray Tracing (yes I am a graphics w*ore). I got a 4K monitor and there was no way I was going to get descent frame rates in games like Cyberpunk 2077 at 4K with Psycho max graphics settings even with my 3090. So, having DLSS was a must.

Fact of the matter is 4K at max settings with Ray Tracing is demanding even for a high end GPU like the 3090 so having DLSS is a must. In certain games you can get away without using DLSS and play games like Forza Horizon 5 without DLSS which I did at 4K max without issues. But when you start using Ray Tracing DLSS is a must. Noticed that with Control too. Was pretty laggy in certain parts without DLSS having tested it myself on the 3090.

Intel hired the guy which helped develop DLSS at nVidia when they started building their GPU division. Hence their good quality upscaling Xess, AMD should have hired him. My guess is that they thought they didn't need him and their version of upscaling with FSR was "good enough".

For me it was the exact opposite. I don't give anything about Ray tracing, I wouldn't pay a cent more if my card had Ray tracing support or not.

But DLSS was a factor in my purchasing decision however.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#93 Xplode_games
Member since 2011 • 2540 Posts

@tribesjah said:

@osan0: I feel like the price difference very much depends on which AIB card you are looking at, but I would argue that the price difference is still more around the $200-$250 mark (which at that point, harder to choose between the two).

For example, one of the cheaper 7900 XTX (Sapphire Pulse) can be had for $900 (and recently you could even get an XFX or an Asrock variant, granted I would probably avoid the Asrock one, for 860ish). On the 4080 side the cheapest ones are still in the 1100-1150 range (Zotac and PNY). I know this just cause I had the same debate (4080 vs 7900), decided to go with 7900 due to better raster performance (and, to a lesser extent, to not support Nvidia and their overpricing practices, this is my first AMD card ever tbh). I think with a $100 difference the choice is less clear (raster vs DLSS and RT), but at 200+ I think the 7900 still makes more sense (granted I may be biased).

I agree with you. If they were both priced the same, then one could argue the differences in VRAM vs RT and DLSS vs FSR, however with massive price differences, they don't directly compare.

I will add that some consumers might want to look at the 7900 XT which still has 20GB VRAM because those are going for around $700 on sale. That's not a hype price, you can find them for around that on a deal. That is a damn good price point if you don't mind losing some performance and VRAM to the XTX.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#94  Edited By Xtasy26
Member since 2008 • 5582 Posts
@osan0 said:
@Xtasy26 said:
@osan0 said:

Out of curiosity: how much weight do you put on DLSS for your purchasing decision? A nice to have or "DLSS support or it's not an option?". Personally I file things like FSR and DLSS under a nice to have. Something that will swing the direction one way if all other things are equal. I don't consider them to be critical features.

As for AMD: Yeah they probably should. I was thinking they are too far behind: But intel have done a great job with Xess and that is on the first try. Xess, when running on ARC specifically, is very impressive from what i have seen. So yeah...AMD should probably just pull the trigger. They have been making some strange decisions around their GPUs. The lack of RT performance (and lack of any major improvement in RDNA3) too is...odd.

That's a good question. Honestly, I didn't make much of weight for DLSS. All I know was that nVidia had better Ray Tracing and I wanted to play games in Ray Tracing (yes I am a graphics w*ore). I got a 4K monitor and there was no way I was going to get descent frame rates in games like Cyberpunk 2077 at 4K with Psycho max graphics settings even with my 3090. So, having DLSS was a must.

Fact of the matter is 4K at max settings with Ray Tracing is demanding even for a high end GPU like the 3090 so having DLSS is a must. In certain games you can get away without using DLSS and play games like Forza Horizon 5 without DLSS which I did at 4K max without issues. But when you start using Ray Tracing DLSS is a must. Noticed that with Control too. Was pretty laggy in certain parts without DLSS having tested it myself on the 3090.

Intel hired the guy which helped develop DLSS at nVidia when they started building their GPU division. Hence their good quality upscaling Xess, AMD should have hired him. My guess is that they thought they didn't need him and their version of upscaling with FSR was "good enough".

Nothing wrong with being a graphics wh(*e. I'm also one at times...just..er...on a tighter budget. But i'm the dedicated "I'll play games at 30FPS on a PC to get the best visual settings" type of graphics wh*re.....real commitment to the cause :P.

But does it have to be DLSS specifically? Hypothetically if CP2077 only supported FSR would it be a case of "Nope. Not playing it. Not happening. Unacceptable". Is it essential that the upscaling be DLSS specifically?

And looking at hardware purchasing decisions: you own a 3090 so i'm guessing you are very much in the "best of the best at whatever cost" type of PC gamer (which is grand. Always fun to do finances willing). But say finances were tighter. Say 500 was the tops you could spend on the GPU. How much tax would you be willing to pay for DLSS specifically?

Actually, I wouldn't consider myself "get the best GPU at whatever cost". I only got it since I was using 1080P from 2011 starting with HD 6950 bios flashed to HD 6970 all they way to GTX 1060 6GB (10 years on the same resolution, time to upgrade, no?) This was my first really, really high end GPU that cost's $1500. Never got a GPU over $1000. Hence, I was mostly AMD user from 2008-2017. Basically from the HD 4870 to the R9 390X before switching to nVidia with GTX 1060 6GB. I would go for most bang for the buck hence the use of AMD as AMD would get close to nVidia's performance or draw it at a lesser price using traditional rasterization.

The issue becomes that since I went to 4K with Ray Tracing every frame counts. Especially, considering that I would want the best "image quality" with the best frame rates anything other than 3090 was the only option. Now, I know some people wouldn't mind the "lesser image quality" and get something like 6900 XT and play it at 4K with FSR (which is fine) but with DLSS it would hard for me to go back since the image quality is slightly better at 4K.

Now, I do like AMD (even got called AMD fanboy on these forums) when I was defending AMD when it had the best price/performance. But right now it's clear if you want to play at 4K with max settings with Ray Tracing and the "better image quality" nVidia seems to be the only choice. Having said that I am not going to discourage others from getting AMD even if they want to play at 4K as long as they are okay with slight deprecation with image quality when using FSR.

I will definitely be open to AMD in the future if they can get their image quality up to par with nVidia, if the rumors of AMD using ML/AI for future updates for image upscaling comes true.

I feel like we are back to the late 90s early 2000's when different vendors had different "image quality" with their GPU's in games until we got to the point in the later 2000's where image quality was pretty much the same between AMD/nVidia.

We may get to that point where all the GPU vendors have reached image quality parity in the future since DLSS/XeSS/FSR are still in it's infancy.

As for tax for the slightly image quality, probably max $200. I kind of "cringe" at the thought since that's how much I used to pay for GPUs (my XFX 7600 GT cost $200) hence I am paying the extra $200 just for image quality alone.

Which goes back to nVidia. As much as I hate to say it they really seems to know their customers quite well, hence they are charging the extra $200 over the 7900 XTX since they know they have the slightly better feature set with DLSS 3 and frame generation. That's only for the high-end which already has good amount of VRAM. For the mid-end (less than $500, which I find it kind of ridiculous, since $500 would have gotten you high end GPU 10 years ago). Things become more muddier. It would be hard for me to justify getting something like 8GB card in 2023 from nVidia when I could spend the same money and get something like 12 GB GPU from AMD at the same price). In that case I would take the "lesser image quality" since I would want to future proof my GPU with the extra VRAM.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#95 Xtasy26
Member since 2008 • 5582 Posts
@Xplode_games said:
@tribesjah said:

@osan0: I feel like the price difference very much depends on which AIB card you are looking at, but I would argue that the price difference is still more around the $200-$250 mark (which at that point, harder to choose between the two).

For example, one of the cheaper 7900 XTX (Sapphire Pulse) can be had for $900 (and recently you could even get an XFX or an Asrock variant, granted I would probably avoid the Asrock one, for 860ish). On the 4080 side the cheapest ones are still in the 1100-1150 range (Zotac and PNY). I know this just cause I had the same debate (4080 vs 7900), decided to go with 7900 due to better raster performance (and, to a lesser extent, to not support Nvidia and their overpricing practices, this is my first AMD card ever tbh). I think with a $100 difference the choice is less clear (raster vs DLSS and RT), but at 200+ I think the 7900 still makes more sense (granted I may be biased).

I agree with you. If they were both priced the same, then one could argue the differences in VRAM vs RT and DLSS vs FSR, however with massive price differences, they don't directly compare.

I will add that some consumers might want to look at the 7900 XT which still has 20GB VRAM because those are going for around $700 on sale. That's not a hype price, you can find them for around that on a deal. That is a damn good price point if you don't mind losing some performance and VRAM to the XTX.

$700 for 20GB VRAM 7900 XT is a damn good price.