Xtasy26's forum posts

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#1 Xtasy26
Member since 2008 • 5582 Posts

@uninspiredcup said:

This is very interesting. Technically Quake 3 already connected Doom/Quake 1/2, but that was more a throw-shit-at-the-wall.

The new Quake II chapter explicitly connects Quake I-II.

Loading Video...

That's pretty cool. You think they are trying to form a connection to setup Quake 5? There are rumors going around ID is working on another Quake game.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#2  Edited By Xtasy26
Member since 2008 • 5582 Posts

Looks like a great re-master. Have Quake 1 remaster. Beat the original Quake back like 25 years ago. But even with the re-master graphics looked dated with Quake 1. But with Quake 2 it's different basically anything after 1998 when the second generation GPU's started to hit the market like Voodoo 2 is when the graphics started to get significantly better. Quake 2 was one of the first ones two show case second gen GPU's. I remember all the PC Hardware and PC Gaming magazines benchmarked Quake 2 to death on all the different GPUs at the time. It was to go to benchmark for GPUs. Don't know why some people still like Quake 1 over Quake 2. I will admit Quake 1 was unique when it came out graphically with it's Lovecraftian theme, even John Carmack stated that Quake 1 was "brown" every level seems to look brown with no diversity. Quake 2 took it to another level not just graphically. Ironically beat Quake 2 exactly 20 years ago in the summer of 2003.

Back to the re-master like that they kept the original look and feel with better textures, lighting, etc. without too much deviating from the original ambiance and look of the original. My type of re-master. I think changing the game too much takes it away from the the original art style and ambiance. Started playing expansion Reckoning about 7 years ago with mod to play it at 1080P but stopped at Unit 1. Perfect timing to go back and play the expansions with better graphics. I wonder how much of difference it really is the expansions since those were the ones I didn't play through. The new expansion made by Machine Games look good too.

It's sad the state of gaming where I am more excited about 25 year old re-master of a game then newer AAA games.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#3 Xtasy26
Member since 2008 • 5582 Posts
@Xplode_games said:
@tribesjah said:

@osan0: I feel like the price difference very much depends on which AIB card you are looking at, but I would argue that the price difference is still more around the $200-$250 mark (which at that point, harder to choose between the two).

For example, one of the cheaper 7900 XTX (Sapphire Pulse) can be had for $900 (and recently you could even get an XFX or an Asrock variant, granted I would probably avoid the Asrock one, for 860ish). On the 4080 side the cheapest ones are still in the 1100-1150 range (Zotac and PNY). I know this just cause I had the same debate (4080 vs 7900), decided to go with 7900 due to better raster performance (and, to a lesser extent, to not support Nvidia and their overpricing practices, this is my first AMD card ever tbh). I think with a $100 difference the choice is less clear (raster vs DLSS and RT), but at 200+ I think the 7900 still makes more sense (granted I may be biased).

I agree with you. If they were both priced the same, then one could argue the differences in VRAM vs RT and DLSS vs FSR, however with massive price differences, they don't directly compare.

I will add that some consumers might want to look at the 7900 XT which still has 20GB VRAM because those are going for around $700 on sale. That's not a hype price, you can find them for around that on a deal. That is a damn good price point if you don't mind losing some performance and VRAM to the XTX.

$700 for 20GB VRAM 7900 XT is a damn good price.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#4  Edited By Xtasy26
Member since 2008 • 5582 Posts
@osan0 said:
@Xtasy26 said:
@osan0 said:

Out of curiosity: how much weight do you put on DLSS for your purchasing decision? A nice to have or "DLSS support or it's not an option?". Personally I file things like FSR and DLSS under a nice to have. Something that will swing the direction one way if all other things are equal. I don't consider them to be critical features.

As for AMD: Yeah they probably should. I was thinking they are too far behind: But intel have done a great job with Xess and that is on the first try. Xess, when running on ARC specifically, is very impressive from what i have seen. So yeah...AMD should probably just pull the trigger. They have been making some strange decisions around their GPUs. The lack of RT performance (and lack of any major improvement in RDNA3) too is...odd.

That's a good question. Honestly, I didn't make much of weight for DLSS. All I know was that nVidia had better Ray Tracing and I wanted to play games in Ray Tracing (yes I am a graphics w*ore). I got a 4K monitor and there was no way I was going to get descent frame rates in games like Cyberpunk 2077 at 4K with Psycho max graphics settings even with my 3090. So, having DLSS was a must.

Fact of the matter is 4K at max settings with Ray Tracing is demanding even for a high end GPU like the 3090 so having DLSS is a must. In certain games you can get away without using DLSS and play games like Forza Horizon 5 without DLSS which I did at 4K max without issues. But when you start using Ray Tracing DLSS is a must. Noticed that with Control too. Was pretty laggy in certain parts without DLSS having tested it myself on the 3090.

Intel hired the guy which helped develop DLSS at nVidia when they started building their GPU division. Hence their good quality upscaling Xess, AMD should have hired him. My guess is that they thought they didn't need him and their version of upscaling with FSR was "good enough".

Nothing wrong with being a graphics wh(*e. I'm also one at times...just..er...on a tighter budget. But i'm the dedicated "I'll play games at 30FPS on a PC to get the best visual settings" type of graphics wh*re.....real commitment to the cause :P.

But does it have to be DLSS specifically? Hypothetically if CP2077 only supported FSR would it be a case of "Nope. Not playing it. Not happening. Unacceptable". Is it essential that the upscaling be DLSS specifically?

And looking at hardware purchasing decisions: you own a 3090 so i'm guessing you are very much in the "best of the best at whatever cost" type of PC gamer (which is grand. Always fun to do finances willing). But say finances were tighter. Say 500 was the tops you could spend on the GPU. How much tax would you be willing to pay for DLSS specifically?

Actually, I wouldn't consider myself "get the best GPU at whatever cost". I only got it since I was using 1080P from 2011 starting with HD 6950 bios flashed to HD 6970 all they way to GTX 1060 6GB (10 years on the same resolution, time to upgrade, no?) This was my first really, really high end GPU that cost's $1500. Never got a GPU over $1000. Hence, I was mostly AMD user from 2008-2017. Basically from the HD 4870 to the R9 390X before switching to nVidia with GTX 1060 6GB. I would go for most bang for the buck hence the use of AMD as AMD would get close to nVidia's performance or draw it at a lesser price using traditional rasterization.

The issue becomes that since I went to 4K with Ray Tracing every frame counts. Especially, considering that I would want the best "image quality" with the best frame rates anything other than 3090 was the only option. Now, I know some people wouldn't mind the "lesser image quality" and get something like 6900 XT and play it at 4K with FSR (which is fine) but with DLSS it would hard for me to go back since the image quality is slightly better at 4K.

Now, I do like AMD (even got called AMD fanboy on these forums) when I was defending AMD when it had the best price/performance. But right now it's clear if you want to play at 4K with max settings with Ray Tracing and the "better image quality" nVidia seems to be the only choice. Having said that I am not going to discourage others from getting AMD even if they want to play at 4K as long as they are okay with slight deprecation with image quality when using FSR.

I will definitely be open to AMD in the future if they can get their image quality up to par with nVidia, if the rumors of AMD using ML/AI for future updates for image upscaling comes true.

I feel like we are back to the late 90s early 2000's when different vendors had different "image quality" with their GPU's in games until we got to the point in the later 2000's where image quality was pretty much the same between AMD/nVidia.

We may get to that point where all the GPU vendors have reached image quality parity in the future since DLSS/XeSS/FSR are still in it's infancy.

As for tax for the slightly image quality, probably max $200. I kind of "cringe" at the thought since that's how much I used to pay for GPUs (my XFX 7600 GT cost $200) hence I am paying the extra $200 just for image quality alone.

Which goes back to nVidia. As much as I hate to say it they really seems to know their customers quite well, hence they are charging the extra $200 over the 7900 XTX since they know they have the slightly better feature set with DLSS 3 and frame generation. That's only for the high-end which already has good amount of VRAM. For the mid-end (less than $500, which I find it kind of ridiculous, since $500 would have gotten you high end GPU 10 years ago). Things become more muddier. It would be hard for me to justify getting something like 8GB card in 2023 from nVidia when I could spend the same money and get something like 12 GB GPU from AMD at the same price). In that case I would take the "lesser image quality" since I would want to future proof my GPU with the extra VRAM.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#5 Xtasy26
Member since 2008 • 5582 Posts

@last_lap said:

@Xtasy26: Because I'm a "consoler" I don't know about anything lol.

You celebrating a component doesn't surprise me because Hermits are weird who care more about how many frames, resolution games are than the games themselves. So celebrating a GPU is weird af but expected.

Yeah this "component" is more powerful that the Xbox 360 and the PS3. Seems like you don't as the other PC Gamers in here pointe out the significance. This was more powerful than those two GPUs combined...LMAO. Yes, hermits care about frames and resolutions that's why we shell out the money for it to get you know a better experience.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#6  Edited By Xtasy26
Member since 2008 • 5582 Posts
@osan0 said:
@Xtasy26 said:
@osan0 said:

I kinda makes sense on the surface but, with a bit more thought, it really doesn't. It's no secret that DLSS is superior to FSR. This is known. It's well covered and established. There has been nothing to indicate that FSR2 has caught up. This is not to say FSR is bad. I use it myself in a couple of games and it's fine...though I am probably using it under the best conditions (FSR quality @4K). But DLSS is better (though by no means perfect).

So what does cutting DLSS out of starfield actually get AMD except bad press? There is no win here. Developers wont be any more or less encouraged to use FSR in their games anyway. It still makes sense to implement FSR since it runs on everything including consoles, so the tech still has value. It's also designed so that any game with DLSS support can easily implement FSR2 and the reverse is also true.

RTGs marketing department are in a tough spot in fairness. The reality is that Radeon is still behind on the software and hardware front. They have made good strides on the hardware front since vega in fairness. But there are still weaknesses. Then there is a lot of work to do on the software side too, especially in the areas that are not gaming.

But there has to be better ways of using the deal than this surely. RTGs marketing department really does need an overhaul.

You are right FSR @4K is pretty good. Don't know if it's enough for me to switch from nVIDIA to AMD.

AMD really needs to come up with something like nVidia using Tensor Cores for AI implementation of Upscaling/Super Sampling. There were rumors that AMD was working on such things on their future GPUs. While FSR is great but not quite DLSS, some may argue that it's not noticeable. But having used DLSS at 4K I was very impressed by the quality. It would hard for me to switch back to AMD.

Out of curiosity: how much weight do you put on DLSS for your purchasing decision? A nice to have or "DLSS support or it's not an option?". Personally I file things like FSR and DLSS under a nice to have. Something that will swing the direction one way if all other things are equal. I don't consider them to be critical features.

As for AMD: Yeah they probably should. I was thinking they are too far behind: But intel have done a great job with Xess and that is on the first try. Xess, when running on ARC specifically, is very impressive from what i have seen. So yeah...AMD should probably just pull the trigger. They have been making some strange decisions around their GPUs. The lack of RT performance (and lack of any major improvement in RDNA3) too is...odd.

That's a good question. Honestly, I didn't make much of weight for DLSS. All I know was that nVidia had better Ray Tracing and I wanted to play games in Ray Tracing (yes I am a graphics w*ore). I got a 4K monitor and there was no way I was going to get descent frame rates in games like Cyberpunk 2077 at 4K with Psycho max graphics settings even with my 3090. So, having DLSS was a must.

Fact of the matter is 4K at max settings with Ray Tracing is demanding even for a high end GPU like the 3090 so having DLSS is a must. In certain games you can get away without using DLSS and play games like Forza Horizon 5 without DLSS which I did at 4K max without issues. But when you start using Ray Tracing DLSS is a must. Noticed that with Control too. Was pretty laggy in certain parts without DLSS having tested it myself on the 3090.

Intel hired the guy which helped develop DLSS at nVidia when they started building their GPU division. Hence their good quality upscaling Xess, AMD should have hired him. My guess is that they thought they didn't need him and their version of upscaling with FSR was "good enough".

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#7  Edited By Xtasy26
Member since 2008 • 5582 Posts
@osan0 said:
@R4gn4r0k said:
@osan0 said:
@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

Yeah I saw that in the Hardware Unboxed Vid.....that's a really bad look for AMD indeed.

It kinda makes sense as AMD doesn't want DLSS to outshine their proprietary tech in their sponsored games.

But I still wish the practice would cease to a halt, starting with Starfield.

And I wish their AMD sponsorship would go in a different direction.

I kinda makes sense on the surface but, with a bit more thought, it really doesn't. It's no secret that DLSS is superior to FSR. This is known. It's well covered and established. There has been nothing to indicate that FSR2 has caught up. This is not to say FSR is bad. I use it myself in a couple of games and it's fine...though I am probably using it under the best conditions (FSR quality @4K). But DLSS is better (though by no means perfect).

So what does cutting DLSS out of starfield actually get AMD except bad press? There is no win here. Developers wont be any more or less encouraged to use FSR in their games anyway. It still makes sense to implement FSR since it runs on everything including consoles, so the tech still has value. It's also designed so that any game with DLSS support can easily implement FSR2 and the reverse is also true.

RTGs marketing department are in a tough spot in fairness. The reality is that Radeon is still behind on the software and hardware front. They have made good strides on the hardware front since vega in fairness. But there are still weaknesses. Then there is a lot of work to do on the software side too, especially in the areas that are not gaming.

But there has to be better ways of using the deal than this surely. RTGs marketing department really does need an overhaul.

You are right FSR @4K is pretty good. Don't know if it's enough for me to switch from nVIDIA to AMD.

AMD really needs to come up with something like nVidia using Tensor Cores for AI implementation of Upscaling/Super Sampling. There were rumors that AMD was working on such things on their future GPUs. While FSR is great but not quite DLSS, some may argue that it's not noticeable. But having used DLSS at 4K I was very impressed by the quality. It would hard for me to switch back to AMD.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#8 Xtasy26
Member since 2008 • 5582 Posts

@04dcarraher said:

Also I would like to suggest that more recent 8gb vram limitation increase in games has one common thread.... these newer AMD sponsorship titles.....(STAR WARS Jedi: Survivor,Resident Evil 4, THE LAST OF US etc) Is AMD purposely influencing developers to inflate vram requirements to make their gpu's look better? AMD holding back RT effects in sponsored games to cater toward their shortcomings? AMD Goaded Nvidia over vram ahead of RTX 4070 launching ..... All these things seem a bit fishy in being hypocritical in supporting "open source" yet cutting/limiting features.

Nvidia really shouldn't have any excuses for being stingy with certain products at the price they are charging. RTX 4070 having 12 GB is laughable at the price they are charging. When I was played Cyberpunk at 4K with Psycho graphics with everything set to max on my RTX 3090 it easily took up 12-13 GB. That's one thing that's annoying about nVidia, they are trying to protect their margins while handicapping their great graphics cards.

My GTX 1060 6GB could have easily play Red Dead 2 maxed out at 1080P but it stutters when compared to the RX 580 because the 6GB buffers gets full in certain scenes and it has lower 1% framerates compare to the RX 580 which has 8GB. Granted this didn't affect me as I got a RTX 3090 but it's another example.

@pclover1980 said:
@04dcarraher said:

Also I would like to suggest that more recent 8gb vram limitation increase in games has one common thread.... these newer AMD sponsorship titles.....(STAR WARS Jedi: Survivor,Resident Evil 4, THE LAST OF US etc) Is AMD purposely influencing developers to inflate vram requirements to make their gpu's look better? AMD holding back RT effects in sponsored games to cater toward their shortcomings? AMD Goaded Nvidia over vram ahead of RTX 4070 launching ..... All these things seem a bit fishy in being hypocritical in supporting "open source" yet cutting/limiting features.

I feel like this is a bit too tinfoil hatty for me, but it may be the case too. Or devs are really, really just lazy. Looking at TLOU1 on PC, there's a noticeable difference in terms of VRAM usage from launch and the latest patch. Also with the case in Forspoken.

Forspoken is known for their piss-poor optimization.

Having said that it would be really dumb to not include DLSS since it's superior to FSR in Starfield.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#9 Xtasy26
Member since 2008 • 5582 Posts

@horgen said:

Competition and reasonable prices. Wasn’t Nvidia GTX 460 also somewhat in the same spot? Just years later…

GTX 460 was good but the HD 5850/HD 5870 was the better overall GPUs since you could get good performance with good power consumption and ran cooler. HD GTX 400 series was notorious for their heat and power. It became a meme on the web with people calling it Thermi which was poking fun at the codename of GTX 400 series which was called "Fermi".

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#10 Xtasy26
Member since 2008 • 5582 Posts
@Postosuchus said:

Ahh the 4870... my first ever high end GPU, and my cheapest too, at $180 in early 2009. Prior to that I had only had "budget" (but still expensive) Nvidia stinkers like the FX 5200 and the mediocre 6600 gt, but the 4870 opened up a whole new world of maximized graphics, framerates, and 1200p resolution. Completely blew away anything the PS360 could do as well.

The only comparably good time I can think of in terms of price/performance was the start of the Xbone/PS4 era. You could get a 7870 (or Nvidia equivalent) that solidly outperformed "the most powerful console to ever have existed" for half the price.

Let's hope days like these can return one day, whether it be from Intel finally catching up to the other two, AMD pulling their heads out of their asses, or Nvidia taking a financial clobbering from an AI bubble burst.

My first high end GPU too. Got mine HD 4870 for $200, also was stuck to the mid-end or budget GPU's too. The one prior being the 7600 GT. It destroyed the PS3 and Xbox 360. Especially since consolers where hyping the "Cell" processor as it's something the best thing since sliced bread. The HD 4870 could push HD 1080P on certain games while the Xbox 360 or the PS3 was still stuck at 720P. :P

I have doubts we will get back to those. AMD seems to be comfortable with selling GPU's at the price to protect their profit margins they don't seem to care about trying to cut costs as much to get market share like they did with the HD 4800 series. They reached up to 40% GPU share. Now they are half that. Only thing I can see is intel pushing them. It's essentially a duopoly now.