Xtasy26's forum posts

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#1  Edited By Xtasy26
Member since 2008 • 5582 Posts

@pyro1245 said:

They need to improve the machine learning features. Rasterization is great, but has limits to what devs can reasonable accomplish in a given power budget. I want to see them match nvidia in this area.

Seems like they could improve their drivers too. I grabbed a used 6800 xt recently because I didn't want to jump into nvidia's 4xxx cards late into the generation with the prices still as stupid as they are. Been pretty happy with it, aside from some system stability issues (used card, sample size of one, so who knows).

6800 XT is a great choice! Great bang for the buck in the mid-end.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#2 Xtasy26
Member since 2008 • 5582 Posts
@Mozelleple112 said:

2023 is definitely one of the best years ever, but it is not on the same level as:

S tier: 1998, 2001, 2004

It is however on the same level as:

A tier: 2008, 2010, 2011, 2020 and 2013

Which makes it better than:

B Tier: 2022, 2018, 2017, 2007, 2009, 2000 and 2015

A lot better than:

C tier: 2019, 2006, 2005, 2003, 2002, 1999, 1997, 1996, 1995, 1994, 1993

And then we have the trash years:

D tier: 2021, 2014 and 2012. The 3 years where not a single game deserved GOTY.

2007 belongs in S tier. Otherwise mostly agree for S tier especially with 1998 and 2004 with the latter being the greatest over the past 25 years.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#3 Xtasy26
Member since 2008 • 5582 Posts

@ghosts4ever said:

Its a decent year. not the best.

didnot care about BG3.

RE4-R is probably best third person action game I played since Max Payne and Mafia 1. and probably best third person action horror game.

Robocop is fantastic. amazing and sleeper hit.

Cyberpunk went from faliure to one of the best RPG game. RPG game with compelling gameplay mechanic is very rare.

Atomic heart was also ok.

Agree Robocop was a surprise. I was also surprised for Cyberpunk and it's expansion. CD Projekt Red righted a wrong. Was surprised they added a 3rd ending. I was somewhat disappointed with ending of CP.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#4  Edited By Xtasy26
Member since 2008 • 5582 Posts
@Pedro said:

Only fanboys care about dumb stuff like this.😂 That is why the system wars is a relic.😲

No. We want competition and better prices. Why is wanting competition for consumers on both the mid-end and high-end a bad thing and see what AMD needs to improve?

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#5 Xtasy26
Member since 2008 • 5582 Posts
@blaznwiipspman1 said:

@Xtasy26: all good mayn. I've never bought Nvidia for a few reasons, mainly because I didn't see the price to performance benefit. Nvidia cards felt like a scam and still do.

Of course, Ray Tracing is a cool but gimmicky feature, and it's true that last gen 6000 amd gpus weren't so good at it. The new 7000 gpus are decent, and capable enough to be on par with the geforce 3000 series. I'd even say that fsr is fairly good, and good enough that it holds its own against dsl.

Personally though, my experience with Intel has been interesting. It stopped working for a few months, and i figured out the reason after i got a bit serious about trying to fix it.

Also the price i paid was a bit higher than an even better AMD gpu, which kind of sucked, but I stood by my ideals, and bought the intel arc a770. After all said and done, its a pretty decent card. The ray Tracing performance is even better than Nvidia in some cases, which is mind boggling. Of course, some games do better than others, hogwarts for ex runs amazing on the Intel arc a770. I get 30 to 40 fps with Ray Tracing on high, and at 4k resolution. I'm excited to see what the next gen Intel gpu are capable of.

That's good to hear that intel is doing good in Ray Tracing. I thought it was a smart move to use the method intel uses with XeSS. They basically hired the guy who developed DLSS for XeSS. That was a smart move. No wonder it's beating FSR in the image quality department. It's going to a tough hill to climb with all the driver optimizations needed especially with older games. Apparently they increased the performance of Assassins Creed Unity by like 100% but that was a game like from 10 years ago. Goes to show how much they have to do with going back to games from over 10 - 12+ years go. I was really looking towards A770 when it released because it was the first time in 20+ years ago that I felt we would have a viable competitor after 3DFX's demise.

Looking forward to what intel has for Battlemage. My suggestion is to have one-die like AMD had for example when they launched the HD 4800 series initially with HD 4850/4870. Release one die and heavily focus on optimizations and double down on the drivers. Do, that for Celestial and then go for the high end with Druid. This will help intel focus on one chip at a time instead of multiple chips.

Can't believe I am rooting for intel. :P

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#6 Xtasy26
Member since 2008 • 5582 Posts
@truebond said:

Drivers should be ready at launch, more video memory, send an engineer to AAA game developer studios to assist with better performance using AMD GPUs and undercut Nvidia price by 20% lower at minimum.

Sending engineers to AAA developers is something that has been bothering me from AMD of their reluctance to do so and this goes back 15 - 20 years ago. It's like nVIDIA really cares about their customers and are bending over backwards to make sure games runs the best on their hardware. I was surprised to find 3-4 Nvidia guys in the credits to No One Lives Forever 2 vs 1 guy from ATI when I finished NOLF 2. That was 20 years ago! This means Nvidia is providing whatever resources to developers to make sure AAA games work the best on their GPUs. I even read about NVIDIA sending engineers to help Crytek to make the Crysis looks the best. That's what I call commitment! I don't think pricing is necessarily bad but working with AAA developers from the beginning instead of trying to fix it later with driver updates is another example where NVIDIA is one step ahead of AMD.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#7 Xtasy26
Member since 2008 • 5582 Posts

@mrbojangles25 said:

I'm kind of curious why they even need to take "the crown" to begin with.

Isn't "good enough" exactly that? Good enough?

If they profitable and showing growth, then what's the big deal?

Good question. Let me explain. Before I can get a Radeon HD 4870, which was my first ATI/AMD GPU. I can get close 70-75% performance of nVIDIA for half the price. With my last AMD GPU, R9 390X I can get close to GTX 980 with less money and the same image quality.

Now, after jumping to 4K I don't really have another alternative option with respect to getting the performance AND image quality. Before AMD could get away with being "good enough" because image quality wasn't a factor as it was as good if not better than nVIDIA in mage quality.

Now, with Ray Tracing and max 4K Game settings we really need something like DLSS to play games with Ray Tracing and AMD's FSR 3 lags behind of DLSS with respect to performance and image quality. If AMD matched the picture quality, with FSR 3 I would more than willing to save the $500 bucks and go with 7900 XTX vs RTX 4090. But that isn't the case.

I agree with you to a certain extent that as long as they are profitable and showing growth, it's fine. But I think as nVidia is winning generation after generation and especially with their features such as frame generation and DLSS, and other features like Shadowplay it's hurting AMD's growth even though AMD may be competitive in the mid-end like with HD 7800 XT.

Back in the 4850/4870 not getting the performance crown, AMD still grew. Market share jumped to 40% now AMD has like 17% which is half then what they were 15 years ago. And still the same after 8 years. Consecutive generations of getting the performance is hurting their image I would argue, especially with the feature set that nVidia is offering.

Lastly, "winning the crown" has a psychological affect on consumers. They perceive that it's the "better" brand even if that may not be 100% true which equals to better sales and market share. We saw that with the 9700/9800 generation 20 years ago. ATI started to change their perception of playing second fiddle to NVIDIA. In the next generation after the 9700/9800 with the X800 XT PE vs the GeForce 6800 Ultra which was released in 2004 close to 20 years ago is when we had a lot of PC gamers finally switch to ATI. In fact that's whey AMD crossed the 50% threshold mark and actually took over the market share and became the #1 GPU maker in the world.

Also, shouldn't AMD offer something to Gamers like me who made the jump to 4K and want's to play every new game maxed out at 4K? I know I may be in the niche from going from mid-end so it may not matter to AMD. But as someone who supported AMD for nearly 10 years, shouldn't AMD provide something for someone like me?

Now, it's more than ever needed for AMD. NVIDIA is running away with almost 80% market share for the reasons above and many more. Look what happened to AMD once they started to take away the "performance crown" from intel with Ryzen. A lot of gamers switched to AMD. Which brought significant influx of cash to AMD.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#8 Xtasy26
Member since 2008 • 5582 Posts

@hardwenzen said:

When the whole gaming industry been edging for two years because of covid, of course the release is strong. So to answer your question, yes, 2023 has been great.

True. There was a lot of pent up games to be released or got delayed.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#9  Edited By Xtasy26
Member since 2008 • 5582 Posts

@osan0 said:

It's going to be very tough to compete at the top end. It's not just about having the most powerful GPU anymore. It's that combined with a bunch of extra features and Nvidia are just doing that better. AMD need to not only be delivering the hardware, but also need to be putting developers on contributing to Blender and working with other app devs to get AMD hardware up to being a first class citizen. When people are willing to shell out 1000+ bucks, paying an extra 200-300 for better upscaling, better video encoding, better AI acceleration, better RT perf, better productivity app support and so on is not a big deal in that market.

They also need to sort their marketing out: It's shockingly poor. E.g. the 7800XT is a fine GPU for it's MSRP (though it has gone up a bit in some parts and is too close to the RTX4070). However it should be called the 7800. If someone has a 6800XT and upgrades to a 7800XT...well...it's not much of an upgrade at all. But it's MSRP is more in line with the 6800. Just shooting themselves in the foot again (not that i would recommend upgrading from a 6800 either...jumps too small). It's just another little blunder on a long line of blunders.

They need to improve the image quality of FSR still. It can be done. Intel has done it with XESS, which, at 1.2, is actually quite good even on the DP4A path now. Ghosting is the biggest issue with it and hopefully that gets fixed. FSR 3s frame gen is actually quite good. Most image quality issues are more Garbage in->garbage out issues rather than flaws with the frame gen itself. Using Async compute was quite clever. Nvidia users should be knocking on Nvidias window and asking for the same option for DLSS.

So if they can knock off the rough edges with the frame gen and improve the base output a bit more, i think it will actually be in a pretty good place. Also work on making it look better for 1440P displays specifically.

In the short-mid term they also need to look at the Low and Mid tier segment of the market I think. At the 100-399 point the expectations for top class features is less expected. Raster perf is still the priority. So, like the RX 480, if they can come up with a compelling offer at 200, 300 and 400 bucks that could be of interest to many. They have been focusing a lot on packaging which doesn't really benefit us directly. But it brings the BOM down: that's the only reason to do it after all.

A proper console killer GPU at 200-250 could be quite enticing.

It's hard to see where they can steal a march though at the moment. Intel are gunning for market share so they could be looking to undercut AMD and Nvidia while offering a very compelling package. Their drivers are not quite on the money yet but, by all accounts, they are improving rapidly. So it's going to be interesting to see how it plays out over the next couple of years.

Good right up. :)

Agreed on the first point. They need to work with developers and bring other apps up to par. I have heard of so many stories where people WANT to get AMD but X, Y,Z program doesn't work as well due to lack of particular support. I am not necessarily talking about gaming also, For example, a recent AI developer had people who wanted to use AMD hardware for AI but apparently AMD's OWN programming language ROCm which is similar to CUDA doesn't even support some AMD GPUs. Like WTF? Where as you can pretty much pick up any NVIDIA GPU and can get CUDA to run. That's lost sale or sales of someone who may wanted to game and use their GPU for development work.

And definitely right about on the high end. People will easily spend extra $200+ on $1000+ GPUs if they get better quality image with DLSS 3.5, and other features. And speaking of features I know first hand when recording game play I had to download 3rd party bulls*** software like Raptr to run in the background and set it up to record gameplay. When I switched to NVIDIA with GTX 1060 6GB I only press a couple of button and bam shadowplay starts recording. A lot of YT streamers didn't want to switch to AMD but can't due to lack of Shadowplay like feature, that's another lost sales. With respect to upscaling AMD should have hired the guy who developed DLSS who intel hired and developed XeSS which is pretty good. They really need to double down ot get onto AI/ML based image upscaling. When you are spending over $1000 it's little things like that matters.

And don't get me started with their marketing. That's another joke itself. If you look at ATI's marketing back in the day 20 years ago with the 9700 Pro they were stellar and was able to really sell themselves. They were able to pivot the brand into an "enthusiast" brand 20 years ago. It has gone downhill over the last 10 years.

On the price point front. From what I hear AMD will not focus on the high end from RDNA 4. It will be mostly focused on the $200 - $400 (similar to RDNA 1). They apparently had a very ambitious architecture to compete with Blackwell but that got shelved to the side to focus on the mid to high end. The ambitious architecture will be used in RDNA 5 and by then they hope to reach parity with DLSS AI/ML image upscaling. Which to me makes sense, there is no point in releasing an expensive $1000 GPU when they still haven't fixed many of the image issues with FSR 3 and features. Gamers may still get Blackwell if AMD hasn't reached parity with DLSS 3.5 or 4 whatever NVIDIA comes with Blackwell.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

51

Followers

Reviews: 53

User Lists: 0

#10 Xtasy26
Member since 2008 • 5582 Posts
@Litchie said:

AMD kinda sucks. They should double down on "not as good as Nvidia, but cheaper" instead of just doing making worse GPUs for the same price as Nvidia.

I see nothing wrong with that. AMD did that with the HD 4800 series and captured the $200 - $300 price segment and that actually enabled them to grab market share. Reached up to 40% which is double what they have now. I mean why would I get something that costs 2X as much and I can get 70 - 75% of the performance at half the price.

From 2008-2013 AMD was killing it from the bottom to mid-end to high end. Only in the last 10 years they stagnated in the high end but was competitive in the mid-end.

Right now though I would add that they would need more than just being cheaper they need match features that match DLSS 3.5 with quality graphics that matches or get's "close" to NVIDIA on top of the price.

I don't understand why they aren't plain the price angle like they did with HD 4800 - HD 6000 series. It seems like Lisa Su want's to maintain "margin" but at the cost they are losing Gamers and in the long run and mind-share. How many of those users will switch back to AMD after they have been using NVIDIA for 10+ years?