10 Years since AMD took the graphics crown? What AMD needs to do better going into 2024+?

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

Poll 10 Years since AMD took the graphics crown? What AMD needs to do better going into 2024+? (25 votes)

Maintain or improve lead on Rasterization. 0%
Improve Ray Tracing performance. 28%
Improve FSR 3 with better image quality, Frame Generation and use of AI/ML and other features. 36%
Better price from top to bottom. 36%

It was in 2013 that AMD released the R9 290X that took the graphics crown from NVIDA. It beat the original Titan at half the price. Which was amazing. Over the past 7 years AMD hasn’t been competitive in the high end. While it has done good in the mid end with the RX 580 doing good agains the GTX 1060 6GB. RX 5700 XT being competitive with RTX 2070. And 7700XT and 7800 XT doing an excellent against the RTX 4070 and RTX 4060. It really hasn’t beat NVIDIA since the Hawaii GPU released back in 2013. It was more of a draw with HD 6900 XT where it beat the RTX 3090 but got trashed with Ray Tracing enabled.

Ironically my last AMD GPU was the R9 390X (which was essentially slightly better R9 290X with 8 GB of memory).

Apparently it still can play some games in 2023 at 1080P depending on the settings which is insane.

I switched to NVIDA with GTX 1060 6GB in 2017 and then getting the RTX 3090 because want to game at 4K with Ray Tracing 2021. So, I have been away from AMD (other than CPUs) from over half a decade now.

But I would like to have a more competitive gaming in the high end. I was hoping that intel would inject some competition but ARC A770 is a JOKE.

Last data with discrete GPU market shows AMD with 17% market share in Q2 2023 with intel with 2% and NVIDIA 80% market share which is insane. This is basically what they were back in Q2 2015 when they had the mediocre R9 Fury series. In other words they were they same as they were back in 2015! No progress in 8 years.

So, what does AMD need to get back into at least 33% market share which they were for the most part through the 2000’s and early 2010. I think highest they had was like 38% back in 2014 after the R9 290X and the boom in Bitcoin mining. They haven’t reached that share in 10 years now!

In my opinion AMD is killing it in Rasterization but they really need to improve Ray Tracing performance which goes hand in had with having features like FSR 3 with frame generation with excellent picture quality. They need to use ML/AI to improve the image quality like NVIDIA does with DLSS 2/3.5. Yes they get close to picture quality at 4K with Ray Tracing and FSR 3 but below that NVIDIA has a significant advantage. I am not saying not to get AMD I will recommend AMD with GPUs like the RX 7800 XT which I think is a better buy but what not necessarily in the high end.

So, what do you guys think?

 • 
Avatar image for ghostofgolden
GhostOfGolden

2588

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#1 GhostOfGolden
Member since 2023 • 2588 Posts

This claim is highly debatable and/or wrong. The GTX 1070 beat the Titan Black for less than half the price. I don’t think AMD took the crown from Nvidia at that time…

Avatar image for hardwenzen
hardwenzen

39441

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#2 hardwenzen
Member since 2005 • 39441 Posts

Look at their trash FSR and compare it to DLSS3. This is how much "better" AMD is.

Avatar image for nirgal
Nirgal

710

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#3 Nirgal
Member since 2019 • 710 Posts

I am waiting for fsr to be competitive with dlss to buy an AMD graphic card . It's my main requirement.

Avatar image for mesome713
Mesome713

7224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#4  Edited By Mesome713
Member since 2019 • 7224 Posts

AMD will never be the top, the be copycat clowns.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Juub1990
Member since 2013 • 12620 Posts

It kept the crown all of one month when the 780 Ti came our and reclaimed it. The R9 290X was also infamous for its insanely high temps and loud fans.

Avatar image for pc_rocks
PC_Rocks

8501

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#6 PC_Rocks
Member since 2018 • 8501 Posts

They can't. The only way for them to took the crown now if Nvidia themselves somehow f**k it up and drop the ball.

Avatar image for Mozelleple112
Mozelleple112

11293

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#7 Mozelleple112
Member since 2011 • 11293 Posts

Only way AMD becomes king is if Nvidia turns into a full on AI card company creating only $3000-$100,000 GPUs for big companies.

Avatar image for Litchie
Litchie

34702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#8  Edited By Litchie
Member since 2003 • 34702 Posts

AMD kinda sucks. They should double down on "not as good as Nvidia, but cheaper" instead of just making worse GPUs for the same price as Nvidia.

Avatar image for osan0
osan0

17853

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 osan0
Member since 2004 • 17853 Posts

It's going to be very tough to compete at the top end. It's not just about having the most powerful GPU anymore. It's that combined with a bunch of extra features and Nvidia are just doing that better. AMD need to not only be delivering the hardware, but also need to be putting developers on contributing to Blender and working with other app devs to get AMD hardware up to being a first class citizen. When people are willing to shell out 1000+ bucks, paying an extra 200-300 for better upscaling, better video encoding, better AI acceleration, better RT perf, better productivity app support and so on is not a big deal in that market.

They also need to sort their marketing out: It's shockingly poor. E.g. the 7800XT is a fine GPU for it's MSRP (though it has gone up a bit in some parts and is too close to the RTX4070). However it should be called the 7800. If someone has a 6800XT and upgrades to a 7800XT...well...it's not much of an upgrade at all. But it's MSRP is more in line with the 6800. Just shooting themselves in the foot again (not that i would recommend upgrading from a 6800 either...jumps too small). It's just another little blunder on a long line of blunders.

They need to improve the image quality of FSR still. It can be done. Intel has done it with XESS, which, at 1.2, is actually quite good even on the DP4A path now. Ghosting is the biggest issue with it and hopefully that gets fixed. FSR 3s frame gen is actually quite good. Most image quality issues are more Garbage in->garbage out issues rather than flaws with the frame gen itself. Using Async compute was quite clever. Nvidia users should be knocking on Nvidias window and asking for the same option for DLSS.

So if they can knock off the rough edges with the frame gen and improve the base output a bit more, i think it will actually be in a pretty good place. Also work on making it look better for 1440P displays specifically.

In the short-mid term they also need to look at the Low and Mid tier segment of the market I think. At the 100-399 point the expectations for top class features is less expected. Raster perf is still the priority. So, like the RX 480, if they can come up with a compelling offer at 200, 300 and 400 bucks that could be of interest to many. They have been focusing a lot on packaging which doesn't really benefit us directly. But it brings the BOM down: that's the only reason to do it after all.

A proper console killer GPU at 200-250 could be quite enticing.

It's hard to see where they can steal a march though at the moment. Intel are gunning for market share so they could be looking to undercut AMD and Nvidia while offering a very compelling package. Their drivers are not quite on the money yet but, by all accounts, they are improving rapidly. So it's going to be interesting to see how it plays out over the next couple of years.

Avatar image for rmpumper
rmpumper

2147

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 rmpumper
Member since 2016 • 2147 Posts

@Mozelleple112 said:

Only way AMD becomes king is if Nvidia turns into a full on AI card company creating only $3000-$100,000 GPUs for big companies.

They might, but that's unlikely. No one wants to the the extortionate prices for nvidia's AI GPUs so a bunch of companies are starting to make their own dedicated hardware.

Avatar image for blaznwiipspman1
blaznwiipspman1

16582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11  Edited By blaznwiipspman1
Member since 2007 • 16582 Posts

@Xtasy26: to me the only thing that matters is the price. Nvidia is highly over rated, but I will admit, they have fantastic AI technology. This is somewhere AMD can make improvements, and on the software end of things too.

We need competition, and if you're a real gamer, you should be doing what I do...support the little guy. I have been using an intel gpu for the past year, and hope others also do the same.

Avatar image for osan0
osan0

17853

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 osan0
Member since 2004 • 17853 Posts

@blaznwiipspman1 said:

@Xtasy26: to me the only thing that matters is the price. Nvidia is highly over rated, but I will admit, they have fantastic AI technology. This is somewhere AMD can make improvements, and on the software end of things too.

We need competition, and if you're a real gamer, you should be doing what I do...support the little guy. I have been using an intel gpu for the past year, and hope others also do the same.

While i do agree that competition is good, blindly buying from the little guy is just as bad as blindly buying from the most popular brand.

At the end of the day the paying customer should spend their money on the thing that does what they need it to do best at a given price point.

The "Team Red/Green/Blue" stuff is just complete nonsense. There is no teams. There is no club. There is just the PC. When it comes time to throw down cash, everyone should be looking at all their options from all vendors and buying the thing that suits them best.

It's up to the different companies to make the case for their product.

Avatar image for Jag85
Jag85

19644

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#13 Jag85
Member since 2005 • 19644 Posts

A big reason I was turned-off AMD a few years ago was the lack of AI Tensor cores. Nvidia's AI Tensor cores are useful for applications beyond gaming.

Avatar image for blaznwiipspman1
blaznwiipspman1

16582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14  Edited By blaznwiipspman1
Member since 2007 • 16582 Posts

@osan0: strongly disagreed. You should always buy from the little guy whenever you can. It's not anything like buying from the big guy. Buying from the big guy is the absolute worst thing you can do, and the worst part is you're paying more to buy from the big guy. I see this nonsense out there more and more, people being dumb enough to buy something that's slightly better for double the price, or considering gimmicky features like Ray Tracing and dlss, or whatever bs apple is peddling nowadays. It's all come down to marketing, just like some moron will buy a shoe with a checkmark on it and spend hundreds of dollars more as the same shoe without the check mark. People are dumb mofos, that's all there is to it.

Literally unless your job depends on Nvidia and there is no other viable option on the other manufacturers, absolutely NONE, then that's really rhe only time you should support Nvidia.

As a true gamer, I look out for my fellow gamers and I've been buying video cards from Intel. I'll continue to do so next gen. If they over take AMD, I'll buy from AMD. If Nvidia tanks, I'll buy Nvidia cards. This is what it means to be a gamer.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#15  Edited By Xtasy26
Member since 2008 • 5582 Posts
@ghostofgolden said:

This claim is highly debatable and/or wrong. The GTX 1070 beat the Titan Black for less than half the price. I don’t think AMD took the crown from Nvidia at that time…

It did beat it in many cases. And for half the price. I don't see why would any get the original Titan unless you are doing professional level work, etc. What does the GTX 1070 have to do with the Titan Black that came 3 years after the R9 290X?

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#16 Xtasy26
Member since 2008 • 5582 Posts
@Juub1990 said:

It kept the crown all of one month when the 780 Ti came our and reclaimed it. The R9 290X was also infamous for its insanely high temps and loud fans.

It barely beat it in certain cases and it was MORE expensive and had less Memory. 3GB in 2013 is lame considering in 2011 HD 7970 had 3GB.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#17 Xtasy26
Member since 2008 • 5582 Posts
@Mozelleple112 said:

Only way AMD becomes king is if Nvidia turns into a full on AI card company creating only $3000-$100,000 GPUs for big companies.

They are already self-proclaiming themselves as an AI company. They will absolutely not give up the gaming market nor should they when they control 80+% of the market and is a market leader.

@blaznwiipspman1 said:

@Xtasy26: to me the only thing that matters is the price. Nvidia is highly over rated, but I will admit, they have fantastic AI technology. This is somewhere AMD can make improvements, and on the software end of things too.

We need competition, and if you're a real gamer, you should be doing what I do...support the little guy. I have been using an intel gpu for the past year, and hope others also do the same.

I have been supporting the "little guy" since 2008 when I switched from NVIDIA to AMD. First got the HD 4870 which was basically $100 cheaper than the GTX 260 and due to it using the World's First DDR5 memory it was getting like 70-75% performance of the GTX 280 at half the price when the GTX 280 was costing over $600!

I think that was the turnaround story for AMD after the mediocre HD 2900XT and slightly better HD 3800 series, that is when they gained market share of up to 40% for Discrete graphics and even close to 50% in the Laptop Gaming space because they had superior price/performance/power especially with the HD 5800 series which was faster than Thermi I mean Fermi...lol. Gaming Laptop makers didn't want to stick that nuclear power reactor inside Gaming Laptops and went with the HD 5800 series for Gaming laptops and that got AMD to almost 50% market share which is insane when you think about it given how AMD was struggling with finance and R&D money with their crap CPUs.

Next up I got the HD 6950 BIOS flashed to HD 6970 and that is what I had my AMD Radeon avatar 10+ years ago..lol. I basically saved $100 and close to $150+ when you consider the HD 6970 got close the GTX 570. I skipped HD 7970 and got the updated Hawaii GPU with XFX R9 390X Double Dissipation which was a fantastic graphics card before switching to GTX 1060 6GB.

I even had a 17" Gaming Laptop with a Mobility Radeon HD 3650 after my previous GeForce Go 7600 died due to defective die packaging from nVidia for which they got sued for and lost. Unfortunately my Laptop maker wasn't on the list of Laptop makers on the list to get compensated which I am still mad about.

So, as you can see I have been supporting the "little guy" to close to a decade in the Desktop and Laptop space. Only switched when nvidia matched the price/power/performance of AMD or was better with the GTX 1060 6GB and had no other option to go with the RTX 3090 since I wanted to game at 4K and play Cyberpunk 2077 at 4K with max setting and Ray Tracing with DLSS Quality. Something that AMD neither had mid-last year when I finished Cyberpunk because HD 6900XT does bad at Ray Tracing at 4K, although they have improved with FRS3 but NVIDIA still has an advantage.

So...my option wasn't because I love nVIDIA, I mean I admire them as a company since my first GPU was a GeForce 4 64MB DDR but it was the lack of options and features that AMD didn't have for my needs.

AMD needs to cater to other people's needs especially at the high end when I want to game at 4K among many other features that they are lacking as mentioned in the thread.

Good news is AMD apparently is working on ML/AI for image upscaling. If AMD can match the image quality then I will jump back in a heartbeat to game at 4K. I used to get called an AMD fanboy when I was defending AMD back 7+ years ago, it wasn't that I was an AMD fanboy it's just AMD had better price/performance back then for my needs. Right now AMD hasn't met my needs at the moment.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#18 Xtasy26
Member since 2008 • 5582 Posts
@Litchie said:

AMD kinda sucks. They should double down on "not as good as Nvidia, but cheaper" instead of just doing making worse GPUs for the same price as Nvidia.

I see nothing wrong with that. AMD did that with the HD 4800 series and captured the $200 - $300 price segment and that actually enabled them to grab market share. Reached up to 40% which is double what they have now. I mean why would I get something that costs 2X as much and I can get 70 - 75% of the performance at half the price.

From 2008-2013 AMD was killing it from the bottom to mid-end to high end. Only in the last 10 years they stagnated in the high end but was competitive in the mid-end.

Right now though I would add that they would need more than just being cheaper they need match features that match DLSS 3.5 with quality graphics that matches or get's "close" to NVIDIA on top of the price.

I don't understand why they aren't plain the price angle like they did with HD 4800 - HD 6000 series. It seems like Lisa Su want's to maintain "margin" but at the cost they are losing Gamers and in the long run and mind-share. How many of those users will switch back to AMD after they have been using NVIDIA for 10+ years?

Avatar image for sakaixx
sakaiXx

15961

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 5

#19 sakaiXx
Member since 2013 • 15961 Posts

AMD cards nowadays aren't bad they are very competitive but really behind in tech. Lots of people buying Nvidia cause promises of DLSS as well as those Nvidia cards also capable of running AMD freeware upscaler.

Avatar image for NoodleFighter
NoodleFighter

11805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 NoodleFighter
Member since 2011 • 11805 Posts

@Xtasy26 said:
@Litchie said:

AMD kinda sucks. They should double down on "not as good as Nvidia, but cheaper" instead of just doing making worse GPUs for the same price as Nvidia.

I see nothing wrong with that. AMD did that with the HD 4800 series and captured the $200 - $300 price segment and that actually enabled them to grab market share. Reached up to 40% which is double what they have now. I mean why would I get something that costs 2X as much and I can get 70 - 75% of the performance at half the price.

From 2008-2013 AMD was killing it from the bottom to mid-end to high end. Only in the last 10 years they stagnated in the high end but was competitive in the mid-end.

Right now though I would add that they would need more than just being cheaper they need match features that match DLSS 3.5 with quality graphics that matches or get's "close" to NVIDIA on top of the price.

I don't understand why they aren't plain the price angle like they did with HD 4800 - HD 6000 series. It seems like Lisa Su want's to maintain "margin" but at the cost they are losing Gamers and in the long run and mind-share. How many of those users will switch back to AMD after they have been using NVIDIA for 10+ years?

AMD has been stagnating in innovation and creating dedicated hardware for hardware accelerated task. Amd's answers in the past for things like tessellation and physx was to delegate it to general computation parts of the GPU or software based solutions instead making dedicated hardware for it like Nvidia. This tactic has screwed over AMD big time because instead of making AI accelerated hardware like Nvidia immediately after they released the RTX 20 series, they just kept doing their thing of making inferior versions of whatever Nvidia has with no dedicated hardware. What they didn't realize was the big picture Nvidia was going for with implementing AI accelerated hardware into their GPUs. There was a lot more potential to it than just creating upscalers for games. Now Nvidia is raking in way more money than ever because of the booming AI market and now we see AMD playing catch up after sitting on their thumbs not making dedicated AI hardware for 3 generations and are now very late to the party.

Also Intel is showing them up now. Intel Xess is better than FSR at this point even on AMD GPUs. Intel using DP4A instructions on GPUs to have a lower level dedicated machine learning solution. AMD users want more games to have Intel Xess because it provides the same performance as FSR but better image quality.

AMD have burned through all their good faith and consumer friendly image. The biggest AMD fans on Youtube criticize them more harshly and make fun of them now for their incompetence to match Nvidia in features and even Intel when it comes to Xess now. AMD's solution to everything being just using compute shaders or throwing in some extra VRAM isn't going to cut it anymore. It reminds me of when AMD tried to compensate against Intel Pre-Ryzen by just shoving more cores into their CPUs and still loosing in performance benchmarks to Intel CPUs with less cores even in task optimized to use more threads and cores.

AMD has a good thing going on with PC handhelds though and should capitalize on it as much as they can. Only Intel can really compete with them in that regard and Intel at the moment needs to work on improving the performance of games on DX9 and older. AMD should make a version of FSR that utilizes DP4A and make sure to get FSR 3.0 support on the Steam Deck and AFMF support on available for all PC handhelds. Otherwise when Intel irons out the legacy PC games issues and gets their equivalent to DLSS 3 out people will start making the switch over to Intel based PC handhelds.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#21  Edited By Xtasy26
Member since 2008 • 5582 Posts

@osan0 said:

It's going to be very tough to compete at the top end. It's not just about having the most powerful GPU anymore. It's that combined with a bunch of extra features and Nvidia are just doing that better. AMD need to not only be delivering the hardware, but also need to be putting developers on contributing to Blender and working with other app devs to get AMD hardware up to being a first class citizen. When people are willing to shell out 1000+ bucks, paying an extra 200-300 for better upscaling, better video encoding, better AI acceleration, better RT perf, better productivity app support and so on is not a big deal in that market.

They also need to sort their marketing out: It's shockingly poor. E.g. the 7800XT is a fine GPU for it's MSRP (though it has gone up a bit in some parts and is too close to the RTX4070). However it should be called the 7800. If someone has a 6800XT and upgrades to a 7800XT...well...it's not much of an upgrade at all. But it's MSRP is more in line with the 6800. Just shooting themselves in the foot again (not that i would recommend upgrading from a 6800 either...jumps too small). It's just another little blunder on a long line of blunders.

They need to improve the image quality of FSR still. It can be done. Intel has done it with XESS, which, at 1.2, is actually quite good even on the DP4A path now. Ghosting is the biggest issue with it and hopefully that gets fixed. FSR 3s frame gen is actually quite good. Most image quality issues are more Garbage in->garbage out issues rather than flaws with the frame gen itself. Using Async compute was quite clever. Nvidia users should be knocking on Nvidias window and asking for the same option for DLSS.

So if they can knock off the rough edges with the frame gen and improve the base output a bit more, i think it will actually be in a pretty good place. Also work on making it look better for 1440P displays specifically.

In the short-mid term they also need to look at the Low and Mid tier segment of the market I think. At the 100-399 point the expectations for top class features is less expected. Raster perf is still the priority. So, like the RX 480, if they can come up with a compelling offer at 200, 300 and 400 bucks that could be of interest to many. They have been focusing a lot on packaging which doesn't really benefit us directly. But it brings the BOM down: that's the only reason to do it after all.

A proper console killer GPU at 200-250 could be quite enticing.

It's hard to see where they can steal a march though at the moment. Intel are gunning for market share so they could be looking to undercut AMD and Nvidia while offering a very compelling package. Their drivers are not quite on the money yet but, by all accounts, they are improving rapidly. So it's going to be interesting to see how it plays out over the next couple of years.

Good right up. :)

Agreed on the first point. They need to work with developers and bring other apps up to par. I have heard of so many stories where people WANT to get AMD but X, Y,Z program doesn't work as well due to lack of particular support. I am not necessarily talking about gaming also, For example, a recent AI developer had people who wanted to use AMD hardware for AI but apparently AMD's OWN programming language ROCm which is similar to CUDA doesn't even support some AMD GPUs. Like WTF? Where as you can pretty much pick up any NVIDIA GPU and can get CUDA to run. That's lost sale or sales of someone who may wanted to game and use their GPU for development work.

And definitely right about on the high end. People will easily spend extra $200+ on $1000+ GPUs if they get better quality image with DLSS 3.5, and other features. And speaking of features I know first hand when recording game play I had to download 3rd party bulls*** software like Raptr to run in the background and set it up to record gameplay. When I switched to NVIDIA with GTX 1060 6GB I only press a couple of button and bam shadowplay starts recording. A lot of YT streamers didn't want to switch to AMD but can't due to lack of Shadowplay like feature, that's another lost sales. With respect to upscaling AMD should have hired the guy who developed DLSS who intel hired and developed XeSS which is pretty good. They really need to double down ot get onto AI/ML based image upscaling. When you are spending over $1000 it's little things like that matters.

And don't get me started with their marketing. That's another joke itself. If you look at ATI's marketing back in the day 20 years ago with the 9700 Pro they were stellar and was able to really sell themselves. They were able to pivot the brand into an "enthusiast" brand 20 years ago. It has gone downhill over the last 10 years.

On the price point front. From what I hear AMD will not focus on the high end from RDNA 4. It will be mostly focused on the $200 - $400 (similar to RDNA 1). They apparently had a very ambitious architecture to compete with Blackwell but that got shelved to the side to focus on the mid to high end. The ambitious architecture will be used in RDNA 5 and by then they hope to reach parity with DLSS AI/ML image upscaling. Which to me makes sense, there is no point in releasing an expensive $1000 GPU when they still haven't fixed many of the image issues with FSR 3 and features. Gamers may still get Blackwell if AMD hasn't reached parity with DLSS 3.5 or 4 whatever NVIDIA comes with Blackwell.

Avatar image for hardwenzen
hardwenzen

39441

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#22 hardwenzen
Member since 2005 • 39441 Posts

@pc_rocks said:

They can't. The only way for them to took the crown now if Nvidia themselves somehow f**k it up and drop the ball.

Look at intel. Chances of nvidia following is there.

Avatar image for BassMan
BassMan

17850

Forum Posts

0

Wiki Points

0

Followers

Reviews: 226

User Lists: 0

#23 BassMan
Member since 2002 • 17850 Posts

All of the above.

Avatar image for osan0
osan0

17853

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 osan0
Member since 2004 • 17853 Posts

@Xtasy26 said:
@osan0 said:

It's going to be very tough to compete at the top end. It's not just about having the most powerful GPU anymore. It's that combined with a bunch of extra features and Nvidia are just doing that better. AMD need to not only be delivering the hardware, but also need to be putting developers on contributing to Blender and working with other app devs to get AMD hardware up to being a first class citizen. When people are willing to shell out 1000+ bucks, paying an extra 200-300 for better upscaling, better video encoding, better AI acceleration, better RT perf, better productivity app support and so on is not a big deal in that market.

They also need to sort their marketing out: It's shockingly poor. E.g. the 7800XT is a fine GPU for it's MSRP (though it has gone up a bit in some parts and is too close to the RTX4070). However it should be called the 7800. If someone has a 6800XT and upgrades to a 7800XT...well...it's not much of an upgrade at all. But it's MSRP is more in line with the 6800. Just shooting themselves in the foot again (not that i would recommend upgrading from a 6800 either...jumps too small). It's just another little blunder on a long line of blunders.

They need to improve the image quality of FSR still. It can be done. Intel has done it with XESS, which, at 1.2, is actually quite good even on the DP4A path now. Ghosting is the biggest issue with it and hopefully that gets fixed. FSR 3s frame gen is actually quite good. Most image quality issues are more Garbage in->garbage out issues rather than flaws with the frame gen itself. Using Async compute was quite clever. Nvidia users should be knocking on Nvidias window and asking for the same option for DLSS.

So if they can knock off the rough edges with the frame gen and improve the base output a bit more, i think it will actually be in a pretty good place. Also work on making it look better for 1440P displays specifically.

In the short-mid term they also need to look at the Low and Mid tier segment of the market I think. At the 100-399 point the expectations for top class features is less expected. Raster perf is still the priority. So, like the RX 480, if they can come up with a compelling offer at 200, 300 and 400 bucks that could be of interest to many. They have been focusing a lot on packaging which doesn't really benefit us directly. But it brings the BOM down: that's the only reason to do it after all.

A proper console killer GPU at 200-250 could be quite enticing.

It's hard to see where they can steal a march though at the moment. Intel are gunning for market share so they could be looking to undercut AMD and Nvidia while offering a very compelling package. Their drivers are not quite on the money yet but, by all accounts, they are improving rapidly. So it's going to be interesting to see how it plays out over the next couple of years.

Good right up. :)

Agreed on the first point. They need to work with developers and bring other apps up to par. I have heard of so many stories where people WANT to get AMD but X, Y,Z program doesn't work as well due to lack of particular support. I am not necessarily talking about gaming also, For example, a recent AI developer had people who wanted to use AMD hardware for AI but apparently AMD's OWN programming language ROCm which is similar to CUDA doesn't even support some AMD GPUs. Like WTF? Where as you can pretty much pick up any NVIDIA GPU and can get CUDA to run. That's lost sale or sales of someone who may wanted to game and use their GPU for development work.

And definitely right about on the high end. People will easily spend extra $200+ on $1000+ GPUs if they get better quality image with DLSS 3.5, and other features. And speaking of features I know first hand when recording game play I had to download 3rd party bulls*** software like Raptr to run in the background and set it up to record gameplay. When I switched to NVIDIA with GTX 1060 6GB I only press a couple of button and bam shadowplay starts recording. A lot of YT streamers didn't want to switch to AMD but can't due to lack of Shadowplay like feature, that's another lost sales. With respect to upscaling AMD should have hired the guy who developed DLSS who intel hired and developed XeSS which is pretty good. They really need to double down ot get onto AI/ML based image upscaling. When you are spending over $1000 it's little things like that matters.

And don't get me started with their marketing. That's another joke itself. If you look at ATI's marketing back in the day 20 years ago with the 9700 Pro they were stellar and was able to really sell themselves. They were able to pivot the brand into an "enthusiast" brand 20 years ago. It has gone downhill over the last 10 years.

On the price point front. From what I hear AMD will not focus on the high end from RDNA 4. It will be mostly focused on the $200 - $400 (similar to RDNA 1). They apparently had a very ambitious architecture to compete with Blackwell but that got shelved to the side to focus on the mid to high end. The ambitious architecture will be used in RDNA 5 and by then they hope to reach parity with DLSS AI/ML image upscaling. Which to me makes sense, there is no point in releasing an expensive $1000 GPU when they still haven't fixed many of the image issues with FSR 3 and features. Gamers may still get Blackwell if AMD hasn't reached parity with DLSS 3.5 or 4 whatever NVIDIA comes with Blackwell.

Oh yeah definitely: for areas outside of gaming they really need to do a huge amount of work if they are to make their high end GPUs more attractive.

It's hard to say what is going on internally. It's like there is major politics and walking on egg shells going on internally to make sure that Ryzen, CDNA and RDNA don't stand on each others toes. So RocM is for CDNA (with some concessions) and AI, as far as AMD is concerned, is primarily CPU and some CDNA work it seems (which could be interesting to see in it's own right but is not a thing for us right now). RDNA is for graphics and only graphics and that's it.

On the video side: I have heard that things have improved for people using H265 and AV1 (assuming they have an RDNA3 card) but H264 remains poor. For whatever reason AMD just never fixed it. There is a release of Vulkan video encode/decode now so maybe that could be used to fix it? But will AMD put resources behind implementing it? Probably not. i thought the Adrenalin Control center also had a pretty good game recording feature now but i don't run windows so i can't check. I don't do any video production myself though so I know very little about the area.

But yeah: To really grab peoples attention they do need to steel a march somewhere. AMD does innovate but they tend to focus on areas that don't really affect us (silicon packaging) or it's in areas that take years to see the pay off (Mantle, A-SYNC compute). It's more business and industry focused. They need to look at demonstrating a new and innovative tech that is more user facing that does get peoples attention. The last thing i can think of in that area from AMD is tressFX (Which was great....power hungry but pretty nice). Other than that Nvidia have been setting the agenda basically and everyone else is trying to keep pace.

Even if/when they get FSR up to DLSS standards, even if they have complete parity in perf and image quality and scalability: the reaction will be more "Finally" rather than "OMG AMAZEBALLS!!". Same with Frame Gen. Credit to them: The Frame Gen part of FSR actually looks very promising (especially in avatar). Some minor rough edges to sort but they have done it using the existing resources on the GPU and ASYNC compute which is REALLY clever (As i have said in other threads, Nvidia users should be kicking down Nvidias door and demanding this as a fallback option for older GPUs). They should get a round of applause for that. But it's late and, again, it;s just catching up. Nvidia got the WOW factor of frame gen. Generally people don't actually care how it's done...just that it's done.

Avatar image for Pedro
Pedro

70035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#25 Pedro  Online
Member since 2002 • 70035 Posts

Only fanboys care about dumb stuff like this.😂 That is why the system wars is a relic.😲

Avatar image for blaznwiipspman1
blaznwiipspman1

16582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26  Edited By blaznwiipspman1
Member since 2007 • 16582 Posts

@Xtasy26: all good mayn. I've never bought Nvidia for a few reasons, mainly because I didn't see the price to performance benefit. Nvidia cards felt like a scam and still do.

Of course, Ray Tracing is a cool but gimmicky feature, and it's true that last gen 6000 amd gpus weren't so good at it. The new 7000 gpus are decent, and capable enough to be on par with the geforce 3000 series. I'd even say that fsr is fairly good, and good enough that it holds its own against dsl.

Personally though, my experience with Intel has been interesting. It stopped working for a few months, and i figured out the reason after i got a bit serious about trying to fix it.

Also the price i paid was a bit higher than an even better AMD gpu, which kind of sucked, but I stood by my ideals, and bought the intel arc a770. After all said and done, its a pretty decent card. The ray Tracing performance is even better than Nvidia in some cases, which is mind boggling. Of course, some games do better than others, hogwarts for ex runs amazing on the Intel arc a770. I get 30 to 40 fps with Ray Tracing on high, and at 4k resolution. I'm excited to see what the next gen Intel gpu are capable of.

Avatar image for truebond
truebond

39

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 truebond
Member since 2010 • 39 Posts

Drivers should be ready at launch, more video memory, send an engineer to AAA game developer studios to assist with better performance using AMD GPUs and undercut Nvidia price by 20% lower at minimum.

Avatar image for mrbojangles25
mrbojangles25

58436

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#29 mrbojangles25
Member since 2005 • 58436 Posts

I'm kind of curious why they even need to take "the crown" to begin with.

Isn't "good enough" exactly that? Good enough?

If they profitable and showing growth, then what's the big deal?

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#30 Xtasy26
Member since 2008 • 5582 Posts

@mrbojangles25 said:

I'm kind of curious why they even need to take "the crown" to begin with.

Isn't "good enough" exactly that? Good enough?

If they profitable and showing growth, then what's the big deal?

Good question. Let me explain. Before I can get a Radeon HD 4870, which was my first ATI/AMD GPU. I can get close 70-75% performance of nVIDIA for half the price. With my last AMD GPU, R9 390X I can get close to GTX 980 with less money and the same image quality.

Now, after jumping to 4K I don't really have another alternative option with respect to getting the performance AND image quality. Before AMD could get away with being "good enough" because image quality wasn't a factor as it was as good if not better than nVIDIA in mage quality.

Now, with Ray Tracing and max 4K Game settings we really need something like DLSS to play games with Ray Tracing and AMD's FSR 3 lags behind of DLSS with respect to performance and image quality. If AMD matched the picture quality, with FSR 3 I would more than willing to save the $500 bucks and go with 7900 XTX vs RTX 4090. But that isn't the case.

I agree with you to a certain extent that as long as they are profitable and showing growth, it's fine. But I think as nVidia is winning generation after generation and especially with their features such as frame generation and DLSS, and other features like Shadowplay it's hurting AMD's growth even though AMD may be competitive in the mid-end like with HD 7800 XT.

Back in the 4850/4870 not getting the performance crown, AMD still grew. Market share jumped to 40% now AMD has like 17% which is half then what they were 15 years ago. And still the same after 8 years. Consecutive generations of getting the performance is hurting their image I would argue, especially with the feature set that nVidia is offering.

Lastly, "winning the crown" has a psychological affect on consumers. They perceive that it's the "better" brand even if that may not be 100% true which equals to better sales and market share. We saw that with the 9700/9800 generation 20 years ago. ATI started to change their perception of playing second fiddle to NVIDIA. In the next generation after the 9700/9800 with the X800 XT PE vs the GeForce 6800 Ultra which was released in 2004 close to 20 years ago is when we had a lot of PC gamers finally switch to ATI. In fact that's whey AMD crossed the 50% threshold mark and actually took over the market share and became the #1 GPU maker in the world.

Also, shouldn't AMD offer something to Gamers like me who made the jump to 4K and want's to play every new game maxed out at 4K? I know I may be in the niche from going from mid-end so it may not matter to AMD. But as someone who supported AMD for nearly 10 years, shouldn't AMD provide something for someone like me?

Now, it's more than ever needed for AMD. NVIDIA is running away with almost 80% market share for the reasons above and many more. Look what happened to AMD once they started to take away the "performance crown" from intel with Ryzen. A lot of gamers switched to AMD. Which brought significant influx of cash to AMD.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#31 Xtasy26
Member since 2008 • 5582 Posts
@truebond said:

Drivers should be ready at launch, more video memory, send an engineer to AAA game developer studios to assist with better performance using AMD GPUs and undercut Nvidia price by 20% lower at minimum.

Sending engineers to AAA developers is something that has been bothering me from AMD of their reluctance to do so and this goes back 15 - 20 years ago. It's like nVIDIA really cares about their customers and are bending over backwards to make sure games runs the best on their hardware. I was surprised to find 3-4 Nvidia guys in the credits to No One Lives Forever 2 vs 1 guy from ATI when I finished NOLF 2. That was 20 years ago! This means Nvidia is providing whatever resources to developers to make sure AAA games work the best on their GPUs. I even read about NVIDIA sending engineers to help Crytek to make the Crysis looks the best. That's what I call commitment! I don't think pricing is necessarily bad but working with AAA developers from the beginning instead of trying to fix it later with driver updates is another example where NVIDIA is one step ahead of AMD.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#32 Xtasy26
Member since 2008 • 5582 Posts
@blaznwiipspman1 said:

@Xtasy26: all good mayn. I've never bought Nvidia for a few reasons, mainly because I didn't see the price to performance benefit. Nvidia cards felt like a scam and still do.

Of course, Ray Tracing is a cool but gimmicky feature, and it's true that last gen 6000 amd gpus weren't so good at it. The new 7000 gpus are decent, and capable enough to be on par with the geforce 3000 series. I'd even say that fsr is fairly good, and good enough that it holds its own against dsl.

Personally though, my experience with Intel has been interesting. It stopped working for a few months, and i figured out the reason after i got a bit serious about trying to fix it.

Also the price i paid was a bit higher than an even better AMD gpu, which kind of sucked, but I stood by my ideals, and bought the intel arc a770. After all said and done, its a pretty decent card. The ray Tracing performance is even better than Nvidia in some cases, which is mind boggling. Of course, some games do better than others, hogwarts for ex runs amazing on the Intel arc a770. I get 30 to 40 fps with Ray Tracing on high, and at 4k resolution. I'm excited to see what the next gen Intel gpu are capable of.

That's good to hear that intel is doing good in Ray Tracing. I thought it was a smart move to use the method intel uses with XeSS. They basically hired the guy who developed DLSS for XeSS. That was a smart move. No wonder it's beating FSR in the image quality department. It's going to a tough hill to climb with all the driver optimizations needed especially with older games. Apparently they increased the performance of Assassins Creed Unity by like 100% but that was a game like from 10 years ago. Goes to show how much they have to do with going back to games from over 10 - 12+ years go. I was really looking towards A770 when it released because it was the first time in 20+ years ago that I felt we would have a viable competitor after 3DFX's demise.

Looking forward to what intel has for Battlemage. My suggestion is to have one-die like AMD had for example when they launched the HD 4800 series initially with HD 4850/4870. Release one die and heavily focus on optimizations and double down on the drivers. Do, that for Celestial and then go for the high end with Druid. This will help intel focus on one chip at a time instead of multiple chips.

Can't believe I am rooting for intel. :P

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#33  Edited By Xtasy26
Member since 2008 • 5582 Posts
@Pedro said:

Only fanboys care about dumb stuff like this.😂 That is why the system wars is a relic.😲

No. We want competition and better prices. Why is wanting competition for consumers on both the mid-end and high-end a bad thing and see what AMD needs to improve?

Avatar image for Litchie
Litchie

34702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#34 Litchie
Member since 2003 • 34702 Posts

@Xtasy26 said:
@Pedro said:

Only fanboys care about dumb stuff like this.😂 That is why the system wars is a relic.😲

No. We want competition and better prices. Why is wanting competition for consumers on both the mid-end and high-end a bad thing and see what AMD needs to improve?

That'd be cool. AMD's plan seems to be making worse GPUs for the same price as Nvidia GPUs though, so that's not gonna happen anytime soon. And Intel? Might be some form of competitor in 10 years or something.

Avatar image for sakaixx
sakaiXx

15961

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 5

#35 sakaiXx
Member since 2013 • 15961 Posts

@Litchie: Their 7900 XT and XTX is good option compared to nvidia in the same price range tho. would be really good if we get cards like that but with dedicated RT cores compared to current solutions. Knowing AMD tho they always shooting themselves one way or another.

Avatar image for Litchie
Litchie

34702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#36 Litchie
Member since 2003 • 34702 Posts

@sakaixx said:

@Litchie: Their 7900 XT and XTX is good option compared to nvidia in the same price range tho. would be really good if we get cards like that but with dedicated RT cores compared to current solutions. Knowing AMD tho they always shooting themselves one way or another.

Yeah, kinda. They aren't better performing than the Nvidia cards, and they're missing features. I think they need to push their prices down some more.

Avatar image for HalcyonScarlet
HalcyonScarlet

13669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#37 HalcyonScarlet
Member since 2011 • 13669 Posts

I just want decent performance and amazing after purchase service (software and drivers).

Avatar image for ghostofgolden
GhostOfGolden

2588

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#38 GhostOfGolden
Member since 2023 • 2588 Posts

I was exclusively a Nvidia customer for over a decade. After watching the 7800XT coverage and reviews I decided to go ahead and grab one for my PC. I'm sorry to say the switch hasn't been all that smooth. I had random shutdowns and sometimes I would never get video out to the monitor when I turned the PC on. My immediate thought was my PSU was tapped out so I put a meter on it. That wasn't the problem. I did a clean Windows 11 install and reinstalled all the drivers and the startup issues still intermittently happens. The only workaround is to unplug the riser cable from the motherboard and plug it back in. After doing a ton of digging on reddit and PC Tech forums I come to find out this is A KNOWN DRIVER ISSUE FOR SOME TIME NOW!!! AMD fanboys always tell us the driver problems are exaggerated. Bullshit...

The card runs well and really trades blows with my kids 4070. But he doesn't have any driver issues... Hell he's had 3 driver updates since my last one... AMD isn't taking the graphics crown from anybody. In fact they better start looking over their shoulder for Intel

Avatar image for mrbojangles25
mrbojangles25

58436

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#39 mrbojangles25
Member since 2005 • 58436 Posts

@Xtasy26: I didn't realize they only had 17% market share, that's pretty bad, and bound to get worse with Intel entering the fray.

I do hope Intel makes some good offerings, it'd really shake things up and force both nVidia and AMD to maybe lower prices and/or increase performance.

Avatar image for adrian1480
adrian1480

15033

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#40 adrian1480
Member since 2003 • 15033 Posts

I don't know what they can do, as these are not equal combatants.

1.) Some things are a function of being late to a technology and being behind. They were very late to AI and ratracing. They were late to get FreeSync out there and it's still not quite as good as Gsync.

2.) They have less resources to advance. Consider Nvidia is a 1.3 trillion market cap company. AMD is around $250 billion. That is to say, Nvidia has roughly 4x the resources as AMD, and AMD's investments are spread across graphics and CPUs, among other things. Nvidia is also spread across a variety of product times and markets, but they still generally revolve around graphics.

They will likely always be at arms length from Nvidia at this point and their biggest strength will be in competing at the midlevel of performance. That said, they can still compete in critical categories:

1.) Their tech provides them a strong advantage on the energy consumption front, where their hardware is the clear winner over Nvidia or Intel hardware. Smaller nanometer process, solid chiplet designs. Only Apple is doing energy consumption better and they are not a pure competitor in the same way.

2.) They can still put together high-end performing parts, just not necessarialy something that hits every feature Nvidia can as well as they can do it with their deeper pockets. As such, they can fight in the value prop, which they have done fairly but could do better.

3.) Also-ran status grants them the ability to allow Nvidia to make the mistakes and learn from them before releasing their own versions of things. Like Gsync vs FreeSync. Or ray tracing. It took Nvidia a few generations before ratracing stopped being a joke. And while Nvidia was working to improve it, AMD was quietly starting to add their own. They are behind as a result but have undoubtedly have had to spend less than Nvidia did to get to this point. They can kinda do like a racing car drift kinda thing and always be able to be competitive.

In the end, they will likely remain in a second-place position, but that's okay. Keep investing, and keep innovating as they have in the CPU space and ocassionally they'll hit unexpected homeruns that will grant them the title for a gen here and there. But it's hard to make strong progress against companies that are 4x your size or bigger without some sort of seachange that you can find yourself at the vanguard of. I feel like AMD would need a new product that nobody else has to grow into a giant.