Digitial Foundry: Teraflop computation no longer a relevant measurement for next gen consoles.

  • 124 results
  • 1
  • 2
  • 3
Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#51 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

These threads are...

Avatar image for rmpumper
rmpumper

2146

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52  Edited By rmpumper
Member since 2016 • 2146 Posts

@tormentos said:

You need more than 12TF to claim 3X Pro,and 18TF to claim 3X xbox one X,the juggle of words to convince people that this machines are a great jump will be incredible.

Not really. MS never claimed that the GPU in Scarlet will be 4x faster than in the X, they were talking about overall performance, so add to that the miles better Zen CPU with some type of SSD and faster memory and you might end up with 4x better performance* (*includes loading times in the equation).

As for the TFlops thing, Sony claimed 12TF for the PS5, which is bullshit, when we can see that the $450 225W 5700XT only has 10TF. They might have been comparing to VEGA equivalent, so the PS5 will most likely have a 5700/Vega56 level GPU at best (same with MS) and might be using Navi gen2 cards in the final product.

Avatar image for sonny2dap
sonny2dap

2085

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 sonny2dap
Member since 2008 • 2085 Posts

@sakaixx said:

so skipping power, we already comparing exclusive this early? Hope xbox has something exclusive that is not on pc

We know it will all be on PC, looks like MS are going to push not just hardware this next cycle but their ecosystem, create as low a barrier to entry as possible without hurting themselves and get their games in the hands of as many consumers as possible, will it work? that remains to be seen but in all honesty the surest bet going in to the next cycle really does look like Nintendo, their strength of IP combined with the fact that no one does what Nintendo does as well as Nintendo does it makes them a safe bet.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 scatteh316
Member since 2004 • 10273 Posts

@horgen said:
@boxrekt said:
@04dcarraher said:

Even if AMD got Navi to match Nvidia's Pascal or Turing's efficiency...... A 10 TFLOP NAVI is still not 4k 60 fps material without compromises, let alone having Ray tracing....

That's the part that's yet to be determined because the next gen consoles have custom components which have yet to be revealed.

You have to remember we're not talking bare bone GPUs that are simply thrown into random rigs. From the SSD's to the CPUs every piece of these consoles are being designed to work as one unit.

We won't know what their actual capability is until we either have a developer leaks that information early or when Sony/MS actually show off the the next gen games running on these consoles.

There are already reports of PS5 dev kits running Red Dead Redemption 2, Spiderman and The Last of Us 2 (all AAA caliber titles) 4k 60fps so saying they will not do 4k 60 is really premature.

At the very least, it sounds like next gen consoles will be running any current gen titles hitting that bench.

Got any links to back that up?

Why is that so hard to believe?

PS5 will have more then enough grunt to run those games at 4k/60.

Avatar image for pc_rocks
PC_Rocks

8495

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#55 PC_Rocks
Member since 2018 • 8495 Posts

@BassMan said:

AMD TFlops have not mattered for a while now. Many of their previous GPUs would have high TFlops count, but perform worse than Nvidia GPUs with less TFlops due to the architecture and driver differences. All that matters is real world benchmarks.

@fedor said:

Next gen 8TF confirmed.

@goldenelementxl said:

Lol @ AMD tflops

This.

Avatar image for Sushiglutton
Sushiglutton

9875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#57 Sushiglutton
Member since 2009 • 9875 Posts

Think we should wait until these machines actually exist and then compare benchmarks for third party games. Debating specs that apparently don't tell you the whole story, even IF we knew what they were seems pretty pointless.

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#58  Edited By BoxRekt
Member since 2019 • 2425 Posts
@Pedro said:
@Juub1990 said:

@Pedro: Because 80% of people have the base console and they serve as the lowest common denominator?

This is true but I don't see why comparing it to the most recent released hardware is "bad". In the end it depends on the consumers' perspective. If you have orig Xbox one or S then I can understand comparing it to those and if you have a One X, I also understand the comparison. In the end any of the next gen would be significant to the launch consoles.

@boxrekt said:

Why not use 2080TI?

It's hardware that also came out "this gen" and you said consoles are just PC's anyway, so why not use a 2080TI?

You can if you want. Consoles are just PCs anyway.....Oh, you thought they were something special.

No silly, it's YOUR argument that consoles are just PC's.

So why only use the latest upgraded console hardware as your reference for next generation instead of using the latest PC hardware?

Maybe you aren't bright enough to defend the fault in your logic like a good deflector should.

If you aren't even able to explain why the latest PC hardware shouldn't be used as the reference point for generation leaps when you claim consoles are just PCs, then you should stop trying to speak on the topic of generations until you get a grasp on your own narrative and can defend it thoroughly.

With that said...why should anyone use the X or Pro as a reference point for next gen?

*lol and try to have that make sense while also claiming consoles are just PC's, which currently have better hardware and would be a better reference point for a generation leap than the shitty mid level PC GPUs with jaguar cores.*

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#60 BoxRekt
Member since 2019 • 2425 Posts
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

I didn't "conveniently" miss anything. I was going off the size of the chip for Scarlet that they counted 365-380mm2 that was estimated to be 48 CU, 44 active.

FYI: The size of PC version of navi GPU comes in at 251mm2 which 40 CUs. Why would I include that when that's NOT what they calculated for the console chip?

Figure what you want about the end result for next gen consoles, but don't project nonsense on me because you have different theories than Digital Foundry has put out.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#61 deactivated-642321fb121ca
Member since 2013 • 7142 Posts
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#62 BoxRekt
Member since 2019 • 2425 Posts
@Random_Matt said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

RT is looking like more and more of a fail for hardware based solution. It may just be a smarter route to just use shader software until the technology can keep up with performance.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 scatteh316
Member since 2004 • 10273 Posts

@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

I dunno on that last point dude...... an HD 7850 can't hang with PS4 these days.....

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 scatteh316
Member since 2004 • 10273 Posts

@Random_Matt said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#65 deactivated-642321fb121ca
Member since 2013 • 7142 Posts
@boxrekt said:
@Random_Matt said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

RT is looking like more and more of a fail for hardware based solution. It may just be a smarter route to just use shader software until the technology can keep up with performance.

RT is fundamentally better, but just isn't ready enough. I was merely just pointing out MS record of misleading PR, even if it is shader based as I expect to be, MS can still turn around and say it is hardware based because it is still using the graphics card, which is a piece of hardware.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@boxrekt said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

I didn't "conveniently" miss anything. I was going off the size of the chip for Scarlet that they counted 365-380mm2 that was estimated to be 48 CU, 44 active.

FYI: The size of PC version of navi GPU comes in at 251mm2 which 40 CUs. Why would I include that when that's NOT what they calculated for the console chip?

Figure what you want about the end result for next gen consoles, but don't project nonsense on me because you have different theories than Digital Foundry has put out.

They have been wrong with every single guess they have ever made on hardware predictions. They never use TDP as a indicator nor price/yeilds.

Go with what you want. I have been consistant with even my Navi predictions knowing it would be GCN based that the TDP would be a nightmare.

A 40 CU 5700 XT has a TDP of 225w... And these clowns think MS can fit a 48 CU chip with Ryzen and RT cores?... They are smoking some strong s***.

Same way they were wrong about the Switch when they said it would use a Tegra X2 and X1X when they said it would use Vega... These guys are retarded.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#67 deactivated-642321fb121ca
Member since 2013 • 7142 Posts
@scatteh316 said:
@Random_Matt said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.

Until I have full confirmation of RT cores, then I'm still sceptical.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#68  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@Random_Matt said:
@scatteh316 said:
@Random_Matt said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.

Until I have full confirmation of RT cores, then I'm still sceptical.

Depends on what method, because there are two types of hardware usage. You have the shader processor method(which eats into the total performance of the gpu more so than having dedicated processors to handle RT) and then you separate dedicated processors for RT.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#69 Pedro
Member since 2002 • 70008 Posts

@boxrekt said:

No silly, it's YOUR argument that consoles are just PC's.

So why only use the latest upgraded console hardware as your reference for next generation instead of using the latest PC hardware?

Maybe you aren't bright enough to defend the fault in your logic like a good deflector should.

If you aren't even able to explain why the latest PC hardware shouldn't be used as the reference point for generation leaps when you claim consoles are just PCs, then you should stop trying to speak on the topic of generations until you get a grasp on your own narrative and can defend it thoroughly.

With that said...why should anyone use the X or Pro as a reference point for next gen?

*lol and try to have that make sense while also claiming consoles are just PC's, which currently have better hardware and would be a better reference point for a generation leap than the shitty mid level PC GPUs with jaguar cores.*

Funny, I said that you can compare it with the latest PC hardware if you want and here you are complaining about the fact that consoles are not PCs. Seems like your ignorance know no bounds. :)

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70  Edited By scatteh316
Member since 2004 • 10273 Posts

@Random_Matt said:
@scatteh316 said:
@Random_Matt said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.

Until I have full confirmation of RT cores, then I'm still sceptical.

Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#71 04dcarraher
Member since 2004 • 23832 Posts

@scatteh316 said:
@Random_Matt said:
@scatteh316 said:
@Random_Matt said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.

Until I have full confirmation of RT cores, then I'm still sceptical.

Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.

NAVI 10 is also "RDNA" and yet it does not have dedicated RT cores. So its not a guarantee.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 scatteh316
Member since 2004 • 10273 Posts

@04dcarraher said:
@scatteh316 said:
@Random_Matt said:
@scatteh316 said:
@Random_Matt said:

Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.

As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.

Until I have full confirmation of RT cores, then I'm still sceptical.

Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.

NAVI 10 is also "RDNA" and yet it does not have dedicated RT cores. So its not a guarantee.

Do you have a link that says Navi 10 doesn't have RT cores? As everything I've seen says 'to be confirmed'

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#73  Edited By deactivated-642321fb121ca
Member since 2013 • 7142 Posts
@scatteh316 said:
@04dcarraher said:
@scatteh316 said:
@Random_Matt said:

Until I have full confirmation of RT cores, then I'm still sceptical.

Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.

NAVI 10 is also "RDNA" and yet it does not have dedicated RT cores. So its not a guarantee.

Do you have a link that says Navi 10 doesn't have RT cores? As everything I've seen says 'to be confirmed'

"To begin with, we can confirm that the "Navi 10" silicon has no fixed function hardware for ray-tracing such as the RT core or tensor cores found in NVIDIA "Turing" RTX GPUs. For now, AMD's implementation of DXR (DirectX Ray-tracing) for now relies entirely on programmable shaders."

https://www.techpowerup.com/256459/amd-radeon-rx-5700-xt-confirmed-to-feature-64-rops-architecture-brief?cp=2

MS would need next gen RDNA, AMD have said it will come. But that will cost a fortune in my eyes, maybe it will be part of the their semi custom design, but do not see it.

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#74  Edited By BoxRekt
Member since 2019 • 2425 Posts
@Pedro said:
@boxrekt said:

No silly, it's YOUR argument that consoles are just PC's.

So why only use the latest upgraded console hardware as your reference for next generation instead of using the latest PC hardware?

Maybe you aren't bright enough to defend the fault in your logic like a good deflector should.

If you aren't even able to explain why the latest PC hardware shouldn't be used as the reference point for generation leaps when you claim consoles are just PCs, then you should stop trying to speak on the topic of generations until you get a grasp on your own narrative and can defend it thoroughly.

With that said...why should anyone use the X or Pro as a reference point for next gen?

*lol and try to have that make sense while also claiming consoles are just PC's, which currently have better hardware and would be a better reference point for a generation leap than the shitty mid level PC GPUs with jaguar cores.*

Funny, I said that you can compare it with the latest PC hardware if you want and here you are complaining about the fact that consoles are not PCs. Seems like your ignorance know no bounds. :)

Not me, I asked YOU why don't you use PC GPU/CPU?

You keep referencing to X and Pro for next gen NOT me. You said consoles are just PCs so why do you want to reference mid gen for next gen?

So I'm asking you why don't you use the latest PC hardware as your reference for next gen instead of older mid-gen consoles?

You don't seem to have an answer.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#75  Edited By 04dcarraher
Member since 2004 • 23832 Posts
@scatteh316 said:
@04dcarraher said:
@scatteh316 said:
@Random_Matt said:

Until I have full confirmation of RT cores, then I'm still sceptical.

Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.

NAVI 10 is also "RDNA" and yet it does not have dedicated RT cores. So its not a guarantee.

Do you have a link that says Navi 10 doesn't have RT cores? As everything I've seen says 'to be confirmed'

"AMD officially said the path they see includes using the current GCN and RDNA cards to perform ray tracing using the card’s shader cores."

https://www.pcworld.com/article/3401598/amd-releases-two-affordable-navi-based-radeon-rx-cards-and-lays-out-ray-tracing-plans.html

"Also, when it comes to ray tracing, AMD is indeed developing their own suite around it. According to their vision, current GCN and RDNA architecture will be able to perform ray tracing on shaders which will be used through ProRender for creators and Radeon Rays for developers. In next-gen RDNA which is supposed to launch in 2020 on 7nm+ node, AMD will be bringing hardware-enabled ray tracing with select lighting effects for real-time gaming. AMD will also enable full-scene ray tracing which would be leveraged through cloud computing."

https://wccftech.com/amd-radeon-rx-5700-xt-7nm-navi-rdna-gpu-official-launch/

So again not all Navi based gpus will have dedicated RT cores. So its not a guarantee that one or both consoles will have dedicated RT cores.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#76 tormentos
Member since 2003 • 33784 Posts

@rmpumper:

1- They never estated overall performance that is all you.

2- please link me to Sony confirming 12TF.

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#77 Shewgenja
Member since 2009 • 21456 Posts

I'd just like to point out that dev kits come with a target performance point. People are making wild assumptions about whether or not the alpha/beta dev kits are at all reflective of this from a hardware standpoint.

Avatar image for horgen
horgen

127517

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#78 horgen  Moderator
Member since 2006 • 127517 Posts

@scatteh316 said:

Why is that so hard to believe?

PS5 will have more then enough grunt to run those games at 4k/60.

Hmmm. 4X resolution (or more?) and twice the fps. Well if it is 8 times stronger I guess.

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#79 BoxRekt
Member since 2019 • 2425 Posts
@horgen said:
@scatteh316 said:

Why is that so hard to believe?

PS5 will have more then enough grunt to run those games at 4k/60.

Hmmm. 4X resolution (or more?) and twice the fps. Well if it is 8 times stronger I guess.

They may have to do some funny stuff to pull it off but I do think that will be the standard for next gen.

I.E. they may checkerboard 1800p to 4k to ensure 60fps but most will aim for higher frames over native 4k res if need be.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#80  Edited By Pedro
Member since 2002 • 70008 Posts

@boxrekt said:

Not me, I asked YOU why don't you use PC GPU/CPU?

You keep referencing to X and Pro for next gen NOT me. You said consoles are just PCs so why do you want to reference mid gen for next gen?

So I'm asking you why don't you use the latest PC hardware as your reference for next gen instead of older mid-gen consoles?

You don't seem to have an answer.

Because I am comparing current offerings from each company to future offerings from each company. I know its mind-blowing. But I think its because you're annoyed that the next gen is not going to be that much stronger than the midgen. Don't be upset with me because your favorite company released a midgen console that undermines their next gen. :)

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 Zaryia
Member since 2016 • 21607 Posts

So does this mean PS5 won't have to play the worst/far-worse version of a majority of titles this gen like PS4 had to?


Nice. Lets hope this is foreal this time.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#82 Pedro
Member since 2002 • 70008 Posts

@zaryia said:

So does this mean PS5 won't have to play the worst/far-worse version of a majority of titles this gen like PS4 had to?

Nice. Lets hope this is foreal this time.

I believe you forgot about this bad boy.
I believe you forgot about this bad boy.

Avatar image for joshrmeyer
JoshRMeyer

12577

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 JoshRMeyer
Member since 2015 • 12577 Posts

@Pedro: She's saying for what's offered. Which is stupid. Because in reality PC would have the worst version, and the best version.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#84  Edited By Pedro
Member since 2002 • 70008 Posts

@joshrmeyer said:

@Pedro: She's saying for what's offered. Which is stupid. Because in reality PC would have the worst version, and the best version.

This is true but the PS4 is nowhere the worse version that resides on the home console front.

Avatar image for slimdogmilionar
slimdogmilionar

1343

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#85 slimdogmilionar
Member since 2014 • 1343 Posts

So basically about the same performance as a 2070, that’s not enough for 4K 60 ultra let alone 8k.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@scatteh316 said:
@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

I dunno on that last point dude...... an HD 7850 can't hang with PS4 these days.....

The PS4 is not a 7850 its a 7870 with 2 cores disabled. 7870 has 20 CU's and the PS4 has 18 CU's, not to mention more RAM. That said the 7850 can pretty much do 1080p medium at 30FPS just like the PS4:

Loading Video...

And just like the PS4 the PS5 will have more RAM than a 5700 Pro so its longevity will be far superior, not to mention that after games are tuned for those consoles so consoles will have a slight advantage but as you can see above... Not much and that GPU is weaker than a PS4 with only 2GB RAM.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#87  Edited By ronvalencia
Member since 2008 • 29612 Posts

@slimdogmilionar said:

So basically about the same performance as a 2070, that’s not enough for 4K 60 ultra let alone 8k.

  • Real world photo capture pre-bake textures leads toward SSD hype which is reinforced by NAVI's CU design with double width texture filtering processors.
  • Variable shading rate is another shader resource conservation tricks.

X1X has Variable Shading Rate (VSR) like feature. https://www.eurogamer.net/articles/digitalfoundry-2017-the-scorpio-engine-in-depth

Andrew Goossen tells us that the GPU supports extensions that allow depth and ID buffers to be efficiently rendered at full native resolution, while colour buffers can be rendered at half resolution with full pixel shader efficiency. Based on conversations last year with Mark Cerny, there is some commonality in approach here with some of the aspects of PlayStation 4 Pro's design, but we can expect some variation in customisations - despite both working with AMD, we're reliably informed that neither Sony or Microsoft are at all aware of each other's designs before they are publicly unveiled.

Depth buffer deals with geometry render buffer. NVIDIA's VSR on Turing from https://developer.nvidia.com/vrworks/graphics/variablerateshading

--------

The majority of current PC games doesn't exploit RTX 2070's new features such VSR, FP rapid pack maths (missing in gaming PC Pascal GPUs) and Tensors cores. Tensor cores and Rapid Pack Maths on Microsoft platforms are accessed by Direct3D12's DirectML and Meta-commands.

When Turing's RPM and Async compute is used, RTX 2080 leaves GTX 1080 Ti behind. This is done via Vulkan API with NVIDIA and AMD API extensions (OpenGL style vendor specific API extensions returns LOL).

AMD Vega GPUs with RPM and HW async compute was beaten by NVIDIA's Turing hardware features brought down from server Volta GPU which employs it's own RPM and Async compute hardware features.

What can RTX 2070 do with it's new hardware features?

https://www.guru3d.com/articles-pages/msi-geforce-rtx-2070-armor-8g-review,24.html

Both NVIDIA and AMD sides used their vendor specific extensions.

Source: https://www.guru3d.com/articles-pages/msi-geforce-rtx-2070-armor-8g-review,24.html

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#88  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Grey_Eyed_Elf said:

You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.

The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.

AMD over-voltage their PC GPUs increase yields.

For Project Scorpio, MS develops smart VRM with automatic under-volts per silicon quality and waits for second generation version for improved electron leakage maturity.

RX 580 can consume 209 watts during PC gaming which almost mirrors 225 watts RX-5700 XT's 225 watts TDP.

The gap between 5700's release to Scarlet's release is the similar to the gap between RX 480's release and Scorpio's release.

Next year's TSMC 7nm+ has

N7+ has identical yield rates to N7 and will steadily improve, while also offering a 20% increase to transistor density. There’s also a 10% performance uplift or 15% power efficiency increase. AMD will take advantage of the former in their fourth-gen Ryzen which they’ve confirmed to use TSMC’s 7nm+ (cite ref 1)

References

1. https://www.techspot.com/news/80237-tsmc-7nm-production-improves-performance-10.html

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#89  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Shewgenja said:

I'd just like to point out that dev kits come with a target performance point. People are making wild assumptions about whether or not the alpha/beta dev kits are at all reflective of this from a hardware standpoint.

FALSE, E3 2005 gears of War 1 demo with GeForce 6800 ultra SLI or Radeon X800 Crossfire PowerMacs didn't reflect the final Xbox 360's GPU which is more capable than E3 2005 PowerMacs dev kits.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#90 Pedro
Member since 2002 • 70008 Posts

@ronvalencia said:

FALSE, E3 2005 gears of War 1 demo with GeForce 6800 ultra SLI or Radeon X800 Crossfire PowerMacs didn't reflect the final Xbox 360's GPU which is more capable than E3 2005 PowerMacs dev kits.

Xbox One X devkit is beefier than the Xbox One X. :|

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#91  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Pedro said:
@ronvalencia said:

FALSE, E3 2005 gears of War 1 demo with GeForce 6800 ultra SLI or Radeon X800 Crossfire PowerMacs didn't reflect the final Xbox 360's GPU which is more capable than E3 2005 PowerMacs dev kits.

Xbox One X devkit is beefier than the Xbox One X. :|

Xbox One X's 1st silicon arrived December 2016.

Actual Xbox One X dev kit when out in March 2017.

Xbox One X dev kit APU runs on the same silicon as retail Xbox One X's APU but it acts like R9-390X (44 CU) silicon quality instead of R9-390 Pro (40 CU) silicon quality.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#92 Pedro
Member since 2002 • 70008 Posts

@ronvalencia: And, like every other useless response you share, non of that drivel changes anything in my comment. Good job.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#93 ronvalencia
Member since 2008 • 29612 Posts

@Pedro said:

@ronvalencia: And, like every other useless response you share, non of that drivel changes anything in my comment. Good job.

Your argument is useless. Who are you? You're not Microsoft.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#94 Pedro
Member since 2002 • 70008 Posts

@ronvalencia said:

Your argument is useless. Who are you? You're not Microsoft.

There was no argument, just facts. You seem to be allergic to it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#95 ronvalencia
Member since 2008 • 29612 Posts

@Pedro said:
@ronvalencia said:

Your argument is useless. Who are you? You're not Microsoft.

There was no argument, just facts. You seem to be allergic to it.

You attempted to make an argument. I have you the context for real Xbox Scorpio silicon's time line.

There was no argument, just facts. You seem to be allergic to it.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#96 Pedro
Member since 2002 • 70008 Posts

@ronvalencia said:

You attempted to make an argument. I have you the context for real Xbox Scorpio silicon's time line.

There was no argument, just facts. You seem to be allergic to it.

The struggle is real. Now you have become incoherent. Facts seem to do that people who don't know what they are talking about. BTW, how is the fact that the Xbox One X dev kit being stronger than the retail system working out for you? LOL

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#97  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Pedro said:
@ronvalencia said:

You attempted to make an argument. I have you the context for real Xbox Scorpio silicon's time line.

There was no argument, just facts. You seem to be allergic to it.

The struggle is real. Now you have become incoherent. Facts seem to do that people who don't know what they are talking about. BTW, how is the fact that the Xbox One X dev kit being stronger than the retail system working out for you? LOL

The real struggle comes from you since you omitted the context and time line for real X1X dev kit.

Avatar image for Pedro
Pedro

70008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#98 Pedro
Member since 2002 • 70008 Posts

@ronvalencia: You really hate being wrong don't you? ? Still waiting for the fact that the Xbox One X devkit being stronger to change.?

Avatar image for Tessellation
Tessellation

9297

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#99 Tessellation
Member since 2009 • 9297 Posts

LOL the crying already started... the old fart thinks he knows more than the digital foundry guys... this poor old man is in love with plastic consoles.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#100 ronvalencia
Member since 2008 • 29612 Posts

@Pedro said:

@ronvalencia: You really hate being wrong don't you? ? Still waiting for the fact that the Xbox One X devkit being stronger to change.?

You're not MS.