If my GTX 970 can run Destiny 2 1080p @60fps then the X1X should be able to as well.

  • 87 results
  • 1
  • 2
Avatar image for lifelessablaze
lifelessablaze

1066

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#1 lifelessablaze
Member since 2017 • 1066 Posts

The X1X should run it at at least 1440p @60fps. Bungie is full of crap and there's nothing they can say for anyone to take them seriously. They should just be truthful about it and say it's 100% parity issues they locked it to 30 frames. There should be 2 options for X1X users: 1440p 60fps or 4k 30fps. After leaving MS they've been a massive joke in the gaming industry which looks like they're continuing it through their second game.

Avatar image for jak42
Jak42

1093

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Jak42
Member since 2016 • 1093 Posts

I'm pretty sure the D2 PC beta is an old build. Like every other beta before it.

Avatar image for Chutebox
Chutebox

50666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Chutebox
Member since 2007 • 50666 Posts

ok

Avatar image for ryu_silveira
Ryu_Silveira

167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#4 Ryu_Silveira
Member since 2017 • 167 Posts

More and more Xbox One X threads! Keep em coming!

Avatar image for lifelessablaze
lifelessablaze

1066

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#5  Edited By lifelessablaze
Member since 2017 • 1066 Posts

@ryu_silveira said:

More and more Xbox One X threads! Keep em coming!

You're welcome homie. Be sure to click 'follow.'

Avatar image for Ant_17
Ant_17

13634

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#6 Ant_17
Member since 2005 • 13634 Posts

lol, sure, whatever helps you justify the Bone.

Avatar image for Archangel3371
Archangel3371

44547

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#8 Archangel3371
Member since 2004 • 44547 Posts

What are you doing here then? Go to Bungie and set them straight. Quick. Go! Go!

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 QuadKnight
Member since 2015 • 12916 Posts

The shitty Jaguar CPU is the problem. It's what I've been trying to tell clowns like Wrongvalencia since they announced the official XboneX specs. The XboneX is still bottlenecked by the shitty Jaguar CPU, no amount of optimizations will make up for the garbage CPU in the XBoneX. If you want 60fps at all times stick to PC. Destiny 2 won't run at 60fps on X1X even if they drop it down to 1080p, the CPU is that shitty, same with the PS4 Pro.

Avatar image for appariti0n
appariti0n

5014

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 appariti0n
Member since 2009 • 5014 Posts

@lifelessablaze said:

The X1X should run it at at least 1440p @60fps. Bungie is full of crap and there's nothing they can say for anyone to take them seriously. They should just be truthful about it and say it's 100% parity issues they locked it to 30 frames. There should be 2 options for X1X users: 1440p 60fps or 4k 30fps. After leaving MS they've been a massive joke in the gaming industry which looks like they're continuing it through their second game.

what CPU do you have paired with your 970? Makes a big difference.

Avatar image for lifelessablaze
lifelessablaze

1066

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#11 lifelessablaze
Member since 2017 • 1066 Posts

@appariti0n said:
@lifelessablaze said:

The X1X should run it at at least 1440p @60fps. Bungie is full of crap and there's nothing they can say for anyone to take them seriously. They should just be truthful about it and say it's 100% parity issues they locked it to 30 frames. There should be 2 options for X1X users: 1440p 60fps or 4k 30fps. After leaving MS they've been a massive joke in the gaming industry which looks like they're continuing it through their second game.

what CPU do you have paired with your 970? Makes a big difference.

an old ass i5 4750 which was a mid tier CPU even when it was new back in 2014

Avatar image for appariti0n
appariti0n

5014

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 appariti0n
Member since 2009 • 5014 Posts

@lifelessablaze said:
@appariti0n said:
@lifelessablaze said:

The X1X should run it at at least 1440p @60fps. Bungie is full of crap and there's nothing they can say for anyone to take them seriously. They should just be truthful about it and say it's 100% parity issues they locked it to 30 frames. There should be 2 options for X1X users: 1440p 60fps or 4k 30fps. After leaving MS they've been a massive joke in the gaming industry which looks like they're continuing it through their second game.

what CPU do you have paired with your 970? Makes a big difference.

an old ass i5 4750 which was a mid tier CPU even when it was new back in 2014

Yeah..... it's still better than what's in the X1X lol.

Avatar image for lifelessablaze
lifelessablaze

1066

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#13 lifelessablaze
Member since 2017 • 1066 Posts

@appariti0n said:
@lifelessablaze said:
@appariti0n said:
@lifelessablaze said:

The X1X should run it at at least 1440p @60fps. Bungie is full of crap and there's nothing they can say for anyone to take them seriously. They should just be truthful about it and say it's 100% parity issues they locked it to 30 frames. There should be 2 options for X1X users: 1440p 60fps or 4k 30fps. After leaving MS they've been a massive joke in the gaming industry which looks like they're continuing it through their second game.

what CPU do you have paired with your 970? Makes a big difference.

an old ass i5 4750 which was a mid tier CPU even when it was new back in 2014

Yeah..... it's still better than what's in the X1X lol.

lmao yeah right.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#14 Wasdie  Moderator
Member since 2003 • 53622 Posts

Can the GPU in the X1X perform to the same level as a GTX 970?

Avatar image for lifelessablaze
lifelessablaze

1066

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#15 lifelessablaze
Member since 2017 • 1066 Posts

@Wasdie said:

Can the GPU in the X1X perform to the same level as a GTX 970?

lol the GTX 970 is trash compared to what the X1X is packing.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#16 Wasdie  Moderator
Member since 2003 • 53622 Posts

@lifelessablaze said:
@Wasdie said:

Can the GPU in the X1X perform to the same level as a GTX 970?

lol the GTX 970 is trash compared to what the X1X is packing.

I think you're overestimating a bit. A 100 mhz bump in clock speed over the GTX 970 base (which nobody really owns, everybody has slightly higher clocked versions) isn't going to be as big of a performance bump as you think. This is assuming the Xbox One X's GPU isn't CPU limited, which it's sounding like it may be.

It's looking like the X1X is on par with about a GTX 980, not much more powerful than a GTX 970.

Avatar image for ellos
ellos

2532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18  Edited By ellos
Member since 2015 • 2532 Posts

@Wasdie: Are you kidding me, It can easily. As bungie have said CPU is the issue. I feel that Pro and especially X1X can just about do it. The concern is probably the base console cpu's and there user base is the main focus. There holding back these premium consoles. We shall see how cpu benchmark stack up soon enough.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#19  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts

Yeah I just did a quick google and it looks like the GPU in the Xbox One X is comparable to that of a Radeon R9 390x which has about 100 mhz slower clock speed and 44 compute units compared to the Xbox One X's 40.

The Radeon R9 390x benchmarks a bit lower than the GTX 970, but if you bump the Radeon R9 930x's clock speed by 100 mhz it probably makes the difference.

So I think it's safe to say the Xbox One X's GPU will be roughly similar to the GTX 970.

Avatar image for lifelessablaze
lifelessablaze

1066

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#20 lifelessablaze
Member since 2017 • 1066 Posts

@Wasdie said:
@lifelessablaze said:
@Wasdie said:

Can the GPU in the X1X perform to the same level as a GTX 970?

lol the GTX 970 is trash compared to what the X1X is packing.

I think you're overestimating a bit. A 100 mhz bump in clock speed over the GTX 970 base (which nobody really owns, everybody has slightly higher clocked versions) isn't going to be as big of a performance bump as you think. This is assuming the Xbox One X's GPU isn't CPU limited, which it's sounding like it may be.

It's looking like the X1X is on par with about a GTX 980, not much more powerful than a GTX 970.

Go watch the DF analysis of the PC version. They basically say even a toaster can run D2 in at least 30 fps. Also the X1X is packing closer to a GTX 1070

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#21  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts

@lifelessablaze: The specs don't line up with that. It looks like the card is going to perform around the same level as a GTX 970, not 1070.

I know Destiny 2 can be run on basically a toaster, but when you start running at 1440p or 4k, things change.

I have no doubt in my mind that Destiny 2 could run at 1080p60 on the Xbox One X. I'm just questioning your assertion that the GPU is more powerful than a single GTX 970. I don't believe that's the case. They are similar and the Xbox One X's GPU may outperform a GTX 970 by a little, but not by much.

I think you really need to revise your expectations for the Xbox One X. You're not going to be buying a $500 console that competes with $500 GPUs. It's just not going to happen.

Avatar image for appariti0n
appariti0n

5014

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 appariti0n
Member since 2009 • 5014 Posts

@lifelessablaze said:
@appariti0n said:
@lifelessablaze said:
@appariti0n said:
@lifelessablaze said:

The X1X should run it at at least 1440p @60fps. Bungie is full of crap and there's nothing they can say for anyone to take them seriously. They should just be truthful about it and say it's 100% parity issues they locked it to 30 frames. There should be 2 options for X1X users: 1440p 60fps or 4k 30fps. After leaving MS they've been a massive joke in the gaming industry which looks like they're continuing it through their second game.

what CPU do you have paired with your 970? Makes a big difference.

an old ass i5 4750 which was a mid tier CPU even when it was new back in 2014

Yeah..... it's still better than what's in the X1X lol.

lmao yeah right.

The CPU in the X1X is only clocked at 2.3, and has considerably worse instructions per clock cycle than your i5 at 3.2 Ghz.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 MonsieurX
Member since 2008 • 39858 Posts

But your 970 was about 330$ alone when it launched, why would a whole system outperform it for a 100$ more?

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#25  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts

@MonsieurX said:

But your 970 was about 330$ alone when it launched, why would a whole system outperform it for a 100$ more?

The 970s were more than that when they launched. More like $400-450.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 MonsieurX
Member since 2008 • 39858 Posts

@Wasdie said:
@MonsieurX said:

But your 970 was about 330$ alone when it launched, why would a whole system outperform it for a 100$ more?

The 970s were more than that when they launched. More like $400-450.

http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/7

ModelLaunchRelease price (USD)

GeForce GTX 970

September 18, 2014$329
GeForce GTX 980$549
GeForce GTX 980 TiJune 2, 2015$649
Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#27 Wasdie  Moderator
Member since 2003 • 53622 Posts

@MonsieurX: huh

Damn Nvidia what happened with the GTX 1070s prices.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Juub1990
Member since 2013 • 12620 Posts

@Wasdie: The X1X runs Rise of the Tomb Raider in 4K/30fps and has settings comparable to the Very High(highest) preset on PC. A 970 can't do that.

Avatar image for madrocketeer
madrocketeer

10591

Forum Posts

0

Wiki Points

0

Followers

Reviews: -6

User Lists: 0

#29  Edited By madrocketeer
Member since 2005 • 10591 Posts

Actually, the X1X's GPU is a souped-up Radeon RX 580, with more shaders, more TMUs and slighter higher clock. It is therefore about parity with or slightly better than the GTX 970.

The issue definitely seems to be the CPU. They're tiny and slow clocked. From what I can gather, the PS4's SOC consist of two quad-core Jaguars, each one about 3.1 mm2. A Haswell i5 is 177 mm2 (though this admittedly includes the integrated GPU), and clocked at at least 2.7 GHz. Even the X1X's Jaguar is clocked only at 2.3 GHz.

Avatar image for loe12k
loe12k

3465

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 loe12k
Member since 2013 • 3465 Posts

The only reason it will be 30fps was for parity. Xbox one x can do 60 fps. Sony does not want their console to have the inferior version.

Avatar image for NFJSupreme
NFJSupreme

6605

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 NFJSupreme
Member since 2005 • 6605 Posts

CPU bottleneck

Avatar image for endofaugust
EndofAugust

812

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#32 EndofAugust
Member since 2017 • 812 Posts

@Juub1990 said:

@Wasdie: The X1X runs Rise of the Tomb Raider in 4K/30fps and has settings comparable to the Very High(highest) preset on PC. A 970 can't do that.

The performance of this GPU places it between that of the a GTX 980 and 980 Ti, so very much 390X territory. I don't know what this dude is talking about bleating on about the 970...

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33 Shewgenja
Member since 2009 • 21456 Posts

CPU bound game is CPU bound.

Avatar image for joshrmeyer
JoshRMeyer

12577

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 JoshRMeyer
Member since 2015 • 12577 Posts

For all the people complaining, complain about the dumb mid gen upgrades. We all knew they have to satisfy their largest base first, and parity is a must at least in multiplayer, which this whole game is.(Even the story mode has other players around)

Avatar image for Nick3306
Nick3306

3429

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 Nick3306
Member since 2007 • 3429 Posts

@Juub1990 said:

@Wasdie: The X1X runs Rise of the Tomb Raider in 4K/30fps and has settings comparable to the Very High(highest) preset on PC. A 970 can't do that.

The joys of optimization my friend.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 Juub1990
Member since 2013 • 12620 Posts

@Nick3306: Not really. A 1070 can also do that and is reportedly marginally faster than the X1X GPU.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#37 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

The CPU is weaker than an entry i3.

It really is a potato box.

Consoles only sell to the consumers without tech knowledge, and succumb to buzz words when thinking this a monster piece of tech.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#38 ronvalencia
Member since 2008 • 29612 Posts

@lifelessablaze said:

The X1X should run it at at least 1440p @60fps. Bungie is full of crap and there's nothing they can say for anyone to take them seriously. They should just be truthful about it and say it's 100% parity issues they locked it to 30 frames. There should be 2 options for X1X users: 1440p 60fps or 4k 30fps. After leaving MS they've been a massive joke in the gaming industry which looks like they're continuing it through their second game.

X1X's version is geared for Digital Foundry's XBO resolution gate.

For multiplayer without a dedicated server, the problem is you can't have different machines having 30 hz vs 60 hz game simulation update rates mix hence the lowest machine dictates the lowest common denominator.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#39 04dcarraher
Member since 2004 • 23832 Posts

@Wasdie said:

@MonsieurX: huh

Damn Nvidia what happened with the GTX 1070s prices.

Mining craze is what happened. Once all AMD gpu's flew off the shelves they had no choice but to grab GTX 1060's and 1070's to supplement.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40  Edited By ronvalencia
Member since 2008 • 29612 Posts

@madrocketeer said:

Actually, the X1X's GPU is a souped-up Radeon RX 580, with more shaders, more TMUs and slighter higher clock. It is therefore about parity with or slightly better than the GTX 970.

The issue definitely seems to be the CPU. They're tiny and slow clocked. From what I can gather, the PS4's SOC consist of two quad-core Jaguars, each one about 3.1 mm2. A Haswell i5 is 177 mm2 (though this admittedly includes the integrated GPU), and clocked at at least 2.7 GHz. Even the X1X's Jaguar is clocked only at 2.3 GHz.

FALSE,

The following diagram is Polaris 10 for RX-480 which is the same for RX-580

Notice RX-480/RX-580 lacking updates to both Rasterizer and Render Backend(RBE). X1X's graphics pipeline was updated e.g. Rasterizer has conservation occlusion and RBE has 2MB render cache updates. There's 60 graphics pipeline changes for X1X.

Most of Polaris updates are for compute shaders and associated L2 cache e.g. 2MB from 1MB.

From http://www.anandtech.com/show/11740/hot-chips-microsoft-xbox-one-x-scorpio-engine-live-blog-930am-pt-430pm-utc#post0821123606

12:36PM EDT - 8x 256KB render caches

12:37PM EDT - 2MB L2 cache with bypass and index buffer access

12:38PM EDT - out of order rasterization, 1MB parameter cache, delta color compression, depth compression, compressed texture access

X1X's GPU's Render Back Ends (RBE) has 256KB cache each and there's 8 of them, hence 2 MB render cache.

X1X's GPU has 2 MB L2 cache, 1 MB parameter cache and 2MB render cache. That's 5 MB of cache.

https://www.amd.com/Documents/GCN_Architecture_whitepaper.pdf

The old GCN's render back end (RBE) cache. Page 13 of 18.

Once the pixels fragments in a tile have been shaded, they flow to the Render Back-Ends (RBEs). The RBEs apply depth, stencil and alpha tests to determine whether pixel fragments are visible in the final frame. The visible pixels fragments are then sampled for coverage and color to construct the final output pixels. The RBEs in GCN can access up to 8 color samples (i.e. 8x MSAA) from the 16KB color caches and 16 coverage samples (i.e. for up to 16x EQAA) from the 4KB depth caches per pixel. The color samples are blended using weights determined by the coverage samples to generate a final anti-aliased pixel color. The results are written out to the frame buffer, through the memory controllers

GCN version 1.0's RBE cache size is just 20 KB. 8x RBE = 160 KB render cache (for 7970)

AMD R9-290X/R9-390X's aging RBE/ROPS comparison. https://www.slideshare.net/DevCentralAMD/gs4106-the-amd-gcn-architecture-a-crash-course-by-layla-mah

16 RBE with each RBE contains 4 ROPS. Each RBE has 24 bytes cache.

24 bytes x 16 = 384 bytes.

X1X's RBE/ROPS has 2048 KB or 2 MB render cache.

X1X's RBE has 256 KB. 8x RBE = 2048 KB (or 2 MB) render cache. X1X has hold more rendering data on the chip when compared to Radeon HD 7970 and R9-290X/R9-390X.

http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-tech-revealed

We quadrupled the GPU L2 cache size, again for targeting the 4K performance."

X1X GPU's 2MB L2 cache can be used for rendering in addition to X1X's 2 MB render cache.

When X1X's L2 cache (for TMU) and render cache (for ROPS) is combined, the total cache size is 4MB which is similar VEGA's shared 4MB L2 cache for RBE/ROPS and TMUs.

Avatar image for endofaugust
EndofAugust

812

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#41 EndofAugust
Member since 2017 • 812 Posts

@ronvalencia said:
@madrocketeer said:

Actually, the X1X's GPU is a souped-up Radeon RX 580, with more shaders, more TMUs and slighter higher clock. It is therefore about parity with or slightly better than the GTX 970.

The issue definitely seems to be the CPU. They're tiny and slow clocked. From what I can gather, the PS4's SOC consist of two quad-core Jaguars, each one about 3.1 mm2. A Haswell i5 is 177 mm2 (though this admittedly includes the integrated GPU), and clocked at at least 2.7 GHz. Even the X1X's Jaguar is clocked only at 2.3 GHz.

FALSE,

The following diagram is Polaris 10 for RX-480 which is the same for RX-580

Notice RX-480/RX-580 lacking updates to both Rasterizer and Render Backend(RBE). X1X's graphics pipeline was updated e.g. Rasterizer has conservation occlusion and RBE has 2MB render cache updates. There's 60 graphics pipeline changes for X1X.

Most of Polaris updates are for compute shaders and associated L2 cache e.g. 2MB from 1MB.

From http://www.anandtech.com/show/11740/hot-chips-microsoft-xbox-one-x-scorpio-engine-live-blog-930am-pt-430pm-utc#post0821123606

12:36PM EDT - 8x 256KB render caches

12:37PM EDT - 2MB L2 cache with bypass and index buffer access

12:38PM EDT - out of order rasterization, 1MB parameter cache, delta color compression, depth compression, compressed texture access

X1X's GPU's Render Back Ends (RBE) has 256KB cache each and there's 8 of them, hence 2 MB render cache.

X1X's GPU has 2 MB L2 cache, 1 MB parameter cache and 2MB render cache. That's 5 MB of cache.

https://www.amd.com/Documents/GCN_Architecture_whitepaper.pdf

The old GCN's render back end (RBE) cache. Page 13 of 18.

Once the pixels fragments in a tile have been shaded, they flow to the Render Back-Ends (RBEs). The RBEs apply depth, stencil and alpha tests to determine whether pixel fragments are visible in the final frame. The visible pixels fragments are then sampled for coverage and color to construct the final output pixels. The RBEs in GCN can access up to 8 color samples (i.e. 8x MSAA) from the 16KB color caches and 16 coverage samples (i.e. for up to 16x EQAA) from the 4KB depth caches per pixel. The color samples are blended using weights determined by the coverage samples to generate a final anti-aliased pixel color. The results are written out to the frame buffer, through the memory controllers

GCN version 1.0's RBE cache size is just 20 KB. 8x RBE = 160 KB render cache (for 7970)

AMD R9-290X/R9-390X's aging RBE/ROPS comparison. https://www.slideshare.net/DevCentralAMD/gs4106-the-amd-gcn-architecture-a-crash-course-by-layla-mah

16 RBE with each RBE contains 4 ROPS. Each RBE has 24 bytes cache.

24 bytes x 16 = 384 bytes.

X1X's RBE/ROPS has 2048 KB or 2 MB render cache.

X1X's RBE has 256 KB. 8x RBE = 2048 KB (or 2 MB) render cache. X1X has hold more rendering data on the chip when compared to Radeon HD 7970 and R9-290X/R9-390X.

http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-tech-revealed

We quadrupled the GPU L2 cache size, again for targeting the 4K performance."

X1X GPU's 2MB L2 cache can be used for rendering in addition to X1X's 2 MB render cache.

When X1X's L2 cache (for TMU) and render cache (for ROPS) is combined, the total cache size is 4MB which is similar VEGA's shared 4MB L2 cache for RBE/ROPS and TMUs.

This actually made sense as a reply, congrats.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#42  Edited By ronvalencia
Member since 2008 • 29612 Posts

@endofaugust said:

This actually made sense as a reply, congrats.

It's the main reason for ForzaTech's X1X performance boost which increases GCN's pixel shader and RBE performance i.e. pixel shader and RBE math operations has access to 2MB render cache just like GTX 1070's version. It has some benefits for Unreal Engine 4's deferred render.

X1X's graphics pipeline improvements may not have any benefits for games already bias towards compute shader and TMU path i.e. only X1X's effective bandwidth increase improvements.

GTX 1070's higher tessellation rate still stands as long it's not gimped by heavy memory bandwidth workloads. This problem is not major issue for X1X since it's geometry workload would be budgeted.

GPU is only strong as it's weakest point.

Polaris's update is half baked i.e. missing major graphics pipeline updates.

Avatar image for madrocketeer
madrocketeer

10591

Forum Posts

0

Wiki Points

0

Followers

Reviews: -6

User Lists: 0

#43  Edited By madrocketeer
Member since 2005 • 10591 Posts

@endofaugust said:
@ronvalencia said:

(Usual graphs and diagrams overload)

This actually made sense as a reply, congrats.

Yeah, but does he really have to do it in the form of 4-browser-page graphs and diagrams ocular assault? I mean, I did read all of that, and I was able to sum it up as; "You're wrong. The X1X's GPU has updated graphics pipeline, the rasterizer has conservation occlusion and the render backends have 2 MB of cache, compared to GCN 1.0's 160 KB, among other improvements. Combine that with the GPU's 2 MB L2 cache, and the X1X has 4 MB of total cache, which is more similar to the Vega." Simple as that. Throw in a few links as sources and - Bam! - point made.

If he had posted that instead, I would have just said well shave my backside and call me "Served," because I stand corrected. Still, my point remains: these consoles are ridiculously GPU-heavy, and that's where their limitations lie. I was even sort of praising the X1X's GPU in my original post.

But no, I had to sift through all of that mess to see what he was saying. He even repeated a few of his points for... ...some reason? This is why I typically don't bother to reply to ronvalencia's posts any more. Too tired of reading them.

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#44  Edited By AzatiS
Member since 2004 • 14969 Posts

@quadknight said:

The shitty Jaguar CPU is the problem. It's what I've been trying to tell clowns like Wrongvalencia since they announced the official XboneX specs. The XboneX is still bottlenecked by the shitty Jaguar CPU, no amount of optimizations will make up for the garbage CPU in the XBoneX. If you want 60fps at all times stick to PC. Destiny 2 won't run at 60fps on X1X even if they drop it down to 1080p, the CPU is that shitty, same with the PS4 Pro.

Thats for real now ?!

You telling me X1X , 500$ most super duper mega ultimate super ultra bomb power console in the world .. cant play Destiny 2 at 1080p/60fps ? Roooofl !

Is this a joke , when did i miss that?

Avatar image for endofaugust
EndofAugust

812

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#45  Edited By EndofAugust
Member since 2017 • 812 Posts

@AzatiS said:
@quadknight said:

The shitty Jaguar CPU is the problem. It's what I've been trying to tell clowns like Wrongvalencia since they announced the official XboneX specs. The XboneX is still bottlenecked by the shitty Jaguar CPU, no amount of optimizations will make up for the garbage CPU in the XBoneX. If you want 60fps at all times stick to PC. Destiny 2 won't run at 60fps on X1X even if they drop it down to 1080p, the CPU is that shitty, same with the PS4 Pro.

Thats for real now ?!

You telling me X1X , 500$ most super duper mega ultimate super ultra bomb power console in the world .. cant play Destiny 2 at 1080p/60fps ? Roooofl !

Is this a joke , when did i miss that?

No one in any official capacity anywhere has ever stated that. This seems to be a recurring trend surrounding this console, people literally making things up and peddling it as fact, things they made up...

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#46  Edited By AzatiS
Member since 2004 • 14969 Posts

@endofaugust said:
@AzatiS said:
@quadknight said:

The shitty Jaguar CPU is the problem. It's what I've been trying to tell clowns like Wrongvalencia since they announced the official XboneX specs. The XboneX is still bottlenecked by the shitty Jaguar CPU, no amount of optimizations will make up for the garbage CPU in the XBoneX. If you want 60fps at all times stick to PC. Destiny 2 won't run at 60fps on X1X even if they drop it down to 1080p, the CPU is that shitty, same with the PS4 Pro.

Thats for real now ?!

You telling me X1X , 500$ most super duper mega ultimate super ultra bomb power console in the world .. cant play Destiny 2 at 1080p/60fps ? Roooofl !

Is this a joke , when did i miss that?

No one in any official capacity anywhere has ever stated that. This seems to be a recurring trend surrounding this console, people literally making things up and peddling it as fact, things they made up...

No really now , wont be the new console able to run destiny 2 that is pretty much well optimized ( played the PC open beta , game is light !! ) with at 60fps on 1080p ? I mean .. thats the worse news ive heard in a while. So neither PRO will !!

So what , people will buy destiny 2 , an fps game that will run at 30fps at 1080p in new consoles ? 30fps ?!!!!!! Man , i say noone should even touch this shit if its true. Go for PC version or bypass this shit all together.

Avatar image for endofaugust
EndofAugust

812

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#47  Edited By EndofAugust
Member since 2017 • 812 Posts

@AzatiS said:
@endofaugust said:
@AzatiS said:
@quadknight said:

The shitty Jaguar CPU is the problem. It's what I've been trying to tell clowns like Wrongvalencia since they announced the official XboneX specs. The XboneX is still bottlenecked by the shitty Jaguar CPU, no amount of optimizations will make up for the garbage CPU in the XBoneX. If you want 60fps at all times stick to PC. Destiny 2 won't run at 60fps on X1X even if they drop it down to 1080p, the CPU is that shitty, same with the PS4 Pro.

Thats for real now ?!

You telling me X1X , 500$ most super duper mega ultimate super ultra bomb power console in the world .. cant play Destiny 2 at 1080p/60fps ? Roooofl !

Is this a joke , when did i miss that?

No one in any official capacity anywhere has ever stated that. This seems to be a recurring trend surrounding this console, people literally making things up and peddling it as fact, things they made up...

No really now , wont be the new console able to run destiny 2 that is pretty much well optimized ( played the PC open beta , game is light !! ) with at 60fps on 1080p ? I mean .. thats the worse news ive heard in a while. So neither PRO will !!

So what , people will buy destiny 2 , an fps game that will run at 30fps at 1080p in new consoles ? 30fps ?!!!!!! Man , i say noone should even touch this shit if its true. Go for PC version or bypass this shit all together.

The game isn't 1080p on the Pro and Xbox One X, it's checkerboard rendered on the Pro (1920x2160) and will likely be native 4K on the Xbox One X.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#48  Edited By ronvalencia
Member since 2008 • 29612 Posts

@madrocketeer said:
@endofaugust said:
@ronvalencia said:

(Usual graphs and diagrams overload)

This actually made sense as a reply, congrats.

Yeah, but does he really have to do it in the form of 4-browser-page graphs and diagrams ocular assault? I mean, I did read all of that, and I was able to sum it up as; "You're wrong. The X1X's GPU has updated graphics pipeline, the rasterizer has conservation occlusion and the render backends have 2 MB of cache, compared to GCN 1.0's 160 KB, among other improvements. Combine that with the GPU's 2 MB L2 cache, and the X1X has 4 MB of total cache, which is more similar to the Vega." Simple as that. Throw in a few links as sources and - Bam! - point made.

If he had posted that instead, I would have just said well shave my backside and call me "Served," because I stand corrected. Still, my point remains: these consoles are ridiculously GPU-heavy, and that's where their limitations lie. I was even sort of praising the X1X's GPU in my original post.

But no, I had to sift through all of that mess to see what he was saying. He even repeated a few of his points for... ...some reason? This is why I typically don't bother to reply to ronvalencia's posts any more. Too tired of reading them.

X1X was mostly designed for Digital Foundry's XBO resolution gate, hence GPU heavy. X1X's CPU quad core layout follows Ryzen CCX layout instead of flat layout.

X1X's CPU has some improvements over stock Jaguar.

https://en.wikipedia.org/wiki/Translation_lookaside_buffer

A Translation lookaside buffer (TLB) is a memory cache that is used to reduce the time taken to access a user memory location

The TLB is sometimes implemented as content-addressable memory (CAM). The CAM search key is the virtual address and the search result is a physical address. If the requested address is present in the TLB, the CAM search yields a match quickly and the retrieved physical address can be used to access memory. This is called a TLB hit. If the requested address is not in the TLB, it is a miss, and the translation proceeds by looking up the page table in a process called a page walk. The page walk is time consuming when compared to the processor speed, as it involves reading the contents of multiple memory locations and using them to compute the physical address. After the physical address is determined by the page walk, the virtual address to physical address mapping is entered into the TLB

Performance implications

The CPU has to access main memory for an instruction cache miss, data cache miss, or TLB miss. The third case (the simplest one) is where the desired information itself actually is in a cache, but the information for virtual-to-physical translation is not in a TLB. These are all slow, due to the need to access a slower level of the memory hierarchy, so a well-functioning TLB is important. Indeed, a TLB miss can be more expensive than an instruction or data cache miss, due to the need for not just a load from main memory, but a page walk, requiring several memory accesses.

Avatar image for svaubel
svaubel

4571

Forum Posts

0

Wiki Points

0

Followers

Reviews: 133

User Lists: 0

#49  Edited By svaubel
Member since 2005 • 4571 Posts

Well both Ps4 Pro and X1X have shit cpu's that bottleneck the gpu. So there you go.

You would have thought that for console upgrades they would have invested in better processors that dont hold back everything else

Avatar image for daredevils2k
daredevils2k

5001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50  Edited By daredevils2k
Member since 2015 • 5001 Posts

@loe12k said:

The only reason it will be 30fps was for parity. Xbox one x can do 60 fps. Sony does not want their console to have the inferior version.

LOL but not at 4k and if if was 4k, it would be at the lowest settings. If they did a 1440p version it would most likely be at medium settings. Lemmings and their dreams of being equal to the master race lol .