These threads are...
You need more than 12TF to claim 3X Pro,and 18TF to claim 3X xbox one X,the juggle of words to convince people that this machines are a great jump will be incredible.
Not really. MS never claimed that the GPU in Scarlet will be 4x faster than in the X, they were talking about overall performance, so add to that the miles better Zen CPU with some type of SSD and faster memory and you might end up with 4x better performance* (*includes loading times in the equation).
As for the TFlops thing, Sony claimed 12TF for the PS5, which is bullshit, when we can see that the $450 225W 5700XT only has 10TF. They might have been comparing to VEGA equivalent, so the PS5 will most likely have a 5700/Vega56 level GPU at best (same with MS) and might be using Navi gen2 cards in the final product.
so skipping power, we already comparing exclusive this early? Hope xbox has something exclusive that is not on pc
We know it will all be on PC, looks like MS are going to push not just hardware this next cycle but their ecosystem, create as low a barrier to entry as possible without hurting themselves and get their games in the hands of as many consumers as possible, will it work? that remains to be seen but in all honesty the surest bet going in to the next cycle really does look like Nintendo, their strength of IP combined with the fact that no one does what Nintendo does as well as Nintendo does it makes them a safe bet.
Even if AMD got Navi to match Nvidia's Pascal or Turing's efficiency...... A 10 TFLOP NAVI is still not 4k 60 fps material without compromises, let alone having Ray tracing....
That's the part that's yet to be determined because the next gen consoles have custom components which have yet to be revealed.
You have to remember we're not talking bare bone GPUs that are simply thrown into random rigs. From the SSD's to the CPUs every piece of these consoles are being designed to work as one unit.
We won't know what their actual capability is until we either have a developer leaks that information early or when Sony/MS actually show off the the next gen games running on these consoles.
There are already reports of PS5 dev kits running Red Dead Redemption 2, Spiderman and The Last of Us 2 (all AAA caliber titles) 4k 60fps so saying they will not do 4k 60 is really premature.
At the very least, it sounds like next gen consoles will be running any current gen titles hitting that bench.
Got any links to back that up?
Why is that so hard to believe?
PS5 will have more then enough grunt to run those games at 4k/60.
AMD TFlops have not mattered for a while now. Many of their previous GPUs would have high TFlops count, but perform worse than Nvidia GPUs with less TFlops due to the architecture and driver differences. All that matters is real world benchmarks.
Next gen 8TF confirmed.
Lol @ AMD tflops
This.
Think we should wait until these machines actually exist and then compare benchmarks for third party games. Debating specs that apparently don't tell you the whole story, even IF we knew what they were seems pretty pointless.
@Pedro: Because 80% of people have the base console and they serve as the lowest common denominator?
This is true but I don't see why comparing it to the most recent released hardware is "bad". In the end it depends on the consumers' perspective. If you have orig Xbox one or S then I can understand comparing it to those and if you have a One X, I also understand the comparison. In the end any of the next gen would be significant to the launch consoles.
Why not use 2080TI?
It's hardware that also came out "this gen" and you said consoles are just PC's anyway, so why not use a 2080TI?
You can if you want. Consoles are just PCs anyway.....Oh, you thought they were something special.
No silly, it's YOUR argument that consoles are just PC's.
So why only use the latest upgraded console hardware as your reference for next generation instead of using the latest PC hardware?
Maybe you aren't bright enough to defend the fault in your logic like a good deflector should.
If you aren't even able to explain why the latest PC hardware shouldn't be used as the reference point for generation leaps when you claim consoles are just PCs, then you should stop trying to speak on the topic of generations until you get a grasp on your own narrative and can defend it thoroughly.
With that said...why should anyone use the X or Pro as a reference point for next gen?
*lol and try to have that make sense while also claiming consoles are just PC's, which currently have better hardware and would be a better reference point for a generation leap than the shitty mid level PC GPUs with jaguar cores.*
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
I didn't "conveniently" miss anything. I was going off the size of the chip for Scarlet that they counted 365-380mm2 that was estimated to be 48 CU, 44 active.
FYI: The size of PC version of navi GPU comes in at 251mm2 which 40 CUs. Why would I include that when that's NOT what they calculated for the console chip?
Figure what you want about the end result for next gen consoles, but don't project nonsense on me because you have different theories than Digital Foundry has put out.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
RT is looking like more and more of a fail for hardware based solution. It may just be a smarter route to just use shader software until the technology can keep up with performance.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
I dunno on that last point dude...... an HD 7850 can't hang with PS4 these days.....
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
RT is looking like more and more of a fail for hardware based solution. It may just be a smarter route to just use shader software until the technology can keep up with performance.
RT is fundamentally better, but just isn't ready enough. I was merely just pointing out MS record of misleading PR, even if it is shader based as I expect to be, MS can still turn around and say it is hardware based because it is still using the graphics card, which is a piece of hardware.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
I didn't "conveniently" miss anything. I was going off the size of the chip for Scarlet that they counted 365-380mm2 that was estimated to be 48 CU, 44 active.
FYI: The size of PC version of navi GPU comes in at 251mm2 which 40 CUs. Why would I include that when that's NOT what they calculated for the console chip?
Figure what you want about the end result for next gen consoles, but don't project nonsense on me because you have different theories than Digital Foundry has put out.
They have been wrong with every single guess they have ever made on hardware predictions. They never use TDP as a indicator nor price/yeilds.
Go with what you want. I have been consistant with even my Navi predictions knowing it would be GCN based that the TDP would be a nightmare.
A 40 CU 5700 XT has a TDP of 225w... And these clowns think MS can fit a 48 CU chip with Ryzen and RT cores?... They are smoking some strong s***.
Same way they were wrong about the Switch when they said it would use a Tegra X2 and X1X when they said it would use Vega... These guys are retarded.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.
Until I have full confirmation of RT cores, then I'm still sceptical.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.
Until I have full confirmation of RT cores, then I'm still sceptical.
Depends on what method, because there are two types of hardware usage. You have the shader processor method(which eats into the total performance of the gpu more so than having dedicated processors to handle RT) and then you separate dedicated processors for RT.
No silly, it's YOUR argument that consoles are just PC's.
So why only use the latest upgraded console hardware as your reference for next generation instead of using the latest PC hardware?
Maybe you aren't bright enough to defend the fault in your logic like a good deflector should.
If you aren't even able to explain why the latest PC hardware shouldn't be used as the reference point for generation leaps when you claim consoles are just PCs, then you should stop trying to speak on the topic of generations until you get a grasp on your own narrative and can defend it thoroughly.
With that said...why should anyone use the X or Pro as a reference point for next gen?
*lol and try to have that make sense while also claiming consoles are just PC's, which currently have better hardware and would be a better reference point for a generation leap than the shitty mid level PC GPUs with jaguar cores.*
Funny, I said that you can compare it with the latest PC hardware if you want and here you are complaining about the fact that consoles are not PCs. Seems like your ignorance know no bounds. :)
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.
Until I have full confirmation of RT cores, then I'm still sceptical.
Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.
Until I have full confirmation of RT cores, then I'm still sceptical.
Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.
NAVI 10 is also "RDNA" and yet it does not have dedicated RT cores. So its not a guarantee.
Don't believe it will actually be hardware based, we know what MS is like with their wordings. I know they said it is, but they will probably say it is the graphics card which is doing it based on shaders, rather than what we would interpret as RT cores.
As per the DF video it's confirmed the consoles are using the RDNA variant of Navi...which has hardware support for RT.
Until I have full confirmation of RT cores, then I'm still sceptical.
Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.
NAVI 10 is also "RDNA" and yet it does not have dedicated RT cores. So its not a guarantee.
Do you have a link that says Navi 10 doesn't have RT cores? As everything I've seen says 'to be confirmed'
Until I have full confirmation of RT cores, then I'm still sceptical.
Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.
NAVI 10 is also "RDNA" and yet it does not have dedicated RT cores. So its not a guarantee.
Do you have a link that says Navi 10 doesn't have RT cores? As everything I've seen says 'to be confirmed'
"To begin with, we can confirm that the "Navi 10" silicon has no fixed function hardware for ray-tracing such as the RT core or tensor cores found in NVIDIA "Turing" RTX GPUs. For now, AMD's implementation of DXR (DirectX Ray-tracing) for now relies entirely on programmable shaders."
https://www.techpowerup.com/256459/amd-radeon-rx-5700-xt-confirmed-to-feature-64-rops-architecture-brief?cp=2
MS would need next gen RDNA, AMD have said it will come. But that will cost a fortune in my eyes, maybe it will be part of the their semi custom design, but do not see it.
No silly, it's YOUR argument that consoles are just PC's.
So why only use the latest upgraded console hardware as your reference for next generation instead of using the latest PC hardware?
Maybe you aren't bright enough to defend the fault in your logic like a good deflector should.
If you aren't even able to explain why the latest PC hardware shouldn't be used as the reference point for generation leaps when you claim consoles are just PCs, then you should stop trying to speak on the topic of generations until you get a grasp on your own narrative and can defend it thoroughly.
With that said...why should anyone use the X or Pro as a reference point for next gen?
*lol and try to have that make sense while also claiming consoles are just PC's, which currently have better hardware and would be a better reference point for a generation leap than the shitty mid level PC GPUs with jaguar cores.*
Funny, I said that you can compare it with the latest PC hardware if you want and here you are complaining about the fact that consoles are not PCs. Seems like your ignorance know no bounds. :)
Not me, I asked YOU why don't you use PC GPU/CPU?
You keep referencing to X and Pro for next gen NOT me. You said consoles are just PCs so why do you want to reference mid gen for next gen?
So I'm asking you why don't you use the latest PC hardware as your reference for next gen instead of older mid-gen consoles?
You don't seem to have an answer.
Until I have full confirmation of RT cores, then I'm still sceptical.
Well if they're using RDNA then they will have dedicated RT cores as that's what the RDNA architecture is....... Navi with RT cores.
NAVI 10 is also "RDNA" and yet it does not have dedicated RT cores. So its not a guarantee.
Do you have a link that says Navi 10 doesn't have RT cores? As everything I've seen says 'to be confirmed'
"AMD officially said the path they see includes using the current GCN and RDNA cards to perform ray tracing using the card’s shader cores."
https://www.pcworld.com/article/3401598/amd-releases-two-affordable-navi-based-radeon-rx-cards-and-lays-out-ray-tracing-plans.html
"Also, when it comes to ray tracing, AMD is indeed developing their own suite around it. According to their vision, current GCN and RDNA architecture will be able to perform ray tracing on shaders which will be used through ProRender for creators and Radeon Rays for developers. In next-gen RDNA which is supposed to launch in 2020 on 7nm+ node, AMD will be bringing hardware-enabled ray tracing with select lighting effects for real-time gaming. AMD will also enable full-scene ray tracing which would be leveraged through cloud computing."
https://wccftech.com/amd-radeon-rx-5700-xt-7nm-navi-rdna-gpu-official-launch/
So again not all Navi based gpus will have dedicated RT cores. So its not a guarantee that one or both consoles will have dedicated RT cores.
Why is that so hard to believe?
PS5 will have more then enough grunt to run those games at 4k/60.
Hmmm. 4X resolution (or more?) and twice the fps. Well if it is 8 times stronger I guess.
They may have to do some funny stuff to pull it off but I do think that will be the standard for next gen.
I.E. they may checkerboard 1800p to 4k to ensure 60fps but most will aim for higher frames over native 4k res if need be.
Not me, I asked YOU why don't you use PC GPU/CPU?
You keep referencing to X and Pro for next gen NOT me. You said consoles are just PCs so why do you want to reference mid gen for next gen?
So I'm asking you why don't you use the latest PC hardware as your reference for next gen instead of older mid-gen consoles?
You don't seem to have an answer.
Because I am comparing current offerings from each company to future offerings from each company. I know its mind-blowing. But I think its because you're annoyed that the next gen is not going to be that much stronger than the midgen. Don't be upset with me because your favorite company released a midgen console that undermines their next gen. :)
@Pedro: She's saying for what's offered. Which is stupid. Because in reality PC would have the worst version, and the best version.
So basically about the same performance as a 2070, that’s not enough for 4K 60 ultra let alone 8k.
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
I dunno on that last point dude...... an HD 7850 can't hang with PS4 these days.....
The PS4 is not a 7850 its a 7870 with 2 cores disabled. 7870 has 20 CU's and the PS4 has 18 CU's, not to mention more RAM. That said the 7850 can pretty much do 1080p medium at 30FPS just like the PS4:
And just like the PS4 the PS5 will have more RAM than a 5700 Pro so its longevity will be far superior, not to mention that after games are tuned for those consoles so consoles will have a slight advantage but as you can see above... Not much and that GPU is weaker than a PS4 with only 2GB RAM.
So basically about the same performance as a 2070, that’s not enough for 4K 60 ultra let alone 8k.
X1X has Variable Shading Rate (VSR) like feature. https://www.eurogamer.net/articles/digitalfoundry-2017-the-scorpio-engine-in-depth
Andrew Goossen tells us that the GPU supports extensions that allow depth and ID buffers to be efficiently rendered at full native resolution, while colour buffers can be rendered at half resolution with full pixel shader efficiency. Based on conversations last year with Mark Cerny, there is some commonality in approach here with some of the aspects of PlayStation 4 Pro's design, but we can expect some variation in customisations - despite both working with AMD, we're reliably informed that neither Sony or Microsoft are at all aware of each other's designs before they are publicly unveiled.
Depth buffer deals with geometry render buffer. NVIDIA's VSR on Turing from https://developer.nvidia.com/vrworks/graphics/variablerateshading
--------
The majority of current PC games doesn't exploit RTX 2070's new features such VSR, FP rapid pack maths (missing in gaming PC Pascal GPUs) and Tensors cores. Tensor cores and Rapid Pack Maths on Microsoft platforms are accessed by Direct3D12's DirectML and Meta-commands.
When Turing's RPM and Async compute is used, RTX 2080 leaves GTX 1080 Ti behind. This is done via Vulkan API with NVIDIA and AMD API extensions (OpenGL style vendor specific API extensions returns LOL).
AMD Vega GPUs with RPM and HW async compute was beaten by NVIDIA's Turing hardware features brought down from server Volta GPU which employs it's own RPM and Async compute hardware features.
What can RTX 2070 do with it's new hardware features?
https://www.guru3d.com/articles-pages/msi-geforce-rtx-2070-armor-8g-review,24.html
Both NVIDIA and AMD sides used their vendor specific extensions.
Source: https://www.guru3d.com/articles-pages/msi-geforce-rtx-2070-armor-8g-review,24.html
You conveniently only bullet pointed the 48 CU chip with 44 active CU's.... You missed the most likely 40 CU with 36 CU's active which is more inline with what is possible based on TDP.
The next consoles in terms of raw power will have a hard time competing against a full XT, they will more than likely land between a Pro and XT in performance and depending on how much Ray Tracing cores affect the TDP it could be worse than we all think.
AMD over-voltage their PC GPUs increase yields.
For Project Scorpio, MS develops smart VRM with automatic under-volts per silicon quality and waits for second generation version for improved electron leakage maturity.
RX 580 can consume 209 watts during PC gaming which almost mirrors 225 watts RX-5700 XT's 225 watts TDP.
The gap between 5700's release to Scarlet's release is the similar to the gap between RX 480's release and Scorpio's release.
Next year's TSMC 7nm+ has
N7+ has identical yield rates to N7 and will steadily improve, while also offering a 20% increase to transistor density. There’s also a 10% performance uplift or 15% power efficiency increase. AMD will take advantage of the former in their fourth-gen Ryzen which they’ve confirmed to use TSMC’s 7nm+ (cite ref 1)
References
1. https://www.techspot.com/news/80237-tsmc-7nm-production-improves-performance-10.html
I'd just like to point out that dev kits come with a target performance point. People are making wild assumptions about whether or not the alpha/beta dev kits are at all reflective of this from a hardware standpoint.
FALSE, E3 2005 gears of War 1 demo with GeForce 6800 ultra SLI or Radeon X800 Crossfire PowerMacs didn't reflect the final Xbox 360's GPU which is more capable than E3 2005 PowerMacs dev kits.
FALSE, E3 2005 gears of War 1 demo with GeForce 6800 ultra SLI or Radeon X800 Crossfire PowerMacs didn't reflect the final Xbox 360's GPU which is more capable than E3 2005 PowerMacs dev kits.
Xbox One X devkit is beefier than the Xbox One X. :|
FALSE, E3 2005 gears of War 1 demo with GeForce 6800 ultra SLI or Radeon X800 Crossfire PowerMacs didn't reflect the final Xbox 360's GPU which is more capable than E3 2005 PowerMacs dev kits.
Xbox One X devkit is beefier than the Xbox One X. :|
Xbox One X's 1st silicon arrived December 2016.
Actual Xbox One X dev kit when out in March 2017.
IMO I would not place any faith in Dev Kit specs this early - chips aren't even close to ready. First Scorpio bring-up was 12/16 and real dev kits went out in March 2017. X1 and PS4 even later.
— Albert Penello (@albertpenello) June 17, 2019
We sent out Mac's for X360 dev kits. Dev kits mean very little at this point TBH.
Xbox One X dev kit APU runs on the same silicon as retail Xbox One X's APU but it acts like R9-390X (44 CU) silicon quality instead of R9-390 Pro (40 CU) silicon quality.
@ronvalencia: And, like every other useless response you share, non of that drivel changes anything in my comment. Good job.
Your argument is useless. Who are you? You're not Microsoft.
Your argument is useless. Who are you? You're not Microsoft.
There was no argument, just facts. You seem to be allergic to it.
You attempted to make an argument. I have you the context for real Xbox Scorpio silicon's time line.
There was no argument, just facts. You seem to be allergic to it.
You attempted to make an argument. I have you the context for real Xbox Scorpio silicon's time line.
There was no argument, just facts. You seem to be allergic to it.
The struggle is real. Now you have become incoherent. Facts seem to do that people who don't know what they are talking about. BTW, how is the fact that the Xbox One X dev kit being stronger than the retail system working out for you? LOL
You attempted to make an argument. I have you the context for real Xbox Scorpio silicon's time line.
There was no argument, just facts. You seem to be allergic to it.
The struggle is real. Now you have become incoherent. Facts seem to do that people who don't know what they are talking about. BTW, how is the fact that the Xbox One X dev kit being stronger than the retail system working out for you? LOL
The real struggle comes from you since you omitted the context and time line for real X1X dev kit.
LOL the crying already started... the old fart thinks he knows more than the digital foundry guys... this poor old man is in love with plastic consoles.
Please Log In to post.
Log in to comment