And just like that, 8k became a thing.

  • 66 results
  • 1
  • 2
Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#1  Edited By Shewgenja
Member since 2009 • 21456 Posts

https://www.engadget.com/2019/06/05/lg-8k-oled-tv-big-expensive/

I hope this satisfies the usual System Wars myopia about the next frontier of television panels and the wild speculation that next gen consoles may or may not exist in an 8k world.

The fact is, 8k arrives before the next consoles. It is important to support this format to some degree. Not just platform makers throwing out bullet points as some grand conspiracy.

FYI, the Japanese broadcast NHK in 8k already.

Avatar image for son-goku7523
Son-Goku7523

955

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#2  Edited By Son-Goku7523
Member since 2019 • 955 Posts

Not surprising in the least. You can already buy a Samsung QLED 8KTV from BestBuy and other retailers at $4500.

People thinking Sony is too quick announcing 8K support are just not knowledgeable about the quick advances in TV tech.

Avatar image for vfighter
VFighter

11031

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 VFighter
Member since 2016 • 11031 Posts

And here I am totally happy with my 1080 display with zero reason to buy anything higher at the moment.

Avatar image for Guy_Brohski
Guy_Brohski

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#4 Guy_Brohski
Member since 2013 • 2221 Posts

16K or go home.

Avatar image for uninspiredcup
uninspiredcup

59181

Forum Posts

0

Wiki Points

0

Followers

Reviews: 86

User Lists: 2

#5 uninspiredcup
Member since 2013 • 59181 Posts

0.2% difference for 12X the price.

Avatar image for deactivated-60113e7859d7d
deactivated-60113e7859d7d

3808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#6  Edited By deactivated-60113e7859d7d
Member since 2017 • 3808 Posts

Diminishing returns. Diminishing returns. This would be great for high res photo viewing. Incredibly demanding and pointless for moving images that you can barely pay attention to for the 1/24th or 1/60th of a second.

Avatar image for locus-solus
locus-solus

1560

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7  Edited By locus-solus
Member since 2013 • 1560 Posts

micro-led has all the benefits of oled perfect blacks without the negatives of image burn-in and color degradation with added benefits of much higher brightness 1000 nits+ needed for hdr.

Dynamic HDR

Widespread adoption of freesync-2

  • Variable Refresh Rate (VRR) reduces or eliminates lag, stutter and frame tearing for more fluid and better detailed gameplay.

technology like these are much more useful than 8k

Avatar image for rzxv04
rzxv04

2578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#8 rzxv04
Member since 2018 • 2578 Posts

@son-goku7523 said:

Not surprising in the least. You can already buy a Samsung QLED 8KTV from BestBuy and other retailers at $4500.

People thinking Sony is too quick announcing 8K support are just not knowledgeable about the quick advances in TV tech.

That's quite an aggressive price. Definitely coming down to $ 2000 during rare clearances in the US.

I expect 8k to become even more common for midgrade tvs.

Others gotta remember that 8K will eventually be almost non metric for quality just as how we're plagued with crappy-mediocre 4K tvs currently. Check last year's black friday of $ 200 4K 55 inch tvs. You'd be surprised at how majority are surprised with just hearing "4K", they assume it's good quality. This happened to me many times when I get the low and mid performers. "That a 4K tv!? Awesome!", yet their 8+ year old 1080p samsung plasma **** on it as far as overall image goes unless we're sitting very close.

Other metrics are peak/sustained brightness, true contrast, reflection, sub pixel arrangement, viewing angles, uniformity, motion, brightness fluctuation, etc.

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#9 deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

I think no one is doubting that 8k will come, it's just that no one is expecting the next consoles to run games at 8k. But after the Pro Sony knows it can get away with that. But I'm sure MS will pull similar BS with their console.

Avatar image for JohnnyGT1
JohnnyGT1

147

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 JohnnyGT1
Member since 2012 • 147 Posts

I didn't notice I don't really look at TV that much but I'm guessing both their NHK E and NHK G then?

Avatar image for howmakewood
Howmakewood

7713

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11  Edited By Howmakewood
Member since 2015 • 7713 Posts

Xbox One S also supports 4K

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

I just bought a 4k Vieira last year, i'm good for the next 5yrs thanks.

Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 stereointegrity
Member since 2007 • 12151 Posts

were atleast 5 years away from 8k becoming a standard. were at the 4k time now and its not gonna change for atleast half a decade

Avatar image for tdkmillsy
tdkmillsy

6050

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#14 tdkmillsy
Member since 2003 • 6050 Posts

If you could afford an 8k tv in the next couple of years and really wanted to show it off, would you buy a PS5/Next Xbox?

Not sure you would really. You could afford a super high spec PC to play those games at that kind of resolution.

I don't think either console will hit 8k nearly as much as the X has met 4k. But you never know.

Avatar image for pc_rocks
PC_Rocks

8495

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#15 PC_Rocks
Member since 2018 • 8495 Posts

Did you just found out that 8K TVs are a thing?

Avatar image for ajstyles
AJStyles

1430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#16 AJStyles
Member since 2018 • 1430 Posts

Gran Turismo 8K exists. It’s coming to PS5.

Avatar image for rzxv04
rzxv04

2578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#17 rzxv04
Member since 2018 • 2578 Posts

@phbz said:

I think no one is doubting that 8k will come, it's just that no one is expecting the next consoles to run games at 8k. But after the Pro Sony knows it can get away with that. But I'm sure MS will pull similar BS with their console.

At best we can probably expect less demanding graphics games to run 8K native/8K CB/8K current secret sauce "next gen AI upscaling" (think of a hypothetical AMD DLSS).

Now heavy graphics AAA games like future Halo 6, God of War 5, etc., nah. I don't think those will internally render at 8K.

Avatar image for howmakewood
Howmakewood

7713

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18 Howmakewood
Member since 2015 • 7713 Posts
@tdkmillsy said:

If you could afford an 8k tv in the next couple of years and really wanted to show it off, would you buy a PS5/Next Xbox?

Not sure you would really. You could afford a super high spec PC to play those games at that kind of resolution.

I don't think either console will hit 8k nearly as much as the X has met 4k. But you never know.

hitting 8k as often as One X would require about 4times as powerful gpu, so give or take 24tflops and most likely we getting half of that so ye

Avatar image for tdkmillsy
tdkmillsy

6050

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 tdkmillsy
Member since 2003 • 6050 Posts

@ajstyles said:

Gran Turismo 8K exists. It’s coming to PS5.

Tech demo is far away from an actual game, did they actually confirm the game was coming to PS5 at 8k 120fps??

Racing games have the best chance (Forza at 4k for example) but 8k 120fps, I'm not sure about that.

Avatar image for rzxv04
rzxv04

2578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#20 rzxv04
Member since 2018 • 2578 Posts

@tdkmillsy said:
@ajstyles said:

Gran Turismo 8K exists. It’s coming to PS5.

Tech demo is far away from an actual game, did they actually confirm the game was coming to PS5 at 8k 120fps??

Racing games have the best chance (Forza at 4k for example) but 8k 120fps, I'm not sure about that.

Doubt that. At best I expect 4KCB/AI upscale and 120hz and only with 8th gen graphics.

Avatar image for Utensilman
Utensilman

1571

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#21 Utensilman
Member since 2006 • 1571 Posts

1080p looks noticeably bad these days with all the 4K. So I’m all for 8k

Avatar image for dxmcat
dxmcat

3385

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 dxmcat
Member since 2007 • 3385 Posts

Fake news :P

Samsung has had their 8k tv around for many months now but guess it isnt good enough due to lack of sticker shock.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#23  Edited By ronvalencia
Member since 2008 • 29612 Posts

@howmakewood said:
@tdkmillsy said:

If you could afford an 8k tv in the next couple of years and really wanted to show it off, would you buy a PS5/Next Xbox?

Not sure you would really. You could afford a super high spec PC to play those games at that kind of resolution.

I don't think either console will hit 8k nearly as much as the X has met 4k. But you never know.

hitting 8k as often as One X would require about 4times as powerful gpu, so give or take 24tflops and most likely we getting half of that so ye

"Variable Rate Shading " would reduce the workload e.g. render geometry at 8K while shading at 4K or less.

https://developer.nvidia.com/vrworks/graphics/variablerateshading

Variable Rate Shading is a new, easy to implement rendering technique enabled by Turing GPUs. It increases rendering performance and quality by applying varying amount of processing power to different areas of the image. VRS works by varying the number of pixels that can be processed by a single pixel shader operation. Single pixel shading operations can now be applied to a block of pixels, allowing applications to effectively vary the shading rate in different areas of the screen.

Variable Rate Shading can be used to render more efficiently in VR by rendering to a surface that closely approximates the lens corrected image that is output to the headset display. This avoids rendering many pixels that would otherwise be discarded before the image is output to the VR headset.

https://www.eurogamer.net/articles/digitalfoundry-2017-the-scorpio-engine-in-depth

However, Andrew Goossen tells us that the GPU supports extensions that allow depth and ID buffers to be efficiently rendered at full native resolution, while colour buffers can be rendered at half resolution with full pixel shader efficiency

Xbox Scorpios GPU includes hardware features that doesn't exist in PC GPUs prior to Turing GPUs, hence old GCN TFLOPS vs resolution comparison is not valid with X1X.

Besides memory bandwidth advantage for X1X, the old RX-580 with 6.1 TFLOPS was being compared to X1X's GPU with a variable rate shading like feature with apparent jump in rendering resolution with native geometry edges from 6 TFLOPS GPU.

X1X's variable rate shading like feature is different from PS4 Pro's checker-board rendering hardware feature. Games like RDR2 shows X1X was able break the old GCN TFLOPS vs resolution scaling i.e. resolution vs TFLOPS scales from XBO, PS4 to PS4 Pro , but X1X was able to break the trend.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 tormentos
Member since 2003 • 33784 Posts

@phbz said:

I think no one is doubting that 8k will come, it's just that no one is expecting the next consoles to run games at 8k. But after the Pro Sony knows it can get away with that. But I'm sure MS will pull similar BS with their console.

MS was the first to falsely claim true 4k not sony.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25  Edited By tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

Xbox Scorpios GPU includes hardware features that doesn't exist in PC GPUs prior to Turing GPUs, hence old GCN TFLOPS vs resolution comparison is not valid with X1X.

Besides memory bandwidth advantage for X1X, the old RX-580 with 6.1 TFLOPS was being compared to X1X's GPU with a variable rate shading like feature with apparent jump in rendering resolution with native geometry edges from 6 TFLOPS GPU.

X1X's variable rate shading like feature is different from PS4 Pro's checker-board rendering hardware feature. Games like RDR2 shows X1X was able break the old GCN TFLOPS vs resolution scaling i.e. resolution vs TFLOPS scales from XBO, PS4 to PS4 Pro , but X1X was able to break the tread.

Man the RX580 bandwidth is totally for self,the XBO X bandwidth is shared and has a penalty just like the PS4 does have one.

Comparing the xbox one X peak SHARED bandwidth vs the RX580 only GPU bandwidth is stupid as the CPU side runs on DDR4 bandwidth it self without using the video card bandwidth.

The RX580 can beat the xbox one X in several games so does the 1060 as well.

Avatar image for 2mrw
2mrw

6205

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#26 2mrw
Member since 2008 • 6205 Posts

Those companies need to come up with something new to sell they under utilised useless new products

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#27  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@vfighter: same. I might go 1440p in a year or two

Edit - P.S., people are flat out delusional if they think they will be gaming in 8k in the PS5. Supporting the format means it can do things like display a picture or do 8k movie playback. Do you have any concept of how resource intensive RENDERING a game at that res will be?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#28  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

Xbox Scorpios GPU includes hardware features that doesn't exist in PC GPUs prior to Turing GPUs, hence old GCN TFLOPS vs resolution comparison is not valid with X1X.

Besides memory bandwidth advantage for X1X, the old RX-580 with 6.1 TFLOPS was being compared to X1X's GPU with a variable rate shading like feature with apparent jump in rendering resolution with native geometry edges from 6 TFLOPS GPU.

X1X's variable rate shading like feature is different from PS4 Pro's checker-board rendering hardware feature. Games like RDR2 shows X1X was able break the old GCN TFLOPS vs resolution scaling i.e. resolution vs TFLOPS scales from XBO, PS4 to PS4 Pro , but X1X was able to break the tread.

Man the RX580 bandwidth is totally for self,the XBO X bandwidth is shared and has a penalty just like the PS4 does have one.

Comparing the xbox one X peak SHARED bandwidth vs the RX580 only GPU bandwidth is stupid as the CPU side runs on DDR4 bandwidth it self without using the video card bandwidth.

The RX580 can beat the xbox one X in several games so does the 1060 as well.

1. X1X CPU's memory bandwidth usage is bound by the lowest common denominator which is PS4 which is about 10 GB/s one direction from CPU to the GPU. Both PS4 Pro and X1X has similar target frame rates, hence CPU's geometry control points workload is similar for both boxes.

2. XBO DirectX resource management overhead is lower than PS4! and PC's DirectX12 (EA DICE claims). Less CPU load on XBO when compared to PC's DirectX12.

3. X1X CPU doesn't handle Direct3D12 API to GPU ISA translation since it's handling on GPU's micro-coding Direct3D12 engine. This is different on the PC since PC GPUs doesn't directly consume DirectX12 API calls. Less CPU load on XBO when compared to PC.

4. X1X's variable rate shading feature is not automatic i.e. programmers needs to deeply code for it.

5. AMD sponsored game like FarCry 5 shows X1X's superiority over RX-580 and GTX 1060. Forza Horizon 4 has downgraded alpha rain effects when compared to Forza Motorsport 7's version. MS's demonstration with FM7's wet track was intentional i.e. proving memory bandwidth superiority over RX-580 with heavy alpha effects. Your X1X's shared memory being inferior to RX-580 argument is debunked!

6. X1X GPU can be bound by CPU's power when GPU power is available e.g. problematic 1440p resolution with 60 fps target while 4K at 30 fps. X1X was specifically designed for digital foundry's XBO resolution gate!

7. X1X GPU's 2 MB render cache reduce hit rates to external memory. Missing on RX-580. Micro-tile cache render is usually a manual process on AMD GPUs i.e. programmers needs to deeply code for it.

At 4K, RX 580 vs GTX 1070 is only 4 to 5 fps difference which is small enough for X1X GPU to jump over RX-580.

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#29  Edited By Ten_Pints
Member since 2014 • 4072 Posts

Utterly pointless for moving images.

Most people can't tell the difference between 1080 and 2160 at normal distances if you have good anti aliasing.

I'd be more interested in colour reproduction and response time.

Avatar image for blazepanzer24
Blazepanzer24

437

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#30 Blazepanzer24
Member since 2018 • 437 Posts

@vfighter: I feel the exact same way!

Avatar image for Litchie
Litchie

34698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#31 Litchie
Member since 2003 • 34698 Posts
@vfighter said:

And here I am totally happy with my 1080 display with zero reason to buy anything higher at the moment.

@uninspiredcup said:

0.2% difference for 12X the price.

@ezekiel43 said:

Diminishing returns. Diminishing returns. This would be great for high res photo viewing. Incredibly demanding and pointless for moving images that you can barely pay attention to for the 1/24th or 1/60th of a second.

Yup.

Avatar image for pc_rocks
PC_Rocks

8495

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#32 PC_Rocks
Member since 2018 • 8495 Posts

I don't understand why would anyone buy 8K when there's virtually no content even apart from the diminishing returns. And higher res without higher frames results in jarring experience.

Avatar image for tdkmillsy
tdkmillsy

6050

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33  Edited By tdkmillsy
Member since 2003 • 6050 Posts

@tormentos said:
@phbz said:

I think no one is doubting that 8k will come, it's just that no one is expecting the next consoles to run games at 8k. But after the Pro Sony knows it can get away with that. But I'm sure MS will pull similar BS with their console.

MS was the first to falsely claim true 4k not sony.

Sony was the first to falsely claim 8k :)

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#34 deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

@tormentos: The Xbox runs games like AC, RDR2, Far Cry 5, Metro and many others in true 4k, so I'll assume you're just joking when trying to make an equivalence between the Pro and the X.

Avatar image for davillain
DaVillain

56281

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#35 DaVillain  Moderator
Member since 2014 • 56281 Posts

@xantufrog said:

@vfighter: same. I might go 1440p in a year or two

Edit - P.S., people are flat out delusional if they think they will be gaming in 8k in the PS5. Supporting the format means it can do things like display a picture or do 8k movie playback. Do you have any concept of how resource intensive RENDERING a game at that res will be?

Moving from 1080p to 1440p several years ago was the best thing happen to me and I love gaming in 1440p, I'm not all that interest in 4K. And so with that, 8K means nothing to me nor do I care about raw graphics like that. You'll love 1440p Xantufrog, it's the perfect balance between high resolution & better framerates in my opinion right in the middle.

PS5 exclusive games is all I'm gonna care about.

Avatar image for Livecommander
Livecommander

1388

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 Livecommander
Member since 2009 • 1388 Posts

4k still isn't the standard this gen. Its still 1080p.

Last gen when some games were doing 1080i /p

The norm was still 720p

Lems are just hype its mid gen upgrades have slightly better resolution power.

YouTube videos barely hit 4k yet

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:
@ajstyles said:

Gran Turismo 8K exists. It’s coming to PS5.

Tech demo is far away from an actual game, did they actually confirm the game was coming to PS5 at 8k 120fps??

Racing games have the best chance (Forza at 4k for example) but 8k 120fps, I'm not sure about that.

I don't think the whole 8 and 120FPS were not mix in the same sentence.

It was 4k 120FPS and i am 100% sure that is for VR not normal games.

@ronvalencia said:

1. X1X CPU's memory bandwidth usage is bound by the lowest common denominator which is PS4 which is about 10 GB/s one direction from CPU to the GPU. Both PS4 Pro and X1X has similar target frame rates, hence CPU's geometry control points workload is similar for both boxes.

NO this is something pulled from your ass as always,just like you wanted to claim FP16 for xbox one X just because the Pro had it.

I have tell you 10 times than Bandwidth comparison you make to help the xbox one X case are bad,because you insist in comparing a GPU from PC which doesn't share its bandwidth with CPU vs the xbox one X which is shared.

So you claim 300GB/s+ bandwidth for the xbox one X and claim the RX580 is cripple because it has 256GB/s when in reality the xbox 300+ are shared and the RX580 are not.

Want to make a more down to earth comparison be fair.

256GB/s for the RX580 + what you get from DDR4 separate bandwidth.

So 50GB/s from DDR4 + 256GB/s from the RX 580 and you don't get the loss you get on APU were the CPU eat bandwidth disproportionally.

So again stop comparing the RX580 bandwidth with the xbox one X the XBO X can't use 100% of its bandwidth for GPU and has a nice penalty from the APU design.

Avatar image for djoffer
djoffer

1856

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 djoffer
Member since 2007 • 1856 Posts

Just bought a Samsung Q90R 4K which In itself is slightly to early as currently only a few Netflix shows and iTunes offers anything in 4K... doubt 8k is going to be needed for a very long time..

Avatar image for onesiphorus
onesiphorus

5272

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 6

#39 onesiphorus
Member since 2014 • 5272 Posts

Is this considered visual over-kill?

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#40  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@davillain-: yeah I think it will be great too. Now that I have a 1440p capable machine, it seems a waste not to have the monitor to do it (although I've been supersampling in some games)

Avatar image for howmakewood
Howmakewood

7713

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41  Edited By Howmakewood
Member since 2015 • 7713 Posts
@djoffer said:

Just bought a Samsung Q90R 4K which In itself is slightly to early as currently only a few Netflix shows and iTunes offers anything in 4K... doubt 8k is going to be needed for a very long time..

netflix 4k isnt really 4k either, compare the quality of (real) 4k blu-ray vs 4k flix stream...

to add Prime Video also provides 4k+hdr streaming

Avatar image for mtron32
mtron32

4432

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#42 mtron32
Member since 2006 • 4432 Posts

I have a 4KTV and rarely watch 4K content so this is a non issue for me. Until the NBA starts offering up games in 4-8K I'm still sleeping on it all.

Avatar image for tdkmillsy
tdkmillsy

6050

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43 tdkmillsy
Member since 2003 • 6050 Posts

@tormentos said:
@tdkmillsy said:
@ajstyles said:

Gran Turismo 8K exists. It’s coming to PS5.

Tech demo is far away from an actual game, did they actually confirm the game was coming to PS5 at 8k 120fps??

Racing games have the best chance (Forza at 4k for example) but 8k 120fps, I'm not sure about that.

I don't think the whole 8 and 120FPS were not mix in the same sentence.

It was 4k 120FPS and i am 100% sure that is for VR not normal games.

@ronvalencia said:

1. X1X CPU's memory bandwidth usage is bound by the lowest common denominator which is PS4 which is about 10 GB/s one direction from CPU to the GPU. Both PS4 Pro and X1X has similar target frame rates, hence CPU's geometry control points workload is similar for both boxes.

NO this is something pulled from your ass as always,just like you wanted to claim FP16 for xbox one X just because the Pro had it.

I have tell you 10 times than Bandwidth comparison you make to help the xbox one X case are bad,because you insist in comparing a GPU from PC which doesn't share its bandwidth with CPU vs the xbox one X which is shared.

So you claim 300GB/s+ bandwidth for the xbox one X and claim the RX580 is cripple because it has 256GB/s when in reality the xbox 300+ are shared and the RX580 are not.

Want to make a more down to earth comparison be fair.

256GB/s for the RX580 + what you get from DDR4 separate bandwidth.

So 50GB/s from DDR4 + 256GB/s from the RX 580 and you don't get the loss you get on APU were the CPU eat bandwidth disproportionally.

So again stop comparing the RX580 bandwidth with the xbox one X the XBO X can't use 100% of its bandwidth for GPU and has a nice penalty from the APU design.

It was 8k 120 fps, what wasn't mentioned was that is was on the PS5, people just jumped to conclusions.

Avatar image for lundy86_4
lundy86_4

61520

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#44  Edited By lundy86_4
Member since 2003 • 61520 Posts

Yeah, 8K TVs have been a thing for a while. Not feasible for us regular folk, but they're gonna start hitting that price-range soon enough.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

I'm confused... You do know 8K has been around for a while now?... and they aren't that much more than a 4K TV, there is just no reason to get one and there won't be a reason when the "next generation" console come out either.

I mean most 4K movie content even today is still 2K upscaled, we have yet to have the movie industry to fully adopt 4K as most even when filmed on 6K camera's master the movies at 2K and then the 4K image you see is still 2K upscaled to 4K that's filmed on a 6K camera. Not many actually master at native 4K.

Also even if 8K does kick off in the next 1-3 years chances are movies will just upscale 4-6K master content to 8K or worse 2k to 8k.

Gaming though.... 8K is not happening, not on the coming hardware even if its technically possible on some games developers won't have a reason to unless Sony/Microsoft pay them to for exclusives just to tick a box and have new thing to hype.

Avatar image for KBFloYd
KBFloYd

22714

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#46 KBFloYd
Member since 2009 • 22714 Posts

is this thread supposed to justify sony touting 8k gaming?

lol..cows

Avatar image for Pedro
Pedro

70024

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#47 Pedro
Member since 2002 • 70024 Posts

So, are the Sony loyalists going out on a limb to state that the PS5 will be an 8K machine?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#48  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@tdkmillsy said:
@ajstyles said:

Gran Turismo 8K exists. It’s coming to PS5.

Tech demo is far away from an actual game, did they actually confirm the game was coming to PS5 at 8k 120fps??

Racing games have the best chance (Forza at 4k for example) but 8k 120fps, I'm not sure about that.

I don't think the whole 8 and 120FPS were not mix in the same sentence.

It was 4k 120FPS and i am 100% sure that is for VR not normal games.

@ronvalencia said:

1. X1X CPU's memory bandwidth usage is bound by the lowest common denominator which is PS4 which is about 10 GB/s one direction from CPU to the GPU. Both PS4 Pro and X1X has similar target frame rates, hence CPU's geometry control points workload is similar for both boxes.

NO this is something pulled from your ass as always,just like you wanted to claim FP16 for xbox one X just because the Pro had it.

I have tell you 10 times than Bandwidth comparison you make to help the xbox one X case are bad,because you insist in comparing a GPU from PC which doesn't share its bandwidth with CPU vs the xbox one X which is shared.

So you claim 300GB/s+ bandwidth for the xbox one X and claim the RX580 is cripple because it has 256GB/s when in reality the xbox 300+ are shared and the RX580 are not.

Want to make a more down to earth comparison be fair.

256GB/s for the RX580 + what you get from DDR4 separate bandwidth.

So 50GB/s from DDR4 + 256GB/s from the RX 580 and you don't get the loss you get on APU were the CPU eat bandwidth disproportionally.

So again stop comparing the RX580 bandwidth with the xbox one X the XBO X can't use 100% of its bandwidth for GPU and has a nice penalty from the APU design.

Try playing a real console ported game with 8 GB single channelDDR3-1600 (~12.8 GB/s) with GTX 1080 Ti or R9-390X 8GB. You will find game console's 30 fps and 60 fps are an easy target to hit. Game console's CPU workload didn't change since year 2013. My old Intel Core i7-2600 era would do the job.

Don't expect 120 to 144 fps which is exclusive to gaming PC. Game console's CPU workload is a waste on high end gaming PCs.

Your benchmark doesn't reflect real game's memory bandwidth usage and doesn't factor in L2 cache programming boundary optimizations.

Read http://www.redgamingtech.com/ps4-architecture-naughty-dog-sinfo-analysis-technical-breakdown-part-2/ from Sony's Jaguar CPU optimization guide.

https://medium.com/software-design/why-software-developers-should-care-about-cpu-caches-8da04355bb8a Modern x86 CPUs with on-chip cache optimization guide.

Your argument shows you don't have commercial x86 programming background.

CELL's SPU local memory storage programming is NOT new. The difference is X86 CPU's cache overflow is more forgiving and it doesn't trigger exception error during overflow!

PS4 CPU is the lowest common denominator and CPU geometry control workloads are programmed with this specs. Modern PC exclusive RTS games may exceed the little Jaguar CPU cache size, but it's not game console's game.

You will see game simulation difference when PS5 sets "8 core Zen v2" CPU workload. This is when PC master race's high end rigs with 8 FAT CPU cores with 16 threads or greater comes into play and we get value from high end PC rigs instead of playing the same game console ported game at higher frame rates or/and higher geometry detail (before GPU's tessellated geometry amplification tricks).

GPU's tessellated geometry amplification = reduce CPU load on geometry control workload.

For example, Digital Foundry testing a 1st generation Ryzen with 4 cores and 8 threads at 3 Ghz.

https://www.youtube.com/watch?v=LjjRdrVAHCQ

"Witcher 3 based on current gen constraints targetting 30 hz on consoles.... in general gameplay on the open world, our Ryzen candidate moves north of one hundred frames per second: a three to four X improvement even before we factor in processor specific optimizations in a fixed box like a console.

Witcher 3's base console target is at 900p/1080p at 30 FPS while Ryzen drives the same 1080p game at ~122 fpsfor the CPU test.

1st gen Ryzen 4C/8T at 3Ghz can pump out 4X CPU geometry control throughput when compared XBO/PS4 on top of PC driver's Direct3D API to GPU ISA translation workload.

Try again.

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#49  Edited By Shewgenja
Member since 2009 • 21456 Posts

In b4 the lems realize this thread was a trap for when MS starts talking about 8k support with Scarlet in a few days.

Thank you for your participation.

Avatar image for BassMan
BassMan

17849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 226

User Lists: 0

#50 BassMan
Member since 2002 • 17849 Posts

I like new tech, but I have no desire to go 8K any time in the near future. 4K is sexy enough for me and is already extremely demanding.