Volitions Advocate on 28/7/2016 at 07:11
That, I think, is why the guys who buy these panels are so concerned that they run at 4:4:4 chroma. I don't know the details, but I understand it is in regards to how many colours the individual pixels display. With HDMI 1.4 you only get 4:2:0 at these resolutions, and that wasn't good enough for the photographers and the video compositors and the like.
Having nice crisp vectored fonts though... You can't even understand until you see it.
heywood on 28/7/2016 at 15:52
You can get 4K at 4:4:4 over HDMI 1.4, but it's limited to 24 fps, which is OK for still images and for films which are usually shot at 24 fps but not so good for other video shot at 30, 48, or 60 Hz and especially not for games. Downsampling to 4:2:0 allows 4K at 60 fps over HDMI 1.4. Consumer video is already 4:2:0 so you're not losing anything there, but for pretty much all other content you might be looking at on a computer it's a significant loss.
Chroma subsampling is pretty straightforward to explain without getting into the details of how they came up with the numbering scheme. Basically, images and video contain brightness information and color information, and our eyes have much higher resolution for fine detail in brightness than color, which is why for example we can read tiny text and why dithering and anti-aliasing works. Engineers take advantage of this difference in brightness and color resolution to save space/bandwidth in digital video.
4:4:4 - There is a unique sample of brightness and a unique sample of color for every pixel. So a 4:4:4 HD video frame at 1920x1080 has 1920x1080 brightness and 1920x1080 color samples.
4:2:2 - There is a unique sample of brightness for every pixel. There is a unique sample of color only every other pixel in the horizontal direction. So at 1920x1080 you have 1920x1080 brightness and 960x1080 color samples.
4:2:0 - There is a unique sample of brightness for every pixel. There is a unique sample of color only every other pixel in both the horizontal and vertical direction. So at 1920x1080 you have 1920x1080 brightness and 960x540 color samples.
Note RGB is inherently 4:4:4. That is what we're used to getting on computer displays. Most digital video formats e.g. Blu-Ray are 4:2:0.
Perhaps what Malf is referring to is the claim I've seen that downsampling 8-bit 4:2:0 4K/UHD gives you 10-bit 4:4:4 HD.
Volitions Advocate on 28/7/2016 at 23:06
Here's a couple of screenshots from Rise of the Tomb Raider.
I think I cranked most of the graphics up, everything but AA (because why at this resolution?) and film grain.
I didn't turn on the benchmarks, but I'd say it was running at about 15 - 20 fps on these settings. Much smoother at 1080p, but it sure looks crisp at this resolution.
I'll post some more later. DX HR doesn't want to capture for some reason.
(
http://i.imgur.com/WUx6tLu.jpg)
(
http://i.imgur.com/3PuMZqV.jpg)
(
http://i.imgur.com/iz0dwI3.jpg)
scumble on 29/7/2016 at 07:51
The price of GSync monitors is putting me off so far but the huge desktop space element is tempting given that the downscaling sounds fine. I do find that the adaptive vsync supported by nvidia cards helps a great deal, but it doesn't always work perfectly.
But I get the point that one can keep a decent monitor for years, way past the life of other components.
Judith on 29/7/2016 at 09:06
I bought 27" U2713H monitor a couple years ago, not because of the resolution (1440p), but the wider color gamut for photography (AdobeRGB). I still play games on 32" 1080p TV, as my video card struggles with higher resolutions. I can play older games in 1440p, but it's not that much of a difference for me. Using 1440p desktop is, it's so much better for 3d applications and photography. Now I have plenty of space for those side panes with options and properties. I don't think I need 4k now. It would have to be something bigger than 27 inch and placed further away from my desk, so I can see the whole screen. As far as gaming goes, I'd have to see something groundbraking, like difference between Dark Souls upscaled 720p vs actual 1080p resolution to think about switching to 4k. Right now even the awesome GTX 1080 struggles to maintain 60 fps in 4k (in new games), so I guess making that transition will take a while.
heywood on 29/7/2016 at 11:58
Quote Posted by Abysmal
The next frontier is wide color gamut (DCI-P3), with companies like Apple already supporting it. Where does that fit into the bandwidth of HDMI?
The available bandwidth puts an upper limit on the combination of resolution, bit depth, and frame rate, but not on gamut. HDMI 1.4 can support up to 48-bit color but the bandwidth only allows for 24-bit color at 4K resolution. HDMI 2.0 allows for 36-bit color at 4K resolution.
The size of the color gamut is independent of bit depth. Suppose you are using 24-bit YCbCr, where you have 8 bits of brightness info (Y) and 16 bits of color info (Cb, Cr). In that case, a single pixel can have any of 65536 color values and any of 256 brightness levels. The color space you're using determines how these values are mapped to actual colors. Both DCI-P3 and AdobeRGB cover about 40% more of the perceptible colors than sRGB, with the extra coverage being mainly in the green and green-blue range. With the same number of bits available to represent color, those 65536 color values are more spread out in DCI-P3 and AdobeRGB compared to sRGB, with a bigger jump from one color value to the next. There is a tradeoff between the size of the gamut and the resolution of color within the gamut. So a higher bit depth is more useful when working in a wider gamut.
Thirith on 30/7/2016 at 09:53
Quick 'un to say that while it's only been a couple of days, I very much like both the higher resolution and the higher Hz and G-Sync combo that come with the Acer XB271HU I've ended up with. I expect it'll become the new normal very quickly, but my 60Hz/1080p screen at the office will feel like a dinosaur...
Volitions Advocate on 4/8/2016 at 03:17
Well... I just went and did a damn foolish thing.
I reserve a bit of money each year from my student loans for computer stuff I might need (it's built into the application) ... I didn't think I'd be eligible for loans anymore, but apparently I am and I was granted my loan to finish my last year.
So I used up my computer money already...
MSI GTX 1080 OC.
That'll give my 4K gaming a boost, and probably give my computer another 4 years of life or so (one can hope) before I have to start replacing core components like my mobo and cpu.
I think it's finally time I overclocked my i7, which means I'll have to get one of those closed loop water cooler heat sinks to keep it nice and cool.
So uh... yeah. Expect more screens and maybe even some gameplay goodness in a couple weeks.
Malf on 4/8/2016 at 07:55
Quote Posted by Thirith
Quick 'un to say that while it's only been a couple of days, I very much like both the higher resolution and the higher Hz and G-Sync combo that come with the Acer XB271HU I've ended up with. I expect it'll become the new normal very quickly, but my 60Hz/1080p screen at the office will feel like a dinosaur...
Snap!
Hope you're enjoying it. A couple of games that really help demonstrate what G-Sync helps with (if you have them):
Shadow of Mordor is smooth as butter on a G-Sync monitor, and was the game that sold me on the tech. Where tearing and frame-drop were noticeable on my old 60Hz 1920x1200 monitor, G-Sync allowed smoother play at higher resolution thanks to the adaptive refresh.
New Doom is astonishingly swish, and almost perverse in how liquid it feels, which helps with the constant percussion of the combat.
Although I think the main takeaway is that, because it's doing it's job so well, most new games will be more than playable, and you won't notice performance drops.
I highly doubt that any modern game would play smoothly on my set-up at 2560x1440 if it weren't for G-Sync.