Originally Posted by wbchen99
Brightness set to 4 for HDR will not be too dark and making the HDR effect less obvious ?
What is the value for your Black level ?
Dynamic contrast here refer to Adv. contrast enhancer ?
Black level ~ 48 to 50
Yes, you’re right, my bad. :0 I meant Adv. contrast enhancer. Medium or High for most games and action, sci-fi, horror movies (films that have more of a dark or colorful palette) Low or off for comedies, dramas, or sitcoms. It does tend to blow up brighter areas of a frame like overhead lights or neon signs, etc, but it’s more of a different style than a problem for me. The real treat here’s what it does to darker scenes. Details come into view that previously could not be seen. Think hair or logos or folds in clothes, etc. I don’t “think” it’s actually bringing the black level down, but it sure seems like it is. It seems to mimize clouding and of course colors pop more as well. Watching the new 4K disc of Ghost in the Shell with adv contrast off and medium’s an entirely different experience.
Sometimes I’ll move the gamma +/- 1. Black adjust seems to hide details in dark areas...so sort of the reverse of Adv. contrast enhancer.
Color I go from 50 all the way up to 60 for some xbox games or colorful movies. For stuff like Friends, the sitcom from the 90’s I will usually go with 52 to keep skin tones in check. But the reason to go higher’s defintely to pop the colors, provided faces look normal. With a game like Forza 7, there’s no skin tones, so I’ll crank that.
Hue stays @ 0, color temp’s on Expert 1. Neutral might make things less yellow and warm’s a good compromise between the two. I used to go neutral for games and warm/neutral for films. But after comparing back and forth with Expert, just left it there for awhile and now when I look back...even to warm, it’s like going from incandescent to fluorescent in the same way that fluorescent lighting reduces the viewable color spectrum. The big reason to go expert’s to see more accurate colors...for me.
Adv. color temp’s where the calibration’s done. I’m no expert with this, so I just used rting’s settings even tho panels are different, their numbers look better to me than default.
Live color blooms details and bleeds into other objects even on low, so I leave that off all the time.
Sharpness 25 or 50 and leave it. I don’t notice much difference unless it’s cranked to 100. Even then it’s just not something that makes much of a difference. Same for the rest of the settings. I just don’t see anything, haha! So I leave it all off.
True cinema for everything but games, airplay, twitch which I leave off. For sports I’ll use Smooth, but commercials look a little jarring so I typically just look down and mute.
High for everything except games, airplay, twitch, and sports.
The reason I leave HDR @ 4’s because it’s an edge lit LED panel. I’ll go into more detail below.
Originally Posted by snorge
Yes I don't understand leaving the brightness low for HDR content. It is meant to be maxed out so the tv is able to reach it's peak values.
Maxing out the backlight on a C7 makes sense because you can do 900 to 0 nits. Maxing out an 850E simply raises all areas of the screen to 400~450 nits depending on the screen area. It’s not a contrast thing and that’s why it won’t show up in pictures.
HDR encoded content on an edge lit panel will look too dark for a 0 backlight and defintely benefits up to 16 (for me), but anything after 4~8 kills the black level. It literally goes from black to gray if I go from 4 to 50. At max brightness I may as well be looking @ 1st gen TN.
Here’s the thermal image from rtings...
You can see from that picture that there’s a single row @ the bottom lighting the entire panel. Changing the brightness setting adjusts the backlight from 0 to 50 uniformly. This breaks the HDR encode from 4K discs as big draw from an OLED or a local dimming set’s the ability to literally change the brightness of the image in different areas of the screen. Different’s the key word. So for example...
You have an image of a sun over a patio with 2 people on a bench. On an OLED with the best hardware capability’s for HDR, the sun could be 900 nits, the people could be 300, and the ground below the patio’s deck could actually be 0. So you have per pixel levels of brightness just like in real life where the sun creates intense black levels in the shade.
And edge lit panel’s hardware’s not capable of HDR in the way filmmakers intend. Sure, images may appear brighter, but if you go back and forth between HDR on or off, you’ll see that what’s actually happening’s a flattening of details from black to white.
Logan Noir’s a perfect example of this. In the scene towards the end with Logan lying in bed chatting with x-23, you can see 1/2 of his face with HDR on. By turning it off, his entire face is shown. It doesn’t “pop” like HDR on and therefore looks less impressive at first glance, but you’re losing a large amount of details.
I’m not necessarily saying HDR looks bad on the set, in fact, I do prefer it from time to time just because of the contrast boost giving a more natural life like presentation. But this can also be a problem. In the Transformers 1 2007 4K HDR film, when devaster’s chasing prime, his entire lower leg’s black. With HDR off there’s so much more detail. So it’s very much a per material basis for me.