VU Premium Android 4k TVs aren't really HDR! Please go through this thread before buying one.

Joined
May 4, 2018
Messages
88
Points
8
Location
Patiala, Punjab, India
This is the email that I wrote the technical team of VU Technologies Pvt. Ltd. India explaining about the 8-bit HDR (which doesn't and shouldn't exist) that their TV has offered me. Hence, I was too lazy to edit the email and post it here so here's the original thing:

"Dear Team VU,

I am in utter disappointment after realizing that your so-called VU Premium Android TVs are not HDR by any means!

I presently own a twice-replaced 55-OA and it is fortunately working well so far but of course I have many other issues apart from this seemingly-deceptive strategy undertaken by your marketing/branding deptt. The proofs of the TV not being HDR are attached below and explained in the following:

1 - Under 'Stats for Nerds' in official (stock) YouTube and Smart YouTube apps, it shows "bt709/bt709" (aka Rec.709) colorspace/color gamut which undoubtedly and apparently is an HDTV standard color spectrum coverage and not a 4k HDR by any means. This wiki article explains everything an end-user must know before buying a true-HDR TV: https://en.m.wikipedia.org/wiki/Rec._2020?wprov=sfla1.

2 - The peak brightness is just 450 nits which again isn't mentioned whether it's partial peak brightness or overall peak brightness or average peak brightness. Very dumb for a company to not mention the detailed brightness info.

3 - I can't find any Dolby Vision engine/DV system app in All Apps that confirms whether this TV supports Dolby Vision.

4 - I tried to play 4k60p HDR & SDR and 4k30p HDR & SDR content off of different sources like external hard drive, and Plex but literally all of the HDR content displayed with washed-off (decolorized) colours and some of the HDR videos even forced a hard-reboot of the TV! Crazy ryt? If the TV was true-HDR10 and DV, the content would be displayed in proper HDR colours.

5 - The color depth of this Hisense panel in Aida64 app shows as 8-bit and you know the minimum bit depth required for HDR is 10-bit and 12-bit is maximum. Another solid proof of this TV not being HDR.

Now coming to 60p (aka 60 fps) playback. No video plays smoothly without frame drops and it's just a total disaster for eyes to look at when 4k60p content is played for prolonged time. Significant jittering/stuttering is observed, especially near the edges of the screen while the 60 fps content plays. This effect was even caught up easily by my parents themselves at once.

I would request you to please forward this message to your higher authorities so that they also know this truth and the fake advertising that you're doing on every box as HDR10 and Dolby Vision, not to mention the Dolby Vision trademark as well.

Regards,
Navdeep Singh Thind"

I hope some of you guys got helped by this. Please never take HDR for granted these days in lower-tier brands!
Visit the following link for proof of this TV not being able to display true 10 bit HDR colour gamut: https://photos.app.goo.gl/GtFNx658JQcZSPKN6. Also watch this video for better clarification: https://photos.app.goo.gl/L3KcNfzaXpZZZohL9
 
Last edited:
Navdeep
I can't comment on your specific TV but it can often be a bit tricky to get HDR right even for flagship TVs
For a long time, I actually had HDR disabled on a Samsung 9 series because it used to look washed out and left it like that for a long time till I realized the issue was at my end.
Once I fixed the system chain, HDR playback now shines through!

I am not saying that you certainly have an issue on your connections chain but it may be perhaps worthwhile to share your source and playback mechanism for HDR so that we could help debug
 
I don't understand, how do u change the settings for a HDR? Isn't it like a default thing in TV's?

Or does one have to go in settings and enable it?
 
I don't understand, how do u change the settings for a HDR? Isn't it like a default thing in TV's?

Or does one have to go in settings and enable it?
How I wish it were that straightforward (as it should have been)
Unfortunately three separate competing standards (Dolby Vision , HDR10, HLG) and the fact that different OTT providers (e.g. netflix/ Prime) support some devices and not others makes it a lot more complex.

Just as an example, if you own say a Samsung Q90 (flagship) and choose to stream a HDR title from Netflix via a Amazon fire stick 4K expecting a bright & vivid video
What you will get instead is a completely washed out image that looks a lot worse than what it does on an old Panasonic LCD.

The reason for that being Samsung does not support Dolby Vision (but it does support HDR10) while Netflix supports Dolby Vision on firestick but not HDR10
Thus the TV gets a narrow color gamut signal while it thinks it is receiving a wide color gamut and thus the resultant picture is flat.

In order to get good HDR while ensuring non HDR isn't flattened to lifeless , You need to do two things

a) Enable your streaming setup to auto switch to HDR when available (most often, it is set to always HDR) - At least it will ensure that if there is a mismatch you will at least get good SDR (which is far better than bad HDR as in the example above),

b) Ensure your TV , playback device and source match up

c) Ensure you are using high speed HDMI cables

The most HDR compatible streaming device at the moment is Apple TV 4K which from an India perspective supports both Prime and Netflix for DV as well as HDR10 (as well as ATMOS)
The Fire TV stick falls short on this measure and so do the native streaming apps on majority of chinese sets like Mi or Ifalconn
 
How I wish it were that straightforward (as it should have been)
Unfortunately three separate competing standards (Dolby Vision , HDR10, HLG) and the fact that different OTT providers (e.g. netflix/ Prime) support some devices and not others makes it a lot more complex.

Just as an example, if you own say a Samsung Q90 (flagship) and choose to stream a HDR title from Netflix via a Amazon fire stick 4K expecting a bright & vivid video
What you will get instead is a completely washed out image that looks a lot worse than what it does on an old Panasonic LCD.

The reason for that being Samsung does not support Dolby Vision (but it does support HDR10) while Netflix supports Dolby Vision on firestick but not HDR10
Thus the TV gets a narrow color gamut signal while it thinks it is receiving a wide color gamut and thus the resultant picture is flat.

In order to get good HDR while ensuring non HDR isn't flattened to lifeless , You need to do two things

a) Enable your streaming setup to auto switch to HDR when available (most often, it is set to always HDR) - At least it will ensure that if there is a mismatch you will at least get good SDR (which is far better than bad HDR as in the example above),

b) Ensure your TV , playback device and source match up

c) Ensure you are using high speed HDMI cables

The most HDR compatible streaming device at the moment is Apple TV 4K which from an India perspective supports both Prime and Netflix for DV as well as HDR10 (as well as ATMOS)
The Fire TV stick falls short on this measure and so do the native streaming apps on majority of chinese sets like Mi or Ifalconn
Very well explained. Cleared lot of misconceptions about HDR and Dolby Vision on 4K TVs and streaming devices. Incidentally, I use MI box S.
 
How I wish it were that straightforward (as it should have been)
Unfortunately three separate competing standards (Dolby Vision , HDR10, HLG) and the fact that different OTT providers (e.g. netflix/ Prime) support some devices and not others makes it a lot more complex.

Just as an example, if you own say a Samsung Q90 (flagship) and choose to stream a HDR title from Netflix via a Amazon fire stick 4K expecting a bright & vivid video
What you will get instead is a completely washed out image that looks a lot worse than what it does on an old Panasonic LCD.

The reason for that being Samsung does not support Dolby Vision (but it does support HDR10) while Netflix supports Dolby Vision on firestick but not HDR10
Thus the TV gets a narrow color gamut signal while it thinks it is receiving a wide color gamut and thus the resultant picture is flat.

In order to get good HDR while ensuring non HDR isn't flattened to lifeless , You need to do two things

a) Enable your streaming setup to auto switch to HDR when available (most often, it is set to always HDR) - At least it will ensure that if there is a mismatch you will at least get good SDR (which is far better than bad HDR as in the example above),

b) Ensure your TV , playback device and source match up

c) Ensure you are using high speed HDMI cables

The most HDR compatible streaming device at the moment is Apple TV 4K which from an India perspective supports both Prime and Netflix for DV as well as HDR10 (as well as ATMOS)
The Fire TV stick falls short on this measure and so do the native streaming apps on majority of chinese sets like Mi or Ifalconn

Perfectly put. The Apple TV 4K is the only option at the moment for Dolby Vision streaming. And sadly that may not change any time in the future thanks to Android TVs sticking to the low end of the scale only.
 
Thanks for that, but I honestly didn't understand much. I have a LG C8 OLED, Nvidia shield. What settings and where am I supposed to do what? Normally when I play a HDR movie, the TV automatically changes to HDR ( a small pop-up comes up on the tv screen)

I also used good quality high speed HDMI cables.
 
How I wish it were that straightforward (as it should have been)
Unfortunately three separate competing standards (Dolby Vision , HDR10, HLG) and the fact that different OTT providers (e.g. netflix/ Prime) support some devices and not others makes it a lot more complex.

Just as an example, if you own say a Samsung Q90 (flagship) and choose to stream a HDR title from Netflix via a Amazon fire stick 4K expecting a bright & vivid video
What you will get instead is a completely washed out image that looks a lot worse than what it does on an old Panasonic LCD.

The reason for that being Samsung does not support Dolby Vision (but it does support HDR10) while Netflix supports Dolby Vision on firestick but not HDR10
Thus the TV gets a narrow color gamut signal while it thinks it is receiving a wide color gamut and thus the resultant picture is flat.

In order to get good HDR while ensuring non HDR isn't flattened to lifeless , You need to do two things

a) Enable your streaming setup to auto switch to HDR when available (most often, it is set to always HDR) - At least it will ensure that if there is a mismatch you will at least get good SDR (which is far better than bad HDR as in the example above),

b) Ensure your TV , playback device and source match up

c) Ensure you are using high speed HDMI cables

The most HDR compatible streaming device at the moment is Apple TV 4K which from an India perspective supports both Prime and Netflix for DV as well as HDR10 (as well as ATMOS)
The Fire TV stick falls short on this measure and so do the native streaming apps on majority of chinese sets like Mi or Ifalconn
Phew!! That flew right over.
LG55C8 has native apps for Netflix/Prime that play DV or HDR content.
There is a marked difference when content is 4K (DV or HDR).
Dunno about other brands handling of such content.

Cheers,
Raghu
 
Perfectly put. The Apple TV 4K is the only option at the moment for Dolby Vision streaming. And sadly that may not change any time in the future thanks to Android TVs sticking to the low end of the scale only.
Amazon Fire Stick has Dolby Vision.
TCL is coming this July with Dolby Vision Android TVs in Europe, they have acquired certification of support for 4K HDR from Youtube, as well.
Panasonic, too is coming with Dolby Vision supporting GX series TVs which are awesomely awesome, not Android though.
 
Navdeep
I can't comment on your specific TV but it can often be a bit tricky to get HDR right even for flagship TVs
For a long time, I actually had HDR disabled on a Samsung 9 series because it used to look washed out and left it like that for a long time till I realized the issue was at my end.
Once I fixed the system chain, HDR playback now shines through!

I am not saying that you certainly have an issue on your connections chain but it may be perhaps worthwhile to share your source and playback mechanism for HDR so that we could help debug
I played 4k/8k HDR content via:
  1. Native YT and Smart YT apps
2. This playback chain: My 8k 60 fps-capable PC -> BlueRigger high quality HDMI 2.0 Cable -> Tv's HDMI 2.0 Non-arc port on the back -> HDMI Enhanced turned on and HDMI Dynamic Range turned to Full. Played 4-5 4k60/4k30/8k60/8k30 HDR10 videos on both MPC-BE and MPC Classic x64 variants with both renderers (Enhanced Custom Presenter and madVR) and a Dolby Vision video too and yet the colours are washed out af but the playback is soooo fucking smooth like literally perfect with no frame drops (thanks to my pc, TV doesn't need much to process). Now if this panel was true 10-bit, why would the picture be decolorized?

3. PLEX TV app: Played HDR Content via my PC Plex Media Server over my 50 Megabits unobstructed transmission network but still the result is same: washed out colors.

Now idk if this isn't able to output 10-bit HDR then what would. Please shine some deep insight into this if you can help me really.
 
How I wish it were that straightforward (as it should have been)
Unfortunately three separate competing standards (Dolby Vision , HDR10, HLG) and the fact that different OTT providers (e.g. netflix/ Prime) support some devices and not others makes it a lot more complex.

Just as an example, if you own say a Samsung Q90 (flagship) and choose to stream a HDR title from Netflix via a Amazon fire stick 4K expecting a bright & vivid video
What you will get instead is a completely washed out image that looks a lot worse than what it does on an old Panasonic LCD.

The reason for that being Samsung does not support Dolby Vision (but it does support HDR10) while Netflix supports Dolby Vision on firestick but not HDR10
Thus the TV gets a narrow color gamut signal while it thinks it is receiving a wide color gamut and thus the resultant picture is flat.

In order to get good HDR while ensuring non HDR isn't flattened to lifeless , You need to do two things

a) Enable your streaming setup to auto switch to HDR when available (most often, it is set to always HDR) - At least it will ensure that if there is a mismatch you will at least get good SDR (which is far better than bad HDR as in the example above),

b) Ensure your TV , playback device and source match up

c) Ensure you are using high speed HDMI cables

The most HDR compatible streaming device at the moment is Apple TV 4K which from an India perspective supports both Prime and Netflix for DV as well as HDR10 (as well as ATMOS)
The Fire TV stick falls short on this measure and so do the native streaming apps on majority of chinese sets like Mi or Ifalconn
My Windows PC was able to detect 8-bit HDR signal but the TV popup was still showing 3840x2160 HDR10. Nvidia Control Panel also shows 8 bpc color bit depth. Under Windows HD Color it says 8-bit HDR. Again two solid proofs of the display being only 8-bit rec709 HDR display as per my PC.

I think it's wayyyy more tricky to get 10-bit hdr (if it really is still present in this HiSense panel) than what you or me think.
 
I played 4k/8k HDR content via:
  1. Native YT and Smart YT apps
2. This playback chain: My 8k 60 fps-capable PC -> BlueRigger high quality HDMI 2.0 Cable -> Tv's HDMI 2.0 Non-arc port on the back -> HDMI Enhanced turned on and HDMI Dynamic Range turned to Full. Played 4-5 4k60/4k30/8k60/8k30 HDR10 videos on both MPC-BE and MPC Classic x64 variants with both renderers (Enhanced Custom Presenter and madVR) and a Dolby Vision video too and yet the colours are washed out af but the playback is soooo fucking smooth like literally perfect with no frame drops (thanks to my pc, TV doesn't need much to process). Now if this panel was true 10-bit, why would the picture be decolorized?

3. PLEX TV app: Played HDR Content via my PC Plex Media Server over my 50 Megabits unobstructed transmission network but still the result is same: washed out colors.

Now idk if this isn't able to output 10-bit HDR then what would. Please shine some deep insight into this if you can help me really.
Cheaper HDR panels have a different problem- their backlight not being sufficiently good to render the higher contrasts effectively, esp in a bright room
However colors won't appear faded.
the decolorized bit is an almost certain indication that your GPU is doing a WCG handshake with your TV but is later sending the regular color gamut.

If you want to troubleshoot, the best method would be to playback on netflix and enable the A/V stats HUD
Take a note of the input video range and the display video range it shows

Couple of samples below showing
- a Dolby Vision video being rendered as HDR10 (note both the video range details)
- The same video in SDR rendered as SDR

Neither will show a washed out color effect unless your renderer tells your TV to expect HDR (which makes TV change its color gamut width) but then sends SDR instead

IMG_5730 (1).jpg (DV -> HDR10)

IMG_5726.jpg(SDR -> SDR)
 
Last edited:
My Windows PC was able to detect 8-bit HDR signal but the TV popup was still showing 3840x2160 HDR10. Nvidia Control Panel also shows 8 bpc color bit depth. Under Windows HD Color it says 8-bit HDR. Again two solid proofs of the display being only 8-bit rec709 HDR display as per my PC.

I think it's wayyyy more tricky to get 10-bit hdr (if it really is still present in this HiSense panel) than what you or me think.
And that almost confirms my suspicion that you have a renderer issue .
TV pop-up says HDR = it has switched to expect WCG but is not getting that in the input
 
Cheaper HDR panels have a different problem- their backlight not being sufficiently good to render the higher contrasts effectively, esp in a bright room
However colors won't appear faded.
the decolorized bit is an almost certain indication that your GPU is doing a WCG handshake with your TV but is later sending the regular color gamut.

If you want to troubleshoot, the best method would be to playback on netflix and enable the A/V stats HUD
Take a note of the input video range and the display video range it shows

Couple of samples below showing
- a Dolby Vision video being rendered as HDR10 (note both the video range details)
- The same video in SDR rendered as SDR

Neither will show a washed out color effect unless your renderer tells your TV to expect HDR (which makes TV change its color gamut width) but then sends SDR instead

View attachment 36537 (DV -> HDR10)

View attachment 36538(SDR -> SDR)
If the GPU is doing wide color gamut (wcg) handshake then what's the problem outputting it finally? I don't understand. Some guy above said that Chromecast Ultra is needed for native HDR YT support but I suppose it must be a physical Chromecast stick costing thousands of rupees?! Anyway, what should I do now to play HDR content off of either of the two YT apps at least or via Plex? Any idea? Plex should work imo as it's a totally different implementation than YT and moreover, all the processing work is done by my pc anyways so I don't get why the hdr10 content is still washed out.
 
If the GPU is doing wide color gamut (wcg) handshake then what's the problem outputting it finally? I don't understand. Some guy above said that Chromecast Ultra is needed for native HDR YT support but I suppose it must be a physical Chromecast stick costing thousands of rupees?! Anyway, what should I do now to play HDR content off of either of the two YT apps at least or via Plex? Any idea? Plex should work imo as it's a totally different implementation than YT and moreover, all the processing work is done by my pc anyways so I don't get why the hdr10 content is still washed out.
Reset your TV and check everything again. Call customer support, if it fails to resolve even after resetting.
There are people on Amazon and Flipkart writing reviews that HDR 10 videos and Games play perfectly on this TV and you are clearly not satisfied.
There has to be some other issue.
 
If the GPU is doing wide color gamut (wcg) handshake then what's the problem outputting it finally? I don't understand. Some guy above said that Chromecast Ultra is needed for native HDR YT support but I suppose it must be a physical Chromecast stick costing thousands of rupees?! Anyway, what should I do now to play HDR content off of either of the two YT apps at least or via Plex? Any idea? Plex should work imo as it's a totally different implementation than YT and moreover, all the processing work is done by my pc anyways so I don't get why the hdr10 content is still washed out.
I wish I could help.
PC + GPU setups can have a very large number of variations spanning the OS/ playback App settings/driver mismatches and its really difficult if not impossible to troubleshoot remotely

Possibly also the reason why HTPCs that were all the rage a decade ago are no longer used by anyone.
 
This is the email that I wrote the technical team of VU Technologies Pvt. Ltd. India explaining about the 8-bit HDR (which doesn't and shouldn't exist) that their TV has offered me. Hence, I was too lazy to edit the email and post it here so here's the original thing:

"Dear Team VU,

I am in utter disappointment after realizing that your so-called VU Premium Android TVs are not HDR by any means!

I presently own a twice-replaced 55-OA and it is fortunately working well so far but of course I have many other issues apart from this seemingly-deceptive strategy undertaken by your marketing/branding deptt. The proofs of the TV not being HDR are attached below and explained in the following:

1 - Under 'Stats for Nerds' in official (stock) YouTube and Smart YouTube apps, it shows "bt709/bt709" (aka Rec.709) colorspace/color gamut which undoubtedly and apparently is an HDTV standard color spectrum coverage and not a 4k HDR by any means. This wiki article explains everything an end-user must know before buying a true-HDR TV: https://en.m.wikipedia.org/wiki/Rec._2020?wprov=sfla1.

2 - The peak brightness is just 450 nits which again isn't mentioned whether it's partial peak brightness or overall peak brightness or average peak brightness. Very dumb for a company to not mention the detailed brightness info.

3 - I can't find any Dolby Vision engine/DV system app in All Apps that confirms whether this TV supports Dolby Vision.

4 - I tried to play 4k60p HDR & SDR and 4k30p HDR & SDR content off of different sources like external hard drive, and Plex but literally all of the HDR content displayed with washed-off (decolorized) colours and some of the HDR videos even forced a hard-reboot of the TV! Crazy ryt? If the TV was true-HDR10 and DV, the content would be displayed in proper HDR colours.

5 - The color depth of this Hisense panel in Aida64 app shows as 8-bit and you know the minimum bit depth required for HDR is 10-bit and 12-bit is maximum. Another solid proof of this TV not being HDR.

Now coming to 60p (aka 60 fps) playback. No video plays smoothly without frame drops and it's just a total disaster for eyes to look at when 4k60p content is played for prolonged time. Significant jittering/stuttering is observed, especially near the edges of the screen while the 60 fps content plays. This effect was even caught up easily by my parents themselves at once.

I would request you to please forward this message to your higher authorities so that they also know this truth and the fake advertising that you're doing on every box as HDR10 and Dolby Vision, not to mention the Dolby Vision trademark as well.

Regards,
Navdeep Singh Thind"

I hope some of you guys got helped by this. Please never take HDR for granted these days in lower-tier brands!
Visit the following link for proof of this TV not being able to display true 10 bit HDR colour gamut: https://photos.app.goo.gl/GtFNx658JQcZSPKN6. Also watch this video for better clarification: https://photos.app.goo.gl/L3KcNfzaXpZZZohL9

To add some info to your problem: basic specs for a TV to Support HDR10 the panel should be 10bit not 8bit, secondly minimum requirement of brightness for HDR standards is at least 1000 nits, for Dolby Vision the panel must be 12bit not 10bit and 3000+nits anything lesser then these specs its not an true HDR or Dolby vision, just because all these budget segment TV's just support or read HDR & Dolby vision files they can not be called as HDR & Dolby Vision TV's its just an marketing gimmick for business
 
To add some info to your problem: basic specs for a TV to Support HDR10 the panel should be 10bit not 8bit, secondly minimum requirement of brightness for HDR standards is at least 1000 nits, for Dolby Vision the panel must be 12bit not 10bit and 3000+nits anything lesser then these specs its not an true HDR or Dolby vision, just because all these budget segment TV's just support or read HDR & Dolby vision files they can not be called as HDR & Dolby Vision TV's its just an marketing gimmick for business

From where do you get such a great knowledge?
Please show me the path to that Alexandria.
 
From where do you get such a great knowledge?
Please show me the path to that Alexandria.
He isn't that much exaggerating in his regard but he surely isn't well-researched. The specs he's mentioning are not even present in 99.8% of the TVs. Obviously you need at least 700+ nits of brightness or so to display good contrasts through a 55" panel and the WCG requirement is somewhat true but not at least 12-but for DV. DV is backwards compatible with 10-bit and 8-bit panels as mine.
 
Wharfedale Linton Heritage Speakers in Walnut finish at a Special Offer Price. BUY now before the price increase.
Back
Top