bitrate on 4k televisions - HotUKDeals
We use cookie files to improve site functionality and personalisation. By continuing to use HotUKDeals, you accept our cookie and privacy policy.
Get the HotUKDeals app free at Google Play

Search Error

An error occurred when searching, please try again!

Login / Sign UpSubmit

bitrate on 4k televisions

£0.00 @
I've been looking at new tvs for my PlayStation pro and what I cannot get my head around is the bit rate. how do you know what bitrate a television is? Read More
mastakilla87 Avatar
6m, 3w agoPosted 6 months, 3 weeks ago
I've been looking at new tvs for my PlayStation pro and what I cannot get my head around is the bit rate. how do you know what bitrate a television is?
Tags:
mastakilla87 Avatar
6m, 3w agoPosted 6 months, 3 weeks ago
Options

All Responses

(5) Jump to unreadPost an answer
Responses/page:
#1
You don't. Bitrate has nothing to do with display, it's to do with video quality of the source media.
#2
razord
You don't. Bitrate has nothing to do with display, it's to do with video quality of the source media.


the source media will be a PlayStation pro and to get the best out of it apparently it's 10 bit
#3
mastakilla87
razord
You don't. Bitrate has nothing to do with display, it's to do with video quality of the source media.
the source media will be a PlayStation pro and to get the best out of it apparently it's 10 bit
That's for hdr
#4
True Colour = 24bit i.e 8 bits per primary colour. 16.7 million colours
Deep Colour = 30bits upwards i.e. 10 bits per primary colour. 1.073 billions colours
Higher the colour bit depth the more colours it can display.
https://en.wikipedia.org/wiki/Color_depth
https://www.avforums.com/article/what-is-hdr.11039
Move over Colour space and welcome to Colour volume oO

I'm old skool so to me TV video is analogue with separate chrominance and luminance :(






Edited By: kester76 on Dec 27, 2016 15:28
#5
mastakilla87
razord
You don't. Bitrate has nothing to do with display, it's to do with video quality of the source media.
the source media will be a PlayStation pro and to get the best out of it apparently it's 10 bit

That's the colour bit depth, bit rate is the quality of the video compression and measured in Mbps.

Any TV that advertises any sort of HDR support will accept a 10-bit signal. As this includes every TV with even partial HDR display ability it's not something you have to worry about.

If you want to get the best out of it then it's not the bit depth you need to be looking at but how much HDR support the TV has.

HDR consists of effectively three components

1. A wider colour space. Normal TVs use the Rec. 709 colour range, HDR TVs are typically measured as a percentage of DCI-P3 or Rec. 2020. The more the merrier, and this tends to be the easiest part of HDR to add. This has nothing to do with bit depth which is the spacing between shades rather than how far the colour range extends.

2. Higher brightness. A typical TV is around 300cd/m², HDR allows for 4000-10,000 with the best HDR TVs at the moment hitting 1500-2000cd/m².

3. Local backlight dimming. So that you can have a very bright section next to a very dark section. The most expensive bit of the technology and only found on the top of the range TVs at the moment.

The TVs available on a 'normal' budget (£600-800 at 50") have the first and a little bit of the second (400-500cd/m²) and at the higher end the occasional rudimentary stab at the third.

It depends on how much you're willing to spend and what size you want as to what's available, but that's generally the current state of HDR support.

Post an Answer

You don't need an account to leave a response. Just enter your email address. We'll keep it private.

...OR log in with your social account

...OR comment using your social account

Thanks for your comment! Keep it up!
We just need to have a quick look and it will be live soon.
The community is happy to hear your opinion! Keep contributing!