PANASONIC VIERA TX-49DX650B Smart 4k Ultra HD 49" LED TV - £479 at Currys
45°Expired

PANASONIC VIERA TX-49DX650B Smart 4k Ultra HD 49" LED TV - £479 at Currys

24
Found 5th Apr 2017
Currys are having an Easter Mega sale with 100's of deals at Black Friday prices apparently. £200 off the PANASONIC VIERA TX-49DX650B Smart 4k Ultra HD 49" LED TV. seems like quite a decent price as a previous thread had it at £585 and got 85°. I'm not too savvy on TV's so i'm sure i'll get a good schooling in the process. If anyone knows a Code to go with this that would be helpful too.
The link to many more 4k Ultra HD TV's is here: More 4k TV's
Community Updates
24 Comments
Thank you for posting! I looked through some of the one's on offer in the promotion and found what I believe to be the cheapest 10bit HDR - a Samsung 55 KS7000 for £899 - does anyone know of a cheaper 10 bit? To answer my own question the Samsung UE43KS7500 @ £799 is the cheapest I have now found - thank you OP - I am using website display specifications to check the panel bit depth for anyone who wants to know... and PRC sell it for £768 - getting better and better!


Edited by: "JumpMan1980" 5th Apr 2017
I wont a 10bit plus 3d is lg any good
Original Poster
martywarty

I wont a 10bit plus 3d is lg any good


??? huh?
I wont a 4k 10bit panel with 3d ive seen lg have one are they any good
Original Poster
martywarty

I wont a 4k 10bit panel with 3d ive seen lg have one are they any good


Huh???
Why the huh
Original Poster
roscoeward

Huh???


So... You have seen a 4K, 10bit panel, that incorporates 3D? You've seen one, you want it. And you're asking if it's good quality?
Original Poster
roscoeward

So... You have seen a 4K, 10bit panel, that incorporates 3D? You've seen … So... You have seen a 4K, 10bit panel, that incorporates 3D? You've seen one, you want it. And you're asking if it's good quality?


Which one is it?
Original Poster
JumpMan1980

Thank you for posting! I looked through some of the one's on offer in the … Thank you for posting! I looked through some of the one's on offer in the promotion and found what I believe to be the cheapest 10bit HDR - a Samsung 55 KS7000 for £899 - does anyone know of a cheaper 10 bit? To answer my own question the Samsung UE43KS7500 @ £799 is the cheapest I have now found - thank you OP - I am using website display specifications to check the panel bit depth for anyone who wants to know... and PRC sell it for £768 - getting better and better!


Can you post the link please
Original Poster
roscoeward

Can you post the link please


£749 with 5 year warranty reliantdirect.co.uk/HOM…IWw
roscoeward

£749 with 5 year warranty … £749 with 5 year warranty http://www.reliantdirect.co.uk/HOME/Product-Detail/Samsung-UE43KS7500-43-Series-7-Curved-SUHD-with-Quantum-Dot-Display_UE43KS7500.htm?LGWCODE=16135;58699;3681&gclid=CKywh_fJjdMCFRVmGwodVMIIWw



Nice one - even cheaper again!!
JumpMan1980

Nice one - even cheaper again!!



Its not cheaper than ops listing as different size screen origionaly was 55 ure listing is 43
Original Poster
martywarty

Its not cheaper than ops listing as different size screen origionaly was … Its not cheaper than ops listing as different size screen origionaly was 55 ure listing is 43


The OP's listing was 49" but still not the same...
JumpMan1980

Thank you for posting! I looked through some of the one's on offer in the … Thank you for posting! I looked through some of the one's on offer in the promotion and found what I believe to be the cheapest 10bit HDR - a Samsung 55 KS7000 for £899 - does anyone know of a cheaper 10 bit?



'10-bit HDR' is a pretty meaningless phrase.

The important bits to have 10-bit support are the input (so you can receive HDR content) and the electronics (so you don't get banding when converting to the TV's colour space). As far as I know both are standard on any 2016 or 2017 TV that claims any sort of HDR support.

Having a 10-bit LCD panel makes very little difference. Maybe you can see the difference between 8-bit with dithering and 10-bit when you sit 5 feet away from an 80"+ screen, but you certainly won't see it on normal sizes and viewing distances.

The elements that make the biggest difference between HDR implementations are the backlight hardware, particularly it's local dimming abilities, and how the electronics handles brightnesses and colour spaces beyond what the TV can display.

The KS7000 is a good HDR TV for the money, but mainly because it's the cheapest TV that has a local dimming system that actually improves the picture - although it is rather rudimentary and I'd expect things to improve quite a bit over the next few years.


EDIT: As for the TV in the deal I found the DX600/DX650 to have fairly poor picture quality compared to others at the same price so cold from me.
Edited by: "EndlessWaves" 5th Apr 2017
EndlessWaves

'10-bit HDR' is a pretty meaningless phrase.The important bits to have … '10-bit HDR' is a pretty meaningless phrase.The important bits to have 10-bit support are the input (so you can receive HDR content) and the electronics (so you don't get banding when converting to the TV's colour space). As far as I know both are standard on any 2016 or 2017 TV that claims any sort of HDR support.Having a 10-bit LCD panel makes very little difference. Maybe you can see the difference between 8-bit with dithering and 10-bit when you sit 5 feet away from an 80"+ screen, but you certainly won't see it on normal sizes and viewing distances. The elements that make the biggest difference between HDR implementations are the backlight hardware, particularly it's local dimming abilities, and how the electronics handles brightnesses and colour spaces beyond what the TV can display.The KS7000 is a good HDR TV for the money, but mainly because it's the cheapest TV that has a local dimming system that actually improves the picture - although it is rather rudimentary and I'd expect things to improve quite a bit over the next few years.EDIT: As for the TV in the deal I found the DX600/DX650 to have fairly poor picture quality compared to others at the same price so cold from me.



It's not meaningless to me - it is a matter of perspective - I want a TV to 'show off' my PS4 Pro's ability - I have been led to believe 10 bit HDR is something I want to ensure the TV I choose has. Is that not correct as far as your're concerned?
EndlessWaves

'10-bit HDR' is a pretty meaningless phrase.The important bits to have … '10-bit HDR' is a pretty meaningless phrase.The important bits to have 10-bit support are the input (so you can receive HDR content) and the electronics (so you don't get banding when converting to the TV's colour space). As far as I know both are standard on any 2016 or 2017 TV that claims any sort of HDR support.Having a 10-bit LCD panel makes very little difference. Maybe you can see the difference between 8-bit with dithering and 10-bit when you sit 5 feet away from an 80"+ screen, but you certainly won't see it on normal sizes and viewing distances. The elements that make the biggest difference between HDR implementations are the backlight hardware, particularly it's local dimming abilities, and how the electronics handles brightnesses and colour spaces beyond what the TV can display.The KS7000 is a good HDR TV for the money, but mainly because it's the cheapest TV that has a local dimming system that actually improves the picture - although it is rather rudimentary and I'd expect things to improve quite a bit over the next few years.EDIT: As for the TV in the deal I found the DX600/DX650 to have fairly poor picture quality compared to others at the same price so cold from me.



Thank you for the info though - this is a learning curve and all info is welcome!
Here's a review of the Samsung TV posted earlier in this thread - I am still thinking it is the one for me at my current budget, with an intention of seeing what a ps4 pro 'can do' - avforums.com/rev…590
JumpMan1980

It's not meaningless to me - it is a matter of perspective - I want a TV … It's not meaningless to me - it is a matter of perspective - I want a TV to 'show off' my PS4 Pro's ability - I have been led to believe 10 bit HDR is something I want to ensure the TV I choose has. Is that not correct as far as your're concerned?



I suppose a 10-bit panel works as a crude filter for good HDR hardware because it's an expensive technology with minor benefit so it's only found in high end TVs that can afford such things. And high end TVs also have good HDR hardware.

The core of HDR is the High Dynamic Brightness Range. For LCDs this is the ability of the backlight to brighten and dim in sections independent of the LCD panel. That gives you the ability to have the normal picture over most of the screen, but also have things like lightsources, glints or explosions that are brighter than a white surface (which is normally the maximum brightness).

The finer the control that the backlight provides, the better. A high end TV at the moment has around 500 dimming zones (the KS7000 has ten).

HDR content is encoded in 10-bit colour depth because that gives 1024 shades of brightness instead of the normal 256. This doesn't mean four times the brightness range though, these are the number of divisions between black and white. The brightness in white is defined by what is called the gamma curve and HDR introduces a new one called Perceptual Quantization which is what gives it the higher brightness range.

LCDs work in a similar way. A 10-bit panel has an identical black and white level to an 8-bit one, it just splits it up more finely. It's great if you're working with subtle graduations of colour - the greyscale monitors that doctors examine x-rays on have been 10-bit for years.

Having more shades in the existing brightness range is of limited benefit to HDR though, I think a lot of it's benefit in that area comes from the ability to get a more exact shade during this transition period between colour gamuts, where the standards, content and TV colour gamut are all likely to be mismatched.

8-bit panels typically use dithering, using adjacent pixels of similar colours to produce that colour when viewed from far enough away. And 'far enough away' is a very short distance with the tiny size of the pixels on a 4K screen.

The extra brightness range on LCDs is created by backlight control, not the LCD panel. You want a backlight that can deliver that brightness to the spots requested and electronics that are good at both dividing the work between LCD and backlight and good at approximating content that the TV can't display. HDR specifies wider brightness and colour ranges than any TV on the market can currently display.

If you want a single number that specifies HDR quality, then the number of dimming zones is often a pretty good approximation right now.
EndlessWaves

I suppose a 10-bit panel works as a crude filter for good HDR hardware … I suppose a 10-bit panel works as a crude filter for good HDR hardware because it's an expensive technology with minor benefit so it's only found in high end TVs that can afford such things. And high end TVs also have good HDR hardware.The core of HDR is the High Dynamic Brightness Range. For LCDs this is the ability of the backlight to brighten and dim in sections independent of the LCD panel. That gives you the ability to have the normal picture over most of the screen, but also have things like lightsources, glints or explosions that are brighter than a white surface (which is normally the maximum brightness). The finer the control that the backlight provides, the better. A high end TV at the moment has around 500 dimming zones (the KS7000 has ten).HDR content is encoded in 10-bit colour depth because that gives 1024 shades of brightness instead of the normal 256. This doesn't mean four times the brightness range though, these are the number of divisions between black and white. The brightness in white is defined by what is called the gamma curve and HDR introduces a new one called Perceptual Quantization which is what gives it the higher brightness range.LCDs work in a similar way. A 10-bit panel has an identical black and white level to an 8-bit one, it just splits it up more finely. It's great if you're working with subtle graduations of colour - the greyscale monitors that doctors examine x-rays on have been 10-bit for years.Having more shades in the existing brightness range is of limited benefit to HDR though, I think a lot of it's benefit in that area comes from the ability to get a more exact shade during this transition period between colour gamuts, where the standards, content and TV colour gamut are all likely to be mismatched. 8-bit panels typically use dithering, using adjacent pixels of similar colours to produce that colour when viewed from far enough away. And 'far enough away' is a very short distance with the tiny size of the pixels on a 4K screen. The extra brightness range on LCDs is created by backlight control, not the LCD panel. You want a backlight that can deliver that brightness to the spots requested and electronics that are good at both dividing the work between LCD and backlight and good at approximating content that the TV can't display. HDR specifies wider brightness and colour ranges than any TV on the market can currently display.If you want a single number that specifies HDR quality, then the number of dimming zones is often a pretty good approximation right now.


Expert on Dimness!
can anyone help has anyone bought a tv or heard of this website its so cheap( pixeltelevisions.co.uk)
EndlessWaves

I suppose a 10-bit panel works as a crude filter for good HDR hardware … I suppose a 10-bit panel works as a crude filter for good HDR hardware because it's an expensive technology with minor benefit so it's only found in high end TVs that can afford such things. And high end TVs also have good HDR hardware.The core of HDR is the High Dynamic Brightness Range. For LCDs this is the ability of the backlight to brighten and dim in sections independent of the LCD panel. That gives you the ability to have the normal picture over most of the screen, but also have things like lightsources, glints or explosions that are brighter than a white surface (which is normally the maximum brightness). The finer the control that the backlight provides, the better. A high end TV at the moment has around 500 dimming zones (the KS7000 has ten).HDR content is encoded in 10-bit colour depth because that gives 1024 shades of brightness instead of the normal 256. This doesn't mean four times the brightness range though, these are the number of divisions between black and white. The brightness in white is defined by what is called the gamma curve and HDR introduces a new one called Perceptual Quantization which is what gives it the higher brightness range.LCDs work in a similar way. A 10-bit panel has an identical black and white level to an 8-bit one, it just splits it up more finely. It's great if you're working with subtle graduations of colour - the greyscale monitors that doctors examine x-rays on have been 10-bit for years.Having more shades in the existing brightness range is of limited benefit to HDR though, I think a lot of it's benefit in that area comes from the ability to get a more exact shade during this transition period between colour gamuts, where the standards, content and TV colour gamut are all likely to be mismatched. 8-bit panels typically use dithering, using adjacent pixels of similar colours to produce that colour when viewed from far enough away. And 'far enough away' is a very short distance with the tiny size of the pixels on a 4K screen. The extra brightness range on LCDs is created by backlight control, not the LCD panel. You want a backlight that can deliver that brightness to the spots requested and electronics that are good at both dividing the work between LCD and backlight and good at approximating content that the TV can't display. HDR specifies wider brightness and colour ranges than any TV on the market can currently display.If you want a single number that specifies HDR quality, then the number of dimming zones is often a pretty good approximation right now.



This was brilliant, thank you. Very knowledgeable. Much appreciated.
Post a comment
Avatar
@
    Text