4K resolution HD Televisions???

Joined
21 Jul 2008
Messages
216
Reaction score
1
Location
Aberdeen
Country
United Kingdom
Anyone had any experience with the new 4K resolution HD tv's, are standard HDMI cables suitable for them? Or do they require a specialist cable?

Thanks
 
Sponsored Links
Sponsored Links
4K TVs are out now. But there's no readily available content for the average user. Sony have what amounts to a hard drive media player. But the service relies on downloading via the web, so users had better have an unlimited streaming package or else face some fairly hefty usage charges.

Sony and Panasonic have joined forces to develop the next generation of optical disc. It uses red laser to pack a 4K film on to the same 12cm size disc as Blu-ray.

Current TVs are shipping with HDMI 1.4 spec connectors. This is the spec that has been in place since May 2009. This spec will handle 4K resolution. The limit is the refresh rate. 4K films will be fine since they are 24 frames per second (fps) and the standard can handle 30fps. What the standard won't handle is anything refreshing faster than that. So if you had a 4K game with a frame refresh rate of 50 or 60 fps then you'd be out of luck.

HDMI 2.0 was announced in Sept this year. It can refresh 4K at 60fps.

Early indications were that HDMI 2.0 would require new hardware in the sockets to handle the extra refresh rate. The latest indications are that it will be achieved with just a firmware update.

The cables you're buying today (High Speed or High Speed with Ethernet) should handle the increased refresh rate once HDMI 2.0 sources start to hit the streets as long as those cables are well made and meet the standard. However, a good contingency plan is to install cables in conduit to make replacement easy. Or to lay in a couple of runs of Cat6 to be used with balun boxes.

Anyone buying a 4K TV today is at what we refer to as the bleeding edge of the technology envelope. They're paying the highest price and taking the biggest risk that something in the standard will change. That's always a risk with 1st gen products.
 
The camera club projector has 1400x1050 display and even sitting at the front with a large screen I can't see any signs of pixels we all have to reduce our picture sizes to match projector.

At 4k unless projected one is unlikely to see any difference in quality only when projected on a cinema size screen would you see the difference. As a large computer monitor where you are within 2 foot of the screen may be but in my room sitting at 12 foot from 32 inch TV can hardly see difference between standard on HD TV. OK if I sit on top of the TV yes I can see the difference but my TV does not dominate my room.

I see the point in future proof but over the years I have done so much with that in mind only now to be redundant.

I wired twin telephone lines to fax machine one in and one out for old 9.6k now at 14k not required even if you still have fax.

I wired LAN using 95 ohm coax that was soon out dated.

The list goes on and now I don't really bother with future proof as experience has shown me it all changes too fast so only do it now as required.
 
4K TVs are out now. But there's no readily available content for the average user. Sony have what amounts to a hard drive media player. But the service relies on downloading via the web, so users had better have an unlimited streaming package or else face some fairly hefty usage charges.

Sony and Panasonic have joined forces to develop the next generation of optical disc. It uses red laser to pack a 4K film on to the same 12cm size disc as Blu-ray.

Current TVs are shipping with HDMI 1.4 spec connectors. This is the spec that has been in place since May 2009. This spec will handle 4K resolution. The limit is the refresh rate. 4K films will be fine since they are 24 frames per second (fps) and the standard can handle 30fps. What the standard won't handle is anything refreshing faster than that. So if you had a 4K game with a frame refresh rate of 50 or 60 fps then you'd be out of luck.

HDMI 2.0 was announced in Sept this year. It can refresh 4K at 60fps.

Early indications were that HDMI 2.0 would require new hardware in the sockets to handle the extra refresh rate. The latest indications are that it will be achieved with just a firmware update.

The cables you're buying today (High Speed or High Speed with Ethernet) should handle the increased refresh rate once HDMI 2.0 sources start to hit the streets as long as those cables are well made and meet the standard. However, a good contingency plan is to install cables in conduit to make replacement easy. Or to lay in a couple of runs of Cat6 to be used with balun boxes.

Anyone buying a 4K TV today is at what we refer to as the bleeding edge of the technology envelope. They're paying the highest price and taking the biggest risk that something in the standard will change. That's always a risk with 1st gen products.

Great answer, thanks
 
I always find it ironic that most people can only afford the real good stuff when their eyesight and hearing have numbed a bit, probably negating the extra pixels / greater clarity of the sound :D
 
Hear, hear (pardon?)
It's certainly true for me. One ear is down to 5kHz and any appreciable volume just causes distortion. Everything sounds terrible. :(
 
The camera club projector has 1400x1050 display and even sitting at the front with a large screen I can't see any signs of pixels we all have to reduce our picture sizes to match projector.
Many years ago I went to a product pitch given by (Kodak, IIRC) for some new commercial film processing/printing equipment they'd introduced, and there was a 35mm slide show.

I had never seen such stunning quality in my life. Chatting to them afterwards, it turned out that the originals had been shot on 10x8, then optically reduced onto the finest stock Kodak had which was then trimmed and mounted into frames.


mind only now to be redundant.
Only in the GD forum, surely?







:LOL:
 
Chester had two award winning buildings one the police station with a special filing system so files could be accessed from all floors and other MANWEB building where the lights were only heating required. Both cutting edge at the time they were build and both now demolished.

Less than 40 years from cutting edge to redundant.

I was so proud of my house with the maximum of 5 telephones but sockets in every room so we could easy move phones as required today one single phone base with cordless does a better job.

Had CAT 5 around the house now only local around main TV laptops don't need it and I don't want extra leads when it sits on my knee. Same with my wife and her Ipad and Iphone the LAN is now redundant still exists but not used.

OK I got some use out of it before it was outdated but to install just in case is just not worth it. Technology just moves too fast to be worth installing any system for latter. OK some conduit where one can easy pull in cables maybe but no point pulling in cables.

I bought 8 track, and V2000 both were the best at the time and both died mainly due to cost no one wanted to pay the extra for the better system cassette tapes and VHS was cheaper and so although better 8 track and V2000 went.

The 3D TV seems to be now having problems with broadcasts being reduced and DAB radio is not taking off as hoped so I would not bother until it has been proved and there is far more commercial take up than at the moment.
 
I disagree, a Cat5e network at Gigabit speed has a number of years left before it starts to fall behind, install Cat 6 or 7 and be safe for a good while longer, besides, a wireless network does not have anywhere near the same bandwidth (it's not all about speed!) as a wired LAN, even the current N standard and the next few iterations will not provide the bandwidth that even a Gigabit network will provide.

The more wireless devices you have the less bandwidth is available for each, a wired LAN will give full bandwidth to each device.

There is an argument for having both in a 'connected' house. For a bit of web surfing wireless isn't a problem but try streaming FLAC audio over Wi-Fi and downloading a large file from a site that can serve faster than your broadband connection, yep the audio will stutter and it gets annoying. Then add in wireless interference and the handshaking that different wireless routers/APs will do with each other and to be frank it's crap.

Hard wire your streaming devices, TVs, bluerays, server etc and then reserve the Wi-Fi for the tablets, phone and laptops - have the best of both worlds :)
 
I was going to post along the very same lines as ironsidebob. :LOL:

There is a paradox. The more wireless enabled devices we have then the greater the need for a wired network. I'm installing more wired networking for clients than ever before. The problem they're finding is that wireless doesn't cut it when there's lots of devices all competing for a share of bandwidth.

The job I'm doing at the moment is a prime example. There are five people in the house including three teens. There are at least two laptops, three tablets, a couple of game consoles, a clutch of mobile phones and two smart TVs all trying to share the same g+n wireless network. Two of the kids are big in to online gaming, so at the weekends there's usually a console on for several hours at a time. One of the issues the principal asked me to tackle was the poor connection speed. Hard-wiring as much as possible has helped enormously.
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top