Advantages of IP CCTV cameras v standard non IP

IP can have significantly higher resolution, enabling facial identification over a wide area by one camera.
 
Both analogue and digital cameras come with a range of resolutions. It's the number of lines resolved that is important. Anything less that 1080 is not going to be much use, and that includes most of the cheap stuff sold in the big chain stores.
 
Think you may be getting your digital and analogue cameras confused Sally2000.
Half decent analogue cameras can be 800TVL plus (television lines)
High definition cameras come in the 1080P Variety which relates more to the pixels i.e
1920 x 1080-pixel resolution, or 2.1 megapixels
 
'high definition' at 1080p is still pretty poor resolution for CCTV purposes. 5-8 Mpixel are common. Higher res available for specialised applications.
 
IP can have significantly higher resolution, enabling facial identification over a wide area by one camera.


Is that factually correct though because I was told by a few installers that I should go for a TV-I model for high resolution. Few have mentioned Hik Vision, or Serage - see http://www.gtecsecurity.co.uk/ or even those offered by cctv42.co.uk
 
TVi is just a method used for connecting Digital cameras as opposed to analogue and allows them to be connected in a conventional manner i.e Coax. HD cameras are exceptionally good compared to the old analogue cameras. I really fail to see how 1080P cameras can be described as "poor" as compared to old Standard definition cameras the picture quality is a huge leap.

You could always sell your house and buy one of these:
http://avigilon.com/products/video-.../hd-pro-cameras/29-mp-jpeg2000-hd-pro-camera/
 
Last edited:
Think you may be getting your digital and analogue cameras confused Sally2000.
Half decent analogue cameras can be 800TVL plus (television lines)


No confusion at all.
Resolution is the same measurement whether analogue or digital. 800 lines is better than a lot of analogue cameras, but still won't identify faces at a distance.
My cameras are all 1080p (that's 1080 horizontal lines, progressive scan). I consider that's the minimum standard and I may well replace them with something better.
And the recorder must of course be capable of saving that full resolution - some can't.
 
A few things come to mind putting IP ahead.
Being network connected, they can be put anywhere you have a network connection, and it only needs one network connection to support multiple cameras if you can put a switch somewhere to "split" it. Analogue cameras need a separate coax per camera and it can quickly add up in cable - been there, done that. Add in any extra features (such as remote pan/tilt/zoom) and you need yet more cable for that.
In the same way, when you get to your recorder end, the data is already digital so that saves on the need for analogue-digital conversion.
Of course, the cameras these days are all digital to start with - so a bit like hooking up an LCD monitor to a computer via VGA (analogue) it does seem daft to convert an "almost" digital signal to analogue so you can change it back again at the other end of the coax.
And then, an analogue recorder (do they still exist ?) would have an analogue output and then you have similar issues with piping the video and control to somewhere else, but a digital recorder will mostly likely also have network access for viewing the recordings - so the recorder can be put anywhere (making it easy for physical security) without any restrictions on where it can be accessed from.

There are some downsides though.
With an analogue connection it's fairly easy to do a long run with decent low loss coax. Twisted pair networks are (nominally) limited to 100m between endpoints including all patch cables etc. You can go longer but you're eating into the margins. But if you need long runs then you really need to be putting fibre optic links in which puts the cost up.
And while analogue is pretty much sewn up in standards (meaning you can be pretty certain that any analogue camera will work with any analogue system - at least at standard def), there are a number of standards for digital video and further scope for manufacturers to produce "not quite standard" implementations and causing compatibility issues.
 
SimonH2,
So theoretically speaking (not that I am going to do it this way), are you saying that if I had a RJ45 ethernet socket in the wall of one of my rooms, I could simply drill through that outer wall, bring a camera cable in from a new IP camera installed outside (cable will be Cat 6), and then plug it into the ethernet socket and I would be able to then capture that camera video on a software of some sort?

I presume this RJ45 ethernet socket socket would need to be PoE at source from wherever the cable at the end of the socket starts from?
If this RJ45 ethernet socket is not PoE, I can add a switch after the socket and then run several camera cables off that switch which is then plugged into that RJ45 ether sockets? As I say I am not proposing to do it this way, but just trying to understand the technical capabilities.

What about the downside the the home network bandwidth could get clogged up with 8 IP cameras?
 
In principle, yes - anywhere you already have a "network" socket, you could put a camera. If you need more connections that there are existing, then you can plug a switch in - that's how modern ethernet networks work.
Power can be a bit tricky ...
Many cameras are PoE, but you do need to understand the specs. "Standard" "active" PoE (proper name 802.3af) can supply up to 15.4W of power, 802.3at (PoE+) can supply up to 25.5W. There isn't a lot of 802.3at stuff about yet, and it carries a price premium over the lower power standard.
More about PoE at https://en.wikipedia.org/wiki/Power_over_Ethernet

There is also passive PoE where the power supply is just stuffed onto spare wires in the cable and picked off at the load end. There is a small risk with these of plugging "the wrong thing" into one or other end and causing damage - the power supply does not check what it's supplying, just stuffs the power in. Many use 24V, but some use 48V, and some use 12V or even 5V - I think you can see the potential for mismatch if care isn't taken. Passive PoE generally doesn't have the same power limitations - but the flip side is that the cable and connectors have limits to how much power is put down them without damage. Part of the 803.3 PoE design process was working out what power level was practical - wear and tear on connectors when unmated while carrying power, heat generated in cables (in commercial environments, the cables may be in very large bundles), and so on.

There are some "interesting" pitfalls to look out for with active PoE. For example, someone at work was helping a customer setup a PoE IP camera and found it stopped working every night - only to come back to life in the morning :?: Yes the camera was PoE - but only if you didn't use the built in IR illumination which took it over the power budget (the fix waas to use a local power supply). So yes, you need to read things carefully - such details are often not overtly stated. So you may need to use local power, either for the whole unit, or perhaps just for the IR illuminator if it has one.
IR illumination is used for "night vision" - when daylight goes, IR (Infra Red) LEDs (or on older setups, or ones needing lots of power, a big light bulb) is used to illuminate the scene with IR so that the camera can still show you a black and white image.

Just for fun, it's possible to buy network switches which are themselves powered by an upstream PoE supply and which can supply PoE to downstream devices. Obviously, they can't supply more to downstream devices than they have left over after taking their own power requirements from the upstream supply - there might be enough budget for an 802.3at powered device to power one 802.3af camera and maybe have enough left over for one IP phone (they generally take a lot less power).

Caveat, I'm in IT - not a CCTV or Security expert.
 
What about the downside the the home network bandwidth could get clogged up with 8 IP cameras?

Again it will depend upon the resolution and therefore the bandwidth of the cameras. A typical high-res camera would run up to about 4 Mb/s, so 8 of them might need 30+ Mb/s. That is a fair chunk of a 100Mb/s home network, but many networks are now 1Gb/s so that wouldn't be a problem. However you'd need a seriously fast NVR to save all that data, and a very large hard disk array to store it. That's quite an investment.
 
Indeed, forgot about addressing the bandwidth :rolleyes: On a gigabit network it's not going to be noticeable anyway, and gigabit switches are so cheap now that it's not really worth the saving to buy 100M stuff.
 
What about the downside the the home network bandwidth could get clogged up with 8 IP cameras?
Again it will depend upon the resolution and therefore the bandwidth of the cameras. A typical high-res camera would run up to about 4 Mb/s, so 8 of them might need 30+ Mb/s. That is a fair chunk of a 100Mb/s home network, but many networks are now 1Gb/s so that wouldn't be a problem. However you'd need a seriously fast NVR to save all that data, and a very large hard disk array to store it. That's quite an investment.
You could use cameras with on-board storage at high resolution, and just pass a low res feed over the network for viewing.
 

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Back
Top