People know much about the coaxial cables, but they know little about the different impedance of coaxial cables, namely 50 Ohm and 75 Ohm coaxial cables. Even many optical manufacturers fail to adequately explain the difference between the two types of coax and why they need one type over the other. Therefore, today’s article will define some of the most critical concepts regarding coaxial cable technology, and shed light on the difference between these two coaxial types.
An Ohm is the unit of resistance used to measure the flow of electrical current through a circuit. In the most basic applications where we are dealing with DC or Direct Current electricity, we are measuring the resistance in Ohms. It is known that the smaller the Ohm, the better the performance. So a 50 Ohm cable provides much better results than a 75 Ohm cable. However, you can find the 75 Ohm cables are everywhere inside your home from the back of the TV to cable & satellite TV boxes and internet routers. Why do the 75 Ohm coaxial cables have the poor impedance but still win large popularity in the home network? In fact, there really is no good or bad impedance, just the right impedance for your application. For the detailed information about these two types of coaxial cables, please read on.
Shedding Light on Coaxial Cable
Coaxial cable, as one type of the bulk fiber optic cable, is comprised of three main components—center conductor, dielectric and shield. Center conductor as the name implies, is in the middle of the coaxial cable that can be made of either solid or stranded wire. Surrounding the center conductor is something called the dielectric. The dielectric acts as a buffer of sorts to keep the center conductor isolated and straight. Finally, on the outside of the dielectric is the coaxial cable’s shield, which is usually a combination of Copper and Aluminum foil and/or wire braid. The shield is then coated by something like PVC to insulate it from the environment. As noted before, coaxial cable can be divided into two types according to different impedance: 50 Ohm and 75 Ohm.
50 Ohm Coaxial Cables: The Forgotten Impedance
First, let’s look at 50 Ohm Coaxial Cables. Experimentation in the early 20th century determined that the best power handling capability could be achieved by using 30 Ohm Coaxial Cable, whereas the lowest signal attenuation (LOSS) could be achieved by using 77 Ohm Coaxial Cable. However, there are few dielectric materials suitable for use in a coaxial cable to support 30 Ohm impedance. Thus, 50 Ohm Coaxial Cable was selected as the ideal compromise; offering high power handling AND low attenuation characteristics.
With 50 Ohm coaxial cables being the best compromise solution, practically any application that demands high power handling capacity, i.e. 100 watts or more, will use 50 Ohm Coaxial Cable. A good rule of thumb is that any device that functions as a transmitter or transceiver tends to use 50 Ohm Coaxial Cable. This includes devices such as CB/Ham Radios, Broadcast Radio/TV Transmitters, Wi-Fi and Cellular Phone Repeaters and 2-Way Radios seen in the below image. And since 50 Ohm cables aren’t as ubiquitous as 75 Ohm cables in the home network, running cable is potentially more difficult if your building is not pre-wired for it. Seriously, the cable is noticeably bigger than a 75 Ohm.
75 Ohm Coaxial Cable is the Way to Go
However, not every case warrants high power handling, so 50 Ohm Coaxial Cable is not appropriate for every application. When the objective is to ensure that the signal gets through the cable in the most efficient way possible, losing very little signal strength in the process, 75 Ohm Coaxial Cable is the way to go. This includes devices such as Satellite and Cable TV Receiver Boxes, High Definition Televisions, AM/FM Radio Receivers, and Police Scanners as you can see in the following image.
With the features of low attenuation and capacitance effectively, 75 Ohm coaxial cable becomes the cable of choice for practically all types of digital audio, digital video and data signals. This is also the reason why every cable TV company uses 75 Ohm coax for distributing its digital video channels as well as its broadband internet data signals. Additionally, direct broadcast satellite dishes and over-the-air HDTV antennas require 75 Ohm Coaxial Cable to ensure that all of the digital channels transfer down the cable with the lowest loss and distortion possible.
75 Ohm cables are the standard coax cable and they’re commonly used and are often pre-wired in many homes and businesses. In all, 75 Ohm is primarily used for video and audio, hence why it’s rapid adoption.
To sum up, the 75 Ohm cable is primarily utilized for the transmission of a video signal. In the case of 50 Ohm cable, it is a data signal that is for the most part being transmitted. To put it simply, 75 Ohm is for pictures and 50 Ohm is for information.
Which Ethernet Cable type should you use?
24AWG vs 26AWG vs 28AWG Ethernet Cable: What Is the Difference?
Shielded vs Unshielded Ethernet Cable: Which Should You Use?