The diagrams of the spectrum bands are wild for me (coming from the RF world) - in that world, a 2GHz channel that I'd used in some systems was considered ridiculously huge, but here in fibre the 'small' channels are 50GHz!
People really don't get the enormity of the difference - when there were policy debates in my country about rolling our new fixed line infrastructure there were literally people saying "but won't all homes and businesses just be able to use wireless in the future?"
My armchair guess is that because traffic hitting the cable is already serialized in some way that larger channels make sense? Of course, those large channels could also be multiplexed in some way and most long-range lines run DWDM/OTN, so I'm just as likely to be talking out of my ass.
I am surprised there are repeaters involved. Is this because of the imperfections of the surface of the fibreglass tubes that cause decay of precision of the reflection over long distances(a visual noise)?
Even the best optical fiber transceivers and glass are limited (practically) to about 100km; repeaters are typically placed every 60-70km. The technology for delivering power to the repeaters is fascinating. They inject 5,000-10,000VDC at one end and each repeater shunts off a tiny amount of current to power the amplifier. All of this is embedded in the cable itself before being loaded onto the cable ship.
The history behind TAT-1, the first transatlantic telephone cable, and the repeaters used, is fascinating. Bell Labs designed the repeaters. The repeaters used vacuum tubes for amplification and were designed for extreme reliability. The flexible repeaters were integrated into the cable like modern cables.
The tubes were tested to an extremely high standard. Only a small fraction of the manufactured tubes were selected after testing: Bell Labs designed a test regime over 18 years to detect minute flaws in manufactured tubes
The cable and its 306 tubes operated for 22 years with no failures.
Note that this is in stark contrast to the first transatlantic tele_graph_ cable, which did not really have a ground line and consisted of seven copper wires covered with three coats of gutta-percha (natural latex rubber) and then hemp and tar. Many breaks and failures later, the first messages were sent in August 1858. The bandwidth was such that Queen Victoria's message to the US president, James Buchanan, that contained 98 words took 16 hours to send. It ultimately died during a famous dispute between William Thomson – later Lord Kelvin – yes, _that_ Kelvin – and the project's main engineer that ultimately ended in disaster (when the engineer put 2k VDC on the cable, destroying the insulation, against Thomson's advice) and a famous court case that basically saw the role of "the scientist" (the physicist!) as a competition professional for the first time.
It's all fascinating history. By the time of Bell Labs, an awful _lot_ had already been learned from previous failures.
Yeah, trying to build a thousands of km long undersea cable without a good theory of transmission lines is gonna be a painful experience (a lot of this theory was developed to fix these problems!)
To add more details to other replies you received, the primary factors are Rayleigh scattering and impurities absorbing light energy, at 1550nm (where this loss is least pronounced) the number that usually gets thrown around is 0.2dB/km in attenuation. That adds up to needing those repeaters at the intervals we have them.
I think it's mostly that the fiber isn't a perfectly transparent medium, over tens of kilometers attenuation adds up. As said in https://news.ycombinator.com/item?id=45159639 these are just to boost power, they don't reform the signal.
Optical repeaters are 1R repeaters, I.e. they regenerate power. Inside the repeater "boxes" (they are actually cylinders) there is an optical amplifier. For typically these are Erbium doped fiber amplifiers (EDFA). I other words a piece of fibre doped with Erbium (a rare earth). The amplifiers are pumped with laser diodes (typically 1-4 per EDFA) at 980 nm and 1480 nm wavelength. By pumping the doped fibre with these wavelength you provide high gain to the telecom channels which are usually in the optical C-band (~1525-1565nm). This way you can reamplify signals over a large bandwidth (~4 THz) without having to do detection and retransmission (which would be unscalable). Repeaters are typically spaced at 60-80 km in submarine, with a "transparent" design (the gain compensates for the transmission loss of the 60km fibre).
Power delivery to the laser diodes is done through the metal jacket of the cable. The whole submarine cable is essentially a very long DC transmission line. Which is a fascinating topic in itself, E.g. What is ground in such a line, it will differ by 1000s of Volts between continents.
> This way you can reamplify signals over a large bandwidth (~4 THz) without having to do detection and retransmission (which would be unscalable).
This trick also means the cable doesn't care about the rest of the technology. If it was a retransmitter then we'd need to replace the entire cable if we change from 100Gbps over Protocol #39 to 200 Gbps over Protocol #40 because every retransmitter needs to be equipped for the new protocol, but the optical amplifier doesn't care why these photons turned up, what they mean - when provided with power it just ensures proportionately more photons like them come out of the amplifier.
Because they're not actually the same photons weird quantum tricks that would work on bench scale, where it was literally the same photon at the receiver as when you transmitted, will not work, but any conventional signalling within quite broad limits is OK. Researchers at the University where I studied as an undergraduate developed EDFA.
Fun fact: Pirelli, the tire company, used to be big in submarine cable repeaters and related products. The ones I saw at telecom shows were painted Pirelli yellow. That part of Pirelli was sold to private equity.
It varies too much to be useful for powering repeaters. It also can't sustain enough current to be useful, since the resistance across the entire system is huge.
It’s not free at all. Most of the voltage drop along the cable is caused by conversion of electrical energy into photons within the erbium-doped fiber amplifiers. A relatively small fraction of the voltage drop is caused by losses in the copper cable that carries the current along the route. The high supply voltage allows a relatively small amount of current to carry thousands of kilowatts of power to the amplifiers without causing much loss in the copper.
I took that as referring to how over large distances the results of driving a metal rod into the dirt don't always match, so if you do things like tie both ends of a shielded cable's shielding to separate ground rods you can get odd problems sometimes.
Although I hadn't thought the differences were usually anywhere close to that large.
The challenge as I understand it, is that yes you will get ‘free’ power (not actually free, as you had to create the low resistance electrical path for it to exist), but you have no control over the properties or values of what you get - and it will vary unpredictably.
It’s also unlikely you’ll consistently get much actual net power out of it, as you’re competing against an entire planets worth of reasonably conductive (in bulk) parallel paths.
There's a couple of factors at play here. One is that AC suffers from capacitive losses over long distances (high power multi-megawatt underground/undersea cables are often HVDC for this and other reasons).
The other more interesting one is that the repeaters in this kind of fibre optic cable are usually powered from both ends, from completely separate electrical grids (so one side sends -5000V and the other sends +5000V, for example). This allows for some level of redundancy as well as thinner insulation. With AC, keeping the phases on both sides aligned would be impractical, as well as the inherent inefficiencies of AC transmission.
AC is only popular because it works with transformers to step up/down the voltage, and it would be more expensive to step up/down a DC signal using electronics (which usually involves converting to AC internally anyway).
AC Voltage is specified in RMS volts, which is based on the average power the AC transmits. The peak voltage (top of the sine wave) is 1.414x the RMS voltage. The insulator only cares about the peak before it breaks down, so because DC doesn't waste time at lower voltages, can transmit more power for the same insulation.
These are coax cables, just by the nature of the external physical shielding required (steel cable sheath). So, the EMF should be contained inside and not affected by the salt water. But, I'm not an expert there and could be missing something.
In addition to the other replies, I also recall hearing some time ago that the AC EM field interacted with wildlife in surprising ways (causing sharks to attack the cable, IIRC). It could be an urban legend at this point though.
The diagrams of the spectrum bands are wild for me (coming from the RF world) - in that world, a 2GHz channel that I'd used in some systems was considered ridiculously huge, but here in fibre the 'small' channels are 50GHz!
People really don't get the enormity of the difference - when there were policy debates in my country about rolling our new fixed line infrastructure there were literally people saying "but won't all homes and businesses just be able to use wireless in the future?"
My armchair guess is that because traffic hitting the cable is already serialized in some way that larger channels make sense? Of course, those large channels could also be multiplexed in some way and most long-range lines run DWDM/OTN, so I'm just as likely to be talking out of my ass.
Cool slides. Note that MAREA is owned by Microsoft and Meta (not Google as slides state) [1]
1 - https://en.wikipedia.org/wiki/MAREA
Semi-surprised at the landing point of (multiple?) cables in Halifax, NS.
Given it's a larger market, I would have thought there would be more direct runs landing on the US coast instead of an 'intermediary' point in Canada.
Video of the talk: https://www.youtube.com/watch?v=JYblPwg70Ns
I am surprised there are repeaters involved. Is this because of the imperfections of the surface of the fibreglass tubes that cause decay of precision of the reflection over long distances(a visual noise)?
Even the best optical fiber transceivers and glass are limited (practically) to about 100km; repeaters are typically placed every 60-70km. The technology for delivering power to the repeaters is fascinating. They inject 5,000-10,000VDC at one end and each repeater shunts off a tiny amount of current to power the amplifier. All of this is embedded in the cable itself before being loaded onto the cable ship.
The history behind TAT-1, the first transatlantic telephone cable, and the repeaters used, is fascinating. Bell Labs designed the repeaters. The repeaters used vacuum tubes for amplification and were designed for extreme reliability. The flexible repeaters were integrated into the cable like modern cables.
The tubes were tested to an extremely high standard. Only a small fraction of the manufactured tubes were selected after testing: Bell Labs designed a test regime over 18 years to detect minute flaws in manufactured tubes
The cable and its 306 tubes operated for 22 years with no failures.
Note that this is in stark contrast to the first transatlantic tele_graph_ cable, which did not really have a ground line and consisted of seven copper wires covered with three coats of gutta-percha (natural latex rubber) and then hemp and tar. Many breaks and failures later, the first messages were sent in August 1858. The bandwidth was such that Queen Victoria's message to the US president, James Buchanan, that contained 98 words took 16 hours to send. It ultimately died during a famous dispute between William Thomson – later Lord Kelvin – yes, _that_ Kelvin – and the project's main engineer that ultimately ended in disaster (when the engineer put 2k VDC on the cable, destroying the insulation, against Thomson's advice) and a famous court case that basically saw the role of "the scientist" (the physicist!) as a competition professional for the first time.
It's all fascinating history. By the time of Bell Labs, an awful _lot_ had already been learned from previous failures.
Yeah, trying to build a thousands of km long undersea cable without a good theory of transmission lines is gonna be a painful experience (a lot of this theory was developed to fix these problems!)
To add more details to other replies you received, the primary factors are Rayleigh scattering and impurities absorbing light energy, at 1550nm (where this loss is least pronounced) the number that usually gets thrown around is 0.2dB/km in attenuation. That adds up to needing those repeaters at the intervals we have them.
I think it's mostly that the fiber isn't a perfectly transparent medium, over tens of kilometers attenuation adds up. As said in https://news.ycombinator.com/item?id=45159639 these are just to boost power, they don't reform the signal.
I was curious to learn more about what the repeater systems look like.
Optical repeaters are 1R repeaters, I.e. they regenerate power. Inside the repeater "boxes" (they are actually cylinders) there is an optical amplifier. For typically these are Erbium doped fiber amplifiers (EDFA). I other words a piece of fibre doped with Erbium (a rare earth). The amplifiers are pumped with laser diodes (typically 1-4 per EDFA) at 980 nm and 1480 nm wavelength. By pumping the doped fibre with these wavelength you provide high gain to the telecom channels which are usually in the optical C-band (~1525-1565nm). This way you can reamplify signals over a large bandwidth (~4 THz) without having to do detection and retransmission (which would be unscalable). Repeaters are typically spaced at 60-80 km in submarine, with a "transparent" design (the gain compensates for the transmission loss of the 60km fibre).
Power delivery to the laser diodes is done through the metal jacket of the cable. The whole submarine cable is essentially a very long DC transmission line. Which is a fascinating topic in itself, E.g. What is ground in such a line, it will differ by 1000s of Volts between continents.
> This way you can reamplify signals over a large bandwidth (~4 THz) without having to do detection and retransmission (which would be unscalable).
This trick also means the cable doesn't care about the rest of the technology. If it was a retransmitter then we'd need to replace the entire cable if we change from 100Gbps over Protocol #39 to 200 Gbps over Protocol #40 because every retransmitter needs to be equipped for the new protocol, but the optical amplifier doesn't care why these photons turned up, what they mean - when provided with power it just ensures proportionately more photons like them come out of the amplifier.
Because they're not actually the same photons weird quantum tricks that would work on bench scale, where it was literally the same photon at the receiver as when you transmitted, will not work, but any conventional signalling within quite broad limits is OK. Researchers at the University where I studied as an undergraduate developed EDFA.
Fun fact: Pirelli, the tire company, used to be big in submarine cable repeaters and related products. The ones I saw at telecom shows were painted Pirelli yellow. That part of Pirelli was sold to private equity.
> What is ground in such a line, it will differ by 1000s of Volts between continents.
Does that translate to free energy for the repeaters?
It varies too much to be useful for powering repeaters. It also can't sustain enough current to be useful, since the resistance across the entire system is huge.
It’s not free at all. Most of the voltage drop along the cable is caused by conversion of electrical energy into photons within the erbium-doped fiber amplifiers. A relatively small fraction of the voltage drop is caused by losses in the copper cable that carries the current along the route. The high supply voltage allows a relatively small amount of current to carry thousands of kilowatts of power to the amplifiers without causing much loss in the copper.
I took that as referring to how over large distances the results of driving a metal rod into the dirt don't always match, so if you do things like tie both ends of a shielded cable's shielding to separate ground rods you can get odd problems sometimes.
Although I hadn't thought the differences were usually anywhere close to that large.
You’re referring to creating an intentional ground loop, I believe [https://en.m.wikipedia.org/wiki/Ground_loop_(electricity)].
The challenge as I understand it, is that yes you will get ‘free’ power (not actually free, as you had to create the low resistance electrical path for it to exist), but you have no control over the properties or values of what you get - and it will vary unpredictably.
It’s also unlikely you’ll consistently get much actual net power out of it, as you’re competing against an entire planets worth of reasonably conductive (in bulk) parallel paths.
It’s almost always a problem because of that.
It’s almost always a problem because of that.
That explains why the latency is still decent even after repeated amplification.
I wonder why DC though. Is AC lossy when surrounded by salt water?
There's a couple of factors at play here. One is that AC suffers from capacitive losses over long distances (high power multi-megawatt underground/undersea cables are often HVDC for this and other reasons).
The other more interesting one is that the repeaters in this kind of fibre optic cable are usually powered from both ends, from completely separate electrical grids (so one side sends -5000V and the other sends +5000V, for example). This allows for some level of redundancy as well as thinner insulation. With AC, keeping the phases on both sides aligned would be impractical, as well as the inherent inefficiencies of AC transmission.
AC is only popular because it works with transformers to step up/down the voltage, and it would be more expensive to step up/down a DC signal using electronics (which usually involves converting to AC internally anyway).
AC Voltage is specified in RMS volts, which is based on the average power the AC transmits. The peak voltage (top of the sine wave) is 1.414x the RMS voltage. The insulator only cares about the peak before it breaks down, so because DC doesn't waste time at lower voltages, can transmit more power for the same insulation.
These are coax cables, just by the nature of the external physical shielding required (steel cable sheath). So, the EMF should be contained inside and not affected by the salt water. But, I'm not an expert there and could be missing something.
In addition to the other replies, I also recall hearing some time ago that the AC EM field interacted with wildlife in surprising ways (causing sharks to attack the cable, IIRC). It could be an urban legend at this point though.
unrelated but the default power supply for Central Office telecom equipment was always -48V DC (and 23" vs 19" racks)