I really like the page about Coax Loss Calculation, which nicely explains the topic...but only the loss amplitude.
What about signal phase?
How does that depend on losses in a coax cable?
I do have some paper (really printed on paper) claiming:
"Because the attenuation (due to eddy currents in the conductors) in a coax cable is proportional to the square root of frequency, the dispersion at a particular frequency (measured in radians) is numerically equal to the attenuation (in nepers)."
I have run into dispersion in coax, at low frequency. It has to do with skin attenuation, but my math skills are so rusty I gave up on solving for it. I started to write a page on it here:
Just remember, at low frequency, lines get electrically longer!
It is interesting that in Microwave Office this effect is NOT modeled in the coax circuit element. But it IS modeled in the stripline element! So you can fake a coax design with the stripline model if you don;t go to far outside the limits of the relative thickness of the strip versus ground plane spacing.
Once the center conductor diameter is large compared to skin depth the problem dies down quickly. Virtually every design above 1 GHz can ignore coax dispersion. The real killer will be at higher frequencies where the coax starts an unwanted mode.
Thank you for your answer and for pointing out the page on dispersion.
The linked paper actually answers my question.
Not sure why I didn't find it earlier.
Actually, I am looking into deformation of pulses travelling down coax cables.
Frequencies I am interested in range from DC (really 0Hz, not the "DC" starting in the range of MHz as used by some microwave people) to GHz.
Join the international conversation on a broad range of microwave and RF topics. Learn about the latest developments in our industry, post questions for your peers to answer, and weigh in with some answers if you can!