I've come across something that I cannot find an explanation for although I have a theory. I am currently working on a coax design with a solid PTFE dielectric but using a NbTi wire (unplated) for the conductor. Through all of my calculations, I should be seeing 49.8 ohms. It's seems very basic, wire od, PTFE ID etc. However, when measuring the coax on a real time TDR, it is telling me it is 55 ohms. I've looked at it every which way I can think of and I'm clueless on why this is happening. PTFE is pure, and is properly cured. Could the resistance of the NbTi wire be causing issues with my TDR and artificially showing me an incorrect impedance? All other standard coax tested on the equipment tests the way it should. Thanks for any insight!
Yes, though Z0=sqrt( (R+jwL)/(G+jwC) ) so a high R will boost the impedance; it’s not artificial. You may have to increase C to compensate. Though I would expect R to be very frequency dependent. Can you sweep it on a VNA, and see what happens at very low frequency such as 300 kHz? You may see a spike in Z0 at very low frequency, which will be exasperated by the R since you are getting sort of a divide by zero.
Please provide dimensional information. Is relative mu (permeability) of NbTi equal to one? A non unity value would increase impedance. Here mu_r=1.22 would give the result you are seeing.
Join the international conversation on a broad range of microwave and RF topics. Learn about the latest developments in our industry, post questions for your peers to answer, and weigh in with some answers if you can!