Reasons for Attenuation increase during temperature cycling test?
We made some central tube fiber cables. It is 12 fibers in one tube. Tube diameter is 2.8 mm outer and 1.8 mm inner. When we conducted temperature cycling test, loss increase is more than 0.2 dB at -30 degree C. We want to bring the loss increase down to less than 0.1 dB. Any suggestions?
I think, what renjith mentioned is a valid point to solve the issue. It seems to be a post-shrinkage problem of tube material. If you can control the post shrinkage, you can control excess fiber length. Excess fiber length in tubes, especially central tube cables creates attenuation increase. If the post shrinkage is not controlled during production, it will create shrinkage of tube after tubing and lead to more excess fiber length.
If EFL or excess fiber length is more, then attenuation will go up in minus temperature, when the cable shrinks. The reason is simple; when the cable shrinks in minus temperature, the shrinkage exerts pressure on fibers and they try to adjust inside the tube. In fact, the fibers bend in more than that it is allowed to bend. Due to excess fibers inside the tube, the fibers follow a waveform. If they are further stressed due to shrinkage, macrobending loss will happen.
How to solve? Control the post-shrinkage. Measure the post shrinkage of your tube. Adjust excess fiber length accordingly. One typical excess fiber length in percentage set for central tube cables is around 0.2%.
If for catv, you don’t need to worry about loss to a certain limit. Catv customers can accept high attenuation in fiber optic cables. In fact, they use attenuators to increase attenuation even if you offer low attenuation cables. What it the maximum attenuation in dB/km?