I. Introduction
Interconnect technology has been progressed at a very fast pace for the past decade. The signaling rate has steadily increased from 100Mb/s to 25Gb/s. With the release of Thunderbolt technology, we are entering a new era in consumer electronics that runs at 20Gb/s line rate (40Gb/s throughput per connector interface). This is driven by the bandwidth requirements of the 4K video, which quickly become the main stream today. Since 8K video is already on the horizon, significant jump in interconnect throughput is required down the road. On the data center and cloud side, mobile data traffic is contributing more than 50%, and the percentage is still increasing. This pushes data center interconnect data rate to 100G, 400G and higher. Electrical I/O is increasingly limited by copper channel, whose interconnect loss is frequency and distance dependent. To overcome this limitation, extra circuitry has been added to compensate for copper channel's loss; these circuits burn more power, add more complexity and take extra spaces. On the other hand, optical fiber has been widely used over longer distances while maintaining higher data rate due to significantly lower attenuation and better immune to electro-magnetic interference (EMI). In addition, with smaller and smaller system form factors, there is not much room left for all the connectors, such as Ethernet, eSATA, DVI, USB, HDMI and Display Port. Thunderbolt interconnect technology was introduced to the market around late 2011, it is the new high-speed, low power, small form factor cable technology, with the intension to be the single universal I/O for future computers and portable devices.
Thunderbolt cable. copper cable on the left, optical cable on the right