Receiver Sensitivity Testing
Optical transceiver manufacturers and qualification engineers test modules exhaustively to ensure standards compliance and peak performance in actual field deployments. Transmitter eye-mask and receiver sensitivity are the most critical tests to validate transceiver performance.
Receiver sensitivity is a key parameter that affects the performance of an optical transceiver. It specifies a module’s capability to perform in harsh environments and helps network operators determine the maximum reach or link margin available in the system.
Receiver Sensitivity Measurement
Sensitivity is defined as how weak an input signal can get before the BER exceeds a specific number as defined by MSA standards. The standard defines this threshold BER as a value above which a signal is considered to be degraded and unfit for data communications.
For optical transceivers, two types of receiver sensitivity are important:
- Unstressed receiver sensitivity, which is expressed in two ways:
- Average Power (dBm)
- OMA (Wpp)
- Stressed Receiver Sensitivity (SRS)
- expressed as OMA (dBm)
Unstressed receiver sensitivity testing is performed by simply connecting the transmitter to the receiver via a variable optical attenuator. BER values are recorded against different receiver power values and are finally plotted against each other. To achieve a certain BER, the receiver sensitivity (or power) must be better than the threshold value as defined by the MSA standard for Unstressed Receiver Sensitivity.
Stressed receiver sensitivity test is performed by sending a degraded signal over the fiber. The signal is degraded by specifying poor extinction ratio, adding different types of jitter and inter-symbol interference (ISI) to the signal, etc. The module passes the test if the measured minimum receive power at the specific BER remains at an accepted level.
- Sensitivity is the minimum average optical power in dBm to achieve a desired bit-error-rate (BER).
- Always compare back-to-back (transmitter directly to receiver) with maximum fiber length.
- The BER is usually <2.4e-4 (PAM4), 5e-5 (NRZ) pre-FEC and <1e-12 post-FEC (see MSA standard)
OMA can be calculated from Average Power (Pavg) and Extinction Ratio (re). Average power can be easily measured using an Optical Power Meter and Extinction ratio is measured using an oscilloscope. OMA can be calculated using the formula below:
Where Pavg is the Average Power, is the extinction ratio, P1 is the optical power when the light source is ON and P2 is the optical power when the light source is OFF.
Typical test configuration for measuring Stressed Receiver Sensitivity
The standard requires that the OMA of a signal be measured with a square wave and not a PRBS (Pseudo Random Binary Sequence), before applying ISI (inter-symbol interference), SI (Sinusoidal amplitude Interference) and SJ (Sinusoidal Jitter). In the absence of a square wave, the OMA of a PRBS signal has to be approximated from its eye diagram as shown below:
To generate SRS signal from the transceiver:
- Use poor extinction ratio, add inter-symbol interference (ISI) and run the sinusoidal jitter (horizontally and vertically) all the way to > 40 MHz.
- Use a low-pass filter (4th order of Bessel-Thomson roll-off filter) to attenuate the high-frequency components (VECP).
- Measure the OMA as a function of receiver BER of 1e-12, or 5e-5, according to MSA requirements.
The table below shows the different times that a test engineer must wait to see 1 error at the defined data rate in order to achieve a certain BER. For example, for 100G LR4, to achieve 1 error at a BER of 1.0E-12, a person has to wait for 39 seconds (or ~0.01 hours).
It is recommended to perform thorough sensitivity testing before qualifying a transceiver. SRS testing is beneficial to screen transceiver vendors and ensure optimal equipment performance in worst-case conditions.
Vitex engineers have the expertise to answer your questions about transceiver testing. Contact us for concerns and questions.