Resumen
In this paper, a machine learning-based tunable optical-digital signal processor is demonstrated for a short-reach optical communication system. The effect of fiber chromatic dispersion after square-law detection is mitigated using a hybrid structure, which shares the complexity between the optical and the digital domain. The optical part mitigates the chromatic dispersion by slicing the signal into small sub-bands and delaying them accordingly, before regrouping the signal again. The optimal delay is calculated in each scenario to minimize the bit error rate. The digital part is a nonlinear equalizer based on a neural network. The results are analyzed in terms of signal-to-noise penalty at the KP4 forward error correction threshold. The penalty is calculated with respect to a back-to-back transmission without equalization. Considering 32 GBd transmission and 0 dB penalty, the proposed hybrid solution shows chromatic dispersion mitigation up to 200 ps/nm (12 km of equivalent standard single-mode fiber length) for stage 1 of the hybrid module and roughly double for the second stage. A simplified version of the optical module is demonstrated with an approximated 1.5 dB penalty compared to the complete two-stage hybrid module. Chromatic dispersion tolerance for a fixed optical structure and a simpler configuration of the nonlinear equalizer is also investigated.