Abstract
In this paper, we compare the theoretical and simulated bit error rate (BER) performance of an all-optical active recirculating fibre loop buffer (RFLB) employed at the transmitter node of an optical time division multiplexed based (OTDM) system, for several buffer storage times (delays). To the best of our knowledge, we believe that this is the first time a theoretical model detailing BER performance based on the delay storage time has been presented in the literature. Moreover, the theoretical model is extended to forecast the power penalties incurred when the delay approaches the millisecond time scale. Thereafter, simulation results are used to compare the power penalty when the RFLB is employed at a receiver node within an optical network scenario. In this work, soliton profiled OTDM packets transmitted at a maximum aggregate data rate of 20 Gbit/s are used as test signals. The buffering architecture consists of a dispersion-shifted fibre (DSF) loop and a switching mechanism. The analysis presented here is intended to promote the understanding of the performance hindering mechanisms within a RFLB architecture and may prove invaluable to future optical buffer designers.
Original language | English |
---|---|
Pages (from-to) | 281-290 |
Journal | Optics Communications |
Volume | 238 |
Issue number | 4-6 |
DOIs | |
Publication status | Published - 15 Aug 2004 |
Keywords
- Amplified spontaneous emission noise
- intensity jitter
- optical buffer
- soliton
- timing jitter