posted on 2023-01-11, 23:05authored byA. V. Reznichenko, A. I. Chernykh, E. V. Sedov, I. S. Terekhov
We consider the information channel described by Schr\"{o}dinger equation with additive Gaussian noise. We introduce the model of the input signal and the model of the output signal receiver. For this channel, using perturbation theory for the small nonlinearity parameter, we calculate the first three terms of the expansion of the conditional probability density function in the nonlinearity parameter. At large signal-to-noise power ratio we calculate the conditional entropy, the output signal entropy, and the mutual information in the leading and next-to-leading order in the nonlinearity parameter and in the leading order in the parameter $1/\mathrm{SNR}$. Using the mutual information we find the optimal input signal distribution and channel capacity in the leading and next-to-leading order in the nonlinearity parameter. Finally, we present the method of the construction of the input signal with the optimal statistics for the given shape of the signal.
History
Disclaimer
This arXiv metadata record was not reviewed or approved by, nor does it necessarily express or reflect the policies or opinions of, arXiv.