Nonlinear Fourier transform (NFT) based transmission technique relies on the integrability of the nonlinear Schrödinger equation (NLSE). However, the lossless NLSE is not directly applicable for the description of light evolution in fibre links with lumped amplification such as Erbium-doped fibre amplifier (EDFA) because of the nonuniform loss and gain evolution. In this case, the path-averaged model is usually applied as an approximation of the true NLSE model including the fibre loss. However, the inaccuracy of the lossless path-average model, even though being small, can also result in a notable performance degradation in NFT-based transmission systems. In this paper, we extend the theoretical approach, which was first proposed for solitons in EDFA systems, to the case of NFT-based systems to constructively diminish the aforementioned performance penalty. Based on the quantitative analysis of distortions due to the use of path-average model, we optimise the signal launch and detection points to minimise the models mismatch. Without loss of generality, we demonstrate how this approach works for the NFT systems that use continuous NFT spectrum modulation (vanishing signals) and NFT main spectrum modulation (periodic signals). Through numerical modelling, we quantify the corresponding improvements in system performance.