Abstract
For the prediction of nonlinear phenomena in a three-wave Raman backscattering for laser amplification, a machine learning technology is applied to predict the generation of solitons in complicated multi-dimensional parameter spaces. The generation of the soliton in the resonant three-wave system is simulated with one-dimensional fluid equations. The solitons are generated in the early phase of the three-wave interaction, and the slow propagation speeds play an important role. Using a pattern matching method comparing the simulation data with the analytic solution, the generation of solitons are automatically detected. After collecting enough data sets by autonomous parameter scanning in the numerical simulation, nonlinear regression and k-nearest neighbor algorithms are utilized for the prediction of the existence of solitons.
Original language | English |
---|---|
Pages (from-to) | 909-916 |
Number of pages | 8 |
Journal | Journal of the Korean Physical Society |
Volume | 75 |
Issue number | 11 |
DOIs | |
State | Published - 1 Dec 2019 |
Bibliographical note
Funding Information:This work is supported by National R&D Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (Grant No. NRF-2019R1A2C1088518).
Publisher Copyright:
© 2019, The Korean Physical Society.
Keywords
- Machine learning
- Raman backscattering
- Solitons