Artificial neural networks as approximate procedure in wireless devices designing problems
DOI:
https://doi.org/10.20535/RADAP.2012.51.41-49Keywords:
artificial neural network, frequency-selective microwave devices, approximation characteristics, optimal configuration of neural networkAbstract
Purpose. Present work is dedicated to the optimal configuration selection and training method of neural network (NN). This NN is architecture's element of modified NN ensemble accepted by authors for implementation of frequency-selective microwave devices design algorithms. Optimal configuration determining of NN. Optimal configuration determining of NN was received by analyzing the results of test NN training with different number of layers and neurons in these layers. The main parameters optimal configuration determining of NN is the approximation quality and total learning time. Choosing of optimal teaching method. NN training methods comparison was carried out for 7 popular training methods: Levenberg-Marquardt backpropagation, BFGS quasi-Newton backpropagation, Bayesian regulation backpropagation, Conjugate gradient backpropagation with Powell-Beale restarts, Gradient descent backpropagation, Gradient descent with momentum backpropagation and Resilient backpropagation. Conclusions. NN using allows to approximate complex features of microwave devices, such as frequency dependencies of S – parameters etc. The approximation accuracy depends on configuration and method of NN training. Increasing the number of NN layers leads to improvement of approximate characteristics. According to our results the most effective is usage of 4 layers and the number of neurons in each layer should be over the range 10 to 20. Optimal training method for complex characteristics is Bayesian regulation backpropagation, for time training reduction can be used Levenberg-Marquardt backpropagation.
References
Література
Горбань А. Н. Обобщенная аппроксимационная теорема / А.Н. Горбань // Сибирский журнал вычислительной математики, 1998. – Т.1 – № 1. – С.12 – 24. .
Адаменко В. О. Використання нейронних мереж для синтезу мікрохвильових пристроїв / В. О. Адаменко, Г. О. Мірських // Вісник НТУУ "КПІ". Серія Радіотехніка. Радіоапаратобудування. – 2012. – №49. – С. 102–107.
Маттей Д. Л. Фильтры СВЧ, согласующие цепи и цепи связи / Д. Л. Маттей, Л. Янг, Е.М.Т. Джонс / Пер. с англ. / Под ред. Л.В. Алексеева, Ф.В. Кушнира. – М.: Связь, 1971. – т.1. – 248 с.
Hagan M. T. Training feedforward networks with the Marquardt algorithm / M. T. Hagan, M. B. Menhaj // Neural Networks, IEEE Transactions on Neural Networks – 1994. – vol. 5, No. 6. – P. 989 – 993.
Meng-Hock Fun. Levenberg-Marquardt training for modular networks / Meng-Hock Fun, O. K. Stillwater, M. T. Hagan // IEEE International Conference on Neural Networks. – 1996. – vol. 1 – P. 463 – 473.
Адаменко В. О. Штучні нейронні мережі в задачах реалізації матеріальних об'єктів. Частина 2. Особливості проектування та застосування / В. О. Адаменко, Г. О. Мірських // Вісник НТУУ "КПІ". Серія Радіотехніка. Радіоапаратобудування. – 2012. – №48 – С. 213 – 221.
Dennis J. E. Numerical Methods for Unconstrained Optimization and Nonlinear Equations / J. E. Dennis, R. B. Schnabel – SIAM. – 1987. – 394 P. – ISBN: 978-0-89871-364-0
Foresee F. D. Gauss-Newton approximation to Bayesian regularization / F. D. Foresee, M. T. Hagan // Proccedings of the 1997 International Joint Conference on Neural Networks – 1997. – P. 1930 – 1935.
Медведев В. С. Нейронные сети. MATLAB 6 / В. С. Медведев, В. Г. Потемкин. – М. : ДИАЛОГ-МИФИ, 2002. – 496 с. – ISBN 5-86404-163-7.
Hagan M. T. Neural Network Design / M. T. Hagan, H. B. Demuth, M. H. Beale – China Machine Press, 1996. – 734 P.– ISBN 7-111-10841-8.
Riedmiller M. A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm / M. Riedmiller, H. Braun // IEEE International Conference On Neural Networks, 1993.
References
Horban A.N. Obobshchennaya approksymatsyonnaya teorema / A.N. Horban // Sybyrskyy zhurnal vыchyslytel'noy matematyky, 1998. — T.1 — No. 1. — S.12 — 24.
Adamenko V. O. Using neural networks for the synthesis of microwave devices. / V. O. Adamenko, G. O. Mirskykh // Visnyk NTUU "KPI". Serija Radiotehnika, Radioaparatobuduvannja, 2012. — No. 49. — S. 102 — 107.
Mattey D.L. Fyltry SVCh, sohlasuyushchye tsepy y tsepy svyazy / D.L. Mattey, L. Yanh, E.M.T. Dzhons / Per. s anhl. / Pod red. L.V. Alekseeva, F.V. Kushnyra. — M.: Svyaz', 1971. — t.1. — 248 s.
Hagan M.T. Training feedforward networks with the Marquardt algorithm / M.T. Hagan, M.B. Menhaj // Neural Networks, IEEE Transactions on Neural Networks — 1994. — vol. 5, No. 6. — P. 989 — 993.
Meng-Hock Fun. Levenberg-Marquardt training for modular networks / Meng-Hock Fun, Stillwater, O. K, Hagan, M.T. // IEEE International Conference on Neural Networks, 1996. — Vol. 1 — P. 463 — 473.
Adamenko V.O. Artificial neural networks in problems of material objects implementation. Part 2. Networking principles and Classification / V. O. Adamenko, G.O. Mirskykh // Visnyk NTUU "KPI". Serija Radiotehnika, Radioaparatobuduvannja, 2012. — No. 48 — S. 213 — 221.
Dennis J. E. Numerical Methods for Unconstrained Optimization and Nonlinear Equations / J. E. Dennis, R. B. Schnabel — SIAM. — 1987. — 394 P.
Foresee F. D. Gauss-Newton approximation to Bayesian regularization / F. D. Foresee, M. T. Hagan // Proccedings of the 1997 International Joint Conference on Neural Networks, 1997. — P. 1930 — 1935.
Medvedev V. S. Nejronnye seti. MATLAB 6 / V. S. Medvedev, V. G. Potemkin. — M.: DIALOG-MIFI, 2002. — 496 s. — ISBN 5-86404-163-7.
Hagan M. T. Neural Network Design / M. T. Hagan, H. B. Demuth, M. H. Beale — China Machine Press, 1996. — 734 P.— ISBN 7-111-10841-8.
Riedmiller M. A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm / M. Riedmiller, Braun H. // IEEE International Conference On Neural Networks, 1993.
Downloads
Published
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).