원문정보
초록
영어
In this paper, we propose a new algorithm(N_BP) to be capable of overcoming limitations of the traditional backpropagation(O_BP). The N_BP is based on the method of conjugate gradients and calculates learning parameters through the line search which may be characterized by order statistics and golden section. Experimental results showed that the N_BP was definitely superior to the O_BP with and without a stochastic term in terms of accuracy and rate of convergence and might surmount the problem of local minima. Furthermore, they confirmed us that the stagnant phenomenon of learning in the O_BP resulted from the limitations of its algorithm in itself and that unessential approaches would never cured it of this phenomenon.
목차
II. O_BP에 대한 개괄
III. N_BP의 수학적 배경
1. 알고리즘
2. 학습계수 결정
IV. 수치예제
V. 실험결과 및 결론
참고문헌
Abstract