earticle

논문검색

FAST-ADAM in Semi-Supervised Generative Adversarial Networks

원문정보

초록

영어

Unsupervised neural networks have not caught enough attention until Generative Adversarial Network (GAN) was proposed. By using both the generator and discriminator networks, GAN can extract the main characteristic of the original dataset and produce new data with similar latent statistics. However, researchers understand fully that training GAN is not easy because of its unstable condition. The discriminator usually performs too good when helping the generator to learn statistics of the training datasets. Thus, the generated data is not compelling. Various research have focused on how to improve the stability and classification accuracy of GAN. However, few studies delve into how to improve the training efficiency and to save training time. In this paper, we propose a novel optimizer, named FAST-ADAM, which integrates the Lookahead to ADAM optimizer to train the generator of a semi-supervised generative adversarial network (SSGAN). We experiment to assess the feasibility and performance of our optimizer using Canadian Institute For Advanced Research – 10 (CIFAR-10) benchmark dataset. From the experiment results, we show that FAST-ADAM can help the generator to reach convergence faster than the original ADAM while maintaining comparable training accuracy results.

목차

Abstract
1. INTRODUCTION
2. PREVIOUS RESEARCH
3. IMPROVED SSGAN
3.1 Semi-supervised learning in GAN
3.2 Lookahead
3.3 Algorithm
4. EXPERIMENT
5. CONCLUSION
ACKNOWLEDGEMENT
REFERENCES

저자정보

  • Li Kun Department of Computer Engineering, Dongseo University, Busan, Korea
  • Dae-Ki Kang Department of Computer Engineering, Dongseo University, Busan, Korea

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 기관로그인 시 무료 이용이 가능합니다.

      • 4,000원

      0개의 논문이 장바구니에 담겼습니다.