earticle

논문검색

Communication

Semi supervised learning using Otherside view consistency regularization

초록

영어

Semi-supervised learning (SSL) has shown great promise in utilizing both labeled and unlabeled data, especially using consistency regularization To enhance the model's learning efficiency, we proposed a dualbranch co-training approach. After dividing the training data into two subsets, each branch utilizes a teacherstudent model pair, where the teacher's weights are updated via an exponential moving average (EMA). The framework combines supervised loss and unsupervised loss to optimize the model. Upon label expansion (e.g pseudo labeling), an additional otherside view is introduced, promoting agreement between the branches' predictions on shared data. This loss mitigates errors arising from incorrect pseudo-labels and enhances the overall robustness of the training process. By dynamically adjusting pseudo-label inclusion based on confidence thresholds, our methodology reduces the impact of noisy data and prevents overfitting. As a result, we could demonstrate the effectiveness of the proposed method in leveraging unlabeled data while maintaining high performance.

목차

Abstract
1. Introduction
2. Related Research
2.1 Consistency Regularization
3. Breast Ultrasound Images
4. Otherside View Co-training
5. Experiments and Results
6. Conclusion
Acknowledgement
References

저자정보

  • Sangmin Lee Graduate Student, Major of Data Science, Korea National University of Transportation, Korea
  • Seokmin Han Professor, Major of Data Science, Korea National University of Transportation, Korea

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 원문제공기관과의 협약기간이 종료되어 열람이 제한될 수 있습니다.

      0개의 논문이 장바구니에 담겼습니다.