earticle

논문검색

Session I : AI Deep Learning

Sensitivity Analysis Based on Similarity between Feature and Weight Vectors

초록

영어

In the classifier which is constructed with a fully connected layer connected to an output layer the weight vector W creates a decision boundary between each. The learning of a feature vector influences the model to extract unique features of the class during the learning process which moves the distribution of the initial feature vector into the decision boundary of the class. When the similarity between the initial feature vector and the weight vector is high, it can be expected that the effect on the model is low because the loss value is small. On the other hand, if the similarity is low, it means that the interval between the feature vector and the weight vector is large and the loss value is high, which can be expected to have a higher effect on the model than when the similarity is high in the learning process of the model. In this paper, we verify how much the similarity between the initial feature vector and the weight vector before learning affects model learning. In order to confirm the effect of similarity, the model is learned by assigning an arbitrary fixed value so that the weight vectors W of the fully connected layer make different similarities. Both VGG16 and VGG19 models are used to compare the Recall and the Precision values of the class as a learning result of each model.

목차

Abstract
I. INTRODUCTION
II. HYPOTHESIS
III. EXPERIMENTS
A. Dataset
B. Model
C. Evaluation metrics
D. Experiments method
E. Experiments results
IV. CONCLUSION
ACKNOWLEDGMENT
REFERENCES

저자정보

  • Sung-Hwan Park Pattern Recognition and Machine Learning Lab Gachon University
  • Sang-Woong Lee Pattern Recognition and Machine Learning Lab Gachon University

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      0개의 논문이 장바구니에 담겼습니다.