원문정보
보안공학연구지원센터(IJDTA)
International Journal of Database Theory and Application
Vol.9 No.7
2016.07
pp.31-38
피인용수 : 0건 (자료제공 : 네이버학술정보)
초록
영어
Representing words as continuous vectors enables the quantification of semantic relationships of words by vector operations, thereby has attracted much attention recently. This paper proposes an approach to combine continuous word representation and topic modeling, by encoding words based on their topic distributions in the hierarchical softmax, so as to introduce the prior semantic relevance information into the neural networks. The word vectors generated by our model are evaluated with respect to word relevance and the document relevance. Experimental results show that our approach is promising for further improving the quality of word vectors.
목차
Abstract
1. Introduction
2. Related Work
3. Approach
3.1. Word Encoding via Topic Distributions
3.2. Topic Modeling
4. Experiments
4.1. Experimental Setting
4.2. Evaluation Tasks
4.3. Results and Discussion
5. Conclusions
Acknowledgments
References
1. Introduction
2. Related Work
3. Approach
3.1. Word Encoding via Topic Distributions
3.2. Topic Modeling
4. Experiments
4.1. Experimental Setting
4.2. Evaluation Tasks
4.3. Results and Discussion
5. Conclusions
Acknowledgments
References
키워드
저자정보
참고문헌
자료제공 : 네이버학술정보