원문정보
피인용수 : 0건 (자료제공 : 네이버학술정보)
목차
1. Backgrounds
1. Nature of the Structure of Language (Marcolli et al., 2025)
2. Existing LLMs of the structure of language
2. Vector-based Model: Transformer Model (Vaswani et al., 2017)
1. Transformer Model architecture (Vaswani et al., 2017: 4)
2. Scaled Dot-Product Attention
3. Positional Encoding and Structural Invariance
4. Self Attention
5. Open Question
3. Syntax-based Model: AMR Model (Banarescu et al. 2013)
1. Syntax Model architecture
4. Hybrid Model: VerbNet-ConceptNet Model(Fodor, De Deyne, and Suzuki 2024)
1. Fodor, De Deyne, and Suzuki (2024: 148)
2. Evaluation of three models
3. Results & Analysis (Fodor, De Deyne, and Suzuki 2024: 169)
5. Further Studies
1. The mathematical structure of Merge (Marcolli et al., 2025)
6 . Refereces
1. Nature of the Structure of Language (Marcolli et al., 2025)
2. Existing LLMs of the structure of language
2. Vector-based Model: Transformer Model (Vaswani et al., 2017)
1. Transformer Model architecture (Vaswani et al., 2017: 4)
2. Scaled Dot-Product Attention
3. Positional Encoding and Structural Invariance
4. Self Attention
5. Open Question
3. Syntax-based Model: AMR Model (Banarescu et al. 2013)
1. Syntax Model architecture
4. Hybrid Model: VerbNet-ConceptNet Model(Fodor, De Deyne, and Suzuki 2024)
1. Fodor, De Deyne, and Suzuki (2024: 148)
2. Evaluation of three models
3. Results & Analysis (Fodor, De Deyne, and Suzuki 2024: 169)
5. Further Studies
1. The mathematical structure of Merge (Marcolli et al., 2025)
6 . Refereces
저자정보
참고문헌
자료제공 : 네이버학술정보
