|
|
|
| Research on Aircraft Maintenance Knowledge Graph Construction Technology Based on Multi-Head Attention and Full-Token Masking |
| LIU Guoliang1, GAO Yuexian1, SHANG Jianhang1, XU Sufan1, ZHANG Erhu2, HUANG Zhun2, YANG Chaodong2, SONG Xilin2 |
1. Shandong University, Jinan 250061, China;
2. China Flight Test Establishment, Xi’an 710089, China |
|
|
|
|
Abstract To address the challenges posed by professional terminology, short texts, large data volumes, and mixed Chinese–English content in aircraft maintenance manuals, this paper proposes a knowledge extraction and knowledge graph construction method based on multi-head attention and full-token masking. First, we design the CoBiTex-FTM (Contextual bidirectional text encoder with full-token masking) model for named entity recognition, which enhances contextual modeling through multi-head attention and ensures label consistency via a whole-word constraint algorithm tailored for mixed-language scenarios. Second, we construct the BiHAM-FTM (Bidirectional LSTM & multi-head attention with full-token masking) model to extract “entity–relation–entity” triples. Finally, an aircraft maintenance knowledge graph system is implemented using Neo4j for structured storage and visual representation of maintenance knowledge. To validate the approach, we build a domain-specific dataset and conduct comparative and ablation experiments. Experimental results show that CoBiTex-FTM achieves an F1 score of 95.16%, while BiHAM-FTM reaches 90.74%, demonstrating superior performance in complex, multilingual, and short-text environments.
|
|
|
|
|
| PACS: V267+.4 |
|
|
|
|
|
|