A Quantum-Like Tensor Compression Sentence Representation Based on Constraint Functions for Semantics Analysis

International Journal of Computational Intelligence Systems(2024)

引用 0|浏览0
暂无评分
摘要
To emphasize the semantic impact of local semantic and grammatical information among adjacent words in the input text, we establish a constraint functions-based quantum-like tensor compression sentence representation model by integrating the concept of extending the pure state-based density matrix to the mixed-state projection operator in quantum mechanics. The provided model highlights the semantic significance of mixed word associations in the input text, simultaneously reducing the reliance on information derived solely from dictionary statistics. We combine the correlation coefficient with the attention mechanism to establish the correlation coefficient between words. The quantum-like sentence representation based on pure state density matrix is extended to the projection operator of mixed states. Combining the acquisition of maximum in convex optimization, a constraint functions-based quantum-like text representation pruning model is established to reduce redundant information caused by dimensional expansion of tensor operations. The experimental results on SICK-2014, STS-benchmark, and STS-companion show that the provided model is more effective than the mainstream models in mining semantic information, especially more sensitive to the negative semantics of sentences.
更多
查看译文
关键词
Quantum-like sentence representation,Semantic similarity,Attention mechanism,Dimensionality reduction,Optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要