Compressive neighborhood embedding for classification

FSKD(2014)

引用 0|浏览2
暂无评分
摘要
Recently, spectral manifold learning algorithms on pattern recognition and machine learning orientation have found wide applications. The common strategy for these algorithms, e.g., Locally Linear Embedding (LLE), facilitates neighborhood relationships which can be constructed by knn or ϵ criterion. This paper presents a simple technique for constructing the nearest neighborhood by combining ℓ2 and ℓ1 norm. The proposed criterion, called Compressive Neighborhood Embedding (CNE), gives rise to a modified spectral manifold learning technique. The validated discriminating power of sparse representation has illuminated in [1], we additionally formulate the semi-supervised learning variation of CNE, SCNE for short, based on the proposed criterion to utilize both labeled and unlabeled data for inference on a graph. Extensive experiments on semi-supervised classification demonstrate the superiority of the proposed algorithm.
更多
查看译文
关键词
knn criterion,CNE criterion,spectral manifold learning algorithms,graph inference,locally linear embedding,image representation,image coding,pattern recognition,compressive neighborhood embedding,inference mechanisms,compressive sensing,learning (artificial intelligence),LLE,ℓ1 norm,ℓ2 norm,unlabeled data,semi-supervised learning,neighborhood relationships,semisupervised classification,compressed sensing,image classification,nearest neighborhood construction,ϵ criterion,labeled data,machine learning orientation,semisupervised learning CNE,graph theory,SCNE,manifold learning,sparse representation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要