An Improved Weighted Base Classification For Optimum Weighted Nearest Neighbor Classifiers

Muhammad Abbas,Kamran Memon,Noor Ain, Ekang Ajebesone,Muhammad Usaid, Zulfiqar Bhutto

EAI ENDORSED TRANSACTIONS ON SCALABLE INFORMATION SYSTEMS(2020)

Cited 0|Views0
No score
Abstract
Existing classification studies use two non-parametric classifiers- k-nearest neighbours (kNN) and decision trees, and oneparametric classifier-logistic regression, generating high accuracies. Previous research work has compared the results ofthese classifiers with training patterns of different sizes to study alcohol tests. In this paper, the Improved Version of thekNN (IVkNN) algorithm is presented which overcomes the limitation of the conventional kNN algorithm to classify winequality. The proposed method typically identifies the same number of nearest neighbours for each test example. Resultsindicate a higher Overall Accuracy (OA) that oscillates between 67% and 76%. Among the three classifiers, the leastsensitive to the training sample size was the kNN and produced the unrivalled OA, followed by sequential decision trees andlogistic regression. Based on the sample size, the proposed IVkNN model presented 80% accuracy and 0.375 root meansquare error (RMSE).
More
Translated text
Key words
Classification, k-Nearest Neighbor (kNN), Logistic Regression, Decision Trees, Cross-Validation, Machine-Learning (ML), SVM, random forest, improved version of k-nearest neighbor (IVkNN), Python
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined