Optimizing the multiclass perceptron through parameter tuning and gpu utilization

Shayne McIntosh, Gary Zoppetti,Stephanie Elzer Schwartz, Millersville

semanticscholar(2016)

引用 0|浏览0
暂无评分
摘要
Artificial neural networks are biologically influenced models which serve as a common tool for data analysis. They produce state-of-the-art results in machine learning fields such as computer vision and speech recognition. Artificial Neural Networks are complex probabilistic models. Training them is computationally expensive. Consequently, optimization and trade-off management are paramount. Optimization methods can generally be arranged into two groups, algorithmic advantage and computing power or hardware advantage. This paper presents data associated with simple optimization methods from both groups for a multinomial logistic regression network. GPU utilization provides hardware or computing power advantage and parameter tuning methods provide algorithmic advantage. Learning is conducted on the MNIST data set. Its assessment is meant to provide intuition for novice network optimization and training.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要