Automated neuron model optimization techniques: a review

Biological Cybernetics(2008)

引用 110|浏览0
暂无评分
摘要
The increase in complexity of computational neuron models makes the hand tuning of model parameters more difficult than ever. Fortunately, the parallel increase in computer power allows scientists to automate this tuning. Optimization algorithms need two essential components. The first one is a function that measures the difference between the output of the model with a given set of parameter and the data. This error function or fitness function makes the ranking of different parameter sets possible. The second component is a search algorithm that explores the parameter space to find the best parameter set in a minimal amount of time. In this review we distinguish three types of error functions: feature-based ones, point-by-point comparison of voltage traces and multi-objective functions. We then detail several popular search algorithms, including brute-force methods, simulated annealing, genetic algorithms, evolution strategies, differential evolution and particle-swarm optimization. Last, we shortly describe Neurofitter, a free software package that combines a phase–plane trajectory density fitness function with several search algorithms.
更多
查看译文
关键词
Optimization, Neuron, Model, Parameters, Automated tuning, Error function, Fitness function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要