Efficient optimization of ODE neuron models using gradient descent
arxiv(2024)
Abstract
Neuroscientists fit morphologically and biophysically detailed neuron
simulations to physiological data, often using evolutionary algorithms.
However, such gradient-free approaches are computationally expensive, making
convergence slow when neuron models have many parameters. Here we introduce a
gradient-based algorithm using differentiable ODE solvers that scales well to
high-dimensional problems. GPUs make parallel simulations fast and gradient
calculations make optimization efficient. We verify the utility of our approach
optimizing neuron models with active dendrites with heterogeneously distributed
ion channel densities. We find that individually stimulating and recording all
dendritic compartments makes such model parameters identifiable. Identification
breaks down gracefully as fewer stimulation and recording sites are given.
Differentiable neuron models, which should be added to popular neuron
simulation packages, promise a new era of optimizable neuron models with many
free parameters, a key feature of real neurons.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined