Correction to TINS 1828 Contributions by metaplasticity to solving the Catastrophic Forgetting Problem: (Trends in Neurosciences, 45:9 p:656-666, 2022).

Trends in neurosciences(2023)

引用 0|浏览8
暂无评分
摘要
In the print and online PDF versions of the article, references [15.Atkinson C. et al.Pseudo-rehearsal: achieving deep reinforcement learning without catastrophic forgetting.Neurocomputing. 2021; 428: 291-307Crossref Scopus (27) Google Scholar, 16.Karhunen J. et al.Unsupervised deep learning: a short review.in: Bingham E. Advances in Independent Component Analysis and Learning Machines. Academic Press, 2015: 125-142Crossref Scopus (42) Google Scholar, 17.Madaan D. et al.Representational continuity for unsupervised continual learning.arXiv. 2022; (Published online October 13, 2021. http://dx.doi.org/10.48550/arXiv.2110.06976)Google Scholar, 18.Hopfield J.J. Neural networks and physical systems with emergent collective computational abilities.Proc. Natl. Acad. Sci. U. S. A. 1982; 79: 2554-2558Crossref PubMed Google Scholar, 19.Ratcliff R. Connectionist models of recognition memory: constraints imposed by learning and forgetting functions.Psychol. Rev. 1990; 97: 285-308Crossref PubMed Google Scholar, 20.Robins A. Consolidation in neural networks and in the sleeping brain.Connect. Sci. 1996; 8: 259-276Crossref Scopus (43) Google Scholar, 21.Shin H. et al.Continual learning with deep generative replay.arXiv. 2017; (Published online May 24, 2017. http://dx.doi.org/10.48550/arXiv.1705.08690)Google Scholar, 22.Ji D. Wilson M.A. Coordinated memory replay in the visual cortex and hippocampus during sleep.Nat. Neurosci. 2007; 10: 100-107Crossref PubMed Scopus (1144) Google Scholar, 23.Kamra N. et al.Deep generative dual memory network for continual learning.arXiv. 2017; (Published online October 28, 2017. http://dx.doi.org/10.48550/arXiv.1710.10368)Google Scholar, 24.van de Ven G.M. et al.Brain-inspired replay for continual learning with artificial neural networks.Nat. Commun. 2020; 11: 4069Crossref PubMed Scopus (154) Google Scholar, 25.Hayes T.L. et al.Replay in deep learning: current approaches and missing biological elements.Neural Comput. 2021; 33: 2908-2950PubMed Google Scholar, 26.McClelland J.L. et al.Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory.Psychol. Rev. 1995; 102: 419-457Crossref PubMed Google Scholar, 27.O’Reilly R.C. et al.Complementary learning systems.Cogn. Sci. 2014; 38: 1229-1248Crossref PubMed Scopus (138) Google Scholar, 28.Kumaran D. et al.What learning systems do intelligent agents need? Complementary learning systems theory updated.Tr. Cogn. Sci. 2016; 20: 512-534Abstract Full Text Full Text PDF PubMed Scopus (335) Google Scholar, 29.Hattori M. A biologically inspired dual-network memory model for reduction of catastrophic forgetting.Neurocomputing. 2014; 134: 262-268Crossref Scopus (17) Google Scholar, 30.McClelland J.L. et al.Integration of new information in memory: new insights from a complementary learning systems perspective.Philos. Trans. R. Soc. B Biol. Sci. 2020; 37520190637Crossref Scopus (38) Google Scholar, 31.Parisi G.I. et al.On the role of neurogenesis in overcoming catastrophic forgetting.arXiv. 2018; (Published online November 6, 2018. http://dx.doi.org/10.48550/arXiv.1811.02113)Google Scholar, 32.Rolls E.T. Treves A. The relative advantages of sparse versus distributed encoding for associative neuronal networks in the brain.Netw. Comput. Neural Syst. 1990; 1: 407-421Crossref Scopus (0) Google Scholar, 33.Ahmad S. Scheinkman L. How can we be so dense? The benefits of using highly sparse representations.arXiv. 2019; (Published online March 27, 2019. http://dx.doi.org/10.48550/arXiv.1903.11257)Google Scholar, 34.Manneschi L. et al.SpaRCe: improved learning of reservoir computing systems through sparse representations.IEEE Trans. Neural Netw. Learn. Syst. 2021; (Published online August 16, 2021. https://doi.org/10.1109/tnnls.2021.3102378)Crossref PubMed Scopus (8) Google Scholar, 35.Ellefsen K.O. et al.Neural modularity helps organisms evolve to learn new skills without forgetting old skills.PLoS Comput. Biol. 2015; 11e1004128Crossref PubMed Scopus (104) Google Scholar, 36.Spanne A. Jörntell H. Questioning the role of sparse coding in the brain.Trends Neurosci. 2015; 38: 417-427Abstract Full Text Full Text PDF PubMed Google Scholar, 37.Feng Y. Brunel N. Storage capacity of networks with discrete synapses and sparsely encoded memories.arXiv. 2021; (Published online December 13, 2021. http://dx.doi.org/10.48550/arXiv.2112.06711)PubMed Google Scholar, 38.Grewal K. et al.Going beyond the point neuron: active dendrites and sparse representations for continual learning.bioRxiv. 2021; (Published online October 26, 2021. https://doi.org/10.1101/2021.10.25.465651)Google Scholar, 39.Iyer A. et al.Avoiding catastrophe: active dendrites enable multi-task learning in dynamic environments.arXiv. 2021; (Published online December 31, 2021. http://dx.doi.org/10.48550/arXiv.2201.00042)Google Scholar, 40.Hainmueller T. Bartos M. Parallel emergence of stable and dynamic memory engrams in the hippocampus.Nature. 2018; 558: 292-296Crossref PubMed Scopus (165) Google Scholar, 41.Leutgeb J.K. et al.Pattern separation in the dentate gyrus and CA3 of the hippocampus.Science. 2007; 315: 961-966Crossref PubMed Scopus (1139) Google Scholar, 42.Wiskott L. et al.A functional hypothesis for adult hippocampal neurogenesis: avoidance of catastrophic interference in the dentate gyrus.Hippocampus. 2006; 16: 329-343Crossref PubMed Scopus (235) Google Scholar, 43.Fahlman S. Lebiere C. The cascade-correlation learning architecture.Adv. Neural Inform. Process. Syst. 1989; 2: 524-532Google Scholar, 44.Carpenter G.A. et al.Invariant recognition of cluttered scenes by a self-organizing ART architecture: CORT-X boundary segmentation.Neural Netw. 1989; 2: 169-181Crossref Scopus (44) Google Scholar, 45.Tsuda B. et al.A modeling framework for adaptive lifelong learning with transfer and savings through gating in the prefrontal cortex.Proc. Natl. Acad. Sci. U. S. A. 2020; 117: 29872-29882Crossref PubMed Scopus (20) Google Scholar, 46.Franco L. Constructive Neural Networks (Studies in Computational Intelligence v. 258). Springer, 2009Crossref Google Scholar, 47.Zemouri R.A. et al.A new growing pruning deep learning neural network algorithm (GP-DLNN).Neural. Comput. Appl. 2019; 32: 18143-18159Crossref Scopus (17) Google Scholar, 48.Rusu A.R. et al.Progressive neural networks.arXiv. 2016; (Published online June 15, 2016. http://dx.doi.org/10.48550/arXiv.1606.04671)Google Scholar, 49.Liu C. et al.Progressive neural architecture search.in: Proceedings of the European Conference on Computer Vision (ECCV). 2018: 19-34Crossref Scopus (604) Google Scholar] were unfortunately missing due to a production error during late stages of article preparation. The references have been reinstated in the online version of the article, and are listed below. The Publisher apologizes to all for the inconvenience. Contributions by metaplasticity to solving the Catastrophic Forgetting ProblemJedlicka et al.Trends in NeurosciencesJuly 4, 2022In BriefCatastrophic forgetting (CF) refers to the sudden and severe loss of prior information in learning systems when acquiring new information. CF has been an Achilles heel of standard artificial neural networks (ANNs) when learning multiple tasks sequentially. The brain, by contrast, has solved this problem during evolution. Modellers now use a variety of strategies to overcome CF, many of which have parallels to cellular and circuit functions in the brain. One common strategy, based on metaplasticity phenomena, controls the future rate of change at key connections to help retain previously learned information. Full-Text PDF
更多
查看译文
关键词
catastrophic forgetting problem,metaplasticity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要