基本信息
浏览量:147
职业迁徙
个人简介
The brain has rich ability and flexibility to perform various types of information processing. Information distributed in the brain is processed in parallel over a large number of mutually connected neurons. There must be mathmatical principles of parallel and distributed information processing, and the brain can be regarded as a biological realization of these principles as a result of evolution. Recently, many researchers have been interested in modeling information processing of learning systems. Sometimes models are simplified reproductions of the biological neural systems, but sometimes they are far from the real brain.
Even if models are different from the actual brain, they could be an engineering realization of the above mentioned information principles that the brain also uses in different styles. From this point of view, it is important to elucidate mathematical principles of learning systems and will be useful in various fields of engineering sciences, such as control theory, pattern recognition, energy management optimization and material design.
A necessary prerequisite for constructing machines that can solve application tasks is a solid theory of learning. We have used so far two very powerful and successful techniques for the analysis of learning machines to solve a number of problems: stochastic modeling and geometry of probability distributions.
When the data include noise, the input-output relation is described stochastically in terms of the conditional probability. Some learning machines are stochastic in their own nature, and hence their behaviors are also described by probability distributions and stochastic dynamics. Even when a machine is deterministic, it is effective to train it as if it were a stochastic machine, although it behaves deterministically in the execution mode. Adopting this point of view, we can treat learning machines within the framework of statistics and probability theory, and it gives great deal of advantages for analyzing their properties.
Another important theory originates from geometry. Analysis of the behavior of single learning machines have been carried out by many researchers using different mathematical methods and computer simulations. However, it has turned out to be a very successful technique to study not a specific machine but a whole family of machines to clarify their capabilities and limitations of a fixed architecture. Considering a learning machines including n-dimensional modifiable parameter, the set of all the possible machines realized by changing their parameters forms an n-dimensional manifold, where the set of parameters plays the role of a coordinate system. The geometry of this manifold is useful for understanding the total capability of a class of learning machines. This trial that connects the stochastic and geometrical ideas starts in 1980’s and it is called information geometry. It originates from the information structure of a manifold of probability distributions and has been developed to be a new mathematical subject with new differential geometrical notions, and it has been successfully applied to analyzing properties of learning machines.
Even if models are different from the actual brain, they could be an engineering realization of the above mentioned information principles that the brain also uses in different styles. From this point of view, it is important to elucidate mathematical principles of learning systems and will be useful in various fields of engineering sciences, such as control theory, pattern recognition, energy management optimization and material design.
A necessary prerequisite for constructing machines that can solve application tasks is a solid theory of learning. We have used so far two very powerful and successful techniques for the analysis of learning machines to solve a number of problems: stochastic modeling and geometry of probability distributions.
When the data include noise, the input-output relation is described stochastically in terms of the conditional probability. Some learning machines are stochastic in their own nature, and hence their behaviors are also described by probability distributions and stochastic dynamics. Even when a machine is deterministic, it is effective to train it as if it were a stochastic machine, although it behaves deterministically in the execution mode. Adopting this point of view, we can treat learning machines within the framework of statistics and probability theory, and it gives great deal of advantages for analyzing their properties.
Another important theory originates from geometry. Analysis of the behavior of single learning machines have been carried out by many researchers using different mathematical methods and computer simulations. However, it has turned out to be a very successful technique to study not a specific machine but a whole family of machines to clarify their capabilities and limitations of a fixed architecture. Considering a learning machines including n-dimensional modifiable parameter, the set of all the possible machines realized by changing their parameters forms an n-dimensional manifold, where the set of parameters plays the role of a coordinate system. The geometry of this manifold is useful for understanding the total capability of a class of learning machines. This trial that connects the stochastic and geometrical ideas starts in 1980’s and it is called information geometry. It originates from the information structure of a manifold of probability distributions and has been developed to be a new mathematical subject with new differential geometrical notions, and it has been successfully applied to analyzing properties of learning machines.
研究兴趣
论文共 237 篇作者统计合作学者相似作者
按年份排序按引用量排序主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
2024 IEEE 21st Biennial Conference on Electromagnetic Field Computation (CEFC)pp.01-02, (2024)
Koji Nakao,Katsunari Yoshioka,Takayuki Sasaki,Rui Tanabe, Xuping Huang,Takeshi Takahashi,Akira Fujita,Jun'ichi Takeuchi,Noboru Murata,Junji Shikata, Kazuki Iwamoto, Kazuki Takada,
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMSno. 9 (2023): 1302-1315
FLAIRS (2023)
Information Geometryno. 2 (2023): 435-462
bioRxiv (Cold Spring Harbor Laboratory) (2023)
Information Geometryno. 1 (2022): 39-77
加载更多
作者统计
#Papers: 239
#Citation: 11755
H-Index: 32
G-Index: 106
Sociability: 6
Diversity: 1
Activity: 1
合作学者
合作机构
D-Core
- 合作者
- 学生
- 导师
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn