Informational And Entropic Criteria Of Self-Similarity Of Fractals And Chaotic Signals

INTERNATIONAL JOURNAL OF MATHEMATICS AND PHYSICS(2018)

Cited 0|Views3
No score
Abstract
Information entropy and fractal dimension of a set of physical values are usually used us quantitative characteristic of chaos. Normalization of entropy is a well-known problem. This work is devoted to develop a method to do this. In the work proposed criteria for self-similarity of information and informational entropy. We have defined normalized values of information (I-1 = 0.567) and informational entropy (I-2 = 0.806) as fixed points of probability density function of information and informational entropy. Meaning of these values is described as criteria of self-similarity of fractals and chaotic signals with different dimensions. We have shown that self-similarity occurs if normalized informational entropy S belongs to the ranges [0,I-1), [I-1,I-2), [I-2,1), that corresponds to topological dimensions from 1 to 3 of quasi-periodic, chaotic, stochastic objects. Validity of these findings has been confirmed by calculation of entropy for hierarchical sets of well-known fractals and nonlinear maps. These criteria can be applied to a wide range of problems, where entropy is used.
More
Translated text
Key words
information, informational entropy, fractal, chaos, self-similarity
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined