Approximating Markov chains.

AUTOR(ES)
RESUMO

A common framework of finite state approximating Markov chains is developed for discrete time deterministic and stochastic processes. Two types of approximating chains are introduced: (i) those based on stationary conditional probabilities (time averaging) and (ii) transient, based on the percentage of the Lebesgue measure of the image of cells intersecting any given cell. For general dynamical systems, stationary measures for both approximating chains converge weakly to stationary measures for the true process as partition width converges to 0. From governing equations, transient chains and resultant approximations of all n-time unit probabilities can be computed analytically, despite typically singular true-process stationary measures (no density function). Transition probabilities between cells account explicitly for correlation between successive time increments. For dynamical systems defined by uniformly convergent maps on a compact set (e.g., logistic, Henon maps), there also is weak continuity with a control parameter. Thus all moments are continuous with parameter change, across bifurcations and chaotic regimes. Approximate entropy is seen as the information-theoretic rate of entropy for approximating Markov chains and is suggested as a parameter for turbulence; a discontinuity in the Kolmogorov-Sinai entropy implies that in the physical world, some measure of coarse graining in a mixing parameter is required.

Documentos Relacionados