next up previous
Next: Dimension Estimates Up: Analysis of a Run Previous: Mutual Information and Redundancy

False Nearest Neighbors

In the previous section, we estimated and de using information-theoretic aspects of the potential energy time series, and there were questions about the convergence of the procedure at high dimensional embeddings. In this section, we consider a statistic purely based on geometrical considerations to again estimate and de. With two different approaches, the estimates will, hopefully, be more reliable.

The method of false nearest neighbors[#!kennel1!#] examines the fraction of nearest neighbors as a function of the embedding dimension to determine the necessary global dimension de to unfold an attractor. Thus the minimum embedding dimension is found when most of the nearest neighbors do not move apart significantly in the next higher dimensional embedding. Fig[*] shows the fraction of false nearest neighbors as a function of the embedding dimension de and the timelag ,calculated using the code of Kennel and Abarbanel which improves upon their original method by accounting for oversampling, autocorrelation at small time delays and sparse populations over regions of the attractor. A timelag of about 18 and an embedding dimension of 4 or 5 is indicated. Thus both the methods indicate a consistent timelag of between 14 and 18 and an embedding dimension of 4 or 5.


  
Figure 3: Total and Marginal Redundancy and Autocorrelation


next up previous
Next: Dimension Estimates Up: Analysis of a Run Previous: Mutual Information and Redundancy
Balasubramany (Balu) Nadiga
1/8/1998