Exploring Finite Markov Chains by the Systematic Computation of Descriptors
27/04/2004 Tuesday 27th April 2004, 14:30 (Room P4.35, Mathematics Building)
Marcel F. Neuts, Department of Systems and Industrial Engineering, The University of Arizona
We try to gain insight into the deeper physical behavior of a finite Markov chain by systematically computing quantities related to the visits to a string of nested sets of states. The choice of the successive states added to the nested sets is called an exploratory strategy. The strategy is constructed by focusing of the physical property to be explored. Quantities that serve as criteria in one strategy are reported as descriptors for the other strategies. This is a promising tool for the exploration of finite discrete-time Markov chains. Similar methods can be developed for continuous-time chains and Markov renewal processes, but the required computational methods are substantially different. We believe that this methodology may find applications, among other areas, in genetics and linguistics. The existing Markov chain analysis should be complemented by data analytic procedures applied to real or simulated data bases. The exploration in parallel of the Markov chains and suitable data sets can serve to develop the skills needed to gain reliable insights from the models and from the data sets.