Amazon cover image
Image from Amazon.com
Image from Google Jackets

Probability and statistics by example. Vol. 2, Markov chains : a primer in random processes and their applications / Yuri Suhov, Michael Kelbert.

By: Contributor(s): Material type: TextTextPublication details: Cambridge : Cambridge University Press, c2008.Description: ix, 487 p. : ill ; 25 cmISBN:
  • 9780521612340 (pbk.)
  • 9780521847674 (cased)
  • 0521847672 (cased)
Other title:
  • Markov chains : a primer in random processes and their applications
Subject(s): DDC classification:
  • 519.2   22
Contents:
Prefacep. vii Discrete-time Markov chainsp. 1 The Markov property and its immediate consequencesp. 1 Class divisionp. 17 Hitting times and probabilitiesp. 26 Strong Markov propertyp. 35 Recurrence and transience: definitions and basic factsp. 39 Recurrence and transience: random walks on latticesp. 45 Equilibrium distributions: definitions and basic factsp. 52 Positive and null recurrencep. 58 Convergence to equilibrium. Long-run proportionsp. 70 Detailed balance and reversibilityp. 80 Controlled and partially observed Markov chainsp. 89 Geometric algebra of Markov chains, Ip. 99 Geometric algebra of Markov chains, IIp. 116 Geometric algebra of Markov chains, IIIp. 130 Large deviations for discrete-time Markov chainsp. 138 Examination questions on discrete-time Markov chainsp. 155 Continuous-time Markov chainsp. 185 Q-matrices and transition matricesp. 185 Continuous-time Markov chains: definitions and basic constructionsp. 196 The Poisson processp. 210 Inhomogeneous Poisson processp. 231 Birth-and-death process. Explosionp. 240 Continuous-time Markov chains with countably many statesp. 250 Hitting times and probabilities. Recurrence and transiencep. 266 Convergence to an equilibrium distribution. Reversibilityp. 283 Applications to queueing theory. Markovian queuesp. 291 Examination questions on continuous-time Markov chainsp. 308 Statistics of discrete-time Markov chainsp. 349 Introductionp. 349 Likelihood functions, 1. Maximum likelihood estimatorsp. 357 Consistency of estimators. Various forms of convergencep. 366 Likelihood functions, 2. Whittle's formulap. 390 Bayesian analysis of Markov chains: prior and posterior distributionsp. 401 Elements of control and information theoryp. 415 Hidden Markov models, 1. State estimation for Markov chainsp. 434 Hidden Markov models, 2. The Baum-Welch learning algorithmp. 451 Generalisations of the Baum-Welch algorithmp. 461 Epilogue: Andrei Markov and his Timep. 479 Bibliographyp. 483 Indexp. 485
Summary: Probability and Statistics are as much about intuition and problem solving as they are about theorem proving. Because of this, students can find it very difficult to make a successful transition from lectures to examinations to practice, since the problems involved can vary so much in nature. Since the subject is critical in many modern applications such as mathematical finance, quantitative management, telecommunications, signal processing, bioinformatics, as well as traditional ones such as insurance, social science and engineering, the authors have rectified deficiencies in traditional lecture-based methods by collecting together a wealth of exercises with complete solutions, adapted to needs and skills of students. Following on from the success of Probability and Statistics by Example: Basic Probability and Statistics, the authors here concentrate on random processes, particularly Markov processes, emphasizing models rather than general constructions. Basic mathematical facts are supplied as and when they are needed and historical information is sprinkled throughout.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Call number Copy number Status Date due Barcode
Books Books Main library General Stacks 519.2 / SH.P 2008 (Browse shelf(Opens below)) 1 Available 006339

Includes bibliographical references and index.

Prefacep. vii Discrete-time Markov chainsp. 1 The Markov property and its immediate consequencesp. 1 Class divisionp. 17 Hitting times and probabilitiesp. 26 Strong Markov propertyp. 35 Recurrence and transience: definitions and basic factsp. 39 Recurrence and transience: random walks on latticesp. 45 Equilibrium distributions: definitions and basic factsp. 52 Positive and null recurrencep. 58 Convergence to equilibrium. Long-run proportionsp. 70 Detailed balance and reversibilityp. 80 Controlled and partially observed Markov chainsp. 89 Geometric algebra of Markov chains, Ip. 99 Geometric algebra of Markov chains, IIp. 116 Geometric algebra of Markov chains, IIIp. 130 Large deviations for discrete-time Markov chainsp. 138 Examination questions on discrete-time Markov chainsp. 155 Continuous-time Markov chainsp. 185 Q-matrices and transition matricesp. 185 Continuous-time Markov chains: definitions and basic constructionsp. 196 The Poisson processp. 210 Inhomogeneous Poisson processp. 231 Birth-and-death process. Explosionp. 240 Continuous-time Markov chains with countably many statesp. 250 Hitting times and probabilities. Recurrence and transiencep. 266 Convergence to an equilibrium distribution. Reversibilityp. 283 Applications to queueing theory. Markovian queuesp. 291 Examination questions on continuous-time Markov chainsp. 308 Statistics of discrete-time Markov chainsp. 349 Introductionp. 349 Likelihood functions, 1. Maximum likelihood estimatorsp. 357 Consistency of estimators. Various forms of convergencep. 366 Likelihood functions, 2. Whittle's formulap. 390 Bayesian analysis of Markov chains: prior and posterior distributionsp. 401 Elements of control and information theoryp. 415 Hidden Markov models, 1. State estimation for Markov chainsp. 434 Hidden Markov models, 2. The Baum-Welch learning algorithmp. 451 Generalisations of the Baum-Welch algorithmp. 461 Epilogue: Andrei Markov and his Timep. 479 Bibliographyp. 483 Indexp. 485

Probability and Statistics are as much about intuition and problem solving as they are about theorem proving. Because of this, students can find it very difficult to make a successful transition from lectures to examinations to practice, since the problems involved can vary so much in nature. Since the subject is critical in many modern applications such as mathematical finance, quantitative management, telecommunications, signal processing, bioinformatics, as well as traditional ones such as insurance, social science and engineering, the authors have rectified deficiencies in traditional lecture-based methods by collecting together a wealth of exercises with complete solutions, adapted to needs and skills of students. Following on from the success of Probability and Statistics by Example: Basic Probability and Statistics, the authors here concentrate on random processes, particularly Markov processes, emphasizing models rather than general constructions. Basic mathematical facts are supplied as and when they are needed and historical information is sprinkled throughout.

1

There are no comments on this title.

to post a comment.