Probability and information : an integrated approach / David Applebaum.
Material type:
TextPublication details: Cambridge ; Cambridge University Press, c2008.Edition: 2nd edDescription: xvi, 273 p. : ill. ; 26 cmISBN: - 9780521727884 (pbk.)
- 9780521899048 (hbk.)
- 519 22
| Item type | Current library | Call number | Copy number | Status | Date due | Barcode | |
|---|---|---|---|---|---|---|---|
Books
|
Main library General Stacks | 519 / AP.P 2008 (Browse shelf(Opens below)) | 1 | Available | 006338 |
Preface to the second editionp. xi Preface to the first editionp. xiii Introductionp. 1 Chance and informationp. 1 Mathematical models of chance phenomenap. 2 Mathematical structure and mathematical proofp. 5 Plan of this bookp. 7 Combinatoricsp. 10 Countingp. 10 Arrangementsp. 11 Combinationsp. 13 Multinomial coefficientsp. 16 The gamma functionp. 18 Exercisesp. 19 Further readingp. 21 Sets and measuresp. 22 The concept of a setp. 22 Set operationsp. 25 Boolean algebrasp. 29 Measures on Boolean algebrasp. 32 Exercisesp. 37 Further readingp. 40 Probabilityp. 41 The concept of probabilityp. 41 Probability in practicep. 43 Conditional probabilityp. 48 Independencep. 55 The interpretation of probabilityp. 57 The historical roots of probabilityp. 62 Exercisesp. 64 Further readingp. 68 Discrete random variablesp. 70 The concept of a random variablep. 70 Properties of random variablesp. 72 Expectation and variancep. 78 Covariance and correlationp. 83 Independent random variablesp. 86 I.I.D. random variablesp. 89 Binomial and Poisson random variablesp. 91 Geometric, negative binomial and hypergeometric random variablesp. 95 Exercisesp. 99 Further readingp. 104 Information and entropyp. 105 What is information?p. 105 Entropyp. 108 Joint and conditional entropies; mutual informationp. 111 The maximum entropy principlep. 115 Entropy, physics and lifep. 117 The uniqueness of entropyp. 119 Exercisesp. 123 Further readingp. 125 Communicationp. 127 Transmission of informationp. 127 The channel capacityp. 130 Codesp. 132 Noiseless codingp. 137 Coding and transmission with noise - Shannon's theoremp. 143 Brief remarks about the history of information theoryp. 150 Exercisesp. 151 Further readingp. 153 Random variables with probability density functionsp. 155 Random variables with continuous rangesp. 155 Probability density functionsp. 157 Discretisation and integrationp. 161 Laws of large numbersp. 164 Normal random variablesp. 167 The central limit theoremp. 172 Entropy in the continuous casep. 179 Exercisesp. 182 Further readingp. 186 Random vectorsp. 188 Cartesian productsp. 188 Boolean algebras and measures on productsp. 191 Distributions of random vectorsp. 193 Marginal distributionsp. 199 Independence revisitedp. 201 Conditional densities and conditional entropyp. 204 Mutual information and channel capacityp. 208 Exercisesp. 212 Further readingp. 216 Markov chains and their entropyp. 217 Stochastic processesp. 217 Markov chainsp. 219 The Chapman-Kolmogorov equationsp. 224 Stationary processesp. 227 Invariant distributions and stationary Markov chainsp. 229 Entropy rates for Markov chainsp. 235 Exercisesp. 240 Further readingp. 243 Exploring furtherp. 245 Proof by mathematical inductionp. 247 Lagrange multipliersp. 249 Integration of exp (-1/2x[superscript 2])p. 252 Table of probabilities associated with the standard normal distributionp. 254 A rapid review of matrix algebrap. 256 Selected solutionsp. 260 Indexp. 268
This updated textbook is an excellent way to introduce probability and information theory to new students in mathematics, computer science, engineering, statistics, economics, or business studies. Only requiring knowledge of basic calculus, it starts by building a clear and systematic foundation to the subject: the concept of probability is given particular attention via a simplified discussion of measures on Boolean algebras. The theoretical ideas are then applied to practical areas such as statistical inference, random walks, statistical mechanics and communications modelling. Topics covered include discrete and continuous random variables, entropy and mutual information, maximum entropy methods, the central limit theorem and the coding and transmission of information, and added for this new edition is material on Markov chains and their entropy. Lots of examples and exercises are included to illustrate how to use the theory in a wide range of applications, with detailed solutions to most exercises available online for instructors.
1
There are no comments on this title.