Information theory, inference, and learning algorithms / (Record no. 6899)

MARC details
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 080304s2007 enka b 001 0 eng
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 0521642981
035 ## - SYSTEM CONTROL NUMBER
System control number (Sirsi) u8
040 ## - CATALOGING SOURCE
Original cataloging agency EG-CaNU
Transcribing agency EG-CaNU
Modifying agency EG-CaNU
042 ## - AUTHENTICATION CODE
Authentication code ncode
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 003.54
Edition number 22
100 1# - MAIN ENTRY--PERSONAL NAME
Personal name MacKay, David J. C.
9 (RLIN) 14450
245 10 - TITLE STATEMENT
Title Information theory, inference, and learning algorithms /
Statement of responsibility, etc. David J.C. Mackay.
260 ## - PUBLICATION, DISTRIBUTION, ETC.
Place of publication, distribution, etc. Cambridge, UK ;
-- New York :
Name of publisher, distributor, etc. Cambridge University Press,
Date of publication, distribution, etc. 2007.
300 ## - PHYSICAL DESCRIPTION
Extent xii, 628 p. :
Other physical details ill. ;
Dimensions 26 cm.
504 ## - BIBLIOGRAPHY, ETC. NOTE
Bibliography, etc. note Includes bibliographical references (p. 613-619) and index.
505 0# - FORMATTED CONTENTS NOTE
Formatted contents note 1. Introduction to information theory -- 2. Probability, entropy and inference -- 3. More about inference -- Part I. Data Compression: 4. The source coding theorem -- 5. Symbol codes -- 6. Stream codes -- 7. Codes for integers -- Part II. Noisy-Channel Coding: 8. Dependent random variables -- 9. Communication over a noisy channel -- 10. The noisy-channel coding theorem -- 11. Error-correcting codes and real channels -- Part III. Further Topics in Information Theory -- 12. Hash codes -- 13. Binary codes -- 14. Very good linear codes exist -- 15. Further exercises on information theory -- 16. Message passing -- 17. Constrained noiseless channels -- 18. Crosswords and codebreaking -- 19. Why have sex? Information acquisition and evolution -- Part IV. Probabilities and Inference -- 20. An example inference task: clustering -- 21. Exact inference by complete enumeration -- 22. Maximum likelihood and clustering -- 23. Useful probability distributions -- 24. Exact marginalization -- 25. Exact marginalization in trellises -- 26. Exact marginalization in graphs -- 27. Laplace's method -- 28. Model comparison and Occam's razor -- 29. Monte Carlo methods -- 30. Efficient Monte Carlo methods -- 31. Ising models -- 32. Exact Monte Carlo sampling -- 33. Variational methods -- 34. Independent component analysis -- 35. Random inference topics -- 36. Decision theory -- 37. Bayesian inference and sampling theory -- Part V. Neural Networks: 38. Introduction to neural networks -- 39. The single neuron as a classifier -- 40. Capacity of a single neuron -- 41. Learning as inference -- 42. Hopfield networks -- 43. Boltzmann machines -- 44. Supervised learning in multilayer networks -- 45. Gaussian processes -- 46. Deconvolution -- Part VI. Sparse Graph Codes -- 47. Low-density parity-check codes -- 48. Convolutional codes and turbo codes -- 49. Repeat-accumulate codes -- 50. Digital fountain codes.
520 ## - SUMMARY, ETC.
Summary, etc. Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.
596 ## -
-- 1
650 #0 - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Information theory.
9 (RLIN) 14451
Holdings
Withdrawn status Lost status Source of classification or shelving scheme Damaged status Not for loan Home library Current library Shelving location Date acquired Source of acquisition Total Checkouts Full call number Barcode Date last seen Copy number Price effective from Koha item type
    Dewey Decimal Classification     Main library Main library General Stacks 01/26/2020 PURCHASE   003.54 / MA.I 2007 000233 11/24/2019 1 11/24/2019 Books