An Introduction to Information Theory
- List Price: $22.95
- Binding: Paperback
- Publisher: Dover Pubns
- Publish date: 10/01/1994
Description:
PREFACECHAPTER 1 Introduction 1-1. Communication Processes 1-2. A Model for a Communication System 1-3. A Quantitative Measure of Information1-4. A Binary Unit of Information1-5. Sketch of the Plan1-6. Main Contributors to Information theory 1-7. An Outline of Information TheoryPart 1 : Discrete Schemes without Memory CHAPTER 2 Basic Concepts of Probability 2-1. Intuitive Background2-2. Sets2-3. Operations on Sets2-4. Algebra of Sets2-5. Functions 2-6. Sample Space 2-7. Probability Measure 2-8. Frequency of Events 2-9. Theorem of Addition 2-10. Conditional Probability 2-11. Theorem of Multiplication2-12. Bayes''s Theorem 2-13. Combinatorial Problems in Probability 2-14. Trees and State Diagrams2-15. Random Variables2-16. Discrete Probability Functions and Distribution2-17. Bivariate Discrete Distributions2-18. Binomial Distribution2-19. Poisson''s Distribution2-20. Expected Value of a Random Variable CHAPTER 3 Basic Concepts of Information Theory: Memoryless Finite Schemes3-1. A Measure of Uncertainty3-2. An Intuitive Justification3-3. Formal Requirements for the Average Uncertainty3-4. H Function as a Measure of Uncertainty3-5. An Alternative Proof That the Entropy Function Possesses a Maximum3-6. Sources and Binary Sources3-7. Measure of Information for Two-dimensional Discrete Finite Probability Schemes3-8. Conditional Entropies3-9. A Sketch of a Communication Network3-10. Derivation of the Noise Characteristics of a Channel3-11. Some Basic Relationships among Different Entropies3-12. A Measure of Mutual Information3-13. Set-theory Interpretation of Shannon''s Fundamental Inequalities3-14. "Redundancy, Efficiency, and Channel Capacity"3-15. Capacity of Channels with Symmetric Noise Structure3-16. BSC and BEC3-17. Capacity of Binary Channels3-18. Binary Pulse Width Communication Channel3-19. Uniqueness of the Entropy FunctionCHAPTER 4 Elements of Encoding4-1. The Purpose of Encoding4-2. Separable Binary Codes4-3. Shannon-Fano Encoding4-4. Necessary and Sufficient Conditions for Noiseless4-5. A Theorem on Decodability4-6. Average Length of Encoded Messages4-7. Shannon''s Binary Encoding4-8. Fundamental Theorem of Discrete Noiseless Coding4-9. Huffman''s Minimum-redundancy Code4-10. Gilbert-Moore Encoding4-11. Fundamental Theorem of Discrete Encoding in Presence of Noise4-12. Error-detecting and Error-correcting Codes4-13. Geometry of the Binary Code Space4-14. Hammings Single-error Correcting Code4-15. Elias''s Iteration Technique4-16. A Mathematical Proof of the Fundamental Theorem of Information Theory for Discrete BSC4-17. Encoding the English AlphabetPart 2: Continuum without Memory CHAPTER 5 Continuous Probability Distribution and Density 5-1. Continuous Sample Space5-2. Probability Distribution Functions 5-3. Probability Density Function5-4. Normal Distribution5-5. Cauchy''s Distribution5-6. Exponential Distribution 5-7. Multidimensional Random Variables 5-8. Joint Distribution of Two Variables: Marginal Distribution5-9. Conditional Probability Distribution and Density5-10. Bivariate Normal Distribution5-11. Functions of Random Variables5-12. Transformation from Cartesian to Polar Coordinate SystemCHAPTER 6 Statistical Averages 6-1. Expected Values; Discrete Case6-2. Expectation of Sums and Products of a Finite Number of Independent Discrete Random Variables 6-3. Moments of a Univariate Random Variable6-4. Two Inequalities 6-5. Moments of Bivariate Random Variables 6-6. Correlation Coefficient6-7. Linear Combination of Random Variables6-8. Moments of Some Common Distribution Functions 6-9. Characteristic Function of a Random Variable6-10. Characteristic Function and Moment-generating Function of Random Variables6-11. Density Functions of the Sum of Two Random VariablesCHAPTER 7 Normal Distributions and Limit Theorems 7-1. Bivariate Normal Considered as an Extension of One-dimensional Normal Distribution7-2. MuItinormal Distribution7-3. Linear Combination of Normally Distributed Independent Random Variables7-4. Central-limit Theorem7-5. A Simple Random-walk Problem7-6. Approximation of the Binomial Distribution by the Normal Distribution7-7. Approximation of Poisson Distribution by a Normal Distribution7-8. The Laws of Large NumbersCHAPTER 8 Continuous Channel without Memory8-1. Definition of Different Entropies8-2. The Nature of Mathematical Difficulties Involved8-3. Infiniteness of Continuous Entropy8-4. The Variability of the Entropy in the Continuous Case with Coordinate Systems8-5. A Measure of Information in the Continuous Case8-6. Maximization of the Entropy of a Continuous Random Variable8-7. Entropy Maximization Problems8-8. Gaussian Noisy Channels8-9. Transmission of Information in the Presence of Additive Noise8-10. Channel Capacity in Presence of Gaussian Additive Noise and Specified Transmitter and Noise Average Power8-11. Relation Between the Entropies of Two Related Random Variables8-12. Note on the Definition of Mutual InformationCHAPTER 9 Transmission of Band-limited Signals9-1. Introduction9-2. Entropies of Continuous Multivariate Distributions9-3. Mutual Information of Two Gaussian Random Vectors9-4. A Channel-capacity Theorem for Additive Gaussian Noise9-5. Digression9-6. Sampling Theorem9-7. A Physical Interpretation of the Sampling Theorem9-8. The Concept of a Vector Space9-9. Fourier-series Signal Space9-10. Band-limited Singal Space9-11. Band-limited Ensembles9-12. Entropies of Band-limited Ensemble in Signal Space9-13. A Mathematical Model for Communication of Continuous Signals9-14 Optimal Decoding9-15. A Lower Bound for the Probability of Error 9-16. An Upper Bound for the Probability of Error. 9-17. Fundamental Theorem of Continuous Memoryless Channels in Presence of Additive Noise9-18. Thomasian''s EstimatePart 3 : Schemes with Memory CHAPTER 10 Stochastic Processes 10-1. Stochastic Theory10-2. Examples of a Stochastic Process10-3. Moments and Expectations 10-4. Stationary Processes 10-5. Ergodic Processes 10-6. Correlation Coefficients and Correlation Functions10-7. Example of a Normal Stochastic Process10-8. Examples of Computation of Correlation Functions 10-9. Some Elementary Properties of Correlation Functions Stationary Processes 10-10. Power Spectra and Correlation Functions10-11. Response of Linear Lumped Systems to Ergodic Excitation10-12. Stochastic Limits and Convergence 10-13. Stochastic Differentiation and Integration 10-14. Gaussian-process Example of a Stationary Process10-15. The Over-all Mathematical Structure of the Stochastic Processes10-16. A Relation between Positive Definite Functions and Theory of ProbabilityCHAPTER 11 Communication under Stochastic Regimes 11-1. Stochastic Nature of Communication11-2. Finite Markov Chains11-3. A Basic Theorem on Regular Markov Chains11-4. Entropy of a Simple Markov Chain 11-5. Entropy of a Discrete Stationary Source11-6. Discrete Channels with Finite Memory11-7. Connection of the Source and the Discrete Channel with Memory 11-8. Connection of a Stationary Source to a Stationary ChannelPart 4 : Some Recent DevelopmentsCHAPTER 12 The Fundamental Theorem of Information Theory PRELIMINARIES12-1. A Decision Scheme12-2. The Probability of Error in a Decision Scheme12-3. A Relation between Error Probability and Equivocation12-4. The Extension of Discrete Memoryless Noisy ChannelsFEINSTEIN''S PROOF12-5. On Certain Random Variables Associated with a Communication System12-6. Feinstein''s Lemma12-7. Completion of the ProofSHANNON''S PROOF12-8. Ensemble Codes12-9. A Relation between Transinformation and Error Probability12-10. An Exponential Bound for Error ProbabilityWOLFOWITZ''S PROOF12-11. The Code Book12-12. A Lemma and Its Application12-13. Estimation of Bounds12-14. Completion of Wolfowitz''s ProofCHAPTER 13 Group Codes13-1. Introduction13-2. The Concept of a Group13-3. Fields and Rings13-4. Algebra for Binary n-Digit Words13-5. Hammings Codes13-6. Group Codes13-7. A Detection Scheme for Group Codes13-8. Slepian''s Technique for Single-error Correcting Group Codes13-9. Further Notes on Group Codes13-10. Some Bounds on the Number of Words in a Systematic CodeAPPENDIX Additional Notes and TablesN-1 The Gambler with a Private WireN-2 Some Remarks on Sampling TheoremN-3 Analytic Signals and the Uncertainty RelationN-4 Elias''s Proof of the Fundamental Theorem for BSCN-5 Further Remarks oil Coding TheoryN-6 Partial Ordering of Channels N-7 Information Theory and Radar Problems T-1 Normal Probability IntegralT-2 Normal Distributions T-3 A Summary of Some Common Probability FunctionsT-4 Probability of No Error for Best Group Code T-5 Parity-check Rules for Best Group Alphabets T-6 Logarithms to the Base 2 T-7 Entropy of a Discrete Binary SourceBIBLIOGRAPHYNAME INDEXSUBJECT INDEX
Expand description
Product notice
Returnable at the third party seller's discretion and may come without consumable supplements like access codes, CD's, or workbooks.
Seller | Condition | Comments | Price |
BookReadingInc
|
Acceptable
|
$21.88
|
BookReadingInc
|
Good
|
$22.24
|
|
Firefly Bookstore
Good
|
$14.61
|
Ergodebooks
|
Good |
$16.25
|
|
Websew.com Inc
New |
$17.68
|
readmybooks
|
Acceptable |
$21.71
|
readmybooks
|
Good |
$21.91
|
|
Bingo Used Books
Like New
|
$22.22
|
|
Best and Fastest Books
Very Good |
$24.46
|
|
GridFreed
New |
$77.55
|
Please Wait