Dekorationsartikel gehören nicht zum Leistungsumfang.
Sprache:
Englisch
14,85 €*
Versandkostenfrei per Post / DHL
Lieferzeit 2-3 Wochen
Kategorien:
Beschreibung
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
Inhaltsverzeichnis
The Entropy Concept In Probability Theory
1. Entropy of Finite Schemes
2. The Uniqueness Theorem
3. Entropy of Markov chains
4. Fundamental Theorems
5. Application to Coding Theory
On the Fundamental Theorems of Information Theory
INTRODUCTION
CHAPTER I. Elementary Inequalities
1. Two generalizations of Shannon's inequality
2. Three inequalities of Feinstein
CHAPTER II. Ergodic Sources
3. Concept of a source. Stationarity. Entropy
4. Ergodic Sources
5. The E property. McMillan's theorem.
6. The martingale concept. Doob's theorem.
7. Auxillary propositions
8. Proof of McMillan's theorem.
CHAPTER III. Channels and the sources driving them
9. Concept of channel. Noise. Stationarity. Anticipation and memory
10. Connection of the channel to the source
11. The ergodic case
CHAPTER IV. Feinstein's Fundamental Lemma
12. Formulation of the problem
13. Proof of the lemma
CHAPTER V. Shannon's Theorems
14. Coding
15. The first Shannon theorem
16. The second Shannon theorem
CONCLUSION
REFERENCES
1. Entropy of Finite Schemes
2. The Uniqueness Theorem
3. Entropy of Markov chains
4. Fundamental Theorems
5. Application to Coding Theory
On the Fundamental Theorems of Information Theory
INTRODUCTION
CHAPTER I. Elementary Inequalities
1. Two generalizations of Shannon's inequality
2. Three inequalities of Feinstein
CHAPTER II. Ergodic Sources
3. Concept of a source. Stationarity. Entropy
4. Ergodic Sources
5. The E property. McMillan's theorem.
6. The martingale concept. Doob's theorem.
7. Auxillary propositions
8. Proof of McMillan's theorem.
CHAPTER III. Channels and the sources driving them
9. Concept of channel. Noise. Stationarity. Anticipation and memory
10. Connection of the channel to the source
11. The ergodic case
CHAPTER IV. Feinstein's Fundamental Lemma
12. Formulation of the problem
13. Proof of the lemma
CHAPTER V. Shannon's Theorems
14. Coding
15. The first Shannon theorem
16. The second Shannon theorem
CONCLUSION
REFERENCES
Details
Genre: | Importe, Technik allg. |
---|---|
Rubrik: | Naturwissenschaften & Technik |
Medium: | Taschenbuch |
Inhalt: | Kartoniert / Broschiert |
ISBN-13: | 9780486604343 |
ISBN-10: | 0486604349 |
UPC: | 800759604340 |
EAN: | 0800759604340 |
Sprache: | Englisch |
Einband: | Kartoniert / Broschiert |
Autor: | Khinchin, A Ya |
Übersetzung: |
Silverman
Friedman |
Hersteller: | Dover Publications |
Verantwortliche Person für die EU: | Libri GmbH, Europaallee 1, D-36244 Bad Hersfeld, gpsr@libri.de |
Maße: | 207 x 139 x 7 mm |
Von/Mit: | A Ya Khinchin |
Gewicht: | 0,145 kg |
Inhaltsverzeichnis
The Entropy Concept In Probability Theory
1. Entropy of Finite Schemes
2. The Uniqueness Theorem
3. Entropy of Markov chains
4. Fundamental Theorems
5. Application to Coding Theory
On the Fundamental Theorems of Information Theory
INTRODUCTION
CHAPTER I. Elementary Inequalities
1. Two generalizations of Shannon's inequality
2. Three inequalities of Feinstein
CHAPTER II. Ergodic Sources
3. Concept of a source. Stationarity. Entropy
4. Ergodic Sources
5. The E property. McMillan's theorem.
6. The martingale concept. Doob's theorem.
7. Auxillary propositions
8. Proof of McMillan's theorem.
CHAPTER III. Channels and the sources driving them
9. Concept of channel. Noise. Stationarity. Anticipation and memory
10. Connection of the channel to the source
11. The ergodic case
CHAPTER IV. Feinstein's Fundamental Lemma
12. Formulation of the problem
13. Proof of the lemma
CHAPTER V. Shannon's Theorems
14. Coding
15. The first Shannon theorem
16. The second Shannon theorem
CONCLUSION
REFERENCES
1. Entropy of Finite Schemes
2. The Uniqueness Theorem
3. Entropy of Markov chains
4. Fundamental Theorems
5. Application to Coding Theory
On the Fundamental Theorems of Information Theory
INTRODUCTION
CHAPTER I. Elementary Inequalities
1. Two generalizations of Shannon's inequality
2. Three inequalities of Feinstein
CHAPTER II. Ergodic Sources
3. Concept of a source. Stationarity. Entropy
4. Ergodic Sources
5. The E property. McMillan's theorem.
6. The martingale concept. Doob's theorem.
7. Auxillary propositions
8. Proof of McMillan's theorem.
CHAPTER III. Channels and the sources driving them
9. Concept of channel. Noise. Stationarity. Anticipation and memory
10. Connection of the channel to the source
11. The ergodic case
CHAPTER IV. Feinstein's Fundamental Lemma
12. Formulation of the problem
13. Proof of the lemma
CHAPTER V. Shannon's Theorems
14. Coding
15. The first Shannon theorem
16. The second Shannon theorem
CONCLUSION
REFERENCES
Details
Genre: | Importe, Technik allg. |
---|---|
Rubrik: | Naturwissenschaften & Technik |
Medium: | Taschenbuch |
Inhalt: | Kartoniert / Broschiert |
ISBN-13: | 9780486604343 |
ISBN-10: | 0486604349 |
UPC: | 800759604340 |
EAN: | 0800759604340 |
Sprache: | Englisch |
Einband: | Kartoniert / Broschiert |
Autor: | Khinchin, A Ya |
Übersetzung: |
Silverman
Friedman |
Hersteller: | Dover Publications |
Verantwortliche Person für die EU: | Libri GmbH, Europaallee 1, D-36244 Bad Hersfeld, gpsr@libri.de |
Maße: | 207 x 139 x 7 mm |
Von/Mit: | A Ya Khinchin |
Gewicht: | 0,145 kg |
Sicherheitshinweis