图书介绍

Information Theory and Network Coding2025|PDF|Epub|mobi|kindle电子书版本百度云盘下载

Information Theory and Network Coding
  • Raymond W. Yeung 著
  • 出版社: Springer US
  • ISBN:387792333
  • 出版时间:2008
  • 标注页数:582页
  • 文件大小:58MB
  • 文件页数:605页
  • 主题词:

PDF下载


点此进入-本书在线PDF格式电子书下载【推荐-云解压-方便快捷】直接下载PDF格式图书。移动端-PC端通用
种子下载[BT下载速度快]温馨提示:(请使用BT下载软件FDM进行下载)软件下载地址页直链下载[便捷但速度慢]  [在线试读本书]   [在线获取解压码]

下载说明

Information Theory and Network CodingPDF格式电子书版下载

下载的文件为RAR压缩包。需要使用解压软件进行解压得到PDF格式图书。

建议使用BT下载工具Free Download Manager进行下载,简称FDM(免费,没有广告,支持多平台)。本站资源全部打包为BT种子。所以需要使用专业的BT下载软件进行下载。如BitComet qBittorrent uTorrent等BT下载工具。迅雷目前由于本站不是热门资源。不推荐使用!后期资源热门了。安装了迅雷也可以迅雷进行下载!

(文件页数 要大于 标注页数,上中下等多册电子书除外)

注意:本站所有压缩包均有解压码: 点击下载压缩包解压工具

图书目录

1 The Science of Information1

Part Ⅰ Components of Information Theory7

2 Information Measures7

2.1 Independence and Markov Chains7

2.2 Shannon’s Information Measures12

2.3 Continuity of Shannon’s Information Measures for Fixed Finite Alphabets18

2.4 Chain Rules21

2.5 Informational Divergence23

2.6 The Basic Inequalities26

2.7 Some Useful Information Inequalities28

2.8 Fano’sInequality32

2.9 Maximum Entropy Distributions36

2.10 Entropy Rate of a Stationary Source38

Appendix 2.A:Approximation of Random Variables with Countably Infinite Alphabets by Truncation41

Chapter Summary43

Problems45

Historical Notes50

3 The I-Measure51

3.1 Preliminaries52

3.2 The I-Measure for Two Random Variables53

3.3 Construction of the I-Measure μ55

3.4 μ* Can Be Negative59

3.5 Information Diagrams61

3.6 Examples of Applications67

Appendix 3.A:A Variation of the Inclusion-Exclusion Formula74

Chapter Summary76

Problems78

Historical Notes80

4 Zero-Error Data Compression81

4.1 The Entropy Bound81

4.2 Prefix Codes86

4.2.1 Definition and Existence86

4.2.2 Huffman Codes88

4.3 Redundancy of Prefix Codes93

Chapter Summary97

Problems98

Historical Notes99

5 Weak Typicality101

5.1 The Weak AEP101

5.2 The Source Coding Theorem104

5.3 Efficient Source Coding106

5.4 The Shannon-McMillan-Breiman Theorem107

Chapter Summary109

Problems110

Historical Notes112

6 Strong Typicality113

6.1 Strong AEP113

6.2 Strong Typicality Versus Weak Typicality121

6.3 Joint Typicality122

6.4 An Interpretation of the Basic Inequalities131

Chapter Summary131

Problems132

Historical Notes135

7 Discrete Memoryless Channels137

7.1 Definition and Capacity141

7.2 The Channel Coding Theorem149

7.3 The Converse152

7.4 Achievability157

7.5 A Discussion164

7.6 Feedback Capacity167

7.7 Separation of Source and Channel Coding172

Chapter Summary175

Problems176

Historical Notes181

8 Rate-Distortion Theory183

8.1 Single-Letter Distortion Measures184

8.2 The Rate-Distortion Function R(D)187

8.3 The Rate-Distortion Theorem192

8.4 The Converse200

8.5 Achievability of RI (D)202

Chapter Summary207

Problems208

Historical Notes209

9 The Blahut-Arimoto Algorithms211

9.1 Alternating Optimization212

9.2 The Algorithms214

9.2.1 Channel Capacity214

9.2.2 The Rate-Distortion Function219

9.3 Convergence222

9.3.1 A Sufficient Condition222

9.3.2 Convergence to the Channel Capacity225

Chapter Summary226

Problems227

Historical Notes228

10 Differential Entropy229

10.1 Preliminaries231

10.2 Definition235

10.3 Joint Differential Entropy,Conditional (Differential) Entropy,and Mutual Information238

10.4 The AEP for Continuous Random Variables245

10.5 Informational Divergence248

10.6 Maximum Differential Entropy Distributions249

Chapter Summary252

Problems255

Historical Notes256

11 Continuous-Valued Channels257

11.1 Discrete-Time Channels257

11.2 The Channel Coding Theorem260

11.3 Proof of the Channel Coding Theorem262

11.3.1 The Converse262

11.3.2 Achievability265

11.4 Memoryless Gaussian Channels270

11.5 Parallel Gaussian Channels272

11.6 Correlated Gaussian Channels278

11.7 The Bandlimited White Gaussian Channel280

11.8 The Bandlimited Colored Gaussian Channel287

11.9 Zero-Mean Gaussian Noise Is the Worst Additive Noise290

Chapter Summary294

Problems296

Historical Notes297

12 Markov Structures299

12.1 Conditional Mutual Independence300

12.2 Full Conditional Mutual Independence309

12.3 Markov Random Field314

12.4 Markov Chain317

Chapter Summary319

Problems320

Historical Notes321

13 Information Inequalities323

13.1 The Region Γ*n325

13.2 Information Expressions in Canonical Form326

13.3 A Geometrical Framework329

13.3.1 Unconstrained Inequalities329

13.3.2 Constrained Inequalities330

13.3.3 Constrained Identities332

13.4 Equivalence of Constrained Inequalities333

13.5 The Implication Problem of Conditional Independence336

Chapter Summary337

Problems338

Historical Notes338

14 Shannon-Type Inequalities339

14.1 The Elemental Inequalities339

14.2 A Linear Programming Approach341

14.2.1 Unconstrained Inequalities343

14.2.2 Constrained Inequalities and Identities344

14.3 A Duality345

14.4 Machine Proving - ITIP347

14.5 Tackling the Implication Problem351

14.6 Minimality of the Elemental Inequalities353

Appendix 14.A:The Basic Inequalities and the Polymatroidal Axioms356

Chapter Summary357

Problems358

Historical Notes360

15 Beyond Shannon-Type Inequalities361

15.1 Characterizations of Γ*2,Γ*3,and Γ*n361

15.2 A Non-Shannon-Type Unconstrained Inequality369

15.3 A Non-Shannon-Type Constrained Inequality374

15.4 Applications380

Chapter Summary383

Problems383

Historical Notes385

16 Entropy and Groups387

16.1 Group Preliminaries388

16.2 Group-Characterizable Entropy Functions393

16.3 A Group Characterization of Γn398

16.4 Information Inequalities and Group Inequalities401

Chapter Summary405

Problems406

Historical Notes408

Part Ⅱ Fundamentals of Network Coding411

17 Introduction411

17.1 The Butterfly Network412

17.2 Wireless and Satellite Communications415

17.3 Source Separation417

Chapter Summary418

Problems418

Historical Notes419

18 The Max-Flow Bound421

18.1 Point-to-Point Communication Networks421

18.2 Examples Achieving the Max-Flow Bound424

18.3 A Class of Network Codes427

18.4 Proof of the Max-Flow Bound429

Chapter Summary431

Problems431

Historical Notes434

19 Single-Source Linear Network Coding:Acyclic Networks435

19.1 Acyclic Networks436

19.2 Linear Network Codes437

19.3 Desirable Properties of a Linear Network Code443

19.3.1 Transformation of a Linear Network Code447

19.3.2 Implementation of a Linear Network Code448

19.4 Existence and Construction449

19.5 Generic Network Codes460

19.6 Static Network Codes468

19.7 Random Network Coding:A Case Study473

19.7.1 How the System Works474

19.7.2 Model and Analysis475

Chapter Summary478

Problems479

Historical Notes482

20 Single-Source Linear Network Coding:Cyclic Networks485

20.1 Delay-Free Cyclic Networks485

20.2 Convolutional Network Codes488

20.3 Decoding of Convolutional Network Codes498

Chapter Summary503

Problems503

Historical Notes504

21 Multi-source Network Coding505

21.1 The Max-Flow Bounds505

21.2 Examples of Application508

21.2.1 Multilevel Diversity Coding508

21.2.2 Satellite Communication Network510

21.3 A Network Code for Acyclic Networks511

21.4 The Achievable Information Rate Region512

21.5 Explicit Inner and Outer Bounds515

21.6 The Converse516

21.7 Achievability521

21.7.1 Random Code Construction524

21.7.2 Performance Analysis527

Chapter Summary536

Problems537

Historical Notes539

Bibliography541

Index561

热门推荐