Information Theory and Coding-CH 6
- 格式:pdf
- 大小:664.23 KB
- 文档页数:30
通信工程专业课程Communication Engineering Specialty Course专业核心课程:The professional core courses:信息论及编码原理、通信原理、电视原理、电磁场及电磁波、天线及电波传播Information theory and coding theory, communication principle, the principle of television, electromagnetic field and electromagnetic wave, antenna and radio wave propagation广播电视发送方向:数字电视技术、广播电视发送技术、数字广播技术Radio and television transmission direction: digital television technology, radio and television transmission technology, digital broadcasting technology移动通信方向:移动通信、现代交换技术、移动电视技术Directions: mobile communication, mobile communication, mobile TV technology of modern switching technology信息论及编码原理:本课程着重介绍信源的类型及特性、信源熵、信道容量、信息率失真函数等信息论的基本理论,以及信源编码和信道编码的基本概念和主要方法。
这些信息论及编码的基本理论和方法不仅适用于通常意义的通信领域,如数字视音频处理和多媒体通信等,也适用于信息安全等计算机信息处理和管理等专门领域的需要。
Information theory and coding theory: This course mainly introduces the types and characteristics of information source, information entropy, channel capacity, information rate distortion function of information theory, as well as the source coding and channel coding of the basic concepts and main methods. The information theory and coding theory and method can be applied not only to the usual sense of the communication field, such as the processing of digital audio and video and multimedia communications, also applies to the information security of computer information processing and management of specialized areas of need.通信原理:本课程以当前广泛应用的通信系统和代表发展趋势的通信技术为背景,系统介绍数字通信基本原理,为学生今后从事相关工作提供理论基础和实际知识。
· 1 ·2.1 试问四进制、八进制脉冲所含信息量是二进制脉冲的多少倍? 解:四进制脉冲可以表示4个不同的消息,例如:{0, 1, 2, 3}八进制脉冲可以表示8个不同的消息,例如:{0, 1, 2, 3, 4, 5, 6, 7} 二进制脉冲可以表示2个不同的消息,例如:{0, 1} 假设每个消息的发出都是等概率的,则:四进制脉冲的平均信息量H(X 1) = log 2n = log 24 = 2 bit/symbol 八进制脉冲的平均信息量H(X 2) = log 2n = log 28 = 3 bit/symbol 二进制脉冲的平均信息量H(X 0) = log 2n = log 22 = 1 bit/symbol 所以:四进制、八进制脉冲所含信息量分别是二进制脉冲信息量的2倍和3倍。
2.2 居住某地区的女孩子有25%是大学生,在女大学生中有75%是身高160厘米以上的,而女孩子中身高160厘米以上的占总数的一半。
假如我们得知“身高160厘米以上的某女孩是大学生”的消息,问获得多少信息量? 解:设随机变量X 代表女孩子学历X x 1(是大学生) x 2(不是大学生) P(X) 0.25 0.75设随机变量Y 代表女孩子身高Y y 1(身高>160cm ) y 2(身高<160cm ) P(Y) 0.5 0.5已知:在女大学生中有75%是身高160厘米以上的 即:p(y 1/ x 1) = 0.75求:身高160厘米以上的某女孩是大学生的信息量 即:bit y p x y p x p y x p y x I 415.15.075.025.0log )()/()(log )/(log )/(2111121111=⎪⎭⎫⎝⎛⨯-=⎥⎦⎤⎢⎣⎡-=-=2.3 一副充分洗乱了的牌(含52张牌),试问 (1) 任一特定排列所给出的信息量是多少?(2) 若从中抽取13张牌,所给出的点数都不相同能得到多少信息量? 解:(1) 52张牌共有52!种排列方式,假设每种排列方式出现是等概率的则所给出的信息量是:bit x p x I i i 581.225!52log )(log )(2==-=(2) 52张牌共有4种花色、13种点数,抽取13张点数不同的牌的概率如下:bit C x p x I C x p i i i 208.134log )(log )(4)(13521322135213=-=-==· 2 ·2.4 设离散无记忆信源⎭⎬⎫⎩⎨⎧=====⎥⎦⎤⎢⎣⎡8/14/1324/18/310)(4321x x x x X P X ,其发出的信息为(202120130213001203210110321010021032011223210),求(1) 此消息的自信息量是多少?(2) 此消息中平均每符号携带的信息量是多少? 解:(1) 此消息总共有14个0、13个1、12个2、6个3,因此此消息发出的概率是:62514814183⎪⎭⎫ ⎝⎛⨯⎪⎭⎫ ⎝⎛⨯⎪⎭⎫ ⎝⎛=p此消息的信息量是:bit p I 811.87log 2=-=(2) 此消息中平均每符号携带的信息量是:bit n I 951.145/811.87/==2.5 从大量统计资料知道,男性中红绿色盲的发病率为7%,女性发病率为0.5%,如果你问一位男士:“你是否是色盲?”他的回答可能是“是”,可能是“否”,问这两个回答中各含多少信息量,平均每个回答中含有多少信息量?如果问一位女士,则答案中含有的平均自信息量是多少? 解: 男士:sym bolbit x p x p X H bitx p x I x p bit x p x I x p i i i N N N Y Y Y / 366.0)93.0log 93.007.0log 07.0()(log )()( 105.093.0log )(log )(%93)( 837.307.0log )(log )(%7)(22222222=+-=-==-=-===-=-==∑女士:symbol bit x p x p X H ii i / 045.0)995.0log 995.0005.0log 005.0()(log )()(2222=+-=-=∑2.6 设信源⎭⎬⎫⎩⎨⎧=⎥⎦⎤⎢⎣⎡17.016.017.018.019.02.0)(654321x x x x x x X P X ,求这个信源的熵,并解释为什么H(X) >log6不满足信源熵的极值性。
《信息论与编码》课程教学大纲一、课程基本信息课程代码:16052603课程名称:信息论与编码英文名称:Information Theory and Coding课程类别:专业课学时:48学分:3适用对象:信息与计算科学考核方式:考试先修课程:数学分析、高等代数、概率论二、课程简介《信息论与编码》是信息科学类专业本科生必修的专业理论课程。
通过本课程的学习,学生将了解和掌握信息度量和信道容量的基本概念、信源和信道特性、编码理论等,为以后深入学习信息与通信类课程、为将来从事信息处理方面的实际工作打下基础。
本课程的主要内容包括:信息的度量、信源和信源熵、信道及信道容量、无失真信源编码、有噪信道编码等。
Information Theory and Coding is a compulsory professional theory course for undergraduates in information science. Through this course, students will understand and master the basic concepts of information measurement and channel capacity, source and channel characteristics, coding theory, etc., lay the foundation for the future in-depth study of information and communication courses, for the future to engage in information processing in the actual work.The main contents of this course include: information measurement, source and source entropy, channel and channel capacity, distortion-free source coding, noisy channel coding, etc。
the theory of information and coding 英文概述说明1. 引言1.1 概述本文将探讨信息与编码理论,这是一门研究如何在通信系统中高效地传递和处理信息的学科。
信息与编码理论的核心目标是通过有效地表示和压缩信息来提高数据传输的效率,并确保传输过程中数据的可靠性和准确性。
1.2 文章结构本文分为五个主要部分进行阐述。
首先,我们将了解信息与编码理论的概述和背景知识。
然后,我们将深入介绍信源熵和信息量的概念及其计算方法。
接下来,我们将讨论常见的编码方法以及衡量编码效率的度量标准。
紧接着,我们将详细介绍纠错编码原理和分类,并探讨纠错能力评估方法以及在通信系统中的应用和性能控制策略。
1.3 目的本文旨在提供一个全面而清晰的介绍关于信息与编码理论相关概念和方法,帮助读者更好地理解该领域并应用于实际工程项目中。
通过对各个方面进行详尽阐述,我们希望读者能够对信息与编码理论有一个全面且深入的了解,并能运用这些知识进行有关通信系统的设计和优化工作。
2. 信息与编码理论:2.1 信息理论概述信息理论是由克劳德·香农于1948年提出的一种研究信号传输和存储中的信息量、信源压缩以及通信系统可靠性的数学理论。
它主要关注如何在不损失信息的情况下进行高效的数据传输和表示。
信息理论提供了一种衡量信息量大小和有效编码方法选择的基础。
2.2 编码理论概述编码理论是研究如何将输入数据转换为特定格式或规则的代码,以便在传输或存储过程中提高数据效率、减少传输错误或降低存储空间等目标。
它包括了多种编码方法,如霍夫曼编码、汉明编码、循环冗余校验(CRC)等。
编码理论在通信、图像处理、音频处理等领域都有广泛应用。
2.3 信息与编码的关系信息与编码密切相关并相互影响。
通过合适的编码方法可以降低数据传输时所需带宽或设备成本;而正确解读接收到的编码,又能恢复发送端传递的原始信息。
因此,在设计通信系统时,需要结合信息理论和编码理论来优化传输效率和保障数据的可靠性。
信息论与编码学习辅导及习题详解Information Theory and Coding Learning Coaching and Exercise ExplanationInformation Theory and Coding are two closely related scientific disciplines for signal processing, data communication, and computer networks. These two studies are often used together in applications, making the learning of the concepts of Information Theory and Coding important. This essay will guide through the basic knowledge of these two fields, provide coaching on learning Information Theory and Coding, and go through exercise explanations.I. Information TheoryA. DefinitionInformation Theory is a branch of applied mathematics and electrical engineering dealing with the quantification, storage, and communication of information. It was developed in 1948 by Claude Shannon. The fundamental problem of communication is to convey a message from a sender to a receiver without any errors. Information Theory is the study of how information is transmitted over a communication medium, and how the communication medium affects the transmission process.B. Basic ConceptsSome basic concepts of Information Theory are entropy, noise, channel capacity, coding, and error control. Entropy is a measure of uncertainty, and is used to help determine the amount of information contained in a signal.Noise is any disturbances that have an impact on the transmission of information. Channel capacity is the maximum amount of data that can be transmitted through a communication channel. Coding is the process of translating the message from the source into a form that can be understood by the destination. Error control is the process of detecting, identifying, and correcting any errors that occur during transmission.II. CodingA. DefinitionCoding is a branch of mathematics and computer science dealing with the efficient representation of data. It was developed in the late 1950s and early 1960s. Coding is used in a variety of applications, including data storage, image processing, digital signal processing, and communications. Coding techniques can greatly reduce the amount of data that needs to be stored and transmitted.B. Basic ConceptsThe main concepts of Coding are coding, signaling, modulation, coding rate, coding efficiency, and entropy. Coding is the process of transforming the message from the source into a form that can be understood by the destination. Signaling is the process of conveying information through a medium over a communication link. Modulation is the process of varying some aspect of a signal in order to transmit information. The coding rate is the number of bits required to encode one message. The coding efficiency is the ratio between the actual number of bits used to encode the message, and the total number of bits used. Entropy is a measure of the amount of information contained in a signal.III. Learning CoachingA. FundamentalsThe best way to learn the fundamentals of Information Theory and Coding is to start by familiarizing oneself with the core concepts such as entropy, noise, channel capacity, coding, and error control. Taking a college course in Information Theory and Coding is also beneficial. Alternatively, reading textbooks and studying reference material is a viable option.B. PracticePracticing the concepts of Information Theory and Coding is essential to mastering them. It is important to try to understand the material rather than memorize it. Doing practice problems is the best way to learn and build an understanding of the material.IV. Exercise ExplanationA. Information TheoryFor the Information Theory part of this exercise, the main goal is to determine the maximum rate at which data can be transmitted through a given communication channel. To do this, one needs to first calculate the entropy of the signal to determine the amount of information contained in it. Then, the channel capacity needs to be calculated by taking the s ignal’s entropy, the noise of the channel, and the coding rate into consideration.B. CodingFor the Coding part of this exercise, the main goal is to encode a message into a format that can be understood by the destination. To do this, one needsto first select an appropriate coding technique. Then, the information needs to be encoded using this technique. Finally, the encoded message needs to be transmitted through a communication link.In conclusion, Information Theory and Coding are two important scientific fields for signal processing, data communication, and computer networks. This essay has guided through the basics of these two fields, provided coaching on learning Information Theory and Coding, and gone through exercise explanations. Therefore, it is essential for one to understand the fundamentals of Information Theory and Coding and practice the concepts in order to gain mastery in these fields.。