求条件熵和互信息的matlab实现

来源:网络收集 时间:2025-06-18 下载这篇文档 手机版
说明:文章内容仅供预览,部分内容可能不全,需要完整文档或者需要复制内容,请下载word后使用。下载word有问题请添加微信号:xuecool-com或QQ:370150219 处理(尽可能给您提供完整文档),感谢您的支持与谅解。点击这里给我发消息

Assignment 1_08116649_Chaoyun_Song

EEE315 Information Theory and

Coding Assignment 1

Channel Capacity and Mutual

Information

ID : 08116649 Name: Chaoyun.Song

Assignment 1_08116649_Chaoyun_Song

1. Introduction

Shannon's information content should have some intuitive properties :1( ()Information contained in the events ought to be defined in terms of some measure of uncertainty of the events. Less certain events ought to contain more information than more certain events. The information of unrelated events taken as a single event should equal the sum of the information of the unrelated events.

In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits. In this context, a 'message' means a specific realization of the random variable. Entropy is defined as:

H(x)???p(x)logp(x)it can be viewed as: a measure of the minimum cost needed to send some form of information; “the amount of surprise factor” of the information measured in bits. or how much energy it is worth spending to carry the information which translates to the minimum number of bits needed to code the information.

In probability theory and information theory, the mutual

information (sometimes known by the archaic termtransinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used. The mutual information can be defined as:

where p(x,y) is the joint probability distribution function of X and Y, and p1(x) and p2(y) are the marginal probability distribution functions of X and Y respectively.

Assignment 1_08116649_Chaoyun_Song

2. Result with Matlab stripts and functions

(1). Write a Matlab function to calculate the entropy of a source given a discrete distribution. Cacluate the entropy for the following distribution. Plot the entropy diagram for each of the distributions. A={B={

C={0.1, 0.31, 0.001, 0.009, 0.2, 0.15, 0.23} Solution: Matlab code

>> A=[1/2 1/4 1/8 1/8]; >> H1=-sum(A.*log2(A)) H1 =

1.7500

>> B=[1/4 1/4 1/4 1/4]; >> H2=-sum(B.*log2(B)) H2 = 2

>> C=[0.1, 0.31, 0.001, 0.009, 0.2, 0.15, 0.23]; >> H3= -sum(C.*log2(C))

1111,,248814}

14,

14,

14,}

Assignment 1_08116649_Chaoyun_Song

H3 =

2.2897

(2). Write a MATLAB[1] script to plot the capacity of a binary symmetric channel with cross probability p as function of p where 0

?p ? 1. For what value of p is the capacity minimized and what is

the minimum value?

For a binary symmetric channel(BSC), we know that P(0|1)=P(1|0)=p, P(0|0)=P(1|1)=1-p, which ?p? is the cross probability 0 ?p ? 1. When P(Y0)=P(Y1)=0.5 the mutual information comes to the minimum value. The capacity of this channel is like:

C= P(X0)P(0|0)log[P(0|0)/0.5]+ P(X0)P(1|0)log[P(1|0)/0.5]+ P(X1)P(0|1)log[P(0|1)/0.5]+ P(X1)P(1|1)log[P(1|1)/0.5] =plog2p+(1-p)log2(1-p)+1

Using matlab we can plot the diagram of ?p? and ?C?

Assignment 1_08116649_Chaoyun_Song

Solution: Matlab Code: >> p=0 : 0.01 : 1;

>> C=p.*log2(p)+(1-p).*log2(1-p)+1; >> plot(p,C)

From the diagram, we can see the change of ?C? with different value of ?p?. When p=1 the channel capacity is 1(bit/symbol)

When p=0.5 there are no information and the mutual information is 0 When 0.5 ?p ? 1 the diagram is same as leftside

So p=0.5 is the capacity minimized, the minimum value of C is 0.

(3). A binary non-symmetric channel is characterized by the probabilities P(0|1) =0.1 and P(1|0) = 0.2.

百度搜索“70edu”或“70教育网”即可找到本站免费阅读全部范文。收藏本站方便下次阅读,70教育网,提供经典综合文库求条件熵和互信息的matlab实现在线全文阅读。

求条件熵和互信息的matlab实现.doc 将本文的Word文档下载到电脑,方便复制、编辑、收藏和打印 下载失败或者文档不完整,请联系客服人员解决!
本文链接:https://www.70edu.com/wenku/181748.html(转载请注明文章来源)
Copyright © 2020-2025 70教育网 版权所有
声明 :本网站尊重并保护知识产权,根据《信息网络传播权保护条例》,如果我们转载的作品侵犯了您的权利,请在一个月内通知我们,我们会及时删除。
客服QQ:370150219 邮箱:370150219@qq.com
苏ICP备16052595号-17
Top
× 游客快捷下载通道(下载后可以自由复制和排版)
单篇付费下载
限时特价:7 元/份 原价:20元
VIP包月下载
特价:29 元/月 原价:99元
低至 0.3 元/份 每月下载150
全站内容免费自由复制
VIP包月下载
特价:29 元/月 原价:99元
低至 0.3 元/份 每月下载150
全站内容免费自由复制
注:下载文档有可能“只有目录或者内容不全”等情况,请下载之前注意辨别,如果您已付费且无法下载或内容有问题,请联系我们协助你处理。
微信:xuecool-com QQ:370150219