第4章-信道容量教学课件.ppt(63页)
- 【下载声明】
1. 本站全部试题类文档,若标题没写含答案,则无答案;标题注明含答案的文档,主观题也可能无答案。请谨慎下单,一旦售出,不予退换。
2. 本站全部PPT文档均不含视频和音频,PPT中出现的音频或视频标识(或文字)仅表示流程,实际无音频或视频文件。请谨慎下单,一旦售出,不予退换。
3. 本页资料《第4章-信道容量教学课件.ppt(63页)》由用户(ziliao2023)主动上传,其收益全归该用户。163文库仅提供信息存储空间,仅对该用户上传内容的表现方式做保护处理,对上传内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知163文库(点击联系客服),我们立即给予删除!
4. 请根据预览情况,自愿下载本文。本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
5. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007及以上版本和PDF阅读器,压缩文件请下载最新的WinRAR软件解压。
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- 信道容量 教学 课件
- 资源描述:
-
1、Review property of mutual info.function:Property.1 Relationship between average mutual info.and channel input probability distribution Property 1:I(X;Y)is an upper convex function of the channel input probability distribution p(x).I(X;Y)p(x)Review property of mutual info.function:Property.2 Relation
2、ship between Info.content and channel transition probability distribution Property 2:I(X;Y)is a concave function of channel transition probability distributes p(y/X).I(X;Y)p(y/x)What is channel?The channel is a carrier transmitting messages a passage through which signal passes.The information is ab
3、stract,but the channel is concrete.For instance:If two people converse,the air is the channel;If the two call each other,the telephone line is the channel;If we watch the television,listen to the radio,the space between the receiver and the transmitter is the channel.4.1.The model and classification
4、 of the channel In this part,we will mainly introduce two parts:Channel models Channel classifications 4.1.1 Channel Models we can treat channel as a converter which transfer events.the channel model can be indicated as the follow Fig:Binary Symmetric Channel(BSC)is the simplest channel model A BSC
5、is shown as below:BSC DMC We assume that the channel and the modulation is memoryless.The inputs and outputs can then be related by a set of conditional probabilities|ijijP YyXxP y x0,1,1iQ0,1,1jqDMC This channel is known as a Discrete Memoryless Channel(DMC)and is depicted as4.1.2 Channel classific
6、ations Channel can be classified into several types.ChangeableConstanttypeParameter 2)Two user channel(point to point)1)users typeMulti-user channel(work)Open wiresymmetrical balance cableSolid mediaCable fine coaxial cablecoaxial cableLong waveAMShortwaveFM3)type of mediaMobileair mediaHorizon rela
7、yMicrowaveTroposphericScatteringIonosphericSatelliteLightWaveguideMixed mediaCable(对流层)(电离层)discretememorylesscontinuoussignal semi-discretewith memorysemi-continuousno interference4)signal/interferenceThermal noiseLinear superpositionImpulse noiseinterferencewith interferenceIntermodulationMultipli
8、cative FadingInter-symbol interference4.2 Channel doubt degree and average mutual information 4.2.1 Channel doubt degree 4.2.2 Average mutual information 4.2.3 Properties of mutual information function 4.2.4 Relationship between entropy,channel doubt degree and average mutual information4.2.1 Channe
9、l doubt degree Assume r.v.X indicates the input set of channel,and r.v.Y indicates the output set of channel,the channel doubt degree is:The meaning of“channel doubt degree H(X|Y)”is that when the receiving terminal gets message Y,the average uncertainty still leaves about source X.In fact,the uncer
10、tainty comes from the noise in channel.(|)(|)()log()XjijijijH X YEH X bp a bp a b 11(|)(|)(|)(|)log(|)nnjijijijijiiH X bp ab I abp abp ab This means that if the average uncertainty of source X is H(X),well get more or less information which eliminates the uncertainty of the source X when get the out
11、put message Y.So we have the following concept of average mutual information.Since we have:(|)()H X YH X4.2.2 Average mutual information The average mutual information is the entropy of source X minus the channel doubt degree.The above meaning is that when the receiver gets a message Y,the average i
12、nformation he can get about X from every symbol he received.(;)()(|)defI X YH XH X Y4.2.3 Properties of mutual information function Property 1:Relationship between mutual information and channel input probability distribution.I(X;Y)is an upper convex function of the channel input probability distrib
13、ution P(X).This can be shown in Fig.4.5 and Fig.4.6.Fig.4.5.I(X;Y)is convex function of P(X)Fig.4.6.Message passing through the channel E.g.4.1 Considering a dual element channel,the probability distribution is110PXand the matrix of channel is Where is the probability of transmission error.11122122p
14、pppPpppp1pp)1log1log()()()|(1log)|()()()|(1log)()()|()();(ppppxpYHxypxypxpYHxypxypYHXYHYHYXIXXYXY)()()1log1log()(pHYHppppYHThen the mutual information is,And we can get the following results,So,The average mutual information diagram is shown in the following Fig.4.7.(0)(0)(0|0)(1)(0|1)(1)(1)(0)(1|0)
15、(1)(1|1)(1)P YP XYXP XYXppppP YP XYXP XYXpppp11()()log()logH YppppppppFig.4.7.Mutual information of the dual symmetric channel From the diagram,we can see that when the input symbols satisfy“equal probability distribution”,the average mutual information I(X;Y)reaches the maximum value,and only at th
16、is time the receiver gets the largest information from every symbol he received.Property 2 Relationship between information and channel transition probability distribution.I(X;Y)is a concave function of channel transition probability distribution of p(Y|X).Fig.4.8.I(X;Y)is a concave function of P(X|
17、Y)E.g.4.2(This is the follow-up of E.g.4.1)Considering dual channel,now we know the average mutual information is,when the source distribution is)()();(pHppHYXIaverage mutual information I(X;Y)is the concave function of p,just see it from the following diagram,the Mutual info.of fixed binary source
18、From the diagram,we can see,once the binary source fixed,when the channel property p changes,well get the different mutual information I(X;Y),when p=1/2,I(X;Y)=0,that means the receiver get the lest information from this channel,and all the information is lost in the way of transmission,this channel
19、 has the most loudly noise.Property 3 If the channel input is discrete and without memory,we have the following inequality1211()(,.,)()(;)(;)NNiiNiiipp x xxp xII X Yxx y Property 4 If the channel is discrete and without memory,we have121211(|)(,.,|,.,)(|)(;)(;)NNNiiiNiiipp y yyx xxp yxII X Yy xx y(R
20、emember the results)4.2.4 Relationship between entropy,channel doubt degree and average mutual information(;)()(|)(;)()()(,)(;)(;)0I X YH XH X YI X YH XH YH X YI X YI Y X(,)()()(,)(|)()(|)()H X YH XH YH X YH X YH YH YXH X E.g.4.3 There is a source 4.06.0)(21xxXPXIts messages pass through a channel w
21、ith noise.The symbols received by the other end of the channel are .The channels transfer matrix is ,please calculate,21yyY 434161651x2xX(1)The self-information included in the symbolandof event.(2)The information about the receiver gets when it observes the message)2,1(ixi)2,1(jyj(3)The entropy of
22、source X and received Y(4)The channel doubt degree H(X|Y)and the noise entropy H(Y|X).(5)The average mutual information got by receiver when it receives Y.Solution:1212()log()log 0.60.737 I xp xbit 2222()log()log 0.41.322 I xp xbit (1):(2):111121251()()(/)()(/)0.60.40.664p yp x p yxp x p yx212122213
23、()()(/)()(/)0.60.40.464p yp x p yxp xp yx1111221(/)5/6(;)loglog0.474 ()0.6p yxI xybitp y2112222(/)1/6(;)loglog1.263 ()0.4p yxI xybitp y 1221221(/)1/4(;)loglog1.263 ()0.6p yxI xybitp y 2222222(/)3/4(;)loglog0.907 ()0.4p yxI xybitp y(3):22()()log()(0.6log0.60.4log0.4)log 100.971/()()log()(0.6log0.60.4
24、log0.4)log 100.971/iiijjjH Xp xp xbit symH Yp yp ybit sym 255111133 (0.6log0.6log0.4log0.4log)log 1066664444 0.715 /()(/)()(/)(/)()(/)()0.971 0.715 0.971 0.715 bit symbolH XH Y XH YH X YH X YH XH Y XH Yb/it symbol(/)()(/)log(/)ijijiijH Y Xp x p yxp yx(5):(4):(;)()(|)0.971 0.7150.256/I X YH XH X Ybit
展开阅读全文