欢迎来到163文库! | 帮助中心 精品课件PPT、教案、教学设计、试题试卷、教学素材分享与下载!
163文库
全部分类
  • 办公、行业>
  • 幼教>
  • 小学>
  • 初中>
  • 高中>
  • 中职>
  • 大学>
  • 各类题库>
  • ImageVerifierCode 换一换
    首页 163文库 > 资源分类 > PPT文档下载
    分享到微信 分享到微博 分享到QQ空间

    第4章-信道容量教学课件.ppt(63页)

    • 文档编号:7977032       资源大小:822.50KB        全文页数:63页
    • 资源格式: PPT        下载积分:22文币     交易提醒:下载本文档,22文币将自动转入上传用户(ziliao2023)的账号。
    微信登录下载
    快捷注册下载 游客一键下载
    账号登录下载
    二维码
    微信扫一扫登录
    下载资源需要22文币
    邮箱/手机:
    温馨提示:
    快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
    如填写123,账号就是123,密码也是123。
    支付方式: 支付宝    微信支付   
    验证码:   换一换

    优惠套餐(点此详情)
     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、试题类文档,标题没说有答案的,则无答案。带答案试题资料的主观题可能无答案。PPT文档的音视频可能无法播放。请谨慎下单,否则不予退换。
    3、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者搜狗浏览器、谷歌浏览器下载即可。。

    第4章-信道容量教学课件.ppt(63页)

    1、Review property of mutual info.function:Property.1 Relationship between average mutual info.and channel input probability distribution Property 1:I(X;Y)is an upper convex function of the channel input probability distribution p(x).I(X;Y)p(x)Review property of mutual info.function:Property.2 Relation

    2、ship between Info.content and channel transition probability distribution Property 2:I(X;Y)is a concave function of channel transition probability distributes p(y/X).I(X;Y)p(y/x)What is channel?The channel is a carrier transmitting messages a passage through which signal passes.The information is ab

    3、stract,but the channel is concrete.For instance:If two people converse,the air is the channel;If the two call each other,the telephone line is the channel;If we watch the television,listen to the radio,the space between the receiver and the transmitter is the channel.4.1.The model and classification

    4、 of the channel In this part,we will mainly introduce two parts:Channel models Channel classifications 4.1.1 Channel Models we can treat channel as a converter which transfer events.the channel model can be indicated as the follow Fig:Binary Symmetric Channel(BSC)is the simplest channel model A BSC

    5、is shown as below:BSC DMC We assume that the channel and the modulation is memoryless.The inputs and outputs can then be related by a set of conditional probabilities|ijijP YyXxP y x0,1,1iQ0,1,1jqDMC This channel is known as a Discrete Memoryless Channel(DMC)and is depicted as4.1.2 Channel classific

    6、ations Channel can be classified into several types.ChangeableConstanttypeParameter 2)Two user channel(point to point)1)users typeMulti-user channel(work)Open wiresymmetrical balance cableSolid mediaCable fine coaxial cablecoaxial cableLong waveAMShortwaveFM3)type of mediaMobileair mediaHorizon rela

    7、yMicrowaveTroposphericScatteringIonosphericSatelliteLightWaveguideMixed mediaCable(对流层)(电离层)discretememorylesscontinuoussignal semi-discretewith memorysemi-continuousno interference4)signal/interferenceThermal noiseLinear superpositionImpulse noiseinterferencewith interferenceIntermodulationMultipli

    8、cative FadingInter-symbol interference4.2 Channel doubt degree and average mutual information 4.2.1 Channel doubt degree 4.2.2 Average mutual information 4.2.3 Properties of mutual information function 4.2.4 Relationship between entropy,channel doubt degree and average mutual information4.2.1 Channe

    9、l doubt degree Assume r.v.X indicates the input set of channel,and r.v.Y indicates the output set of channel,the channel doubt degree is:The meaning of“channel doubt degree H(X|Y)”is that when the receiving terminal gets message Y,the average uncertainty still leaves about source X.In fact,the uncer

    10、tainty comes from the noise in channel.(|)(|)()log()XjijijijH X YEH X bp a bp a b 11(|)(|)(|)(|)log(|)nnjijijijijiiH X bp ab I abp abp ab This means that if the average uncertainty of source X is H(X),well get more or less information which eliminates the uncertainty of the source X when get the out

    11、put message Y.So we have the following concept of average mutual information.Since we have:(|)()H X YH X4.2.2 Average mutual information The average mutual information is the entropy of source X minus the channel doubt degree.The above meaning is that when the receiver gets a message Y,the average i

    12、nformation he can get about X from every symbol he received.(;)()(|)defI X YH XH X Y4.2.3 Properties of mutual information function Property 1:Relationship between mutual information and channel input probability distribution.I(X;Y)is an upper convex function of the channel input probability distrib

    13、ution P(X).This can be shown in Fig.4.5 and Fig.4.6.Fig.4.5.I(X;Y)is convex function of P(X)Fig.4.6.Message passing through the channel E.g.4.1 Considering a dual element channel,the probability distribution is110PXand the matrix of channel is Where is the probability of transmission error.11122122p

    14、pppPpppp1pp)1log1log()()()|(1log)|()()()|(1log)()()|()();(ppppxpYHxypxypxpYHxypxypYHXYHYHYXIXXYXY)()()1log1log()(pHYHppppYHThen the mutual information is,And we can get the following results,So,The average mutual information diagram is shown in the following Fig.4.7.(0)(0)(0|0)(1)(0|1)(1)(1)(0)(1|0)

    15、(1)(1|1)(1)P YP XYXP XYXppppP YP XYXP XYXpppp11()()log()logH YppppppppFig.4.7.Mutual information of the dual symmetric channel From the diagram,we can see that when the input symbols satisfy“equal probability distribution”,the average mutual information I(X;Y)reaches the maximum value,and only at th

    16、is time the receiver gets the largest information from every symbol he received.Property 2 Relationship between information and channel transition probability distribution.I(X;Y)is a concave function of channel transition probability distribution of p(Y|X).Fig.4.8.I(X;Y)is a concave function of P(X|

    17、Y)E.g.4.2(This is the follow-up of E.g.4.1)Considering dual channel,now we know the average mutual information is,when the source distribution is)()();(pHppHYXIaverage mutual information I(X;Y)is the concave function of p,just see it from the following diagram,the Mutual info.of fixed binary source

    18、From the diagram,we can see,once the binary source fixed,when the channel property p changes,well get the different mutual information I(X;Y),when p=1/2,I(X;Y)=0,that means the receiver get the lest information from this channel,and all the information is lost in the way of transmission,this channel

    19、 has the most loudly noise.Property 3 If the channel input is discrete and without memory,we have the following inequality1211()(,.,)()(;)(;)NNiiNiiipp x xxp xII X Yxx y Property 4 If the channel is discrete and without memory,we have121211(|)(,.,|,.,)(|)(;)(;)NNNiiiNiiipp y yyx xxp yxII X Yy xx y(R

    20、emember the results)4.2.4 Relationship between entropy,channel doubt degree and average mutual information(;)()(|)(;)()()(,)(;)(;)0I X YH XH X YI X YH XH YH X YI X YI Y X(,)()()(,)(|)()(|)()H X YH XH YH X YH X YH YH YXH X E.g.4.3 There is a source 4.06.0)(21xxXPXIts messages pass through a channel w

    21、ith noise.The symbols received by the other end of the channel are .The channels transfer matrix is ,please calculate,21yyY 434161651x2xX(1)The self-information included in the symbolandof event.(2)The information about the receiver gets when it observes the message)2,1(ixi)2,1(jyj(3)The entropy of

    22、source X and received Y(4)The channel doubt degree H(X|Y)and the noise entropy H(Y|X).(5)The average mutual information got by receiver when it receives Y.Solution:1212()log()log 0.60.737 I xp xbit 2222()log()log 0.41.322 I xp xbit (1):(2):111121251()()(/)()(/)0.60.40.664p yp x p yxp x p yx212122213

    23、()()(/)()(/)0.60.40.464p yp x p yxp xp yx1111221(/)5/6(;)loglog0.474 ()0.6p yxI xybitp y2112222(/)1/6(;)loglog1.263 ()0.4p yxI xybitp y 1221221(/)1/4(;)loglog1.263 ()0.6p yxI xybitp y 2222222(/)3/4(;)loglog0.907 ()0.4p yxI xybitp y(3):22()()log()(0.6log0.60.4log0.4)log 100.971/()()log()(0.6log0.60.4

    24、log0.4)log 100.971/iiijjjH Xp xp xbit symH Yp yp ybit sym 255111133 (0.6log0.6log0.4log0.4log)log 1066664444 0.715 /()(/)()(/)(/)()(/)()0.971 0.715 0.971 0.715 bit symbolH XH Y XH YH X YH X YH XH Y XH Yb/it symbol(/)()(/)log(/)ijijiijH Y Xp x p yxp yx(5):(4):(;)()(|)0.971 0.7150.256/I X YH XH X Ybit

    25、 sym4.3 Discrete channel without memory Three groups of variables to describe the channel:(1)Channel input probability space,)(,XpXK(2)Channel output probability space,)(,YqYK(3)Channel transfer probability,So,the channel can be represented by),/(,KKYXYPX(|)P Y X This can be indicated by the followi

    26、ng illustration,2,1,2,1,)(channel)(11exitenter11kKjkKimmKnnKmjYyniXxqqyyyqYppxxxpXKKKKand the channel transfer matrix is)()/()()(1111KKKKnmnmxyPxyPxyPxyPP When K=1,it degenerates to the single message channel;and when n=m=2,it degenerates to the binary single message channel.If it satisfies symmetry

    27、,it constitutes the most commonly used BSC.Fig.4.11.Binary message symmetrical channel1010)(ppxpX11)(xyP1010)(qqyqY4.4 Channel capacity 4.4.1 The concept of channel capacity 4.4.2 Discrete channel without memory and its channel capacity 4.4.3 Continuous channel and its channel capacity4.4.1 The conc

    28、ept of channel capacity The capacity of channel can be defined as the maximum value of average mutual information,I(X;Y)Cp(x)maxdefThe unit of channel capacity C is bit/symbol or nat/symbol From the property mentioned before,we know I(X;Y)is an upper convex function of probability distribution p(x)o

    29、f input variable X.For a specific channel,there always exists a source which maximizes the information of every message transmitting through the channel.That means the maximum of I(X;Y)exists.And the probability distribution p(x)is called the optimum input distribution.4.4.2 Discrete channel without

    30、 memory and its channel capacity Classification of the discrete message sequence channel For discrete channel without memory,it satisfies the following relationship.According to the“property 4”of the mutual information I(X;Y)of the message sequence,for the discrete channel without memory,we have);()

    31、;(1kKkkYXIYXI1(|)(|)(|)Kno memorykkkstationaryKPP yxPy xY XNote:only when the source is of without memory,the equal relationship in this formula may be satisfied()()1()11max(;)max(,)max(,)Kkkp xp xkKKstationarykkkkp xkkCII XYI XYCKCx ySo we can get the following deduction which gets the formula of c

    32、hannel capacity C Theorem of discrete channel without memory Assuming that the transmission probability matrix of the discrete channel without memory is Q,the sufficient conditions under which the input letter probability distribution p*can make the mutual information I(p;Q)to achieve maximum value

    33、are 0)(when,|);(0)(when,|);(*kppkkppkapCYaxIapCYaxIWhere is the average mutual information when source letter is sent;and C is the channel capacity of this channel.)()|(log)|();(1jkjkJjjkbpabqabqYaxIkaUnderstanding this theorem Firstly,under this kind of distribution,each letter whose probability is

    34、 above zero provides mutual information C,and each letter whose probability is zero provides mutually information lower than or equal to C.Secondly,only under this kind of distribution,it may cause I(p;Q)to obtain the maximum value C.Thirdly,I(X;Y)is the average of .That is to say,it satisfies this

    35、equation(,)kI xa Y);()(),(YaxIapYXIkkk(1)If we want to enhance I(X;Y),enhancing p(ak)may be a good idea.(2)However,once p(ak)is enhanced,I(x=ak;Y)may be reduced.(3)To adjust p(ak)repeatedly,make I(x=ak;Y)all equal to C (4)This time I(X,Y)=CThe theorem only provides a sufficient condition of)(xpto ma

    36、ke CYXI),(distribution and the value of C;but it may help to get the value of C of several kinds of channels in simple situation.It does not give the concrete E.g.4.4 Assume the transmission matrix of binary discrete symmetrical channel is32313132(1)If,4/1)1(,4/3)0(PPplease calculate);(),/(),/(),(YX

    37、IXYHYXHXH(2)Please calculate the capacity of channel,and the probability distribution when reaching the capacity of channel.(1)symbolbitxpXHii/811.0)41log4143log43()()(22ijijijixypxypxpXYH)/(log)/()()/(symbolbitYXHXHYXIsymbolbitXYHYHXHYXHXYHYHYXHXHYXIsymbolbitypYHxypxpxypxpyxpyxpypxypxpxypxpyxpyxpyp

    38、symbolbitjj/062.0749.0811.0)/()();(/749.0918.0980.0811.0)/()()()/()/()()/()();(/980.0)4167.0log4167.05833.0log5833.0()()(4167.032413143)/()()/()()()()(5833.031413243)/()()/()()()()(/918.0 10log)32lg324131lg314131lg314332lg3243(2222212122212212111121112(2)Application Example 3.6222max(;)log1122log 2(

    39、lglog)log 100.082/33331()2miiCI X YmHbit symP xWhere m represents the number of output symbol set;Hmi is the entropy of the row vector of channel matrix.(|)()(|)log(|)11()(|)log(|)log(|)(|)ijijimiijijijiijjjijiH Y Xp x p yxp yxHp xp yxp yxp yxp yx 4.4.3 Continuous channel and its channel capacity Ch

    40、aracteristic of continuous channel Analog channel Basic knowledge to addable channel Shannon formula Usage of Shannon formulaCharacteristic of continuous channel Characteristic 1:The time is discrete,the value scope is continuous.Characteristic 2:At each moment,it is the single random variable whose

    41、 value is continuous.Analog channelBasic knowledge to addable channel NXYX:channel input N:channel noiseY:channel output If two of X,Y,N are Gauss distributions,then the other is also the Gauss distribution.The differential entropy of the r.v.satisfying Gaussian distribution only concerns with its v

    42、ariance and has nothing to do with the average value.2Fig.4.22.Addable channelX()H X:When a generally stationary random process source with limited frequency(F)and time(T)passes through a white Gaussian channel which has limited power(PN),the channel capacity is:Shannon formula),(wtXThis is the famo

    43、us for continuous channel.When T=1,the capacity is:)1log(2SFC2log(1)log(1)sNPSCFTFTP:AssumeYXN,where X and N are independent discrete r.v.s;and 2(0,)NN(;)()(|)()()I X YH YH Y XH YH N()()max(;)max()()11log2()log22211loglog(1)22sp xp xsNNsNsNNCI X YH YH Ne PPePPPPPPSinceWe haveThe biggest entropy theo

    44、rem of limited average power(|)()|()ny xnp y xp np yxy:Due to the limited frequency F for and according to the Nyquist sample theorem,the continuous signal X(t,w)can be equivalent to 2F discrete signals per second.),(wtX2log(1)/ssNPCF CFbit sPThat is:Considering time duration T:2log(1)sTsNPCFT CFTbi

    45、tPFig.4.23.Shannon formulaUsage of Shannon formulaIn analog communications,the frequency modulation surpasses the amplitude modulation;the wider the frequency band,the stronger is the anti-disturbance.In digital communications,the pseudo-noise(PN)code straightforwardly expands the signal.The wider t

    46、he band width,the more the frequency increases,and the stronger is the anti-disturbance.Another form of Shannon formula200log(1)log(1)log(1)bbE FESCFTFTFTN FNWhere N0 is the noise power intensity in unit bandwidth.FN02FSSTEbb/is the bit energy.And0/NEbis the normalization of SNR.When 1/0NEb)(/2ln1)(

    47、/)1log(000bitNEnatNENEFTCbbb:when the SNR is very low,the channel capacity can be approximately determined by its signal noise ratio.E.g.4.61010logSdBNIn photo transmission,every frame has about 2.25 106 pixels.In order to reproduce good image,we need about 16 points brightness levels.Assuming equal

    48、 probability distribution of the brightness levels,please calculate the transmission channel bandwidth requirement of 30 images per second(signal-to-noise ratio is 30dB).Solution:The channel capacity of additive white Gaussian noise(AWGN)channel in every unit time is,(bit/s)1log(limNSWTCCTtThe required information transmission rate is:)/1(log107.23016log1025.22826NSWCt310/30)/lg(10NSdBNS)(107.2)101(log/107.27328HzWExample 3.8(p26)Application exampleEnd of Chapter 4


    注意事项

    本文(第4章-信道容量教学课件.ppt(63页))为本站会员(ziliao2023)主动上传,其收益全归该用户,163文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上传内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知163文库(点击联系客服),我们立即给予删除!




    Copyright@ 2017-2037 Www.163WenKu.Com  网站版权所有  |  资源地图   
    IPC备案号:蜀ICP备2021032737号  | 川公网安备 51099002000191号


    侵权投诉QQ:3464097650  资料上传QQ:3464097650
       


    【声明】本站为“文档C2C交易模式”,即用户上传的文档直接卖给(下载)用户,本站只是网络空间服务平台,本站所有原创文档下载所得归上传人所有,如您发现上传作品侵犯了您的版权,请立刻联系我们并提供证据,我们将在3个工作日内予以改正。

    163文库