首页 > 其他分享 >52 Things: Number 9: What are Shannon's definitions of entropy and information?

52 Things: Number 9: What are Shannon's definitions of entropy and information?

时间:2024-04-11 12:56:47浏览次数:27  
标签:information What Things uncertainty when entropy Shannon more

52 Things: Number 9: What are Shannon's definitions of entropy and information?

52 件事: 数字 9:香农对熵和信息的定义是什么? This is the latest in a series of blog posts to address the list of '52 Things Every PhD Student Should Know To Do Cryptography': a set of questions compiled to give PhD candidates a sense of what they should know by the end of their first year. This blog post tries to introduce some fundamental concepts in Information Theory: What are Shannon's definitions of entropy and information?
这是一系列博客文章中的最新一篇,旨在解决“每个博士生都应该知道的 52 件事来做密码学”列表:一组问题,旨在让博士生在第一年结束时了解他们应该知道什么。这篇博文试图介绍信息论中的一些基本概念:香农对熵和信息的定义是什么?
    Information Theory was founded by Claude E. Shannon in 1948. It was originally developed to study signal processing but its application has been broadened to various disciplines through decades. This blog is intended to briefly introduce two fundamental concepts, entropy and information. If you are interested in more details, I personally suggest you to find more in [1].
信息论由 Claude E. Shannon 于 1948 年创立。它最初是为了研究信号处理而开发的,但几十年来,它的应用已经扩展到各个学科。本博客旨在简要介绍两个基本概念,熵和信息。如果你对更多细节感兴趣,我个人建议你在 [1] 中找到更多。     Entropy 熵   Entropy is a measurement to evaluate the uncertainty[3] of one or more variables.
熵是评估一个或多个变量的不确定性的度量[3]。   Assume we are investigating the first web page people visit when they open a web browser. We use samples from two groups of people: 4 cryptographer from Bristol Cryptogroup and 4 passenger grabbed at Bristol coach station. Let's make a more ambitious assumption that the cryptographers always visit http://bristolcrypto.blogspot.co.uk/ first when they open a web browser, while the others each visits a different portal.
假设我们正在调查人们打开 Web 浏览器时访问的第一个网页。我们使用来自两组人的样本:来自布里斯托尔加密组的 4 名密码学家和在布里斯托尔长途汽车站抓获的 4 名乘客。让我们做一个更雄心勃勃的假设,即密码学家在打开 Web 浏览器时总是首先访问 http://bristolcrypto.blogspot.co.uk/,而其他人则各自访问不同的门户。   Now let's evaluate the uncertainty of their answers: apparently, the answers from cryptographers are quite certain (low uncertainty) whilst it can hardly be guessed if an answer is from a passenger (high uncertainty). In other words, we  say the answers in the cryptographers group has a low entropy but  that in the passengers group has a high entropy.
现在让我们评估一下他们答案的不确定性:显然,密码学家的答案是相当确定的(低不确定性),而如果答案来自乘客,则很难猜测(高不确定性)。换句话说,我们说密码学家组的答案具有低熵,但乘客组的答案具有高熵。   So one of the most remarkable inventions of Shannon's is his definition of entropy(Shannon's Entropy):
因此,香农最引人注目的发明之一就是他对熵的定义(香农的熵):
<span id="MathJax-Span-2" class="mrow"><span id="MathJax-Span-3" class="mi">H<span id="MathJax-Span-4" class="mo">=<span id="MathJax-Span-5" class="mo">&minus;<span id="MathJax-Span-6" class="msubsup"><span id="MathJax-Span-7" class="texatom"><span id="MathJax-Span-8" class="mrow"><span id="MathJax-Span-9" class="mo">&sum;<span id="MathJax-Span-10" class="texatom"><span id="MathJax-Span-11" class="mrow"><span id="MathJax-Span-12" class="mi">i<span id="MathJax-Span-13" class="texatom"><span id="MathJax-Span-14" class="mrow"><span id="MathJax-Span-15" class="msubsup"><span id="MathJax-Span-16" class="mi">p<span id="MathJax-Span-17" class="mi">i<span id="MathJax-Span-18" class="msubsup"><span id="MathJax-Span-19" class="mi">log<span id="MathJax-Span-20" class="mi">b<span id="MathJax-Span-21" class="mo"><span id="MathJax-Span-22" class="texatom"><span id="MathJax-Span-23" class="mrow"><span id="MathJax-Span-24" class="msubsup"><span id="MathJax-Span-25" class="mi">p<span id="MathJax-Span-26" class="mi">i

where <span id="MathJax-Span-28" class="mrow"><span id="MathJax-Span-29" class="msubsup"><span id="MathJax-Span-30" class="mi">p<span id="MathJax-Span-31" class="mi">i is the probability of a message (an answer in previous example) appears. In computer science, we usually use <span id="MathJax-Span-33" class="mrow"><span id="MathJax-Span-34" class="mi">b<span id="MathJax-Span-35" class="mo">=<span id="MathJax-Span-36" class="mn">2 (bits).
其中 <span id="MathJax-Span-28" class="mrow"><span id="MathJax-Span-29" class="msubsup"><span id="MathJax-Span-30" class="mi">p<span id="MathJax-Span-31" class="mi">i 是消息(上一示例中的答案)出现的概率。在计算机科学中,我们通常使用 <span id="MathJax-Span-33" class="mrow"><span id="MathJax-Span-34" class="mi">b<span id="MathJax-Span-35" class="mo">=<span id="MathJax-Span-36" class="mn">2 (位)。
If we compute the entropy in our example, we will have:
如果我们在示例中计算熵,我们将得到:

<span id="MathJax-Span-38" class="mrow"><span id="MathJax-Span-39" class="mtable"><span id="MathJax-Span-40" class="mtd"><span id="MathJax-Span-41" class="mrow"><span id="MathJax-Span-42" class="msubsup"><span id="MathJax-Span-43" class="mi">H<span id="MathJax-Span-44" class="texatom"><span id="MathJax-Span-45" class="mrow"><span id="MathJax-Span-46" class="mi">c<span id="MathJax-Span-47" class="mi">r<span id="MathJax-Span-48" class="mi">y<span id="MathJax-Span-49" class="mi">p<span id="MathJax-Span-50" class="mi">t<span id="MathJax-Span-51" class="mi">o<span id="MathJax-Span-52" class="mi">g<span id="MathJax-Span-53" class="mi">r<span id="MathJax-Span-54" class="mi">a<span id="MathJax-Span-55" class="mi">p<span id="MathJax-Span-56" class="mi">h<span id="MathJax-Span-57" class="mi">e<span id="MathJax-Span-58" class="mi">r<span id="MathJax-Span-59" class="mo">=<span id="MathJax-Span-60" class="mo">&minus;<span id="MathJax-Span-61" class="msubsup"><span id="MathJax-Span-62" class="texatom"><span id="MathJax-Span-63" class="mrow"><span id="MathJax-Span-64" class="mo">&sum;<span id="MathJax-Span-65" class="texatom"><span id="MathJax-Span-66" class="mrow"><span id="MathJax-Span-67" class="mn">4<span id="MathJax-Span-68" class="texatom"><span id="MathJax-Span-69" class="mrow"><span id="MathJax-Span-70" class="mn">1<span id="MathJax-Span-71" class="texatom"><span id="MathJax-Span-72" class="mrow"><span id="MathJax-Span-73" class="mn">1<span id="MathJax-Span-74" class="msubsup"><span id="MathJax-Span-75" class="mi">log<span id="MathJax-Span-76" class="mn">2<span id="MathJax-Span-77" class="mo"><span id="MathJax-Span-78" class="texatom"><span id="MathJax-Span-79" class="mrow"><span id="MathJax-Span-80" class="mn">1<span id="MathJax-Span-81" class="mo">=<span id="MathJax-Span-82" class="mn">0<span id="MathJax-Span-83" class="mtd"><span id="MathJax-Span-84" class="mrow"><span id="MathJax-Span-85" class="msubsup"><span id="MathJax-Span-86" class="mi">H<span id="MathJax-Span-87" class="texatom"><span id="MathJax-Span-88" class="mrow"><span id="MathJax-Span-89" class="mi">p<span id="MathJax-Span-90" class="mi">a<span id="MathJax-Span-91" class="mi">s<span id="MathJax-Span-92" class="mi">s<span id="MathJax-Span-93" class="mi">e<span id="MathJax-Span-94" class="mi">n<span id="MathJax-Span-95" class="mi">g<span id="MathJax-Span-96" class="mi">e<span id="MathJax-Span-97" class="mi">r<span id="MathJax-Span-98" class="mo">=<span id="MathJax-Span-99" class="mo">&minus;<span id="MathJax-Span-100" class="msubsup"><span id="MathJax-Span-101" class="texatom"><span id="MathJax-Span-102" class="mrow"><span id="MathJax-Span-103" class="mo">&sum;<span id="MathJax-Span-104" class="texatom"><span id="MathJax-Span-105" class="mrow"><span id="MathJax-Span-106" class="mn">4<span id="MathJax-Span-107" class="texatom"><span id="MathJax-Span-108" class="mrow"><span id="MathJax-Span-109" class="mn">1<span id="MathJax-Span-110" class="texatom"><span id="MathJax-Span-111" class="mrow"><span id="MathJax-Span-112" class="texatom"><span id="MathJax-Span-113" class="mrow"><span id="MathJax-Span-114" class="mfrac"><span id="MathJax-Span-115" class="mn">1<span id="MathJax-Span-116" class="mn">4<span id="MathJax-Span-117" class="msubsup"><span id="MathJax-Span-118" class="mi">log<span id="MathJax-Span-119" class="mn">2<span id="MathJax-Span-120" class="mo"><span id="MathJax-Span-121" class="texatom"><span id="MathJax-Span-122" class="mrow"><span id="MathJax-Span-123" class="mfrac"><span id="MathJax-Span-124" class="mn">1<span id="MathJax-Span-125" class="mn">4<span id="MathJax-Span-126" class="mo">=<span id="MathJax-Span-127" class="mn">2
  So the passengers' answer do have a higher entropy than the cryptographers'!
因此,乘客的答案确实比密码学家的熵更高!   Information 信息   Formally, the definition of Shannon's information is given in [2] as:
从形式上讲,香农信息的定义在[2]中给出为:       " Information is a measure of one's freedom of choice when one selects a message."
“信息是衡量一个人在选择信息时选择自由度的指标。”   To explain this, let's make a minor modification to our previous example. Let's grab another 4 passengers from Bristol train station and assume their answers are also random portals just like the passengers' in coach station.
为了解释这一点,让我们对前面的示例进行一些小的修改。让我们从布里斯托尔火车站抓起另外 4 名乘客,并假设他们的答案也是随机的门户,就像长途汽车站的乘客一样。   Here is the question: Given an answer <span id="MathJax-Span-129" class="mrow"><span id="MathJax-Span-130" class="mi">y, can you tell which group the answer is from?
问题来了:给定一个答案 <span id="MathJax-Span-129" class="mrow"><span id="MathJax-Span-130" class="mi">y ,你能说出答案来自哪个群体吗?   We can instantly tell that the answer is from our cryptographer group if <span id="MathJax-Span-132" class="mrow"><span id="MathJax-Span-133" class="mi">y is http://bristolcrypto.blogspot.co.uk/. However, we will be struggling if <span id="MathJax-Span-135" class="mrow"><span id="MathJax-Span-136" class="mi">y is a portal; therefore, we could say the message of http://bristolcrypto.blogspot.co.uk/ contains more information (less freedom) than a portal (more freedom).
我们可以立即判断答案来自我们的密码学家小组,如果 <span id="MathJax-Span-132" class="mrow"><span id="MathJax-Span-133" class="mi">y http://bristolcrypto.blogspot.co.uk/。但是,如果 <span id="MathJax-Span-135" class="mrow"><span id="MathJax-Span-136" class="mi">y 是一个门户,我们将苦苦挣扎;因此,我们可以说 http://bristolcrypto.blogspot.co.uk/ 的信息比门户(更多的自由)包含更多的信息(更少的自由)。   So how does it relate to entropy?
那么它与熵有什么关系呢?   Extending the definition of entropy, we define the Conditional Entropy as:
扩展熵的定义,我们将条件熵定义为:   <span id="MathJax-Span-138" class="mrow"><span id="MathJax-Span-139" class="mi">H<span id="MathJax-Span-140" class="mo">(<span id="MathJax-Span-141" class="mi">Y<span id="MathJax-Span-142" class="texatom"><span id="MathJax-Span-143" class="mrow"><span id="MathJax-Span-144" class="mo">|<span id="MathJax-Span-145" class="mi">X<span id="MathJax-Span-146" class="mo">)<span id="MathJax-Span-147" class="mo">=<span id="MathJax-Span-148" class="munderover"><span id="MathJax-Span-149" class="mo">&sum;<span id="MathJax-Span-150" class="texatom"><span id="MathJax-Span-151" class="mrow"><span id="MathJax-Span-152" class="mi">x<span id="MathJax-Span-153" class="mo">&isin;<span id="MathJax-Span-154" class="mi">X<span id="MathJax-Span-155" class="texatom"><span id="MathJax-Span-156" class="mrow"><span id="MathJax-Span-157" class="mi">p<span id="MathJax-Span-158" class="mo">(<span id="MathJax-Span-159" class="mi">x<span id="MathJax-Span-160" class="mo">)<span id="MathJax-Span-161" class="mi">H<span id="MathJax-Span-162" class="mo">(<span id="MathJax-Span-163" class="mi">Y<span id="MathJax-Span-164" class="texatom"><span id="MathJax-Span-165" class="mrow"><span id="MathJax-Span-166" class="mo">|<span id="MathJax-Span-167" class="mi">X<span id="MathJax-Span-168" class="mo">=<span id="MathJax-Span-169" class="mi">x<span id="MathJax-Span-170" class="mo">)   which describes the entropy of <span id="MathJax-Span-172" class="mrow"><span id="MathJax-Span-173" class="mi">Y when given condition <span id="MathJax-Span-175" class="mrow"><span id="MathJax-Span-176" class="mi">X<span id="MathJax-Span-177" class="mo">=<span id="MathJax-Span-178" class="mi">x. To make it more explicitly, since entropy is the uncertainty of a variable; hence the previous definition of conditional entropy is in fact the uncertainty of <span id="MathJax-Span-180" class="mrow"><span id="MathJax-Span-181" class="mi">Y when given the "clue"(condition) <span id="MathJax-Span-183" class="mrow"><span id="MathJax-Span-184" class="mi">X.
它描述了给定条件 <span id="MathJax-Span-175" class="mrow"><span id="MathJax-Span-176" class="mi">X<span id="MathJax-Span-177" class="mo">=<span id="MathJax-Span-178" class="mi">x 时的 <span id="MathJax-Span-172" class="mrow"><span id="MathJax-Span-173" class="mi">Y 熵。更明确地说,因为熵是变量的不确定性;因此,条件熵的先前定义实际上是给定“线索”(条件) <span id="MathJax-Span-183" class="mrow"><span id="MathJax-Span-184" class="mi">X 时的 <span id="MathJax-Span-180" class="mrow"><span id="MathJax-Span-181" class="mi">Y 不确定性。   Observation: consider two variables <span id="MathJax-Span-186" class="mrow"><span id="MathJax-Span-187" class="mi">X and <span id="MathJax-Span-189" class="mrow"><span id="MathJax-Span-190" class="mi">Y. If <span id="MathJax-Span-192" class="mrow"><span id="MathJax-Span-193" class="mi">X contains only a minimal information of <span id="MathJax-Span-195" class="mrow"><span id="MathJax-Span-196" class="mi">Y, then given an exact value of <span id="MathJax-Span-198" class="mrow"><span id="MathJax-Span-199" class="mi">X should not help us much on deducing the value of <span id="MathJax-Span-201" class="mrow"><span id="MathJax-Span-202" class="mi">Y, that is, it does not obviously reduce the uncertainty of <span id="MathJax-Span-204" class="mrow"><span id="MathJax-Span-205" class="mi">Y; on the other hand, if <span id="MathJax-Span-207" class="mrow"><span id="MathJax-Span-208" class="mi">X contains essential information of <span id="MathJax-Span-210" class="mrow"><span id="MathJax-Span-211" class="mi">Y, then the entropy of <span id="MathJax-Span-213" class="mrow"><span id="MathJax-Span-214" class="mi">Y is expected to be low when <span id="MathJax-Span-216" class="mrow"><span id="MathJax-Span-217" class="mi">X is given. Therefore, the conditional entropy can be viewed as a rational measurement to the information <span id="MathJax-Span-219" class="mrow"><span id="MathJax-Span-220" class="mi">X has of <span id="MathJax-Span-222" class="mrow"><span id="MathJax-Span-223" class="mi">Y!
观察:考虑两个变量 <span id="MathJax-Span-186" class="mrow"><span id="MathJax-Span-187" class="mi">X 和 <span id="MathJax-Span-189" class="mrow"><span id="MathJax-Span-190" class="mi">Y 。如果 <span id="MathJax-Span-192" class="mrow"><span id="MathJax-Span-193" class="mi">X 只包含 的 <span id="MathJax-Span-195" class="mrow"><span id="MathJax-Span-196" class="mi">Y 极小信息,那么给定一个精确值 应该 <span id="MathJax-Span-198" class="mrow"><span id="MathJax-Span-199" class="mi">X 对我们推导 的 <span id="MathJax-Span-201" class="mrow"><span id="MathJax-Span-202" class="mi">Y 值没有多大帮助,也就是说,它不会明显降低 的 <span id="MathJax-Span-204" class="mrow"><span id="MathJax-Span-205" class="mi">Y 不确定性;另一方面,如果 <span id="MathJax-Span-207" class="mrow"><span id="MathJax-Span-208" class="mi">X 包含 的基本 <span id="MathJax-Span-210" class="mrow"><span id="MathJax-Span-211" class="mi">Y 信息,则在给定时 <span id="MathJax-Span-216" class="mrow"><span id="MathJax-Span-217" class="mi">X 的 <span id="MathJax-Span-213" class="mrow"><span id="MathJax-Span-214" class="mi">Y 熵预计较低。因此,条件熵可以看作是对 的信息 <span id="MathJax-Span-219" class="mrow"><span id="MathJax-Span-220" class="mi">X <span id="MathJax-Span-222" class="mrow"><span id="MathJax-Span-223" class="mi">Y 的有理度量!   Another important measurement is called Mutual Information. It is a measure of mutual dependency among two variables. One way to define it is the loss of entropy(uncertainty) when given a condition:
另一个重要的衡量标准称为互信息。它是两个变量之间相互依赖性的度量。定义它的一种方法是给定条件时熵的损失(不确定性):
<span id="MathJax-Span-225" class="mrow"><span id="MathJax-Span-226" class="mi">I<span id="MathJax-Span-227" class="mo">(<span id="MathJax-Span-228" class="mi">X<span id="MathJax-Span-229" class="mo">;<span id="MathJax-Span-230" class="mi">Y<span id="MathJax-Span-231" class="mo">)<span id="MathJax-Span-232" class="mo">=<span id="MathJax-Span-233" class="mi">H<span id="MathJax-Span-234" class="mo">(<span id="MathJax-Span-235" class="mi">X<span id="MathJax-Span-236" class="mo">)<span id="MathJax-Span-237" class="mo">&minus;<span id="MathJax-Span-238" class="mi">H<span id="MathJax-Span-239" class="mo">(<span id="MathJax-Span-240" class="mi">X<span id="MathJax-Span-241" class="texatom"><span id="MathJax-Span-242" class="mrow"><span id="MathJax-Span-243" class="mo">|<span id="MathJax-Span-244" class="mi">Y<span id="MathJax-Span-245" class="mo">)<span id="MathJax-Span-246" class="mo">=<span id="MathJax-Span-247" class="mi">H<span id="MathJax-Span-248" class="mo">(<span id="MathJax-Span-249" class="mi">Y<span id="MathJax-Span-250" class="mo">)<span id="MathJax-Span-251" class="mo">&minus;<span id="MathJax-Span-252" class="mi">H<span id="MathJax-Span-253" class="mo">(<span id="MathJax-Span-254" class="mi">Y<span id="MathJax-Span-255" class="texatom"><span id="MathJax-Span-256" class="mrow"><span id="MathJax-Span-257" class="mo">|<span id="MathJax-Span-258" class="mi">X<span id="MathJax-Span-259" class="mo">)       Cryptographic Example 加密示例 The concepts of information theory is widely used in cryptography. A classic example is to view a cryptographic process as a channel with plaintext and ciphertext being input and output respectively. Research of side channel analysis also benefits from the usage of information theory.
信息论的概念在密码学中被广泛使用。一个典型的示例是将加密过程视为一个通道,其中明文和密文分别是输入和输出。侧信道分析的研究也受益于信息论的使用。  

标签:information,What,Things,uncertainty,when,entropy,Shannon,more
From: https://www.cnblogs.com/3cH0-Nu1L/p/18104685

相关文章

  • 52 Things: Number 12: What is the elliptic curve group law?
    52Things:Number12:Whatistheellipticcurvegrouplaw?52件事:数字12:什么是椭圆曲线群定律?Thisisthelatestinaseriesofblogpoststoaddressthelistof '52ThingsEveryPhDStudentShouldKnow' todoCryptography:asetofquestionscompiled......
  • 52 Things: Number 11: What are the DLP, CDH and DDH problems?
    52Things:Number11:WhataretheDLP,CDHandDDHproblems?52件事:数字11:DLP、CDH和DDH问题是什么? Thisisthelatestinaseriesofblogpoststoaddressthelistof'52ThingsEveryPhDStudentShouldKnowToDoCryptography':asetofquestion......
  • 52 Things: Number 10: What is the difference between the RSA and strong-RSA prob
    52Things:Number10:WhatisthedifferencebetweentheRSAandstrong-RSAproblem?52件事:数字10:RSA和强RSA问题有什么区别?Thisisthelatestinaseriesofblogpoststoaddressthelistof'52ThingsEveryPhDStudentShouldKnowToDoCryptography......
  • 基于融合语义信息改进的内容推荐算法。Improved content recommendation algorithm in
    引言路漫漫其修远兮,吾将上下而求索。每天一篇论文,做更好的自己。本文读的这篇论文为发表于2023年5月28日的一篇名为《基于融合语义信息改进的内容推荐算法》(基于融合语义信息改进的内容推荐算法)的文章,文章主要介绍了基于内容的推荐技术在电子商务和教育领域的广泛应用,以及传统基......
  • Welcome to the Internet. What would you prefer?
    前言:今天T1数据也太水了。voiddfs(intu,intfa,intl){ siz[u]=1; tsum[u]=a[u]; f[u][0]=l*abs(a[u]-x); f[u][1]=l*abs(a[u]-y); for(constauto&i:e[u]){ intv=i.first,w=i.second; if(v==fa)continue; dfs(v,u,w); for......
  • What is the difference between Mysql InnoDB B+ tree index and hash index? Why do
    原文:WhatisthedifferencebetweenMysqlInnoDBB+treeindexandhashindex?WhydoesMongoDBuseB-tree?|byMinaAyoub|MediumThemostimportantdifferencebetweenB-treeandB+treeisthatB+treeonlyhasleafnodestostoredata,andothernodes......
  • WHAT - 值得掌握的 computed 计算属性机制
    目录一、介绍二、计算属性vs方法:缓存优势三、计算属性vswatch1.主要区别:目的和用法2.watch性能问题四、计算属性底层实现五、计算属性只读和可写六、最佳实践1.不应该有副作用2.避免直接修改计算属性值一、介绍参考阅读:vue3官方文档......
  • 红队笔记10:pWnOS2.0打靶流程-whatweb指纹识别-searchsploit搜索漏洞利用getshell(vulnh
    目录开头:1.主机发现和端口扫描2.80端口- whatweb指纹识别-searchsploit搜索漏洞并利用whatweb指纹识别:searchsploit搜索历史漏洞:什么是perl?SimplePHPblog登录成功-图片上传getshell3.提权-敏感文件泄露密码泄露尝试登录 4.总结:开头:学习的视频是哔哩哔哩红......
  • 从零实战本地服务器部署 Docker 安装 ThingsBoard PE 专业版(适用于Cassandra + Kafka
    目录1、准备工作2、本地服务器LinuxCentos7.9系统安装docker2.1、检查Linux的内核版本2.2、卸载Docker旧版本(若有需要)2.3、安装Docker2.4、安装Docker引擎2.5、 启动docker和设置开机⾃启动3、使用Docker安装ThingsBoardPE3.1、 拉取ThingsBoardPE镜像3.2......
  • 题解 CF70E【Information Reform】
    题解CF70E【InformationReform】题目描述\(n\)个点的树,边权为\(1\)。可以花费常数\(k\),在一个点上建基站。每个点\(i\)需要找到离他最近的基站\(a_i\),花费\(d[dis(i,a_i)]\)。一种方案的总花费是建基站的花费加上每个点的花费之和。最小化总花费。输出方案\(a_i\)。......