4. Theory of Secure Communication - part 1
💬 Overview
- condition for perfect secrecy
🪜 Get in
Birthday paradox
about 50% change of having a same birthday when there are 23 people!
P(n) = prob that at least two have a same BTD when there are n persons
P(n) = 1 - P'(n)
(P'(n) = no one has same BTD)
P(1) = 1
P(2) = 1 * (365-1)/365
P(3) = 1 * (365-1)/365 * (365-2)/365
Digital Seignature Susceptibility
感度
🍦 Model of contents
✨Shannon theory
communication theory of secrecy system
similar to "noisy channel problem"
i.e) A user wishing to send a message over a noisy channel must add redundancy to allow detection/correction of errors.
about 50% change of having a same birthday when there are 23 people!
✨Shannon theory
communication theory of secrecy system
similar to "noisy channel problem"
i.e) A user wishing to send a message over a noisy channel must add redundancy to allow detection/correction of errors.
✨PERFECT SECRECY - Priori & Posterior Probability
✔️ in words..
暗号化前の鍵の確率、メッセージの確率、および
暗号化後(post)攻撃者が暗号テキストを介して可能な鍵とメッセージを計算する可能性
(i.e.dataが16 bitの場合、2^16種類のmsgがあります.)
=>CのAttackerはMjとKIのPosterior確率を計算できる.
=>事後確率=攻撃者を表す'sknowledgeの
✔️ in calc..
✔️ Condition for PERFECT SECRECY
🤔 IF P(M|C) = P(M) for all C & M
attacker gain NO info by intercepting C
=cが略奪された場合でも,M(ランダム)を選択する確率は同じである.
=cを略奪してもMに関する情報は得られない
🤭 So Perfect Securecy is
1. P(M|C) = P(M) for all C & all M
2. P(C|M) = P(C) for all C & all M
🤭 So Perfect System is
= as many C's as M's
= at least one K transforming any M into these C's.
this works for guessing imppossible XXD
these are neccessary and sufficient condition !
✨ Shannon Entropy
✔️ definition
measure of unpredictability in random variable
△高ければ高いほど予測が難しくなる!
✔️ Examples
1) Political Poll
i.e.) Outcome of the poll and learning the result/gives/some new info!
i.e) outcome of the first-round poll/already known/and the second poll can be predicted.
✔️ Calc..
Message Entrophy
= amount of info produced/when a message is chosen
Key Entropy
= amount of info produced/when a key is chosen
✔️ Observation
the amount of uncertainty introduced into the system/cannot be greater than the key uncertainty.
K uncertanity is at least M uncertainty
(Kエントロピー>=Mエントロピー)=M info is完全非表示
occurs when all M's are equally probable (i.e. totally random)
ファジイ性
✔️ Definition
Conditional entropies of K and M
(i.e., unpredictability of the K and M given an intercepted C.)
✔️ Properties
N=切り取られたC部分の長さ
H(K, N|C) is a non-increasing function of N
i.e., it's theoretically easier to determine the key as more C is intercepted. Cカットが多ければ多いほど、Kを特定しやすくなります
✔️ For perfect secrecy!
Mutual Info
-I(M|C)=Cが与えられた場合のMで得られる情報量
-I(K|C)=Cが与えられた場合に得られるKに関する情報量
I(M|C) = H(M) - H(M|C)
"new info revealed knowing C"
= "uncertainty"- "uncertainty after knowing C"
How to achieve perfect secrecy?
I(M|C) = H(M) - H(M|C) = 0
H(M) = H(M|C)
= C and M are statistically independent
also
H(M) <= H(K)
=少なくともKエントロピーはMエントロピーより大きい!
= num of M bits <= num of K bits
since..
otherwise,,
like H(M) > H(K) !
Cryptanalyst can obtain a lot of info about plaintext.
Reference
この問題について(4. Theory of Secure Communication - part 1), 我々は、より多くの情報をここで見つけました https://velog.io/@yesterdaykite/4.-Theory-of-Secure-Communication-part-1テキストは自由に共有またはコピーできます。ただし、このドキュメントのURLは参考URLとして残しておいてください。
Collection and Share based on the CC Protocol