4. Theory of Secure Communication - part 1


💬 Overview

  • get-in : btd paradox, digital signature susceptibility
  • model of secure system
    - condition for perfect secrecy
  • shannon entroyphy
  • ファジイ
  • perfect secrecy (Priori & Posterior Probability)
  • redundancy
  • unicity distance
  • complexity theroy
  • 🪜 Get in


    Birthday paradox

    about 50% change of having a same birthday when there are 23 people!
    P(n) = prob that at least two have a same BTD when there are n persons
    P(n) = 1 - P'(n)
    (P'(n) = no one has same BTD)
    P(1) = 1
    P(2) = 1 * (365-1)/365
    P(3) = 1 * (365-1)/365 * (365-2)/365

    Digital Seignature Susceptibility


    感度

    🍦 Model of contents



    ✨Shannon theory


    communication theory of secrecy system

    similar to "noisy channel problem"
    i.e) A user wishing to send a message over a noisy channel must add redundancy to allow detection/correction of errors.
  • ノイズチャネル:Mを送信し、ノイズがCに変化したMを検出及び修正することにより、Mに修正する

    ✨PERFECT SECRECY - Priori & Posterior Probability


    ✔️ in words..


    暗号化前の鍵の確率、メッセージの確率、および
    暗号化後(post)攻撃者が暗号テキストを介して可能な鍵とメッセージを計算する可能性
  • Pi:鍵の確率:AKjは、確率Pi
  • のall鍵Kセットから選択される.
  • Mj:メッセージの確率:優先確率Qjを持つall msgs MセットからAmsg Mjを選択
    (i.e.dataが16 bitの場合、2^16種類のmsgがあります.)
  • C = E(Mj, Ki)
    =>CのAttackerはMjとKIのPosterior確率を計算できる.
    =>事後確率=攻撃者を表す'sknowledgeの
  • ✔️ in calc..


  • P(M|C)=Cを取得した攻撃者がMを正しく解読する確率=prob.of M given C intercepted
  • P(C|M)=MがCを生成する確率=prob.of C generated by M
  • P(M)=Mが選択される確率=prob.of M being selected
  • P(C)=Cが入る確率=prob.of obtaing C
  • ✔️ Condition for PERFECT SECRECY


    🤔 IF P(M|C) = P(M) for all C & M
    attacker gain NO info by intercepting C
    =cが略奪された場合でも,M(ランダム)を選択する確率は同じである.
    =cを略奪してもMに関する情報は得られない
    🤭 So Perfect Securecy is
    1. P(M|C) = P(M) for all C & all M
    2. P(C|M) = P(C) for all C & all M
    🤭 So Perfect System is
    = as many C's as M's
    = at least one K transforming any M into these C's.
    this works for guessing imppossible XXD
    these are neccessary and sufficient condition !

    ✨ Shannon Entropy


    ✔️ definition


    measure of unpredictability in random variable
    △高ければ高いほど予測が難しくなる!

    ✔️ Examples


    1) Political Poll
  • Entrophy(H) is large : round 1 - outcome of poll is not already known 'unpredictable'
    i.e.) Outcome of the poll and learning the result/gives/some new info!
  • Entrophy(H) is small : round 2 - same poll performed after the first round
    i.e) outcome of the first-round poll/already known/and the second poll can be predicted.
  • 2) Coin Flipping (fair coin case)
  • Entropy(H)=1=全く予知できない
  • P(head) = P(tail) = no way to predict!
  • ✔️ Calc..


    Message Entrophy
    = amount of info produced/when a message is chosen

    Key Entropy
    = amount of info produced/when a key is chosen

    ✔️ Observation


  • the amount of uncertainty introduced into the system/cannot be greater than the key uncertainty.

  • K uncertanity is at least M uncertainty
    (Kエントロピー>=Mエントロピー)=M info is完全非表示
    occurs when all M's are equally probable (i.e. totally random)
  • ファジイ性


    ✔️ Definition


    Conditional entropies of K and M
    (i.e., unpredictability of the K and M given an intercepted C.)

    ✔️ Properties


    N=切り取られたC部分の長さ
    H(K, N|C) is a non-increasing function of N

    i.e., it's theoretically easier to determine the key as more C is intercepted. Cカットが多ければ多いほど、Kを特定しやすくなります

    ✔️ For perfect secrecy!


    Mutual Info
    -I(M|C)=Cが与えられた場合のMで得られる情報量
    -I(K|C)=Cが与えられた場合に得られるKに関する情報量
    I(M|C) = H(M) - H(M|C)
    "new info revealed knowing C"
    = "uncertainty"- "uncertainty after knowing C"
    How to achieve perfect secrecy?

    I(M|C) = H(M) - H(M|C) = 0
    H(M) = H(M|C)
    = C and M are statistically independent
    also
    H(M) <= H(K)
    =少なくともKエントロピーはMエントロピーより大きい!
    = num of M bits <= num of K bits
    since..

    otherwise,,
    like H(M) > H(K) !
    Cryptanalyst can obtain a lot of info about plaintext.