For symmetric cipher systems one can prove unconditional security
against an opponent with unlimited computational power. The proof is
based on the notion of entropy. However, this measure does not provide a
satisfactory framework to the analysis of public key cryptosystems,
always based on cryptographic assumptions. The problem is that Shannon´s
definition of information is a purely statistical notion, which ignores
the computational difficulty of extracting the information.
The public key and the cipher text together contain all the Shannon
information concerning the plaintext, but the information is
computationally inaccessible. So, we face this intriguing question: what
is accessible information? In this talk we present a first attempt to
answer this question based on algorithmic entropy also known as
Kolmogorov complexity.
