#### Filter Results:

- Full text PDF available (10)

#### Publication Year

2012

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- Flávio du Pin Calmon, Mayank Varia, Muriel Médard, Mark M. Christiansen, Ken R. Duffy, Stefano Tessaro
- 2013 51st Annual Allerton Conference on…
- 2013

Lower bounds for the average probability of error of estimating a hidden variable X given an observation of a correlated random variable Y, and Fano's inequality in particular, play a central role in information theory. In this paper, we present a lower bound for the average estimation error based on the marginal distribution of X and the principal inertias… (More)

- Mark M. Christiansen, Ken R. Duffy
- IEEE Transactions on Information Theory
- 2013

How hard is it to guess a password? Massey showed that a simple function of the Shannon entropy of the distribution from which the password is selected is a lower bound on the expected number of guesses, but one which is not tight in general. In a series of subsequent papers under ever less restrictive stochastic assumptions, an asymptotic relationship as… (More)

- Flávio du Pin Calmon, Muriel Médard, Linda M. Zeger, João Barros, Mark M. Christiansen, Ken R. Duffy
- 2012 50th Annual Allerton Conference on…
- 2012

We present a new information-theoretic definition and associated results, based on list decoding in a source coding setting. We begin by presenting list-source codes, which naturally map a key length (entropy) to list size. We then show that such codes can be analyzed in the context of a novel information-theoretic metric, ϵ-symbol secrecy, that… (More)

—A string is sent over a noisy channel that erases some of its characters. Knowing the statistical properties of the string's source and which characters were erased, a listener that is equipped with an ability to test the veracity of a string, one string at a time, wishes to fill in the missing pieces. Here we characterize the influence of the stochastic… (More)

- Mark M. Christiansen, Ken R. Duffy, Flávio du Pin Calmon, Muriel Médard
- 2013 IEEE International Symposium on Information…
- 2013

Consider the situation where a word is chosen probabilistically from a finite list. If an attacker knows the list and can inquire about each word in turn, then selecting the word via the uniform distribution maximizes the attacker's difficulty, its Guesswork, in identifying the chosen word. It is tempting to use this property in cryptanalysis of… (More)

We present information-theoretic definitions and results for analyzing symmetric-key en-cryption schemes beyond the perfect secrecy regime, i.e. when perfect secrecy is not attained. We adopt two lines of analysis, one based on lossless source coding, and another akin to rate-distortion theory. We start by presenting a new information-theoretic metric for… (More)

- Mark M. Christiansen, Ken R. Duffy, Flávio du Pin Calmon, Muriel Médard
- IEEE Transactions on Information Theory
- 2015

The guesswork problem was originally motivated by a desire to quantify computational security for single user systems. Leveraging recent results from its analysis, we extend the remit and utility of the framework to the quantification of the computational security of multi-user systems. In particular, assume that V users independently select strings… (More)

- Ahmad Beirami, A. Robert Calderbank, Mark M. Christiansen, Ken R. Duffy, Ali Makhdoumi, Muriel Médard
- 2015 53rd Annual Allerton Conference on…
- 2015

Guesswork is the position at which a random string drawn from a given probability distribution appears in the list of strings ordered from the most likely to the least likely. We define the tilt operation on probability distributions and show that it parametrizes an exponential family of distributions, which we refer to as the tilted family of the source.… (More)

- Muriel Medard, Linda M. Zeger, +5 authors Ken R. Duffy
- 2012

The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Abstract—We present a new information-theoretic definition and associated results, based on list decoding in a source coding setting. We begin by presenting list-source codes, which naturally map a key length (entropy) to list size. We then… (More)