Pseudorandom Functions (PRFs) are among the most fundamental Cryptographic primitives. with applications as basic as private-key encryption, identification and authentication and as sophisticated as indistinguishability obfuscation (through the notion of Puncturable PRFs). PRFs were introduced in the eighties by Oded Goldreich, Shafi Goldwasser and Silvio Micali in their seminal paper “how to construct random functions.” A large fraction of papers in Cryptography rely on the GGM paper, either directly or indirectly. In particular, each one of the papers in my PhD dissertation owes its existence to GGM. So where’s the story in a paper that is universally admired and a notion that is clearly fundamental? While preparing my talk for OdedFest, I got to reread the paper and was amazed to see how much effort the authors had to put in motivating (or even justifying) PRFs. What we now take for granted was not always as obvious (and indeed, the paper was rejected from the first conference it was submitted to).

Let me first loosely define PRFs: Consider for example a family of Boolean functions $F^n$ on $n$-bit strings that is efficient – it is easy (polynomial time in $n$) to sample a key $s$ that describes a random function $f_s$ from the family and given $s$ and $x$ it is easy to evaluate $f_s(x)$ (in fact, the definition only makes sense if we have an ensemble of such families, one for every value of $n$). Such a family is pseudorandom if it is hard to distinguish $f_s$ from a completely random (uniformly chosen) Boolean function on $n$ bits. In other words, a distibguisher that gets black-box (oracle) access to a function $g$ cannot tell if it is $f_s$ or a completely random function.

PRFs allow parties to effectively share exponential number of pseudorandom bits (by sharing $s$ the parties get access to the exponentially-many outputs of $f_s$). Very powerful. Yet, GGM had to explain why Kolmogorov-Complexity based definitions are different, why PRFs are different from one-way functions and why PRFs are different from pseudorandom-bit generators. I can’t imagine any paper using PRFs today that would have to explain such basic distinctions. But even more interestingly, GGM goes to lengths to explain why PRFs are different from the following stateful simulation of a black-box random function: Assume that every past query and its answer is recorded. For every new query $q_i$ check if the query was previously asked and if so answer consistently, otherwise answer with a random bit and record the new pair. While I imagine that such simulation could be useful in some cases, the possibility of such simulation takes nothing from the importance of PRFs. Obvious, right? Well, it clearly wasn’t obvious to an anonymous reviewer when the paper was first submitted (and rejected).

The moral of the story is not that conference committees can be stupid, but rather that like in any other area of life, the work of the pioneers is hard. They need to justify and convince by explaining things that in retrospect seem obvious. And perhaps it is OK. After all, for every GGM there are plenty of papers offering ridiculous new notions. Nevertheless, where would we be without those innovators that keep on breaking new ground for the rest of us to follow?

#### 5 Comments on Struggles of the Pioneers

1. Grigory Yaroslavtsev // June 6, 2017 at 7:57 am // Reply

Very true!

Like

2. Lorain // June 6, 2017 at 10:03 am // Reply

Interesting. What is g? And what is a “key” s?

Like

• Omer Reingold // June 6, 2017 at 10:11 am // Reply

The key s is simply a description of a function from the family $F^n$ and g is the function the distinguisher interacts with (that can be either uniformly chosen or from the family $F^n$).

Like

3. Hung Le // June 9, 2017 at 12:13 pm // Reply

Should “Kolmagorov-Complexity” be Kolmogorov-Complexity

Like