Randomness merger

From HandWiki - Reading time: 4 min

Short description: Function in extractor theory

In extractor theory, a randomness merger is a function which extracts randomness out of a set of random variables, provided that at least one of them is uniformly random. Its name stems from the fact that it can be seen as a procedure which "merges" all the variables into one, preserving at least some of the entropy contained in the uniformly random variable. Mergers are currently used in order to explicitly construct randomness extractors.

Intuition and definition

Consider a set of [math]\displaystyle{ k }[/math] random variables, [math]\displaystyle{ X_1,\ldots,X_k }[/math], each distributed over [math]\displaystyle{ \{0,1\}^n }[/math] at least one of which is uniformly random; but it is not known which one. Furthermore, the variables may be arbitrarily correlated: they may be functions of one another, they may be constant, and so on. However, since at least one of them is uniform, the set as a whole contains at least [math]\displaystyle{ n }[/math] bits of entropy.

The job of the merger is to output a new random variable, also distributed over [math]\displaystyle{ \{0,1\}^n }[/math], that retains as much of that entropy as possible. Ideally, if it were known which of the variables is uniform, it could be used as the output, but that information is not known. The idea behind mergers is that by using a small additional random seed, it is possible to get a good result even without knowing which one is the uniform variable.

Definition (merger):

A function [math]\displaystyle{ M : (\{0,1\}^n)^k \times \{0,1\}^d \rightarrow \{0,1\}^n }[/math] is called an [math]\displaystyle{ (m,\varepsilon) }[/math]-merger if for every set of random variables [math]\displaystyle{ (X_1,\ldots,X_k) }[/math] distributed over [math]\displaystyle{ \{0,1\}^n }[/math], at least one of which is uniform, the distribution of [math]\displaystyle{ Z = M(X_1,\ldots,X_k, U_d) }[/math] has smooth min-entropy [math]\displaystyle{ H_\infty^\varepsilon(Z) \geq m }[/math]. The variable [math]\displaystyle{ U_d }[/math] denotes the uniform distribution over [math]\displaystyle{ d }[/math] bits, and represents a truly random seed.

In other words, by using a small uniform seed of length [math]\displaystyle{ d }[/math], the merger returns a string which is [math]\displaystyle{ \varepsilon }[/math]-close to having at least [math]\displaystyle{ m }[/math] min-entropy; this means that its statistical distance from a string with [math]\displaystyle{ m }[/math] min-entropy is no larger than [math]\displaystyle{ \varepsilon }[/math].

Reminder: There are several notions of measuring the randomness of a distribution; the min-entropy of a random variable [math]\displaystyle{ Z }[/math] is defined as the largest [math]\displaystyle{ k }[/math] such that the most probable value of [math]\displaystyle{ Z }[/math] occurs with probability no more than [math]\displaystyle{ 2^{-k} }[/math]. The min-entropy of a string is an upper bound to the amount of randomness that can be extracted from it. [1]

Parameters

There are three parameters to optimize when building mergers:

  1. The output's min-entropy [math]\displaystyle{ m }[/math] should be as high as possible, for then more bits can be extracted from it.
  2. [math]\displaystyle{ \varepsilon }[/math] should be as small as possible, for then after applying an extractor to the merger's output, the result will be closer to uniform.
  3. The seed length [math]\displaystyle{ d }[/math] should be as small as possible, for then the merger requires fewer initial truly random bits to work.

Explicit constructions for mergers are known with relatively good parameters. For example, Dvir and Wigderson's construction gives:[2] For every [math]\displaystyle{ \alpha \gt 0 }[/math] and integer [math]\displaystyle{ n }[/math], if [math]\displaystyle{ k \leq 2^{o(n)} }[/math], there exists an explicit [math]\displaystyle{ (m,\varepsilon) }[/math]-merger [math]\displaystyle{ M : (\{0,1\}^n)^k \times \{0,1\}^d \rightarrow \{0,1\}^n }[/math] such that:

  1. [math]\displaystyle{ m = (1-\alpha)n, }[/math]
  2. [math]\displaystyle{ d = O(\log(n) + \log(k)), }[/math]
  3. [math]\displaystyle{ \varepsilon = O\left(\frac{1}{n \cdot k}\right). }[/math]

The proof is constructive and allows building such a merger in polynomial time in the given parameters.

Usage

It is possible to use mergers in order to produce randomness extractors with good parameters. Recall that an extractor is a function which takes a random variable that has high min-entropy, and returns a smaller random variable, but one that is close to uniform. An arbitrary min-entropy extractor can be obtained using the following merger-based scheme:[2][3]

  • Given a source of high min-entropy, partition it into blocks. By a known result,[4] at least one of these partitions will also have high min-entropy as a block-source.
  • Apply a block extractor separately to all the blocks. This is a weaker sort of extractor, and good constructions for it are known.[2] Since at least one of the blocks has high min-entropy, at least one of the outputs is very close to uniform.
  • Use the merger to combine all the previous outputs into one string. If a good merger is used, then the resultant string will have very high min-entropy compared to its length.
  • Use a known extractor that works only for very high min-entropy sources to extract the randomness.

The essence of the scheme above is to use the merger in order to transform a string with arbitrary min-entropy into a smaller string, while not losing a lot of min-entropy in the process. This new string has very high min-entropy compared to its length, and it's then possible to use older, known, extractors which only work for those type of strings.

See also

References

  1. De, Portmann, Vidick and Renner (2009). "Trevisan's extractor in the presence of quantum side information". SIAM Journal on Computing 41 (4): 915–940. doi:10.1137/100813683.  Section 2.2.
  2. 2.0 2.1 2.2 "Kakeya sets, new mergers and old extractors". http://www.cs.princeton.edu/~zdvir/papers/DvirWigderson08.pdf. 
  3. "Extracting Randomness: A Survey and New Constructions". http://www.cs.tau.ac.il/~amnon/Papers/NT.jcss99.pdf.  Section 4.3.
  4. Amnon Ta-Shma. "Refining Randomness". http://www.cs.tau.ac.il/~amnon/Papers/T.thesis96.ps.  Phd. Thesis.




Licensed under CC BY-SA 3.0 | Source: https://handwiki.org/wiki/Randomness_merger
17 views | Status: cached on July 24 2024 14:06:02
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF