Article

True randomness from realistic quantum devices

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Even if the output of a Random Number Generator (RNG) is perfectly uniformly distributed, it may be correlated to pre-existing information and therefore be predictable. Statistical tests are thus not sufficient to guarantee that an RNG is usable for applications, e.g., in cryptography or gambling, where unpredictability is important. To enable such applications a stronger notion of randomness, termed "true randomness", is required, which includes independence from prior information. Quantum systems are particularly suitable for true randomness generation, as their unpredictability can be proved based on physical principles. Practical implementations of Quantum RNGs (QRNGs) are however always subject to noise, i.e., influences which are not fully controlled. This reduces the quality of the raw randomness generated by the device, making it necessary to post-process it. Here we provide a framework to analyse realistic QRNGs and to determine the post-processing that is necessary to turn their raw output into true randomness.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... two-universal hashing, for which some further random bits are needed. For details, see [1] and references therein. ...
... where P id is the distribution of the ideal measurement, η ∈ [0, 1] is a quality parameter, n o is the number of outcomes, and p a is the input distribution. In order to characterize the detector, we make use of the tomographically complete qubit state set {|+ , |0 , |1 , |+i}, corresponding to the ±1-eigenstates of Pauli σ z and the +1- eigenstates of σ x , σ y . ...
... Consider the case of the observed distribution (22), where P id corresponds to the statistics of a σ x -measurement. For any α ∈ [0, 1], the two sent states are drawn from the set {|φ α , |ψ α } with ...
Article
Measurements of quantum systems can be used to generate classical data that is truly unpredictable for every observer. However, this true randomness needs to be discriminated from randomness due to ignorance or lack of control of the devices. We analyze the randomness gain of a measurement-device-independent setup, consisting of a well-characterized source of quantum states and a completely uncharacterized and untrusted detector. Our framework generalizes previous schemes as arbitrary input states and arbitrary measurements can be analyzed. Our method is used to suggest simple and realistic implementations that yield high randomness generation rates of more than one random bit per qubit for detectors of sufficient quality.
... Consequently, the first step for analysing our experiment is to carefully calibrate and model the realistic photodiodes, which output noisy voltage measurements rather than exact photon numbers. More formally, following the approach of [35], we model the POVM describing our noisy, characterised measurements as a projective measurement on a larger system. For the case of our detectors (see Fig. 5 in Appendix B for a cohesive summary), the measured voltages are modelled as follows. ...
... That is, whether the output of the protocol can then be used as an input to other cryptographic protocols without compromising the security. For example, it can be input to a randomness extractor along with a seed to achieve certified randomness expansion using well known techniques [35,42]. It is still unknown whether fully device-independent protocols are composably secure without extra assumptions, e.g. ...
... Whilst the length of the seed must be chosen proportional to m, it only has to be generated once and can be safely reused to hash arbitrarily many blocks, meaning that the initial random seed can be used to generate an unbounded amount of randomness. This also means that the seed can be hard-coded into the hashing device (for a further discussion and an explicit implementation, see [35]). Other quantum-secure methods, such as the Trevisan extractor, are more efficient in the length of the required seed. ...
Preprint
Full-text available
A remarkable aspect of quantum theory is that certain measurement outcomes are entirely unpredictable to all possible observers. Such quantum events can be harnessed to generate numbers whose randomness is asserted based upon the underlying physical processes. We formally introduce and experimentally demonstrate an ultrafast optical quantum randomness generator that uses a totally untrusted photonic source. While considering completely general quantum attacks, we certify randomness at a rate of $1.1\,\mathrm{Gbps}$ with a rigorous security parameter of $10^{-20}$. Our security proof is entirely composable, thereby allowing the generated randomness to be utilised for arbitrary applications in cryptography and beyond.
... Using the methods proposed by Frauchiger et al. [4] to evaluate the amount of quantum randomness of our device we evaluate the min-Entropy of the output distribution conditioned on the distribution of all the predictable side information. In our case, the side information is modeled as classical and determined by the random variable E. The conditional min-Entropy that we need to compute is thus the following: ...
... Davide Rusca thanks the EUs H2020 program under the Marie Skłodowska-Curie project QCALL (GA 675662) for financial support. Sanguinetti [15], [5] qStream [13] Wei [20] Nie [12] Dynes [3] Matsumoto [ The model used in our paper follows the method introduced in Frauchiger et al. [4]. We specify first the density operator ρ corresponding to our QRNG. ...
... where the vector s = s 1 , s 2 , ..., s m collects all the variables for dark counts for each detector, H(·) is the Hamming weight function and m is the number of detector considered. This model corresponds to a generalization of the two model previously proposed for two detectors [4]. ...
Preprint
The security of electronic devices has become a key requisite for the rapidly-expanding pervasive and hyper-connected world. Robust security protocols ensuring secure communication, device's resilience to attacks, authentication control and users privacy need to be implemented. Random Number Generators (RNGs) are the fundamental primitive in most secure protocols but, often, also the weakest one. Establishing security in billions of devices requires high quality random data generated at a sufficiently high throughput. On the other hand, the RNG should exhibit a high integration level with on-chip extraction to remove, in real time, potential imperfections. We present the first integrated Quantum RNG (QRNG) in a standard CMOS technology node. The QRNG is based on a parallel array of independent Single-Photon Avalanche Diodes (SPADs), homogeneously illuminated by a DC-biased LED, and co-integrated logic circuits for postprocessing. We describe the randomness generation process and we prove the quantum origin of entropy. We show that co-integration of combinational logic, even of high complexity, does not affect the quality of randomness. Our CMOS QRNG can reach up to 400 Mbit/s throughput with low power consumption. Thanks to the use of standard CMOS technology and a modular architecture, our QRNG is suitable for a highly scalable solution.
... We quantify the amount of quantum randomness to the amount of classical noise using a quantum-to-classical-noise ratio (QCNR). When QCNR is low, both the quality and the security of the random sequence generated may be compromised [24,26,27]. ...
... They constitute a strong extractor which implies that the seed can be reused without sacrificing too much randomness. In recent development of QRNGs [20,22,26,27,43], they have been used to construct hashing functions such as the Toeplitz-hashing matrix. These constructions require a long (but reusable) seed [44]. ...
... For instance, one can apply entropy smoothing [39,53] on the worst-case min-entropy to tighten the analysis. Our framework can also be generalized to encapsulate potential quantum side information by considering the analysis described in Ref. [27]. A detailed cryptoanalysis of our framework can also increase the final throughput of the QRNG [47]. ...
Article
Full-text available
The generation of random numbers via quantum processes is an efficient and reliable method to obtain true indeterministic random numbers that are of vital importance to cryptographic communication and large scale computer modelling. However, in realistic scenarios, the raw output of a quantum random number generator is inevitably tainted with classical technical noise. The integrity of the device could be compromised if this noise is tampered with, or even controlled by some malicious party. To safeguard against this, we propose and experimentally demonstrate an approach that produces side-information independent randomness that is quantified by min-entropy conditioned on this classical noise. We present a method for maximising the conditional min-entropy of the number sequence generated from a given quantum to classical noise ratio (QCNR). The detected photo-current in our experiment is shown to have a real-time random number generation rate of 14 Mbps/MHz. Integrating this figure across the spectral response of the detection system shows the potential to deliver more than 70 Gbps of random numbers in our experimental setup.
... Consequently, the first step for analysing our experiment is to carefully calibrate and model the realistic photodiodes, which output noisy voltage measurements rather than exact photon numbers. More formally, following the approach of [40], we model the POVM describing our noisy, characterised measurements as a projective measurement on a larger system. For the case of our detectors (see Fig. 6 in Appendix B for a cohesive summary), the measured voltages are modelled as follows. ...
... That is, whether the output of the protocol can then be used as an input to other cryptographic protocols without compromising the security. For example, it can be input to a randomness extractor along with a seed to achieve certified randomness expansion using well known techniques [40,42]. Very few implementations enjoy such composable security proofs in either the device-dependent [9,35,36] or partially device-independent case [13]. ...
... Whilst the length of the seed must be chosen proportional to m, it only has to be generated once and can be safely reused to hash arbitrarily many blocks, meaning that the initial random seed can be used to generate an unbounded amount of randomness. This also means that the seed can be hard-coded into the hashing device (for a further discussion and an explicit implementation, see [40]). Other quantum-secure methods, such as the Trevisan extractor, are more efficient in the length of the required seed. ...
Article
Full-text available
A remarkable aspect of quantum theory is that certain measurement outcomes are entirely unpredictable to all possible observers. Such quantum events can be harnessed to generate numbers whose randomness is asserted based upon the underlying physical processes. We formally introduce, design, and experimentally demonstrate an ultrafast optical quantum random number generator that uses a totally untrusted photonic source. While considering completely general quantum attacks, we certify and generate in real time random numbers at a rate of 8.05 Gb/s with a composable security parameter of 10−10. Composable security is the most stringent and useful security paradigm because any given protocol remains secure even if arbitrarily combined with other instances of the same, or other, protocols, thereby allowing the generated randomness to be utilized for arbitrary applications in cryptography and beyond. This work achieves the fastest generation of composably secure quantum random numbers ever reported.
... In this work we are concerned with the converse, i.e., the device-dependent, approach [19]. In contrast to the above, device-dependent RNGs are more practical, smaller, faster, and cheaper [3]. ...
... To assess the quality of the randomness generated by these APDs, one would in principle need a microscopic model describing their workings. Within such a model, one may then attempt to prove that their output is unpredictable even if the quantum state of the APDs was fully known (i.e., pure) at the time when the randomness generation process is initiated, that is, when the device received the trigger signal requesting it to generate randomness [19]. However, lacking such a microscopic model, one may also resort to physically reasonable assumptions. ...
Article
Full-text available
We reverse-engineer, test and analyse hardware and firmware of the commercial quantum-optical random number generator Quantis from ID Quantique. We show that $>99\%$ > 99 % of its output data originates in physically random processes: random timing of photon absorption in a semiconductor material, and random growth of avalanche owing to impact ionisation. Under a strong assumption that these processes correspond to a measurement of an initially pure state of the components, our analysis implies the unpredictability of the generated randomness. We have also found minor non-random contributions from imperfections in detector electronics and an internal processing algorithm, specific to this particular device. Our work shows that the design quality of a commercial quantum-optical randomness source can be verified without cooperation of the manufacturer and without access to the engineering documentation.
... The production of application-ready random numbers from the QES requires a randomness extraction stage [30]. In real QRNG devices, untrusted noise degrades (corrupts) the purity of the randomness associated to quantum processes. ...
... In real QRNG devices, untrusted noise degrades (corrupts) the purity of the randomness associated to quantum processes. The application of proper randomness extractors allows to eliminate corruption of the quantum signal [30]. Random extraction requires (i) the estimation of the amount of available minentropy from the QES, taking into account electronic noise, memory effects and digitization noise [31], and (ii) an appropriate hashing of the data after digitization. ...
Article
Random number generators are essential to ensure performance in information technologies, including cryptography, stochastic simulations and massive data processing. The quality of random numbers ultimately determines the security and privacy that can be achieved, while the speed at which they can be generated poses limits to the utilisation of the available resources. In this work we propose and demonstrate a quantum entropy source for random number generation on an indium phosphide photonic integrated circuit made possible by a new design using two-laser interference and heterodyne detection. The resulting device offers high-speed operation with unprecedented security guarantees and reduced form factor. It is also compatible with complementary metal-oxide semiconductor technology, opening the path to its integration in computation and communication electronic cards, which is particularly relevant for the intensive migration of information processing and storage tasks from local premises to cloud data centres.
... Frauchiger et al. report device-specific biases in the raw output of QRNG [17]. Our research expands on this to demonstrate the magnitude and level of these biases. ...
... The difference between the biases in each Quantis module sample reinforces Frauchiger et al.'s report of a device-specific component to the raw output [17]. Recent assertions that post-processing is required to provide adequate QRNG output have been made [18], and this work agrees with this prognosis. ...
Conference Paper
Full-text available
Quantum phenomena offer a very attractive entropy source for random number generation, harnessing inherently chaotic observable events. In this work, we present the analysis of a popular commercial QRNG range, ID Quantique's Quantis 16M, 4M and USB modules. Previous analyses are extended significantly, by including novel analyses using Ent, Alphabits, and Rabbit (the latter two from TestUo1). Ent reveals significant biases in the raw output of Quantis devices. The Alphabits and Rabbit batteries also report significant issues with the raw output of these devices. Analysis of these original results and discussion of correspondence with ID Quantique regarding their post-processing requirements consolidates these findings to highlight that raw data from the discussed devices are unsuited to cryptographic use.
... (Note we require PA > 2 , where is whatever smoothening parameter is used.) Interestingly, while the choice of the random hash function used for privacy amplification must be random, it was proven in [24] that once chosen it can be fixed and so we do not need to use additional randomness to choose a hash function (it could be chosen randomly once and then hard-coded into A's device-see [24] for more details). If the adversary prepares N qubit states, unentangled with any quantum memory, then we may immediately use our Theorem 2 to compute . ...
... (Note we require PA > 2 , where is whatever smoothening parameter is used.) Interestingly, while the choice of the random hash function used for privacy amplification must be random, it was proven in [24] that once chosen it can be fixed and so we do not need to use additional randomness to choose a hash function (it could be chosen randomly once and then hard-coded into A's device-see [24] for more details). If the adversary prepares N qubit states, unentangled with any quantum memory, then we may immediately use our Theorem 2 to compute . ...
Article
Full-text available
In this paper, we show an interesting connection between a quantum sampling technique and quantum uncertainty. Namely, we use the quantum sampling technique, introduced by Bouman and Fehr, to derive a novel entropic uncertainty relation based on smooth min- entropy, the binary Shannon entropy of an observed outcome, and the probability of failure of a classical sampling strategy. We then show two applications of our new relation. First, we use it to develop a simple proof of a version of the Maassen and Uffink uncertainty relation. Second, we show how it may be applied to quantum random number generation.
... Some implementations, like hashing with Toeplitz random binary matrices (Krawczyk, 1994;Mansour et al., 1990), are particularly efficient. We can define such an extractor in which the seed is used as a rectangular matrix that is multiplied to n-vectors from the source to output almost independent bits (Frauchiger et al., 2013). This approach is used in some commercial devices where the extraction function is a precomputed random matrix that acts as the seed that is distributed coded into the device (Troyer and Renner, 2012). ...
Article
Full-text available
Random numbers are a fundamental resource in science and engineering with important applications in simulation and cryptography. The inherent randomness at the core of quantum mechanics makes quantum systems a perfect source of entropy. Quantum random number generation is one of the most mature quantum technologies with many alternative generation methods. We discuss the different technologies in quantum random number generation from the early devices based on radioactive decay to the multiple ways to use the quantum states of light to gather entropy from a quantum origin. We also discuss randomness extraction and amplification and the notable possibility of generating trusted random numbers even with untrusted hardware using device independent generation protocols.
... However, a central issue for these QRNGs is how to certify and quantify the entropy of the genuine randomness, i.e., the randomness that originates from the intrinsic unpredictability of quantum-mechanical measurements. Entropy estimates for specific setups were recently proposed using sophisticated theoretical models [8][9][10]. Nevertheless, these techniques require complicated device characterization that may be difficult to accurately assess in practice. ...
Article
Full-text available
A quantum random number generator (QRNG) generates genuine randomness from the intrinsic probabilistic nature of quantum mechanics. The central problems for most QRNGs are estimating the entropy of the genuine randomness and producing such randomness at high rates. Here we propose and demonstrate a proof-of-concept QRNG that operates at a high rate of 24 Mbit/s by means of a high-dimensional entanglement system, in which the user monitors the entropy in real time via the observation of a nonlocal quantum interference, without a detailed characterization of the devices. Our work provides an important approach to a robust QRNG with trusted but error-prone devices.
... In practice, however, perfect randomness cannot be expected even from measurementbased quantum random number generators. What one can reasonably guarantee is only a relatively high entropy of the outcomes of QRNG, which then requires post-processing [83,35]. Moreover, it has recently been shown that even the state of the art QRNGs do not pass certain standard statistical tests for randomness [43]. ...
Article
Full-text available
Randomness is an invaluable resource in today's life with a broad use reaching from numerical simulations through randomized algorithms to cryptography. However, on the classical level no true randomness is available and even the use of simple quantum devices in a prepare-measure setting suffers from lack of stability and controllability. This gave rise to a group of quantum protocols that provide randomness certified by classical statistical tests -- Device Independent Quantum Random Number Generators. In this paper we review the most relevant results in this field, which allow the production of almost perfect randomness with help of quantum devices, supplemented with an arbitrary weak source of additional randomness. This is in fact the best one could hope for to achieve, as with no starting randomness (corresponding to no free will in a different concept) even a quantum world would have a fully deterministic description.
... Achieved randomness merits (bias and correlation) of strings S and T are not good enough for general applications, consequently we need some kind of randomness extraction. One could, in principle, estimate min-entropy of the strings following the approach in [25], after which an efficient universal hashing extractor would be applied to the raw bits. In this work however, we use a chained XOR extractor, depicted in Fig. 4, which is much less efficient in terms of number of extracted bits versus number of input bits, but also much simpler to realize in hardware. ...
Article
Full-text available
We present a random number generator based on quantum effects in photonic emission and detection. It is unique in simultaneous use of both spatial and temporal quantum information contained in the system which makes it resilient to hardware failure and signal injection attacks. We show that its deviation from randomness cam be estimated based on simple measurements. Generated numbers pass NIST Statistical test suite without post-processing.
... There is a common belief, expressed for example by Frauchiger et al. [18], that quantum physics is needed for random-number generators to be " really random " . The root of this belief seems to be a notion that quantum randomness is " inherent " or can be " proven " , whereas classical physics is deterministic. ...
Article
Full-text available
Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as (i) What does practical determinism mean? (ii) Is noise-based logic a Turing machine? (iii) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, (iv) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.
... The problem of discriminating between classical and quantum randomness sources has already been addressed in previous work, and there exist randomness extraction algorithms specifically designed to distil the output from quantum RNGs to obtain uniformly distributed random numbers. As an example, we used the double-hash function algorithm suggested by Frauchiger et al. 33 , which is computationally efficient and could be implemented in hardware. The resulting data successfully passes the 15 tests in the NIST randomness test suite 34 with a significance level of 0.05 (5%). ...
Article
Full-text available
Random number generation is crucial in many aspects of everyday life, as online security and privacy depend ultimately on the quality of random numbers. Many current implementations are based on pseudo-random number generators, but information security requires true random numbers for sensitive applications like key generation in banking, defence or even social media. True random number generators are systems whose outputs cannot be determined, even if their internal structure and response history are known. Sources of quantum noise are thus ideal for this application due to their intrinsic uncertainty. In this work, we propose using resonant tunnelling diodes as practical true random number generators based on a quantum mechanical effect. The output of the proposed devices can be directly used as a random stream of bits or can be further distilled using randomness extraction algorithms, depending on the application.
... Such strategies are advocated in the literature in order to increase the secret key rate by minimizing the cost of sifting[30].8 See, for instance,[31] for an analysis of realistic quantum random number generators explaining how randomness originating from quantum processes can be turned into ideal seeds.Accepted in Quantum 2017-06-08, click title to verify ...
Article
Full-text available
In this work we present a security analysis for quantum key distribution, establishing a rigorous tradeoff between various protocol and security parameters for a class of entanglement-based and prepare-and-measure protocols. The goal of this paper is twofold: 1) to review and clarify the stateof-the-art security analysis based on entropic uncertainty relations, and 2) to provide an accessible resource for researchers interested in a security analysis of quantum cryptographic protocols that takes into account finite resource effects. For this purpose we collect and clarify several arguments spread in the literature on the subject with the goal of making this treatment largely self-contained. More precisely, we focus on a class of prepare-and-measure protocols based on the Bennett-Brassard (BB84) protocol as well as a class of entanglement-based protocols similar to the Bennett-Brassard-Mermin (BBM92) protocol. We carefully formalize the different steps in these protocols, including randomization, measurement, parameter estimation, error correction and privacy amplification, allowing us to be mathematically precise throughout the security analysis. We start from an operational definition of what it means for a quantum key distribution protocol to be secure and derive simple conditions that serve as sufficient condition for secrecy and correctness. We then derive and eventually discuss tradeoff relations between the block length of the classical computation, the noise tolerance, the secret key length and the security parameters for our protocols. Our results significantly improve upon previously reported tradeoffs.
... Achieved randomness merits (bias and correlation) of strings S and T are not good enough for general applications, consequently we need some kind of randomness extraction. One could, in principle, estimate min-entropy of the strings following the approach in [25], after which an efficient universal hashing extractor would be applied to the raw bits. In this work however, we use a chained XOR extractor, depicted in Fig. 4, which is much less efficient in terms of number of extracted bits versus number of input bits, but also much simpler to realize in hardware. ...
Article
Full-text available
We present a first random number generator (RNG) which simultaneously uses independent spatial and temporal quantum randomness contained in an optical system. Availability of the two independent sources of entropy makes the RNG resilient to hardware failure and signal injection attacks. We show that the deviation from randomness of the generated numbers can be estimated quickly from simple measurements thus eliminating the need for usual time-consuming statistical testing of the output data. As a confirmation it is demonstrated that generated numbers pass NIST Statistical test suite.
... An important issue here is to estimate the entropy of the randomness source, namely the raw random bits generated, from which truly random bits can be extracted [202]. Sophisticated techniques have been developed to estimate entropy in specific cases [203,204]. However, these methods are somewhat difficult to implement and do not easily lend themselves to generalization nor to easy realtime monitoring. ...
Article
p>Uniquely among the sciences, quantum cryptography has driven both foundational research as well as practical real-life applications. We review the progress of quantum cryptography in the last decade, covering quantum key distribution and other applications. Quanta 2017; 6: 1–47.</p
... The problem of discriminating between classical and quantum randomness sources has already been addressed in previous work, and there exist randomness extraction algorithms specifically designed to distil the output from quantum RNGs to obtain uniformly distributed random numbers. As an example, we used the double-hash function algorithm suggested by Frauchiger et al. 33 , which is computationally efficient and could be implemented in hardware. The resulting data successfully passes the 15 tests in the NIST randomness test suite 34 with a significance level of 0.05 (5%). ...
Conference Paper
p>In this work, we show how the hysteretic behaviour of resonant tunnelling diodes (RTDs) can be exploited for new functionalities. In particular, the RTDs exhibit a stochastic 2-state switching mechanism that could be useful for random number generation and cryptographic applications. This behaviour can be scaled to N-bit switching, by connecting various RTDs in series. The InGaAs/AlAs RTDs used in our experiments display very sharp negative differential resistance (NDR) peaks at room temperature which show hysteresis cycles that, rather than having a fixed switching threshold, show a probability distribution about a central value. We propose to use this intrinsic uncertainty emerging from the quantum nature of the RTDs as a source of randomness. We show that a combination of two RTDs in series results in devices with three-state outputs and discuss the possibility of scaling to N-state devices by subsequent series connections of RTDs, which we demonstrate for the up to the 4-state case. In this work, we suggest using that the intrinsic uncertainty in the conduction paths of resonant tunnelling diodes can behave as a source of randomness that can be integrated into current electronics to produce on-chip true random number generators. The N-shaped I-V characteristic of RTDs results in a two-level random voltage output when driven with current pulse trains. Electrical characterisation and randomness testing of the devices was conducted in order to determine the validity of the true randomness assumption. Based on the results obtained for the single RTD case, we suggest the possibility of using multi-well devices to generate N-state random switching devices for their use in random number generation or multi-valued logic devices.</p
... In fact, the conditional min-entropy H min X δ jE ð Þ is not estimated from the data, but it is bounded considering the structure of the POVM and the optimal strategy for the attacker, making it independent from the number of rounds of the protocol. Finally, a Toeplitz randomness extractor 33 is calibrated using H min X δ jE ð Þ, and extracts the certified numbers from the raw data. As a final check, we applied a series of statistical tests from the DieHarder and NIST suite: all of them are successfully passed (see Supplementary Note 4). ...
Article
Full-text available
Random numbers are commonly used in many different fields, ranging from simulations in fundamental science to security applications. In some critical cases, as Bell’s tests and cryptography, the random numbers are required to be both private and to be provided at an ultra-fast rate. However, practical generators are usually considered trusted, but their security can be compromised in case of imperfections or malicious external actions. In this work we introduce an efficient protocol which guarantees security and speed in the generation. We propose a source-device-independent protocol based on generic Positive Operator Valued Measurements and then we specialize the result to heterodyne measurements. Furthermore, we experimentally implemented the protocol, reaching a secure generation rate of 17.42 Gbit/s, without the need of an initial source of randomness. The security of the protocol has been proven for general attacks in the finite key scenario.
... Having these two characteristics (probability of ones equal to 0.5 and absence of correlation among successive bits) a pool of generated bits has no other possibility than to be random. Namely, according to min-entropy theory, laid out in Ref. [31], a sufficient condition for a RNG to generate truly random bits is that it generates any -bit string with an a priori probability of 1/2 . Now, for = 1 this is simply a condition that probability of ones is equal to 1/2, which is probably the most intuitive characteristic of a random bit string. ...
Chapter
Full-text available
Random numbers are essential for our modern information-based society. Unlike frequently used pseudo-random generators, physical random number generators do not depend on deterministic algorithms but rather on a physical process to provide true randomness. In this work we present a conceptually simple optical quantum random number generator that features special characteristics necessary for application in a loophole-free Bell inequality test, namely: (1) very short latency between the request for a random bit and time when the bit is generated; (2) all physical processes relevant to the bit production happen after the bit request signal; and (3) high efficiency of producing a bit upon a request (100% by design). This generator is characterized by further desirable characteristics: ability of high bit generation rate, possibility to use a low detection-efficiency photon detector, a high ratio of number of bits per detected photon (≈2) and simplicity of the bit generating process. Generated sequences of random bits pass NIST STS test without further postprocessing.
... The reason is that any experimental implementation is prone to technical imperfections that introduce unavoidable noise. A rigorous characterization of the devices is therefore required in order to separate technical noise from true quantum randomness, which is often cumbersome and challenging in practice [7][8][9][10]. ...
Preprint
Quantum theory allows for randomness generation in a device-independent setting, where no detailed description of the experimental device is required. Here we derive a general upper bound on the amount of randomness that can be generated in such a setting. Our bound applies to any black-box scenario, thus covering a wide range of scenarios from partially characterised to completely uncharacterised devices. Specifically, we prove that the number of random bits that can be generated is limited by the number of different input states that enter the measurement device. We show explicitly that our bound is tight in the simplest case. More generally, our work indicates that the prospects of generating a large amount of randomness by using high-dimensional (or even continuous variable) systems will be extremely challenging in practice.
Article
Full-text available
Security proofs of quantum key distribution (QKD) systems usually assume that the users have access to source of perfect randomness. State-of-the-art QKD systems run at frequencies in the GHz range, requiring a sustained GHz rate of generation and acquisition of quantum random numbers. In this paper we demonstrate such a high speed random number generator. The entropy source is based on amplified spontaneous emission from an erbium-doped fibre, which is directly acquired using a standard small form-factor pluggable (SFP) module. The module connects to the Field Programmable Gate Array (FPGA) of a QKD system. A real-time randomness extractor is implemented in the FPGA and achieves a sustained rate of 1.25 Gbps of provably random bits.
Article
We demonstrate robust, high-speed random number generation using interference of the steady-state emission of guaranteed random phases, obtained through gain-switching a semiconductor laser diode. Steady-state emission tolerates large temporal pulse misalignments and therefore significantly improves the interference quality. Using an 8-bit digitizer followed by a finite-impulse response unbiasing algorithm, we achieve random number generation rates of 8 and 20 Gb/s, for laser repetition rates of 1 and 2.5 GHz, respectively, with a +/-20% tolerance in the interferometer differential delay. We also report a generation rate of 80 Gb/s using partially phase-correlated short pulses. In relation to the field of quantum key distribution, our results confirm the gain-switched laser diode as a suitable light source, capable of providing phase-randomized coherent pulses at a clock rate of up to 2.5 GHz.
Book
This book brings together research on numerical methods adapted for Graphics Processing Units (GPUs). It explains recent efforts to adapt classic numerical methods, including solution of linear equations and FFT, for massively parallel GPU architectures. This volume consolidates recent research and adaptations, covering widely used methods that are at the core of many scientific and engineering computations. Each chapter is written by authors working on a specific group of methods; these leading experts provide mathematical background, parallel algorithms and implementation details leading to reusable, adaptable and scalable code fragments. This book also serves as a GPU implementation manual for many numerical algorithms, sharing tips on GPUs that can increase application efficiency. The valuable insights into parallelization strategies for GPUs are supplemented by ready-to-use code fragments. Numerical Computations with GPUs targets professionals and researchers working in high performance computing and GPU programming. Advanced-level students focused on computer science and mathematics will also find this book useful as secondary text book or reference.
Article
Randomness is one of the most important resources in modern information science, since encryption founds upon the trust in random numbers. Since it is impossible to prove if an existing random bit string is truly random, it is relevant that they be generated in a trust worthy process. This requires specialized hardware for random numbers, for example a die or a tossed coin. But when all input parameters are known, their outcome might still be predicted. A quantum mechanical superposition allows for provably true random bit generation. In the past decade many quantum random number generators (QRNGs) were realized. A photonic implementation is described as a photon which impinges on a beam splitter, but such a protocol is rarely realized with non-classical light or anti-bunched single photons. Instead, laser sources or light emitting diodes are used. Here we analyze the difference in generating a true random bit string with a laser and with anti-bunched light. We show that a single photon source provides more randomness than even a brighter laser. This gain of usable entropy proves the advantages of true single photons versus coherent input states of light in an experimental implementation. The underlying advantage can be adapted to microscopy and sensing.
Article
Full-text available
Quantum random number generators promise perfectly unpredictable random numbers. A popular approach to quantum random number generation is homodyne measurements of the vacuum state, the ground state of the electro-magnetic field. Here we experimentally implement such a quantum random number generator, and derive a security proof that considers quantum side-information instead of classical side-information only. Based on the assumptions of Gaussianity and stationarity of noise processes, our security analysis furthermore includes correlations between consecutive measurement outcomes due to finite detection bandwidth, as well as analog-to-digital converter imperfections. We characterize our experimental realization by bounding measured parameters of the stochastic model determining the min-entropy of the system’s measurement outcomes, and we demonstrate a real-time generation rate of 2.9 Gbit/s. Our generator follows a trusted, device-dependent, approach. By treating side-information quantum mechanically an important restriction on adversaries is removed, which previously was reserved to semi-device-independent and device-independent schemes.
Article
Full-text available
We describe a methodology and standard of proof for experimental claims of quantum random number generation (QRNG), analogous to well-established methods from precision measurement. For appropriately constructed physical implementations, lower bounds on the quantum contribution to the average min-entropy can be derived from measurements on the QRNG output. Given these bounds, randomness extractors allow generation of nearly perfect "{\epsilon}-random" bit streams. An analysis of experimental uncertainties then gives experimentally derived confidence levels on the {\epsilon} randomness of these sequences. We demonstrate the methodology by application to phase-diffusion QRNG, driven by spontaneous emission as a trusted randomness source. All other factors, including classical phase noise, amplitude fluctuations, digitization errors and correlations due to finite detection bandwidth, are treated with paranoid caution, i.e., assuming the worst possible behaviors consistent with observations. A data-constrained numerical optimization of the distribution of untrusted parameters is used to lower bound the average min-entropy. Under this paranoid analysis, the QRNG remains efficient, generating at least 2.3 quantum random bits per symbol with 8-bit digitization and at least 0.83 quantum random bits per symbol with binary digitization, at a confidence level of 0.99993. The result demonstrates ultrafast QRNG with strong experimental guarantees.
Article
Full-text available
We analyze the information an attacker can obtain on the numbers generated by a user by measurements on a subsystem of a system consisting of two entangled two-level systems. The attacker and the user make measurements on their respective subsystems, only. Already the knowledge of the density matrix of the subsystem of the user completely determines the upperbound on the information accessible to the attacker. We compare and contrast this information to the appropriate bounds provided by quantum state discrimination.
Article
Full-text available
Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology.
Preprint
The prototype of a quantum random number generator is a single photon which impinges onto a beam splitter and is then detected by single photon detectors at one of the two output paths. Prior to detection, the photon is in a quantum mechanical superposition state of the two possible outcomes with - ideally - equal amplitudes until its position is determined by measurement. When the two output modes are observed by a single photon detector, the generated clicks can be interpreted as ones and zeros - and a raw random bit stream is obtained. Here we implement such a random bit generator based on single photons from a defect center in diamond. We investigate the single photon emission of the defect center by an anti-bunching measurement. This certifies the "quantumness" of the supplied photonic input state, while the random "decision" is still based on the vacuum fluctuations at the open port of the beam-splitter. Technical limitations, such as intensity fluctuations, mechanical drift, and bias are discussed. A number of ways to suppress such unwanted effects, and an a priori entropy estimation are presented. The single photon nature allows for a characterization of the non-classicality of the source, and allows to determine a background fraction. Due to the NV-center's superior stability and optical properties, we can operate the generator under ambient conditions around the clock. We present a true 24/7 operation of the implemented random bit generator.
Article
Full-text available
We propose a method to determine single hyperspace vectors (product strings of noise-bits) by classical means with the same effectiveness as the results using time shifted noise-based logic. A system of binary linear equations based on the amplitudes of the hyperspace vector and the reference noise-bits is set up and solved after enough independent information is collected. The resulting error probability (the chance of getting no answer) has approximately an exponential decay with the time of measurement. © 2012 World Scientific Publishing Company.
Article
Full-text available
Quantum random number generators (QRNGs) can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts --- a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence needs to be carefully calibrated and modeled to achieve information-theoretically provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic assemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for QRNGs in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multi-photon emissions are allowed in optical implementations. Our analysis takes account of the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. Our scheme can be implemented by simply converting some current realizations of quantum random number generation.
Article
Within quantum theory, the Born rule restricts our ability to predict measurement outcomes. However, could it be that this restriction is not fundamental, but instead due to the quantum wavefunction being insufficient to generate the most precise predictions? In other words, could there be an extension of quantum theory giving more informative predictions? Here we review a recent line of work arguing that this question has a negative answer.
Article
One of the most fundamental quantum random number generators is implemented with light impinging onto a beam splitter, and two single photon detectors at its output. Often, this generator is described as "a photon which takes one or other path towards a detector". The input state of light in conjunction with the detector response is relevant for the amount, the pattern, and the correlation of the generated clicks. Only a fraction of all generator outcomes, the min-entropy, can be used as a further resource for true randomness. This paper addresses the difference in the common description with incoming single photons and the often implemented scheme with a weak coherent light source, such as an attenuated laser. For this very fundamental and widely used configuration the amount of usable entropy is compared. If single photons from an anti-bunched light source are supplied, the amount of entropy is higher than for the case of a supplied coherent state - although the latter can be arbitrarily bright unlike the single photon source.
Chapter
The chapter describes a Monte Carlo method’s implementation for analyzing the dynamics of open quantum systems—so-called quantum trajectories method. The discussed implementation is realized with use of the CUDA technology. It should be pointed out that using GPU in this approach allows to increase the performance of the quantum trajectories method’s simulation.
Chapter
Intrinsic uncertainty is a distinctive feature of quantum physics, which can be used to harness high-quality randomness. However, in realistic scenarios, the raw output of a quantum random-number generator (QRNG) is inevitably tainted by classical technical noise. The integrity of such a device can be compromised if this noise is tampered with, or even controlled by some malicious parties. In this chapter, we first briefly discuss how the quantum randomness can be characterised via information theoretic approaches, namely by quantifying the Shannon entropy and min-entropy. We then consider several ways where classical side-information can be taken into account via these quantities in a continuous-variable QRNG. Next, we focus on side-information independent randomness that is quantified by min-entropy conditioned on the classical noise. To this end, we present a method for maximizing the conditional min-entropy from a given quantum-to-classical-noise ratio. We demonstrate our approach on a vacuum state CV-QRNG. Lastly, we highlight several recent developments in the quest of developing secure CV-QRNG.
Article
Full-text available
We discuss a simple idealistic quantum entanglement based protocol for quantum random number generation allowing a trusted third party to publicly perform arbitrarily complex tests of randomness without any violation of the secrecy of the generated bit sequences. The protocol diminishes also an average time of the randomness testing (thus enabling arbitrary shortening of this time with increasing number of entangled qubits).
Article
Quantum random number generators with a continuous variable are considered based on a primary randomness of the outcomes of homodyne measurements of a coherent state. A deterministic method of extraction of truly random 0 and 1 from the primary sequence of measurements of the quadrature of the field in homodyne detection is considered. The method, in the case of independence of successive measurement outcomes, in the asymptotic limit of long sequences, allows us to extract with a polynomial complexity all the true randomness contained in the primary sequence. The method does not require knowledge of the probability distribution function of the primary random sequence, and also does not require additional randomness in the extraction of random 0 and 1. The approach with deterministic randomness extractors, unlike other methods, contains fewer assumptions and conditions that need to be satisfied in the experimental implementation of such generators, and is significantly more effective and simple in experimental implementation. The fundamental limitations dictated by nature for achieving statistical independence of successive measurement outcomes are also considered. The statistical independence of the measurement outcomes is the equivalent of true randomness, in the sense that is possible in the case of the independence of the measurement outcomes, provably, with deterministic extractor, to extract a ‘truly random sequence of 0 and 1’. It is shown that in the asymptotic limit it is possible to extract all the true randomness contained in the outcomes of physical measurements.
Article
Full-text available
Quantum random number generators (QRNGs) can significantly improve the security of cryptographic protocols, by ensuring that generated keys cannot be predicted. However, the cost, size, and power requirements of current QRNGs has prevented them from becoming widespread. In the meantime, the quality of the cameras integrated in mobile telephones has improved significantly, so that now they are sensitive to light at the few-photon level. We demonstrate how these can be used to generate random numbers of a quantum origin.
Article
Full-text available
We performed a sanity check of public keys collected on the web and found that the vast majority works as intended. Our main goal was to test the validity of the assumption that different random choices are made each time keys are generated. We found that this is not always the case, resulting in public keys that offer no security. Our conclusion is that generating secure public keys in the real world is challenging. We did not study usage of public keys.
Article
Full-text available
Random numbers are a valuable component in diverse applications that range from simulations over gambling to cryptography. The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational unpredictability of quantum mechanics. However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique. Here we present a simple experimental setup based on homodyne measurements that uses the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably unique randomness, are important attributes for achieving high-reliability, high-speed and low-cost quantum random number generators.
Conference Paper
Full-text available
We show that the existence of one-way functions is necessary and sufficient for the existence of pseudo-random generators in the following sense. Let ƒ be an easily computable function such that when x is chosen randomly: (1) from ƒ(x) it is hard to recover an x1 with ƒ(x1) = ƒ(x) by a small circuit, or; (2) ƒ has small degeneracy and from ƒ(x) it is hard to recover x by a fast algorithm. From one-way functions of type (1) or (2) we show how to construct pseudo-random generators secure against small circuits or fast algorithms, respectively, and vice-versa. Previous results show how to construct pseudo-random generators from one-way functions that have special properties ([Blum, Micali 82], [Yao 82], [Levin 85], [Goldreich, Krawczyk, Luby 88]).We use the results of [Goldreich, Levin 89] in an essential way.
Article
Full-text available
According to quantum theory, measurements generate random outcomes, in stark contrast with classical mechanics. This raises the question of whether there could exist an extension of the theory that removes this indeterminism, as suspected by Einstein, Podolsky and Rosen. Although this has been shown to be impossible, existing results do not imply that the current theory is maximally informative. Here we ask the more general question of whether any improved predictions can be achieved by any extension of quantum theory. Under the assumption that measurements can be chosen freely, we answer this question in the negative: no extension of quantum theory can give more information about the outcomes of future measurements than quantum theory itself. Our result has significance for the foundations of quantum mechanics, as well as applications to tasks that exploit the inherent randomness in quantum theory, such as quantum cryptography.
Article
Full-text available
Random number generators (RNG) are an important resource in many areas: cryptography (both quantum and classical), probabilistic computation (Monte Carlo methods), numerical simulations, industrial testing and labeling, hazard games, scientific research, etc. Because today's computers are deterministic, they can not create random numbers unless complemented with a RNG. Randomness of a RNG can be precisely, scientifically characterized and measured. Especially valuable is the information-theoretic provable RNG (True RNG - TRNG) which, at state of the art, seem to be possible only by use of physical randomness inherent to certain (simple) quantum systems. On the other hand, current industry standard dictates use of RNG's based on free running oscillators (FRO) whose randomness is derived from electronics noise present in logic circuits and which cannot be strictly proven. This approach is currently used in 3-rd and 4-th generation FPGA and ASIC hardware, unsuitable for realization of quantum TRNG. We compare weak and strong aspects of the two approaches and discuss possibility of building quantum TRNG in the recently appeared Mixed Signal FPGA technology. Finally, we discuss several examples where use of a TRNG is critical and show how it can significantly improve security of cryptographic systems.
Article
Full-text available
Randomness extraction involves the processing of purely classical information and is therefore usually studied in the framework of classical probability theory. However, such a classical treatment is generally too restrictive for applications, where side information about the values taken by classical random variables may be represented by the state of a quantum system. This is particularly relevant in the context of cryptography, where an adversary may make use of quantum devices. Here, we show that the well known construction paradigm for extractors proposed by Trevisan is sound in the presence of quantum side information. We exploit the modularity of this paradigm to give several concrete extractor constructions, which, e.g, extract all the conditional (smooth) min-entropy of the source using a seed of length poly-logarithmic in the input, or only require the seed to be weakly random.
Article
Full-text available
Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.
Article
Full-text available
An extractor is a function that is used to extract randomness. Given an imperfect random source X and a uniform seed Y, the output E(X,Y) is close to uniform. We study properties of such functions in the presence of prior quantum information about X, with a particular focus on cryptographic applications. We prove that certain extractors are suitable for key expansion in the bounded-storage model where the adversary has a limited amount of quantum memory. For extractors with one-bit output we show that the extracted bit is essentially equally secure as in the case where the adversary has classical resources. We prove the security of certain constructions that output multiple bits in the bounded-storage model.
Article
Full-text available
This paper, provides a general treatment of privacy amplification by public discussion, a concept introduced by Bennett, Brassard, and Robert for a special scenario. Privacy amplification is a process that allows two parties to distil a secret key from a common random variable about which an eavesdropper has partial information. The two parties generally know nothing about the eavesdropper's information except that it satisfies a certain constraint. The results have applications to unconditionally secure secret-key agreement protocols and quantum cryptography, and they yield results on wiretap and broadcast channels for a considerably strengthened definition of secrecy capacity
Article
Full-text available
Similarly to quantum states, also quantum measurements can be "mixed", corresponding to a random choice within an ensemble of measuring apparatuses. Such mixing is equivalent to a sort of hidden variable, which produces a noise of purely classical nature. It is then natural to ask which apparatuses are "indecomposable", i. e. do not correspond to any random choice of apparatuses. This problem is interesting not only for foundations, but also for applications, since most optimization strategies give optimal apparatuses that are indecomposable. Mathematically the problem is posed describing each measuring apparatus by a positive operator-valued measure (POVM), which gives the statistics of the outcomes for any input state. The POVM's form a convex set, and in this language the indecomposable apparatuses are represented by extremal points--the analogous of "pure states" in the convex set of states. Differently from the case of states, however, indecomposable POVM's are not necessarily rank-one, e. g. von Neumann measurements. In this paper we give a complete classification of indecomposable apparatuses (for discrete spectrum), by providing different necessary and sufficient conditions for extremality of POVM's, along with a simple general algorithm for the decomposition of a POVM into extremals. As an interesting application, "informationally complete" measurements are analyzed in this respect. The convex set of POVM's is fully characterized by determining its border in terms of simple algebraic properties of the corresponding POVM's.
Article
This work is intended as an introduction to cryptographic security and a motivation for the widely used Quantum Key Distribution (QKD) security definition. We review the notion of security necessary for a protocol to be usable in a larger cryptographic context, i.e., for it to remain secure when composed with other secure protocols. We then derive the corresponding security criterion for QKD. We provide several examples of QKD composed in sequence and parallel with different cryptographic schemes to illustrate how the error of a composed protocol is the sum of the errors of the individual protocols. We also discuss the operational interpretations of the distance metric used to quantify these errors.
Article
We prove continuity of quantum conditional information $S(\rho^{12}| \rho^2)$ with respect to the uniform convergence of states and obtain a bound which is independent of the dimension of the second party. This can, e.g., be used to prove the continuity of squashed entanglement.
Article
We argue that the concepts of "freedom of choice" and of "causal order" are intrinsically linked: a choice is considered "free" if it is correlated only to variables in its causal future. We discuss the implications of this to Bell-type scenarios, where two separate measurements are carried out, neither of which lies in the causal future of the other, and where one typically assumes that the measurement settings are chosen freely. Furthermore, we refute a recent criticism made by Ghirardi and Romano in [arXiv:1301.5040] and [arXiv:1302.1635] that we used an unphysical freedom of choice assumption in our previous works, [Nat. Commun. 2, 411 (2011)] and [Phys. Rev. Lett. 108, 150402 (2012)].
Article
The most efficient way of obtaining information about the state of a quantum system is not always a direct measurement. It is sometimes preferable to extend the original Hilbert space of states into a larger space, and then to perform a quantum measurement in the enlarged space. Such an extension is always possible, by virtue of Neumark's theorem. The physical interpretation usually given to that theorem is the introduction of an auxiliary quantum system, prepared in a standard state, and the execution of a quantum measurement on both systems together. However, this widespread interpretation is unacceptable, because the statistical properties of the supposedly standard auxiliary system are inseparably entangled with those of the original, unknown system. A different method of preparing the auxiliary system is proposed, and shown to be physically acceptable.
Conference Paper
This paper describes a kind of architecture of truly random number generator circuits. A simple circuit to generate truly random numbers, which is based on thermal noise of the resistor, is presented, as well as some simulation results. The circuit can be fabricated using standard CMOS process
Article
In this paper, we show that the conditional min-entropy H <sub>min</sub>( A | B ) of a bipartite state rhoAB is directly related to the maximum achievable overlap with a maximally entangled state if only local actions on the B -part of rhoAB are allowed. In the special case where A is classical, this overlap corresponds to the probability of guessing A given B . In a similar vein, we connect the conditional max-entropy H <sub>max</sub>( A | B ) to the maximum fidelity of rhoAB with a product state that is completely mixed on A . In the case where A is classical, this corresponds to the security of A when used as a secret key in the presence of an adversary holding B . Because min- and max-entropies are known to characterize information-processing tasks such as randomness extraction and state merging, our results establish a direct connection between these tasks and basic operational problems. For example, they imply that the (logarithm of the) probability of guessing A given B is a lower bound on the number of uniform secret bits that can be extracted from A relative to an adversary holding B .
Article
The Leftover Hash Lemma states that the output of a two-universal hash function applied to an input with sufficiently high entropy is almost uniformly random. In its standard formulation, the lemma refers to a notion of randomness that is (usually implicitly) defined with respect to classical side information. Here, a strictly more general version of the Leftover Hash Lemma that is valid even if side information is represented by the state of a quantum system is shown. Our result applies to almost two-universal families of hash functions. The generalized Leftover Hash Lemma has applications in cryptography, e.g., for key agreement in the presence of an adversary who is not restricted to classical information processing.
Article
In this paper we exhibit several new classes of hash functions with certain desirable properties, and introduce two novel applications for hashing which make use of these functions. One class contains a small number of functions, yet is almost universal2. If the functions hash n-bit long names into m-bit indices, then specifying a member of the class requires only O((m + log2log2(n)) · log2(n)) bits as compared to O(n) bits for earlier techniques. For long names, this is about a factor of m larger than the lower bound of m + log2n − log2m bits. An application of this class is a provably secure authentication technique for sending messages over insecure lines. A second class of functions satisfies a much stronger property than universal2. We present the application of testing sets for equality.The authentication technique allows the receiver to be certain that a message is genuine. An “enemy”—even one with infinite computer resources—cannot forge or modify a message without detection. The set equality technique allows operations including “add member to set,” “delete member from set” and “test two sets for equality” to be performed in expected constant time and with less than a specified probability of error.
Article
This paper gives an input independent average linear time algorithm for storage and retrieval on keys. The algorithm makes a random choice of hash function from a suitable class of hash functions. Given any sequence of inputs the expected time (averaging over all functions in the class) to store and retrieve elements is linear in the length of the sequence. The number of references to the data base required by the algorithm for any input is extremely close to the theoretical minimum for any possible hash function with randomly distributed inputs. We present three suitable classes of hash functions which also can be evaluated rapidly. The ability to analyze the cost of storage and retrieval without worrying about the distribution of the input allows as corollaries improvements on the bounds of several algorithms.
Article
In this paper, we investigate how the use of a channel with perfect authenticity but no privacy can be used to repair the defects of a channel with imperfect privacy but no authenticity. More precisely, let us assume that Alice and Bob wish to agree on a secret random bit string, and have at their disposal an imperfect private channel and a perfect public channel. The private channel is imperfect in various ways: transmission errors can occur, and partial information can leak to an eavesdropper, Eve, who also has the power to suppress, inject, and modify transmissions arbitrarily. On the other hand, the public channel transmits information accurately, and these transmissions cannot be modified or suppressed by Eve, but their entire contents becomes known to her. We consider the situation in which a random bit string x has already been transmitted from Alice to Bob over the private channel, and we describe interactive public channel protocols that allow them, with high probability: (1) to assess the extent to which the private channel transmission has been corrupted by tampering and channel noise; and (2) if this corruption is not too severe, to repair Bob's partial ignorance of the transmitted string and Eve's partial knowledge of it by distilling from the transmitted and received versions of the string another string, in general shorter than x, upon which Alice and Bob have perfect information, while Eve has nearly no information (or in some cases exactly none), except for its length. These protocols remain secure against unlimited computing power.
Article
Extractors are functions which are able to “extract” random bits from arbitrary distributions which “contain” sufficient randomness. Explicit constructions of extractors have many applications in complexity theory and combinatorics. This manuscript is a survey of recent developments in extractors and focuses on explicit constructions of extractors following Trevisan’s breakthrough result [L. Trivisan, Construction of extractors using pseudorandom generators. In Proc. 31st ACM Symposium on Theory of Computing (1999)].
Book
Part I. Fundamental Concepts: 1. Introduction and overview; 2. Introduction to quantum mechanics; 3. Introduction to computer science; Part II. Quantum Computation: 4. Quantum circuits; 5. The quantum Fourier transform and its application; 6. Quantum search algorithms; 7. Quantum computers: physical realization; Part III. Quantum Information: 8. Quantum noise and quantum operations; 9. Distance measures for quantum information; 10. Quantum error-correction; 11. Entropy and information; 12. Quantum information theory; Appendices; References; Index.
Article
Are there fundamentally random processes in nature? Theoretical predictions, confirmed experimentally, such as the violation of Bell inequalities, point to an affirmative answer. However, these results are based on the assumption that measurement settings can be chosen freely at random, so assume the existence of perfectly free random processes from the outset. Here we consider a scenario in which this assumption is weakened and show that partially free random bits can be amplified to make arbitrarily free ones. More precisely, given a source of random bits whose correlation with other variables is below a certain threshold, we propose a procedure for generating fresh random bits that are virtually uncorrelated with all other variables. We also conjecture that such procedures exist for any non-trivial threshold. Our result is based solely on the no-signalling principle, which is necessary for the existence of free randomness.
Article
The data processing inequality (DPI) is a fundamental feature of information theory. Informally it states that you cannot increase the information content of a quantum system by acting on it with a local physical operation. When the smooth min-entropy is used as the relevant information measure, then the DPI follows immediately from the definition of the entropy. The DPI for the von Neumann entropy is then obtained by specializing the DPI for the smooth min-entropy by using the quantum asymptotic equipartition property (QAEP). We provide a new, simplified proof of the QAEP and therefore obtain a self-contained proof of the DPI for the von Neumann entropy.
Article
We construct a strong extractor against quantum storage that works for every min-entropy $k$, has logarithmic seed length, and outputs $\Omega(k)$ bits, provided that the quantum adversary has at most $\beta k$ qubits of memory, for any $\beta < \half$. The construction works by first condensing the source (with minimal entropy-loss) and then applying an extractor that works well against quantum adversaries when the source is close to uniform. We also obtain an improved construction of a strong quantum-proof extractor in the high min-entropy regime. Specifically, we construct an extractor that uses a logarithmic seed length and extracts $\Omega(n)$ bits from any source over $\B^n$, provided that the min-entropy of the source conditioned on the quantum adversary's state is at least $(1-\beta) n$, for any $\beta < \half$.
Article
After a general introduction, the thesis is divided into four parts. In the first, we discuss the task of coin tossing, principally in order to highlight the effect different physical theories have on security in a straightforward manner, but, also, to introduce a new protocol for non-relativistic strong coin tossing. This protocol matches the security of the best protocol known to date while using a conceptually different approach to achieve the task. In the second part variable bias coin tossing is introduced. This is a variant of coin tossing in which one party secretly chooses one of two biased coins to toss. It is shown that this can be achieved with unconditional security for a specified range of biases, and with cheat-evident security for any bias. We also discuss two further protocols which are conjectured to be unconditionally secure for any bias. The third section looks at other two-party secure computations for which, prior to our work, protocols and no-go theorems were unknown. We introduce a general model for such computations, and show that, within this model, a wide range of functions are impossible to compute securely. We give explicit cheating attacks for such functions. In the final chapter we discuss the task of expanding a private random string, while dropping the usual assumption that the protocol's user trusts her devices. Instead we assume that all quantum devices are supplied by an arbitrarily malicious adversary. We give two protocols that we conjecture securely perform this task. The first allows a private random string to be expanded by a finite amount, while the second generates an arbitrarily large expansion of such a string.
Conference Paper
It is shown that modified versions of the linear congruential generator and the shift register generator are provably good for amplifying the correctness of a probabilistic algorithm. More precisely, if r random bits are needed for a BPP algorithm to be correct with probability at least 2/3, then O ( r + k <sup>2</sup>) bits are needed to improve this probability to 1-2<sup>-k</sup>. A different pseudorandom generator that is optimal, up to a constant factor, in this regard is also presented. It uses only O ( r + k ) bits to improve the probability to 1-2<sup>-k</sup>. This generator is based on random walks on expanders. The results do not depend on any unproven assumptions. It is shown that the modified versions of the shift register and linear congruential generators can be used to sample from distributions using, in the limit, the information-theoretic lower bound on random bits
Article
We introduce a new approach to constructing extractors. Extractors are algorithms that transform a “weakly random” distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain. We demonstrate an unsuspected connection between extractors and pseudorandom generators. In fact, we show that every pseudorandom generator of a certain kind is an extractor. A pseudorandom generator construction due to Impagliazzo and Wigderson, once reinterpreted via our connection, is already an extractor that beats most known constructions and solves an important open question. We also show that, using the simpler Nisan-Wigderson generator and standard error-correcting codes, one can build even better extractors with the additional advantage that both the construction and the analysis are simple and admit a short self-contained description.
Article
We show that any randomized algorithm that runs in space S and time T and uses poly(S) random bits can be simulated using only O(S) random bits in space S and time T poly(S). A deterministic simulation in space S follows. Of independent interest is our main technical tool: a procedure which extracts randomness from a defective random source using a small additional number of truly random bits. 1
Article
This paper is an expository treatment of the leftover hash lemma and some of its applications in cryptography and complexity. 1 Introduction The technique of universal hashing, introduced in 1979 by Carter and Wegman [6], has become an essential tool in many areas of computer science, including derandomization, pseudorandom number generation and privacy amplication, to mention three specic applications. It has been observed that universal hash families are very closely related to combinatorial structures such as orthogonal arrays ([11]) and error-correcting codes ([15]), and we will frequently make use of these connections. (for a survey, see Stinson [21]). Several random number generators related to strongly universal hash families have been shown to have desirable quasirandomness properites; see for example, [17]. (Quasirandomness provides a measure of how closely a given probability distribution approximates the uniform distribution.) We will give a self-contained, elementary tre...
Article
In the classical privacy amplification problem Alice and Bob share information that is only partially secret towards an eavesdropper Charlie. Their goal is to distill this information to a shorter string that is completely secret. The classical privacy amplification problem can be solved almost optimally using extractors. An interesting variant of the problem, where the eavesdropper Charlie is allowed to keep quantum information rather than just classical information, was introduced by Konig, Maurer and Renner. In this setting, the eavesdropper Charlie may entangle himself with the input (without changing it) and the only limitation Charlie has is that it may keep at most b qubits of storage. A natural question is whether there are classical extractors that are good even against quantum storage. Recent work has shown that some classical extractors miserably fail against quantum storage. At the same time, it was shown that some other classical extractors work well even against quantum storage, but all these extractors had a large seed length that was either as large as the extractor output, or as large as the quantum storage available to the eavesdropper. In this paper we show that a modified version of Trevisan's extractor is good even against quantum storage, thereby giving the first such construction with logarithmic seed length. The technique we use is a combination of Trevisan's approach of constructing an extractor from a black-box pseudorandom generator, together with locally list-decodable codes and previous work done on quantum random access codes.
Conference Paper
Privacy amplification is the art of shrinking a partially secret string Z to a highly secret key S. We show that, even if an adversary holds quantum information about the initial string Z, the key S obtained by two-universal hashing is secure, according to a universally composable security definition. Additionally, we give an asymptotically optimal lower bound on the length of the extractable key S in terms of the adversary's (quantum) knowledge about Z. Our result has applications in quantum cryptography. In particular, it implies that many of the known quantum key distribution protocols are universally composable.
Article
Simple optical instruments are linear optical networks where the incident light modes are turned into equal numbers of outgoing modes by linear transformations. For example, such instruments are beam splitters, multiports, interferometers, fibre couplers, polarizers, gravitational lenses, parametric amplifiers, phase-conjugating mirrors and also black holes. The article develops the quantum theory of simple optical instruments and applies the theory to a few characteristic situations, to the splitting and interference of photons and to the manifestation of Einstein-Podolsky-Rosen correlations in parametric downconversion. How to model irreversible devices such as absorbers and amplifiers is also shown. Finally, the article develops the theory of Hawking radiation for a simple optical black hole. The paper is intended as a primer, as a nearly self-consistent tutorial. The reader should be familiar with basic quantum mechanics and statistics, and perhaps with optics and some elementary field theory. The quantum theory of light in dielectrics serves as the starting point and, in the concluding section, as a guide to understand quantum black holes.
Privacy amplification by public discussion
  • C H Bennett
  • G Brassard
  • J.-M Robert
C. H. Bennett, G. Brassard, and J.-M. Robert, "Privacy amplification by public discussion," SIAM Journal on Computing Comput. 17, 210 (1988).
Postprocessing for quantum random number generators: entropy evaluation and randomness extraction
  • Xiongfeng Ma
  • Feihu Xu
  • He Xu
  • Xiaoqing Tan
  • Bing Qi
  • Hoi-Kwong Lo
Xiongfeng Ma, Feihu Xu, He Xu, Xiaoqing Tan, Bing Qi, and Hoi-Kwong Lo, "Postprocessing for quantum random number generators: entropy evaluation and randomness extraction," arXiv:1207.1473 (2012).
A randomness extractor for the Quantis device
  • Matthias Troyer
  • Renato Renner
Matthias Troyer and Renato Renner, "A randomness extractor for the Quantis device," http://www.idquantique.com/images/ stories/PDF/quantis-random-generator/quantisrndextract-techpaper.pdf (2012).