RESUMEN
Quantum computers are expected to break modern public key cryptography owing to Shor's algorithm. As a result, these cryptosystems need to be replaced by quantum-resistant algorithms, also known as post-quantum cryptography (PQC) algorithms. The PQC research field has flourished over the past two decades, leading to the creation of a large variety of algorithms that are expected to be resistant to quantum attacks. These PQC algorithms are being selected and standardized by several standardization bodies. However, even with the guidance from these important efforts, the danger is not gone: there are billions of old and new devices that need to transition to the PQC suite of algorithms, leading to a multidecade transition process that has to account for aspects such as security, algorithm performance, ease of secure implementation, compliance and more. Here we present an organizational perspective of the PQC transition. We discuss transition timelines, leading strategies to protect systems against quantum attacks, and approaches for combining pre-quantum cryptography with PQC to minimize transition risks. We suggest standards to start experimenting with now and provide a series of other recommendations to allow organizations to achieve a smooth and timely PQC transition.
RESUMEN
Weinberg's seminal prediction of the cosmological constant relied on a provisional method for regulating eternal inflation which has since been put aside. We show that a modern regulator, the causal patch, improves agreement with observation, removes many limiting assumptions, and yields additional powerful results. Without assuming necessary conditions for observers such as galaxies or entropy production, the causal patch measure predicts the coincidence of vacuum energy and present matter density. Their common scale, and thus the enormous size of the visible Universe, originates in the number of metastable vacua in the landscape.
RESUMEN
We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as ΔH(ρ_{m},ρ)+ãQã_{F|m}≥0, where ΔH(ρ_{m},ρ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρ_{m} and ãQã_{F|m} is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.