RESUMO
We consider how to describe Hamiltonian mechanics in generalized probabilistic theories with the states represented as quasiprobability distributions. We give general operational definitions of energy-related concepts. We define generalized energy eigenstates as the purest stationary states. Planck's constant plays two different roles in the framework: the phase space volume taken up by a pure state and a dynamical factor. The Hamiltonian is a linear combination of generalized energy eigenstates. This allows for a generalized Liouville time-evolution equation that applies to quantum and classical Hamiltonian mechanics and more. The approach enables a unification of quantum and classical energy concepts and a route to discussing energy in a wider set of theories.
RESUMO
We experimentally probe the interplay of the quantum switch with the laws of thermodynamics. The quantum switch places two channels in a superposition of orders and may be applied to thermalizing channels. Quantum-switching thermal channels has been shown to give apparent violations of the second law. Central to these apparent violations is how quantum switching channels can increase the capacity to communicate information. We experimentally show this increase and how it is consistent with the laws of thermodynamics, demonstrating how thermodynamic resources are consumed. We use a nuclear magnetic resonance approach with coherently controlled interactions of nuclear spin qubits. We verify an analytical upper bound on the increase in capacity for channels that preserve energy and thermal states, and demonstrate that the bound can be exceeded for an energy-altering channel. We show that the switch can be used to take a thermal state to a state that is not thermal, while consuming free energy associated with the coherence of a control system. The results show how the switch can be incorporated into quantum thermodynamics experiments as an additional resource.
RESUMO
We address a new setting where the second law is under question: thermalizations in a quantum superposition of causal orders, enacted by the so-called quantum switch. This superposition has been shown to be associated with an increase in the communication capacity of the channels, yielding an apparent violation of the data-processing inequality and a possibility to separate hot from cold. We analyze the thermodynamics of this information capacity increasing process. We show how the information capacity increase is compatible with thermodynamics. We show that there may indeed be an information capacity increase for consecutive thermalizations obeying the first and second laws of thermodynamics if these are placed in an indefinite order and moreover that only a significantly bounded increase is possible. The increase comes at the cost of consuming a thermodynamic resource, the free energy of coherence associated with the switch.
RESUMO
We consider how the energy cost of bit reset scales with the time duration of the protocol. Bit reset necessarily takes place in finite time, where there is an extra penalty on top of the quasistatic work cost derived by Landauer. This extra energy is dissipated as heat in the computer, inducing a fundamental limit on the speed of irreversible computers. We formulate a hardware-independent expression for this limit in the framework of stochastic processes. We derive a closed-form lower bound on the work penalty as a function of the time taken for the protocol and bit reset error. It holds for discrete as well as continuous systems, assuming only that the master equation respects detailed balance.
RESUMO
The heat generated by computations is not only an obstacle to circuit miniaturization but also a fundamental aspect of the relationship between information theory and thermodynamics. In principle, reversible operations may be performed at no energy cost; given that irreversible computations can always be decomposed into reversible operations followed by the erasure of data, the problem of calculating their energy cost is reduced to the study of erasure. Landauer's principle states that the erasure of data stored in a system has an inherent work cost and therefore dissipates heat. However, this consideration assumes that the information about the system to be erased is classical, and does not extend to the general case where an observer may have quantum information about the system to be erased, for instance by means of a quantum memory entangled with the system. Here we show that the standard formulation and implications of Landauer's principle are no longer valid in the presence of quantum information. Our main result is that the work cost of erasure is determined by the entropy of the system, conditioned on the quantum information an observer has about it. In other words, the more an observer knows about the system, the less it costs to erase it. This result gives a direct thermodynamic significance to conditional entropies, originally introduced in information theory. Furthermore, it provides new bounds on the heat generation of computations: because conditional entropies can become negative in the quantum case, an observer who is strongly correlated with a system may gain work while erasing it, thereby cooling the environment.
RESUMO
We report an experimental realization of Maxwell's demon in a photonic setup. We show that a measurement at the few-photons level followed by a feed-forward operation allows the extraction of work from intense thermal light into an electric circuit. The interpretation of the experiment stimulates the derivation of an equality relating work extraction to information acquired by measurement. We derive a bound using this relation and show that it is in agreement with the experimental results. Our work puts forward photonic systems as a platform for experiments related to information in thermodynamics.
RESUMO
Landauer's principle states that it costs at least kBTln2 of work to reset one bit in the presence of a heat bath at temperature T. The bound of kBTln2 is achieved in the unphysical infinite-time limit. Here we ask what is possible if one is restricted to finite-time protocols. We prove analytically that it is possible to reset a bit with a work cost close to kBTln2 in a finite time. We construct an explicit protocol that achieves this, which involves thermalizing and changing the system's Hamiltonian so as to avoid quantum coherences. Using concepts and techniques pertaining to single-shot statistical mechanics, we furthermore prove that the heat dissipated is exponentially close to the minimal amount possible not just on average, but guaranteed with high confidence in every run. Moreover, we exploit the protocol to design a quantum heat engine that works near the Carnot efficiency in finite time.
RESUMO
We review recent results on textile triboelectric nanogenerators (T-TENGs), which function both as harvesters of mechanical energy and self-powered motion sensors. T-TENGs can be flexible, breathable, and lightweight. With a combination of traditional and novel manufacturing methods, including nanofibers, T-TENGs can deliver promising power output. We review the evolution of T-TENG device structures based on various textile material configurations and fabrication methods, along with demonstrations of self-powered systems. We also provide a detailed analysis of different textile materials and approaches used to enhance output. Additionally, we discuss integration capabilities with supercapacitors and potential applications across various fields such as health monitoring, human activity monitoring, human-machine interaction applications, etc. This review concludes by addressing the challenges and key research questions that remain for developing viable T-TENG technology.
RESUMO
Polyhydroxyalkanoates (PHAs) could be used to make sustainable, biodegradable plastics. However, the precise and accurate mechanistic modeling of PHA biosynthesis, especially medium-chain-length PHA (mcl-PHA), for yield improvement remains a challenge to biology. PHA biosynthesis is typically triggered by nitrogen limitation and tends to peak at an optimal carbon-to-nitrogen (C/N) ratio. Specifically, simulation of the underlying dynamic regulation mechanisms for PHA bioprocess is a bottleneck owing to surfeit model complexity and current modeling philosophies for uncertainty. To address this issue, we proposed a quantum-like decision-making model to encode gene expression and regulation events as hidden layers by the general transformation of a density matrix, which uses the interference of probability amplitudes to provide an empirical-level description for PHA biosynthesis. We implemented our framework modeling the biosynthesis of mcl-PHA in Pseudomonas putida with respect to external C/N ratios, showing its optimization production at maximum PHA production of 13.81% cell dry mass (CDM) at the C/N ratio of 40:1. The results also suggest the degree of P. putida's preference in channeling carbon towards PHA production as part of the bacterium's adaptative behavior to nutrient stress using quantum formalism. Generic parameters (kD, kN and theta θ) obtained based on such quantum formulation, representing P. putida's PHA biosynthesis with respect to external C/N ratios, was discussed. This work offers a new perspective on the use of quantum theory for PHA production, demonstrating its application potential for other bioprocesses.
Assuntos
Nitrogênio , Poli-Hidroxialcanoatos , Pseudomonas putida , Pseudomonas putida/metabolismo , Pseudomonas putida/genética , Poli-Hidroxialcanoatos/biossíntese , Poli-Hidroxialcanoatos/metabolismo , Nitrogênio/metabolismo , Carbono/metabolismo , Teoria Quântica , Nutrientes/metabolismo , Modelos BiológicosRESUMO
A Comment on the Letter by S. W. Kim et al., Phys. Rev. Lett. 106, 070401 (2011). The authors of the Letter offer a Reply.
RESUMO
Annealing has proven highly successful in finding minima in a cost landscape. Yet, depending on the landscape, systems often converge towards local minima rather than global ones. In this Letter, we analyze the conditions for which annealing is approximately successful in finite time. We connect annealing to stochastic thermodynamics to derive a general bound on the distance between the system state at the end of the annealing and the ground state of the landscape. This distance depends on the amount of state updates of the system and the accumulation of nonequilibrium energy, two protocol and energy landscape-dependent quantities which we show are in a trade-off relation. We describe how to bound the two quantities both analytically and physically. This offers a general approach to assess the performance of annealing from accessible parameters, both for simulated and physical implementations.
RESUMO
Annealing is the process of gradually lowering the temperature of a system to guide it towards its lowest energy states. In an accompanying paper [Y. Luo et al., Phys. Rev. E 108, L052105 (2023)10.1103/PhysRevE.108.L052105], we derived a general bound on annealing performance by connecting annealing with stochastic thermodynamics tools, including a speed limit on state transformation from entropy production. We here describe the derivation of the general bound in detail. In addition, we analyze the case of simulated annealing with Glauber dynamics in depth. We show how to bound the two case-specific quantities appearing in the bound, namely the activity, a measure of the number of microstate jumps, and the change in relative entropy between the state and the instantaneous thermal state, which is due to temperature variation. We exemplify the arguments by numerical simulations on the Sherrington-Kirkpatrick (SK) model of spin glasses.
RESUMO
A bit reset is a basic operation in irreversible computing. This costs work and dissipates energy in the computer, creating a limit on speeds and energy efficiency of future irreversible computers. It was recently shown by Zhen et al. [Phys. Rev. Lett. 127, 190602 (2021)0031-900710.1103/PhysRevLett.127.190602] that for a finite-time reset protocol, the additional work on top of the quasistatic protocol can always be minimized by considering a two-level system, and then be lower bounded through a thermodynamical speed limit. An important question is to understand under what protocol parameters, including a bit reset error and maximum energy shift, this penalty decreases exponentially vs inverse linearly in the protocol time. Here we provide several analytical results to address this question, as well as numerical simulations of specific examples of protocols.
RESUMO
This study aims at investigating flexible and transparent thermoplastic polyurethane (TPU) as a novel material for triboelectric nanogenerator (TENG) devices with a polyethylene terephthalate layer. Devices having TPU-either as a flat film or as electrospun micrometer-dimension fibers with varying concentrations of TPU-were tested. The best output performing device provided 21.4 V and 23 µA as open-circuit voltage and short-circuit current respectively, with the application of a small force of 0.33 N indicating the high efficiency of the device. Devices with flat films-obtained using the doctor-blade (DB) technique-have high transparency (80%) as well as high TENG output. The topography of the TPU layer, characterized by atomic force microscopy, reveals nanoscale roughness of the film surface. Finally, we demonstrate that gentle hand tapping on the TENG device can power upto 11 light-emitting diodes (LEDs). The high transparency, lightweight, simple fabrication, flexibility, and robust features of such device make it an added value for various optoelectronic applications.
RESUMO
A remarkable feature of quantum theory is nonlocality (Bell inequality violations). However, quantum correlations are not maximally nonlocal, and it is natural to ask whether there are compelling reasons for rejecting theories in which stronger violations are possible. To shed light on this question, we consider post-quantum theories in which maximally nonlocal states (nonlocal boxes) occur. We show that reversible transformations in such theories are trivial: they consist solely of local operations and permutations of systems. In particular, no correlations can be created; nonlocal boxes cannot be prepared from product states and classical computers can efficiently simulate all such processes.
RESUMO
We probe the potential for intelligent intervention to enhance the power output of energy harvesters. We investigate general principles and a case study: a bi-resonant piezo electric harvester. We consider intelligent interventions via pre-programmed reversible energy-conserving operations. These include voltage bias flips and voltage phase shifts. These can be used to rectify voltages and to remove destructive interference. We choose the intervention type based on past data, using machine learning techniques. We find that in important parameter regimes the resulting interventions can outperform diode-based intervention, which in contrast has a fundamental minimum power dissipation bound.
RESUMO
Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.
RESUMO
The patterns of fringes produced by an interferometer have long been important testbeds for our best contemporary theories of physics. Historically, interference has been used to contrast quantum mechanics with classical physics, but recently experiments have been performed that test quantum theory against even more exotic alternatives. A physically motivated family of theories are those where the state space of a two-level system is given by a sphere of arbitrary dimension. This includes classical bits, and real, complex and quaternionic quantum theory. In this paper, we consider relativity of simultaneity (i.e. that observers may disagree about the order of events at different locations) as applied to a two-armed interferometer, and show that this forbids most interference phenomena more complicated than those of complex quantum theory. If interference must depend on some relational property of the setting (such as path difference), then relativity of simultaneity will limit state spaces to standard complex quantum theory, or a subspace thereof. If this relational assumption is relaxed, we find one additional theory compatible with relativity of simultaneity: quaternionic quantum theory. Our results have consequences for current laboratory interference experiments: they have to be designed carefully to avoid rendering beyond-quantum effects invisible by relativity of simultaneity.
RESUMO
We demonstrate with an experiment how molecules are a natural test bed for probing fundamental quantum thermodynamics. Single-molecule spectroscopy has undergone transformative change in the past decade with the advent of techniques permitting individual molecules to be distinguished and probed. We demonstrate that the quantum Jarzynski equality for heat is satisfied in this set-up by considering the time-resolved emission spectrum of organic molecules as arising from quantum jumps between states. This relates the heat dissipated into the environment to the free energy difference between the initial and final state. We demonstrate also how utilizing the quantum Jarzynski equality allows for the detection of energy shifts within a molecule, beyond the relative shift.
RESUMO
Maxwell's daemon is a popular personification of a principle connecting information gain and extractable work in thermodynamics. A Szilard Engine is a particular hypothetical realization of Maxwell's daemon, which is able to extract work from a single thermal reservoir by measuring the position of particle(s) within the system. Here we investigate the role of particle statistics in the whole process; namely, how the extractable work changes if instead of classical particles fermions or bosons are used as the working medium. We give a unifying argument for the optimal work in the different cases: the extractable work is determined solely by the information gain of the initial measurement, as measured by the mutual information, regardless of the number and type of particles which constitute the working substance.