RESUMO
Quantum physics is intrinsically probabilistic, where the Born rule yields the probabilities associated with a state that deterministically evolves. The entropy of a quantum state quantifies the amount of randomness (or information loss) of such a state. The degrees of freedom of a quantum state are position and spin. We focus on the spin degree of freedom and elucidate the spin-entropy. Then, we present some of its properties and show how entanglement increases spin-entropy. A dynamic model for the time evolution of spin-entropy concludes the paper.
RESUMO
Quantum physics through the lens of Bayesian statistics considers probability to be a degree of belief and subjective. A Bayesian derivation of the probability density function in phase space is presented. Then, a Kullback-Liebler divergence in phase space is introduced to define interference and entanglement. Comparisons between each of these two quantities and the entropy are made. A brief presentation of entanglement in phase space to the spin degree of freedom and an extension to mixed states completes the work.
RESUMO
Two types of randomness are associated with a mixed quantum state: the uncertainty in the probability coefficients of the constituent pure states and the uncertainty in the value of each observable captured by the Born's rule probabilities. Entropy is a quantification of randomness, and we propose a spin-entropy for the observables of spin pure states based on the phase space of a spin as described by the geometric quantization method, and we also expand it to mixed quantum states. This proposed entropy overcomes the limitations of previously-proposed entropies such as von Neumann entropy which only quantifies the randomness of specifying the quantum state. As an example of a limitation, previously-proposed entropies are higher for Bell entangled spin states than for disentangled spin states, even though the spin observables are less constrained for a disentangled pair of spins than for an entangled pair. The proposed spin-entropy accurately quantifies the randomness of a quantum state, it never reaches zero value, and it is lower for entangled states than for disentangled states.
RESUMO
Quantum physics, despite its intrinsically probabilistic nature, lacks a definition of entropy fully accounting for the randomness of a quantum state. For example, von Neumann entropy quantifies only the incomplete specification of a quantum state and does not quantify the probabilistic distribution of its observables; it trivially vanishes for pure quantum states. We propose a quantum entropy that quantifies the randomness of a pure quantum state via a conjugate pair of observables/operators forming the quantum phase space. The entropy is dimensionless, it is a relativistic scalar, it is invariant under canonical transformations and under CPT transformations, and its minimum has been established by the entropic uncertainty principle. We expand the entropy to also include mixed states. We show that the entropy is monotonically increasing during a time evolution of coherent states under a Dirac Hamiltonian. However, in a mathematical scenario, when two fermions come closer to each other, each evolving as a coherent state, the total system's entropy oscillates due to the increasing spatial entanglement. We hypothesize an entropy law governing physical systems whereby the entropy of a closed system never decreases, implying a time arrow for particle physics. We then explore the possibility that as the oscillations of the entropy must by the law be barred in quantum physics, potential entropy oscillations trigger annihilation and creation of particles.
RESUMO
Image classification for real-world applications often involves complicated data distributions such as fine-grained and long-tailed. To address the two challenging issues simultaneously, we propose a new regularization technique that yields an adversarial loss to strengthen the model learning. Specifically, for each training batch, we construct an adaptive batch prediction (ABP) matrix and establish its corresponding adaptive batch confusion norm (ABC-Norm). The ABP matrix is a composition of two parts, including an adaptive component to class-wise encode the imbalanced data distribution, and the other component to batch-wise assess the softmax predictions. The ABC-Norm leads to a norm-based regularization loss, which can be theoretically shown to be an upper bound for an objective function closely related to rank minimization. By coupling with the conventional cross-entropy loss, the ABC-Norm regularization could introduce adaptive classification confusion and thus trigger adversarial learning to improve the effectiveness of model learning. Different from most of state-of-the-art techniques in solving either fine-grained or long-tailed problems, our method is characterized with its simple and efficient design, and most distinctively, provides a unified solution. In the experiments, we compare ABC-Norm with relevant techniques and demonstrate its efficacy on several benchmark datasets, including (CUB-LT, iNaturalist2018); (CUB, CAR, AIR); and (ImageNet-LT), which respectively correspond to the real-world, fine-grained, and long-tailed scenarios.
RESUMO
Any complete theory of human stereopsis must model not only how the correspondences between locations in the two views are determined and the depths are recovered from their disparity, but also how the ambiguity arising from such factors as noise, periodicity, and large regions of constant intensity are resolved and missing data are interpolated. In investigating this process of recovering surface structure from sparse disparity information, using stereo pairs with sparse identifiable features, we made an observation that contradicts all extant models. It suggests the inadequacy of retinotopic representation in modeling surface perception in this stage. We also suggest a possible alternative theory, which is a minimization of the modulus of Gaussian curvature.
Assuntos
Percepção de Profundidade/fisiologia , Ilusões Ópticas/fisiologia , Percepção de Forma/fisiologia , Humanos , Modelos Neurológicos , Modelos Psicológicos , Modelos Estatísticos , Psicofísica , Disparidade Visual/fisiologia , Visão Binocular/fisiologiaRESUMO
Illusory contours occur in a wide variety of circumstances in nature. A striking man- made example is the Kanizsa triangle. A common factor in all such figures is the perception of a surface occluding part of a background, i.e. illusory contours are always accompanied by illusory surfaces. The detection of occlusion cues suggest various different local surface configurations, leading to a large combinatorial set of global surface configurations, each one constituting an image organization. We address the problems of why and how the image organizations that yield illusory contours arise. Our approach is to: (i) detect occlusions; (ii) assign surface-states at these locations that reflect the presence of a particular surface configuration; (iii) apply a Bayesian model to diffuse this local surface information; (iv) define an entropy measure for each image organization to select the best one(s) as the one(s) giving the lowest entropy values. We note that: (a) the illusory contours arise from the surface boundaries, and hence we do not propagate/extend intensity edges directly; (b) the overlapping surfaces provide an explanation for amodal completions. The model reproduces various qualitative and quantitative aspects of illusory contour perception and has been supported by a series of experiments.
RESUMO
In this paper we report a database and a series of techniques related to the problem of tracking cells, and detecting their divisions, in time-lapse movies of mammalian embryos. Our contributions are (1) a method for counting embryos in a well, and cropping each individual embryo across frames, to create individual movies for cell tracking; (2) a semi-automated method for cell tracking that works up to the 8-cell stage, along with a software implementation available to the public (this software was used to build the reported database); (3) an algorithm for automatic tracking up to the 4-cell stage, based on histograms of mirror symmetry coefficients captured using wavelets; (4) a cell-tracking database containing 100 annotated examples of mammalian embryos up to the 8-cell stage; and (5) statistical analysis of various timing distributions obtained from those examples.
Assuntos
Blastômeros/citologia , Divisão Celular/fisiologia , Rastreamento de Células/métodos , Embrião de Mamíferos/citologia , Desenvolvimento Embrionário/fisiologia , Processamento de Imagem Assistida por Computador/métodos , Animais , Blastômeros/metabolismo , Rastreamento de Células/instrumentação , Embrião de Mamíferos/metabolismo , Processamento de Imagem Assistida por Computador/instrumentação , CamundongosRESUMO
We present DevStaR, an automated computer vision and machine learning system that provides rapid, accurate, and quantitative measurements of C. elegans embryonic viability in high-throughput (HTP) applications. A leading genetic model organism for the study of animal development and behavior, C. elegans is particularly amenable to HTP functional genomic analysis due to its small size and ease of cultivation, but the lack of efficient and quantitative methods to score phenotypes has become a major bottleneck. DevStaR addresses this challenge using a novel hierarchical object recognition machine that rapidly segments, classifies, and counts animals at each developmental stage in images of mixed-stage populations of C. elegans. Here, we describe the algorithmic design of the DevStaR system and demonstrate its performance in scoring image data acquired in HTP screens.
Assuntos
Caenorhabditis elegans/anatomia & histologia , Caenorhabditis elegans/crescimento & desenvolvimento , Processamento de Imagem Assistida por Computador/métodos , Estágios do Ciclo de Vida/fisiologia , Fenótipo , Algoritmos , Animais , MicroscopiaRESUMO
We present a hierarchical principle for object recognition and its application to automatically classify developmental stages of C. elegans animals from a population of mixed stages. The object recognition machine consists of four hierarchical layers, each composed of units upon which evaluation functions output a label score, followed by a grouping mechanism that resolves ambiguities in the score by imposing local consistency constraints. Each layer then outputs groups of units, from which the units of the next layer are derived. Using this hierarchical principle, the machine builds up successively more sophisticated representations of the objects to be classified. The algorithm segments large and small objects, decomposes objects into parts, extracts features from these parts, and classifies them by SVM. We are using this system to analyze phenotypic data from C. elegans high-throughput genetic screens, and our system overcomes a previous bottleneck in image analysis by achieving near real-time scoring of image data. The system is in current use in a functioning C. elegans laboratory and has processed over two hundred thousand images for lab users.