Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
Respir Med ; 197: 106853, 2022 06.
Article in English | MEDLINE | ID: mdl-35512457

ABSTRACT

PURPOSE: To validate the role of Macklin effect on chest CT imaging in predicting subsequent occurrence of pneumomediastinum/pneumothorax (PMD/PNX) in COVID-19 patients. MATERIALS AND METHODS: This is an observational, case-control study. Consecutive COVID-19 patients who underwent chest CT scan at hospital admission during the study time period (October 1st, 2020-April 31st, 2021) were identified. Macklin effect accuracy for prediction of spontaneous barotrauma was measured in terms of sensitivity, specificity, positive (PPV) and negative predictive values (NPV). RESULTS: Overall, 981 COVID-19 patients underwent chest CT scan at hospital arrival during the study time period; 698 patients had radiological signs of interstitial pneumonia and were considered for further evaluation. Among these, Macklin effect was found in 33 (4.7%), including all 32 patients who suffered from barotrauma lately during hospital stay (true positive rate: 96.9%); only 1/33 with Macklin effect did not develop barotrauma (false positive rate: 3.1%). No barotrauma event was recorded in patients without Macklin effect on baseline chest CT scan. Macklin effect yielded a sensitivity of 100% (95% CI: 89.1-100), a specificity of 99.85% (95% CI: 99.2-100), a PPV of 96.7% (95% CI: 80.8-99.5), a NPV of 100% and an accuracy of 99.8% (95% CI: 99.2-100) in predicting PMD/PNX, with a mean advance of 3.2 ± 2.5 days. Moreover, all Macklin-positive patients developed ARDS requiring ICU admission and, in 90.1% of cases, invasive mechanical ventilation. CONCLUSIONS: Macklin effect has high accuracy in predicting PMD/PNX in COVID-19 patients; it is also an excellent predictor of disease severity.


Subject(s)
Barotrauma , COVID-19 , Mediastinal Emphysema , Pneumothorax , Barotrauma/complications , Barotrauma/diagnostic imaging , COVID-19/complications , COVID-19/diagnostic imaging , Case-Control Studies , Humans , Mediastinal Emphysema/diagnostic imaging , Mediastinal Emphysema/epidemiology , Mediastinal Emphysema/etiology , Pneumothorax/epidemiology , Tomography, X-Ray Computed
2.
Phys Rev E ; 102(3-1): 032119, 2020 Sep.
Article in English | MEDLINE | ID: mdl-33075947

ABSTRACT

The traditional approach of statistical physics to supervised learning routinely assumes unrealistic generative models for the data: Usually inputs are independent random variables, uncorrelated with their labels. Only recently, statistical physicists started to explore more complex forms of data, such as equally labeled points lying on (possibly low-dimensional) object manifolds. Here we provide a bridge between this recently established research area and the framework of statistical learning theory, a branch of mathematics devoted to inference in machine learning. The overarching motivation is the inadequacy of the classic rigorous results in explaining the remarkable generalization properties of deep learning. We propose a way to integrate physical models of data into statistical learning theory and address, with both combinatorial and statistical mechanics methods, the computation of the Vapnik-Chervonenkis entropy, which counts the number of different binary classifications compatible with the loss class. As a proof of concept, we focus on kernel machines and on two simple realizations of data structure introduced in recent physics literature: k-dimensional simplexes with prescribed geometric relations and spherical manifolds (equivalent to margin classification). Entropy, contrary to what happens for unstructured data, is nonmonotonic in the sample size, in contrast with the rigorous bounds. Moreover, data structure induces a transition beyond the storage capacity, which we advocate as a proxy of the nonmonotonicity, and ultimately a cue of low generalization error. The identification of a synaptic volume vanishing at the transition allows a quantification of the impact of data structure within replica theory, applicable in cases where combinatorial methods are not available, as we demonstrate for margin learning.

3.
Phys Rev Lett ; 125(12): 120601, 2020 Sep 18.
Article in English | MEDLINE | ID: mdl-33016711

ABSTRACT

Data structure has a dramatic impact on the properties of neural networks, yet its significance in the established theoretical frameworks is poorly understood. Here we compute the Vapnik-Chervonenkis entropy of a kernel machine operating on data grouped into equally labeled subsets. At variance with the unstructured scenario, entropy is nonmonotonic in the size of the training set, and displays an additional critical point besides the storage capacity. Remarkably, the same behavior occurs in margin classifiers even with randomly labeled data, as is elucidated by identifying the synaptic volume encoding the transition. These findings reveal aspects of expressivity lying beyond the condensed description provided by the storage capacity, and they indicate the path towards more realistic bounds for the generalization error of neural networks.

4.
Phys Rev Lett ; 125(7): 070604, 2020 Aug 14.
Article in English | MEDLINE | ID: mdl-32857555

ABSTRACT

Dicke-like models can describe a variety of physical systems, such as atoms in a cavity or vibrating ion chains. In equilibrium these systems often feature a radical change in their behavior when switching from weak to strong spin-boson interaction. This usually manifests in a transition from a "dark" to a "superradiant" phase. However, understanding the out-of-equilibrium physics of these models is extremely challenging, and even more so for strong spin-boson coupling. Here we show that the nonequilibrium strongly interacting multimode Dicke model can mimic some fundamental properties of an associative memory-a system which permits the recognition of patterns, such as letters of an alphabet. Patterns are encoded in the couplings between spins and bosons, and we discuss the dynamics of the spins from the perspective of pattern retrieval in associative memory models. We identify two phases, a "paramagnetic" and a "ferromagnetic" one, and a crossover behavior between these regimes. The "ferromagnetic" phase is reminiscent of pattern retrieval. We highlight similarities and differences with the thermal dynamics of a Hopfield associative memory and show that indeed elements of "machine learning behavior" emerge in the strongly coupled multimode Dicke model.

5.
Phys Rev E ; 102(1-1): 012306, 2020 Jul.
Article in English | MEDLINE | ID: mdl-32794907

ABSTRACT

Many machine learning algorithms used for dimensional reduction and manifold learning leverage on the computation of the nearest neighbors to each point of a data set to perform their tasks. These proximity relations define a so-called geometric graph, where two nodes are linked if they are sufficiently close to each other. Random geometric graphs, where the positions of nodes are randomly generated in a subset of R^{d}, offer a null model to study typical properties of data sets and of machine learning algorithms. Up to now, most of the literature focused on the characterization of low-dimensional random geometric graphs whereas typical data sets of interest in machine learning live in high-dimensional spaces (d≫10^{2}). In this work, we consider the infinite dimensions limit of hard and soft random geometric graphs and we show how to compute the average number of subgraphs of given finite size k, e.g., the average number of k cliques. This analysis highlights that local observables display different behaviors depending on the chosen ensemble: soft random geometric graphs with continuous activation functions converge to the naive infinite-dimensional limit provided by Erdös-Rényi graphs, whereas hard random geometric graphs can show systematic deviations from it. We present numerical evidence that our analytical results, exact in infinite dimensions, provide a good approximation also for dimension d≳10.

6.
Sci Rep ; 9(1): 17133, 2019 Nov 20.
Article in English | MEDLINE | ID: mdl-31748557

ABSTRACT

Identifying the minimal number of parameters needed to describe a dataset is a challenging problem known in the literature as intrinsic dimension estimation. All the existing intrinsic dimension estimators are not reliable whenever the dataset is locally undersampled, and this is at the core of the so called curse of dimensionality. Here we introduce a new intrinsic dimension estimator that leverages on simple properties of the tangent space of a manifold and extends the usual correlation integral estimator to alleviate the extreme undersampling problem. Based on this insight, we explore a multiscale generalization of the algorithm that is capable of (i) identifying multiple dimensionalities in a dataset, and (ii) providing accurate estimates of the intrinsic dimension of extremely curved manifolds. We test the method on manifolds generated from global transformations of high-contrast images, relevant for invariant object recognition and considered a challenge for state-of-the-art intrinsic dimension estimators.

7.
Phys Rev E ; 96(5-1): 052141, 2017 Nov.
Article in English | MEDLINE | ID: mdl-29347707

ABSTRACT

Driven lattice gases are widely regarded as the paradigm of collective phenomena out of equilibrium. While such models are usually studied with nearest-neighbor interactions, many empirical driven systems are dominated by slowly decaying interactions such as dipole-dipole and Van der Waals forces. Motivated by this gap, we study the nonequilibrium stationary state of a driven lattice gas with slow-decayed repulsive interactions at zero temperature. By numerical and analytical calculations of the particle current as a function of the density and of the driving field, we identify (i) an abrupt breakdown transition between insulating and conducting states, (ii) current quantization into discrete phases where a finite current flows with infinite differential resistivity, and (iii) a fractal hierarchy of excitations, related to the Farey sequences of number theory. We argue that the origin of these effects is the competition between scales, which also causes the counterintuitive phenomenon that crystalline states can melt by increasing the density.

8.
Phys Rev Lett ; 116(25): 256803, 2016 Jun 24.
Article in English | MEDLINE | ID: mdl-27391740

ABSTRACT

After more than three decades, the fractional quantum Hall effect still poses challenges to contemporary physics. Recent experiments point toward a fractal scenario for the Hall resistivity as a function of the magnetic field. Here, we consider the so-called thin-torus limit of the Hamiltonian describing interacting electrons in a strong magnetic field, restricted to the lowest Landau level, and we show that it can be mapped onto a one-dimensional lattice gas with repulsive interactions, with the magnetic field playing the role of the chemical potential. The statistical mechanics of such models leads us to interpret the sequence of Hall plateaux as a fractal phase diagram whose landscape shows a qualitative agreement with experiments.

9.
Phys Rev Lett ; 114(14): 143601, 2015 Apr 10.
Article in English | MEDLINE | ID: mdl-25910121

ABSTRACT

Using an approach inspired from spin glasses, we show that the multimode disordered Dicke model is equivalent to a quantum Hopfield network. We propose variational ground states for the system at zero temperature, which we conjecture to be exact in the thermodynamic limit. These ground states contain the information on the disordered qubit-photon couplings. These results lead to two intriguing physical implications. First, once the qubit-photon couplings can be engineered, it should be possible to build scalable pattern-storing systems whose dynamics is governed by quantum laws. Second, we argue with an example of how such Dicke quantum simulators might be used as a solver of "hard" combinatorial optimization problems.


Subject(s)
Models, Theoretical , Neural Networks, Computer , Quantum Theory , Glass/chemistry , Thermodynamics
SELECTION OF CITATIONS
SEARCH DETAIL
...