Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
1.
Radiol Artif Intell ; 6(3): e230079, 2024 05.
Article in English | MEDLINE | ID: mdl-38477661

ABSTRACT

Purpose To evaluate the impact of an artificial intelligence (AI) assistant for lung cancer screening on multinational clinical workflows. Materials and Methods An AI assistant for lung cancer screening was evaluated on two retrospective randomized multireader multicase studies where 627 (141 cancer-positive cases) low-dose chest CT cases were each read twice (with and without AI assistance) by experienced thoracic radiologists (six U.S.-based or six Japan-based radiologists), resulting in a total of 7524 interpretations. Positive cases were defined as those within 2 years before a pathology-confirmed lung cancer diagnosis. Negative cases were defined as those without any subsequent cancer diagnosis for at least 2 years and were enriched for a spectrum of diverse nodules. The studies measured the readers' level of suspicion (on a 0-100 scale), country-specific screening system scoring categories, and management recommendations. Evaluation metrics included the area under the receiver operating characteristic curve (AUC) for level of suspicion and sensitivity and specificity of recall recommendations. Results With AI assistance, the radiologists' AUC increased by 0.023 (0.70 to 0.72; P = .02) for the U.S. study and by 0.023 (0.93 to 0.96; P = .18) for the Japan study. Scoring system specificity for actionable findings increased 5.5% (57% to 63%; P < .001) for the U.S. study and 6.7% (23% to 30%; P < .001) for the Japan study. There was no evidence of a difference in corresponding sensitivity between unassisted and AI-assisted reads for the U.S. (67.3% to 67.5%; P = .88) and Japan (98% to 100%; P > .99) studies. Corresponding stand-alone AI AUC system performance was 0.75 (95% CI: 0.70, 0.81) and 0.88 (95% CI: 0.78, 0.97) for the U.S.- and Japan-based datasets, respectively. Conclusion The concurrent AI interface improved lung cancer screening specificity in both U.S.- and Japan-based reader studies, meriting further study in additional international screening environments. Keywords: Assistive Artificial Intelligence, Lung Cancer Screening, CT Supplemental material is available for this article. Published under a CC BY 4.0 license.


Subject(s)
Artificial Intelligence , Early Detection of Cancer , Lung Neoplasms , Tomography, X-Ray Computed , Humans , Lung Neoplasms/diagnosis , Lung Neoplasms/epidemiology , Japan , United States/epidemiology , Retrospective Studies , Early Detection of Cancer/methods , Female , Male , Middle Aged , Aged , Sensitivity and Specificity , Radiographic Image Interpretation, Computer-Assisted/methods
2.
Ann Intern Med ; 157(3): 170-9, 2012 Aug 07.
Article in English | MEDLINE | ID: mdl-22868834

ABSTRACT

BACKGROUND: Sleep plays a critical role in maintaining health and well-being; however, patients who are hospitalized are frequently exposed to noise that can disrupt sleep. Efforts to attenuate hospital noise have been limited by incomplete information on the interaction between sounds and sleep physiology. OBJECTIVE: To determine profiles of acoustic disruption of sleep by examining the cortical (encephalographic) arousal responses during sleep to typical hospital noises by sound level and type and sleep stage. DESIGN: 3-day polysomnographic study. SETTING: Sound-attenuated sleep laboratory. PARTICIPANTS: Volunteer sample of 12 healthy participants. INTERVENTION: Baseline (sham) night followed by 2 intervention nights with controlled presentation of 14 sounds that are common in hospitals (for example, voice, intravenous alarm, phone, ice machine, outside traffic, and helicopter). The sounds were administered at calibrated, increasing decibel levels (40 to 70 dBA [decibels, adjusted for the range of normal hearing]) during specific sleep stages. MEASUREMENTS: Encephalographic arousals, by using established criteria, during rapid eye movement (REM) sleep and non-REM (NREM) sleep stages 2 and 3. RESULTS: Sound presentations yielded arousal response curves that varied because of sound level and type and sleep stage. Electronic sounds were more arousing than other sounds, including human voices, and there were large differences in responses by sound type. As expected, sounds in NREM stage 3 were less likely to cause arousals than sounds in NREM stage 2; unexpectedly, the probability of arousal to sounds presented in REM sleep varied less by sound type than when presented in NREM sleep and caused a greater and more sustained elevation of instantaneous heart rate. LIMITATIONS: The study included only 12 participants. Results for these healthy persons may underestimate the effects of noise on sleep in patients who are hospitalized. CONCLUSION: Sounds during sleep influence both cortical brain activity and cardiovascular function. This study systematically quantifies the disruptive capacity of a range of hospital sounds on sleep, providing evidence that is essential to improving the acoustic environments of new and existing health care facilities to enable the highest quality of care. PRIMARY FUNDING SOURCE: Academy of Architecture for Health, Facilities Guidelines Institute, and The Center for Health Design.


Subject(s)
Hospitalization , Noise/adverse effects , Sleep Stages/physiology , Acoustic Stimulation , Electroencephalography , Female , Heart Rate/physiology , Humans , Male , Polysomnography , Prospective Studies , Wakefulness/physiology , Young Adult
3.
IEEE Trans Biomed Eng ; 59(2): 483-93, 2012 Feb.
Article in English | MEDLINE | ID: mdl-22084041

ABSTRACT

Although the spontaneous brain rhythms of sleep have commanded much recent interest, their detection and analysis remains suboptimal. In this paper, we develop a data-driven Bayesian algorithm for sleep spindle detection on the electroencephalography (EEG). The algorithm exploits the Karhunen-Loève transform and Bayesian hypothesis testing to produce the instantaneous probability of a spindle's presence with maximal resolution. In addition to possessing flexibility, transparency, and scalability, this algorithm could perform at levels superior to standard methods for EEG event detection.


Subject(s)
Algorithms , Bayes Theorem , Electroencephalography/methods , Signal Processing, Computer-Assisted , Sleep Stages/physiology , Adult , Brain/physiology , Female , Humans , Male , Middle Aged
4.
PLoS One ; 6(3): e17351, 2011 Mar 03.
Article in English | MEDLINE | ID: mdl-21408616

ABSTRACT

The neural correlates of the wake-sleep continuum remain incompletely understood, limiting the development of adaptive drug delivery systems for promoting sleep maintenance. The most useful measure for resolving early positions along this continuum is the alpha oscillation, an 8-13 Hz electroencephalographic rhythm prominent over posterior scalp locations. The brain activation signature of wakefulness, alpha expression discloses immediate levels of alertness and dissipates in concert with fading awareness as sleep begins. This brain activity pattern, however, is largely ignored once sleep begins. Here we show that the intensity of spectral power in the alpha band actually continues to disclose instantaneous responsiveness to noise--a measure of sleep depth--throughout a night of sleep. By systematically challenging sleep with realistic and varied acoustic disruption, we found that sleepers exhibited markedly greater sensitivity to sounds during moments of elevated alpha expression. This result demonstrates that alpha power is not a binary marker of the transition between sleep and wakefulness, but carries rich information about immediate sleep stability. Further, it shows that an empirical and ecologically relevant form of sleep depth is revealed in real-time by EEG spectral content in the alpha band, a measure that affords prediction on the order of minutes. This signal, which transcends the boundaries of classical sleep stages, could potentially be used for real-time feedback to novel, adaptive drug delivery systems for inducing sleep.


Subject(s)
Brain/physiology , Sleep Stages/physiology , Wakefulness/physiology , Acoustic Stimulation , Adult , Alpha Rhythm/physiology , Darkness , Female , Humans , Male
5.
Curr Biol ; 20(15): R626-7, 2010 Aug 10.
Article in English | MEDLINE | ID: mdl-20692606

ABSTRACT

Quality sleep is an essential part of health and well-being. Yet fractured sleep is disturbingly prevalent in our society, partly due to insults from a variety of noises [1]. Common experience suggests that this fragility of sleep is highly variable between people, but it is unclear what mechanisms drive these differences. Here we show that it is possible to predict an individual's ability to maintain sleep in the face of sound using spontaneous brain rhythms from electroencephalography (EEG). The sleep spindle is a thalamocortical rhythm manifested on the EEG as a brief 11-15 Hz oscillation and is thought to be capable of modulating the influence of external stimuli [2]. Its rate of occurrence, while variable across people, is stable across nights [3]. We found that individuals who generated more sleep spindles during a quiet night of sleep went on to exhibit higher tolerance for noise during a subsequent, noisy night of sleep. This result shows that the sleeping brain's spontaneous activity heralds individual resilience to disruptive stimuli. Our finding sets the stage for future studies that attempt to augment spindle production to enhance sleep continuity when confronted with noise.


Subject(s)
Brain/physiology , Noise/adverse effects , Sleep/physiology , Adult , Electroencephalography , Humans , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL