Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
1.
Crit Care Explor ; 6(4): e1073, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38545607

ABSTRACT

OBJECTIVES: Early signs of bleeding are often masked by the physiologic compensatory responses delaying its identification. We sought to describe early physiologic signatures of bleeding during the blood donation process. SETTING: Waveform-level vital sign data including electrocardiography, photoplethysmography (PPG), continuous noninvasive arterial pressure, and respiratory waveforms were collected before, during, and after bleeding. SUBJECTS: Fifty-five healthy volunteers visited blood donation center to donate whole blood. INTERVENTION: After obtaining the informed consent, 3 minutes of resting time was given to each subject. Then 3 minutes of orthostasis was done, followed by another 3 minutes of resting before the blood donation. After the completion of donating blood, another 3 minutes of postbleeding resting time, followed by 3 minutes of orthostasis period again. MEASUREMENTS AND MAIN RESULTS: From 55 subjects, waveform signals as well as numerical vital signs (heart rate [HR], respiratory rate, blood pressure) and clinical characteristics were collected, and data from 51 subjects were analyzable. Any adverse events (AEs; dizziness, lightheadedness, nausea) were documented. Statistical and physiologic features including HR variability (HRV) metrics and other waveform morphologic parameters were modeled. Feature trends for all participants across the study protocol were analyzed. No significant changes in HR, blood pressure, or estimated cardiac output were seen during bleeding. Both orthostatic challenges and bleeding significantly decreased time domain and high-frequency domain HRV, and PPG amplitude, whereas increasing PPG amplitude variation. During bleeding, time-domain HRV feature trends were most sensitive to the first 100 mL of blood loss, and incremental changes of different HRV parameters (from 300 mL of blood loss), as well as a PPG morphologic feature (from 400 mL of blood loss), were shown with statistical significance. The AE group (n = 6) showed decreased sample entropy compared with the non-AE group during postbleed orthostatic challenge (p = 0.003). No significant other trend differences were observed during bleeding between AE and non-AE groups. CONCLUSIONS: Various HRV-related features were changed during rapid bleeding seen within the first minute. Subjects with AE during postbleeding orthostasis showed decreased sample entropy. These findings could be leveraged toward earlier identification of donors at risk for AE, and more broadly building a data-driven hemorrhage model for the early treatment of critical bleeding.

2.
AMIA Annu Symp Proc ; 2022: 405-414, 2022.
Article in English | MEDLINE | ID: mdl-37128388

ABSTRACT

A significant proportion of clinical physiologic monitoring alarms are false. This often leads to alarm fatigue in clinical personnel, inevitably compromising patient safety. To combat this issue, researchers have attempted to build Machine Learning (ML) models capable of accurately adjudicating Vital Sign (VS) alerts raised at the bedside of hemodynamically monitored patients as real or artifact. Previous studies have utilized supervised ML techniques that require substantial amounts of hand-labeled data. However, manually harvesting such data can be costly, time-consuming, and mundane, and is a key factor limiting the widespread adoption of ML in healthcare (HC). Instead, we explore the use of multiple, individually imperfect heuristics to automatically assign probabilistic labels to unlabeled training data using weak supervision. Our weakly supervised models perform competitively with traditional supervised techniques and require less involvement from domain experts, demonstrating their use as efficient and practical alternatives to supervised learning in HC applications of ML.


Subject(s)
Artifacts , Monitoring, Physiologic , Supervised Machine Learning , Vital Signs , Humans , Monitoring, Physiologic/methods , Monitoring, Physiologic/standards , Heuristics , Automation
3.
Crit Care Med ; 49(1): 79-90, 2021 01 01.
Article in English | MEDLINE | ID: mdl-33165027

ABSTRACT

OBJECTIVES: To compare 5% albumin with 0.9% saline for large-volume resuscitation (> 60 mL/Kg within 24 hr), on mortality and development of acute kidney injury. DESIGN: Retrospective cohort study. SETTING: Patients admitted to ICUs in 13 hospitals across Western Pennsylvania. We analyzed two independent cohorts, the High-Density Intensive Care databases: High-Density Intensive Care-08 (July 2000 to October 2008, H08) and High-Density Intensive Care-15 (October 2008 to December 2014, H15). PATIENTS: Total of 18,629 critically ill patients requiring large-volume resuscitation. INTERVENTIONS: Five percent of albumin in addition to saline versus 0.9% saline. MEASUREMENTS AND MAIN RESULTS: After excluding patients with acute kidney injury prior to large-volume resuscitation, 673 of 2,428 patients (27.7%) and 1,814 of 16,201 patients (11.2%) received 5% albumin in H08 and H15, respectively. Use of 5% albumin was associated with decreased 30-day mortality by multivariate regression in H08 (odds ratio 0.65; 95% CI 0.49-0.85; p = 0.002) and in H15 (0.52; 95% CI 0.44-0.62; p < 0.0001) but was associated with increased acute kidney injury in H08 (odds ratio 1.98; 95% CI 1.56-2.51; p < 0.001) and in H15 (odds ratio 1.75; 95% CI 1.58-1.95; p < 0.001). However, 5% albumin was not associated with persistent acute kidney injury and resulted in decreased major adverse kidney event at 30, 90, and 365 days. Propensity matched analysis confirmed similar associations with mortality and acute kidney injury. CONCLUSIONS: During large-volume resuscitation, 5% albumin was associated with reduced mortality and major adverse kidney event at 30, 90, and 365 days. However, a higher rate of acute kidney injury of any stage was observed that did not translate into persistent renal dysfunction.


Subject(s)
Albumins/therapeutic use , Critical Illness/therapy , Resuscitation/methods , Saline Solution/therapeutic use , Albumins/administration & dosage , Critical Illness/mortality , Hospital Mortality , Humans , Proportional Hazards Models , Resuscitation/mortality , Retrospective Studies , Saline Solution/administration & dosage , Survival Analysis
4.
J Trauma Acute Care Surg ; 88(5): 654-660, 2020 05.
Article in English | MEDLINE | ID: mdl-32032282

ABSTRACT

BACKGROUND: Modeling approaches offer a novel way to detect and predict coagulopathy in trauma patients. A dynamic model, built and tested on thromboelastogram (TEG) data, was used to generate a virtual library of over 160,000 simulated RapidTEGs. The patient-specific parameters are the initial platelet count, platelet activation rate, thrombus growth rate, and lysis rate (P(0), k1, k2, and k3, respectively). METHODS: Patient data from both STAAMP (n = 182 patients) and PAMPer (n = 111 patients) clinical trials were collected. A total of 873 RapidTEGs were analyzed. One hundred sixteen TEGs indicated maximum amplitude (MA) below normal and 466 TEGs indicated lysis percent above normal. Each patient's TEG response was compared against the virtual library of TEGs to determine library trajectories having the least sum-of-squared error versus the patient TEG up to each specified evaluation time ∈ (3, 4, 5, 7.5, 10, 15, 20 minutes). Using 10 nearest-neighbor trajectories, a logistic regression was performed to predict if the patient TEG indicated MA below normal (<50 mm), lysis percent 30 minutes after MA (LY30) greater than 3%, and/or blood transfusion need using the parameters from the dynamic model. RESULTS: The algorithm predicts abnormal MA values using the initial 3 minutes of RapidTEG data with a median area under the curve of 0.95, and improves with more data to 0.98 by 10 minutes. Prediction of future platelet and packed red blood cell transfusion based on parameters at 4 and 5 minutes, respectively, provides equivalent predictions to the traditional TEG parameters in significantly less time. Dynamic model parameters could not predict abnormal LY30 or future fresh-frozen plasma transfusion. CONCLUSION: This analysis could be incorporated into TEG software and workflow to quickly estimate if the MA would be below or above threshold value within the initial minutes following a TEG, along with an estimate of what blood products to have on hand. LEVEL OF EVIDENCE: Therapeutic/Care Management: Level IV.


Subject(s)
Blood Coagulation Disorders/diagnosis , Blood Component Transfusion/statistics & numerical data , Models, Cardiovascular , Thrombelastography/statistics & numerical data , Wounds and Injuries/complications , Adult , Algorithms , Blood Coagulation Disorders/blood , Blood Coagulation Disorders/therapy , Clinical Trials as Topic , Female , Humans , Male , Middle Aged , Platelet Activation , Platelet Count , Point-of-Care Systems/statistics & numerical data , Prognosis , Thrombelastography/instrumentation , Time Factors , Wounds and Injuries/blood , Wounds and Injuries/therapy , Young Adult
5.
Crit Care Explor ; 1(10): e0058, 2019 Oct.
Article in English | MEDLINE | ID: mdl-32166238

ABSTRACT

We hypothesize that knowledge of a stable personalized baseline state and increased data sampling frequency would markedly improve the ability to detect progressive hypovolemia during hemorrhage earlier and with a lower false positive rate than when using less granular data. DESIGN: Prospective temporal challenge. SETTING: Large animal research laboratory, University Medical Center. SUBJECTS: Fifty-one anesthetized Yorkshire pigs. INTERVENTIONS: Pigs were instrumented with arterial, pulmonary arterial, and central venous catheters and allowed to stabilize for 30 minutes then bled at a constant rate of either 5 mL·min-1 (n = 13) or 20 (n = 38) until mean arterial pressure decreased to 40 or 30 mm Hg in the 5 and 20 mL·min-1 pigs, respectively. MEASUREMENTS AND MAIN RESULTS: Data during the stabilization period served as baseline. Hemodynamic variables collected at 250 Hz were used to create predictive models of "bleeding" using featurized beat-to-beat and waveform data and compared with models using mean unfeaturized hemodynamic variables averaged over 1-minute as simple hemodynamic metrics using random forest classifiers to identify bleeding with or without baseline data. The robustness of the prediction was evaluated in a leave-one-pig-out cross-validation. Predictive performance of models was compared by their activity monitoring operating characteristic and receiver operating characteristic profiles. Primary hemodynamic threshold data poorly identified bleed onset unless very stable initial baseline reference data were available. When referenced to baseline, bleed detection at a false positive rates of 10-2 with time to detect 80% of pigs bleeding was similar for simple hemodynamic metrics, beat-to-beat, and waveform at about 3-4 minutes. Whereas when universally baselined, increasing sampling frequency reduced latency of bleed detection from 10 to 8 to 6 minutes, for simple hemodynamic metrics, beat-to-beat, and waveform, respectively. Some informative features differed between simple hemodynamic metrics, beat-to-beat, and waveform models. CONCLUSIONS: Knowledge of personal stable baseline data allows for early detection of new-onset bleeding, whereas if no personal baseline exists increasing sampling frequency of hemodynamic monitoring data improves bleeding detection earlier and with lower false positive rate.

6.
Chest ; 152(5): 972-979, 2017 11.
Article in English | MEDLINE | ID: mdl-28527880

ABSTRACT

BACKGROUND: Urine output (UO) is a vital sign for critically ill patients, but standards for monitoring and reporting vary widely between ICUs. Careful monitoring of UO could lead to earlier recognition of acute kidney injury (AKI) and better fluid management. We sought to determine if the intensity of UO monitoring is associated with outcomes in patients with and those without AKI. METHODS: This was a retrospective cohort study including 15,724 adults admitted to ICUs from 2000 to 2008. Intensive UO monitoring was defined as hourly recordings and no gaps > 3 hours for the first 48 hours after ICU admission. RESULTS: Intensive monitoring for UO was conducted in 4,049 patients (26%), and we found significantly higher rates of AKI (OR, 1.22; P < .001) in these patients. After adjustment for age and severity of illness, intensive UO monitoring was associated with improved survival but only among patients experiencing AKI. With or without AKI, patients with intensive monitoring also had less cumulative fluid volume (2.98 L vs 3.78 L; P < .001) and less fluid overload (2.49% vs 5.68%; P < .001) over the first 72 hours of ICU stay. CONCLUSIONS: In this large ICU population, intensive monitoring of UO was associated with improved detection of AKI and reduced 30-day mortality in patients experiencing AKI, as well as less fluid overload for all patients. Our results should help inform clinical decisions and ICU policy about frequency of monitoring of UO, especially for patients at high risk of AKI or fluid overload, or both.


Subject(s)
Acute Kidney Injury/diagnosis , Intensive Care Units , Monitoring, Physiologic/methods , Urination/physiology , Acute Kidney Injury/physiopathology , Adult , Aged , Disease Progression , Female , Follow-Up Studies , Humans , Male , Middle Aged , Predictive Value of Tests , Retrospective Studies , Time Factors
7.
Crit Care Med ; 45(2): e146-e153, 2017 Feb.
Article in English | MEDLINE | ID: mdl-27635770

ABSTRACT

OBJECTIVE: We sought to investigate if the chloride content of fluids used in resuscitation was associated with short- and long-term outcomes. DESIGN: We identified patients who received large-volume fluid resuscitation, defined as greater than 60 mL/kg over a 24-hour period. Chloride load was determined for each patient based on the chloride ion concentration of the fluids they received during large-volume fluid resuscitation multiplied by the volume of fluids. We compared the development of hyperchloremic acidosis, acute kidney injury, and survival among those with higher and lower chloride loads. SETTING: University Medical Center. PATIENTS: Patients admitted to ICUs from 2000 to 2008. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Among 4,710 patients receiving large-volume fluid resuscitation, hyperchloremic acidosis was documented in 523 (11%). Crude rates of hyperchloremic acidosis, acute kidney injury, and hospital mortality all increased significantly as chloride load increased (p < 0.001). However, chloride load was no longer associated with hyperchloremic acidosis or acute kidney injury after controlling for total fluids, age, and baseline severity. Conversely, each 100 mEq increase in chloride load was associated with a 5.5% increase in the hazard of death even after controlling for total fluid volume, age, and severity (p = 0.0015) over 1 year. CONCLUSIONS: Chloride load is associated with significant adverse effects on survival out to 1 year even after controlling for total fluid load, age, and baseline severity of illness. However, the relationship between chloride load and development of hyperchloremic acidosis or acute kidney injury is less clear, and further research is needed to elucidate the mechanisms underlying the adverse effects of chloride load on survival.


Subject(s)
Chlorides/analysis , Fluid Therapy/methods , Rehydration Solutions/chemistry , Resuscitation/methods , Acidosis/etiology , Acute Kidney Injury/etiology , Adolescent , Adult , Aged , Chlorides/adverse effects , Female , Fluid Therapy/mortality , Humans , Male , Middle Aged , Rehydration Solutions/adverse effects , Rehydration Solutions/therapeutic use , Resuscitation/mortality , Young Adult
8.
Crit Care Med ; 44(7): e456-63, 2016 Jul.
Article in English | MEDLINE | ID: mdl-26992068

ABSTRACT

OBJECTIVE: The use of machine-learning algorithms to classify alerts as real or artifacts in online noninvasive vital sign data streams to reduce alarm fatigue and missed true instability. DESIGN: Observational cohort study. SETTING: Twenty-four-bed trauma step-down unit. PATIENTS: Two thousand one hundred fifty-three patients. INTERVENTION: Noninvasive vital sign monitoring data (heart rate, respiratory rate, peripheral oximetry) recorded on all admissions at 1/20 Hz, and noninvasive blood pressure less frequently, and partitioned data into training/validation (294 admissions; 22,980 monitoring hours) and test sets (2,057 admissions; 156,177 monitoring hours). Alerts were vital sign deviations beyond stability thresholds. A four-member expert committee annotated a subset of alerts (576 in training/validation set, 397 in test set) as real or artifact selected by active learning, upon which we trained machine-learning algorithms. The best model was evaluated on test set alerts to enact online alert classification over time. MEASUREMENTS AND MAIN RESULTS: The Random Forest model discriminated between real and artifact as the alerts evolved online in the test set with area under the curve performance of 0.79 (95% CI, 0.67-0.93) for peripheral oximetry at the instant the vital sign first crossed threshold and increased to 0.87 (95% CI, 0.71-0.95) at 3 minutes into the alerting period. Blood pressure area under the curve started at 0.77 (95% CI, 0.64-0.95) and increased to 0.87 (95% CI, 0.71-0.98), whereas respiratory rate area under the curve started at 0.85 (95% CI, 0.77-0.95) and increased to 0.97 (95% CI, 0.94-1.00). Heart rate alerts were too few for model development. CONCLUSIONS: Machine-learning models can discern clinically relevant peripheral oximetry, blood pressure, and respiratory rate alerts from artifacts in an online monitoring dataset (area under the curve > 0.87).


Subject(s)
Artifacts , Clinical Alarms/classification , Monitoring, Physiologic/methods , Supervised Machine Learning , Vital Signs , Blood Pressure Determination , Cohort Studies , Heart Rate , Humans , Oximetry , Respiratory Rate
9.
J Pediatr ; 164(4): 749-755.e3, 2014 Apr.
Article in English | MEDLINE | ID: mdl-24388320

ABSTRACT

OBJECTIVE: To determine the incidence and risk factors for readmission to the intensive care unit (ICU) among preterm infants who required mechanical ventilation at birth. STUDY DESIGN: We studied preterm newborns (birth weight 500-1250 g) who required mechanical ventilation at birth and were enrolled in a multicenter trial of inhaled nitric oxide therapy. Patients were assessed up to 4.5 years of age via annual in-person evaluations and structured telephone interviews. Univariate and multivariable analyses of baseline and birth hospitalization predictors of ICU readmission were performed. RESULTS: Of 512 subjects providing follow-up data, 58% were readmitted to the hospital (51% of these had multiple readmissions, averaging 3.9 readmissions per subject), 19% were readmitted to an ICU, and 12% required additional mechanical ventilation support. In univariate analyses, ICU readmission was more common among male subjects (OR 2.01; 95% CI 1.27-3.18), infants with grade 3-4 intracranial hemorrhage (OR 2.13; 95% CI 1.23-3.69), increasing duration of birth hospitalization (OR 1.01 per day; 95% CI 1.00-1.02), and prolonged oxygen therapy (OR 1.01 per day; 95% CI 1.00-1.01). In the first year after birth hospitalization, children readmitted to an ICU incurred greater health care costs (median $69,700 vs $30,200 for subjects admitted to the ward and $9600 for subjects never admitted). CONCLUSIONS: Small preterm infants who were mechanically ventilated at birth have substantial risk for readmission to an ICU and late mechanical ventilation, require extensive health care resources, and incur high treatment costs.


Subject(s)
Intensive Care Units/statistics & numerical data , Patient Readmission/statistics & numerical data , Respiration, Artificial , Respiratory Insufficiency/therapy , Child, Preschool , Female , Follow-Up Studies , Humans , Infant, Newborn , Infant, Premature , Male , Risk Factors
SELECTION OF CITATIONS
SEARCH DETAIL