RESUMEN
Background: Cardiac function of critically ill patients with COVID-19 generally has been reported from clinically obtained data. Echocardiographic deformation imaging can identify ventricular dysfunction missed by traditional echocardiographic assessment. Research Question: What is the prevalence of ventricular dysfunction and what are its implications for the natural history of critical COVID-19? Study Design and Methods: This is a multicenter prospective cohort of critically ill patients with COVID-19. We performed serial echocardiography and lower extremity vascular ultrasound on hospitalization days 1, 3, and 8. We defined left ventricular (LV) dysfunction as the absolute value of longitudinal strain of < 17% or left ventricle ejection fraction (LVEF) of < 50%. Primary clinical outcome was inpatient survival. Results: We enrolled 110 patients. Thirty-nine (35.5%) died before hospital discharge. LV dysfunction was present at admission in 38 patients (34.5%) and in 21 patients (36.2%) on day 8 (P = .59). Median baseline LVEF was 62% (interquartile range [IQR], 52%-69%), whereas median absolute value of baseline LV strain was 16% (IQR, 14%-19%). Survivors and nonsurvivors did not differ statistically significantly with respect to day 1 LV strain (17.9% vs 14.4%; P = .12) or day 1 LVEF (60.5% vs 65%; P = .06). Nonsurvivors showed worse day 1 right ventricle (RV) strain than survivors (16.3% vs 21.2%; P = .04). Interpretation: Among patients with critical COVID-19, LV and RV dysfunction is common, frequently identified only through deformation imaging, and early (day 1) RV dysfunction may be associated with clinical outcome.
RESUMEN
Importance: Platelet activation is a potential therapeutic target in patients with COVID-19. Objective: To evaluate the effect of P2Y12 inhibition among critically ill patients hospitalized for COVID-19. Design, Setting, and Participants: This international, open-label, adaptive platform, 1:1 randomized clinical trial included critically ill (requiring intensive care-level support) patients hospitalized with COVID-19. Patients were enrolled between February 26, 2021, through June 22, 2022. Enrollment was discontinued on June 22, 2022, by the trial leadership in coordination with the study sponsor given a marked slowing of the enrollment rate of critically ill patients. Intervention: Participants were randomly assigned to receive a P2Y12 inhibitor or no P2Y12 inhibitor (usual care) for 14 days or until hospital discharge, whichever was sooner. Ticagrelor was the preferred P2Y12 inhibitor. Main Outcomes and Measures: The primary outcome was organ support-free days, evaluated on an ordinal scale that combined in-hospital death and, for participants who survived to hospital discharge, the number of days free of cardiovascular or respiratory organ support up to day 21 of the index hospitalization. The primary safety outcome was major bleeding, as defined by the International Society on Thrombosis and Hemostasis. Results: At the time of trial termination, 949 participants (median [IQR] age, 56 [46-65] years; 603 male [63.5%]) had been randomly assigned, 479 to the P2Y12 inhibitor group and 470 to usual care. In the P2Y12 inhibitor group, ticagrelor was used in 372 participants (78.8%) and clopidogrel in 100 participants (21.2%). The estimated adjusted odds ratio (AOR) for the effect of P2Y12 inhibitor on organ support-free days was 1.07 (95% credible interval, 0.85-1.33). The posterior probability of superiority (defined as an OR > 1.0) was 72.9%. Overall, 354 participants (74.5%) in the P2Y12 inhibitor group and 339 participants (72.4%) in the usual care group survived to hospital discharge (median AOR, 1.15; 95% credible interval, 0.84-1.55; posterior probability of superiority, 80.8%). Major bleeding occurred in 13 participants (2.7%) in the P2Y12 inhibitor group and 13 (2.8%) in the usual care group. The estimated mortality rate at 90 days for the P2Y12 inhibitor group was 25.5% and for the usual care group was 27.0% (adjusted hazard ratio, 0.96; 95% CI, 0.76-1.23; P = .77). Conclusions and Relevance: In this randomized clinical trial of critically ill participants hospitalized for COVID-19, treatment with a P2Y12 inhibitor did not improve the number of days alive and free of cardiovascular or respiratory organ support. The use of the P2Y12 inhibitor did not increase major bleeding compared with usual care. These data do not support routine use of a P2Y12 inhibitor in critically ill patients hospitalized for COVID-19. Trial Registration: ClinicalTrials.gov Identifier: NCT04505774.
Asunto(s)
COVID-19 , Agonistas del Receptor Purinérgico P2Y , Humanos , Masculino , Persona de Mediana Edad , Enfermedad Crítica/terapia , Hemorragia , Mortalidad Hospitalaria , Ticagrelor/uso terapéutico , Agonistas del Receptor Purinérgico P2Y/uso terapéuticoRESUMEN
The efficient and accurate diagnosis of dengue, a major mosquito-borne disease, is of primary importance for clinical care, surveillance, and outbreak control. The identification of specific dengue virus serotype 1 (DENV-1) to DENV-4 can help in understanding the transmission dynamics and spread of dengue disease. The four rapid low-resource serotype-specific dengue tests use a simple sample preparation reagent followed by reverse transcription-isothermal recombinase polymerase amplification (RT-RPA) combined with lateral flow detection (LFD) technology. Results are obtained directly from clinical sample matrices in 35 min, requiring only a heating block and pipettes for liquid handling. In addition, we demonstrate that the rapid sample preparation step inactivates DENV, improving laboratory safety. Human plasma and serum were spiked with DENV, and DENV was detected with analytical sensitivities of 333 to 22,500 median tissue culture infectious doses (TCID50)/mL. The analytical sensitivities in blood were 94,000 to 333,000 TCID50/mL. Analytical specificity testing confirmed that each test could detect multiple serotype-specific strains but did not respond to strains of other serotypes, closely related flaviviruses, or chikungunya virus. Clinical testing on 80 human serum samples demonstrated test specificities of between 94 and 100%, with a DENV-2 test sensitivity of 100%, detecting down to 0.004 PFU/µL, similar to the sensitivity of the PCR test; the other DENV tests detected down to 0.03 to 10.9 PFU/µL. Collectively, our data suggest that some of our rapid dengue serotyping tests provide a potential alternative to conventional labor-intensive RT-quantitative PCR (RT-qPCR) detection, which requires expensive thermal cycling instrumentation, technical expertise, and prolonged testing times. Our tests provide performance and speed without compromising specificity in human plasma and serum and could become promising tools for the detection of high DENV loads in resource-limited settings. IMPORTANCE The efficient and accurate diagnosis of dengue, a major mosquito-borne disease, is of primary importance for clinical care, surveillance, and outbreak control. This study describes the evaluation of four rapid low-resource serotype-specific dengue tests for the detection of specific DENV serotypes in clinical sample matrices. The tests use a simple sample preparation reagent followed by reverse transcription-isothermal recombinase polymerase amplification (RT-RPA) combined with lateral flow detection (LFD) technology. These tests have several advantages compared to RT-qPCR detection, such as a simple workflow, rapid sample processing and turnaround times (35 min from sample preparation to detection), minimal equipment needs, and improved laboratory safety through the inactivation of the virus during the sample preparation step. The low-resource formats of these rapid dengue serotyping tests have the potential to support effective dengue disease surveillance and enhance the diagnostic testing capacity in resource-limited countries with both endemic dengue and intense coronavirus disease 2019 (COVID-19) transmission.
Asunto(s)
Virus del Dengue , Dengue , Humanos , Dengue/diagnóstico , Virus del Dengue/clasificación , Virus del Dengue/aislamiento & purificación , Prueba de Diagnóstico Rápido , Recombinasas , Sensibilidad y Especificidad , SerogrupoRESUMEN
AIMS: Patient-performed lung ultrasound (LUS) in a heart failure (HF) telemedicine model may be used to monitor worsening pulmonary oedema and to titrate therapy, potentially reducing HF admission. The aim of the study was to assess the feasibility of training HF patients to perform a LUS self-exam in a telemedicine model. METHODS AND RESULTS: A pilot study was conducted at a public hospital involving subjects with a history of HF. After a 15 min training session involving a tutorial video, subjects performed a four-zone LUS using a handheld ultrasound. Exams were saved on a remote server and independently reviewed by two LUS experts. Studies were determined interpretable according to a strict definition: the presence of an intercostal space, and the presence of A-lines, B-lines, or both. Subjects also answered a questionnaire to gather feedback and assess self-efficacy. The median age of 44 subjects was 53 years (range, 36-64). Thirty (68%) were male. Last educational level attained was high school or below for 31 subjects (70%), and one-third used Spanish as their preferred language. One hundred fifty of 175 lung zones (85%) were interpretable, with expert agreement of 87% and a kappa of 0.49. 98% of subjects reported that they could perform this LUS self-exam at home. CONCLUSIONS: This pilot study reports that training HF patients to perform a LUS self-exam is feasible, with reported high self-efficacy. This supports further investigation into a telemedicine model using LUS to reduce emergency department visits and hospitalizations associated with HF.
Asunto(s)
Insuficiencia Cardíaca , Telemedicina , Adulto , Estudios de Factibilidad , Insuficiencia Cardíaca/diagnóstico , Insuficiencia Cardíaca/terapia , Humanos , Pulmón/diagnóstico por imagen , Masculino , Persona de Mediana Edad , Proyectos PilotoRESUMEN
OBJECTIVES: As point of care ultrasound (POCUS) has become more integrated into emergency and critical care medicine, there has been increased interest in utilizing ultrasound to assess volume status. However, recent studies of carotid POCUS on volume status and fluid responsiveness fail to recognize the effect insonation angle has on their results. To address this, we studied the effect of insonation angle on peak systolic velocity (PSV) change associated with respiratory variation (RV) and passive leg raise (PLR). METHODS: Doppler measurements were obtained from 51 subjects presenting to the ED. Minimal and maximal PSV were obtained using insonation angles of 46°, 60°, and 90°. ∆PSV was calculated using PLR and RV as trial methods. Results were categorized into two groups, those with a ∆PSVâ¯>â¯10% and those with a ∆PSVâ¯≤â¯10%. ∆PSV mean and standard error, as well as measures of agreement were calculated. RESULTS: Mean ∆PSV associated with PLR test was 9% in the 46° and 60° groups, and 18% in the 90° group, with standard errors of 6, 7, and 14%, respectively. Using 46° as our relative gold standard, Kappa was 0.23 at 60° and 0.11 at 90° with RV as the trial method, and 0.23 at 60° and 0.01 at 90° with a PLR as the trial method. CONCLUSIONS: Variation in PSV is heavily dependent on insonation angle. There was only slight to fair agreement in ∆PSV among the various insonation angles. Further investigation of the optimal insonation angle to assess ∆PSV should be undertaken.
Asunto(s)
Velocidad del Flujo Sanguíneo/fisiología , Arterias Carótidas/diagnóstico por imagen , Ultrasonografía Doppler Dúplex/métodos , Adulto , Anciano , Anciano de 80 o más Años , Arterias Carótidas/fisiología , Estenosis Carotídea/diagnóstico , Estenosis Carotídea/diagnóstico por imagen , Femenino , Humanos , Masculino , Persona de Mediana Edad , Sistemas de Atención de PuntoAsunto(s)
Angiotensinas/metabolismo , Oxigenación por Membrana Extracorpórea/tendencias , Choque/fisiopatología , Adulto , Angiotensina II/metabolismo , Angiotensina II/farmacología , Angiotensinas/farmacología , Oxigenación por Membrana Extracorpórea/métodos , Femenino , Humanos , Masculino , Persona de Mediana Edad , Choque/metabolismo , Choque Cardiogénico/metabolismo , Choque Cardiogénico/fisiopatología , Choque Séptico/metabolismo , Choque Séptico/fisiopatología , Vasoconstrictores/farmacología , Vasoconstrictores/uso terapéuticoRESUMEN
OBJECTIVES: New paradigm shifts in trauma resuscitation recommend that early reconstitution of whole blood ratios with massive transfusion protocols (MTP) may be associated with improved survival. We performed a preliminary study on the efficacy of MTP at an urban, Level 1 trauma center and its impact on resuscitation goals. METHODS: A case-control study was performed on consecutive critically-ill trauma patients over the course of 1 year. The trauma captain designated patients as either MTP activation (cases) or routine care without MTP (controls) in matched, non-randomized fashion. Primary outcomes were: time to initial transfusion; number of total units of packed red blood cells (pRBC) and fresh frozen plasma (FFP) transfused; and ratio of pRBC to fresh frozen plasma (pRBC:FFP). Secondary outcomes were in-hospital mortality, and length of stay. RESULTS: Out of 226 patients screened, we analyzed 58 patients meeting study criteria (32 MTP, 26 non-MTP). Study characteristics for the MTP and non-MTP groups were similar except age (34.0 vs. 45.85 years, p=0.015). MTP patients received blood products more expeditiously (41.7 minutes vs. 62.1 minutes, p=0.10), with more pRBC (5.19 vs 3.08 units, p=0.05), more FFP (0.19 vs 0.08 units, p<0.01), and had larger pRBC:FFP ratios (1.90 vs 0.52, p<0.01). Secondary outcomes did not differ significantly but the MTP group was associated with a trend for decreased hospital length of stay (p=0.08). CONCLUSIONS: MTP resulted in clinically significant improvements in transfusion times and volumes. Further larger and randomized studies are warranted to validate these findings to optimize MTP protocols.
Asunto(s)
Transfusión Sanguínea/métodos , Hemorragia/terapia , Adulto , Transfusión Sanguínea/estadística & datos numéricos , Estudios de Casos y Controles , Protocolos Clínicos , Enfermedad Crítica , Tratamiento de Urgencia/métodos , Tratamiento de Urgencia/estadística & datos numéricos , Transfusión de Eritrocitos/estadística & datos numéricos , Femenino , Hemorragia/mortalidad , Mortalidad Hospitalaria , Hospitales Urbanos , Humanos , Tiempo de Internación/estadística & datos numéricos , Masculino , Persona de Mediana Edad , Ciudad de Nueva York/epidemiología , Plasma , Resucitación/métodos , Resucitación/estadística & datos numéricos , Estudios Retrospectivos , Centros Traumatológicos , Heridas y Lesiones/mortalidad , Heridas y Lesiones/terapiaAsunto(s)
Falla de Equipo , Balón Gástrico , Náusea/etiología , Adulto , Color , Femenino , Humanos , OrinaRESUMEN
BACKGROUND: Traumatic injury in the United States is the Number 1 cause of mortality for patients 1 year to 44 years of age. Studies suggest that early identification of major injury leads to better outcomes for patients. Imaging, such as computed tomography (CT), is routinely used to help determine the presence of major underlying injuries. We review the literature to determine whether whole-body CT (WBCT), a protocol including a noncontrast scan of the brain and neck and a contrast-enhanced scan of the chest, abdomen, and pelvis, detects more clinically significant injuries as opposed to selective scanning as determined by mortality rates. METHODS: Scientific publications from 1980 to 2013 involving the study of the difference between pan scan and selective scan after trauma were identified. The Preferred Reporting Items for Systematic Reviews and Meta-analyses was used. Publications were categorized by level of evidence. Injury Severity Score (ISS) and pooled odds for mortality rate of patients who received WBCT scan versus those who received selective scans were compared. RESULTS: Of the 465 publications identified, 7 were included, composing of 25,782 trauma patients who received CT scan following trauma. Of the patients, 52% (n = 13,477) received pan scan and 48% (n = 12,305) received selective scanning. Overall ISS was significantly higher for patients receiving WBCT versus those receiving selective scan (29.7 vs. 26.4, p < 0.001, respectively). Overall mortality rate was significantly lower for WBCT versus selective scanning (16.9; 95% confidence interval [CI], 16.3-17.6 vs. 20.3; 95% CI, 19.6-21.1, p < 0.0002, respectively). Pooled odds ratio for mortality rate was 0.75 (95% CI, 0.7-0.79), favoring WBCT. CONCLUSION: Despite the WBCT group having significantly higher ISS at baseline compared with the group who received selective scanning, the WBCT group had a lower overall mortality rate and a more favorable pooled odds ratio for trauma patients. This suggests that in terms of overall mortality, WBCT scan is preferable to selective scanning in trauma patients. LEVEL OF EVIDENCE: Systematic review and meta-analysis, level III.
Asunto(s)
Tomografía Computarizada por Rayos X/métodos , Heridas y Lesiones/diagnóstico por imagen , Heridas y Lesiones/mortalidad , Humanos , Puntaje de Gravedad del TraumatismoRESUMEN
BACKGROUND: As trauma care evolves, there has been increased reliance on imaging. The purpose of this study was to examine changes in trauma imaging and radiation exposure over time. Our hypothesis was that there has been an increased usage of imaging in the management of trauma patients without measurable improvements in outcomes. METHODS: A continuous series of injured patients admitted to a Level I trauma center during a 2-month period in 2002 was compared with the same period in 2007. All computed tomography (CT)s and plain radiographs performed for each patient were tabulated. Effective radiation dose estimates for each patient were then calculated. The outcome measures were length of stay, mortality, and missed injuries. RESULTS: The 495 patients in 2007 and 497 patients in 2002 demonstrated no significant differences in demographics, clinical data, or outcomes between groups. However, from 2002 to 2007, for blunt trauma, the mean CTs per patient increased significantly (2.1 ± 1.6 vs. 3.2 ± 2.0, p < 0.001), as did plain radiographs (8.8 ± 12.9 vs. 14.9 ± 17.0, p < 0.001). For penetrating trauma, roentgenogram usage increased significantly (4.2 ± 5.3 vs. 9.1 ± 14.4, p = 0.01) with a trend toward increased CTs (0.7 ± 1.1 vs.1.0 ± 1.6, p = 0.11). Total radiation dose estimates demonstrated significantly increased radiation exposure in 2007; blunt (11.5 ± 11.3 mSv vs. 20.7 ± 14.9 mSv, p < 0.05) and penetrating (2.9 ± 4.9 mSv vs. 5.4 ± 7.9 mSv, p < 0.05). CONCLUSION: From 2002 to 2007, there was a significant increase in the use of CT and plain radiographs in the management of trauma patients, leading to significantly higher radiation exposure with no demonstrable improvements in the diagnosis of missed injuries, mortality, or length of stay.