RESUMEN
BACKGROUND: It is suspected that microbiome-derived trimethylamine N-oxide (TMAO) may enhance platelet responsiveness and accordingly be thrombophilic. The purpose of this prospective observational study is to evaluate TMAO in patients with subarachnoid hemorrhage (SAH) and compare it with a control group. A secondary aim was to investigate TMAO in the cerebrospinal fluid (CSF) from SAH patients. This should provide a better understanding of the role of TMAO in the pathogenesis of SAH and its thrombotic complications. METHODS: The study included patients with diagnosed spontaneous SAH recruited after initial treatment on admission and patients with nerve, nerve root, or plexus disorders serving as controls. Blood samples were gathered from all patients at recruitment. Additionally, sampling of SAH patients in the intensive care unit continued daily for 14 days. The CSF was collected out of existing external ventricular drains whenever possible. RESULTS: Thirty-four patients diagnosed with SAH, and 108 control patients participated in this study. Plasma TMAO levels at baseline were significantly lower in the SAH group (1.7 µmol/L) compared to the control group (2.9 µmol/L). TMAO was detectable in the CSF (0.4 µmol/L) and significantly lower than in plasma samples of the SAH group at baseline. Plasma and CSF TMAO levels correlated positively. The TMAO levels did not differ significantly during the observation period of 15 days. CONCLUSIONS: Although we assumed that patients with higher TMAO levels were at higher risk for SAH a priori, plasma TMAO levels were lower in patients with SAH compared with control subjects with nerve, nerve root, or plexus disorders on admission to the hospital. A characteristic pattern of plasma TMAO levels in patients with SAH was not found.
Asunto(s)
Hemorragia Subaracnoidea , Humanos , Hemorragia Subaracnoidea/complicaciones , Hemorragia Subaracnoidea/terapia , Metilaminas , Estudios ProspectivosRESUMEN
PURPOSE: This executive summary of a national living guideline aims to provide rapid evidence based recommendations on the role of drug interventions in the treatment of hospitalized patients with COVID-19. METHODS: The guideline makes use of a systematic assessment and decision process using an evidence to decision framework (GRADE) as recommended standard WHO (2021). Recommendations are consented by an interdisciplinary panel. Evidence analysis and interpretation is supported by the CEOsys project providing extensive literature searches and living (meta-) analyses. For this executive summary, selected key recommendations on drug therapy are presented including the quality of the evidence and rationale for the level of recommendation. RESULTS: The guideline contains 11 key recommendations for COVID-19 drug therapy, eight of which are based on systematic review and/or meta-analysis, while three recommendations represent consensus expert opinion. Based on current evidence, the panel makes strong recommendations for corticosteroids (WHO scale 5-9) and prophylactic anticoagulation (all hospitalized patients with COVID-19) as standard of care. Intensified anticoagulation may be considered for patients with additional risk factors for venous thromboembolisms (VTE) and a low bleeding risk. The IL-6 antagonist tocilizumab may be added in case of high supplemental oxygen requirement and progressive disease (WHO scale 5-6). Treatment with nMABs may be considered for selected inpatients with an early SARS-CoV-2 infection that are not hospitalized for COVID-19. Convalescent plasma, azithromycin, ivermectin or vitamin D3 should not be used in COVID-19 routine care. CONCLUSION: For COVID-19 drug therapy, there are several options that are sufficiently supported by evidence. The living guidance will be updated as new evidence emerges.
Asunto(s)
COVID-19 , COVID-19/terapia , Hospitalización , Humanos , Inmunización Pasiva , Guías de Práctica Clínica como Asunto , SARS-CoV-2 , Sueroterapia para COVID-19RESUMEN
BACKGROUND: Acute respiratory distress syndrome (ARDS) represents the most severe course of COVID-19 (caused by the SARS-CoV-2 virus), usually resulting in a prolonged stay in an intensive care unit (ICU) and high mortality rates. Despite the fact that most affected individuals need invasive mechanical ventilation (IMV), evidence on specific ventilation strategies for ARDS caused by COVID-19 is scarce. Spontaneous breathing during IMV is part of a therapeutic concept comprising light levels of sedation and the avoidance of neuromuscular blocking agents (NMBA). This approach is potentially associated with both advantages (e.g. a preserved diaphragmatic motility and an optimised ventilation-perfusion ratio of the ventilated lung), as well as risks (e.g. a higher rate of ventilator-induced lung injury or a worsening of pulmonary oedema due to increases in transpulmonary pressure). As a consequence, spontaneous breathing in people with COVID-19-ARDS who are receiving IMV is subject to an ongoing debate amongst intensivists. OBJECTIVES: To assess the benefits and harms of early spontaneous breathing activity in invasively ventilated people with COVID-19 with ARDS compared to ventilation strategies that avoid spontaneous breathing. SEARCH METHODS: We searched the Cochrane COVID-19 Study Register (which includes CENTRAL, PubMed, Embase, Clinical Trials.gov WHO ICTRP, and medRxiv) and the WHO COVID-19 Global literature on coronavirus disease to identify completed and ongoing studies from their inception to 2 March 2022. SELECTION CRITERIA: Eligible study designs comprised randomised controlled trials (RCTs) that evaluated spontaneous breathing in participants with COVID-19-related ARDS compared to ventilation strategies that avoided spontaneous breathing (e.g. using NMBA or deep sedation levels). Additionally, we considered controlled before-after studies, interrupted time series with comparison group, prospective cohort studies and retrospective cohort studies. For these non-RCT studies, we considered a minimum total number of 50 participants to be compared as necessary for inclusion. Prioritised outcomes were all-cause mortality, clinical improvement or worsening, quality of life, rate of (serious) adverse events and rate of pneumothorax. Additional outcomes were need for tracheostomy, duration of ICU length of stay and duration of hospitalisation. DATA COLLECTION AND ANALYSIS: We followed the methods outlined in the Cochrane Handbook for Systematic Reviews of Interventions. Two review authors independently screened all studies at the title/abstract and full-text screening stage. We also planned to conduct data extraction and risk of bias assessment in duplicate. We planned to conduct meta-analysis for each prioritised outcome, as well as subgroup analyses of mortality regarding severity of oxygenation impairment and duration of ARDS. In addition, we planned to perform sensitivity analyses for studies at high risk of bias, studies using NMBA in addition to deep sedation level to avoid spontaneous breathing and a comparison of preprints versus peer-reviewed articles. We planned to assess the certainty of evidence using the GRADE approach. MAIN RESULTS: We identified no eligible studies for this review. AUTHORS' CONCLUSIONS: We found no direct evidence on whether early spontaneous breathing in SARS-CoV-2-induced ARDS is beneficial or detrimental to this particular group of patients. RCTs comparing early spontaneous breathing with ventilatory strategies not allowing for spontaneous breathing in SARS-CoV-2-induced ARDS are necessary to determine its value within the treatment of severely ill people with COVID-19. Additionally, studies should aim to clarify whether treatment effects differ between people with SARS-CoV-2-induced ARDS and people with non-SARS-CoV-2-induced ARDS.
Asunto(s)
COVID-19 , Síndrome de Dificultad Respiratoria , COVID-19/complicaciones , Humanos , Bloqueantes Neuromusculares , Respiración Artificial , Síndrome de Dificultad Respiratoria/virología , SARS-CoV-2 , Revisiones Sistemáticas como AsuntoRESUMEN
BACKGROUND: With potential antiviral and anti-inflammatory properties, Janus kinase (JAK) inhibitors represent a potential treatment for symptomatic severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection. They may modulate the exuberant immune response to SARS-CoV-2 infection. Furthermore, a direct antiviral effect has been described. An understanding of the current evidence regarding the efficacy and safety of JAK inhibitors as a treatment for coronavirus disease 2019 (COVID-19) is required. OBJECTIVES: To assess the effects of systemic JAK inhibitors plus standard of care compared to standard of care alone (plus/minus placebo) on clinical outcomes in individuals (outpatient or in-hospital) with any severity of COVID-19, and to maintain the currency of the evidence using a living systematic review approach. SEARCH METHODS: We searched the Cochrane COVID-19 Study Register (comprising MEDLINE, Embase, ClinicalTrials.gov, World Health Organization (WHO) International Clinical Trials Registry Platform, medRxiv, and Cochrane Central Register of Controlled Trials), Web of Science, WHO COVID-19 Global literature on coronavirus disease, and the US Department of Veterans Affairs Evidence Synthesis Program (VA ESP) Covid-19 Evidence Reviews to identify studies up to February 2022. We monitor newly published randomised controlled trials (RCTs) weekly using the Cochrane COVID-19 Study Register, and have incorporated all new trials from this source until the first week of April 2022. SELECTION CRITERIA: We included RCTs that compared systemic JAK inhibitors plus standard of care to standard of care alone (plus/minus placebo) for the treatment of individuals with COVID-19. We used the WHO definitions of illness severity for COVID-19. DATA COLLECTION AND ANALYSIS: We assessed risk of bias of primary outcomes using Cochrane's Risk of Bias 2 (RoB 2) tool. We used GRADE to rate the certainty of evidence for the following primary outcomes: all-cause mortality (up to day 28), all-cause mortality (up to day 60), improvement in clinical status: alive and without need for in-hospital medical care (up to day 28), worsening of clinical status: new need for invasive mechanical ventilation or death (up to day 28), adverse events (any grade), serious adverse events, secondary infections. MAIN RESULTS: We included six RCTs with 11,145 participants investigating systemic JAK inhibitors plus standard of care compared to standard of care alone (plus/minus placebo). Standard of care followed local protocols and included the application of glucocorticoids (five studies reported their use in a range of 70% to 95% of their participants; one study restricted glucocorticoid use to non-COVID-19 specific indications), antibiotic agents, anticoagulants, and antiviral agents, as well as non-pharmaceutical procedures. At study entry, about 65% of participants required low-flow oxygen, about 23% required high-flow oxygen or non-invasive ventilation, about 8% did not need any respiratory support, and only about 4% were intubated. We also identified 13 ongoing studies, and 9 studies that are completed or terminated and where classification is pending. Individuals with moderate to severe disease Four studies investigated the single agent baricitinib (10,815 participants), one tofacitinib (289 participants), and one ruxolitinib (41 participants). Systemic JAK inhibitors probably decrease all-cause mortality at up to day 28 (95 of 1000 participants in the intervention group versus 131 of 1000 participants in the control group; risk ratio (RR) 0.72, 95% confidence interval (CI) 0.57 to 0.91; 6 studies, 11,145 participants; moderate-certainty evidence), and decrease all-cause mortality at up to day 60 (125 of 1000 participants in the intervention group versus 181 of 1000 participants in the control group; RR 0.69, 95% CI 0.56 to 0.86; 2 studies, 1626 participants; high-certainty evidence). Systemic JAK inhibitors probably make little or no difference in improvement in clinical status (discharged alive or hospitalised, but no longer requiring ongoing medical care) (801 of 1000 participants in the intervention group versus 778 of 1000 participants in the control group; RR 1.03, 95% CI 1.00 to 1.06; 4 studies, 10,802 participants; moderate-certainty evidence). They probably decrease the risk of worsening of clinical status (new need for invasive mechanical ventilation or death at day 28) (154 of 1000 participants in the intervention group versus 172 of 1000 participants in the control group; RR 0.90, 95% CI 0.82 to 0.98; 2 studies, 9417 participants; moderate-certainty evidence). Systemic JAK inhibitors probably make little or no difference in the rate of adverse events (any grade) (427 of 1000 participants in the intervention group versus 441 of 1000 participants in the control group; RR 0.97, 95% CI 0.88 to 1.08; 3 studies, 1885 participants; moderate-certainty evidence), and probably decrease the occurrence of serious adverse events (160 of 1000 participants in the intervention group versus 202 of 1000 participants in the control group; RR 0.79, 95% CI 0.68 to 0.92; 4 studies, 2901 participants; moderate-certainty evidence). JAK inhibitors may make little or no difference to the rate of secondary infection (111 of 1000 participants in the intervention group versus 113 of 1000 participants in the control group; RR 0.98, 95% CI 0.89 to 1.09; 4 studies, 10,041 participants; low-certainty evidence). Subgroup analysis by severity of COVID-19 disease or type of JAK inhibitor did not identify specific subgroups which benefit more or less from systemic JAK inhibitors. Individuals with asymptomatic or mild disease We did not identify any trial for this population. AUTHORS' CONCLUSIONS: In hospitalised individuals with moderate to severe COVID-19, moderate-certainty evidence shows that systemic JAK inhibitors probably decrease all-cause mortality. Baricitinib was the most often evaluated JAK inhibitor. Moderate-certainty evidence suggests that they probably make little or no difference in improvement in clinical status. Moderate-certainty evidence indicates that systemic JAK inhibitors probably decrease the risk of worsening of clinical status and make little or no difference in the rate of adverse events of any grade, whilst they probably decrease the occurrence of serious adverse events. Based on low-certainty evidence, JAK inhibitors may make little or no difference in the rate of secondary infection. Subgroup analysis by severity of COVID-19 or type of agent failed to identify specific subgroups which benefit more or less from systemic JAK inhibitors. Currently, there is no evidence on the efficacy and safety of systemic JAK inhibitors for individuals with asymptomatic or mild disease (non-hospitalised individuals).
Asunto(s)
Tratamiento Farmacológico de COVID-19 , Coinfección , Inhibidores de las Cinasas Janus , Antivirales/uso terapéutico , Humanos , Inhibidores de las Cinasas Janus/uso terapéutico , Oxígeno , Ensayos Clínicos Controlados Aleatorios como Asunto , SARS-CoV-2 , Estados UnidosRESUMEN
For patients with acute respiratory insufficiency, mechanical ("invasive") ventilation is a fundamental therapeutic measure to ensure sufficient gas exchange. Despite decades of strong research efforts, central questions on mechanical ventilation therapy are still answered incompletely. Therefore, many different ventilation modes and settings have been used in daily clinical practice without scientifically sound bases. At the same time, implementation of the few evidence-based therapeutic concepts (e.g., "lung protective ventilation") into clinical practice is still insufficient. The aim of our guideline project "Mechanical ventilation and extracorporeal gas exchange in acute respiratory insufficiency" was to develop an evidence-based decision aid for treating patients with and on mechanical ventilation. It covers the whole pathway of invasively ventilated patients (including indications of mechanical ventilation, ventilator settings, additional and rescue therapies, and liberation from mechanical ventilation). To assess the quality of scientific evidence and subsequently derive recommendations, we applied the Grading of Recommendations, Assessment, Development and Evaluation method. For the first time, using this globally accepted methodological standard, our guideline contains recommendations on mechanical ventilation therapy not only for acute respiratory distress syndrome patients but also for all types of acute respiratory insufficiency. This review presents the two main chapters of the guideline on choosing the mode of mechanical ventilation and setting its parameters. The guideline group aimed that - by thorough implementation of the recommendations - critical care teams may further improve the quality of care for patients suffering from acute respiratory insufficiency. By identifying relevant gaps of scientific evidence, the guideline group intended to support the development of important research projects.
Asunto(s)
Oxigenación por Membrana Extracorpórea , Respiración Artificial/métodos , Síndrome de Dificultad Respiratoria/terapia , Insuficiencia Respiratoria/terapia , HumanosRESUMEN
INTRODUCTION: Hemoglobin-based oxygen carriers (HBOC) have been developed as an alternative to blood transfusions. Their nitric-oxide-scavenging properties HBOC also induce vasoconstriction. In acute lung injury, an excess of nitric oxide results in a general vasodilation, reducing oxygenation by impairing the hypoxic pulmonary vasoconstriction. Inhaled nitric oxide (iNO) is used to correct the ventilation perfusion mismatch. We hypothesized that the additional use of HBOC might increase this effect. In a rodent model of ARDS we evaluated the combined effect of HBOC and iNO on vascular tone and gas exchange. METHODS: ARDS was induced in anaesthetized Wistar rats by saline lavage and aggressive ventilation. Two groups received either hydroxyethylstarch 10% (HES; nâ¯=â¯10) or the HBOC hemoglobin glutamer-200 (HBOC-200; nâ¯=â¯10) via a central venous infusion. Additionally, both groups received iNO. Monitoring of the right ventricular pressure (RVP) and mean arterial pressure (MAP) was performed with microtip transducers. Arterial oxygenation was measured via arterial blood gas analyses. RESULTS: Application of HBOC-200 led to a significant increase of MAP and RVP when compared to baseline and to the HES group. This effect was reversed by iNO. The application of HBOC and iNO had no effect on the arterial oxygenation over time. No difference in arterial oxygenation was found between the groups. CONCLUSION: Application of HBOC led to an increase of systemic and pulmonary vascular resistance in this animal model of ARDS. The increase in RVP was reversed by iNO. Pulmonary vasoconstriction by hemoglobin glutamer-200 in combination with iNO did not improve arterial oxygenation in ARDS.
Asunto(s)
Hemoglobinas/administración & dosificación , Óxido Nítrico/administración & dosificación , Oxígeno/metabolismo , Síndrome de Dificultad Respiratoria/terapia , Administración por Inhalación , Animales , Presión Arterial/fisiología , Sustitutos Sanguíneos/administración & dosificación , Modelos Animales de Enfermedad , Derivados de Hidroxietil Almidón/administración & dosificación , Masculino , Intercambio Gaseoso Pulmonar/fisiología , Ratas , Ratas Wistar , Síndrome de Dificultad Respiratoria/fisiopatología , Vasoconstricción/fisiología , Presión Ventricular/fisiologíaRESUMEN
Recommendations concerning the management of hemoglobin levels and hematocrit in patients on extracorporeal membrane oxygenation (ECMO) still advise maintenance of a normal hematocrit. In contrast, current transfusion guidelines for critically ill patients support restrictive transfusion practice. We report on a series of patients receiving venovenous ECMO (vvECMO) for acute respiratory distress syndrome (ARDS) treated according to the restrictive transfusion regimen recommended for critically ill patients. We retrospectively analyzed 18 patients receiving vvECMO due to severe ARDS. Hemoglobin concentrations were kept between 7 and 9 g/dL with a transfusion trigger at 7 g/dL or when physiological transfusion triggers were apparent. We assessed baseline data, hospital mortality, time on ECMO, hemoglobin levels, hematocrit, quantities of packed red blood cells received, and lactate concentrations and compared survivors and nonsurvivors. The overall mortality of all patients on vvECMO was 38.9%. Mean hemoglobin concentration over all patients and ECMO days was 8.30 ± 0.51 g/dL, and hematocrit was 0.25 ± 0.01, with no difference between survivors and nonsurvivors. Mean numbers of given PRBCs showed a trend towards higher quantities in the group of nonsurvivors, but the difference was not significant (1.97 ± 1.47 vs. 0.96 ± 0.76 units; P = 0.07). Mean lactate clearance from the first to the third day was 45.4 ± 28.3%, with no significant difference between survivors and nonsurvivors (P = 0.19). In our cohort of patients treated with ECMO due to severe ARDS, the application of a restrictive transfusion protocol did not result in an increased mortality. Safety and feasibility of the application of a restrictive transfusion protocol in patients on ECMO must further be evaluated in randomized controlled trials.
Asunto(s)
Transfusión de Eritrocitos , Oxigenación por Membrana Extracorpórea , Síndrome de Dificultad Respiratoria/cirugía , Síndrome Respiratorio Agudo Grave/cirugía , Adolescente , Adulto , Biomarcadores/sangre , Enfermedad Crítica , Transfusión de Eritrocitos/efectos adversos , Transfusión de Eritrocitos/mortalidad , Oxigenación por Membrana Extracorpórea/efectos adversos , Oxigenación por Membrana Extracorpórea/mortalidad , Femenino , Hematócrito , Hemoglobinas/metabolismo , Mortalidad Hospitalaria , Humanos , Ácido Láctico/sangre , Masculino , Persona de Mediana Edad , Valor Predictivo de las Pruebas , Síndrome de Dificultad Respiratoria/sangre , Síndrome de Dificultad Respiratoria/diagnóstico , Síndrome de Dificultad Respiratoria/mortalidad , Estudios Retrospectivos , Factores de Riesgo , Síndrome Respiratorio Agudo Grave/sangre , Síndrome Respiratorio Agudo Grave/diagnóstico , Síndrome Respiratorio Agudo Grave/mortalidad , Factores de Tiempo , Resultado del Tratamiento , Adulto JovenRESUMEN
BACKGROUND: Many missions in the preclinical emergency services seem to be triggered by false indications as defined by the Federal State Rescue Act. These emergency calls are often a result of or associated with social issues. Emergency rescue personnel are confronted with social problems and as a result often feel left alone with the problem. AIM: This article promotes the understanding of emergency service personnel for the associations between social problems and health. Solution strategies for frequent social emergencies are described. MATERIAL AND METHODS: This article demonstrates the associations between socioeconomic status, health and disease. Typical indications for missions in which social aspects play an important role are presented and solution strategies for the approach are suggested. A discussion is presented on how to deal with cases of child abuse and domestic violence. Three classical psychiatric problem areas with common social components are explained: psychomotor state of excitation, suicide and alcohol-associated incidents and special attention is paid to danger to third parties and aggressive patients. In addition to the treatment of medical conditions, social problems play an important role particularly for the elderly and chronically ill patients. RESULTS AND CONCLUSION: Emergency personnel have only limited options for dealing with such problems; however, it is important to be aware of regional structures and non-medical organizations, which might be of help in such situations. These include social services, youth welfare services, crisis interventions teams and social psychiatric services.
Asunto(s)
Servicios Médicos de Urgencia , Trabajo de Rescate , Conducta Social , Adulto , Anciano , Niño , Maltrato a los Niños , Intervención en la Crisis (Psiquiatría) , Violencia Doméstica , Personal de Salud , Estado de Salud , Humanos , Trastornos Mentales/complicaciones , Trastornos Mentales/psicología , Persona de Mediana Edad , Clase Social , Problemas SocialesRESUMEN
The aim of the present study was to characterize a murine model of acute respiratory distress syndrome (ARDS) abiding by the Berlin definition of human ARDS and guidelines for animal models of ARDS. To this end, C57BL/6NCrl mice were challenged with lipopolysaccharide (LPS; 15 mg/kg, i.p.) followed 18 h later by injection of oleic acid (OA; 0.12 mL/kg, i.v.). Controls received saline injection at both time points. Haemodynamics were monitored continuously. Arterial blood gas analyses were performed just before and every 30 min after OA challenge. Ninety minutes after OA challenge, the chest of mice was scanned using micro-computed tomography (CT). Cytokine concentrations were measured in plasma samples. Lungs were harvested 90 min after OA challenge for histology, immunohistochemistry, lung weight measurements and tissue cytokine detection. A histological lung injury score was determined. Eighteen hours after LPS challenge, mice exhibited a severe systemic inflammatory response syndrome. Oxygenation declined significantly after OA injections (Pa o2 /Fi o2 283 ± 73 and 256 ± 71 mmHg at 60 and 90 min, respectively; P < 0.001). Bilateral patchy infiltrates were present on the micro-CT scans. Histology revealed parenchymal damage with accumulation of polymorphonuclear neutrophils, intra-alveolar proteinacous debris and few hyaline membranes. The lung wet : dry ratio indicated damage to the alveolar capillary membrane. Cytokine patterns evidenced a severe local and systemic inflammatory state in plasma and lung tissue. In conclusion, the described two-hit model of ARDS shows a pathological picture of ARDS closely mimicking human ARDS according to the Berlin definition and may facilitate interpretation of prospective experimental results.
Asunto(s)
Pulmón/patología , Síndrome de Dificultad Respiratoria/patología , Animales , Citocinas/sangre , Modelos Animales de Enfermedad , Hemodinámica/efectos de los fármacos , Lipopolisacáridos/farmacología , Pulmón/efectos de los fármacos , Masculino , Ratones , Ratones Endogámicos C57BL , Neutrófilos/efectos de los fármacos , Ácido Oléico/farmacología , Edema Pulmonar/sangre , Edema Pulmonar/patología , Síndrome de Dificultad Respiratoria/sangre , Síndrome de Respuesta Inflamatoria Sistémica/sangre , Síndrome de Respuesta Inflamatoria Sistémica/patología , Microtomografía por Rayos X/métodosRESUMEN
Background: Sleep deprivation and disturbances in circadian rhythms may hinder surgical performance and decision-making capabilities. Solid organ transplantations, which are technically demanding and often begin at uncertain times, frequently during nighttime hours, are particularly susceptible to these effects. This study aimed to assess how transplant operations conducted during daytime versus nighttime influence both patient and graft outcomes and function. Methods: simultaneous pancreas-kidney transplants (SPKTs) conducted at the University Hospital of Leipzig from 1998 to 2018 were reviewed retrospectively. The transplants were categorized based on whether they began during daytime hours (8 a.m. to 6 p.m.) or nighttime hours (6 p.m. to 8 a.m.). We analyzed the demographics of both donors and recipients, as well as primary outcomes, which included surgical complications, patient survival, and graft longevity. Results: In this research involving 105 patients, 43 SPKTs, accounting for 41%, took place in the daytime, while 62 transplants (59%) occurred at night. The characteristics of both donors and recipients were similar across the two groups. Further, the rate of (surgical) pancreas graft-related complications and reoperations (daytime 39.5% versus nighttime 33.9%; p = 0.552) were also not statistically significant between both groups. In this study, the five-year survival rate for patients was comparable for both daytime and nighttime surgeries, with 85.2% for daytime and 86% for nighttime procedures (p = 0.816). Similarly, the survival rates for pancreas grafts were 75% for daytime and 77% for nighttime operations (p = 0.912), and for kidney grafts, 76% during the day compared to 80% at night (p = 0.740), indicating no significant statistical difference between the two time periods. In a multivariable model, recipient BMI > 30 kg/m2, donor age, donor BMI, and cold ischemia time > 15 h were independent predictors for increased risk of (surgical) pancreas graft-related complications, whereas the timepoint of SPKT (daytime versus nighttime) did not have an impact. Conclusions: The findings from our retrospective analysis at a big single German transplant center indicate that SPKT is a reliable procedure, regardless of the start time. Additionally, our data revealed that patients undergoing nighttime transplants have no greater risk of surgical complications or inferior results concerning long-term survival of the patient and graft. However, due to the small number of cases evaluated, further studies are required to confirm these results.
RESUMEN
OBJECTIVES: We present the 'COVID-19 evidence ecosystem' (CEOsys) as a German network to inform pandemic management and to support clinical and public health decision-making. We discuss challenges faced when organizing the ecosystem and derive lessons learned for similar networks acting during pandemics or health-related crises. STUDY DESIGN AND SETTING: Bringing together 18 university hospitals and additional institutions, CEOsys key activities included research prioritization, conducting living systematic reviews (LSRs), supporting evidence-based (living) guidelines, knowledge translation (KT), detecting research gaps, and deriving recommendations, backed by technical infrastructure and capacity building. RESULTS: CEOsys rapidly produced 31 high-quality evidence syntheses and supported three living guidelines on COVID-19-related topics, while also developing methodological procedures. Challenges included CEOsys' late initiation in relation to the pandemic outbreak, the delayed prioritization of research questions, the continuously evolving COVID-19-related evidence, and establishing a technical infrastructure. Methodological-clinical tandems, the cooperation with national guideline groups and international collaborations were key for efficiency. CONCLUSION: CEOsys provided a proof-of-concept for a functioning evidence ecosystem at the national level. Lessons learned include that similar networks should, among others, involve methodological and clinical key stakeholders early on, aim for (inter)national collaborations, and systematically evaluate their value. We particularly call for a sustainable network.
Asunto(s)
COVID-19 , Pandemias , Humanos , COVID-19/epidemiología , Alemania , Medicina Basada en la Evidencia , SARS-CoV-2 , Guías de Práctica Clínica como AsuntoRESUMEN
PURPOSE: [(11)C]DASB is currently the most frequently used highly selective radiotracer for visualization and quantification of central SERT. Its use, however, is hampered by the short half-life of (11)C, the moderate cortical test-retest reliability, and the lack of quantifying endogenous serotonin. Labelling with (18)F allows in principle longer acquisition times for kinetic analysis in brain tissue and may provide higher sensitivity. The aim of our study was to firstly use the new highly SERT-selective (18)F-labelled fluoromethyl analogue of (+)-McN5652 ((+)-[(18)F]FMe-McN5652) in humans and to evaluate its potential for SERT quantification. METHODS: The PET data from five healthy volunteers (three men, two women, age 39 ± 10 years) coregistered with individual MRI scans were semiquantitatively assessed by volume-of-interest analysis using the software package PMOD. Rate constants and total distribution volumes (V (T)) were calculated using a two-tissue compartment model and arterial input function measurements were corrected for metabolite/plasma data. Standardized uptake region-to-cerebellum ratios as a measure of specific radiotracer accumulation were compared with those of a [(11)C]DASB PET dataset from 21 healthy subjects (10 men, 11 women, age 38 ± 8 years). RESULTS: The two-tissue compartment model provided adequate fits to the data. Estimates of total distribution volume (V (T)) demonstrated good identifiability based on the coefficients of variation (COV) for the volumes of interest in SERT-rich and cortical areas (COV V (T) <10%). Compared with [(11)C]DASB PET, there was a tendency to lower mean uptake values in (+)-[(18)F]FMe-McN5652 PET; however, the standard deviation was also somewhat lower. Altogether, cerebral (+)-[(18)F]FMe-McN5652 uptake corresponded well with the known SERT distribution in humans. CONCLUSION: The results showed that (+)-[(18)F]FMe-McN5652 is also suitable for in vivo quantification of SERT with PET. Because of the long half-life of (18)F, the widespread use within a satellite concept seems feasible.
Asunto(s)
Encéfalo/diagnóstico por imagen , Encéfalo/metabolismo , Radioisótopos de Flúor , Isoquinolinas/química , Tomografía de Emisión de Positrones/métodos , Proteínas de Transporte de Serotonina en la Membrana Plasmática/metabolismo , Adulto , Sitios de Unión , Transporte Biológico , Proteínas Sanguíneas/metabolismo , Femenino , Humanos , Isoquinolinas/efectos adversos , Isoquinolinas/metabolismo , Masculino , Tomografía de Emisión de Positrones/efectos adversos , Tomografía de Emisión de Positrones/normas , Estándares de Referencia , SeguridadRESUMEN
Perfluorocarbons are oxygen-carrying, dense liquids initially intended for the use in partial or total liquid ventilation of patients with severe acute respiratory distress syndrome but which did not show beneficial effects in clinical studies. However, perfluorocarbons may be used for lung lavage in severe alveolar proteinosis. In acute respiratory distress syndrome, oxygenation may be so severely compromised that the use of nonoxygenated perfluorocarbons may not be possible. We report a case of severe, nonresolving acute respiratory distress syndrome treated with extracorporeal membrane oxygenation to secure oxygenation, using perfluorocarbon in a single instillation to aid the clearance of debris and proteinacous edema.
Asunto(s)
Oxigenación por Membrana Extracorpórea , Fluorocarburos , Ventilación Liquida , Síndrome de Dificultad Respiratoria , Fluorocarburos/uso terapéutico , Humanos , Respiración Artificial , Síndrome de Dificultad Respiratoria/terapiaRESUMEN
Objective: Due to the high prevalence and incidence of cardio- and cerebrovascular diseases among dialysis-dependent patients with end-stage renal disease (ERSD) scheduled for kidney transplantation (KT), the use of antiplatelet therapy (APT) and/or anticoagulant drugs in this patient population is common. However, these patients share a high risk of complications, either due to thromboembolic or bleeding events, which makes adequate peri- and post-transplant anticoagulation management challenging. Predictive clinical models, such as the HAS-BLED score developed for predicting major bleeding events in patients under anticoagulation therapy, could be helpful tools for the optimization of antithrombotic management and could reduce peri- and postoperative morbidity and mortality. Methods: Data from 204 patients undergoing kidney transplantation (KT) between 2011 and 2018 at the University Hospital Leipzig were retrospectively analyzed. Patients were stratified and categorized postoperatively into the prophylaxis group (group A)patients without pretransplant anticoagulation/antiplatelet therapy and receiving postoperative heparin in prophylactic dosesand into the (sub)therapeutic group (group B)patients with postoperative continued use of pretransplant antithrombotic medication used (sub)therapeutically. The primary outcome was the incidence of postoperative bleeding events, which was evaluated for a possible association with the use of antithrombotic therapy. Secondary analyses were conducted for the associations of other potential risk factors, specifically the HAS-BLED score, with allograft outcome. Univariate and multivariate logistic regression as well as a Cox proportional hazard model were used to identify risk factors for long-term allograft function, outcome and survival. The calibration and prognostic accuracy of the risk models were evaluated using the Hosmer−Lemshow test (HLT) and the area under the receiver operating characteristic curve (AUC) model. Results: In total, 94 of 204 (47%) patients received (sub)therapeutic antithrombotic therapy after transplantation and 108 (53%) patients received prophylactic antithrombotic therapy. A total of 61 (29%) patients showed signs of postoperative bleeding. The incidence (p < 0.01) and timepoint of bleeding (p < 0.01) varied significantly between the different antithrombotic treatment groups. After applying multivariate analyses, pre-existing cardiovascular disease (CVD) (OR 2.89 (95% CI: 1.02−8.21); p = 0.04), procedure-specific complications (blood loss (OR 1.03 (95% CI: 1.0−1.05); p = 0.014), Clavien−Dindo classification > grade II (OR 1.03 (95% CI: 1.0−1.05); p = 0.018)), HAS-BLED score (OR 1.49 (95% CI: 1.08−2.07); p = 0.018), vit K antagonists (VKA) (OR 5.89 (95% CI: 1.10−31.28); p = 0.037), the combination of APT and therapeutic heparin (OR 5.44 (95% CI: 1.33−22.31); p = 0.018) as well as postoperative therapeutic heparin (OR 3.37 (95% CI: 1.37−8.26); p < 0.01) were independently associated with an increased risk for bleeding. The intraoperative use of heparin, prior antiplatelet therapy and APT in combination with prophylactic heparin was not associated with increased bleeding risk. Higher recipient body mass index (BMI) (OR 0.32 per 10 kg/m2 increase in BMI (95% CI: 0.12−0.91); p = 0.023) as well as living donor KT (OR 0.43 (95% CI: 0.18−0.94); p = 0.036) were associated with a decreased risk for bleeding. Regarding bleeding events and graft failure, the HAS-BLED risk model demonstrated good calibration (bleeding and graft failure: HLT: chi-square: 4.572, p = 0.802, versus chi-square: 6.52, p = 0.18, respectively) and moderate predictive performance (bleeding AUC: 0.72 (0.63−0.79); graft failure: AUC: 0.7 (0.6−0.78)). Conclusions: In our current study, we could demonstrate the HAS-BLED risk score as a helpful tool with acceptable predictive accuracy regarding bleeding events and graft failure following KT. The intensified monitoring and precise stratification/assessment of bleeding risk factors may be helpful in identifying patients at higher risks of bleeding, improved individualized anticoagulation decisions and choices of antithrombotic therapy in order to optimize outcome after kidney transplantation.
RESUMEN
Objectives: Adequate organ perfusion, as well as appropriate blood pressure levels at the time of unclamping, is crucial for early and long-term graft function and outcome in simultaneous pancreas−kidney transplantation (SPKT). However, the optimal intraoperative mean arterial pressure (MAP) level has not well been defined. Methods: From a prospectively collected database, the medical data of 105 patients undergoing SPKT at our center were retrospectively analyzed. A receiver operating characteristic (ROC) analysis was preliminarily performed for optimal cut-off value for MAP at reperfusion, to predict early pancreatic graft function. Due to these results, we divided the patients according to their MAP values at reperfusion into <91 mmHg (n = 47 patients) and >91 mmHg (n = 58 patients) groups. Clinicopathological characteristics and outcomes, as well as early graft function and long-term survival, were retrospectively analyzed. Results: Donor and recipient characteristics were comparable between both groups. Rates of postoperative complications were significantly higher in the <91 mmHg group than those in the >91 mmHg group (vascular thrombosis of the pancreas: 7 (14%) versus 2 (3%); p = 0.03; pancreatitis/intraabdominal abscess: 10 (21%) versus 4 (7%); p = 0.03; renal delayed graft function (DGF): 11 (23%) versus 5 (9%); p = 0.03; postreperfusion urine output: 106 ± 50 mL versus 195 ± 45 mL; p = 0.04). There were no significant differences in intraoperative volume repletion, central venous pressure (CVP), use of vasoactive inotropic agents, and the metabolic outcome. Five-year pancreas graft survival was significantly higher in the >91 mmHg group (>91 mmHg: 82% versus <91 mmHg: 61%; p < 0.01). No significant differences were observed in patient and kidney graft survival at 5 years between both groups. Multivariate Cox regression analysis affirmed MAP < 91 mmHg as an independent prognostic predictor for renal DGF (HR 3.49, 1.1−10.8, p = 0.03) and pancreas allograft failure (HR 2.26, 1.0−4.8, p = 0.01). Conclusions: A MAP > 91 mmHg at the time point of reperfusion was associated with a reduced rate of postoperative complications, enhancing and recovering long-term graft function and outcome and thus increasing long-term survival in SPKT recipients.
RESUMEN
BACKGROUND: Despite recent advances in surgical procedures and immunosuppressive regimes, early pancreatic graft dysfunction, mainly specified as ischemia-reperfusion injury (IRI)-Remains a common cause of pancreas graft failure with potentially worse outcomes in simultaneous pancreas-kidney transplantation (SPKT). Anesthetic conditioning is a widely described strategy to attenuate IRI and facilitate graft protection. Here, we investigate the effects of different volatile anesthetics (VAs) on early IRI-associated posttransplant clinical outcomes as well as graft function and outcome in SPKT recipients. METHODS: Medical data of 105 patients undergoing SPKT between 1998-2018 were retrospectively analyzed and stratified according to the used VAs. The primary study endpoint was the association and effect of VAs on pancreas allograft failure following SPKT; secondary endpoint analyses included "IRI- associated posttransplant clinical outcome" as well as long-term graft function and outcome. Additionally, peak serum levels of C-reactive protein (CRP) and lipase during the first 72 h after SPKT were determined and used as further markers for "pancreatic IRI" and graft injury. Typical clinicopathological characteristics and postoperative outcomes such as early graft outcome and long-term function were analyzed. RESULTS: Of the 105 included patients in this study three VAs were used: isoflurane (n = 58 patients; 55%), sevoflurane (n = 22 patients; 21%), and desflurane (n = 25 patients, 24%). Donor and recipient characteristics were comparable between both groups. Early graft loss within 3 months (24% versus 5% versus 8%, p = 0.04) as well as IRI-associated postoperative clinical complications (pancreatitis: 21% versus 5% versus 5%, p = 0.04; vascular thrombosis: 13% versus 0% versus 5%; p = 0.09) occurred more frequently in the Isoflurane group compared with the sevoflurane and desflurane groups. Anesthesia with sevoflurane resulted in the lowest serum peak levels of lipase and CRP during the first 3 days after transplantation, followed by desflurane and isoflurane (p = 0.039 and p = 0.001, respectively). There was no difference with regard to 10-year pancreas graft survival as well as endocrine/metabolic function among all three VA groups. Multivariate analysis revealed the choice of VAs as an independent prognostic factor for graft failure three months after SPKT (HR 0.38, 95%CI: 0.17-0.84; p = 0.029). CONCLUSIONS: In our study, sevoflurane and desflurane were associated with significantly increased early graft survival as well as decreased IRI-associated post-transplant clinical outcomes when compared with the isoflurane group and should be the focus of future clinical studies evaluating the positive effects of different VA agents in patients receiving SPKT.
RESUMEN
Background: Despite recent advances and refinements in perioperative management of simultaneous pancreas−kidney transplantation (SPKT) early pancreatic graft dysfunction (ePGD) remains a critical problem with serious impairment of early and long-term graft function and outcome. Hence, we evaluated a panel of classical blood serum markers for their value in predicting early graft dysfunction in patients undergoing SPKT. Methods: From a prospectively collected database medical data of 105 patients undergoing SPKT between 1998 and 2018 at our center were retrospectively analyzed. The primary study outcome was the detection of occurrence of early pancreatic graft dysfunction (ePGD), the secondary study outcome was early renal graft dysfunction (eRGD) as well as all other outcome parameters associated with the graft function. In this context, ePGD was defined as pancreas graft-related complications including graft pancreatitis, pancreatic abscess/peritonitis, delayed graft function, graft thrombosis, bleeding, rejection and the consecutive need for re-laparotomy due to graft-related complications within 3 months. With regard to analyzing ePGD, serum levels of white blood cell count (WBC), C-reactive protein (CRP), procalcitonin (PCT), pancreatic lipase as well as neutrophil−lymphocyte ratio (NLR) and platelet−lymphocyte ratio (PLR) were measured preoperatively and at postoperative days (POD) 1, 2, 3 and 5. Further, peak serum levels of CRP and lipase during the first 72 h were evaluated. Receiver operating characteristics (ROC) curves were performed to assess their predictive value for ePGD and eRGD. Cut-off levels were calculated with the Youden index. Significant diagnostic biochemical cut-offs as well as other prognostic clinical factors were tested in a multivariate logistic regression model. Results: Of the 105 patients included, 43 patients (41%) and 28 patients (27%) developed ePGD and eRGD following SPKT, respectively. The mean WBC, PCT, NLR, PLR, CRP and lipase levels were significantly higher on most PODs in the ePGD group compared to the non-ePGD group. ROC analysis indicated that peak lipase (AUC: 0.82) and peak CRP levels (AUC: 0.89) were highly predictive for ePGD after SPKT. The combination of both achieved the highest AUC (0.92; p < 0.01) in predicting ePGD. Concerning eRGD, predictive accuracy of all analyzed serological markers was moderate (all AUC < 0.8). Additionally, multivariable analysis identified previous dialysis/no preemptive transplantation (OR 2.4 (95% CI: 1.41−4.01), p = 0.021), donor age (OR 1.07 (95% CI: 1.03−1.14), p < 0.010), donor body mass index (OR 1.32 (95% CI: 1.01−1.072), p = 0.04), donors cerebrovascular cause of death (OR 7.8 (95% CI: 2.21−26.9), p < 0.010), donor length of ICU stay (OR 1.27 (95% CI: 1.08−1.49), p < 0.010), as well as CIT pancreas (OR 1.07 (95% CI: 1.03−1.14), p < 0.010) as clinical relevant prognostic predictors for ePGD. Further, a peak of lipase (OR 1.04 (95% CI: 1.02−1.07), p < 0.010), peak of CRP levels (OR 1.12 (95% CI: 1.02−1.23), p < 0.010), pancreatic serum lipase concentration on POD 2 > 150 IU/L (OR 2.9 (95% CI: 1.2−7.13), p = 0.021) and CRP levels of ≥ 180 ng/mL on POD 2 (OR 3.6 (95% CI: 1.54−8.34), p < 0.01) and CRP levels > 150 ng/mL on POD 3 (OR 4.5 (95% CI: 1.7−11.4), p < 0.01) were revealed as independent biochemical predictive variables for ePGD after transplantation. Conclusions: In the current study, the combination of peak lipase and CRP levels were highly effective in predicting early pancreatic graft dysfunction development following SPKT. In contrast, for early renal graft dysfunction the predictive value of this parameter was less sensitive. Intensified monitoring of these parameters may be helpful for identifying patients at a higher risk of pancreatic ischemia reperfusion injury and various IRI- associated postoperative complications leading to ePGD and thus deteriorated outcome.
RESUMEN
Background: Despite recent advances and refinements in perioperative management of kidney transplantation (KT), early renal graft injury (eRGI) remains a critical problem with serious impairment of graft function as well as short- and long-term outcome. Serial monitoring of peripheral blood innate immune cells might be a useful tool in predicting post-transplant eRGI and graft outcome after KT. Methods: In this prospective study, medical data of 50 consecutive patients undergoing KT at the University Hospital of Leipzig were analyzed starting at the day of KT until day 10 after the transplantation. The main outcome parameter was the occurrence of eRGI and other outcome parameters associated with graft function/outcome. eRGI was defined as graft-related complications and clinical signs of renal IRI (ischemia reperfusion injury), such as acute tubular necrosis (ATN), delayed graft function (DGF), initial nonfunction (INF) and graft rejection within 3 months following KT. Typical innate immune cells including neutrophils, natural killer (NK) cells, monocytes, basophils and dendritic cells (myeloid, plasmacytoid) were measured in all patients in peripheral blood at day 0, 1, 3, 7 and 10 after the transplantation. Receiver operating characteristics (ROC) curves were performed to assess their predictive value for eRGI. Cutoff levels were calculated with the Youden index. Significant diagnostic immunological cutoffs and other prognostic clinical factors were tested in a multivariate logistic regression model. Results: Of the 50 included patients, 23 patients developed eRGI. Mean levels of neutrophils and monocytes were significantly higher on most days in the eRGI group compared to the non-eRGI group after transplantation, whereas a significant decrease in NK cell count, basophil levels and DC counts could be found between baseline and postoperative course. ROC analysis indicated that monocytes levels on POD 7 (AUC: 0.91) and NK cell levels on POD 7 (AUC: 0.92) were highly predictive for eRGI after KT. Multivariable analysis identified recipient age (OR 1.53 (95% CI: 1.003−2.350), p = 0.040), recipient body mass index > 25 kg/m2 (OR 5.6 (95% CI: 1.36−23.9), p = 0.015), recipient cardiovascular disease (OR 8.17 (95% CI: 1.28−52.16), p = 0.026), donor age (OR 1.068 (95% CI: 1.011−1.128), p = 0.027), <0.010), deceased-donor transplantation (OR 2.18 (95% CI: 1.091−4.112), p = 0.027) and cold ischemia time (CIT) of the renal graft (OR 1.005 (95% CI: 1.001−1.01), p = 0.019) as clinically relevant prognostic factors associated with increased eRGI following KT. Further, neutrophils > 9.4 × 103/µL on POD 7 (OR 16.1 (95% CI: 1.31−195.6), p = 0.031), monocytes > 1150 cells/ul on POD 7 (OR 7.81 (95% CI: 1.97−63.18), p = 0.048), NK cells < 125 cells/µL on POD 3 (OR 6.97 (95% CI: 3.81−12.7), p < 0.01), basophils < 18.1 cells/µL on POD 10 (OR 3.45 (95% CI: 1.37−12.3), p = 0.02) and mDC < 4.7 cells/µL on POD 7 (OR 11.68 (95% CI: 1.85−73.4), p < 0.01) were revealed as independent biochemical predictive variables for eRGI after KT. Conclusions: We show that the combined measurement of immunological innate variables (NK cells and monocytes on POD 7) and specific clinical factors such as prolonged CIT, increased donor and recipient age and morbidity together with deceased-donor transplantation were significant and specific predictors of eRGI following KT. We suggest that intensified monitoring of these parameters might be a helpful clinical tool in identifying patients at a higher risk of postoperative complication after KT and may therefore help to detect andby diligent clinical managementeven prevent deteriorated outcome due to IRI and eRGI after KT.
RESUMEN
OBJECTIVE: On 16 December 2006, most Eurotransplant countries changed waiting time oriented liver allocation policy to the urgency oriented Model for End-stage Liver Disease (MELD) system. There are limited data on the effects of this policy change within the Eurotransplant community. PATIENTS AND METHODS: A total of 154 patients who had undergone deceased donor liver transplantation (LT) were retrospectively analyzed in three time periods: period A (1-year pre-MELD, n = 42) versus period B (1-year post-MELD, n = 52) versus period C (2 years after MELD implementation, n = 60). RESULTS: The median MELD score at the time of LT increased from 16.3 points in period A to 22.4 and 20.4 in periods B and C, respectively (p = 0.007). Waitlist mortality decreased from 18.4% in period A to 10.4% and 9.4% in periods B and C, respectively (p = 0.015). Three-month mortality did not change significantly (10% each for periods A, B and C). One-year survival was 84% for the MELD 6-19 group compared with 81% in the MELD 20-29 group and 74% in the MELD ≥30 group (p = 0.823). Analyzing MELD score and previously described prognostic scores [i.e. survival after liver transplantation (SALT) score and donor-MELD (D-MELD) score] with regard to 1-year survival, only a high risk SALT score was predictive (p = 0.038). In our center, 2 years after implementation of the MELD system, waitlist mortality decreased, while 90-day mortality did not change significantly. CONCLUSION: Up to now, only the SALT score proved to be of prognostic relevance post-transplant.