RESUMO
Hantaviruses, members of the Bunyaviridae family, can cause two patterns of disease in humans, hantavirus hemorrhagic fever with renal syndrome (HFRS) and cardiopulmonary syndrome (HCPS), being the latter hegemonic on the American continent. Andesvirus is one of the strains that can cause HCPS and is endemic in Chile. Its transmission occurs through direct or indirect contact with infected rodents' urine, saliva, or feces and inhalation of aerosol particles containing the virus. HCPS rapidly evolves into acute but reversible multiorgan dysfunction. The hemodynamic pattern of HCPS is not identical to that of cardiogenic or septic shock, being characterized by hypovolemia, systolic dysfunction, and pulmonary edema secondary to increased permeability. Given the lack of specific effective therapies to treat this viral infection, the focus of treatment lies in the timely provision of intensive care, specifically hemodynamic and respiratory support, which often requires veno-arterial extracorporeal membrane oxygenation (VA-ECMO). This narrative review aims to provide insights into specific ICU management of HCPS based on the available evidence and gathered experience in Chile and South America including perspectives of pathophysiology, organ dysfunction kinetics, timely life support provision, safe patient transportation, and key challenges for the future.
Assuntos
Cuidados Críticos , Síndrome Pulmonar por Hantavirus , Humanos , Cuidados Críticos/métodos , Síndrome Pulmonar por Hantavirus/terapia , Síndrome Pulmonar por Hantavirus/diagnóstico , Síndrome Pulmonar por Hantavirus/fisiopatologia , Síndrome Pulmonar por Hantavirus/epidemiologia , Oxigenação por Membrana Extracorpórea/métodos , Chile/epidemiologia , Orthohantavírus/fisiologiaRESUMO
BACKGROUND: Andes virus (ANDV) is a zoonotic Orthohantavirus leading to hantavirus cardiopulmonary syndrome. Although most transmissions occur through environmental exposure to rodent faeces and urine, rare person-to-person transmission has been documented, mainly for close contacts. This study investigates the presence and infectivity of ANDV in body fluids from confirmed cases and the duration of viraemia. METHODS: In this prospective study, 131 participants with confirmed ANDV infection were enrolled in Chile in a prospective study between 2008 and 2022. Clinical samples (buffy coat, plasma, gingival crevicular fluid [GCF], saliva, nasopharyngeal swabs [NPS], and urine) were collected weekly for 3 weeks together with clinical and epidemiological data. Samples were categorised as acute or convalescent (up to and after 16 days following onset of symptoms). Infectivity of positive fluids was assessed after the culture of samples on Vero E6 cells and use of flow cytometry assays to determine the production of ANDV nucleoprotein. FINDINGS: ANDV RNA was detected in 100% of buffy coats during acute phase, declining to 95% by day 17, and to 93% between days 23-29. ANDV RNA in GCF and saliva decreased from 30% and 12%, respectively, during the acute phase, to 12% and 11% during the convalescent phase. Successful infectivity assays of RT-qPCR-positive fluids, including GCF, saliva, NPS, and urine, were observed in 18 (42%) of 43 samples obtained during the acute phase of infection. After re-culture, the capacity to infect Vero E6 cells was maintained in 16 (89%) of 18 samples. Severity was associated with the presence of ANDV RNA in one or more fluids besides blood (odds ratio 2·58 [95% CI 1·42-5·18]). INTERPRETATION: ANDV infection is a systemic and viraemic infection, that affects various organs. The presence of infectious particles in body fluids contributes to our understanding of potential mechanisms for person-to-person transmission, supporting the development of preventive strategies. Detection of ANDV RNA in additional fluids at hospital admission is a predictor of disease severity. FUNDING: National Institutes of Health and Agencia de Investigación y Desarrollo. TRANSLATION: For the Spanish translation of the abstract see Supplementary Materials section.
Assuntos
Infecções por Hantavirus , Orthohantavírus , Viremia , Eliminação de Partículas Virais , Humanos , Estudos Prospectivos , Masculino , Adulto , Infecções por Hantavirus/transmissão , Infecções por Hantavirus/epidemiologia , Infecções por Hantavirus/virologia , Feminino , Orthohantavírus/isolamento & purificação , Chile/epidemiologia , Pessoa de Meia-Idade , Adulto Jovem , Adolescente , RNA Viral , Animais , Criança , Chlorocebus aethiops , Idoso , Células VeroRESUMO
BACKGROUND: Several studies have validated capillary refill time (CRT) as a marker of tissue hypoperfusion, and recent guidelines recommend CRT monitoring during septic shock resuscitation. Therefore, it is relevant to further explore its kinetics of response to short-term hemodynamic interventions with fluids or vasopressors. A couple of previous studies explored the impact of a fluid bolus on CRT, but little is known about the impact of norepinephrine on CRT when aiming at a higher mean arterial pressure (MAP) target in septic shock. We designed this observational study to further evaluate the effect of a fluid challenge (FC) and a vasopressor test (VPT) on CRT in septic shock patients with abnormal CRT after initial resuscitation. Our purpose was to determine the effects of a FC in fluid-responsive patients, and of a VPT aimed at a higher MAP target in chronically hypertensive fluid-unresponsive patients on the direction and magnitude of CRT response. METHODS: Thirty-four septic shock patients were included. Fluid responsiveness was assessed at baseline, and a FC (500 ml/30 mins) was administered in 9 fluid-responsive patients. A VPT was performed in 25 patients by increasing norepinephrine dose to reach a MAP to 80-85 mmHg for 30 min. Patients shared a multimodal perfusion and hemodynamic monitoring protocol with assessments at at least two time-points (baseline, and at the end of interventions). RESULTS: CRT decreased significantly with both tests (from 5 [3.5-7.6] to 4 [2.4-5.1] sec, p = 0.008 after the FC; and from 4.0 [3.3-5.6] to 3 [2.6 -5] sec, p = 0.03 after the VPT. A CRT-response was observed in 7/9 patients after the FC, and in 14/25 pts after the VPT, but CRT deteriorated in 4 patients on this latter group, all of them receiving a concomitant low-dose vasopressin. CONCLUSIONS: Our findings support that fluid boluses may improve CRT or produce neutral effects in fluid-responsive septic shock patients with persistent hypoperfusion. Conversely, raising NE doses to target a higher MAP in previously hypertensive patients elicits a more heterogeneous response, improving CRT in the majority, but deteriorating skin perfusion in some patients, a fact that deserves further research.
RESUMO
Different techniques have been proposed to measure antibiotic levels within the lung parenchyma; however, their use is limited because they are invasive and associated with adverse effects. We explore whether beta-lactam antibiotics could be measured in exhaled breath condensate collected from heat and moisture exchange filters (HMEFs) and correlated with the concentration of antibiotics measured from bronchoalveolar lavage (BAL). We designed an observational study in patients undergoing mechanical ventilation, which required a BAL to confirm or discard the diagnosis of pneumonia. We measured and correlated the concentration of beta-lactam antibiotics in plasma, epithelial lining fluid (ELF), and exhaled breath condensate collected from HMEFs. We studied 12 patients, and we detected the presence of antibiotics in plasma, ELF, and HMEFs from every patient studied. The concentrations of antibiotics were very heterogeneous over the population studied. The mean antibiotic concentration was 293.5 (715) ng/mL in plasma, 12.3 (31) ng/mL in ELF, and 0.5 (0.9) ng/mL in HMEF. We found no significant correlation between the concentration of antibiotics in plasma and ELF (R2 = 0.02, p = 0.64), between plasma and HMEF (R2 = 0.02, p = 0.63), or between ELF and HMEF (R2 = 0.02, p = 0.66). We conclude that beta-lactam antibiotics can be detected and measured from the exhaled breath condensate accumulated in the HMEF from mechanically ventilated patients. However, no correlations were observed between the antibiotic concentrations in HMEF with either plasma or ELF.
RESUMO
Introducción: El virus SARS-Cov-2 se ha asociado a múltiples manifestaciones neurológicas, incluyendo accidente cerebrovascular agudo. La manifestación cerebrovascular reportada con mayor secuencia es accidente cerebrovascular secundario a trombosis de grandes vasos. La arteriopatía cerebral autosómica dominante con infartos subcorticales y leucoencefalopatía (CADASIL, por las siglas en inglés), es la enfermedad genética más frecuente asociada a lesiones de sustancia blanca e infartos lacunares múltiples. Se desconoce como el SARS-Cov-2 podría afectar en los pacientes con CADASIL. El objetivo de este trabajo es reportar la manifestación neurológica del COVID-19 en un paciente con CADASIL. Métodos: Se buscó información clínica y de laboratorio en los registros clínicos. Información adicional se obtuvo del informe de la evaluación de neuropsicología y de la entrevista con el paciente. Resultados: Paciente en la sexta década de la vida consulta a servicio de urgencias por confusión, desorientación, dificultad para expresarse y debilidad en su mano y pierna a derecha. La resonancia magnética cerebral demostró lesiones subcorticales múltiples agudas en áreas limítrofe y lesiones de sustancia blanca crónicas en capsula externa y polos de los lóbulos temporales, ambas típicas del CADASIL. El examen genético arrojó mutación sin sentido en el gen NOTCH3. El paciente fue seguido durante 10 meses, aunque presentó mejoría de su condición neurológica, persistió el déficit cognitivo con repercusión en sus actividades instrumentales de la vida diaria. Conclusión: Los pacientes con CADASIL que se infectan con SARS-Cov-2 pueden manifestarse con accidentes cerebrovasculares de zona limítrofe y encefalopatía. El COVID19 podría acelerar la declinación cognitiva descripta en los pacientes con CADASIL.
Introduction: The SARS-Cov-2 is associated with many neurological manifestations, including acute cerebrovascular disease. The most common reported stroke manifestation is ischemic stroke secondary to large vessels occlusion. The cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy (CADASIL) is the most frequent genetic disease associated with white matter disease and multiple lacunar strokes. How the SARS-Cov-2 could affect CADASIL patients is unknown. Our aim is to report the neurologic presentation of COVID19 in a CADASIL patient. Method: With searched laboratory data and patient history in clinical registers. Additional information was obtained from the neuropsychologic report and patient's family interview. Results: A patient on his fifties consulted to the emergency department for disorientation, difficulty with language and weakness of his right arm and leg. The magnetic resonance showed multiple acute subcortical border zone lesions and other chronic white matter lesions affecting the pole of the temporal lobes and external capsule, both typical of CADASIL. The genetic examination confirmed a missense mutation on the NOTCH3 gene. The patient was followed up for 10 months and although there was an improvement in his neurologic condition, he remained with cognitive deficits that impacted in his instrumental activities of daily living. Conclusion: CADASIL patients infected with SARS-Cov-2 can suffer of multiple border zone infarcts and encephalopathy. The COVID19 could accelerated the cognitive decline of CADASIL patients.
RESUMO
Bitter pit (BP) is one of the most relevant post-harvest disorders for apple industry worldwide, which is often related to calcium (Ca) deficiency at the calyx end of the fruit. Its occurrence takes place along with an imbalance with other minerals, such as potassium (K). Although the K/Ca ratio is considered a valuable indicator of BP, a high variability in the levels of these elements occurs within the fruit, between fruits of the same plant, and between plants and orchards. Prediction systems based on the content of elements in fruit have a high variability because they are determined in samples composed of various fruits. With X-ray fluorescence (XRF) spectrometry, it is possible to characterize non-destructively the signal intensity for several mineral elements at a given position in individual fruit and thus, the complete signal of the mineral composition can be used to perform a predictive model to determine the incidence of bitter pit. Therefore, it was hypothesized that using a multivariate modeling approach, other elements beyond the K and Ca could be found that could improve the current clutter prediction capability. Two studies were carried out: on the first one an experiment was conducted to determine the K/Ca and the whole spectrum using XRF of a balanced sample of affected and non-affected 'Granny Smith' apples. On the second study apples of three cultivars ('Granny Smith', 'Brookfield' and 'Fuji'), were harvested from two commercial orchards to evaluate the use of XRF to predict BP. With data from the first study a multivariate classification system was trained (balanced database of healthy and BP fruit, consisting in 176 from each group) and then the model was applied on the second study to fruit from two orchards with a history of BP. Results show that when dimensionality reduction was performed on the XRF spectra (1.5 - 8 KeV) of 'Granny Smith' apples, comparing fruit with and without BP, along with K and Ca, four other elements (i.e., Cl, Si, P, and S) were found to be deterministic. However, the PCA revealed that the classification between samples (BP vs. non-BP fruit) was not possible by univariate analysis (individual elements or the K/Ca ratio).Therefore, a multivariate classification approach was applied, and the classification measures (sensitivity, specificity, and balanced precision) of the PLS-DA models for all cultivars evaluated ('Granny Smith', 'Fuji' and 'Brookfield') on the full training samples and with both validation procedures (Venetian and Monte Carlo), ranged from 0.76 to 0.92. The results of this work indicate that using this technology at the individual fruit level is essential to understand the factors that determine this disorder and can improve BP prediction of intact fruit.
RESUMO
Due to climate change and expected food shortage in the coming decades, not only will it be necessary to develop cultivars with greater tolerance to environmental stress, but it is also imperative to reduce breeding cycle time. In addition to yield evaluation, plant breeders resort to many sensory assessments and some others of intermediate complexity. However, to develop cultivars better adapted to current/future constraints, it is necessary to incorporate a new set of traits, such as morphophysiological and physicochemical attributes, information relevant to the successful selection of genotypes or parents. Unfortunately, because of the large number of genotypes to be screened, measurements with conventional equipment are unfeasible, especially under field conditions. High-throughput plant phenotyping (HTPP) facilitates collecting a significant amount of data quickly; however, it is necessary to transform all this information (e.g., plant reflectance) into helpful descriptors to the breeder. To the extent that a holistic characterization of the plant (phenomics) is performed in challenging environments, it will be possible to select the best genotypes (forward phenomics) objectively but also understand why the said individual differs from the rest (reverse phenomics). Unfortunately, several elements had prevented phenomics from developing as desired. Consequently, a new set of prediction/validation methodologies, seasonal ambient information, and the fusion of data matrices (e.g., genotypic and phenotypic information) need to be incorporated into the modeling. In this sense, for the massive implementation of phenomics in plant breeding, it will be essential to count an interdisciplinary team that responds to the urgent need to release material with greater capacity to tolerate environmental stress. Therefore, breeding programs should (i) be more efficient (e.g., early discarding of unsuitable material), (ii) have shorter breeding cycles (fewer crosses to achieve the desired cultivar), and (iii) be more productive, increasing the probability of success at the end of the breeding process (percentage of cultivars released to the number of initial crosses).
Assuntos
Fenômica , Melhoramento Vegetal , Genótipo , Fenótipo , Plantas/genéticaRESUMO
PURPOSE: Assessing competency in surgical procedures is key for instructors to distinguish whether a resident is qualified to perform them on patients. Currently, assessment techniques do not always focus on providing feedback about the order in which the activities need to be performed. In this research, using a Process Mining approach, process-oriented metrics are proposed to assess the training of residents in a Percutaneous Dilatational Tracheostomy (PDT) simulator, identifying the critical points in the execution of the surgical process. MATERIALS AND METHODS: A reference process model of the procedure was defined, and video recordings of student training sessions in the PDT simulator were collected and tagged to generate event logs. Three process-oriented metrics were proposed to assess the performance of the residents in training. RESULTS: Although the students were proficient in classic metrics, they did not reach the optimum in process-oriented metrics. Only in 25% of the stages the optimum was achieved in the last session. In these stages, the four more challenging activities were also identified, which account for 32% of the process-oriented metrics errors. CONCLUSIONS: Process-oriented metrics offer a new perspective on surgical procedures performance, providing a more granular perspective, which enables a more specific and actionable feedback for both students and instructors.
Assuntos
Competência Clínica , Traqueostomia , Humanos , Dilatação , Retroalimentação , Estudantes , Traqueostomia/educação , Traqueostomia/métodosRESUMO
Rationale: The role of and needs for extracorporeal membrane oxygenation (ECMO) at a population level during the coronavirus disease (COVID-19) pandemic have not been completely established. Objectives: To identify the cumulative incidence of ECMO use in the first pandemic wave and to describe the Nationwide Chilean cohort of ECMO-supported patients with COVID-19. Methods: We conducted a population-based study from March 3 to August 31, 2020, using linked data from national agencies. The cumulative incidence of ECMO use and mortality risk of ECMO-supported patients were calculated and age standardized. In addition, a retrospective cohort analysis was performed. Outcomes were 90-day mortality after ECMO initiation, ECMO-associated complications, and hospital length of stay. Cox regression models were used to explore risk factors for mortality in a time-to-event analysis. Measurements and Main Results: Ninety-four patients with COVID-19 were supported with ECMO (0.42 per population of 100,000, 14.89 per 100,000 positive cases, and 1.2% of intubated patients with COVID-19); 85 were included in the cohort analysis, and the median age was 48 (interquartile range [IQR], 41-55) years, 83.5% were men, and 42.4% had obesity. The median number of pre-ECMO intubation days was 4 (IQR, 2-7), the median PaO2/FiO2 ratio was 86.8 (IQR, 64-99) mm Hg, 91.8% of patients were prone positioned, and 14 patients had refractory respiratory acidosis. Main complications were infections (70.6%), bleeding (38.8%), and thromboembolism (22.4%); 52 patients were discharged home, and 33 died. The hospital length of stay was a median of 50 (IQR, 24-69) days. Lower respiratory system compliance and higher driving pressure before ECMO initiation were associated with increased mortality. A duration of pre-ECMO intubation ≥10 days was not associated with mortality. Conclusions: Documenting nationwide ECMO needs may help in planning ECMO provision for future COVID-19 pandemic waves. The 90-day mortality of the Chilean cohort of ECMO-supported patients with COVID-19 (38.8%) is comparable to that of previous reports.
Assuntos
COVID-19/terapia , Oxigenação por Membrana Extracorpórea/estatística & dados numéricos , Síndrome do Desconforto Respiratório/terapia , Adulto , Idoso , COVID-19/complicações , COVID-19/diagnóstico , COVID-19/epidemiologia , Chile/epidemiologia , Feminino , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Avaliação das Necessidades , Síndrome do Desconforto Respiratório/diagnóstico , Síndrome do Desconforto Respiratório/epidemiologia , Síndrome do Desconforto Respiratório/virologia , Estudos Retrospectivos , Índice de Gravidade de Doença , Resultado do TratamentoRESUMO
Percutaneous dilatational tracheostomy has become the technique of choice in multiple intensive care units. Among innovations to improve procedural safety and success, bronchoscopic guidance of percutaneous dilatational tracheostomy has been advocated and successfully implemented by multiple groups. Most published literature focuses on the percutaneous dilatational tracheostomy operator, with scarce descriptions of the bronchoscopic particularities of the procedure. In this article, we provide 10 suggestions to enhance specific procedural aspects of bronchoscopic guidance of percutaneous dilatational tracheostomy, and strategies to optimize its teaching and learning, in order to promote learners' competence acquisition and increase patient safety.
RESUMO
Patients with severe coronavirus disease (COVID-19) may have COVID-19-associated invasive mold infection (CAIMI) develop. We report 16 cases of CAIMI among 146 nonimmunocompromised patients with severe COVID-19 at an academic hospital in Santiago, Chile. These rates correspond to a CAIMI incidence of 11%; the mortality rate for these patients was 31.2%.
Assuntos
COVID-19 , Estado Terminal , Micoses , Chile/epidemiologia , Humanos , Micoses/complicações , SARS-CoV-2RESUMO
BACKGROUND: Deconstructing a complex procedure improves skills learning, but no model has covered all relevant Percutaneous Dilatational Tracheostomy (PDT) procedural aspects. Moreover, the heterogeneity of techniques described may hinder trainees' competency acquisition. Our objective was to develop a PDT model for procedural training that includes a comprehensive step-by-step design. METHODS: Procedural descriptions were retrieved after a structured search in medical databases. Activities were extracted and the adherence to McKinley's dimensions of procedural competence was analyzed. We developed a comprehensive PDT model, which was further validated through a Delphi-based consensus of Spanish-speaking international experts. RESULTS: The 14 descriptions retrieved for analysis presented a median [interquartile range] of 18 [11-22] steps, covering 3 [2-4] of McKinley's dimensions. The Delphi panel's first model included all McKinley's dimensions, and was answered by 25 experts from nine countries, ending in the second round. The final model included 59 activities divided into six stages (51 from the initial model and eight proposed by experts) and performed by two operators (bronchoscopy and tracheostomy). CONCLUSIONS: We have presented a PDT model that includes necessary competence dimensions to be considered complete. The model was validated by an experts' consensus, allowing to improve procedural training to promote safer patient care.
Assuntos
Broncoscopia , Traqueostomia , Consenso , Técnica Delphi , Dilatação , HumanosRESUMO
Hypopituitarism after moderate or severe traumatic brain injury (TBI) is usually underdiagnosed and therefore undertreated. Its course can be divided in an acute phase during the first 14 days after TBI with 50 to 80% risk of hypopituitarism, and a chronic phase, beginning three months after the event, with a prevalence of hypopituitarism that ranges from 2 to 70%. Its pathophysiology has been addressed in several studies, suggesting that a vascular injury to the pituitary tissue is the most important mechanism during the acute phase, and an autoimmune one during chronic stages. In the acute phase, there are difficulties to correctly interpret pituitary axes. Hence, we propose a simple and cost-effective algorithm to detect and treat a potential hypothalamic-pituitary-adrenal axis impairment and alterations of sodium homeostasis, both of which can be life-threatening. In the chronic phase, post-concussion syndrome is the most important differential diagnosis. Given the high prevalence of hypopituitarism, we suggest that all pituitary axes should be assessed in all patients with moderate to severe TBI, between 3 to 6 months after the event, and then repeated at 12 months after trauma by a specialized team in pituitary disease.
Assuntos
Humanos , Doenças da Hipófise , Lesões Encefálicas Traumáticas , Hipopituitarismo , Sistema Hipófise-Suprarrenal , Lesões Encefálicas Traumáticas/complicações , Hipopituitarismo/diagnóstico , Hipopituitarismo/etiologia , Sistema Hipotálamo-HipofisárioRESUMO
BACKGROUND: Persistent hyperlactatemia has been considered as a signal of tissue hypoperfusion in septic shock patients, but multiple non-hypoperfusion-related pathogenic mechanisms could be involved. Therefore, pursuing lactate normalization may lead to the risk of fluid overload. Peripheral perfusion, assessed by the capillary refill time (CRT), could be an effective alternative resuscitation target as recently demonstrated by the ANDROMEDA-SHOCK trial. We designed the present randomized controlled trial to address the impact of a CRT-targeted (CRT-T) vs. a lactate-targeted (LAC-T) fluid resuscitation strategy on fluid balances within 24 h of septic shock diagnosis. In addition, we compared the effects of both strategies on organ dysfunction, regional and microcirculatory flow, and tissue hypoxia surrogates. RESULTS: Forty-two fluid-responsive septic shock patients were randomized into CRT-T or LAC-T groups. Fluids were administered until target achievement during the 6 h intervention period, or until safety criteria were met. CRT-T was aimed at CRT normalization (≤ 3 s), whereas in LAC-T the goal was lactate normalization (≤ 2 mmol/L) or a 20% decrease every 2 h. Multimodal perfusion monitoring included sublingual microcirculatory assessment; plasma-disappearance rate of indocyanine green; muscle oxygen saturation; central venous-arterial pCO2 gradient/ arterial-venous O2 content difference ratio; and lactate/pyruvate ratio. There was no difference between CRT-T vs. LAC-T in 6 h-fluid boluses (875 [375-2625] vs. 1500 [1000-2000], p = 0.3), or balances (982[249-2833] vs. 15,800 [740-6587, p = 0.2]). CRT-T was associated with a higher achievement of the predefined perfusion target (62 vs. 24, p = 0.03). No significant differences in perfusion-related variables or hypoxia surrogates were observed. CONCLUSIONS: CRT-targeted fluid resuscitation was not superior to a lactate-targeted one on fluid administration or balances. However, it was associated with comparable effects on regional and microcirculatory flow parameters and hypoxia surrogates, and a faster achievement of the predefined resuscitation target. Our data suggest that stopping fluids in patients with CRT ≤ 3 s appears as safe in terms of tissue perfusion. Clinical Trials: ClinicalTrials.gov Identifier: NCT03762005 (Retrospectively registered on December 3rd 2018).
RESUMO
BACKGROUND: Assessment of tissue hypoxia at the bedside has yet to be translated into daily clinical practice in septic shock patients. Perfusion markers are surrogates of deeper physiological phenomena. Lactate-to-pyruvate ratio (LPR) and the ratio between veno-arterial PCO2 difference and Ca-vO2 (ΔPCO2/Ca-vO2) have been proposed as markers of tissue hypoxia, but they have not been compared in the clinical scenario. We studied acute septic shock patients under resuscitation. We wanted to evaluate the relationship of these hypoxia markers with clinical and biochemical markers of hypoperfusion during septic shock resuscitation. METHODS: Secondary analysis of a randomized controlled trial. Septic shock patients were randomized to fluid resuscitation directed to normalization of capillary refill time (CRT) versus normalization or significant lowering of lactate. Multimodal assessment of perfusion was performed at 0, 2, 6 and 24 hours, and included macrohemodynamic and metabolic perfusion variables, CRT, regional flow and hypoxia markers. Patients who attained their pre-specified endpoint at 2-hours were compared to those who did not. RESULTS: Forty-two patients were recruited, median APACHE-II score was 23 [15-31] and 28-day mortality 23%. LPR and ΔPCO2/Ca-vO2 ratio did not correlate during early resuscitation (0-2 h) and the whole study period (24-hours). ΔPCO2/Ca-vO2 ratio derangements were more prevalent than LPR ones, either in the whole cohort (52% vs. 23%), and in association with other perfusion abnormalities. In patients who reached their resuscitation endpoints, the proportion of patients with altered ΔPCO2/Ca-vO2 ratio decreased significantly (66% to 33%, P=0.045), while LPR did not (14% vs. 25%, P=0.34). CONCLUSIONS: Hypoxia markers did not exhibit correlation during resuscitation in septic shock patients. They probably interrogate different pathophysiological processes and mechanisms of dysoxia during early septic shock. Future studies should better elucidate the interaction and clinical role of hypoxia markers during septic shock resuscitation.
RESUMO
Our country is suffering the effects of the ongoing pandemic of coronavirus disease (COVID-19). Because the vulnerability of healthcare systems, especially the intensive care areas they can rapidly be overloaded. That challenge the ICUs simultaneously on multiple fronts making urgent to increase the number of beds, without lowering the standards of care. The purpose of this article is to discuss some aspects of the national situation and to provide recommendations on the organizational management of intensive care units such as isolation protocols, surge in ICU bed capacity, ensure adequate supplies, protect and train healthcare workers maintaining quality clinical management.
Assuntos
Humanos , Infecções por Coronavirus/epidemiologia , Pandemias , Unidades de Terapia Intensiva/organização & administração , Unidades de Terapia Intensiva/provisão & distribuição , Capacidade de Resposta ante EmergênciasRESUMO
Wheat plants growing under Mediterranean rain-fed conditions are exposed to water deficit, particularly during the grain filling period, and this can lead to a strong reduction in grain yield (GY). This study examines the effects of water deficit after during the grain filling period on photosynthetic and water-use efficiencies at the leaf and whole-plant level for 14 bread wheat genotypes grown in pots under glasshouse conditions. Two glasshouse experiments were conducted, one in a conventional glasshouse at the Universidad de Talca, Chile (Experiment 1), and another at the National Plant Phenomics Centre (NPPC), Aberystwyth, UK (Experiment 2), in 2015. Plants were grown under well-watered (WW) and water-limited (WL) conditions during grain filling. The reductions in leaf water potential (Ψ), net CO2 assimilation (An) and stomatal conductance (gs) due to water deficit were 79, 35 and 55%, respectively, during grain filling but no significant differences were found among genotypes. However, chlorophyll fluorescence parameters (as determined on dark-adapted and illuminated leaves) and chlorophyll content (Chl) were significantly different among genotypes, but not between water conditions. Under both water conditions, An presented a positive and linear relationship with the effective photochemical quantum yield of Photosystem II (Y(II)) and the maximum rate of electron transport (ETRmax), and negative with the quantum yield of non-photochemical energy conversion in Photosystem II (Y(NPQ)). The relationship between An and Chl was positive and linear for both water conditions, but under WL conditions An tended to be lower at any Chl value. Both, instantaneous (An/E) and intrinsic (An/gs) water-use efficiencies at the leaf level exhibited a positive and linear relationship with plant water-use efficiency (WUEp = plant dry weight/water use). Carbon discrimination (Δ13C) in kernels presented a negative relationship with WUEp, at both WW and WL conditions, and a positive relationship with GY. Our results indicate that during grain filling wheat plants face limitations to the assimilation process due to natural senesce and water stress. The reduction in An and gs after anthesis in both water conditions was mainly due a decline in the chlorophyll content (non-stomatal limitation), whereas the observed differences between water conditions were mainly due to a stomatal limitation.
Assuntos
Variação Genética , Genótipo , Folhas de Planta/genética , Folhas de Planta/metabolismo , Triticum/genética , Triticum/metabolismo , Água/metabolismo , Pão , Clorofila/metabolismo , Folhas de Planta/crescimento & desenvolvimento , Solo/química , Triticum/crescimento & desenvolvimento , Água/análiseRESUMO
Our country is suffering the effects of the ongoing pandemic of coronavirus disease (COVID-19). Because the vulnerability of healthcare systems, especially the intensive care areas they can rapidly be overloaded. That challenge the ICUs simultaneously on multiple fronts making urgent to increase the number of beds, without lowering the standards of care. The purpose of this article is to discuss some aspects of the national situation and to provide recommendations on the organizational management of intensive care units such as isolation protocols, surge in ICU bed capacity, ensure adequate supplies, protect and train healthcare workers maintaining quality clinical management.
Assuntos
COVID-19/epidemiologia , Unidades de Terapia Intensiva/organização & administração , Unidades de Terapia Intensiva/provisão & distribuição , Pandemias , Humanos , Capacidade de Resposta ante EmergênciasRESUMO
Hypopituitarism after moderate or severe traumatic brain injury (TBI) is usually underdiagnosed and therefore undertreated. Its course can be divided in an acute phase during the first 14 days after TBI with 50 to 80% risk of hypopituitarism, and a chronic phase, beginning three months after the event, with a prevalence of hypopituitarism that ranges from 2 to 70%. Its pathophysiology has been addressed in several studies, suggesting that a vascular injury to the pituitary tissue is the most important mechanism during the acute phase, and an autoimmune one during chronic stages. In the acute phase, there are difficulties to correctly interpret pituitary axes. Hence, we propose a simple and cost-effective algorithm to detect and treat a potential hypothalamic-pituitary-adrenal axis impairment and alterations of sodium homeostasis, both of which can be life-threatening. In the chronic phase, post-concussion syndrome is the most important differential diagnosis. Given the high prevalence of hypopituitarism, we suggest that all pituitary axes should be assessed in all patients with moderate to severe TBI, between 3 to 6 months after the event, and then repeated at 12 months after trauma by a specialized team in pituitary disease.
Assuntos
Lesões Encefálicas Traumáticas , Hipopituitarismo , Doenças da Hipófise , Lesões Encefálicas Traumáticas/complicações , Humanos , Hipopituitarismo/diagnóstico , Hipopituitarismo/etiologia , Sistema Hipotálamo-Hipofisário , Sistema Hipófise-SuprarrenalRESUMO
INTRODUCTION: Bronchoscopy-guided percutaneous dilatational tracheostomy (BG-PDT) is an invasive procedure regularly performed in the intensive care unit. Risk of serious complications have been estimated in up to 5%, focused during the learning phase. We have not found any published formal training protocols, and commercial simulators are costly and not widely available in some countries. The objective of this study was to present the design and simulator performance of a low-cost BG-PDT simulator. METHODS: A simulator was designed with materials available in a hardware store, synthetic skin pads, ex vivo bovine tracheas, and a pipe inspection camera. The simulator was tested in 8 experts and 9 novices. Sessions were video recorded, and participants were equipped with the Imperial College Surgical Device, a hand motion-tracking device. Performance was evaluated with a multimodal approach, including first attempt success rate, global success rate, total procedural time, Imperial College Surgical Device-derived proficiency parameters, and global rating scale applied blindly by 2 expert observers. A satisfaction survey was applied after the procedure. RESULTS: A simulator was successfully constructed, allowing multiple iterations per assembly, with a fixed cost of US $30 and $4 per use. Experts had greater global and first attempt success rate, performed the procedure faster, and with greater proficiency. It presented high user satisfaction and fidelity. CONCLUSIONS: A low-cost BG-PDT simulator was successfully constructed, with the ability to discriminate between experts and novices, and with high fidelity. Considering its ease of construction and cost, it can be replicated in almost any intensive care unit.