Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 18 de 18
1.
J Crit Care ; 83: 154834, 2024 May 22.
Article En | MEDLINE | ID: mdl-38781812

INTRODUCTION: Persistent critical illness (PCI) is a syndrome in which the acute presenting problem has been stabilized, but the patient's clinical state does not allow ICU discharge. The burden associated with PCI is substantial. The most obvious marker of PCI is prolonged ICU length of stay (LOS), usually greater than 10 days. Urea to Creatinine ratio (UCr) has been suggested as an early marker of PCI development. METHODS: A single-center retrospective study. Data of patients admitted to a general mixed medical-surgical ICU during Jan 1st 2018 till Dec 31st 2022 was extracted, including demographic data, baseline characteristics, daily urea and creatinine results, renal replacement therapy (RRT) provided, and outcome measures - length of stay, and mortality (ICU, and 90 days). Patients were defined as PCI patients if their LOS was >10 days. We used Fisher exact test or Chi-square to compare PCI and non-PCI patients. The association between UCr with PCI development was assessed by repeated measures linear model. Multivariate Cox regression was used for 1 year mortality assessment. RESULTS: 2098 patients were included in the analysis. Patients who suffered from PCI were older, with higher admission prognostic scores. Their 90-day mortality was significantly higher than non-PCI patients (34.58% vs 12.18%, p < 0.0001). A significant difference in UCr was found only on the first admission day among all patients. This was not found when examining separately surgical, trauma, or transplantation patients. We did not find a difference in UCr in different KDIGO (Kidney Disease Improving Global Outcomes) stages. Elevated UCr and PCI were found to be significantly associated with 1 year mortality. CONCLUSION: In this single center retrospective cohort study, UCr was not found to be associated with PCI development.

2.
PLoS One ; 19(1): e0296386, 2024.
Article En | MEDLINE | ID: mdl-38166095

INTRODUCTION: The decision to intubate and ventilate a patient is mainly clinical. Both delaying intubation (when needed) and unnecessarily invasively ventilating (when it can be avoided) are harmful. We recently developed an algorithm predicting respiratory failure and invasive mechanical ventilation in COVID-19 patients. This is an internal validation study of this model, which also suggests a categorized "time-weighted" model. METHODS: We used a dataset of COVID-19 patients who were admitted to Rabin Medical Center after the algorithm was developed. We evaluated model performance in predicting ventilation, regarding the actual endpoint of each patient. We further categorized each patient into one of four categories, based on the strength of the prediction of ventilation over time. We evaluated this categorized model performance regarding the actual endpoint of each patient. RESULTS: 881 patients were included in the study; 96 of them were ventilated. AUC of the original algorithm is 0.87-0.94. The AUC of the categorized model is 0.95. CONCLUSIONS: A minor degradation in the algorithm accuracy was noted in the internal validation, however, its accuracy remained high. The categorized model allows accurate prediction over time, with very high negative predictive value.


COVID-19 , Respiratory Insufficiency , Humans , COVID-19/therapy , Respiration, Artificial , Predictive Value of Tests , Respiratory Insufficiency/therapy , Respiration
3.
Reg Anesth Pain Med ; 2024 Jan 29.
Article En | MEDLINE | ID: mdl-38286738

BACKGROUND: Adequate pain control following lung transplantation (LTx) surgery is paramount. Thoracic epidural analgesia (TEA) is the gold standard; however, the potential use of extracorporeal membrane oxygenation (ECMO) and consequent anticoagulation therapy raises safety concerns, prompting clinicians to seek safer alternatives. The utility of thoracic wall blocks in general thoracic surgery is well established; however, their role in the context of LTx has been poorly investigated. METHODS: In this retrospective exploratory study, we assessed the effect of adding a superficial parasternal intercostal plane (sPIP) block and serratus anterior plane (SAP) block to standard anesthetic and analgesic care on tracheal extubation rates, pain scores and opioid consumption until 72 hours postoperatively in LTx. RESULTS: Sixty patients were included in the analysis; 35 received the standard anesthetic and analgesic care (control group), and 25 received sPIP and SAP blocks in addition to the standard anesthetic and analgesic care (intervention group). We observed higher tracheal extubation rates in the intervention group at 8 hours postoperatively (16.0% vs 0.0%, p=0.03). This was also shown after adjusting for known prognostic factors (OR 1.18; 95% CI 1.04 to 1.33, p=0.02). Furthermore, we noted a lower opioid consumption measured by morphine milligram equivalents at 24 hours in the intervention group (median 405 (IQR 300-490) vs 266 (IQR 168-366), p=0.02). This was also found after adjusting for known prognostic factors (ß -118; 95% CI -221 to 14, p=0.03). CONCLUSION: sPIP and SAP blocks are safe regional analgesic techniques in LTx involving ECMO and clamshell incision. They are associated with faster tracheal extubation and lower opioid consumption. These techniques should be considered when TEA is not appropriate. Further high-quality studies are warranted to confirm these findings.

4.
Artif Organs ; 48(4): 392-401, 2024 Apr.
Article En | MEDLINE | ID: mdl-38112077

BACKGROUND: The leading causes of maternal mortality include respiratory failure, cardiovascular events, infections, and hemorrhages. The use of extracorporeal membrane oxygenation (ECMO) as rescue therapy in the peripartum period for cardiopulmonary failure is expanding in critical care medicine. METHODS: This retrospective observational study was conducted on a nationwide cohort in Israel. During the 3-year period, between September 1, 2019, and August 31, 2022, all women in the peripartum period who had been supported by ECMO for respiratory or circulatory failure at 10 large Israeli hospitals were identified. Indications for ECMO, maternal and neonatal outcomes, details of ECMO support, and complications were collected. RESULTS: During the 3-year study period, in Israel, there were 540 234 live births, and 28 obstetric patients were supported by ECMO, with an incidence of 5.2 cases per 100 000 or 1 case per 19 000 births (when excluding patients with COVID-19, the incidence will be 2.5 cases per 100 000 births). Of these, 25 were during the postpartum period, of which 16 (64%) were connected in the PPD1, and 3 were during pregnancy. Eighteen patients (64.3%) were supported by V-V ECMO, 9 (32.1%) by V-A ECMO, and one (3.6%) by a VV-A configuration. Hypoxic respiratory failure (ARDS) was the most common indication for ECMO, observed in 21 patients (75%). COVID-19 was the cause of ARDS in 15 (53.7%) patients. The indications for the V-A configuration were cardiomyopathy (3 patients), amniotic fluid embolism (2 patients), sepsis, and pulmonary hypertension. The maternal and fetal survival rates were 89.3% (n = 25) and 100% (n = 28). The average ECMO duration was 17.6 ± 18.6 days and the ICU stay was 29.8 ± 23.8 days. Major bleeding complications requiring surgical intervention were observed in one patient. CONCLUSIONS: The incidence of using ECMO in the peripartum period is low. The maternal and neonatal survival rates in patients treated with ECMO are high. These results show that ECMO remains an important treatment option for obstetric patients with respiratory and/or cardiopulmonary failure.


COVID-19 , Extracorporeal Membrane Oxygenation , Respiratory Distress Syndrome , Respiratory Insufficiency , Pregnancy , Infant, Newborn , Humans , Female , Extracorporeal Membrane Oxygenation/methods , Israel/epidemiology , Retrospective Studies , Respiratory Distress Syndrome/epidemiology , Respiratory Distress Syndrome/etiology , Respiratory Distress Syndrome/therapy , Respiratory Insufficiency/epidemiology , Respiratory Insufficiency/therapy , Respiratory Insufficiency/etiology
6.
Clin Nutr ; 42(9): 1602-1609, 2023 09.
Article En | MEDLINE | ID: mdl-37480797

PURPOSE: Nutritional therapy is essential to ICU care. Successful early enteral feeding is hindered by lack of protocols, gastrointestinal intolerance and feeding interruptions, leading to impaired nutritional intake. smART+ was developed as a nutrition management feeding platform controlling tube positioning, reflux, gastric pressure, and malnutrition. This study evaluated the potential of this new ICU care platform to deliver targeted nutrition and improve ICU outcomes. METHODS: Critically ill patients ≥18 years-old, mechanically ventilated and enterally fed, were randomized to receive ESPEN-guideline-based nutrition or smART+ -guided nutrition for 2-14 days. Primary endpoint was average deviation from daily targeted nutrition determined via calculation of energy targets per calorimetry. Secondary endpoints included gastric residual volumes, length of stay (LOS) and length of ventilation (LOV). RESULTS: smART+ achieved a mean deviation from daily targeted nutrition of 10.5% (n = 48) versus 34.3% for control (n = 50), p < 0.0001. LOS and LOV were decreased in the smART+ group versus control (mean LOS: 10.4 days versus 13.7; reduction 3.3 days, adjusted HR 1.71, 95% CI:1.13,2.60, p = 0.012; mean LOV: 9.5 days versus 12.8 days reduction of 3.3 days, adjusted HR 1.64, 95% CI:1.08-2.51, p = 0.021). Feeding goals were met (within ±10%) on 75.7% of days for smART+ versus 23.3% for control (p < 0.001). No treatment-related adverse events occurred in either group. The study was stopped due to success in a planned interim analysis of the first 100 patients. CONCLUSION: The smART+ Platform improved adherence to feeding goals and reduced LOS and LOV versus standard of care in critically ill patients. TRIAL REGISTRATION: NCT04098224; registered September 23, 2019.


Critical Illness , Enteral Nutrition , Humans , Adolescent , Critical Illness/therapy , Nutritional Status , Calorimetry , Critical Care
7.
Nutrients ; 15(12)2023 Jun 10.
Article En | MEDLINE | ID: mdl-37375609

BACKGROUND: The association between gastrointestinal intolerance during early enteral nutrition (EN) and adverse clinical outcomes in critically ill patients is controversial. We aimed to assess the prognostic value of enteral feeding intolerance (EFI) markers during early ICU stays and to predict early EN failure using a machine learning (ML) approach. METHODS: We performed a retrospective analysis of data from adult patients admitted to Beilinson Hospital ICU between January 2011 and December 2018 for more than 48 h and received EN. Clinical data, including demographics, severity scores, EFI markers, and medications, along with 72 h after admission, were analyzed by ML algorithms. Prediction performance was assessed by the area under the receiver operating characteristics (AUCROC) of a ten-fold cross-validation set. RESULTS: The datasets comprised 1584 patients. The means of the cross-validation AUCROCs for 90-day mortality and early EN failure were 0.73 (95% CI 0.71-0.75) and 0.71 (95% CI 0.67-0.74), respectively. Gastric residual volume above 250 mL on the second day was an important component of both prediction models. CONCLUSIONS: ML underlined the EFI markers that predict poor 90-day outcomes and early EN failure and supports early recognition of at-risk patients. Results have to be confirmed in further prospective and external validation studies.


Critical Illness , Enteral Nutrition , Adult , Humans , Infant, Newborn , Enteral Nutrition/adverse effects , Enteral Nutrition/methods , Prognosis , Retrospective Studies , Hospitalization
8.
J Crit Care ; 78: 154351, 2023 12.
Article En | MEDLINE | ID: mdl-37348187

INTRODUCTION: Communication with ventilated patients in the Intensive care unit (ICU) is challenging. This may lead to anxiety and frustration, potentially contributing to the development of delirium. Various technologies, such as eye-tracking devices, have been employed to facilitate communication with varying grades of success. The EyeControl-Med device is a novel technology that delivers audio content and allows patients to interact by eye movements and could potentially allow for better communication in this setting. The aim of this exploratory concept study was to assess communication capabilities and delirium incidence using the EyeControl-Med device in critically ill patients unable to generate speech. MATERIAL AND METHODS: A single-arm pilot study of patients in a mixed ICU. Patients were approached for consent if they were invasively ventilated and/or tracheotomized, hence unable to generate speech, but had no severe cognitive or sensory impairment that could prevent proper usage. Patients underwent at least 3 sessions with the EyeControl-Med device administered by a speech-language pathologist. Communication and consciousness were assessed using the Loewenstein Communication Scale (LCS) tool during the first and last sessions. Delirium was assessed using a computerized CAM-ICU questionnaire. RESULTS: 15 patients were included, 40% of whom were diagnosed with COVID-19. All patients completed three to seven usage sessions. The mean LCS score improved by 19.3 points (p < 0.0001), with each of its five components showing significant improvements as well. The mean number of errors on the CAM-ICU questionnaire decreased from 6.5 to 2.5 (p = 0.0006), indicating a lower incidence of delirium. No adverse effects were observed. CONCLUSION: The EyeControl-Med device may facilitate communication and reduce the manifestations and duration of delirium in ventilated critically ill patients. Controlled studies are required to establish this effect.


COVID-19 , Delirium , Humans , Delirium/diagnosis , Pilot Projects , Critical Illness/therapy , Critical Illness/psychology , Intensive Care Units , Communication , Respiration, Artificial/adverse effects
9.
Transpl Infect Dis ; 25(2): e14036, 2023 Apr.
Article En | MEDLINE | ID: mdl-36880576

BACKGROUND: Management of infections due to carbapenemase-resistant Enterobacterales (CRE) in solid organ transplant (SOT) recipients remains a difficult challenge. The INCREMENT-SOT-CPE score has been specifically developed from SOT recipients to stratify mortality risk, but an external validation is lacking. METHODS: Multicenter retrospective cohort study of liver transplant (LT) recipients colonized with CRE infection who developed infection after transplant over 7-year period. Primary endpoint was all-cause 30-day mortality from infection onset. A comparison between INCREMENT-SOT-CPE and other selected scores was performed. A two-level mixed effects logistic regression model with random effects for the center was fitted. Performance characteristics at optimal cut-point were calculated. Multivariable Cox regression analysis of risk factors for all-cause 30-day mortality was carried out. RESULTS: Overall, 250 CRE carriers developed infection after LT and were analyzed. The median age was 55 years (interquartile range [IQR]: 46-62) and 157 were males (62.8%). All-cause 30-day mortality was 35.6%. A sequential organ failure assessment (SOFA) score ≥ 11 showed a sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy of 69.7%, 76.4%, 62.0%, 82.0%, and 74.0%, respectively. An INCREMENT-SOT-CPE ≥ 11 reported a sensitivity, specificity, PPV, NPV, and accuracy of 73.0%, 62.1%, 51.6%, 80.6% and 66.0%, respectively. At multivariable analysis acute renal failure, prolonged mechanical ventilation, INCREMENT-SOT-CPE score ≥ 11 and SOFA score ≥ 11 were independently associated with all-cause 30-day mortality, while a tigecycline-based targeted regimen was found to be protective. CONCLUSIONS: Both INCREMENT-SOT-CPE ≥ 11 and SOFA ≥ 11 were identified as strong predictors of all-cause 30-day mortality in a large cohort of CRE carriers developing infection after LT.


Liver Transplantation , Organ Transplantation , Male , Humans , Middle Aged , Female , Organ Transplantation/adverse effects , Liver Transplantation/adverse effects , Carbapenems , Retrospective Studies , Risk Factors , Transplant Recipients
10.
J Crit Care ; 74: 154211, 2023 04.
Article En | MEDLINE | ID: mdl-36630859

PURPOSE: Vasopressin has become an important vasopressor drug while treating a critically ill patient to maintain adequate mean arterial pressure. Diabetes insipidus (DI) is a rare syndrome characterized by the excretion of a large volume of diluted urine, inappropriate for water homeostasis. We noticed that several COVID19 patients developed excessive polyuria suggestive of DI, with a concomitant plasma sodium-level increase and/or low urine osmolality. We noticed a temporal relationship between vasopressin treatment cessation and polyuria periods. We reviewed those cases to better describe this phenomenon. METHODS: We retrospectively collected COVID19 ECMO patients' (from July 6, 2020, to November 30, 2021) data from the electronic medical records. By examining urine output, urine osmolality (if applicable), plasma sodium level, and plasma osmolality, we set DI diagnosis. We described the clinical course of DI episodes and compared baseline characteristics between patients who developed DI and those who did not. RESULTS: Out of 37 patients, 12 had 18 episodes of DI. These patients were 7 years younger and had lower severity scores (APACHE-II and SOFA). Mortality difference was not seen between groups. 17 episodes occurred after vasopressin discontinuation; 14 episodes were treated with vasopressin reinstitution. DI lasted for a median of 21 h, with a median increase of 14 mEq/L of sodium. CONCLUSIONS: Temporary DI prevalence after vasopressin discontinuation in COVID19 ECMO patients might be higher than previously described for vasopressin-treated patients.


COVID-19 , Diabetes Insipidus , Vasopressins , Humans , COVID-19/complications , Critical Illness , Diabetes Insipidus/complications , Diabetes Insipidus/diagnosis , Diabetes Insipidus/drug therapy , Polyuria/complications , Polyuria/diagnosis , Polyuria/drug therapy , Retrospective Studies , Sodium/urine , Vasopressins/therapeutic use
11.
Sci Rep ; 12(1): 10573, 2022 06 22.
Article En | MEDLINE | ID: mdl-35732690

In hypoxemic patients at risk for developing respiratory failure, the decision to initiate invasive mechanical ventilation (IMV) may be extremely difficult, even more so among patients suffering from COVID-19. Delayed recognition of respiratory failure may translate into poor outcomes, emphasizing the need for stronger predictive models for IMV necessity. We developed a two-step model; the first step was to train a machine learning predictive model on a large dataset of non-COVID-19 critically ill hypoxemic patients from the United States (MIMIC-III). The second step was to apply transfer learning and adapt the model to a smaller COVID-19 cohort. An XGBoost algorithm was trained on data from the MIMIC-III database to predict if a patient would require IMV within the next 6, 12, 18 or 24 h. Patients' datasets were used to construct the model as time series of dynamic measurements and laboratory results obtained during the previous 6 h with additional static variables, applying a sliding time-window once every hour. We validated the adaptation algorithm on a cohort of 1061 COVID-19 patients from a single center in Israel, of whom 160 later deteriorated and required IMV. The new XGBoost model for the prediction of the IMV onset was trained and tested on MIMIC-III data and proved to be predictive, with an AUC of 0.83 on a shortened set of features, excluding the clinician's settings, and an AUC of 0.91 when the clinician settings were included. Applying these models "as is" (no adaptation applied) on the dataset of COVID-19 patients degraded the prediction results to AUCs of 0.78 and 0.80, without and with the clinician's settings, respectively. Applying the adaptation on the COVID-19 dataset increased the prediction power to an AUC of 0.94 and 0.97, respectively. Good AUC results get worse with low overall precision. We show that precision of the prediction increased as prediction probability was higher. Our model was successfully trained on a specific dataset, and after adaptation it showed promise in predicting outcome on a completely different dataset. This two-step model successfully predicted the need for invasive mechanical ventilation 6, 12, 18 or 24 h in advance in both general ICU population and COVID-19 patients. Using the prediction probability as an indicator of the precision carries the potential to aid the decision-making process in patients with hypoxemic respiratory failure despite the low overall precision.


COVID-19 , Respiratory Insufficiency , COVID-19/therapy , Critical Illness/therapy , Humans , Machine Learning , Respiration, Artificial , Respiratory Insufficiency/therapy
12.
Nutrients ; 14(7)2022 Mar 23.
Article En | MEDLINE | ID: mdl-35405945

INTRODUCTION: Hypophosphatemia may prolong ventilation and induce weaning failure. Some studies have associated hypophosphatemia with increased mortality. Starting or restarting nutrition in a critically ill patient may be associated with refeeding syndrome and hypophosphatemia. The correlation between nutrition, mechanical ventilation, and hypophosphatemia has not yet been fully elucidated. METHODS: A retrospective cohort study of 825 admissions during two consecutive years was conducted. Using the electronic medical chart, demographic and clinical data were obtained. Hypophosphatemia was defined as a phosphate level below 2.5 mg/dL (0.81 mmol/L) in the first 72 h of ICU admission. Comparisons between baseline characteristics and outcomes and multivariate analysis were performed. RESULTS: A total of 324 (39.27%) patients had hypophosphatemia during the first 72 h of ICU admission. Patients with hypophosphatemia tended to be younger, with lower APACHE-II, SOFA24, and ΔSOFA scores. They had a longer length of stay and length of ventilation, more prevalent prolonged ventilation, and decreased mortality. Their energy deficit was lower. There was no effect of hypophosphatemia severity on these results. In multivariate analysis, hypophosphatemia was not found to be statistically significant either with respect to mortality or survivor's length of ventilation, but lower average daily energy deficit and SOFA24 were found to be statistically significant with respect to survivor's length of ventilation. CONCLUSION: Hypophosphatemia had no effect on mortality or length of ventilation. Lower average daily energy deficit is associated with a longer survivor's length of ventilation.


Hypophosphatemia , Intensive Care Units , Critical Illness , Humans , Length of Stay , Respiration, Artificial , Retrospective Studies
13.
Clin Infect Dis ; 73(4): e955-e966, 2021 08 16.
Article En | MEDLINE | ID: mdl-33564840

BACKGROUND: Patients colonized with carbapenem-resistant Enterobacteriaceae (CRE) are at higher risk of developing CRE infection after liver transplantation (LT), with associated high morbidity and mortality. Prediction model for CRE infection after LT among carriers could be useful to target preventive strategies. METHODS: Multinational multicenter cohort study of consecutive adult patients underwent LT and colonized with CRE before or after LT, from January 2010 to December 2017. Risk factors for CRE infection were analyzed by univariate analysis and by Fine-Gray subdistribution hazard model, with death as competing event. A nomogram to predict 30- and 60-day CRE infection risk was created. RESULTS: A total of 840 LT recipients found to be colonized with CRE before (n = 203) or after (n = 637) LT were enrolled. CRE infection was diagnosed in 250 (29.7%) patients within 19 (interquartile range [IQR], 9-42) days after LT. Pre- and post-LT colonization, multisite post-LT colonization, prolonged mechanical ventilation, acute renal injury, and surgical reintervention were retained in the prediction model. Median 30- and 60-day predicted risk was 15% (IQR, 11-24) and 21% (IQR, 15-33), respectively. Discrimination and prediction accuracy for CRE infection was acceptable on derivation (area under the curve [AUC], 74.6; Brier index, 16.3) and bootstrapped validation dataset (AUC, 73.9; Brier index, 16.6). Decision-curve analysis suggested net benefit of model-directed intervention over default strategies (treat all, treat none) when CRE infection probability exceeded 10%. The risk prediction model is freely available as mobile application at https://idbologna.shinyapps.io/CREPostOLTPredictionModel/. CONCLUSIONS: Our clinical prediction tool could enable better targeting interventions for CRE infection after transplant.


Carbapenem-Resistant Enterobacteriaceae , Enterobacteriaceae Infections , Liver Transplantation , Adult , Anti-Bacterial Agents/therapeutic use , Carbapenems/pharmacology , Cohort Studies , Enterobacteriaceae Infections/drug therapy , Enterobacteriaceae Infections/epidemiology , Humans , Risk Factors
14.
Neurotoxicology ; 78: 99-105, 2020 05.
Article En | MEDLINE | ID: mdl-32084435

Organophosphates (OPs) are widely used as pesticides and have been employed as warfare agents. OPs inhibit acetylcholinesterase, leading to over-stimulation of cholinergic synapses and can cause status epilepticus (SE). OPs poisoning can result in irreversible brain damage and death. Despite termination of SE, recurrent seizures and abnormal brain activity remain common sequelae often associated with long-term neural damage and cognitive dysfunction. Therefore, early treatment for prevention of seizures is of high interest. Using a rat model of paraoxon poisoning, we tested the efficacy of different neuroprotective and anti-epileptic drugs (AEDs) in suppressing early seizures and preventing brain damage. Electrocorticographic recordings were performed prior, during and after injection of 4.5 LD50 paraoxon, followed by injections of atropine and toxogonin (obidoxime) to prevent death. Thirty minutes later, rats were injected with midazolam alone or in combination with different AEDs (lorazepam, valproic acid, phenytoin) or neuroprotective drugs (losartan, isoflurane). Outcome measures included SE duration, early seizures frequency and epileptiform activity duration in the first 24 -hs after poisoning. To assess delayed brain damage, we performed T2-weighted magnetic resonance imaging one month after poisoning. SE duration and the number of recurrent seizures were not affected by the addition of any of the drugs tested. Delayed brain injury was most prominent in the septum, striatum, amygdala and piriform network. Only isoflurane anesthesia significantly reduced brain damage. We show that acute treatment with isoflurane, but not AEDs, reduces brain damage following SE. This may offer a new therapeutic approach for exposed individuals.


Anticonvulsants/administration & dosage , Brain/drug effects , Isoflurane/administration & dosage , Midazolam/administration & dosage , Paraoxon/toxicity , Status Epilepticus/prevention & control , Animals , Brain/pathology , Brain/physiopathology , Disease Models, Animal , Male , Rats, Sprague-Dawley , Status Epilepticus/chemically induced , Status Epilepticus/pathology
15.
Eur J Gastroenterol Hepatol ; 31(9): 1135-1140, 2019 Sep.
Article En | MEDLINE | ID: mdl-30896551

BACKGROUND: Early infections are common during the first month after liver transplantation (LT), whereas no consensus exists on the optimal prophylactic antimicrobial therapy. We aimed to evaluate the effectiveness of cefazolin perioperative prophylaxis in LT. PATIENTS AND METHODS: We documented our experience with single-dose cefazolin as prophylaxis for LT. Infections occurring within 30 days following LT during 2006-2015 were documented retrospectively. Univariate and multivariate analyses of risk factors for infection were carried out. RESULTS: Among 113 LT recipients receiving cefazolin as prophylaxis, infections occurred in 50 (44%) patients, including surgical site infections (n=24, 21%) and bacteremia (n=14, 12%). Bacteria resistant to cefazolin were documented in 59/72 (82%) isolates. Enterococcal infections were documented in 6% (7/113). Almost half of the infections (44%) occurred in the first week following LT and the vast majority within 2 weeks. The 30-day mortality rate (7%, 8/113) was significantly higher among infected patients (7/50, 14% vs. 1/63, 1.6%, P=0.011). Model for End-stage Liver Disease score, age, and requirement for at least 5 U of packed red cells during transplantation were predictive for postoperative infections. CONCLUSION: In our center, cefazolin was insufficient as perioperative prophylaxis in LT. We suggest that all LT recipients should receive antibiotic prophylaxis targeting microorganisms on the basis of local bacterial ecology and patterns of resistance irrespective of preoperative or intraoperative risk assessment.


Anti-Bacterial Agents/administration & dosage , Antibiotic Prophylaxis , Bacteremia/prevention & control , Cefazolin/administration & dosage , Liver Transplantation/adverse effects , Surgical Wound Infection/prevention & control , Adult , Bacteremia/epidemiology , Bacteremia/microbiology , Female , Humans , Liver Failure/etiology , Liver Failure/pathology , Liver Failure/surgery , Male , Middle Aged , Retrospective Studies , Surgical Wound Infection/epidemiology , Surgical Wound Infection/microbiology
16.
Mil Med ; 180(6): 702-7, 2015 Jun.
Article En | MEDLINE | ID: mdl-26032387

OBJECTIVE: Specialized training of medical teams for chemical warfare agent (CWA) events is important to save lives. We aimed to evaluate the retention of knowledge (ROK) and self-perceived competency (SPC) of military medical personnel in delivering treatment during CWA events. METHODS: A questionnaire and a multiple-choice examination were sent to military physicians and paramedics, evaluating their CWA, ROK, and SPC (study group [SG]). Their assessment was compared to medical personnel immediately post training (reference group [RG]). SG was subdivided into two groups: G1 ≤ 1 year and G2 > 1 year, past training. RESULTS: Overall, 135 participants responded (35-RG, 65% physicians). Self-reported ROK and SPC were significantly higher in RG compared to SG and in G1 compared to G2. Test scores were higher in RG compared to SG, but similar in G1 and G2 groups. SPC was lower compared to ROK in the entire cohort and subgroups. A moderate correlation was found between the self-and test-assessed scores (Pearson correlation coefficient 0.45, p < 0.001). Physicians received significantly (p = 0.01) higher test scores in RG compared with paramedics. CONCLUSIONS: ROK and SPC among military medical personnel for treatment of CWA casualties deteriorate significantly as early as 1 year post training, SPC > ROK. Thus, we recommend CWA refresher training at least every year.


Allied Health Personnel/psychology , Chemical Warfare , Clinical Competence , Military Personnel , Physicians/psychology , Self Efficacy , Adult , Chemical Warfare Agents/toxicity , Female , Health Knowledge, Attitudes, Practice , Humans , Male , Military Medicine/education , Poisoning/therapy , Retention, Psychology , Surveys and Questionnaires , Time Factors , United States , Young Adult
17.
Toxicology ; 323: 19-25, 2014 Sep 02.
Article En | MEDLINE | ID: mdl-24881594

Poisoning with organophosphates (OPs) may induce status epilepticus (SE), leading to severe brain damage. Our objectives were to investigate whether OP-induced SE leads to the emergence of spontaneous recurrent seizures (SRSs), the hallmark of chronic epilepsy, and if so, to assess the efficacy of benzodiazepine therapy following SE onset in preventing the epileptogenesis. We also explored early changes in hippocampal pyramidal cells excitability in this model. Adult rats were poisoned with the paraoxon (450µg/kg) and immediately treated with atropine (3mg/kg) and obidoxime (20mg/kg) to reduce acute mortality due to peripheral acetylcholinesterase inhibition. Electrical brain activity was assessed for two weeks during weeks 4-6 after poisoning using telemetric electrocorticographic intracranial recordings. All OP-poisoned animals developed SE, which could be suppressed by midazolam. Most (88%) rats which were not treated with midazolam developed SRSs, indicating that they have become chronically epileptic. Application of midazolam 1min following SE onset had a significant antiepileptogenic effect (only 11% of the rats became epileptic; p=0.001 compared to non-midazolam-treated rats). Applying midazolam 30min after SE onset did not significantly prevent chronic epilepsy. The electrophysiological properties of CA1 pyramidal cells, assessed electrophysiologically in hippocampal slices, were not altered by OP-induced SE. Thus we show for the first time that a single episode of OP-induced SE in rats leads to the acquisition of chronic epilepsy, and that this epileptogenic outcome can be largely prevented by immediate, but not delayed, administration of midazolam. Extrapolating these results to humans would suggest that midazolam should be provided together with atropine and an oxime in the immediate pharmacological treatment of OP poisoning.


Antidotes/therapeutic use , Cholinesterase Inhibitors/toxicity , Epilepsy/prevention & control , Midazolam/therapeutic use , Paraoxon/toxicity , Status Epilepticus/chemically induced , Animals , Atropine/therapeutic use , Cholinesterase Reactivators/therapeutic use , Chronic Disease , Epilepsy/chemically induced , Muscarinic Agonists , Obidoxime Chloride/therapeutic use , Pesticides/toxicity , Pilocarpine , Rats , Rats, Sprague-Dawley , Status Epilepticus/physiopathology
18.
Harefuah ; 153(3-4): 199-205, 237, 2014.
Article He | MEDLINE | ID: mdl-24791566

Sulfur mustard (SM) is an alkylating chemical warfare agent with high military significance due to its high toxicity, resistance and availability. SM was widely used in military conflicts, the last being the Iran-Iraq war with more than 100,000 Iranians exposed, one-third of whom are still suffering from late effects. The intensity of the delayed complications correlates to the extent, the area and the route of exposure. The clinical manifestations most commonly involve respiratory, ocular and dermal effects. Respiratory complications include dyspnea, cough and expectorations and various obstructive and restrictive lung diseases. Dermal complications are itching, burning sensation, blisters, dry skin, dermatitis and pigmentary changes. Ocular complications include photophobia, red eye, tearing, corneal ulcers and blindness. Although the picture remains incomplete the major mechanisms responsible for the clinical and pathological effects of SM are: DNA alkylation and cross-linking, protein modification and membrane damage in addition to induction of inflammatory mediators in the target tissues causing extensive necrosis, apoptosis and loss of tissue structure. The current report reviews long-term complications of SM exposure, focusing on new treatments tested in clinical trials conducted on humans. Such treatments include: N-acetyl cysteine, bronchodilators, corticosteroids, Interferon-gamma, furosemide and morphine for the respiratory complications. Ocular complications may entail: Invasive procedures treating corneal complication, limbal ischemia and stem cell deficiency. Treatment for dermatological complications include: anti-depressants, pimercrolimus, Unna's boot, capsaicin, phenol and menthol, Aloe vera and olive oil, curcumin and Interferon-gamma.


Chemical Warfare Agents/toxicity , Chemical Warfare , Mustard Gas/toxicity , Clinical Trials as Topic , Eye Diseases/chemically induced , Eye Diseases/physiopathology , Eye Diseases/therapy , Humans , Iran , Iraq , Respiratory Tract Diseases/chemically induced , Respiratory Tract Diseases/physiopathology , Respiratory Tract Diseases/therapy , Skin Diseases/chemically induced , Skin Diseases/pathology , Time Factors , Warfare
...