ABSTRACT
Severe respiratory infections can result in acute respiratory distress syndrome (ARDS)1. There are no effective pharmacological therapies that have been shown to improve outcomes for patients with ARDS. Although the host inflammatory response limits spread of and eventually clears the pathogen, immunopathology is a major contributor to tissue damage and ARDS1,2. Here we demonstrate that respiratory viral infection induces distinct fibroblast activation states, which we term extracellular matrix (ECM)-synthesizing, damage-responsive and interferon-responsive states. We provide evidence that excess activity of damage-responsive lung fibroblasts drives lethal immunopathology during severe influenza virus infection. By producing ECM-remodelling enzymes-in particular the ECM protease ADAMTS4-and inflammatory cytokines, damage-responsive fibroblasts modify the lung microenvironment to promote robust immune cell infiltration at the expense of lung function. In three cohorts of human participants, the levels of ADAMTS4 in the lower respiratory tract were associated with the severity of infection with seasonal or avian influenza virus. A therapeutic agent that targets the ECM protease activity of damage-responsive lung fibroblasts could provide a promising approach to preserving lung function and improving clinical outcomes following severe respiratory infections.
Subject(s)
ADAMTS4 Protein/metabolism , Fibroblasts/enzymology , Fibroblasts/pathology , Influenza A virus/pathogenicity , Lung/pathology , Lung/physiopathology , ADAMTS4 Protein/antagonists & inhibitors , Animals , Birds/virology , Extracellular Matrix/enzymology , Gene Expression Profiling , Humans , Influenza in Birds/virology , Influenza, Human/pathology , Influenza, Human/therapy , Influenza, Human/virology , Interferons/immunology , Interferons/metabolism , Leukocyte Common Antigens/metabolism , Lung/enzymology , Lung/virology , Mice , Respiratory Distress Syndrome/enzymology , Respiratory Distress Syndrome/physiopathology , Respiratory Distress Syndrome/therapy , Respiratory Distress Syndrome/virology , Seasons , Single-Cell Analysis , Stromal Cells/metabolismABSTRACT
An amendment to this paper has been published and can be accessed via a link at the top of the paper.
ABSTRACT
Human-to-swine transmission of influenza A virus (IAV) repeatedly occurs, leading to sustained transmission and increased diversity in swine; human seasonal H3N2 introductions occurred in the 1990s and 2010s and were maintained in North American swine. Swine H3N2 strains were subsequently associated with zoonotic infections, highlighting the need to understand the risk of endemic swine IAV to humans. We quantified antigenic distances between swine H3N2 and human seasonal vaccine strains from 1973 to 2014 using a panel of monovalent antisera raised in pigs in hemagglutination inhibition (HI) assays. Swine H3N2 lineages retained the closest antigenic similarity to human vaccine strains from the decade of incursion. Swine lineages from the 1990s were antigenically more similar to human vaccine strains of the mid-1990s but had substantial distance from recent human vaccine strains. In contrast, lineages from the 2010s were closer to human vaccine strains from 2011 and 2014 and the most antigenically distant from human vaccine strains prior to 2007. HI assays using ferret antisera demonstrated that swine lineages from the 1990s and 2010s had significant fold reductions compared to the homologous HI titer of the nearest pandemic preparedness candidate vaccine virus (CVV) or seasonal vaccine strain. The assessment of postinfection and postvaccination human serum cohorts demonstrated limited cross-reactivity to swine H3N2 from the 1990s, especially in older adults born before the 1970s. We identified swine strains to which humans are likely to lack population immunity or are not protected against by a current human seasonal vaccine or CVV to use in prioritizing future human CVV strain selection. IMPORTANCE Human H3N2 influenza A viruses spread to pigs in North America in the 1990s and more recently in the 2010s. These cross-species events led to sustained circulation and increased H3N2 diversity in pig populations. The evolution of H3N2 in swine led to a reduced similarity to human seasonal H3N2 and the vaccine strains used to protect human populations. We quantified the antigenic phenotypes and found that North American swine H3N2 lineages retained more antigenic similarity to historical human vaccine strains from the decade of incursion but had substantial differences compared to recent human vaccine strains. Additionally, pandemic preparedness vaccine strains demonstrated a loss of similarity to contemporary swine strains. Finally, human sera revealed that although these adults had antibodies against human H3N2 strains, many had limited immunity to swine H3N2, especially older adults born before 1970. Antigenic assessment of swine H3N2 provides critical information for pandemic preparedness and candidate vaccine development.
Subject(s)
Influenza A Virus, H3N2 Subtype/genetics , Orthomyxoviridae Infections/virology , Viral Zoonoses/virology , Animals , Antigenic Drift and Shift , Antigenic Variation , Hemagglutinin Glycoproteins, Influenza Virus/genetics , Humans , Immune Sera/immunology , Influenza A Virus, H3N2 Subtype/classification , Influenza A Virus, H3N2 Subtype/immunology , Influenza Vaccines/genetics , Influenza, Human/transmission , Influenza, Human/virology , Orthomyxoviridae Infections/transmission , Phylogeny , Risk Assessment , Swine , Viral Zoonoses/transmissionABSTRACT
The utility of PBPK models in support of drug development has been well documented. During the discovery stage, PBPK has increasingly been applied for early risk assessment, prediction of human dose, toxicokinetic dose projection and early formulation assessment. Previous review articles have proposed model building and application strategies for PBPK-based first in human predictions with comprehensive descriptions of the individual components of PBPK models. This includes the generation of decision trees, based on comprehensive literature reviews, to guide the application of PBPK in the discovery setting. The goal of this mini review is to provide additional guidance on the real-world application of PBPK, in support of the discovery stage of drug development. In this mini review, our goal is to provide guidance on the typical steps involved in the development and application of a PBPK model during drug discovery to assist in decision making. We have illustrated our recommended approach through description of case examples, where PBPK has been successfully applied to aid in human PK projection, candidate selection and prediction of drug interaction liability for parent and metabolite. Through these case studies, we have highlighted fundamental issues, including pre-verification in preclinical species, the application of empirical scalars in the prediction of in vivo clearance from in vitro systems, in silico prediction of permeability and the exploration of aqueous and biorelevant solubility data to predict dissolution. In addition, current knowledge gaps have been highlighted and future directions proposed. Significance Statement Through description of three case studies, we have highlighted the fundamental principles of PBPK application during drug discovery. These include pre-verification of the model in preclinical species, application of empirical scalars where necessary in the prediction of clearance, in silico prediction of permeability, and the exploration of aqueous and biorelevant solubility data to predict dissolution. In addition, current knowledge gaps have been highlighted and future directions proposed.
ABSTRACT
The continual emergence of novel influenza A strains from non-human hosts requires constant vigilance and the need for ongoing research to identify strains that may pose a human public health risk. Since 1999, canine H3 influenza A viruses (CIVs) have caused many thousands or millions of respiratory infections in dogs in the United States. While no human infections with CIVs have been reported to date, these viruses could pose a zoonotic risk. In these studies, the National Institutes of Allergy and Infectious Diseases (NIAID) Centers of Excellence for Influenza Research and Surveillance (CEIRS) network collaboratively demonstrated that CIVs replicated in some primary human cells and transmitted effectively in mammalian models. While people born after 1970 had little or no pre-existing humoral immunity against CIVs, the viruses were sensitive to existing antivirals and we identified a panel of H3 cross-reactive human monoclonal antibodies (hmAbs) that could have prophylactic and/or therapeutic value. Our data predict these CIVs posed a low risk to humans. Importantly, we showed that the CEIRS network could work together to provide basic research information important for characterizing emerging influenza viruses, although there were valuable lessons learned.
Subject(s)
Communicable Diseases, Emerging/veterinary , Dog Diseases/virology , Influenza A Virus, H3N2 Subtype/isolation & purification , Influenza A Virus, H3N8 Subtype/isolation & purification , Influenza A virus/isolation & purification , Zoonoses/virology , Animals , Communicable Diseases, Emerging/transmission , Communicable Diseases, Emerging/virology , Dog Diseases/transmission , Dogs , Ferrets , Guinea Pigs , Humans , Influenza A Virus, H3N2 Subtype/classification , Influenza A Virus, H3N2 Subtype/genetics , Influenza A Virus, H3N8 Subtype/classification , Influenza A Virus, H3N8 Subtype/genetics , Influenza A virus/classification , Influenza A virus/genetics , Influenza, Human/transmission , Influenza, Human/virology , Mice, Inbred BALB C , Mice, Inbred C57BL , Mice, Inbred DBA , United States , Zoonoses/transmissionABSTRACT
Drugs that modulate cytokine levels are often used for the treatment of cancer as well as inflammatory or immunologic disorders. Pharmacokinetic drug-biologic interactions (DBIs) may arise from suppression or elevation of cytochrome P450 (P450) enzymes caused by the increase or decrease in cytokine levels after administration of these therapies. There is in vitro and in vivo evidence that demonstrates a clear link between raised interleukin (IL)-6 levels and P450 suppression, in particular CYP3A4. However, despite this, the changes in IL-6 levels in vivo rarely lead to significant drug interactions (area under the curve and Cmax ratios < 2-fold). The clinical significance of such interactions therefore remains questionable and is dependent on the therapeutic index of the small molecule therapy. Physiologically based pharmacokinetic (PBPK) modeling has been used successfully to predict the impact of raised IL-6 on P450 activities. Beyond IL-6, published data show little evidence that IL-8, IL-10, and IL-17 suppress P450 enzymes. In vitro data suggest that IL-1ß, IL-2, tumor necrosis factor (TNF)-α, and interferon (IFN)-γ can cause suppression of P450 enzymes. Despite in vivo there being a link between IL-6 levels and P450 suppression, the evidence to support a direct effect of IL-2, IL-8, IL-10, IL-17, IFN-γ, TNF-α, or vascular endothelial growth factor on P450 activity is inconclusive. This commentary will discuss the relevance of such drug-biologic interactions and whether current PBPK models considering only IL-6 are sufficient. SIGNIFICANCE STATEMENT: This commentary summarizes the current in vitro and in vivo literature regarding cytokine-mediated cytochrome P450 suppression and compares the relative suppressive potential of different cytokines in reference to interleukin (IL)-6. It also discusses the relevance of drug-biologic interactions to therapeutic use of small molecule drugs and whether current physiologically based pharmacokinetic models considering only IL-6 are sufficient to predict the extent of drug-biologic interactions.
Subject(s)
Biological Products , Interleukin-6 , Cytochrome P-450 Enzyme System/metabolism , Cytokines , Drug Interactions , Interleukin-10 , Interleukin-17 , Interleukin-2 , Interleukin-6/metabolism , Interleukin-8 , Pharmaceutical Preparations/metabolism , Tumor Necrosis Factor-alpha , Vascular Endothelial Growth Factor AABSTRACT
BACKGROUND: Peri-intubation cardiac arrest is an uncommon, serious complication following endotracheal intubation in the emergency department. Although several risk factors have been previously identified, this study aimed to comprehensively identify risk factors associated with peri-intubation cardiac arrest. METHODS: This retrospective, nested case-control study conducted from January 1, 2016 to December 31, 2020 analyzed variables including demographic characteristics, triage, and pre-intubation vital signs, medications, and laboratory data. Univariate analysis and multivariable logistic regression models were used to compare clinical factors between the patients with peri-intubation cardiac arrest and patients without cardiac arrest. RESULTS: Of the 6983 patients intubated during the study period, 5130 patients met the inclusion criteria; 92 (1.8%) patients met the criteria for peri-intubation cardiac arrest and 276 were age- and sex-matched to the control group. Before intubation, systolic blood pressure and diastolic blood pressure were lower (104 vs. 136.5 mmHg, p < 0.01; 59.5 vs. 78 mmHg, p < 0.01 respectively) and the shock index was higher in the patients with peri-intubation cardiac arrest than the control group (0.97 vs. 0.83, p < 0.0001). Cardiogenic pulmonary edema as an indication for intubation (adjusted odds ratio [aOR]: 5.921, 95% confidence interval [CI]: 1.044-33.57, p = 0.04), systolic blood pressure < 90 mmHg before intubation (aOR: 5.217, 95% CI: 1.484-18.34, p = 0.01), and elevated lactate levels (aOR: 1.012, 95% CI: 1.002-1.022, p = 0.01) were independent risk factors of peri-intubation cardiac arrest. CONCLUSIONS: Patients with hypotension before intubation have a higher risk of peri-intubation cardiac arrest in the emergency department. Future studies are needed to evaluate the influence of resuscitation before intubation and establish airway management strategies to avoid serious complications.
Subject(s)
Heart Arrest , Case-Control Studies , Emergency Service, Hospital , Heart Arrest/epidemiology , Heart Arrest/etiology , Heart Arrest/therapy , Humans , Intubation, Intratracheal/adverse effects , Retrospective Studies , Risk FactorsABSTRACT
OBJECTIVES: Early adequate resuscitation of patients with trauma is crucial in preventing shock and early mortality. Thus, we aimed to determine the performance of the inferior vena cava (IVC) volume and other risk factors and scores in predicting massive transfusion and mortality. METHODS: We included all patients with trauma who underwent computed tomography (CT) scan of the torso, which included the abdominal area, in our emergency department (ED) from January 2014 to January 2017. We calculated the 3-dimensional IVC volume from the left renal vein to the IVC bifurcation. The primary outcome was the performance of IVC volume in predicting massive transfusion, and the secondary outcome was the performance of IVC volume in predicting 24-hour and 30-day in-hospital mortality. RESULTS: Among the 236 patients with trauma, 7.6% received massive transfusions. The IVC volume and revised trauma score (RTS) were independent predictors of massive transfusion (adjusted odds ratio [OR]: 0.79 vs 1.86, 95% confidence interval [CI], 0.71-0.89 vs 1.4-2.47, respectively). Both parameters showed the good area under the curve (AUC) for the prediction of massive transfusion (adjusted AUC: 0.83 and 0.82, 95% CI, 0.74-0.92 vs 0.72-0.93, respectively). Patients with a large IVC volume (fourth quartile) were less likely to receive massive transfusion than those with a small IVC volume (first quartile, ≥28.29 mL: 0% vs <15.08 mL: 20.3%, OR: 0.13, 95% CI, 0.03-0.66). CONCLUSIONS: The volume of IVC measured on CT scan and RTS are independent predictors of massive transfusion in patients with trauma in the ED.
Subject(s)
Blood Transfusion , Blood Volume , Shock , Vena Cava, Inferior , Humans , Mortality , Predictive Value of Tests , Resuscitation , Retrospective Studies , Vena Cava, Inferior/diagnostic imagingABSTRACT
BACKGROUND: Extraintestinal pathogenic E. coli (ExPEC) is a common gram-negative organism causing various infections, including urinary tract infections (UTIs), bacteremia, and neonatal meningitis. The cjrABC-senB gene cluster of E. coli contributes to ExPEC virulence in the mouse model of UTIs. Consistently, the distribution of cjrABC-senB is epidemiologically associated with human UTIs caused by E. coli. cjrABC-senB, which has previously been proposed to encode an iron uptake system, may facilitate ExPEC survival in the iron availability-restricted urinary tract. Given that the bloodstream is also an iron limited environment to invading bacteria, the pathogenic role of cjrABC-senB in ExPEC bacteremia, however, remains to be investigated. METHODS: The ability of ExPEC RS218 strains with and without cjrABC-senB to survive in the mouse bloodstream and human serum was evaluated. Subsequently, the role of this gene cluster in the ExPEC interaction with the complement system was evaluated. Finally, the distribution of cjrABC-senB in human clinical E. coli isolates was determined by PCR. The frequency of cjrABC-senB in bacteremia isolates that were not associated with UTIs (non-UTI bacteremia isolates) was compared with that in UTI-associated isolates and fecal isolates. RESULTS: Expression of cjrABC-senB attenuated the survival of RS218 in the mouse bloodstream and human serum. The cjrABC-senB-harboring strains triggered enhanced classical- and alternative-complement pathway activation and became more vulnerable to complement-mediated killing in serum. cjrA was identified as the major gene responsible for the attenuated serum survival. Expressing cjrABC-senB and cjrA increased bacterial susceptibility to detergent and induced periplasmic protein leakage, suggesting that the expression of these genes compromises the integrity of the outer membrane of ExPEC. In addition, the frequency of cjrABC-senB in non-UTI bacteremia isolates was significantly lower than that in UTI-associated isolates, while the frequencies in non-UTI bacteremia isolates and fecal isolates showed no significant difference. Consistently, this epidemiological investigation suggests that cjrABC-senB does not contribute to E. coli bacteremia in humans. CONCLUSION: The contribution of cjrABC-senB to the pathogenesis of ExPEC is niche dependent and contradictory because the genes facilitate ExPEC UTIs but hinder bacteremia. The contradictory niche-dependent characteristic may benefit the development of novel strategies against E. coli-caused infections.
Subject(s)
Bacteremia/microbiology , Complement Activation , Escherichia coli Infections/microbiology , Escherichia coli Proteins/metabolism , Extraintestinal Pathogenic Escherichia coli/physiology , Genes, Bacterial , Multigene Family , Animals , Extraintestinal Pathogenic Escherichia coli/genetics , Mice , Mice, Inbred BALB CABSTRACT
Sepsis is a major cause of morbidity and mortality worldwide. With the advance of medical care, the mortality of sepsis has decreased in the past decades. Many treatments and diagnostic tools still lack supporting evidence. We conducted a retrospective population-based cohort study with propensity score matched subcohorts based on a prospectively collected national longitudinal health insurance database in Taiwan. Severe sepsis-associated hospital admissions from 2000 to 2011 based on International Classification of Diseases, Ninth Revision, Clinical Modification codes of infections and acute organ dysfunction were identified. To compare the effectiveness of treatment and diagnostic tool, propensity scores were generated to match the comparable control groups. During the 12-year period, 33 375 patients and 50 465 hospitalizations of severe sepsis were identified. The age-standardized 28-day in-hospital mortality decreased significantly from 21% in 2008 to 15% in 2011 with increasingly implemented treatment and diagnostic tool. After propensity score matching, procalcitonin (odds ratio [OR]: 0.70, 95% confidence interval [95% CI]: 0.61-0.81) and lactate testing (OR: 0.90, 95% CI: 0.84-0.97, respectively), transfusion of packed red blood cell (OR: 0.60, 95% CI: 0.52-0.69), albumin (OR: 0.72, 95% CI: 0.55-0.93), balanced crystalloid (OR: 0.29, 95% CI: 0.20-0.41), and use of dopamine (OR: 0.44, 95% CI: 0.39-0.49) were found to be significantly associated with lower mortality rate. However, inconsistent findings need to be further validated.
Subject(s)
Sepsis , Cohort Studies , Hospital Mortality , Humans , Retrospective Studies , Sepsis/mortality , Sepsis/therapy , Taiwan/epidemiologyABSTRACT
Roux-en-Y gastric bypass surgery (RYGBS) is an effective surgical intervention to reduce mortality in morbidly obese patients. Following RYGBS, the disposition of drugs may be affected by anatomical alterations and changes in intestinal and hepatic drug metabolizing enzyme activity. The aim of this study was to better understand the drug-drug interaction (DDI) potential of CYP3A and P-gp inhibitors. The impacts of RYGBS on the absorption and metabolism of midazolam, acetaminophen, digoxin, and their major metabolites were simulated using physiologically-based pharmacokinetic (PBPK) modeling. PBPK models for verapamil and posaconazole were built to evaluate CYP3A- and P-gp-mediated DDIs pre- and post-RYGBS. The simulations suggest that for highly soluble drugs, such as verapamil, the predicted bioavailability was comparable pre- and post-RYGBS. For verapamil inhibition, RYGBS did not affect the fold-change of the predicted inhibited-to-control plasma AUC ratio or predicted inhibited-to-control peak plasma concentration ratio for either midazolam or digoxin. In contrast, the predicted bioavailability of posaconazole, a poorly soluble drug, decreased from 12% pre-RYGBS to 5% post-RYGBS. Compared to control, the predicted posaconazole-inhibited midazolam plasma AUC increased by 2.0-fold pre-RYGBS, but only increased by 1.6-fold post-RYGBS. A similar trend was predicted for pre- and post-RYGBS inhibited-to-control midazolam peak plasma concentration ratios (2.0- and 1.6-fold, respectively) following posaconazole inhibition. Absorption of highly soluble drugs was more rapid post-RYGBS, resulting in higher predicted midazolam peak plasma concentrations, which was further increased following inhibition by verapamil or posaconazole. To reduce the risk of a drug-drug interaction in patients post-RYGBS, the dose or frequency of object drugs may need to be decreased when administered with highly soluble inhibitor drugs, especially if toxicities are associated with plasma peak concentrations.
Subject(s)
Cytochrome P-450 CYP3A Inhibitors/pharmacokinetics , Cytochrome P-450 CYP3A/metabolism , Gastric Bypass/adverse effects , Models, Biological , ATP Binding Cassette Transporter, Subfamily B/metabolism , Acetaminophen/administration & dosage , Acetaminophen/pharmacokinetics , Administration, Oral , Area Under Curve , Biological Availability , Cytochrome P-450 CYP3A Inhibitors/administration & dosage , Digoxin/administration & dosage , Digoxin/pharmacokinetics , Dose-Response Relationship, Drug , Drug Administration Schedule , Drug Interactions , Gastrointestinal Absorption , Hepatobiliary Elimination , Humans , Intestinal Elimination , Metabolic Clearance Rate , Midazolam/administration & dosage , Midazolam/pharmacokinetics , Obesity, Morbid/surgery , Postoperative Period , Triazoles/administration & dosage , Triazoles/pharmacokinetics , Verapamil/administration & dosage , Verapamil/pharmacokineticsABSTRACT
BACKGROUND: The aim of this study was to evaluate the benefits of cholecystectomy on mitigating recurrent biliary complications following endoscopic treatment of common bile duct stone. METHODS: We used the data from the Taiwan National Health Insurance Research Database to conduct a population-based cohort study. Among 925 patients who received endoscopic treatment for choledocholithiasis at the first admission from 2005 to 2012, 422 received subsequent cholecystectomy and 503 had gallbladder (GB) left in situ. After propensity score matching with 1:1 ratio, the cumulative incidence of recurrent biliary complication and overall survival was analyzed with Cox's proportional hazards model. The primary endpoint of this study is recurrent biliary complications, which require intervention. RESULTS: After matching, 378 pairs of patients were identified with a median follow-up time of 53 (1-108) months. The recurrent rate of biliary complications was 8.20% in the cholecystectomy group and 24.87% in the GB in situ group (p < 0.001). In the multivariate Cox regression analysis, the only independent risk factor for recurrent biliary complications was GB left in situ (hazard ratio [HR] 3.55, 95% CI 2.36-5.33). CONCLUSIONS: Cholecystectomy after endoscopic treatment of common bile duct stone reduced the prevalence of recurrent biliary complications.
Subject(s)
Cholangiopancreatography, Endoscopic Retrograde , Cholecystectomy , Choledocholithiasis/surgery , Aged , Aged, 80 and over , Choledocholithiasis/pathology , Female , Humans , Male , Middle Aged , Recurrence , Retrospective Studies , Risk Factors , Treatment OutcomeABSTRACT
BACKGROUND: We aimed to derive and validate a parsimonious and pragmatic clinical prediction rule using the concepts of Predisposition, Infection, Response, and Organ Dysfunction to predict in-hospital mortality; and to compare it with other prediction rules, as well as with conventional biomarkers for evaluating the mortality risk of patients with suspected sepsis in the emergency department (ED). METHODS: We conducted a pragmatic cohort study with consecutive ED patients aged 18 or older with documented diagnostic codes of infection and two sets of blood culture ordered by physicians between 2010 and 2012 in a tertiary teaching hospital. RESULTS: 7011 and 12,110 patients were included in the derivation cohort and the validation cohort for the final analysis. There were 479 deaths (7%) in the derivation cohort and 1145 deaths (9%) in the validation cohort. Independent predictors of death were absence of Chills (odds ratio: 2.28, 95% confidence interval: 1.75-2.97), Hypothermia (2.12, 1.57-2.85), Anemia (2.45, 1.97-3.04), wide Red cell Distribution Width (RDW) (3.27, 2.63-4.05) and history of Malignancy (2.00, 1.63-2.46). This novel clinical prediction rule (CHARM) performed well for stratifying patients into mortality risk groups (sensitivity: 99.4%, negative predictive value 99.7%, receiver operating characteristic area 0.77). The CHARM score also outperformed the other scores or biomarkers such as PIRO, SIRS, MEDS, CURB-65, C-reactive protein, procalcitonin and lactate (all p<.05). CONCLUSIONS: In patients with suspected sepsis, this parsimonious and pragmatic model could be utilized to stratify the mortality risk of patients in the early stage of sepsis.
Subject(s)
Hospital Mortality , Sepsis/mortality , Aged , Aged, 80 and over , Anemia/epidemiology , Biomarkers/blood , C-Reactive Protein/metabolism , Calcitonin/blood , Chills/epidemiology , Cohort Studies , Comorbidity , Decision Support Techniques , Emergency Service, Hospital , Erythrocyte Indices , Female , Humans , Hypothermia/epidemiology , Lactic Acid/blood , Male , Middle Aged , Neoplasms/epidemiology , Odds Ratio , Prognosis , ROC Curve , Retrospective Studies , Sepsis/blood , Sepsis/epidemiology , Tertiary Care CentersABSTRACT
Sepsis is one of the major causes of death worldwide, and is the host response to infection which renders our organs malfunctioning. Insufficient tissue perfusion and oxygen delivery have been implicated in the pathogenesis of sepsis-related organ dysfunction, making transfusion of packed red blood cells (pRBCs) a reasonable treatment modality. However, clinical trials have generated controversial results. Even the notion that transfused pRBCs increase the oxygen-carrying capacity of blood has been challenged. Meanwhile, during sepsis, the ability of our tissues to utilize oxygen may also be reduced, and the increased blood concentrations of lactate may be the results of strong inflammation and excessive catecholamine release, rather than impaired cell respiration. Leukodepleted pRBCs more consistently demonstrated improvement in microcirculation, and the increase in blood viscosity brought about by pRBC transfusion helps maintain functional capillary density. A restrictive strategy of pRBC transfusion is recommended in treating septic patients.
Subject(s)
Erythrocyte Transfusion/adverse effects , Sepsis/therapy , Clinical Trials as Topic , Erythrocyte Transfusion/methods , Erythrocytes/metabolism , Humans , Oxygen/metabolism , Sepsis/metabolismABSTRACT
OBJECTIVE: Roux-en-Y reconstructions can be divided into intact papilla of Vater and bilioenteric anastomosis (BEA) with respect to endoscopic retrograde cholangiography (ERC). Double-balloon enteroscopy-assisted ERC (DBE-ERC) may produce different results between the two populations but lacks studies. MATERIAL AND METHODS: Forty-seven patients with Roux-en-Y anastomosis undergoing 73 procedures of DBE-ERC were enrolled between July 2007 and August 2013. There were 14 patients with intact papilla of Vater (group A) and 33 patients with BEA (group B). The effectiveness of DBE-ERC, including data of reaching the blind end, performance of ERC, results of endoscopic therapies, and follow-up were retrospectively analyzed and compared between the two groups. RESULTS: For reaching the blind end, the success rate was not different between the groups (85.7% vs. 81.8%, p = 0.7), but the mean procedure time was significantly shorter for group A (28 min vs. 52 min, p = 0.01). For ERC, the success rate was not different between the groups (91.7% vs. 96.3%, p = 0.53), but the mean procedure time was significantly longer for group A (28.4 min vs. 4 min, p < 0.001). All endoscopic therapies could be successfully performed in both groups. No group A patients and five (23.8%) group B patients developed recurrent biliary stricture/stones requiring interventions during a mean follow-up period of 26.1 months. CONCLUSIONS: DBE-ERC was effective for both populations with biliary disorders. Reaching the blind end was more difficult but ERC was easier for patients with BEA in terms of procedure time rather than success rates.
Subject(s)
Anastomosis, Roux-en-Y , Biliary Tract Surgical Procedures/methods , Cholangiopancreatography, Endoscopic Retrograde , Double-Balloon Enteroscopy , Duodenum/diagnostic imaging , Pancreatic Ducts/diagnostic imaging , Postoperative Complications , Adult , Aged , Aged, 80 and over , Duodenum/surgery , Female , Follow-Up Studies , Gallstones/diagnostic imaging , Humans , Intestinal Perforation/diagnosis , Male , Middle Aged , Operative Time , Pancreatic Ducts/surgery , Retrospective StudiesABSTRACT
OBJECTIVE: Delirium is common in mechanically ventilated patients in the ICU and associated with short- and long-term morbidity and mortality. The use of systemic corticosteroids is also common in the ICU. Outside the ICU setting, corticosteroids are a recognized risk factor for delirium, but their relationship with delirium in critically ill patients has not been fully evaluated. We hypothesized that systemic corticosteroid administration would be associated with a transition to delirium in mechanically ventilated patients with acute lung injury. DESIGN: Prospective cohort study. SETTING: Thirteen ICUs in four hospitals in Baltimore, MD. PATIENTS: Five hundred twenty mechanically ventilated adult patients with acute lung injury. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Delirium evaluation was performed by trained research staff using the validated Confusion Assessment Method for the ICU screening tool. A total of 330 of the 520 patients (64%) had at least two consecutive ICU days of observation in which delirium was assessable (e.g., patient was noncomatose), with a total of 2,286 days of observation and a median (interquartile range) of 15 (9, 28) observation days per patient. These 330 patients had 99 transitions into delirium from a prior nondelirious, noncomatose state. The probability of transitioning into delirium on any given day was 14%. Using multivariable Markov models with robust variance estimates, the following factors (adjusted odds ratio; 95% CI) were independently associated with transition to delirium: older age (compared to < 40 years old, 40-60 yr [1.81; 1.26-2.62], and ≥ 60 yr [2.52; 1.65-3.87]) and administration of any systemic corticosteroid in the prior 24 hours (1.52; 1.05-2.21). CONCLUSIONS: After adjusting for other risk factors, systemic corticosteroid administration is significantly associated with transitioning to delirium from a nondelirious state. The risk of delirium should be considered when deciding about the use of systemic corticosteroids in critically ill patients with acute lung injury.
Subject(s)
Acute Lung Injury/drug therapy , Adrenal Cortex Hormones/adverse effects , Delirium/etiology , Respiration, Artificial , APACHE , Acute Lung Injury/complications , Adrenal Cortex Hormones/administration & dosage , Adult , Age Factors , Critical Illness/therapy , Delirium/diagnosis , Female , Humans , Intensive Care Units , Male , Markov Chains , Middle Aged , Organ Dysfunction Scores , Prospective Studies , Risk Factors , Severity of Illness Index , Treatment OutcomeABSTRACT
UNLABELLED: Generalized linear models were used to assess the relationship between religious attendance and lifetime smoking status among middle-aged adults (n = 666) sampled from waves three (1993 to 1996) and four (2004 to 2005) of the Baltimore Epidemiologic Catchment Area (ECA) study. Religious attendance once per week or greater as compared to never was inversely associated with smoking status. Future research should explore potential mediating factors of the association between religious attendance and smoking among middle-aged adults in order to gain a greater understanding of the mechanisms underlying this relationship. FUNDING: NIMH grant DA026652; NIDA grant T32DA007292.
Subject(s)
Religion , Smoking Cessation/statistics & numerical data , Smoking/epidemiology , Adult , Cross-Sectional Studies , Female , Humans , Male , Middle Aged , PrevalenceABSTRACT
This study aimed to examine the influence of sport skill levels on behavioural and neuroelectric performance in visuospatial attention and memory visuospatial tasks were administered to 54 participants, including 18 elite and 18 amateur table tennis players and 18 nonathletes, while event-related potentials were recorded. In all the visuospatial attention and memory conditions, table tennis players displayed faster reaction times than nonathletes, regardless of skill level, although there was no difference in accuracy between groups. In addition, regardless of task conditions, both player groups had a greater P3 amplitude than nonathletes, and elite players exhibited a greater P3 amplitude than amateurs players. The results of this study indicate that table tennis players, irrespective of their skill level, exhibit enhanced visuospatial capabilities. Notably, athletes at the elite level appear to benefit from an augmented allocation of attentional resources when engaging in visuospatial tasks.
Subject(s)
Attention , Cognition , Evoked Potentials , Reaction Time , Humans , Male , Young Adult , Attention/physiology , Cognition/physiology , Evoked Potentials/physiology , Reaction Time/physiology , Female , Tennis/physiology , Tennis/psychology , Adult , Space Perception/physiology , Athletes/psychology , Athletic Performance/physiology , Visual Perception/physiology , Electroencephalography , AdolescentABSTRACT
BACKGROUND AND OBJECTIVE: Machine learning models are vital for enhancing healthcare services. However, integrating them into health information systems (HISs) introduces challenges beyond clinical decision making, such as interoperability and diverse electronic health records (EHR) formats. We proposed Model Cabinet Architecture (MoCab), a framework designed to leverage fast healthcare interoperability resources (FHIR) as the standard for data storage and retrieval when deploying machine learning models across various HISs, addressing the challenges highlighted by platforms such as EPOCH®, ePRISM®, KETOS, and others. METHODS: The MoCab architecture is designed to streamline predictive modeling in healthcare through a structured framework incorporating several specialized parts. The Data Service Center manages patient data retrieval from FHIR servers. These data are then processed by the Knowledge Model Center, where they are formatted and fed into predictive models. The Model Retraining Center is crucial in continuously updating these models to maintain accuracy in dynamic clinical environments. The framework further incorporates Clinical Decision Support (CDS) Hooks for issuing clinical alerts. It uses Substitutable Medical Apps Reusable Technologies (SMART) on FHIR to develop applications for displaying alerts, prediction results, and patient records. RESULTS: The MoCab framework was demonstrated using three types of predictive models: a scoring model (qCSI), a machine learning model (NSTI), and a deep learning model (SPC), applied to synthetic data that mimic a major EHR system. The implementations showed how MoCab integrates predictive models with health data for clinical decision support, utilizing CDS Hooks and SMART on FHIR for seamless HIS integration. The demonstration confirmed the practical utility of MoCab in supporting clinical decision making, validated by its application in various healthcare settings. CONCLUSIONS: We demonstrate MoCab's potential in promoting the interoperability of machine learning models and enhancing its utility across various EHRs. Despite facing challenges like FHIR adoption, MoCab addresses key challenges in adapting machine learning models within healthcare settings, paving the way for further enhancements and broader adoption.
Subject(s)
Decision Support Systems, Clinical , Electronic Health Records , Machine Learning , Humans , Health Information Systems , Health Information Interoperability , Information Storage and Retrieval/methodsABSTRACT
Sparsentan is a dual endothelin/angiotensin II receptor antagonist indicated to reduce proteinuria in patients with primary IgA nephropathy at high risk of disease progression. In vitro data indicate that sparsentan is likely to inhibit or induce various CYP enzymes at therapeutic concentrations. Sparsentan as a victim and perpetrator of CYP3A4 mediated drug-drug interactions (DDIs) has been assessed clinically. A mechanistic, bottom-up, physiologically-based pharmacokinetic (PK) model for sparsentan was developed based on in vitro data of drug solubility, formulation dissolution and particle size, drug permeability, inhibition and induction of metabolic enzymes, and P-glycoprotein (P-gp) driven efflux. The model was verified using clinical PK data from healthy adult volunteers administered single and multiple doses in the fasted and fed states for a wide range of sparsentan doses. The model was also verified by simulation of clinically observed DDIs. The verified model was then used to test various DDI simulations of sparsentan as a perpetrator and victim of CYP3A4 using an expanded set of inducers and inhibitors with varying potency. Additional perpetrator and victim DDI simulations were performed using probes for CYP2C9 and CYP2C19. Simulations were conducted to predict the effect of complete inhibition of P-gp inhibition on sparsentan absorption and clearance. The predictive simulations indicated that exposure of sparsentan could increase greater than two-fold if co-administered with a strong CYP3A4 inhibitor, such as itraconazole. Other potential DDI interactions as victim or perpetrator were all within two-fold of control. The effect of complete P-gp inhibition on sparsentan PK was negligible.