Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 94
Filter
Add more filters

Publication year range
1.
Ann Pharmacother ; 58(9): 877-885, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38247044

ABSTRACT

BACKGROUND: Phenobarbital may offer advantages over benzodiazepines for severe alcohol withdrawal syndrome (SAWS), but its impact on clinical outcomes has not been fully elucidated. OBJECTIVE: The purpose of this study was to determine the clinical impact of phenobarbital versus benzodiazepines for SAWS. METHODS: This retrospective cohort study compared phenobarbital to benzodiazepines for the management of SAWS for patients admitted to progressive or intensive care units (ICUs) between July 2018 and July 2022. Patients included had a history of delirium tremens (DT) or seizures, Clinical Institute Withdrawal Assessment of Alcohol-Revised (CIWA-Ar) >15, or Prediction of Alcohol Withdrawal Severity Scale (PAWSS) score ≥4. The primary outcome was hospital length of stay (LOS). Secondary outcomes included progressive or ICU LOS, incidence of adjunctive pharmacotherapy, and incidence/duration of mechanical ventilation. RESULTS: The final analysis included 126 phenobarbital and 98 benzodiazepine encounters. Patients treated with phenobarbital had shorter median hospital LOS versus those treated with benzodiazepines (2.8 vs 4.7 days; P < 0.0001); a finding corroborated by multivariable analysis. The phenobarbital group also had shorter median progressive/ICU LOS (0.7 vs 1.3 days; P < 0.0001), and lower incidence of dexmedetomidine (P < 0.0001) and antipsychotic initiation (P < 0.0001). Fewer patients in the phenobarbital group compared to the benzodiazepine group received new mechanical ventilation (P = 0.045), but median duration was similar (1.2 vs 1.6 days; P = 1.00). CONCLUSION AND RELEVANCE: Scheduled phenobarbital was associated with decreased hospital LOS compared to benzodiazepines for SAWS. This was the first study to compare outcomes of fixed-dose, nonoverlapping phenobarbital to benzodiazepines in patients with clearly defined SAWS and details a readily implementable protocol.


Subject(s)
Benzodiazepines , Length of Stay , Phenobarbital , Phenobarbital/therapeutic use , Phenobarbital/administration & dosage , Humans , Benzodiazepines/therapeutic use , Benzodiazepines/administration & dosage , Male , Retrospective Studies , Female , Middle Aged , Length of Stay/statistics & numerical data , Adult , Alcohol Withdrawal Delirium/drug therapy , Substance Withdrawal Syndrome/drug therapy , Aged , Intensive Care Units/statistics & numerical data , Hypnotics and Sedatives/therapeutic use , Hypnotics and Sedatives/administration & dosage , Severity of Illness Index , Cohort Studies
2.
J Arthroplasty ; 39(8S1): S256-S262, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38604279

ABSTRACT

BACKGROUND: Tibial bone defects are commonly encountered in revision total knee arthroplasty (rTKA) and can be managed with metaphyseal cones or sleeves. Few studies have directly compared tibial cones and sleeves in rTKA, and none have limited this comparison to the most severe tibial defects. The purpose of this study was to evaluate and compare the outcomes of metaphyseal cones and sleeves for tibial reconstruction in rTKA regarding implant fixation and clinical outcomes. METHODS: A retrospective review was conducted on patients undergoing rTKA in which metaphyseal cones or sleeves were utilized for addressing metaphyseal bone loss (34 cones and 18 sleeves). Tibial bone loss was classified according to the Anderson Orthopaedic Research Institute bone defect classification, with types 2B and 3 being included. Patient-reported outcomes and postoperative complications were collected, and a radiographic evaluation of osseointegration or loosening was performed. RESULTS: There were 52 knees included (34 cones, 18 sleeves), with a median follow-up of 41.0 months. All-cause implant survival was 100% at 2 years and 96% (95% confidence interval: 76 to 99%) at 4 years, with 98% of tibial components demonstrating osseointegration at the final follow-up. During follow-up, there were a total 11 revisions, of which 1 sleeve was revised secondary to implant loosening. Tibial sleeves had a higher risk of revision compared to tibial cones (P < .01), and sleeves fixed with a hybrid technique were more likely to need revision than cones fixed by the same method (P = .01). CONCLUSIONS: Porous metaphyseal tibial cones and tibial metaphyseal sleeves both performed well at a 41-month median follow-up with no difference in aseptic survivorship between the 2 constructs. Both demonstrate high rates of osseointegration, low rates of aseptic failure, and significant improvement in Knee Society Scores in patients with severe tibial defects in rTKA.


Subject(s)
Arthroplasty, Replacement, Knee , Knee Prosthesis , Reoperation , Tibia , Humans , Retrospective Studies , Female , Male , Arthroplasty, Replacement, Knee/instrumentation , Arthroplasty, Replacement, Knee/methods , Aged , Tibia/surgery , Tibia/diagnostic imaging , Middle Aged , Osseointegration , Treatment Outcome , Prosthesis Failure , Knee Joint/surgery , Knee Joint/diagnostic imaging , Knee Joint/physiopathology , Aged, 80 and over , Follow-Up Studies
3.
J Arthroplasty ; 38(6S): S326-S330, 2023 06.
Article in English | MEDLINE | ID: mdl-36813212

ABSTRACT

BACKGROUND: Periprosthetic joint infection (PJI) is a devastating complication of knee and hip arthroplasty. Past literature has shown that gram-positive bacteria are commonly responsible for these infections, although limited research exists studying the changes in the microbial profile of PJIs over time. This study sought to analyze the incidence and trends of pathogens responsible for PJI over three decades. METHODS: This is a multi-institutional retrospective review of patients who had a knee or hip PJI from 1990 to 2020. Patients with a known causative organism were included and those with insufficient culture sensitivity data were excluded. There were 731 eligible joint infections from 715 patients identified. Organisms were divided into multiple categories based on genus/species and 5-year increments were used to analyze the study period. The Cochran-Armitage trend tests were used to evaluate linear trends in microbial profile over time and a P-value <.05 was considered statistically significant. RESULTS: There was a statistically significant positive linear trend in the incidence of methicillin-resistant Staphylococcus aureus over time (P = .0088) as well as a statistically significant negative linear trend in the incidence of coagulase-negative staphylococci over time (P = .0018). There was no statistical significance between organism and affected joint (knee/hip). CONCLUSION: The incidence of methicillin-resistant Staphylococcus aureus PJI is increasing over time, whereas, coagulase-negative staphylococci PJI is decreasing, paralleling the global trend of antibiotic resistance. Identifying these trends may help with the prevention and treatment of PJI through methods such as remodeling perioperative protocols, modifying prophylactic/empiric antimicrobial approaches, or transitioning to alternative therapeutic strategies.


Subject(s)
Arthroplasty, Replacement, Hip , Arthroplasty, Replacement, Knee , Methicillin-Resistant Staphylococcus aureus , Prosthesis-Related Infections , Staphylococcal Infections , Humans , Arthroplasty, Replacement, Hip/adverse effects , Incidence , Coagulase/therapeutic use , Arthroplasty, Replacement, Knee/adverse effects , Retrospective Studies , Prosthesis-Related Infections/epidemiology , Prosthesis-Related Infections/etiology , Prosthesis-Related Infections/drug therapy , Anti-Bacterial Agents/therapeutic use , Staphylococcal Infections/epidemiology , Staphylococcal Infections/etiology , Staphylococcal Infections/drug therapy
4.
J Cardiovasc Pharmacol ; 80(3): 471-475, 2022 09 01.
Article in English | MEDLINE | ID: mdl-35881901

ABSTRACT

ABSTRACT: Initial warfarin dosing and time in therapeutic range (TTR) are poorly characterized for early post-operative left ventricular assist device (LVAD) patients. This study evaluated TTR after LVAD implantation compared between patients receiving low-dose (<3 mg) and high-dose (≥3 mg) warfarin. This single-center, retrospective analysis included 234 LVAD patients who received warfarin within 5 days of implantation. The primary outcome was TTR during the 5 days following first international normalized ratio (INR) ≥2 compared between low-dose and high-dose groups. Secondary outcomes were hospital and intensive care unit length of stay, time to first INR ≥2, TTR after first INR ≥2, and reinitiation of parenteral anticoagulation. No difference in TTR was detected between warfarin groups (57.2% vs. 62.7%, P = 0.13). Multivariable analysis did not detect any factors predictive of TTR during the primary outcome timeframe, but age and body mass index were associated with the warfarin dose. The low-dose group received a mean warfarin dose of 1.9 mg (±0.64 mg), and the high dose group received 4.34 mg (±1.38 mg). Cohort TTR during the primary outcome timeframe was 60.5% and 56.5% for hospitalization. The low-dose group had longer intensive care unit length of stay, shorter time to therapeutic INR, and more frequently reinitiated parenteral anticoagulation. Patients with recent LVAD implantation are complex and have diverse warfarin sensitivity factors, which did not allow for optimal warfarin dose detection, although half of all patients received doses between 2.04 mg and 4.33 mg. Individualized dosing should be used, adjusting for patient-specific factors such as age, body mass index, and drug interactions.


Subject(s)
Heart-Assist Devices , Warfarin , Anticoagulants , Heart-Assist Devices/adverse effects , Humans , International Normalized Ratio , Retrospective Studies
5.
Ann Pharmacother ; 56(4): 377-386, 2022 Apr.
Article in English | MEDLINE | ID: mdl-34282636

ABSTRACT

BACKGROUND: The gut microbiome plays a critical role in modulating the therapeutic effect of immune checkpoint inhibitors (ICIs). Proton pump inhibitors (PPIs) are commonly used in cancer patients and may affect the gut microbiome by altering gut pH. OBJECTIVE: To evaluate if concurrent use of PPI is associated with overall survival (OS) and progression-free survival (PFS) in patients with stage IV non-small-cell lung cancer (NSCLC), melanoma, renal cell carcinoma, transitional cell carcinoma, or head and neck squamous cell carcinoma. METHODS: This was a single-center retrospective cohort study of advanced cancer adult patients who received nivolumab or pembrolizumab between September 1, 2014, and August 31, 2019. Concomitant PPI exposure was defined as PPI use 0 to 30 days before or after initiation of ICIs. Treatment outcome was OS and PFS. RESULTS: A total of 233 patients were included in our study. Concomitant PPI use was not significantly associated with OS (hazard ratio [HR] = 1.22; 95% CI = 0.80-1.86) or PFS (HR = 1.05; 95% CI = 0.76-1.45) in patients with ICI use. The effect estimates were robust after adjusting for covariates in multivariate analysis and in patients with NSCLC. CONCLUSION AND RELEVANCE: Concomitant PPI use was not associated with the effectiveness of nivolumab or pembrolizumab. Certain predictors of survival outcomes related to PPI use in patients receiving immunotherapy, such as the time window and indication of PPI exposure and autoimmune disorders, should be explored in the future to better carve out the impact of PPI on the effectiveness of ICI use.


Subject(s)
Carcinoma, Non-Small-Cell Lung , Lung Neoplasms , Carcinoma, Non-Small-Cell Lung/drug therapy , Humans , Immune Checkpoint Inhibitors/therapeutic use , Lung Neoplasms/drug therapy , Proton Pump Inhibitors/adverse effects , Retrospective Studies
6.
J Arthroplasty ; 37(6S): S327-S332, 2022 06.
Article in English | MEDLINE | ID: mdl-35074448

ABSTRACT

BACKGROUND: Long-term reinfection and mortality rates and clinical outcomes with sufficient subject numbers remain limited for patients undergoing two-stage exchange arthroplasty for chronic periprosthetic knee infections. The purpose of this study was to determine the long-term reinfection, complication, and mortality following reimplantation for two-stage exchange following knee arthroplasty. METHODS: Retrospective review of 178 patients who underwent two-stage exchange knee arthroplasty for chronic PJI at three large tertiary referral institutions with an average of 6.63-year follow-up from reimplantation from 1990 to 2015. Rates of reinfection, mortality, and all-cause revision were calculated along with the cumulative incidence of reinfection with death as a competing factor. Risk factors for reinfection were determined using Cox multivariate regression analysis. RESULTS: Overall rate of infection eradication was 85.41%, with a mortality rate of 30.33%. Patients with minimum 5-year follow-up (n = 118, average 8.32 years) had an infection eradication rate of 88.98%, with a mortality rate of 33.05%. CONCLUSION: This is a large series with long-term follow-up evaluating outcomes of two-stage exchange knee arthroplasty resulting in adequate infection eradication and high mortality. Results were maintained at longer follow-up. This technique should be considered in patients with chronic PJI; however, realistic expectations regarding long-term outcomes must be discussed with patients.


Subject(s)
Arthroplasty, Replacement, Knee , Prosthesis-Related Infections , Anti-Bacterial Agents/therapeutic use , Arthroplasty, Replacement, Knee/adverse effects , Arthroplasty, Replacement, Knee/methods , Humans , Knee Joint/surgery , Prosthesis-Related Infections/epidemiology , Prosthesis-Related Infections/etiology , Prosthesis-Related Infections/surgery , Reinfection , Reoperation/adverse effects , Retrospective Studies , Treatment Outcome
7.
J Arthroplasty ; 37(7S): S674-S677, 2022 07.
Article in English | MEDLINE | ID: mdl-35283230

ABSTRACT

BACKGROUND: Two-stage reimplantation is an effective treatment for periprosthetic joint infection (PJI). Many factors are involved in the variable success of this procedure. The purpose of this study is to examine the relationship between patient risk factors, comorbidities, and the pathogen on reinfection rates following two-stage reimplantation. METHODS: We evaluated 158 patients treated for PJI from 2008-2019. Only patients who had completed a two-stage exchange were included. Patient demographics, comorbidities, laboratory values, time-to-reimplantation, pathogen, antibiotic sensitivities, host status, and reinfection rates were assessed. Multivariate analysis was performed to identify correlation between risk factors and reinfection. A P-value < .05 was considered statistically significant. RESULTS: 31 patients experienced a reinfection (19.6%). There was a statistically significant association between infection with Methicillin Sensitive Staphylococcus Aureus (MSSA) and reinfection (P = .046). Patients with a reinfection also had a significantly greater median serum C-reactive protein (CRP) level (12.65 g/dL) at the time of diagnosis compared to patients without a reinfection (5.0 g/dL) (P = .010). Median Erythrocyte Sedimentation Rate (ESR) (56 in no re-infection and 69 in re-infection) and time-to-reimplantation (101 days in no reinfection and 141 days in reinfection) demonstrated a trend toward an association with re-infection but were not statistically significant (P = .055 and P = .054 respectively). CONCLUSION: As the number of arthroplasties continue to rise, PJIs are increasing proportionately and represent a significant revision burden. Elevated C-reactive protein (CRP) levels and Methicillin Sensitive Staphylococcus aureus (MSSA) infection were strongly associated with failure of a two-stage reimplantation. While not statistically significant with our numbers, there were strong trends toward an association between elevated Erythrocyte Sedimentation Rate (ESR), longer time-to-reimplantation, and reinfection.


Subject(s)
Arthritis, Infectious , Arthroplasty, Replacement, Hip , Arthroplasty, Replacement, Knee , Prosthesis-Related Infections , Reinfection , Replantation , Staphylococcal Infections , Anti-Bacterial Agents/pharmacology , Anti-Bacterial Agents/therapeutic use , Arthritis, Infectious/etiology , Arthroplasty, Replacement, Hip/adverse effects , Arthroplasty, Replacement, Knee/adverse effects , C-Reactive Protein/analysis , Humans , Methicillin/pharmacology , Methicillin/therapeutic use , Prosthesis-Related Infections/diagnosis , Prosthesis-Related Infections/etiology , Reoperation , Retrospective Studies , Staphylococcal Infections/diagnosis , Staphylococcal Infections/drug therapy , Staphylococcal Infections/etiology
8.
Pediatr Emerg Care ; 38(1): e170-e172, 2022 Jan 01.
Article in English | MEDLINE | ID: mdl-32675710

ABSTRACT

OBJECTIVES: Abusive head trauma (AHT) is the leading cause of death from trauma in children less than 2 years of age. A delay in presentation for care has been reported as a risk factor for abuse; however, there has been limited research on this topic. We compare children diagnosed with AHT to children diagnosed with accidental head trauma to determine if there is a delay in presentation. METHODS: We retrospectively studied children less than 6 years old who had acute head injury and were admitted to the pediatric intensive care unit at a pediatric hospital from 2013 to 2017. Cases were reviewed to determine the duration from symptom onset to presentation to care and the nature of the head injury (abusive vs accidental). RESULTS: A total of 59 children met inclusion criteria. Patients who had AHT were significantly more likely to present to care more than 30 minutes after symptom onset (P = 0.0015). Children who had AHT were more likely to be younger (median, 4 vs 31 months; P < 0.0001) and receive Medicaid (P < 0.0001) than those who had accidental head trauma. Patients who had AHT were more likely to have a longer length of stay (median, 11 vs 3 days; P < 0.0001) and were less likely to be discharged home than patients who had accidental head trauma (38% vs 84%; P = 0.0005). CONCLUSIONS: Children who had AHT were more likely to have a delayed presentation for care as compared with children whose head trauma was accidental. A delay in care should prompt clinicians to strongly consider a workup for abusive injury.


Subject(s)
Child Abuse , Craniocerebral Trauma , Child , Child Abuse/diagnosis , Craniocerebral Trauma/diagnosis , Craniocerebral Trauma/epidemiology , Craniocerebral Trauma/etiology , Humans , Intensive Care Units, Pediatric , Retrospective Studies , Risk Factors , United States
9.
Int J Mol Sci ; 23(12)2022 Jun 16.
Article in English | MEDLINE | ID: mdl-35743155

ABSTRACT

B-cell chronic lymphocytic leukemia (CLL) results from intrinsic genetic defects and complex microenvironment stimuli that fuel CLL cell growth through an array of survival signaling pathways. Novel small-molecule agents targeting the B-cell receptor pathway and anti-apoptotic proteins alone or in combination have revolutionized the management of CLL, yet combination therapy carries significant toxicity and CLL remains incurable due to residual disease and relapse. Single-molecule inhibitors that can target multiple disease-driving factors are thus an attractive approach to combat both drug resistance and combination-therapy-related toxicities. We demonstrate that SRX3305, a novel small-molecule BTK/PI3K/BRD4 inhibitor that targets three distinctive facets of CLL biology, attenuates CLL cell proliferation and promotes apoptosis in a dose-dependent fashion. SRX3305 also inhibits the activation-induced proliferation of primary CLL cells in vitro and effectively blocks microenvironment-mediated survival signals, including stromal cell contact. Furthermore, SRX3305 blocks CLL cell migration toward CXCL-12 and CXCL-13, which are major chemokines involved in CLL cell homing and retention in microenvironment niches. Importantly, SRX3305 maintains its anti-tumor effects in ibrutinib-resistant CLL cells. Collectively, this study establishes the preclinical efficacy of SRX3305 in CLL, providing significant rationale for its development as a therapeutic agent for CLL and related disorders.


Subject(s)
Leukemia, Lymphocytic, Chronic, B-Cell , Cell Cycle Proteins/pharmacology , Humans , Leukemia, Lymphocytic, Chronic, B-Cell/pathology , Nuclear Proteins , Phosphatidylinositol 3-Kinases , Protein Kinase Inhibitors/pharmacology , Protein Kinase Inhibitors/therapeutic use , Receptors, Antigen, B-Cell/metabolism , Transcription Factors , Tumor Microenvironment
10.
Pediatr Res ; 89(4): 968-973, 2021 03.
Article in English | MEDLINE | ID: mdl-32492694

ABSTRACT

BACKGROUND: Very low birth weight (VLBW) infants may be at risk for late-onset circulatory collapse (LCC) where otherwise stable infants develop hypotension resistant to vasoactive agents. The risk factors for LCC development are poorly defined, and it has been theorized that it may be in part due to withdrawal from exogenous prenatal steroids. The goal of this study was to define the clinical characteristics of LCC and investigate its association with antenatal steroid administration. METHODS: This is a retrospective cohort study of infants born ≤1500 g. LCC was retrospectively diagnosed in infants requiring glucocorticoids for circulatory instability at >1 week of life. Demographic and clinical characteristics were compared between groups using Mann-Whitney test. RESULTS: Three hundred and ten infants were included; 19 (6.1%) developed LCC. Infants with LCC were born at a median 4.6 weeks' lower gestation, 509 g lower birth weight than those without LCC. There was no difference in antenatal steroid delivery between the groups. CONCLUSIONS: LCC occurs in a distinct subset of VLBW infants, suggesting the need for monitoring in this high-risk population. Antenatal steroids did not significantly increase the risk of LCC development in this study. IMPACT: Late-onset circulatory collapse (LCC) is a life-threatening clinical entity occurring in around 6% in VLBW infants and is likely underdiagnosed in the United States. Targeting specific demographic characteristics such as birth weight (<1000 g) and gestational age at birth (<26 weeks) may allow for early identification of high-risk infants, allowing close monitoring and prompt treatment of LCC. No significant association was found between antenatal steroid administration and LCC development, suggesting that the theoretical risks of antenatal steroids on the fetal HPA axis does not outweigh the benefits of antenatal steroids in fetal lung maturity. To date, no studies characterizing LCC have originated outside of Asia. Therefore, providing a description of LCC in a U.S.-based cohort will provide insight into both its prevalence and presentation to inform clinicians about this potentially devastating disorder and foster early diagnosis and treatment. This study validates LCC characteristics and prevalence previously outlined by Asian studies in a single-center U.S.-based cohort while also identifying potential risk factors for LCC development. This manuscript will provide education for U.S. physicians about the risk factors and clinical presentation of LCC to facilitate early diagnosis and treatment, potentially decreasing neonatal mortality. With prompt recognition and treatment of LCC, infants may have decreased exposure to vasoactive medications that have significant systemic effects.


Subject(s)
Shock/diagnosis , Shock/epidemiology , Female , Gestational Age , Glucocorticoids/metabolism , Humans , Hypothalamo-Hypophyseal System , Infant, Newborn , Infant, Premature , Infant, Premature, Diseases/epidemiology , Infant, Very Low Birth Weight , Male , Pituitary-Adrenal System , Retrospective Studies , Risk Factors , Steroids/metabolism
11.
Pediatr Res ; 90(2): 436-443, 2021 08.
Article in English | MEDLINE | ID: mdl-33293682

ABSTRACT

BACKGROUND: Perinatal inflammation adversely affects health. Therefore, aims of this IRB-approved study are: (1) compare inflammatory compounds within and between maternal and umbilical cord blood samples at the time of delivery, (2) assess relationships between inflammatory compounds in maternal and cord blood with birth characteristics/outcomes, and (3) assess relationships between blood and placental fat-soluble nutrients with blood levels of individual inflammatory compounds. METHODS: Mother-infant dyads were enrolled (n = 152) for collection of birth data and biological samples of maternal blood, umbilical cord blood, and placental tissue. Nutrient levels included: lutein + zeaxanthin; lycopene; α-, ß-carotene; ß-cryptoxanthin; retinol; α-, γ-, δ-tocopherol. Inflammatory compounds included: tumor necrosis factor-α, superoxide dismutase, interleukins (IL) 1ß, 2, 6, 8, 10. RESULTS: Median inflammatory compound levels were 1.2-2.3 times higher in cord vs. maternal blood, except IL2 (1.3 times lower). Multiple significant correlations existed between maternal vs. infant inflammatory compounds (range of r = 0.22-0.48). While relationships existed with blood nutrient levels, the most significant were identified in placenta where all nutrients (except δ-tocopherol) exhibited relationships with inflammatory compounds. Relationships between anti-inflammatory nutrients and proinflammatory compounds were primarily inverse. CONCLUSION: Inflammation is strongly correlated between mother-infant dyads. Fat-soluble nutrients have relationships with inflammatory compounds, suggesting nutrition is a modifiable factor. IMPACT: Mother and newborn inflammation status are strongly interrelated. Levels of fat-soluble nutrients in blood, but especially placenta, are associated with blood levels of proinflammatory and anti-inflammatory compounds in both mother and newborn infant. As fat-soluble nutrient levels are associated with blood inflammatory compounds, nutrition is a modifiable factor to modulate inflammation and improve perinatal outcomes.


Subject(s)
Fetal Blood/chemistry , Inflammation Mediators/blood , Nutrients/blood , Parturition/blood , Placenta/chemistry , Biomarkers/blood , Cross-Sectional Studies , Female , Humans , Infant Nutritional Physiological Phenomena , Infant, Newborn , Lipids/chemistry , Male , Maternal Nutritional Physiological Phenomena , Nutritional Status , Pregnancy , Solubility
12.
Am J Med Genet A ; 182(10): 2243-2252, 2020 10.
Article in English | MEDLINE | ID: mdl-32677343

ABSTRACT

Fetal alcohol spectrum disorders (FASD) describe a range of physical, behavioral, and neurologic deficits in individuals exposed to alcohol prenatally. Reduced palpebral fissure length is one of the cardinal facial features of FASD. However, other ocular measurements have not been studied extensively in FASD. Using the Fetal Alcohol Syndrome Epidemiologic Research (FASER) database, we investigated how inner canthal distance (ICD), interpupillary distance (IPD), and outer canthal distance (OCD) centiles differed between FASD and non-FASD individuals. We compared ocular measurement centiles in children with FASD to non-FASD individuals and observed reductions in all three centiles for ICD, IPD, and OCD. However, when our non-FASD children who had various forms of growth deficiency (microcephaly, short-stature, or underweight) were compared to controls, we did not observe a similar reduction in ocular measurements. This suggests that reductions in ocular measurements are a direct effect of alcohol on ocular development independent of its effect on growth parameters, which is consistent with animal models showing a negative effect of alcohol on developing neural crest cells. Interpupillary distance centile appeared to be the most significantly reduced ocular measure we evaluated, suggesting it may be a useful measure to be considered in the diagnosis of FASD.


Subject(s)
Alcohol Drinking/genetics , Fetal Alcohol Spectrum Disorders/genetics , Microcephaly/genetics , Neural Crest/growth & development , Alcohol Drinking/adverse effects , Alcohol Drinking/epidemiology , Animals , Child , Eye/metabolism , Eye/pathology , Face/pathology , Female , Fetal Alcohol Spectrum Disorders/epidemiology , Fetal Alcohol Spectrum Disorders/etiology , Fetal Alcohol Spectrum Disorders/pathology , Humans , Male , Maternal-Fetal Exchange/genetics , Microcephaly/chemically induced , Microcephaly/epidemiology , Neural Crest/pathology , Pregnancy
13.
Cancer ; 125(15): 2602-2609, 2019 08 01.
Article in English | MEDLINE | ID: mdl-31067356

ABSTRACT

BACKGROUND: The purpose of this study was to evaluate risk and response-based multi-agent therapy for patients with rhabdomyosarcoma (RMS) at first relapse. METHODS: Patients with RMS and measurable disease at first relapse with unfavorable-risk (UR) features were randomized to a 6-week phase 2 window with 1 of 2 treatment schedules of irinotecan with vincristine (VI) (previously reported). Those with at least a partial response to VI continued to receive 44 weeks of multi-agent chemotherapy including the assigned VI regimen. UR patients who did not have measurable disease at study entry, did not have a radiographic response after the VI window, or declined VI window therapy received 31 weeks of multi-agent chemotherapy including tirapazamine (TPZ) at weeks 1, 4, 10, 19, and 28. Favorable-risk (FR) patients received 31 weeks of the same multi-agent chemotherapy without VI and TPZ. RESULTS: One hundred thirty-six eligible patients were enrolled. For 61 patients not responding to VI, the 3-year failure-free survival (FFS) and overall survival (OS) rates were 17% (95% confidence interval [CI], 8%-29%) and 24% (13%-37%), respectively. For 30 UR patients not treated with VI, the 3-year FFS and OS rates were 21% (8%-37%) and 39% (20%-57%), respectively. FR patients had 3-year FFS and OS rates of 79% (47%-93%) and 84% (50%-96%), respectively. There were no unexpected toxicities. CONCLUSIONS: Patients with UR RMS at first relapse or disease progression have a poor prognosis when they are treated with this multi-agent therapy, whereas FR patients have a higher chance of being cured with second-line therapy.


Subject(s)
Rhabdomyosarcoma/drug therapy , Child , Disease Progression , Female , Humans , Male , Recurrence , Rhabdomyosarcoma/mortality , Risk Factors , Survival Analysis
14.
J Clin Gastroenterol ; 53(5): e202-e207, 2019.
Article in English | MEDLINE | ID: mdl-29688916

ABSTRACT

BACKGROUND AND GOALS: Gastrointestinal bleeding (GIB) is a significant complication following left ventricular assist device (LVAD) implantation. We evaluated the incidence, predictors, endoscopic findings, and outcomes of GIB in LVAD recipients. STUDY: Retrospective review of 205 adult patients undergoing HeartMate II LVAD implantation from January 2012 to June 2016. Patients were reviewed and separated into GIB (n=57; 28%) and non-GIB (n=148; 72%) groups. RESULTS: Median time to GIB was 55 (range, 3 to 730) days. The GIB group patients were older (61±12 vs. 56±13, P=0.0042), more often underwent concomitant tricuspid valve (TV) repair (16% vs. 4%, P=0.007), and a higher percentage were assigned for destination therapy (75% vs. 55%, P=0.01). Angioectasia (33%) was the most common identified cause of GIB. Median time to endoscopic intervention was 1 day. The total number of hospital readmissions after LVAD was higher in the GIB group (median of 5 vs. 3, P=0.001), as was the total number of blood products transfused after LVAD (29 vs. 13, P≤0.0001). GIB was associated with an increased risk of death (hazard ratio, 1.94; 95% confidence interval, 1.16-3.25; P=0.01) and the mortality rate during hospitalization for GIB was 11% (P=0.0004). Receiving a heart transplant was associated with a decreased hazard of death (hazard ratio, 0.40; 95% confidence interval, 0.19-0.85; P=0.016). CONCLUSIONS: Older age and destination therapy as implant strategy were found to be associated with an increased risk of GIB, consistent with previous studies. A unique finding in our study is the association of TV repair with a higher incidence of GIB. Further studies are needed to investigate possible mechanisms by which TV repair increases the incidence of GIB.


Subject(s)
Gastrointestinal Hemorrhage/epidemiology , Heart Ventricles , Heart-Assist Devices , Age Factors , Anticoagulants/adverse effects , Female , Gastrointestinal Hemorrhage/etiology , Humans , Incidence , Male , Middle Aged , Nebraska/epidemiology , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Retrospective Studies , Risk Factors
15.
Future Oncol ; 15(17): 1989-1995, 2019 Jun.
Article in English | MEDLINE | ID: mdl-31170814

ABSTRACT

Aim: This study evaluated the overall survival (OS) of older patients (≥60 years) with acute myeloid leukemia based on the intensity of treatment. Methods: This single center, retrospective study included 211 patients diagnosed between 2000 and 2016, who received 10-day decitabine, low-intensity therapy or high-intensity therapy. Cox regression examined the impact of therapy on OS. Results: Younger patients were more likely to receive high-intensity therapy. Patients who received low-intensity therapy had worse OS compared with high-intensity therapy (median OS: 1.2 vs 8.5 months; p < 0.01). OS was similar with 10-day decitabine (median OS of 6.3 months) compared with either low-intensity therapy or high-intensity therapy. Conclusion: Ten-day decitabine is an effective alternative in older patients with newly diagnosed acute myeloid leukemia.


Subject(s)
Antimetabolites, Antineoplastic/administration & dosage , Decitabine/administration & dosage , Leukemia, Myeloid, Acute/drug therapy , Aged , Dose-Response Relationship, Drug , Female , Humans , Kaplan-Meier Estimate , Leukemia, Myeloid, Acute/mortality , Male , Middle Aged , Remission Induction/methods , Retrospective Studies , Treatment Outcome
16.
Clin Orthop Relat Res ; 476(2): 345-352, 2018 02.
Article in English | MEDLINE | ID: mdl-29529667

ABSTRACT

BACKGROUND: Two-stage reimplantation has consistently yielded high rates of success for patients with chronic prosthetic joint infection, although results more than 5 years after reimplantation are not commonly reported. Numerous factors may contribute to the risk of reinfection, although these factors-as well as the at-risk period after reimplantation-are not well characterized. QUESTIONS/PURPOSES: (1) What is the risk of reinfection after reimplantation for prosthetic joint infection at a minimum of 5 years? (2) Is the bacteriology of the index infection associated with late reinfection? (3) Is the presence of bacteria at the time of reimplantation associated with late reinfection? METHODS: Between 1995 and 2010, we performed 97 two-stage revisions in 93 patients for prosthetic joint infection of the hip or knee, and all are included in this retrospective study. During that time, the indications for this procedure generally were (1) infections occurring more than 3 months after the index arthroplasty; and (2) more acute infections associated with prosthetic loosening or resistant organisms. One patient (1%) was lost to followup; all others have a minimum of 5 years of followup (mean, 11 years; range, 5-20 years) and all living patients have been seen within the last 2 years. Patients were considered free from infection if they did not have pain at rest or constitutional symptoms such as fever, chills, or malaise. The patients' bacteriology and resistance patterns of these organisms were observed with respect to recurrence of infection. Odds ratios and Fisher's exact test were performed to analyze the data. The incidence of reinfection was determined using cumulative incidence methods that considered death as a competing event. RESULTS: Reinfection occurred in 12 of the 97 joints resulting in implant revision. The estimated 10-year cumulative incidence of infection was 14% (95% confidence interval [CI], 7%-23%) and incidence of infection from the same organism was 5% (95% CI, 1%-11%). Five occurred early or within 2 years and three were resistant pathogens (methicillin-resistant Staphylococcus aureus, methicillin-resistant Staphylococcus epidermidis, or vancomycin-resistant Enterococcus). Seven late hematogenous infections occurred and all were > 4 years after reimplantation and involved nonresistant organisms. Three of the five (60%) early infections were caused by resistant bacteria, whereas all seven late infections were caused by different organisms or a combination of different organisms than were isolated in the original infection. The early reinfections were more often caused by resistant organisms, whereas late infections involved different organisms than were isolated in the original infection and none involved resistant organisms. With the numbers available, we found no difference between patients in whom bacteria were detected at the time of reimplantation and those in whom cultures were negative in terms of the risk of reinfection 5 years after reimplantation (18.6% [18 of 97] versus 81.4% [79 of 97], odds ratio 1.56 [95% CI, 0.38-6.44]; p = 0.54); however, with only 93 patients, we may have been underpowered to make this analysis. CONCLUSIONS: In our study, resistant organisms were more often associated with early reinfection, whereas late failures were more commonly associated with new pathogens. We believe the most important finding in our study is that substantial risk of late infection remains even among patients who seemed free from infection 2 years after reimplantation for prosthetic joint infections of the hip or knee. This highlights the importance of educating our patients about the ongoing risk of prosthetic joint infection. LEVEL OF EVIDENCE: Level III, therapeutic study.


Subject(s)
Arthroplasty, Replacement, Hip/adverse effects , Arthroplasty, Replacement, Knee/adverse effects , Hip Prosthesis/adverse effects , Knee Prosthesis/adverse effects , Prosthesis-Related Infections/microbiology , Adult , Aged , Aged, 80 and over , Drug Resistance, Bacterial , Female , Humans , Incidence , Male , Middle Aged , Nebraska/epidemiology , Prosthesis-Related Infections/diagnosis , Prosthesis-Related Infections/epidemiology , Prosthesis-Related Infections/surgery , Recurrence , Reoperation , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome
17.
Clin Transplant ; 31(8)2017 08.
Article in English | MEDLINE | ID: mdl-28509373

ABSTRACT

BACKGROUND: While screening for asymptomatic BK viremia (BKV) has been well studied in isolated kidney transplant recipients, there is a paucity of published outcomes in simultaneous pancreas-kidney (SPK) transplant recipients who underwent BKV screening followed by pre-emptive reduction in immunosuppression. METHODS: This is a single-center, retrospective review of 31 consecutive SPK recipients who were transplanted over a 5-year period following the initiation of a serum BKV screening protocol. RESULTS: BK viremia developed in 11 (35.5%) patients, and all patients achieved complete viral clearance following reduction in immunosuppression. Two patients (6.5%) developed BK virus nephropathy, but both had preserved allograft function. One patient developed mild rejection of the kidney allograft following clearance of BKV, and two patients developed mild rejection of the pancreas allograft after reduction in immunosuppression, but there were no kidney or pancreas allograft losses due to rejection. The development of BK viremia did not impact overall patient survival or kidney and pancreas allograft survival. CONCLUSION: Screening asymptomatic SPK recipients for BKV followed by reduction in maintenance immunosuppression appears to be an effective strategy to prevent kidney allograft dysfunction and graft loss due to BK virus nephropathy, without compromising pancreas allograft outcomes.


Subject(s)
BK Virus/isolation & purification , Kidney Transplantation , Pancreas Transplantation , Polyomavirus Infections/diagnosis , Postoperative Complications/diagnosis , Tumor Virus Infections/diagnosis , Viremia/diagnosis , Adult , Aged , Drug Administration Schedule , Female , Follow-Up Studies , Graft Rejection/prevention & control , Graft Survival , Humans , Immunocompromised Host , Immunosuppressive Agents/therapeutic use , Incidence , Kaplan-Meier Estimate , Kidney Transplantation/methods , Male , Middle Aged , Pancreas Transplantation/methods , Polyomavirus Infections/epidemiology , Polyomavirus Infections/immunology , Polyomavirus Infections/therapy , Postoperative Complications/epidemiology , Postoperative Complications/immunology , Postoperative Complications/therapy , Real-Time Polymerase Chain Reaction , Retrospective Studies , Treatment Outcome , Tumor Virus Infections/epidemiology , Tumor Virus Infections/immunology , Tumor Virus Infections/therapy , Viremia/epidemiology , Viremia/immunology , Viremia/therapy
18.
Clin Transplant ; 31(8)2017 08.
Article in English | MEDLINE | ID: mdl-28477381

ABSTRACT

BACKGROUND: Sinus tachycardia (ST) is common after heart transplantation (HTx). The aim of the study was to evaluate the effect of diltiazem treatment during the first year after HTx on heart rate (HR), cardiac allograft function, and exercise capacity. METHODS: From the total cohort, 25 HTx recipients started diltiazem treatment 4±2 weeks after HTx and continued it for at least 1 year (diltiazem group). Each study case was matched to a control. All patients underwent hemodynamic assessment and cardiopulmonary exercise test (CPET) at 1 year after HTx. RESULTS: HR decreased in the diltiazem group from 99±11 bpm to 94±7 bpm (P=.03) and did not change in the controls (98±11 bpm vs 100±13 bpm, P=.14). The difference between the groups at 1 year after HTx was significant (P=.04). In the diltiazem group left ventricular (LV), stroke volume and ejection fraction increased (48±16 vs 55±17 mL, P=.02, and 60%±10% vs 62%±12% P=.03, respectively) but did not differ from controls. E/E' decreased (10.7±2.7 vs 7.3±1.9, P=.003) while cardiac index was higher (3.5±0.8 vs 3.1±0.5; P=.05) in the diltiazem group at 1-year follow-up. The absolute peak VO2 (21±4 vs 18±6 mL/kg/min; P=.05) and normalized peak VO2 (73%±17% vs 58%±14%; P=.004) were significantly higher in the diltiazem group. CONCLUSIONS: This study showed that diltiazem treatment reduces ST, may improve cardiac allograft function and exercise tolerance during the first year after HTx.


Subject(s)
Cardiovascular Agents/pharmacology , Diltiazem/pharmacology , Exercise Tolerance/drug effects , Heart Transplantation , Postoperative Complications/drug therapy , Tachycardia, Sinus/drug therapy , Adult , Aged , Cardiovascular Agents/therapeutic use , Diltiazem/therapeutic use , Drug Administration Schedule , Exercise Test , Female , Follow-Up Studies , Heart Rate/drug effects , Humans , Male , Middle Aged , Quality of Life , Retrospective Studies , Stroke Volume/drug effects , Tachycardia, Sinus/etiology , Treatment Outcome
19.
Clin Transplant ; 31(5)2017 05.
Article in English | MEDLINE | ID: mdl-28251691

ABSTRACT

BACKGROUND: Sinus tachycardia often presents in heart transplantation (HTx) recipients, but data on its effect on exercise performance are limited. METHODS: Based on mean heart rate (HR) value 3 months after HTx, 181 patients transplanted from 2006 to 2015 at University of Nebraska Medical Center were divided into two groups: (i) HR<95 beats/min (bpm, n=93); and (ii) HR≥95 bpm (n=88). Cardiopulmonary exercise testing (CPET) was performed 1 year after HTx. RESULTS: Mean HR at 3 months post-HTx was 94±11 bpm and did not change significantly at 1 year post-HTx (96±11 bpm, P=.13). HR≥95 bpm at 3 months was associated with younger donor age (OR 1.1; CI 1.0-1.1, P=.02), female donors (OR -2.4; CI 1.16-5.24 P=.02), and lack of donors' heavy alcohol use (OR -0.43; CI 0.17-0.61; P=.04). HR≥95 bpm at 3 months post-HTx was independently associated with decreased exercise capacity in metabolic equivalent (P=.008), reduced peak VO2 (P=.006), and percent of predicted peak VO2 (P=.002) during CPET. CONCLUSIONS: HR≥95 at 3 months following HTx is associated with reduced exercise tolerance in stable HTx recipients. Medical HR reduction after HTx could improve exercise performance after HTx and merits further investigation.


Subject(s)
Exercise Tolerance/physiology , Heart Transplantation/adverse effects , Tachycardia, Sinus/etiology , Adult , Female , Follow-Up Studies , Heart Rate , Humans , Male , Middle Aged , Oxygen Consumption , Prognosis , Time Factors
20.
J Cardiothorac Vasc Anesth ; 30(1): 107-14, 2016 Jan.
Article in English | MEDLINE | ID: mdl-26847749

ABSTRACT

OBJECTIVE: The primary aim of the study was to describe the most common intraoperative transesophageal echocardiography (TEE) findings during the 3 separate phases of orthotopic liver transplantation (OLT). The secondary aim of the study was to determine if the abnormal TEE findings were associated with major postoperative adverse cardiac events (MACE) and thus may be amenable to future management strategies. DESIGN: Data were collected retrospectively from the electronic medical record and institutional echocardiography database. SETTING: Single university hospital. PARTICIPANTS: A total of 100 patients undergoing OLT via total cavaplasty technique. INTERVENTIONS: Intraoperative TEE was performed in all 3 phases of OLT. MEASUREMENT AND MAIN RESULTS: TEE findings of 100 patients who had TEE during OLT during the dissection, anhepatic, and reperfusion phases of transplantation were recorded after blind review. Findings then were analyzed to see if those findings were predictive of postoperative MACE. Intraoperative TEE findings varied among the different phases of OLT. Common TEE findings at reperfusion were microemboli (n = 40, 40%), isolated right ventricular dysfunction (n = 22, 22%), and intracardiac thromboemboli (n = 20, 20%). CONCLUSIONS: Intraoperative echocardiography findings during liver transplantation varied during each phase of transplantation. The presence of intracardiac thromboemboli or biventricular dysfunction on intraoperative echocardiography was predictive of short- and long-term major postoperative adverse cardiac events.


Subject(s)
Cardiovascular Diseases/diagnostic imaging , Echocardiography, Transesophageal/methods , Liver Transplantation/adverse effects , Monitoring, Intraoperative/methods , Postoperative Complications/diagnostic imaging , Adolescent , Adult , Cardiovascular Diseases/etiology , Cohort Studies , Female , Humans , Male , Middle Aged , Postoperative Complications/etiology , Retrospective Studies , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL