Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 67
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Hum Mol Genet ; 32(16): 2646-2655, 2023 08 07.
Article in English | MEDLINE | ID: mdl-37369012

ABSTRACT

Animal studies implicate one-carbon metabolism and DNA methylation genes in hepatocellular carcinoma (HCC) development in the setting of metabolic perturbations. Using human samples, we investigated the associations between common and rare variants in these closely related biochemical pathways and risk for metabolic HCC development in a multicenter international study. We performed targeted exome sequencing of 64 genes among 556 metabolic HCC cases and 643 cancer-free controls with metabolic conditions. Multivariable logistic regression was used to calculate odds ratios (ORs) and 95% confidence intervals (CIs), adjusting for multiple comparisons. Gene-burden tests were used for rare variant associations. Analyses were performed in the overall sample and among non-Hispanic whites. The results show that among non-Hispanic whites, presence of rare functional variants in ABCC2 was associated with 7-fold higher risk of metabolic HCC (OR = 6.92, 95% CI: 2.38-20.15, P = 0.0004), and this association remained significant when analyses were restricted to functional rare variants observed in ≥2 participants (cases 3.2% versus controls 0.0%, P = 1.02 × 10-5). In the overall multiethnic sample, presence of rare functional variants in ABCC2 was nominally associated with metabolic HCC (OR = 3.60, 95% CI: 1.52-8.58, P = 0.004), with similar nominal association when analyses were restricted to functional rare variants observed in ≥2 participants (cases 2.9% versus controls 0.2%, P = 0.006). A common variant in PNPLA3 (rs738409[G]) was associated with higher HCC risk in the overall sample (P = 6.36 × 10-6) and in non-Hispanic whites (P = 0.0002). Our findings indicate that rare functional variants in ABCC2 are associated with susceptibility to metabolic HCC in non-Hispanic whites. PNPLA3-rs738409 is also associated with metabolic HCC risk.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Humans , Liver Neoplasms/genetics , Liver Neoplasms/pathology , Carcinoma, Hepatocellular/genetics , Carcinoma, Hepatocellular/pathology , DNA Methylation/genetics , Genetic Predisposition to Disease , Case-Control Studies , Germ Cells/pathology , Carbon , Polymorphism, Single Nucleotide/genetics
2.
Ann Neurol ; 93(4): 830-843, 2023 04.
Article in English | MEDLINE | ID: mdl-36546684

ABSTRACT

OBJECTIVE: Recent evidence supports a link between increased TDP-43 burden and the presence of an APOE4 gene allele in Alzheimer's disease (AD); however, it is difficult to conclude the direct effect of APOE on TDP-43 pathology due to the presence of mixed AD pathologies. The goal of this study is to address how APOE isoforms impact TDP-43 pathology and related neurodegeneration in the absence of typical AD pathologies. METHODS: We overexpressed human TDP-43 via viral transduction in humanized APOE2, APOE3, APOE4 mice, and murine Apoe-knockout (Apoe-KO) mice. Behavior tests were performed across ages. Animals were harvested at 11 months of age and TDP-43 overexpression-related neurodegeneration and gliosis were assessed. To further address the human relevance, we analyzed the association of APOE with TDP-43 pathology in 160 postmortem brains from autopsy-confirmed amyotrophic lateral sclerosis (ALS) and frontotemporal lobar degeneration with motor neuron disease (FTLD-MND) in the Mayo Clinic Brain Bank. RESULTS: We found that TDP-43 overexpression induced motor function deficits, neuronal loss, and gliosis in the motor cortex, especially in APOE2 mice, with much milder or absent effects in APOE3, APOE4, or Apoe-KO mice. In the motor cortex of the ALS and FTLD-MND postmortem human brains, we found that the APOE2 allele was associated with more severe TDP-43-positive dystrophic neurites. INTERPRETATION: Our data suggest a genotype-specific effect of APOE on TDP-43 proteinopathy and neurodegeneration in the absence of AD pathology, with the strongest association seen with APOE2. ANN NEUROL 2023;93:830-843.


Subject(s)
Alzheimer Disease , Amyotrophic Lateral Sclerosis , Frontotemporal Dementia , Frontotemporal Lobar Degeneration , Motor Neuron Disease , Humans , Animals , Mice , Amyotrophic Lateral Sclerosis/genetics , Apolipoprotein E2/genetics , Alzheimer Disease/genetics , Alzheimer Disease/pathology , Apolipoprotein E4/genetics , Apolipoprotein E3 , Gliosis/genetics , DNA-Binding Proteins/genetics , Apolipoproteins E/genetics , Frontotemporal Lobar Degeneration/pathology
3.
Acta Neuropathol ; 147(1): 54, 2024 03 12.
Article in English | MEDLINE | ID: mdl-38472443

ABSTRACT

Rare and common GBA variants are risk factors for both Parkinson's disease (PD) and dementia with Lewy bodies (DLB). However, the degree to which GBA variants are associated with neuropathological features in Lewy body disease (LBD) is unknown. Herein, we assessed 943 LBD cases and examined associations of 15 different neuropathological outcomes with common and rare GBA variants. Neuropathological outcomes included LBD subtype, presence of a high likelihood of clinical DLB (per consensus guidelines), LB counts in five cortical regions, tyrosine hydroxylase immunoreactivity in the dorsolateral and ventromedial putamen, ventrolateral substantia nigra neuronal loss, Braak neurofibrillary tangle (NFT) stage, Thal amyloid phase, phospho-ubiquitin (pS65-Ub) level, TDP-43 pathology, and vascular disease. Sequencing of GBA exons revealed a total of 42 different variants (4 common [MAF > 0.5%], 38 rare [MAF < 0.5%]) in our series, and 165 cases (17.5%) had a copy of the minor allele for ≥ 1 variant. In analysis of common variants, p.L483P was associated with a lower Braak NFT stage (OR = 0.10, P < 0.001). In gene-burden analysis, presence of the minor allele for any GBA variant was associated with increased odds of a high likelihood of DLB (OR = 2.00, P < 0.001), a lower Braak NFT stage (OR = 0.48, P < 0.001), a lower Thal amyloid phase (OR = 0.55, P < 0.001), and a lower pS65-Ub level (ß: -0.37, P < 0.001). Subgroup analysis revealed that GBA variants were most common in LBD cases with a combination of transitional/diffuse LBD and Braak NFT stage 0-II or Thal amyloid phase 0-1, and correspondingly that the aforementioned associations of GBA gene-burden with a decreased Braak NFT stage and Thal amyloid phase were observed only in transitional or diffuse LBD cases. Our results indicate that in LBD, GBA variants occur most frequently in cases with greater LB pathology and low AD pathology, further informing disease-risk associations of GBA in PD, PD dementia, and DLB.


Subject(s)
Alzheimer Disease , Lewy Body Disease , Parkinson Disease , Humans , Lewy Body Disease/pathology , Parkinson Disease/pathology , Alzheimer Disease/pathology , Substantia Nigra/pathology , Neurofibrillary Tangles/pathology
4.
Clin Transplant ; 38(4): e15311, 2024 04.
Article in English | MEDLINE | ID: mdl-38616569

ABSTRACT

BACKGROUND: Simultaneous liver kidney (SLK) transplant protects against acute cellular rejection. In 2017, UNOS implemented a "safety net" policy to allow patients with renal recovery to avoid renal transplantation. Whether kidney after liver transplantation (KALT) increases the risk of rejection is unknown. METHODS: We performed a retrospective analysis of the Organ Procurement and Transplantation Network (OPTN) database of adult patients who received liver transplant, SLK or KALT between 2010 and 2020. We examined rejection of the liver within 6 months and 1 year of the liver transplant, as well as rejection of the kidney within 6 months and 1 year of receiving the kidney, as well as patient and graft loss. RESULTS: Sixty-six thousand seventy-nine patients were transplanted; 60 168 with liver transplant alone, 5627 with SLK, and 284 with KALT. Acute or chronic liver rejection rates within 6 or 12 months were statistically higher in the KALT group (10.0% and 10.9%) compared to the SLK group (6.1% and 7.5%), but comparable to the LTA group (9.3% and 11.1%). Kidney rejection and graft survival rates were not different. Liver graft survival was worse in KALT than SLK or LTA (Kaplan-Meier estimates .61 vs. .89 and .90), but these patients were more ill at the time of transplantation. KDPI and LDRI scores were notably lower in the SLK than KALT group. Patient survival was not clinically different between the groups. CONCLUSION: KALT does not increase the risk of acute or chronic kidney rejection. SLK has a lower risk of early liver rejection, but this effect diminishes by one year to being not clinically different compared to KALT. Given that KALT is immunologically safe, and potentially avoids unnecessary renal graft use, it should be preferred over SLK. BRIEF SUMMARY: Patients undergoing sequential kidney after liver transplant do not have an increased risk of liver or kidney rejection when compared to liver transplant alone or simultaneous liver and kidney transplant.


Subject(s)
Kidney Transplantation , Liver Transplantation , Adult , Humans , Liver Transplantation/adverse effects , Retrospective Studies , Liver , Kidney , Kidney Transplantation/adverse effects
5.
BMC Ophthalmol ; 24(1): 255, 2024 Jun 13.
Article in English | MEDLINE | ID: mdl-38872120

ABSTRACT

BACKGROUND: Vitreoretinal lymphoma (VRL) is a rare intraocular malignancy that poses a diagnostic challenge due to the non-specific clinical presentation that resembles uveitis. The use of spectral domain optical coherence tomography (SD-OCT) has emerged as a valuable imaging tool to characterize VRL. Therefore, we sought to determine the specific OCT features in VRL compared to the uveitides. METHODS: Retrospective chart review of patients who were seen at Mayo Clinic from January 1, 2010 through December 31, 2022. The medical records and SD-OCT images at time of initial presentation were reviewed in patients with biopsy-proven VRL, intermediate uveitis, or biopsy-confirmed sarcoid posterior uveitis. Patients with VRL or similar uveitides including intermediate uveitis or sarcoid posterior uveitis were included. RESULTS: There were 95 eyes of 56 patients in the VRL group and 86 eyes of 45 patients in the uveitis group, of whom 15 (33.3%) were diagnosed with intermediate uveitis and 30 (66.7%) with sarcoid chorioretinitis. The SD-OCT features more commonly seen at initial presentation in VRL patients (vs. uveitis) included preretinal deposits (31.6% vs. 9.3%, p = 0.002), intraretinal infiltrates (34% vs. 3.5%, p < 0.001), inner retinal hyperreflective spots (15.8% vs. 0%, p < 0.001), outer retinal atrophy (22.1% vs. 2.3%, p < 0.001), subretinal focal deposits (21.1% vs. 4.7%, p = 0.001), retinal pigmented epithelium (RPE) changes (49.5% vs. 3.5%, p < 0.001), and sub-RPE deposits (34.7% vs. 0%, p < 0.001). Features more frequently seen in uveitis included epiretinal membrane (ERM) (82.6% vs. 44.2%, p < 0.001), central macular thickening (95.3% vs. 51.6%, p < 0.001), cystoid macular edema (36% vs. 11.7%, p < 0.001), subretinal fluid (16.3% vs 6.4%, p = 0.04), and subfoveal fluid (16.3% vs. 3.2%, p = 0.003). Multivariate regression analysis controlling for age and sex showed absence of ERM (OR 0.14 [0.04,0.41], p < 0.001) and absence of central macular thickening (OR 0.03 [0,0.15], p = 0.02) were associated with VRL as opposed to uveitis. CONCLUSION: OCT features most predictive of VRL (vs. uveitis) included absence of ERM and central macular thickening.


Subject(s)
Retinal Neoplasms , Tomography, Optical Coherence , Uveitis , Vitreous Body , Humans , Tomography, Optical Coherence/methods , Retrospective Studies , Male , Female , Middle Aged , Retinal Neoplasms/diagnosis , Retinal Neoplasms/diagnostic imaging , Aged , Vitreous Body/pathology , Vitreous Body/diagnostic imaging , Uveitis/diagnosis , Adult , Intraocular Lymphoma/diagnosis , Visual Acuity , Diagnosis, Differential , Aged, 80 and over
6.
J Clin Rheumatol ; 2024 Mar 05.
Article in English | MEDLINE | ID: mdl-38446195

ABSTRACT

OBJECTIVE: The aims of this study were to assess whether a relationship between anti-SSA-52 and interstitial lung disease (ILD) can be further defined, and to enhance screening, detection, and potentially guide treatment. METHODS: A historical cohort study of 201 patients was conducted at a single tertiary care center between January 1, 2016 and December 31, 2020. All included patients were anti-SSA-52 antibody positive. Chart review was performed for laboratory values, symptoms, pulmonary function tests, treatment, and imaging. Chest computed tomographies were reviewed by chest radiologists. RESULTS: Among anti-SSA-52 antibody-positive patients, ILD was found in 125 (62.2%) compared with 76 (37.8%) with no ILD (p = 0.001). For those with ILD, 78 (62.4%) were diagnosed with connective tissue disease (CTD)-associated ILD, 28 (22.4%) were diagnosed ILD only, and 19 (15.2%) met the criteria for interstitial pneumonia with autoimmune features. In patients with CTD-ILD, 18 (23.0%) had their ILD diagnosis made over 6 months before a CTD diagnosis, and an additional 43 (55.1%) had their ILD and CTD diagnosed within 6 months of each other (p < 0.001). Common computed tomography patterns were nonspecific interstitial pneumonia/organizing pneumonia overlap in 44 (35.2%), 25 (20.0%) nonspecific interstitial pneumonia, and 15 (12%) usual interstitial pneumonia. Twenty-eight (35.9%) had antisynthetase syndrome, followed by 16 (20.5%) with dermatomyositis, 10 (12.8%) with CTD overlap, and 6 (7.7%) with systemic scleroderma. CONCLUSIONS: There was a significant association between anti-SSA-52 antibodies and ILD across a wide spectrum of rheumatological diagnoses. A significant portion of patients were diagnosed with ILD either at the same time or before their CTD diagnosis. Further study will be needed to assess effective treatment and response.

7.
Liver Transpl ; 2023 Nov 29.
Article in English | MEDLINE | ID: mdl-38015446

ABSTRACT

The number of kidney after liver transplants (KALT) increased after the implementation of the United Network of Organ Sharing (UNOS) safety net policy, but the effects of the policy on KALT outcomes remain unknown. Using the UNOS database, we identified KALT between 60 and 365 days from liver transplant from January 1, 2010, to December 31, 2020. The main outcome was 1- and 3-year patient, liver, and kidney graft survival. Secondary outcomes included 6-month and 1-year acute rejection (AR) of liver and kidney, and 1-year kidney allograft function. Of the 256 KALT, 90 were pre-policy and 166 post-policy. Compared to pre-policy, post-policy 1- and 3-year liver graft survival was higher (54% and 54% vs. 86% and 81%, respectively, p <0.001), while 1- and 3-year kidney graft survival (99% and 75% vs. 92% and 79%, respectively, p =0.19), and 1- and 3-year patient survival (99% and 99% vs. 95% and 89%, respectively, p =0.11) were not significantly different. Subgroup analysis revealed similar trends in patients with and without renal failure at liver transplant. Liver AR at 6 months was lower post-policy (6.3% vs. 18.3%, p =0.006) but was similar (10.5% vs. 13%, p =0.63) at 1 year. Kidney AR was unchanged post-policy at 6 months and 1 year. Creatinine at 1 year did not differ post-policy versus pre-policy (1.4 vs. 1.3 mg/dL, p =0.07) despite a higher proportion of deceased donors, higher Kidney Donor Profile Index, and longer kidney cold ischemia time post-policy ( p <0.05 for all). This 3-year follow-up after the 2017 UNOS policy revision demonstrated that the safety net implementation has resulted in improved liver outcomes for patients who underwent KALT with no increased AR of the liver or the kidney allografts.

8.
J Comput Assist Tomogr ; 47(3): 382-389, 2023.
Article in English | MEDLINE | ID: mdl-37185000

ABSTRACT

OBJECTIVE: We sought to determine the prevalence and possible features associated with symptoms in adult patients diagnosed with an aberrant right subclavian artery (ARSA). METHODS: In this single-center retrospective study, 386 adult patients were diagnosed with ARSA on chest CT scans performed between June 2016 and April 2021. Patients were grouped by the presence of symptoms, which included dysphagia, shortness of breath, cough, and upper airway wheezing. Four cardiothoracic radiologists reviewed the chest CT scans to assess features of ARSA. Agreement and multivariable logistic regression analyses were performed to determine interobserver variability and features associated with the presence of symptoms, respectively. RESULTS: The prevalence of ARSA was 1.02% and 81.3% of patients were asymptomatic. Shortness of breath (74.6%) was the most common symptom. Interobserver agreement was acceptable with most variables having an interclass correlation coefficient or κ >0.80. A patient's height > 158 cm (OR: 2.50, P = 0.03), cross-sectional area > 60 mm 2 of ARSA at the level of the esophagus (OR: 2.39, P = 0.046), and angle >108 degrees formed with the aortic arch (OR: 1.99, P = 0.03) were associated with the presence of symptoms on multivariable logistic regression. A distance increase per 1 mm between ARSA and trachea (OR: 0.85, P = 0.02) was associated with decreased odds of symptoms. CONCLUSIONS: Aberrant right subclavian artery is an incidental finding in most adult patients. The cross-sectional area at the level of the esophagus, angle formed with the medial wall of the aortic arch, distance between the ARSA and the trachea, and a patient's height were features associated with the presence of symptoms.


Subject(s)
Subclavian Artery , Tomography, X-Ray Computed , Humans , Adult , Retrospective Studies , Subclavian Artery/diagnostic imaging , Dyspnea
9.
Endocr Pract ; 29(3): 155-161, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36566985

ABSTRACT

OBJECTIVE: Patients hospitalized with COVID-19 and hyperglycemia require frequent glucose monitoring, usually performed with glucometers. Continuous glucose monitors (CGMs) are common in the outpatient setting but not yet approved for hospital use. We evaluated CGM accuracy, safety for insulin dosing, and CGM clinical reliability in 20 adult patients hospitalized with COVID-19 and hyperglycemia. METHODS: Study patients were fitted with a remotely monitored CGM. CGM values were evaluated against glucometer readings. The CGM sensor calibration was performed if necessary. CGM values were used to dose insulin, without glucometer confirmation. RESULTS: CGM accuracy against glucometer, expressed as mean absolute relative difference (MARD), was calculated using 812 paired glucometer-CGM values. The aggregate MARD was 10.4%. For time in range and grades 1 and 2 hyperglycemia, MARD was 11.4%, 9.4%, and 9.1%, respectively, with a small variation between medical floors and intensive care units. There was no MARD correlation with mean arterial blood pressure levels, oxygen saturation, daily hemoglobin levels, and glomerular filtration rates. CGM clinical reliability was high, with 99.7% of the CGM values falling within the "safe" zones of Clarke error grid. After CGM placement, the frequency of glucometer measurements decreased from 5 to 3 and then 2 per day, reducing nurse presence in patient rooms and limiting viral exposure. CONCLUSION: With twice daily, on-demand calibration, the inpatient CGM use was safe for insulin dosing, decreasing the frequency of glucometer fingersticks. For glucose levels >70 mg/dL, CGMs showed adequate accuracy, without interference from vital and laboratory values.


Subject(s)
COVID-19 , Diabetes Mellitus, Type 1 , Hyperglycemia , Adult , Humans , Blood Glucose , Blood Glucose Self-Monitoring , Reproducibility of Results , Tertiary Care Centers , Insulin , Insulin, Regular, Human
10.
Breast Cancer Res ; 24(1): 45, 2022 07 11.
Article in English | MEDLINE | ID: mdl-35821041

ABSTRACT

BACKGROUND: Breast terminal duct lobular units (TDLUs), the source of most breast cancer (BC) precursors, are shaped by age-related involution, a gradual process, and postpartum involution (PPI), a dramatic inflammatory process that restores baseline microanatomy after weaning. Dysregulated PPI is implicated in the pathogenesis of postpartum BCs. We propose that assessment of TDLUs in the postpartum period may have value in risk estimation, but characteristics of these tissues in relation to epidemiological factors are incompletely described. METHODS: Using validated Artificial Intelligence and morphometric methods, we analyzed digitized images of tissue sections of normal breast tissues stained with hematoxylin and eosin from donors ≤ 45 years from the Komen Tissue Bank (180 parous and 545 nulliparous). Metrics assessed by AI, included: TDLU count; adipose tissue fraction; mean acini count/TDLU; mean dilated acini; mean average acini area; mean "capillary" area; mean epithelial area; mean ratio of epithelial area versus intralobular stroma; mean mononuclear cell count (surrogate of immune cells); mean fat area proximate to TDLUs and TDLU area. We compared epidemiologic characteristics collected via questionnaire by parity status and race, using a Wilcoxon rank sum test or Fisher's exact test. Histologic features were compared between nulliparous and parous women (overall and by time between last birth and donation [recent birth: ≤ 5 years versus remote birth: > 5 years]) using multivariable regression models. RESULTS: Normal breast tissues of parous women contained significantly higher TDLU counts and acini counts, more frequent dilated acini, higher mononuclear cell counts in TDLUs and smaller acini area per TDLU than nulliparas (all multivariable analyses p < 0.001). Differences in TDLU counts and average acini size persisted for > 5 years postpartum, whereas increases in immune cells were most marked ≤ 5 years of a birth. Relationships were suggestively modified by several other factors, including demographic and reproductive characteristics, ethanol consumption and breastfeeding duration. CONCLUSIONS: Our study identified sustained expansion of TDLU numbers and reduced average acini area among parous versus nulliparous women and notable increases in immune responses within five years following childbirth. Further, we show that quantitative characteristics of normal breast samples vary with demographic features and BC risk factors.


Subject(s)
Breast Neoplasms , Mammary Glands, Human , Artificial Intelligence , Breast/pathology , Breast Neoplasms/pathology , Female , Humans , Mammary Glands, Human/pathology , Parity , Pregnancy
11.
Breast Cancer Res Treat ; 194(1): 149-158, 2022 Jul.
Article in English | MEDLINE | ID: mdl-35503494

ABSTRACT

PURPOSE: Breast terminal duct lobular units (TDLUs) are the main source of breast cancer (BC) precursors. Higher serum concentrations of hormones and growth factors have been linked to increased TDLU numbers and to elevated BC risk, with variable effects by menopausal status. We assessed associations of circulating factors with breast histology among premenopausal women using artificial intelligence (AI) and preliminarily tested whether parity modifies associations. METHODS: Pathology AI analysis was performed on 316 digital images of H&E-stained sections of normal breast tissues from Komen Tissue Bank donors ages ≤ 45 years to assess 11 quantitative metrics. Associations of circulating factors with AI metrics were assessed using regression analyses, with inclusion of interaction terms to assess effect modification. RESULTS: Higher prolactin levels were related to larger TDLU area (p < 0.001) and increased presence of adipose tissue proximate to TDLUs (p < 0.001), with less significant positive associations for acini counts (p = 0.012), dilated acini (p = 0.043), capillary area (p = 0.014), epithelial area (p = 0.007), and mononuclear cell counts (p = 0.017). Testosterone levels were associated with increased TDLU counts (p < 0.001), irrespective of parity, but associations differed by adipose tissue content. AI data for TDLU counts generally agreed with prior visual assessments. CONCLUSION: Among premenopausal women, serum hormone levels linked to BC risk were also associated with quantitative features of normal breast tissue. These relationships were suggestively modified by parity status and tissue composition. We conclude that the microanatomic features of normal breast tissue may represent a marker of BC risk.


Subject(s)
Breast Neoplasms , Artificial Intelligence , Breast/pathology , Breast Neoplasms/pathology , Female , Hormones/metabolism , Humans , Middle Aged , Risk Factors
12.
World J Surg ; 46(10): 2468-2475, 2022 10.
Article in English | MEDLINE | ID: mdl-35854013

ABSTRACT

BACKGROUND: Abdominal arterial calcification (AAC) is common among candidates for kidney transplant. The aim of this study is to correlate AAC score value with post-kidney transplant outcomes. METHODS: We modified the coronary calcium score by changing the intake data points and used it to quantitate the AAC. We conducted a retrospective clinical study of all adult patients who were transplanted at our center, between 2010 and 2013, and had abdominal computed tomography scan done before transplantation. Outcomes included mortality, pulse pressure (PP) measured by 24 h ambulatory blood pressure monitoring system, and kidney allograft function measured by iothalamate clearance. RESULTS: For each 1000 increase of AAC score value, there is an associated 1.05 increase in the risk of death (95% CI 1.02, 1.08) (p < 0.001). Overall median AAC value for all patients was 1784; Kaplan-Meier curve showed reduced survival of all-cause mortality for patients with AAC score value above median and reduced survival among patients with cardiac related mortality. The iothalamate clearance was lower among patients with total AAC score value above the median. Patients with abnormal PP (< 40 or > 60 mmHg) had an elevated median AAC score value at 4319.3 (IQR 1210.4, 11097.1) compared to patients with normal PP with AAC score value at 595.9 (IQR 9.9, 2959.9) (p < 0.001). CONCLUSION: We showed an association of AAC with patients' survival and kidney allograft function after kidney transplant. The AAC score value could be used as a risk stratification when patients are considered for kidney transplant.


Subject(s)
Aortic Diseases , Kidney Transplantation , Vascular Calcification , Adult , Allografts , Aorta, Abdominal , Blood Pressure Monitoring, Ambulatory/adverse effects , Humans , Iothalamic Acid , Kidney , Kidney Transplantation/adverse effects , Retrospective Studies , Risk Factors , Vascular Calcification/complications , Vascular Calcification/diagnostic imaging
13.
Am J Emerg Med ; 57: 98-102, 2022 07.
Article in English | MEDLINE | ID: mdl-35533574

ABSTRACT

OBJECTIVE: An artificial intelligence (AI) algorithm has been developed to detect the electrocardiographic signature of atrial fibrillation (AF) present on an electrocardiogram (ECG) obtained during normal sinus rhythm. We evaluated the ability of this algorithm to predict incident AF in an emergency department (ED) cohort of patients presenting with palpitations without concurrent AF. METHODS: This retrospective study included patients 18 years and older who presented with palpitations to one of 15 ED sites and had a 12­lead ECG performed. Patients with prior AF or newly diagnosed AF during the ED visit were excluded. Of the remaining patients, those with a follow up ECG or Holter monitor in the subsequent year were included. We evaluated the performance of the AI-ECG output to predict incident AF within one year of the index ECG by estimating an area under the receiver operating characteristics curve (AUC). Sensitivity, specificity, and positive and negative predictive values were determined at the optimum threshold (maximizing sensitivity and specificity), and thresholds by output decile for the sample. RESULTS: A total of 1403 patients were included. Forty-three (3.1%) patients were diagnosed with new AF during the following year. The AI-ECG algorithm predicted AF with an AUC of 0.74 (95% CI 0.68-0.80), and an optimum threshold with sensitivity 79.1% (95% Confidence Interval (CI) 66.9%-91.2%), and specificity 66.1% (95% CI 63.6%-68.6%). CONCLUSIONS: We found this AI-ECG AF algorithm to maintain statistical significance in predicting incident AF, with clinical utility for screening purposes limited in this ED population with a low incidence of AF.


Subject(s)
Atrial Fibrillation , Artificial Intelligence , Atrial Fibrillation/diagnosis , Electrocardiography , Emergency Service, Hospital , Humans , Retrospective Studies
14.
Liver Transpl ; 27(9): 1291-1301, 2021 09.
Article in English | MEDLINE | ID: mdl-33687745

ABSTRACT

Pre-liver transplantation (LT) renal dysfunction is associated with poor post-LT survival. We studied whether early allograft dysfunction (EAD) modifies this association. Data on 2,856 primary LT recipients who received a transplant between 1998 and 2018 were retrospectively reviewed. Patients who died within the first post-LT week or received multiorgan transplants and previous LT recipients were excluded. EAD was defined as (1) total bilirubin ≥ 10 mg/dL on postoperative day (POD) 7, (2) international normalized ratio ≥1.6 on POD 7, and/or (3) alanine aminotransferase or aspartate aminotransferase ≥2000 IU/mL in the first postoperative week. Pre-LT renal dysfunction was defined as serum creatinine >1.5 mg/dL or on renal replacement therapy at LT. Patients were divided into 4 groups according to pre-LT renal dysfunction and post-LT EAD development. Recipients who had both pre-LT renal dysfunction and post-LT EAD had the worst unadjusted 1-year, 3-year, and 5-year post-LT patient and graft survival, whereas patients who had neither renal dysfunction nor EAD had the best survival (P < 0.001). After adjusting for multiple factors, the risk of death was significantly higher only in those with both pre-LT renal dysfunction and post-LT EAD (adjusted hazard ratio [aHR], 2.19; 95% confidence interval [CI], 1.58-3.03; P < 0.001), whereas those with renal dysfunction and no EAD had a comparable risk of death to those with normal kidney function at LT (aHR, 1.12; 95% CI, 0.86-1.45; P = 0.41). Results remained unchanged when pre-LT renal dysfunction was redefined using different glomerular filtration rate cutoffs. Pre-LT renal dysfunction negatively impacts post-LT survival only in patients who develop EAD. Livers at higher risk of post-LT EAD should be used with caution in recipients with pre-LT renal dysfunction.


Subject(s)
Kidney Diseases , Liver Transplantation , Allografts , Graft Survival , Humans , Kidney , Liver , Liver Transplantation/adverse effects , Retrospective Studies , Risk Factors
15.
Cardiology ; 146(1): 106-115, 2021.
Article in English | MEDLINE | ID: mdl-32810847

ABSTRACT

INTRODUCTION: Percutaneous left atrial appendage closure is an established alternative to anticoagulation therapy for stroke prophylaxis among patients with nonvalvular atrial fibrillation. There are currently no guidelines on the choice of antithrombotic therapy following placement of the Watchman® device, the optimal time to discontinue anticoagulation or the duration of follow-up imaging after device deployment. Our main objective was to evaluate clinical outcomes among these patients. METHODS: We conducted a retrospective review of patients who received a Watchman® device at Mayo Clinic sites between January 2010 and December 2018. We constructed Cox-proportional hazard models to evaluate the effect of specific variables on clinical outcomes. RESULTS: 231 patients were identified (33% female), median age was 77 years, CHA2DS2-VASc score was 5 and HASBLED score was 4. We found no difference in clinically significant bleeding based on initial antithrombotic choice. However, patients with prior gastrointestinal bleeding were more likely to have a bleeding event in the first 6 weeks following Watchman® implantation (HR 9.40, 95% CI 2.15-41.09). Device sizes of 24-27 mm were significantly associated with a decreased risk of thromboembolic events (HR 0.15, 95% CI 0.04-0.55) compared to 21-mm devices. Peridevice leak (PDL) sizes appeared to either remain the same or increase on follow-up imaging. DISCUSSION/CONCLUSIONS: This observational study showed no statistically significant difference in bleeding risk related to initial antithrombotic choice. Smaller device sizes were associated with thromboembolic events, and longitudinal PDL assessment using transesophageal echocardiography showed these frequently do not decrease in size. Larger studies are needed.


Subject(s)
Atrial Appendage , Stroke , Aged , Anticoagulants/therapeutic use , Atrial Appendage/diagnostic imaging , Atrial Appendage/surgery , Female , Humans , Male , Retrospective Studies , Stroke/etiology , Stroke/prevention & control , Treatment Outcome
16.
J Thromb Thrombolysis ; 51(4): 1059-1066, 2021 May.
Article in English | MEDLINE | ID: mdl-33538988

ABSTRACT

Distal or calf deep vein thrombosis (DVT) are said to have low rates of propagation, embolization, and recurrence. The objective of this study was to determine outcomes among cancer patients with calf DVT compared to those without cancer. Consecutive patients with ultrasound confirmed acute calf DVT (3/1/2013-8/10/2019) were assessed for venous thromboembolism (VTE) recurrence and bleeding outcomes compared by cancer status. There were 830 patients with isolated calf DVT; 243 with cancer and 587 without cancer. Cancer patients were older (65.9 ± 11.4 vs. 62.0 ± 15.9 years; p = 0.006), with less frequent recent hospitalization (31.7% vs. 48.0%; p < 0.001), surgery (30.0% vs. 38.0%; p = 0.03), or trauma (3.7% vs. 19.9%; p < 0.001). The four most common cancers included hematologic malignancies (20.6%), lung (11.5%), gastrointestinal (10.3%), and ovarian/GYN (9.1%). Nearly half of patients had metastatic disease (43.8%) and 57% were receiving chemotherapy. VTE recurrence rates were similar for patients with (7.1%) and without cancer (4.0%; p = 0.105). Major bleeding (6.3% vs. 2.3%; p = 0.007) were greater for cancer patients while clinical relevant non major bleeding rates did not differ (7.1% vs. 4.6%; p = 0.159). In this retrospective analysis, cancer patients with calf DVT have similar rates of VTE recurrence but higher major bleeding outcomes compared to patients without cancer.


Subject(s)
Mesenteric Ischemia , Venous Thromboembolism , Venous Thrombosis , Anticoagulants , Hemorrhage/etiology , Humans , Neoplasm Recurrence, Local , Recurrence , Retrospective Studies , Risk Factors
17.
Ann Hepatol ; 24: 100317, 2021.
Article in English | MEDLINE | ID: mdl-33545403

ABSTRACT

INTRODUCTION AND OBJECTIVES: Renal dysfunction before liver transplantation (LT) is associated with higher post-LT mortality. We aimed to study if this association still persisted in the contemporary transplant era. MATERIALS AND METHODS: We retrospectively reviewed data on 2871 primary LT performed at our center from 1998 to 2018. All patients were listed for LT alone and were not considered to be simultaneous liver-kidney (SLK) transplant candidates. SLK recipients and those with previous LT were excluded. Patients were grouped into 4 eras: era-1 (1998-2002, n = 488), era-2 (2003-2007, n = 889), era-3 (2008-2012, n = 703) and era-4 (2013-2018, n = 791). Pre-LT renal dysfunction was defined as creatinine (Cr) >1.5 mg/dl or on dialysis at LT. The effect of pre-LT renal dysfunction on post-LT patient survival in each era was examined using Kaplan Meier estimates and univariate and multivariate Cox proportional hazard analyses. RESULTS: Pre-LT renal dysfunction was present in 594 (20%) recipients. Compared to patients in era-1, patients in era-4 had higher Cr, lower eGFR and were more likely to be on dialysis at LT (P < 0.001). Pre-LT renal dysfunction was associated with worse 1, 3 and 5-year survival in era-1 and era-2 (P < 0.005) but not in era-3 or era-4 (P = 0.13 and P = 0.08, respectively). Multivariate analysis demonstrated the lack of independent effect of pre-LT renal dysfunction on post-LT mortality in era-3 and era-4. A separate analysis using eGFR <60 mL/min/1.73 m2 at LT to define renal dysfunction showed similar results. CONCLUSIONS: Pre-LT renal dysfunction had less impact on post-LT survival in the contemporary transplant era.


Subject(s)
Liver Diseases/complications , Liver Diseases/mortality , Liver Transplantation , Renal Insufficiency/complications , Aged , Creatinine/blood , Female , Glomerular Filtration Rate , Graft Survival , Humans , Kaplan-Meier Estimate , Liver Diseases/surgery , Male , Middle Aged , Proportional Hazards Models , Renal Dialysis , Renal Insufficiency/diagnosis , Renal Insufficiency/mortality , Retrospective Studies , Risk Factors , Survival Rate , Time Factors
18.
South Med J ; 114(7): 432-437, 2021 07.
Article in English | MEDLINE | ID: mdl-34215897

ABSTRACT

OBJECTIVE: To assess the clinical characteristics and clinical outcomes of bradycardic patients with coronavirus disease 2019 (COVID-19) pneumonia. METHODS: The electronic medical records of 221 consecutive patients hospitalized for COVID-19 pneumonia between June and September 2020 were retrospectively reviewed. Patient characteristics, electrocardiographic data, and clinical and laboratory information were retrospectively collected. Patients not treated with drugs that blunt chronotropic response (nodal) were analyzed separately. RESULTS: Only patients whose heart rate was <60 beats per minute (bpm) (136/221, 61.5%) were included. Serial electrocardiography revealed that most patients (130/137, 97.7%) remained in sinus rhythm. The heart rate was between 50 and 59 bpm in 75% of the patients, while 18.4% were in the 40 to 49 bpm range, and 6.6% were <40 bpm. Medians for development of bradycardia after swab polymerase chain reaction positivity and duration of bradycardia were 41 hours and 5 days, respectively. Bradycardia resolved in 81 patients (59.6%). There were no statistically significant differences in outcomes according to degree of bradycardia (<50 vs 50-59, all P ≥ 0.073). No significant differences were noted for the overall cohort when comparing COVID-19 treatments according to resolution of bradycardia; however, when considering only the patients who were not receiving a nodal agent or antiarrhythmic, treatment with lenzilumab was more common in patients with resolution of bradycardia than patients without resolution of bradycardia (12.2% vs 0.0%, P = 0.030). CONCLUSIONS: Sinus bradycardia occurs frequently in patients with severe COVID-19, but the degree of bradycardia does not correlate with clinical outcomes. Lenzilumab may be associated with the resolution of bradycardia.


Subject(s)
Bradycardia/complications , COVID-19/complications , Adult , Aged , Aged, 80 and over , Antibodies, Monoclonal, Humanized/therapeutic use , Bradycardia/drug therapy , Electrocardiography , Female , Hospitalization , Humans , Male , Middle Aged , Retrospective Studies , Young Adult
19.
J Clin Rheumatol ; 27(5): 187-193, 2021 Aug 01.
Article in English | MEDLINE | ID: mdl-32040055

ABSTRACT

BACKGROUND/OBJECTIVE: The aim of this cross-sectional study is to determine the prevalence of opioid use in a large sample of fibromyalgia (FM) patients and examine the factors associated with opioid prescription/use despite multiple clinical guidelines that do not recommend opioid use in this population. METHODS: Data were collected from a convenience sample of 698 patients admitted from August 2017 to May 2019 into an intensive 2-day Fibromyalgia Treatment Program at a tertiary medical center in the United States after FM diagnosis. Patients were administered the Fibromyalgia Impact Questionnaire-Revised, the Center for Epidemiologic Study of Depression Scale, and the Pain Catastrophizing Scale upon admission to the program. Demographic information and opioid use were self-reported. Logistic regression analysis was utilized to determine associations between patient-related variables and opioid use in this prospective study. RESULTS: Of 698 patients, 27.1% (n = 189) were taking opioids at intake. Extended duration of symptoms (>3 years), increased age, higher degree of functional impairment, and increased pain catastrophizing were significantly associated with opioid use. CONCLUSIONS: Opioids are not recommended for the treatment of FM under current guidelines. Greater burden of illness appeared to be associated with the prescription and use of opioids in this population. These findings suggest that some providers may not be aware of current recommendations that have been found to be effective in the management of FM that are contained in guidelines. Alternative approaches to the management of FM that do not involve opioids are reviewed in an effort to improve care.


Subject(s)
Analgesics, Opioid , Fibromyalgia , Cross-Sectional Studies , Fibromyalgia/diagnosis , Fibromyalgia/drug therapy , Fibromyalgia/epidemiology , Humans , Prospective Studies , Surveys and Questionnaires , United States/epidemiology
20.
Psychosomatics ; 61(2): 145-153, 2020.
Article in English | MEDLINE | ID: mdl-31864662

ABSTRACT

BACKGROUND: Psychiatric disorders are common in cancer patients and impact outcomes. Impact on cancer care cost needs study to develop business case for psychosocial interventions. OBJECTIVE: To evaluate the impact of preexisting psychiatric comorbidities on total cost of care during 6 months after cancer diagnosis. METHODS: This retrospective cohort study examined patients diagnosed with cancer between January 1, 2009, and December 31, 2014, at one National Cancer Institute-designated cancer center. Patients who received all cancer treatment at the study site (6598 of 11,035 patients) were included. Patients were divided into 2 groups, with or without psychiatric comorbidity, based on International Classification of Diseases, Ninth Revision (ICD-9) diagnosis codes. Total costs of care during the first 6 months of treatment were based on standardized costs adjusted to 2014 dollars, determined by assigning Medicare reimbursement rates to professional billed services and applying appropriate cost-to-charge ratios. Quantile regression models with covariate adjustments were developed to assess the effect of psychiatric comorbidity across the distribution of costs. RESULTS: Six hundred ninety-eight (10.6%) of 6598 eligible patients had at least one psychiatric comorbidity. These patients had more nonpsychiatric Elixhauser comorbidities (mean 4 vs. 3). Unadjusted total cancer care costs were higher for patients with psychiatric comorbidity (mean [standard deviation]: $51,798 [$74,549] vs. $32,186 [$45,240]; median [quartiles]: $23,871 [$10,705-$57,338] vs. $19,073 [$8120-$38,230]). Quantile regression models demonstrated that psychiatric comorbidity had significant incremental effects at higher levels of cost: 75th percentile $8629 (95% confidence interval: $3617-13,642) and 90th percentile $42,586 (95% confidence interval: $25,843-59,330). CONCLUSIONS: Psychiatric comorbidities are associated with increased total cancer costs, especially in patients with very high cancer care costs, representing an opportunity to develop mitigation strategies.


Subject(s)
Health Care Costs/statistics & numerical data , Mental Disorders/economics , Neoplasms/economics , Psychosocial Intervention/economics , Cancer Care Facilities/economics , Cohort Studies , Comorbidity , Humans , Mental Disorders/complications , Mental Disorders/therapy , Neoplasms/complications , Neoplasms/therapy , Retrospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL