Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 68
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Circulation ; 147(10): 782-794, 2023 03 07.
Article in English | MEDLINE | ID: mdl-36762560

ABSTRACT

BACKGROUND: The benefit-risk profile of direct oral anticoagulants (DOACs) compared with warfarin, and between DOACs in patients with atrial fibrillation (AF) and chronic liver disease is unclear. METHODS: We conducted a new-user, retrospective cohort study of patients with AF and chronic liver disease who were enrolled in a large, US-based administrative database between January 1, 2011, and December 31, 2017. We assessed the effectiveness and safety of DOACs (as a class and individually) compared with warfarin, and between DOACs in patients with AF and chronic liver disease. The primary outcomes were hospitalization for ischemic stroke/systemic embolism and hospitalization for major bleeding. Inverse probability treatment weights were used to balance the treatment groups on measured confounders. RESULTS: Overall, 10 209 participants were included, with 4421 (43.2%) on warfarin, 2721 (26.7%) apixaban, 2211 (21.7%) rivaroxaban, and 851 (8.3%) dabigatran. The incidence rates per 100 person-years for ischemic stroke/systemic embolism were 2.2, 1.4, 2.6, and 4.4 for DOACs as a class, apixaban, rivaroxaban, and warfarin, respectively. The incidence rates per 100 person-years for major bleeding were 7.9, 6.5, 9.1, and 15.0 for DOACs as a class, apixaban, rivaroxaban, and warfarin, respectively. After inverse probability treatment weights, the risk of hospitalization for ischemic stroke/systemic embolism was significantly lower between DOACs as a class (hazard ratio [HR], 0.64 [95% CI, 0.46-0.90]) or apixaban (HR, 0.40 [95% CI, 0.19-0.82]) compared with warfarin, but not significantly different between rivaroxaban versus warfarin (HR, 0.76 [95% CI, 0.47-1.21]) or rivaroxaban versus apixaban (HR, 1.73 [95% CI, 0.91-3.29]). Compared with warfarin, the risk of hospitalization for major bleeding was lower with DOACs as a class (HR, 0.69 [95% CI, 0.58-0.82]), apixaban (HR, 0.60 [95% CI, 0.46-0.78]), and rivaroxaban (HR, 0.79 [95% CI, 0.62-1.0]). However, the risk of hospitalization for major bleeding was higher for rivaroxaban versus apixaban (HR, 1.59 [95% CI, 1.18-2.14]). CONCLUSIONS: Among patients with AF and chronic liver disease, DOACs as a class were associated with lower risks of hospitalization for ischemic stroke/systemic embolism and major bleeding versus warfarin. However, the incidence of clinical outcomes among patients with AF and chronic liver disease varied between individual DOACs and warfarin, and in head-to-head DOAC comparisons.


Subject(s)
Atrial Fibrillation , Embolism , Ischemic Stroke , Liver Diseases , Stroke , Humans , Warfarin/adverse effects , Atrial Fibrillation/diagnosis , Atrial Fibrillation/drug therapy , Atrial Fibrillation/epidemiology , Rivaroxaban/adverse effects , Anticoagulants/adverse effects , Cohort Studies , Retrospective Studies , Stroke/epidemiology , Stroke/prevention & control , Stroke/drug therapy , Hemorrhage/chemically induced , Hemorrhage/epidemiology , Hemorrhage/drug therapy , Dabigatran/adverse effects , Liver Diseases/diagnosis , Liver Diseases/epidemiology , Embolism/epidemiology , Embolism/prevention & control , Embolism/complications , Administration, Oral
2.
Am J Epidemiol ; 193(2): 308-322, 2024 Feb 05.
Article in English | MEDLINE | ID: mdl-37671942

ABSTRACT

This study explores natural direct and joint natural indirect effects (JNIE) of prenatal opioid exposure on neurodevelopmental disorders (NDDs) in children mediated through pregnancy complications, major and minor congenital malformations, and adverse neonatal outcomes, using Medicaid claims linked to vital statistics in Rhode Island, United States, 2008-2018. A Bayesian mediation analysis with elastic net shrinkage prior was developed to estimate mean time to NDD diagnosis ratio using posterior mean and 95% credible intervals (CrIs) from Markov chain Monte Carlo algorithms. Simulation studies showed desirable model performance. Of 11,176 eligible pregnancies, 332 had ≥2 dispensations of prescription opioids anytime during pregnancy, including 200 (1.8%) having ≥1 dispensation in the first trimester (T1), 169 (1.5%) in the second (T2), and 153 (1.4%) in the third (T3). A significant JNIE of opioid exposure was observed in each trimester (T1, JNIE = 0.97, 95% CrI: 0.95, 0.99; T2, JNIE = 0.97, 95% CrI: 0.95, 0.99; T3, JNIE = 0.96, 95% CrI: 0.94, 0.99). The proportion of JNIE in each trimester was 17.9% (T1), 22.4% (T2), and 56.3% (T3). In conclusion, adverse pregnancy and birth outcomes jointly mediated the association between prenatal opioid exposure and accelerated time to NDD diagnosis. The proportion of JNIE increased as the timing of opioid exposure approached delivery.


Subject(s)
Neurodevelopmental Disorders , Prenatal Exposure Delayed Effects , Pregnancy , Female , Infant, Newborn , Child , Humans , United States/epidemiology , Analgesics, Opioid/adverse effects , Mediation Analysis , Prenatal Exposure Delayed Effects/chemically induced , Prenatal Exposure Delayed Effects/epidemiology , Bayes Theorem , Neurodevelopmental Disorders/chemically induced , Neurodevelopmental Disorders/epidemiology , Neurodevelopmental Disorders/drug therapy
3.
Am J Epidemiol ; 191(2): 331-340, 2022 01 24.
Article in English | MEDLINE | ID: mdl-34613378

ABSTRACT

To examine methodologies that address imbalanced treatment switching and censoring, 6 different analytical approaches were evaluated under a comparative effectiveness framework: intention-to-treat, as-treated, intention-to-treat with censor-weighting, as-treated with censor-weighting, time-varying exposure, and time-varying exposure with censor-weighting. Marginal structural models were employed to address time-varying exposure, confounding, and possibly informative censoring in an administrative data set of adult patients who were hospitalized with acute coronary syndrome and treated with either clopidogrel or ticagrelor. The effectiveness endpoint included first occurrence of death, myocardial infarction, or stroke. These methodologies were then applied across simulated data sets with varying frequencies of treatment switching and censoring to compare the effect estimate of each analysis. The findings suggest that implementing different analytical approaches has an impact on the point estimate and interpretation of analyses, especially when censoring is highly unbalanced.


Subject(s)
Acute Coronary Syndrome/drug therapy , Hospitalization/statistics & numerical data , Platelet Aggregation Inhibitors/therapeutic use , Selection Bias , Treatment Switching , Acute Coronary Syndrome/complications , Acute Coronary Syndrome/mortality , Adult , Aged , Clopidogrel/therapeutic use , Comparative Effectiveness Research , Computer Simulation , Female , Humans , Intention to Treat Analysis , Latent Class Analysis , Male , Middle Aged , Myocardial Infarction/etiology , Myocardial Infarction/mortality , Stroke/etiology , Stroke/mortality , Ticagrelor/therapeutic use , Treatment Outcome
4.
Pharmacoepidemiol Drug Saf ; 31(9): 932-943, 2022 09.
Article in English | MEDLINE | ID: mdl-35729705

ABSTRACT

PURPOSE: Supplementing investigator-specified variables with large numbers of empirically identified features that collectively serve as 'proxies' for unspecified or unmeasured factors can often improve confounding control in studies utilizing administrative healthcare databases. Consequently, there has been a recent focus on the development of data-driven methods for high-dimensional proxy confounder adjustment in pharmacoepidemiologic research. In this paper, we survey current approaches and recent advancements for high-dimensional proxy confounder adjustment in healthcare database studies. METHODS: We discuss considerations underpinning three areas for high-dimensional proxy confounder adjustment: (1) feature generation-transforming raw data into covariates (or features) to be used for proxy adjustment; (2) covariate prioritization, selection, and adjustment; and (3) diagnostic assessment. We discuss challenges and avenues of future development within each area. RESULTS: There is a large literature on methods for high-dimensional confounder prioritization/selection, but relatively little has been written on best practices for feature generation and diagnostic assessment. Consequently, these areas have particular limitations and challenges. CONCLUSIONS: There is a growing body of evidence showing that machine-learning algorithms for high-dimensional proxy-confounder adjustment can supplement investigator-specified variables to improve confounding control compared to adjustment based on investigator-specified variables alone. However, more research is needed on best practices for feature generation and diagnostic assessment when applying methods for high-dimensional proxy confounder adjustment in pharmacoepidemiologic studies.


Subject(s)
Machine Learning , Pharmacoepidemiology , Confounding Factors, Epidemiologic , Databases, Factual , Delivery of Health Care , Humans
5.
Pharm Stat ; 21(6): 1199-1218, 2022 11.
Article in English | MEDLINE | ID: mdl-35535938

ABSTRACT

Health administrative data are oftentimes of limited use in epidemiological study on drug safety in pregnancy, due to lacking information on gestational age at birth (GAB). Although several studies have proposed algorithms to estimate GAB using claims database, failing to incorporate the unique distributional shape of GAB, can introduce bias in estimates and subsequent modeling. Hence, we develop a Bayesian latent class model to predict GAB. The model employs a mixture of Gaussian distributions with linear covariates within each class. This approach allows modeling heterogeneity in the population by identifying latent subgroups and estimating class-specific regression coefficients. We fit this model in a Bayesian framework conducting posterior computation with Markov Chain Monte Carlo methods. The method is illustrated with a dataset of 10,043 Rhode Island Medicaid mother-child pairs. We found that the three-class and six-class mixture specifications maximized prediction accuracy. Based on our results, Medicaid women were partitioned into three classes, featured by extreme preterm or preterm birth, preterm or" early" term birth, and" late" term birth. Obstetrical complications appeared to pose a significant influence on class-membership. Altogether, compared to traditional linear models our approach shows an advantage in predictive accuracy, because of superior flexibility in modeling a skewed response and population heterogeneity.


Subject(s)
Models, Statistical , Premature Birth , Humans , Infant, Newborn , Pregnancy , Female , Gestational Age , Latent Class Analysis , Bayes Theorem , Premature Birth/epidemiology
6.
Semin Dial ; 34(2): 163-169, 2021 03.
Article in English | MEDLINE | ID: mdl-33280176

ABSTRACT

Circulating endothelial cells (CEC) are thought to be markers of endothelial injury. We hypothesized that the numbers of CEC may provide a novel means for predicting long-term survival and cardiovascular events in hemodialysis patients. 54 hemodialysis patients underwent enumeration of their CEC number. We retrospectively analyzed their survival and incidence of adverse cardiovascular events. 22 deaths (41%) were noted over the median follow up period of 3.56 years (IQR 1.43-12) and 6 were attributed to cardiovascular deaths (11%) of which 1 (4%) was in the low CEC (CEC<20 cells/ml) and 5 (19%) in the high CEC (CEC≥20 cells/ml) group. High CEC was associated with worse cardiovascular survival (p = 0.05) and adverse cardiac events (p = 0.01). In multivariate analysis, CEC >20 cells/ml was associated with a 4-fold increased risk of adverse cardiac events (OR, 4.16 [95% CI,1.38-12.54],p = 0.01) while all-cause mortality and cardiovascular mortality were not statistically different. In this hemodialysis population, a single measurement of CEC was a strong predictor of long term future adverse cardiovascular events. We propose that CEC may be a novel biomarker for assessing cardiovascular risk in dialysis patients.


Subject(s)
Cardiovascular System , Endothelial Cells , Biomarkers , Humans , Renal Dialysis/adverse effects , Retrospective Studies
7.
Chem Res Toxicol ; 33(7): 1780-1790, 2020 07 20.
Article in English | MEDLINE | ID: mdl-32338883

ABSTRACT

Drug-induced organ injury is a major reason for drug candidate attrition in preclinical and clinical drug development. The liver, kidneys, and heart have been recognized as the most common organ systems affected in safety-related attrition or the subject of black box warnings and postmarket drug withdrawals. In silico physicochemical property calculations and in vitro assays have been utilized separately in the early stages of the drug discovery and development process to predict drug safety. In this study, we combined physicochemical properties and in vitro cytotoxicity assays including mitochondrial dysfunction to build organ-specific univariate and multivariable logistic regression models to achieve odds ratios for the prediction of clinical hepatotoxicity, nephrotoxicity, and cardiotoxicity using 215 marketed drugs. The multivariable hepatotoxic predictive model showed an odds ratio of 6.2 (95% confidence interval (CI) 1.7-22.8) or 7.5 (95% CI 3.2-17.8) for mitochondrial inhibition or drug plasma Cmax >1 µM for drugs associated with liver injury, respectively. The multivariable nephrotoxicity predictive model showed an odds ratio of 5.8 (95% CI 2.0-16.9), 6.4 (95% CI 1.1-39.3), or 15.9 (95% CI 2.8-89.0) for drug plasma Cmax >1 µM, mitochondrial inhibition, or hydrogen-bond-acceptor atoms >7 for drugs associated with kidney injury, respectively. Conversely, drugs with a total polar surface area ≥75 Å were 79% (odds ratio 0.21, 95% CI 0.061-0.74) less likely to be associated with kidney injury. Drugs belonging to the extended clearance classification system (ECCS) class 4, where renal secretion is the primary clearance mechanism (low permeability drugs that are bases/neutrals), were 4 (95% CI 1.8-9.5) times more likely to to be associated with kidney injury with this data set. Alternatively, ECCS class 2 drugs, where hepatic metabolism is the primary clearance (high permeability drugs that are bases/neutrals) were 77% less likely (odds ratio 0.23 95% CI 0.095-0.54) to to be associated with kidney injury. A cardiotoxicity model was poorly defined using any of these drug physicochemical attributes. Combining in silico physicochemical properties descriptors along with in vitro toxicity assays can be used to build predictive toxicity models to select small molecule therapeutics with less potential to cause liver and kidney organ toxicity.


Subject(s)
Biological Assay , Chemical and Drug Induced Liver Injury , Drug Discovery , Kidney Diseases/chemically induced , Models, Biological , Pharmaceutical Preparations/chemistry , Heart/drug effects , Humans , Kidney/drug effects , Liver/drug effects , Logistic Models , Mitochondria/drug effects
8.
Transpl Int ; 33(8): 865-877, 2020 08.
Article in English | MEDLINE | ID: mdl-31989680

ABSTRACT

The outcomes of lymphocyte-depleting antibody induction therapy (LDAIT), [thymoglobulin (ATG) or alemtuzumab (ALM)] versus interleukin-2 receptor antagonist (IL-2RA) in the nonbroadly-sensitized [pretransplant calculated panel reactive antibody (cPRA), <80%] adult deceased donor kidney transplant recipients (adult-DDKTRs) are understudied. In this registry, study of 55 593 adult-DD-KTRs, outcomes of LDAIT [(ATG, N = 32 985) and (ALM, N = 9429)], and IL-2RA (N = 13 179) in <10% and 10-79% cPRA groups was analyzed. Adjusted odds ratio (aOR) of one-year biopsy-proven acute rejection (BPAR) was lower; while, aOR of 1-year composite of re-hospitalization, graft loss, or death was higher with LDAIT than IL2-RA in both cPRA groups. Adjusted odds ratio (aOR) of delayed graft function was higher with LDAIT than IL-2RA in the <10% cPRA group. Adjusted hazard ratio (aHR) of 5-year death-censored graft loss (DCGL) in both <80% cPRA groups seemed higher with ALM than other inductions [(<10% cPRA: ALM versus IL2RA, aHR = 1.11, 95% CI = 1.00-1.23 and ATG versus ALM: aHR = 0.84, 95% CI = 0.77-0.91; 10-79% cPRA: ALM versus IL2RA, aHR = 1.29, 95% CI = 1.02-1.64; and ATG versus ALM, aHR = 0.83, 95% CI = 0.70-0.98)]. Five-year aHR of death did not differ among induction therapies in both cPRA groups. In nonbroadly sensitized adult-DDKTRs, LDAIT is more protective against 1-year BPAR (not 5-year mortality) than IL-2RA; the trend of a higher 5-year DCGL risk with ALM than ATG or IL-2RA needs further investigation.


Subject(s)
Kidney Transplantation , Adult , Antilymphocyte Serum/therapeutic use , Graft Rejection , Graft Survival , Humans , Immunosuppressive Agents/therapeutic use , Registries , Retrospective Studies
9.
Pharmacoepidemiol Drug Saf ; 29(4): 493-503, 2020 04.
Article in English | MEDLINE | ID: mdl-32102109

ABSTRACT

BACKGROUND: Most women are prescribed an opioid after hysterectomy. The goal of this study was to determine the association between initial opioid prescribing characteristics and chronic opioid use after hysterectomy. METHODS: This study included women enrolled in a commercial health plan who had a hysterectomy between 1 July 2010 and 31 March 2015. We used trajectory models to define chronic opioid use as patients with the highest probability of having an opioid prescription filled during the 6 months post-surgery. A multivariable logistic regression was applied to examine the association between initial opioid dispensing (amount prescribed and duration of treatment) and chronic opioid use after adjusting for potential confounders. RESULTS: A total of 693 of 50 127 (1.38%) opioid-naïve women met the criteria for chronic opioid use following hysterectomy. The baseline variables and initial opioid prescription characteristics predicted the pattern of long-term opioid use with moderate discrimination (c statistic = 0.70). Significant predictors of chronic opioid use included initial opioid daily dose (≥60 MME vs <40 MME, aOR: 1.43, 95% CI: 1.14-1.79) and days' supply (4-7 days vs 1-3 days, aOR: 1.28, 95% CI: 1.06-1.54; ≥8 days vs 1-3 days, aOR: 1.41, 95% CI: 1.05-1.89). Other significant baseline predictors included older age, abdominal or laparoscopic/robotic hysterectomy, tobacco use, psychiatric medication use, back pain, and headache. CONCLUSION: Initial opioid prescribing characteristics are associated with the risk of chronic opioid use after hysterectomy. Prescribing lower daily doses and shorter days' supply of opioids to women after hysterectomy may result in lower risk of chronic opioid use.


Subject(s)
Analgesics, Opioid/administration & dosage , Drug Prescriptions , Hysterectomy/trends , Pain, Postoperative/epidemiology , Pain, Postoperative/prevention & control , Adult , Aged , Analgesics, Opioid/adverse effects , Cohort Studies , Drug Administration Schedule , Female , Humans , Hysterectomy/adverse effects , Middle Aged , Opioid-Related Disorders/epidemiology , Opioid-Related Disorders/prevention & control , Predictive Value of Tests , Young Adult
10.
Nephrol Dial Transplant ; 34(1): 83-89, 2019 01 01.
Article in English | MEDLINE | ID: mdl-29548021

ABSTRACT

Background: Monitoring of mycophenolic acid (MPA) levels may be useful for effective mycophenolate mofetil (MMF) dosing. However, whether commonly obtained trough levels are an acceptable method of surveillance remains debatable. We hypothesized that trough levels of MPA would be a poor predictor of area under the curve (AUC) for MPA. Methods: A total of 51 patients with lupus nephritis who were on MMF 1500 mg twice a day and had a 4-h AUC done were included in this study. MPA levels were measured prior to (C0) and at 1 (C1), 2 (C2) and 4 (C4) h, followed by 1500 mg of MMF. The MPA AUC values were calculated using the linear trapezoidal rule. Regression analysis was used to examine the relationship between the MPA trough and AUC. Differences in the MPA trough and AUC between different clinical and demographic categories were compared using t-tests. Results: When grouped by tertiles there was significant overlap in MPA, AUC 0-4 and MPA trough in all tertiles. Although there was a statistically significant correlation between MPA trough levels and AUC, this association was weak and accounted for only 30% of the variability in MPA trough levels. This relationship might be even more unreliable in men than women. The use of angiotensin-converting enzyme inhibitors or angiotensin receptor blockers was associated with increased MPA trough levels and AUC at 0-4 h (AUC0-4). Conclusion: Trough levels of MPA do not show a strong correlation with AUC. In clinical situations where MPA levels are essential to guide therapy, an AUC0-4 would be a better indicator of the adequacy of treatment.


Subject(s)
Antibiotics, Antineoplastic/blood , Drug Monitoring/statistics & numerical data , Lupus Nephritis/blood , Lupus Nephritis/drug therapy , Mycophenolic Acid/blood , Adolescent , Adult , Antibiotics, Antineoplastic/administration & dosage , Area Under Curve , Disease Management , Drug Monitoring/methods , Female , Humans , Male , Middle Aged , Mycophenolic Acid/administration & dosage , Prognosis , Young Adult
11.
Clin Transplant ; 33(1): e13440, 2019 01.
Article in English | MEDLINE | ID: mdl-30387534

ABSTRACT

BACKGROUND: With the advent of combined antiretroviral therapy (cART), growing evidence has shown human immunodeficiency virus (HIV) may no longer be an absolute contraindication for solid organ transplantation. This study compares outcomes of heart transplantations between HIV-positive and HIV-negative recipients using SRTR transplant registry data. METHODS: Patient survival, overall graft survival and death-censored graft survival were compared between HIV-positive and HIV-negative recipients. Multivariate Cox regression and Cox regression with a disease risk score (DRS) methodology were used to estimate the adjusted hazard ratios among heart transplant recipients (HTRs). RESULTS: In total, 35 HTRs with HIV+ status were identified. No significant differences were found in patient survival (88% vs 77%; P = 0.1493), overall graft survival (85% vs 76%; P = 0.2758), and death-censored graft survival (91% vs 91%; P = 0.9871) between HIV-positive and HIV-negative HTRs in 5-year follow-up. No significant differences were found after adjusting for confounders. CONCLUSIONS: This study supports the use of heart transplant procedures in selected HIV-positive patients. This study suggests that HIV-positive status is not a contraindication for life-saving heart transplant as there were no differences in graft, patient survival.


Subject(s)
HIV Infections/complications , HIV/isolation & purification , Heart Diseases/mortality , Heart Transplantation/mortality , Adult , Female , Follow-Up Studies , Graft Survival , HIV Infections/virology , Heart Diseases/epidemiology , Heart Diseases/surgery , Heart Diseases/virology , Humans , Incidence , Male , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate , Treatment Outcome , United States/epidemiology
12.
Nephrol Dial Transplant ; 33(1): 177-184, 2018 01 01.
Article in English | MEDLINE | ID: mdl-29045704

ABSTRACT

Background: This study aimed to analyze adult kidney transplant recipients (KTRs) for the risk of new-onset diabetes after transplantation (NODAT) associated with viral serologies and immunosuppression regimens [tacrolimus (Tac) + mycophenolate (MPA), cyclosporine (CSA) + MPA, sirolimus (SRL) + MPA, SRL + CSA or SRL +Tac]. Methods: Cox regression models were used to examine the risk of NODAT in the first posttransplant year associated with: (i) CSA + MPA, SRL + MPA, SRL + MPA or SRL + Tac versus reference, Tac + MPA; (ii) pretransplant viral serology [+ or -; hepatitis B core (HBc), hepatitis C (HCV), cytomegalovirus (CMV) or Epstein Barr Virus (EBV)]; and (iii) interactions between immunosuppression regimens and the viral serology found significant in the main analysis. Results: Adult KTRs (n = 97 644) from January 1995 through September 2015 were studied. HCV+ [hazard ratio (HR) 1.50, 95% confidence interval (CI) 1.31-1.68] or CMV+ (HR 1.12, 95% CI 1.06-1.19) serology was a risk factor and HBc+ (HR 1.04, 95% CI 0.95-1.15) or EBV+ (HR 1.06, 95% CI 0.97-1.15) serology was not a risk factor for NODAT. Regardless of associated HCV or CMV serology, risk of NODAT relative to the reference regimen (Tac + MPA) was lower with CSA + MPA [HCV-: HR 0.74, 95% CI 0.65-0.85; HCV+: HR 0.47, 95% CI 0.28-0.78; CMV-: CSA + MPA HR 0.68, 95% CI 0.54-0.86; CMV+: (CSA + MPA) HR 0.73, 95% CI 0.63-0.85] and similar with SRL + CSA or SRL + MPA. In KTRs with HCV- or CMV+ serology, SRL + Tac was associated with a higher risk of NODAT relative to reference [HCV- (HR 1.43, 95% CI 1.17-1.74) and CMV+ (HR 1.44, 95% CI 1.14-1.81), respectively]. The risk for NODAT-free graft loss was lower with Tac + MPA than the other regimens. Conclusions: Tailoring immunosuppression regimen based on HCV or CMV serology may modify the risk of developing NODAT in KTRs.


Subject(s)
Diabetes Mellitus/diagnosis , Diabetes Mellitus/drug therapy , Immunosuppressive Agents/therapeutic use , Kidney Transplantation/adverse effects , Virus Diseases/blood , Viruses/isolation & purification , Adolescent , Adult , Age of Onset , Diabetes Mellitus/blood , Diabetes Mellitus/etiology , Female , Humans , Immunosuppression Therapy , Male , Middle Aged , Virus Diseases/virology , Young Adult
13.
Arterioscler Thromb Vasc Biol ; 36(2): 266-73, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26634654

ABSTRACT

OBJECTIVE: Patients with systemic lupus erythematosis are at risk for premature atherosclerosis and half of the patients with systemic lupus erythematosis have elevated type I interferon (IFN-I) levels. We hypothesized that IFN-I would induce premature atherosclerosis by increasing the number of smooth muscle progenitor cells (SMPC) in the bloodstream and promoting atherosclerotic lesions within the vasculature. APPROACH AND RESULTS: SMPC isolated from wild-type and IFN receptor knockout animals were cultured in medium±IFN-I. In vivo, we used electroporation to generate stable IFN-I expression for as long as 4 months. The number of SMPC was determined in mice that expressed IFN-I and in control mice and sections from the bifurcation of the abdominal aorta were analyzed 3 months after electroporation of an IFN-I expression plasmid or a control plasmid. Adding IFN-I to the media increased the number of cultured wild-type SMPC and increased mRNA for SM22, but had no effect on SMPC isolated from IFN receptor knockout mice. Our in vivo results demonstrated a positive relationship between the preatherosclerotic-like lesions and endothelial damage. Although, there were no significant differences in smooth muscle cell density or thickness of the medial layer between groups, the IFN-I-expressing mice had a significant increase in preatherosclerotic-like lesions and immature smooth muscle cells, cells that expressed CD34 and smooth muscle α-actin; but lacked smooth muscle myosin heavy chain. CONCLUSIONS: IFN-I seems to enhance SMPC number in vitro. In vivo IFN-I expression may maintain SMPC in an immature state. These immature smooth muscle cells could give rise to macrophages and eventually foam cells.


Subject(s)
Aortic Diseases/metabolism , Atherosclerosis/metabolism , Cell Differentiation , Interferon Type I/metabolism , Muscle, Smooth, Vascular/metabolism , Myocytes, Smooth Muscle/metabolism , Stem Cells/metabolism , Animals , Antigens, CD34/genetics , Antigens, CD34/metabolism , Aorta, Abdominal/metabolism , Aorta, Abdominal/pathology , Aortic Diseases/genetics , Aortic Diseases/pathology , Atherosclerosis/genetics , Atherosclerosis/pathology , Cells, Cultured , Endothelial Cells/metabolism , Endothelial Cells/pathology , Female , Genotype , Interferon Type I/deficiency , Interferon Type I/genetics , Mice, Inbred C57BL , Mice, Knockout , Microfilament Proteins/genetics , Microfilament Proteins/metabolism , Muscle Proteins/genetics , Muscle Proteins/metabolism , Muscle, Smooth, Vascular/pathology , Myocytes, Smooth Muscle/pathology , Myosin Heavy Chains/metabolism , Phenotype , Stem Cells/pathology , Time Factors , Transfection
14.
BMC Pregnancy Childbirth ; 17(1): 10, 2017 01 06.
Article in English | MEDLINE | ID: mdl-28061833

ABSTRACT

BACKGROUND: Application of latent variable models in medical research are becoming increasingly popular. A latent trait model is developed to combine rare birth defect outcomes in an index of infant morbidity. METHODS: This study employed four statewide, retrospective 10-year data sources (1999 to 2009). The study cohort consisted of all female Florida Medicaid enrollees who delivered a live singleton infant during study period. Drug exposure was defined as any exposure to Antiepileptic drugs (AEDs) during pregnancy. Mothers with no AED exposure served as the AED unexposed group for comparison. Four adverse outcomes, birth defect (BD), abnormal condition of new born (ACNB), low birth weight (LBW), and pregnancy and obstetrical complication (PCOC), were examined and combined using a latent trait model to generate an overall severity index. Unidimentionality, local independence, internal homogeneity, and construct validity were evaluated for the combined outcome. RESULTS: The study cohort consisted of 3183 mother-infant pairs in total AED group, 226 in the valproate only subgroup, and 43,956 in the AED unexposed group. Compared to AED unexposed group, the rate of BD was higher in both the total AED group (12.8% vs. 10.5%, P < .0001), and the valproate only subgroup (19.6% vs. 10.5%, P < .0001). The combined outcome was significantly correlated with the length of hospital stay during delivery in both the total AED group (Rho = 0.24, P < .0001) and the valproate only subgroup (Rho = 0.16, P = .01). The mean score for the combined outcome in the total AED group was significantly higher (2.04 ± 0.02 vs. 1.88 ± 0.01, P < .0001) than AED unexposed group, whereas the valproate only subgroup was not. CONCLUSIONS: Latent trait modeling can be an effective tool for combining adverse pregnancy and perinatal outcomes to assess prenatal exposure to AED, but evaluation of the selected components is essential to ensure the validity of the combined outcome.


Subject(s)
Anticonvulsants/adverse effects , Epilepsy/drug therapy , Pregnancy Complications/drug therapy , Prenatal Exposure Delayed Effects/chemically induced , Valproic Acid/adverse effects , Abnormalities, Drug-Induced/epidemiology , Abnormalities, Drug-Induced/etiology , Adult , Female , Florida , Humans , Infant, Low Birth Weight , Patient Outcome Assessment , Pregnancy , Pregnancy Outcome , Prenatal Exposure Delayed Effects/epidemiology , Retrospective Studies
15.
Catheter Cardiovasc Interv ; 88(4): 501-505, 2016 Oct.
Article in English | MEDLINE | ID: mdl-26524970

ABSTRACT

OBJECTIVES: To perform an updated meta-analysis to determine whether complete revascularization of significant coronary lesions at the time of primary percutaneous coronary intervention (PCI) would be associated with better outcomes compared with culprit-only revascularization. BACKGROUND: Individual trials have demonstrated conflicting evidence regarding the optimum revascularization strategy at the time of primary PCI. METHODS: Clinical trials that randomized ST elevation myocardial infarction (STEMI) patients with multi-vessel disease to a complete versus culprit-only revascularization strategy were included. Random effects summary risk ratios (RR) were constructed using a DerSimonian-Laird model. The primary outcome of interest was mortality or myocardial infarction (MI). RESULTS: A total of seven trials with 1,939 patients were included in the analysis. Compared with culprit-only revascularization, complete revascularization was associated with a non-significant reduction in the risk of mortality or MI (RR 0.69, 95% confidence interval (CI) 0.42-1.12, P = 0.14). Complete revascularization was associated with a reduced risk of major adverse cardiac events (MACE) (RR 0.61, 95% CI 0.45-0.81, P < 0.001), due to a significant reduction in urgent revascularization (RR 0.46, 95% CI 0.29-0.70, P < 0.001). The risk of major bleeding and contrast-induced nephropathy was similar with both approaches (RR 0.83, 95% CI 0.41-1.71, P = 0.62, and RR 0.94, 95% CI 0.42-2.12, P = 0.82). CONCLUSIONS: Complete revascularization of all significant coronary lesions at the time of primary PCI was associated with a reduction in the risk of MACE due to reduction in the risk of urgent revascularization. This approach appears to be safe, with no excess major bleeding, or contrast-induced nephropathy. © 2015 Wiley Periodicals, Inc.


Subject(s)
Coronary Artery Disease/therapy , Percutaneous Coronary Intervention/methods , ST Elevation Myocardial Infarction/therapy , Aged , Contrast Media/adverse effects , Coronary Angiography/adverse effects , Coronary Artery Disease/diagnostic imaging , Coronary Artery Disease/mortality , Female , Hemorrhage/etiology , Humans , Kidney Diseases/chemically induced , Male , Middle Aged , Odds Ratio , Percutaneous Coronary Intervention/adverse effects , Percutaneous Coronary Intervention/mortality , Randomized Controlled Trials as Topic , Recurrence , Risk Assessment , Risk Factors , ST Elevation Myocardial Infarction/diagnostic imaging , ST Elevation Myocardial Infarction/mortality , Time Factors , Treatment Outcome
16.
Catheter Cardiovasc Interv ; 88(5): 765-774, 2016 Nov.
Article in English | MEDLINE | ID: mdl-27515910

ABSTRACT

OBJECTIVES: To perform an updated systematic review comparing a routine invasive strategy with a selective invasive strategy for patients with non-ST-elevation acute coronary syndromes (NSTE-ACS) in the era of stents and antiplatelet therapy. BACKGROUND: Recent meta-analyses comparing both strategies have shown conflicting results. METHODS: Electronic databases were searched for randomized trials that compared a routine invasive strategy (i.e., routine coronary angiography +/- revascularization) versus a selective invasive strategy (i.e., medical stabilization and coronary angiography +/- revascularization if objective evidence of ischemia or refractory ischemia) in patients with NSTE-ACS. Summary odds ratios (OR) were primarily constructed using Peto's model. RESULTS: Twelve trials with 9,650 patients were included. Compared with a selective invasive strategy, a routine invasive strategy was associated with a reduction in the composite of all-cause mortality or myocardial infarction (MI) [OR: 0.86, 95% confidence interval (CI) 0.77-0.96] at a mean follow-up of 39 months, primarily due to a reduction in the risk of MI (OR: 0.78, 95% CI: 0.68-0.88). The risk of all-cause mortality was non-significantly reduced with a routine invasive strategy (OR: 0.88, 95% CI: 0.77-1.01). The risk of recurrent angina was reduced with a routine invasive strategy (OR: 0.55, 95% CI: 0.49-0.62), as well as the risk of future revascularization procedures (OR: 0.35, 95% CI: 0.30-0.39). CONCLUSION: In patients with NSTE-ACS, a routine invasive strategy reduced the risk of ischemic events, including the risk of mortality or MI. Routine invasive therapy reduced the risk of recurrent angina and future revascularization procedures. © 2016 Wiley Periodicals, Inc.


Subject(s)
Acute Coronary Syndrome/surgery , Non-ST Elevated Myocardial Infarction/etiology , Randomized Controlled Trials as Topic , Acute Coronary Syndrome/complications , Acute Coronary Syndrome/diagnosis , Coronary Angiography , Humans , Myocardial Revascularization , Non-ST Elevated Myocardial Infarction/diagnosis , Non-ST Elevated Myocardial Infarction/surgery
17.
Am Heart J ; 169(3): 412-8, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25728732

ABSTRACT

BACKGROUND: Chronic kidney disease (CKD) is associated with accelerated atherosclerosis and adverse cardiovascular outcomes, but mechanisms are unclear. We hypothesized that mild CKD independently predicts adverse outcomes in women with symptoms and signs of ischemia. METHODS: We categorized 876 women from the Women's Ischemia Syndrome Evaluation cohort according to estimated glomerular filtration rate (eGFR) (eGFR ≥90 mL/min per 1.73 m(2) [normal], 60-89 mL/min per 1.73 m(2) [mild CKD], ≤59 mL/min per 1.73 m(2) [severe CKD]). Time to death from all-cause and cardiovascular causes and major adverse outcomes were assessed by multivariate regression adjusted for baseline covariates. RESULTS: Obstructive coronary artery disease (CAD) was present only in few patients (39%). Even after adjusting for CAD severity, renal function remained a strong independent predictor of all-cause and cardiac mortality (P < .001). Every 10-unit decrease in eGFR was associated with a 14% increased risk of all-cause mortality (adjusted hazard ratio [AHR] 1.14 [1.08-1.20], P < .0001), 16% increased risk of cardiovascular mortality (AHR 1.16 [1.09-1.23], P < .0001), and 9% increased risk of adverse cardiovascular events (AHR 1.09 [1.03-1.15], P = .002). CONCLUSIONS: Even mild CKD is a strong independent predictor of all-cause and cardiac mortality in women with symptoms/signs of ischemia, regardless of underlying obstructive CAD severity, underscoring the need to better understand the interactions between ischemic heart disease and CKD.


Subject(s)
Chest Pain/mortality , Chest Pain/physiopathology , Kidney/physiopathology , Myocardial Ischemia/mortality , Myocardial Ischemia/physiopathology , Renal Insufficiency/physiopathology , Aged , Coronary Artery Disease/mortality , Coronary Artery Disease/physiopathology , Female , Glomerular Filtration Rate , Humans , Kaplan-Meier Estimate , Middle Aged , Risk Assessment , Severity of Illness Index , Women's Health
18.
Am J Physiol Regul Integr Comp Physiol ; 308(11): R945-56, 2015 Jun 01.
Article in English | MEDLINE | ID: mdl-25810384

ABSTRACT

Oxidative stress and inflammation are risk factors for hypertension in pregnancy. Here, we examined the 24-h mean arterial pressure (MAP) via telemetry and the nitric oxide (NO) and redox systems in the kidney cortex, medulla, and aorta of virgin and pregnant rats treated with a high-fat/prooxidant Western diet (HFD), ANG II, and TNF-α. Female Sprague-Dawley rats were given a normal diet (ND) or a HFD for 8 wk before mating. Day 6 of pregnancy and age-matched virgins were implanted with minipumps infusing saline or ANG II (150 ng·kg(-1)·min(-1)) + TNF-α (75 ng/day) for 14 days. Groups consisted of Virgin + ND + Saline (V+ND) (n = 7), Virgin + HFD +ANG II and TNF-α (V+HFD) (n = 7), Pregnant + ND + Saline (P+ND) (n = 6), and Pregnant + HFD + ANG II and TNF-α (P+HFD) (n = 8). After day 6 of minipump implantation, V+HFD rats displayed an increase in MAP on days 7, 8, and 10-15 vs. V+ND rats. P+HFD rats, after day 6 of minipump implantation, showed an increase in MAP only on day 7 vs. P+ND rats. P+HFD rats had a normal fall in 24-h MAP, hematocrit, plasma protein concentration, and osmolality at late pregnancy. No change in kidney cortex, medulla, or aortic oxidative stress in P+HFD rats. P+HFD rats displayed a decrease in nNOSß abundance, but no change in kidney cortex NOx content vs. P+ND rats. Pregnant rats subjected to a chronic HFD and prooxidant and proinflammatory insults have a blunted increase in 24-h MAP and renal oxidative stress. Our data suggest renal NO bioavailability is not altered in pregnant rats treated with a HFD, ANG II, and TNF-α.


Subject(s)
Angiotensin II , Arterial Pressure , Diet, High-Fat , Diet, Western , Hypertension/prevention & control , Kidney Cortex/metabolism , Oxidative Stress , Tumor Necrosis Factor-alpha , Animals , Antioxidants/metabolism , Aorta/metabolism , Aorta/physiopathology , Birth Weight , Disease Models, Animal , Female , Hypertension/etiology , Hypertension/metabolism , Hypertension/physiopathology , Litter Size , Nitric Oxide/metabolism , Pregnancy , Rats, Sprague-Dawley , Telemetry , Time Factors
19.
Transpl Int ; 28(4): 401-9, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25440520

ABSTRACT

The OPTN/UNOS Kidney Paired Donation (KPD) Pilot Program allocates priority to zero-HLA mismatches. However, in unrelated living donor kidney transplants (LDKT)-the same donor source in KPD-no study has shown whether zero-HLA mismatches provide any advantage over >0 HLA mismatches. We hypothesize that zero-HLA mismatches among unrelated LDKT do not benefit graft survival. This retrospective SRTR database study analyzed LDKT recipients from 1987 to 2012. Among unrelated LDKT, subjects with zero-HLA mismatches were compared to a 1:1-5 matched (by donor age ±1 year and year of transplantation) control cohort with >0 HLA mismatches. The primary endpoint was death-censored graft survival. Among 32,654 unrelated LDKT recipients, 83 had zero-HLA mismatches and were matched to 407 controls with >0 HLA mismatches. Kaplan-Meier analyses for death-censored graft and patient survival showed no difference between study and control cohorts. In multivariate marginal Cox models, zero-HLA mismatches saw no benefit with death-censored graft survival (HR = 1.46, 95% CI 0.78-2.73) or patient survival (HR = 1.43, 95% CI 0.68-3.01). Our data suggest that in unrelated LDKT, zero-HLA mismatches may not offer any survival advantage. Therefore, particular study of zero-HLA mismatching is needed to validate its place in the OPTN/UNOS KPD Pilot Program allocation algorithm.


Subject(s)
HLA-DR Antigens , Histocompatibility Antigens Class I , Kidney Transplantation , Transplantation Immunology , Adult , Cohort Studies , Female , Graft Survival , Humans , Living Donors , Male , Middle Aged , Retrospective Studies
20.
JAMA Neurol ; 81(8): 866-874, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-38884986

ABSTRACT

Importance: Animal and human studies have suggested that the use of angiotensin receptor blockers (ARBs) may be associated with a lower risk of incident epilepsy compared with other antihypertensive medications. However, observational data from the US are lacking. Objective: To evaluate the association between ARB use and epilepsy incidence in subgroups of US patients with hypertension. Design, Setting, and Participants: This retrospective cohort study used data from a national health administrative database from January 2010 to December 2017 with propensity score (PS) matching. The eligible cohort included privately insured individuals aged 18 years or older with diagnosis of primary hypertension and dispensed at least 1 ARB, angiotensin-converting enzyme inhibitor (ACEI), ß-blocker, or calcium channel blocker (CCB) from 2010 to 2017. Patients with a diagnosis of epilepsy at or before the index date or dispensed an antiseizure medication 12 months before or 90 days after initiating the study medications were excluded. The data analysis for this project was conducted from April 2022 to April 2024. Exposures: Propensity scores were generated based on baseline covariates and used to match patients who received ARBs with those who received either ACEIs, ß-blockers, CCBs, or a combination of these antihypertensive medications. Main Outcomes and Measures: Cox regression analyses were used to evaluate epilepsy incidence during follow-up comparing the ARB cohort with other antihypertensive classes. Subgroup and sensitivity analyses were conducted to examine the association between ARB use and epilepsy incidence in various subgroups. Results: Of 2 261 964 patients (mean [SD] age, 61.7 [13.9] years; 1 120 630 [49.5%] female) included, 309 978 received ARBs, 807 510 received ACEIs, 695 887 received ß-blockers, and 448 589 received CCBs. Demographic and clinical characteristics differed across the 4 comparison groups prior to PS matching. Compared with ARB users, patients receiving ACEIs were predominantly male and had diabetes, CCB users were generally older (eg, >65 years), and ß-blocker users had more comorbidities and concurrent medications. The 1:1 PS-matched subgroups included 619 858 patients for ARB vs ACEI, 619 828 patients for ARB vs ß-blocker, and 601 002 patients for ARB vs CCB. Baseline characteristics were equally distributed between comparison groups after matching with propensity scores. Use of ARBs was associated with a decreased incidence of epilepsy compared with ACEIs (adjusted hazard ratio [aHR], 0.75; 95% CI, 0.58-0.96), ß-blockers (aHR, 0.70; 95% CI, 0.54-0.90), and a combination of other antihypertensive classes (aHR, 0.72; 95% CI, 0.56-0.95). Subgroup analyses revealed a significant association between ARB use (primarily losartan) and epilepsy incidence in patients with no preexisting history of stroke or cardiovascular disease. Conclusions and Relevance: This cohort study found that ARBs, mainly losartan, were associated with a lower incidence of epilepsy compared with other antihypertensive agents in hypertensive patients with no preexisting stroke or cardiovascular disease. Further studies, such as randomized clinical trials, are warranted to confirm the comparative antiepileptogenic properties of antihypertensive medications.


Subject(s)
Angiotensin Receptor Antagonists , Epilepsy , Hypertension , Humans , Female , Hypertension/epidemiology , Hypertension/drug therapy , Male , Angiotensin Receptor Antagonists/therapeutic use , Angiotensin Receptor Antagonists/adverse effects , Middle Aged , Epilepsy/drug therapy , Epilepsy/epidemiology , Retrospective Studies , Aged , Adult , Antihypertensive Agents/therapeutic use , Incidence , Cohort Studies , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Angiotensin-Converting Enzyme Inhibitors/adverse effects , Propensity Score , Calcium Channel Blockers/therapeutic use , Calcium Channel Blockers/adverse effects , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL