Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 36
Filter
2.
Curr Hypertens Rep ; 26(5): 183-199, 2024 May.
Article in English | MEDLINE | ID: mdl-38363454

ABSTRACT

PURPOSE OF REVIEW: To define resistant hypertension (RHT), review its pathophysiology and disease burden, identify barriers to effective hypertension management, and to highlight emerging treatment options. RECENT FINDINGS: RHT is defined as uncontrolled blood pressure (BP) ≥ 130/80 mm Hg despite concurrent prescription of ≥ 3 or ≥ 4 antihypertensive drugs in different classes or controlled BP despite prescription of ≥ to 4 drugs, at maximally tolerated doses, including a diuretic. BP is regulated by a complex interplay between the renin-angiotensin-aldosterone system, the sympathetic nervous system, the endothelin system, natriuretic peptides, the arterial vasculature, and the immune system; disruption of any of these can increase BP. RHT is disproportionately manifest in African Americans, older patients, and those with diabetes and/or chronic kidney disease (CKD). Amongst drug-treated hypertensives, only one-quarter have been treated intensively enough (prescribed > 2 drugs) to be considered for this diagnosis. New treatment strategies aimed at novel therapeutic targets include inhibition of sodium-glucose cotransporter 2, aminopeptidase A, aldosterone synthesis, phosphodiesterase 5, xanthine oxidase, and dopamine beta-hydroxylase, as well as soluble guanylate cyclase stimulation, nonsteroidal mineralocorticoid receptor antagonism, and dual endothelin receptor antagonism. The burden of RHT remains high. Better use of currently approved therapies and integrating emerging therapies are welcome additions to the therapeutic armamentarium for addressing needs in high-risk aTRH patients.


Subject(s)
Antihypertensive Agents , Hypertension , Humans , Antihypertensive Agents/therapeutic use , Hypertension/drug therapy , Hypertension/physiopathology , Drug Resistance , Blood Pressure/drug effects , Cost of Illness
4.
Int J Audiol ; 62(2): 151-158, 2023 02.
Article in English | MEDLINE | ID: mdl-35015962

ABSTRACT

OBJECTIVE: To elucidate D-methionine's (D-met) dose and time rescue parameters from steady-state or impulse noise-induced permanent threshold shift (PTS) and determine D-met rescue's influence on serum and cochlear antioxidant levels. DESIGN: Five D-met doses at 0, 50, 100, or 200 mg/kg/dose administered starting at 1, 24, or 36 hours post steady-state or impulse noise exposure. Auditory brainstem responses at baseline and 21 days post-noise measured PTS. Serum (superoxide dismutase [SOD], catalase [CAT],, glutathione reductaseand glutathione peroxidase [GPx]) and cochlear (Glutathione [GSH] and glutathione disulphide [GSSG]) antioxidant levels measured physiological impact. STUDY SAMPLE: Chinchillas (10/study group; 6-8/confirmatory groups). RESULTS: D-met significantly reduced PTS for impulse noise (100 mg [2, 6, 14 and 20 kHz]; 200 mg [2, 14 and 20 kHz]) and steady-state noise (all dosing groups, time parameters and tested frequencies). PTS reduction did not significantly vary by rescue time. D-met significantly increased serum SOD (100 and 200 mg for 24 hour rescue) and GPx (50 mg/kg at 24 hour rescue) at 21 days post-noise. Cochlear GSH and GSSG levels were unaffected relative to control. CONCLUSION: D-met rescues from steady-state and impulse noise-induced PTS even when administered up to 36 hours post-noise and dose-dependently influences serum antioxidant levels even 21 days post-noise. D-met's broad and effective dose/time window renders it a promising antioxidant rescue agent.


Subject(s)
Hearing Loss, Noise-Induced , Methionine , Humans , Antioxidants/pharmacology , Hearing Loss, Noise-Induced/etiology , Hearing Loss, Noise-Induced/prevention & control , Glutathione Disulfide/pharmacology , Racemethionine/pharmacology , Superoxide Dismutase/pharmacology , Auditory Threshold , Evoked Potentials, Auditory, Brain Stem/physiology
5.
J Stroke Cerebrovasc Dis ; 31(8): 106550, 2022 Aug.
Article in English | MEDLINE | ID: mdl-35576858

ABSTRACT

OBJECTIVES: Large middle cerebral artery (MCA) strokes remain a major cause for mortality and morbidity all over the world, and therefore early identification of patients with the highest risk for malignant cerebral edema is crucial for early intervention. Neutrophils to lymphocytes ratio (NLR) and peripheral total white blood cell (WBC) count are inflammatory markers done routinely for all patients, and this study evaluated the use of NLR and elevated white blood cell count within the first 24 h of MCA ischemic stroke onset, with the absence of significant hemorrhagic transformation, to predict malignant cerebral edema. MATERIALS AND METHODS: A total of 156 patients with large MCA strokes were included. We collected demographic, clinical, radiological data, and NLR and WBCs within the first 24 h from admission.We excluded patients who had any underlying infections diagnosed 7 days before or within 72 h after admission. We used a body temp of 38 C or more, abnormal CXR or abnormal urine analysis within the first 72 h to exclude patients with possible infections.We excluded immune-compromised patients and patients on steroid therapy. We compared the NLR and WBC count in patients who developed malignant cerebral edema versus the patients who did not. NLR > 3.5 and < 3.5 was used for comparison. We then conducted multivariate logistic regression models to explore the relationship between cerebral edema, WBCs and NLR count simultaneously. RESULTS: NLR, WBC, radiological involvement of more than 50% of MCA territory infarction on presentation, hyperdense MCA sign, and NIH stroke scale were all significantly higher in patients with malignant cerebral edema within the first 24 h. Using univariate logistic regression, NLR performs better than WBC when predicting the occurrence of malignant cerebral edema (AUC = 0.74 vs. 0.62). However, NIH stroke scale scores, and radiological involvement of more than 50% of MCA territory infarction on the first 24 h of presentation on CT scan both showed better discriminative performance for malignant cerebral edema than NLR (AUC = 0.84 and 0.76, respectively). When combined, NLR > 3.5 paired with the NIH stroke scale score had the best predictive performance (AUC = 0.87). CONCLUSION: NLR > 3.5 can be used for early prognostication in patients with large vessel MCA ischemic strokes with no significant hemorrhagic transformation within the first 24 h regardless if they had reperfusion therapy or not. Combining NLR of > 3.5 in addition to high NIHSS provided the best predictive model in our study. Further studies are needed to further develop the best predictive model in diverse populations.


Subject(s)
Brain Edema , Stroke , Biomarkers , Brain Edema/diagnostic imaging , Brain Edema/etiology , Humans , Infarction, Middle Cerebral Artery/diagnostic imaging , Infarction, Middle Cerebral Artery/therapy , Leukocyte Count , Predictive Value of Tests , Retrospective Studies , Stroke/diagnostic imaging , Stroke/etiology
7.
Int J Audiol ; 61(8): 621-631, 2022 08.
Article in English | MEDLINE | ID: mdl-34622731

ABSTRACT

Objective: This exploratory Phase 2 clinical trial is the first determining safety and efficacy of oral D-methionine (D-met) in reducing cisplatin-induced ototoxicity.Design: Randomised parallel double-blind placebo-controlled exploratory Phase 2 study.Study samples: Fifty adult cancer patients received oral D-met or placebo before each cisplatin dose. Physical examination, blood collection and audiometry occurred at baseline and subsequent visits plus post-treatment audiometry. After attrition, final analysis included 27 patients.Results: Significant treatment group by ear and time (baseline vs. post-treatment) interactions occurred at 10 kHz and 11.2 kHz. Placebo and D-met groups differed in threshold shift for left ear at 11.2 kHz (mean difference = 22.97 dB [9.59, 36.35]). Averaging across ears, placebo group showed significant threshold shifts from baseline to post-treatment at 10 kHz (mean shift= -13.65 dB [-21.32,-5.98]), 11.2 kHz (-16.15 dB [-25.19,-7.12]), and 12.5 kHz (-11.46 dB [-19.18,-3.74]) but not 8 kHz (-8.65 dB [-17.86, 0.55]). The D-met group showed no significant threshold shifts (8 kHz: -1.25 dB [-7.75, 5.25]; 10 kHz:-3.93 dB [-8.89, 1.03]; 11.2 kHz:-4.82 dB [-11.21, 1.57]; 12.5 kHz:-3.68 dB [-11.57, 4.21]). Side effects did not significantly differ between groups.Conclusion: Oral D-met reduces cisplatin-induced ototoxicity in humans.


Subject(s)
Hearing Loss , Methionine , Adult , Auditory Threshold , Cisplatin/toxicity , Hearing Loss/chemically induced , Hearing Loss/diagnosis , Hearing Loss/prevention & control , Humans , India , Methionine/therapeutic use , Ototoxicity/prevention & control
8.
Int J Audiol ; 61(9): 769-777, 2022 Sep.
Article in English | MEDLINE | ID: mdl-34369249

ABSTRACT

OBJECTIVE: Determine if D-methionine (D-met) rescue prevents temporary threshold shift (TTS) from steady-state or impulse noise and determine D-met's impact on serum and cochlear antioxidant levels. DESIGN: D-met at 50, 100 or 200 mg/kg/doses were administered 0, 6 and 18 hours-post noise. ABRs at baseline and 24 hours post-noise measured TTS. Serum (SOD, CAT, GR, GPx) and cochlear (GSH, GSSG) antioxidant levels measured physiological influence. Three control groups, with impulse or steady-state or without noise, were saline-injected. STUDY SAMPLE: Ten Chinchillas/group. RESULTS: D-met rescue did not significantly reduce TTS or impact serum CAT, SOD, GPx or GR levels vs. noise-exposed control groups, but TTS was greater in all groups relative to no-noise controls. D-met significantly elevated CAT at 50 mg/kg vs. steady-state controls and SOD at 200 mg/kg vs. impulse noise controls. D-met significantly reduced cochlear GSH/GSSG ratios in the 100 mg/kg D-met group vs. impulse noise controls. CONCLUSIONS: While D-met rescue has reduced permanent threshold shift in previous studies, it did not reduce TTS in this study. However, D-met rescue did alter selective serum and cochlear oxidative state changes 24 hours post-noise relative to controls. Results demonstrate TTS studies do not always predict PTS protection in otoprotectant experimental designs.


Subject(s)
Antioxidants , Hearing Loss, Noise-Induced , Animals , Auditory Threshold/physiology , Chinchilla , Evoked Potentials, Auditory, Brain Stem/physiology , Glutathione Disulfide , Hearing Loss, Noise-Induced/etiology , Hearing Loss, Noise-Induced/prevention & control , Methionine , Superoxide Dismutase
9.
PLoS One ; 16(12): e0261049, 2021.
Article in English | MEDLINE | ID: mdl-34879107

ABSTRACT

OBJECTIVE: Determine effective preloading timepoints for D-methionine (D-met) otoprotection from steady state or impulse noise and impact on cochlear and serum antioxidant measures. DESIGN: D-met started 2.0-, 2.5-, 3.0-, or 3.5- days before steady-state or impulse noise exposure with saline controls. Auditory brainstem response (ABRs) measured from 2 to 20 kHz at baseline and 21 days post-noise. Samples were then collected for serum (SOD, CAT, GR, GPx) and cochlear (GSH, GSSG) antioxidant levels. STUDY SAMPLE: Ten Chinchillas per group. RESULTS: Preloading D-met significantly reduced ABR threshold shifts for both impulse and steady state noise exposures but with different optimal starting time points and with differences in antioxidant measures. For impulse noise exposure, the 2.0, 2.5, and 3.0 day preloading start provide significant threshold shift protection at all frequencies. Compared to the saline controls, serum GR for the 3.0 and 3.5 day preloading groups was significantly increased at 21 days with no significant increase in SOD, CAT or GPx for any impulse preloading time point. Cochlear GSH, GSSG, and GSH/GSSG ratio were not significantly different from saline controls at 21 days post noise exposure. For steady state noise exposure, significant threshold shift protection occurred at all frequencies for the 3.5, 3.0 and 2.5 day preloading start times but protection only occurred at 3 of the 6 test frequencies for the 2.0 day preloading start point. Compared to the saline controls, preloaded D-met steady-state noise groups demonstrated significantly higher serum SOD for the 2.5-3.5 day starting time points and GPx for the 2.5 day starting time but no significant increase in GR or CAT for any preloading time point. Compared to saline controls, D-met significantly increased cochlear GSH concentrations in the 2 and 2.5 day steady-state noise exposed groups but no significant differences in GSSG or the GSH/GSSG ratio were noted for any steady state noise-exposed group. CONCLUSIONS: The optimal D-met preloading starting time window is earlier for steady state (3.5-2.5 days) than impulse noise (3.0-2.0). At 21 days post impulse noise, D-met increased serum GR for 2 preloading time points but not SOD, CAT, or GpX and not cochlear GSH, GSSG or the GSH/GSSG ratio. At 21 days post steady state noise D-met increased serum SOD and GPx at select preloading time points but not CAT or GR. However D-met did increase the cochlear GSH at select preloading time points but not GSSG or the GSH/GSSG ratio.


Subject(s)
Antioxidants/pharmacology , Auditory Threshold , Cochlea/drug effects , Hearing Loss, Noise-Induced/prevention & control , Methionine/pharmacology , Protective Agents/pharmacology , Animals , Chinchilla , Cochlea/pathology , Hearing Loss, Noise-Induced/etiology , Hearing Loss, Noise-Induced/pathology , Male
10.
Dermatol Surg ; 47(12): 1562-1565, 2021 12 01.
Article in English | MEDLINE | ID: mdl-34417389

ABSTRACT

BACKGROUND: There are limited published data regarding the incidence and risk factors for infection after minor dermatologic procedures, such as skin biopsy, shave, and curettage. Prior studies of infection risk after dermatologic procedures have often not specified the method of preparation of local anesthetic. OBJECTIVE: To assess the incidence and risk factors for infection after minor procedures performed in a general dermatology clinic using buffered lidocaine prepared in office. MATERIALS AND METHODS: In this retrospective case-control study, the medical record was searched for cases of infection after skin biopsies, shaves, conventional excisions, and destructions performed in a general dermatology clinic over a 4-year period. Patient and procedure characteristics were compared with uninfected controls. RESULTS: Of 9,031 procedures performed during the study period, there were 34 infections (0.4%). The odds of infection for procedures on the arm and leg were 5.29 and 9.28 times higher, respectively, than those on the head/neck. There was no significant effect of age, sex, smoking, immunosuppression, diabetes, or anticoagulation. CONCLUSION: The incidence of infection is low after minor dermatologic procedures performed with local anesthesia using buffered lidocaine prepared in office. There is a higher risk of infection on the arm and leg compared with the head and neck.


Subject(s)
Dermatologic Surgical Procedures , Surgical Wound Infection/epidemiology , Aged , Case-Control Studies , Female , Humans , Incidence , Male , Middle Aged , Minor Surgical Procedures , Retrospective Studies , Risk Factors
11.
Hypertension ; 77(1): 72-81, 2021 01.
Article in English | MEDLINE | ID: mdl-33161774

ABSTRACT

Refractory hypertension (RfH) is a severe phenotype of antihypertension treatment failure. Treatment-resistant hypertension (TRH), a less severe form of difficult-to-treat hypertension, has been associated with significantly worse health outcomes. However, no studies currently show how health outcomes may worsen upon progression to RfH. RfH and TRH were studied in 3147 hypertensive participants in the CRIC (Chronic Renal Insufficiency Cohort study). The hypertensive phenotype (ie, no TRH or RfH, TRH, or RfH) was identified at the baseline visit, and health outcomes were monitored at subsequent visits. Outcome risk was compared using Cox proportional hazards models with time-varying covariates. A total of 136 (4.3%) individuals were identified with RfH at baseline. After adjusting for participant characteristics, individuals with RfH had increased risk for the composite renal outcome across all study years (50% decline in estimated glomerular filtration rate or end-stage renal disease; hazard ratio for study years 0-10=1.73 [95% CI, 1.42-2.11]) and the composite cardiovascular disease outcome during later study years (stroke, myocardial infarction, or congestive heart failure; hazard ratio for study years 0-3=1.25 [0.91-1.73], for study years 3-6=1.50 [0.97-2.32]), and for study years 6-10=2.72 [1.47-5.01]) when compared with individuals with TRH. There was no significant difference in all-cause mortality between those with refractory versus TRH. We provide the first evidence that RfH is associated with worse long-term health outcomes compared with TRH.


Subject(s)
Antihypertensive Agents/therapeutic use , Hypertension/drug therapy , Renal Insufficiency, Chronic/complications , Adult , Aged , Cohort Studies , Female , Humans , Hypertension/complications , Hypertension/epidemiology , Male , Middle Aged , Patient Outcome Assessment , Proportional Hazards Models
12.
Catheter Cardiovasc Interv ; 97(4): E569-E579, 2021 03.
Article in English | MEDLINE | ID: mdl-32969155

ABSTRACT

BACKGROUND: Transcatheter mitral valve repair (TMVR) is a treatment option for patients with 3+ or greater mitral regurgitation who cannot undergo mitral valve surgery. Outcomes in patients with chronic kidney disease (CKD) and end stage renal disease (ESRD) are unclear. We sought to evaluate the TMVR in-hospital outcomes, readmission rates and its impact on kidney function. METHODS: Data from 2016 National Readmission Database was used to obtain all patients who underwent TMVR. Patients were classified by their CKD status: no CKD, CKD, or ESRD. The primary outcomes were: in-hospital mortality, 30- and 90-day readmission rate, and change in CKD status on readmission. Multivariable logistic regression analysis was used to assess in-hospital, readmission outcomes and kidney function stage. RESULTS: A total of 4,645 patients were assessed (mean age 78.5 ± 10.3 years). In-hospital mortality was higher in patients with CKD (4.0%, odds ratio [OR]:2.01 [95% CI, confidence interval: 1.27-3.18]) and ESRD (6.6%, OR: 6.38 [95% CI: 1.49-27.36]) compared with non-CKD (2.4%). 30-day readmission rate was higher in ESRD versus non-CKD patients (17.8% vs. 10.4%, OR: 2.24 [95% CI: 1.30-3.87]) as was 90-day readmission (41.2% vs. 21% OR: 2.51 [95% CI:1.70-3.72]). Kidney function improved in 25% of patients with CKD stage 3 and in 50% with CKD stage 4-5 at 30-and 90-day readmission. Incidence of AKI, major bleeding, and respiratory failure were higher in CKD group. CONCLUSIONS: Patients with CKD and ESRD have worse outcomes and higher readmission rate after TMVR. In patients who were readmitted after TMVR, renal function improved in some patients, suggesting that TMVR could potentially improve CKD stage.


Subject(s)
Heart Valve Prosthesis Implantation , Mitral Valve Insufficiency , Renal Insufficiency, Chronic , Aged , Cardiac Catheterization/adverse effects , Heart Valve Prosthesis Implantation/adverse effects , Hospitals , Humans , Mitral Valve/diagnostic imaging , Mitral Valve/surgery , Mitral Valve Insufficiency/surgery , Patient Readmission , Renal Insufficiency, Chronic/diagnosis , Treatment Outcome
13.
Ecol Evol ; 10(14): 7221-7232, 2020 Jul.
Article in English | MEDLINE | ID: mdl-32760523

ABSTRACT

Obtaining accurate estimates of disease prevalence is crucial for the monitoring and management of wildlife populations but can be difficult if different diagnostic tests yield conflicting results and if the accuracy of each diagnostic test is unknown. Bayesian latent class analysis (BLCA) modeling offers a potential solution, providing estimates of prevalence levels and diagnostic test accuracy under the realistic assumption that no diagnostic test is perfect.In typical applications of this approach, the specificity of one test is fixed at or close to 100%, allowing the model to simultaneously estimate the sensitivity and specificity of all other tests, in addition to infection prevalence. In wildlife systems, a test with near-perfect specificity is not always available, so we simulated data to investigate how decreasing this fixed specificity value affects the accuracy of model estimates.We used simulations to explore how the trade-off between diagnostic test specificity and sensitivity impacts prevalence estimates and found that directional biases depend on pathogen prevalence. Both the precision and accuracy of results depend on the sample size, the diagnostic tests used, and the true infection prevalence, so these factors should be considered when applying BLCA to estimate disease prevalence and diagnostic test accuracy in wildlife systems. A wildlife disease case study, focusing on leptospirosis in California sea lions, demonstrated the potential for Bayesian latent class methods to provide reliable estimates under real-world conditions.We delineate conditions under which BLCA improves upon the results from a single diagnostic across a range of prevalence levels and sample sizes, demonstrating when this method is preferable for disease ecologists working in a wide variety of pathogen systems.

14.
PLoS Negl Trop Dis ; 14(6): e0008407, 2020 06.
Article in English | MEDLINE | ID: mdl-32598393

ABSTRACT

Confronted with the challenge of understanding population-level processes, disease ecologists and epidemiologists often simplify quantitative data into distinct physiological states (e.g. susceptible, exposed, infected, recovered). However, data defining these states often fall along a spectrum rather than into clear categories. Hence, the host-pathogen relationship is more accurately defined using quantitative data, often integrating multiple diagnostic measures, just as clinicians do to assess their patients. We use quantitative data on a major neglected tropical disease (Leptospira interrogans) in California sea lions (Zalophus californianus) to improve individual-level and population-level understanding of this Leptospira reservoir system. We create a "host-pathogen space" by mapping multiple biomarkers of infection (e.g. serum antibodies, pathogen DNA) and disease state (e.g. serum chemistry values) from 13 longitudinally sampled, severely ill individuals to characterize changes in these values through time. Data from these individuals describe a clear, unidirectional trajectory of disease and recovery within this host-pathogen space. Remarkably, this trajectory also captures the broad patterns in larger cross-sectional datasets of 1456 wild sea lions in all states of health but sampled only once. Our framework enables us to determine an individual's location in their time-course since initial infection, and to visualize the full range of clinical states and antibody responses induced by pathogen exposure. We identify predictive relationships between biomarkers and outcomes such as survival and pathogen shedding, and use these to impute values for missing data, thus increasing the size of the useable dataset. Mapping the host-pathogen space using quantitative biomarker data enables more nuanced understanding of an individual's time course of infection, duration of immunity, and probability of being infectious. Such maps also make efficient use of limited data for rare or poorly understood diseases, by providing a means to rapidly assess the range and extent of potential clinical and immunological profiles. These approaches yield benefits for clinicians needing to triage patients, prevent transmission, and assess immunity, and for disease ecologists or epidemiologists working to develop appropriate risk management strategies to reduce transmission risk on a population scale (e.g. model parameterization using more accurate estimates of duration of immunity and infectiousness) and to assess health impacts on a population scale.


Subject(s)
Biomarkers/blood , Host-Pathogen Interactions/physiology , Leptospira/pathogenicity , Leptospirosis/diagnosis , Leptospirosis/veterinary , Sea Lions/microbiology , Animal Diseases/diagnosis , Animal Diseases/immunology , Animal Diseases/microbiology , Animals , Antibodies, Bacterial/blood , Bacterial Shedding , California , Cross-Sectional Studies , Host-Pathogen Interactions/immunology , Immunity , Kinetics , Leptospira interrogans , Leptospirosis/immunology , Survival Rate
15.
Am J Hypertens ; 33(6): 528-533, 2020 05 21.
Article in English | MEDLINE | ID: mdl-31930338

ABSTRACT

BACKGROUND: Intensively treated participants in the SPRINT study experienced fewer primary cardiovascular composite study endpoints (CVD events) and lower mortality, although 38% of participants experienced a serious adverse event (SAE). The relationship of SAEs with CVD events is unknown. METHODS: CVD events were defined as either myocardial infarction, acute coronary syndrome, decompensated heart failure, stroke, or death from cardiovascular causes. Cox models were utilized to understand the occurrence of SAEs with CVD events according to baseline atherosclerotic cardiovascular disease (ASCVD) risk. RESULTS: SAEs occurred in 96% of those experiencing a CVD event but only in 34% (P < 0.001) of those not experiencing a CVD event. Occurrence of SAEs monotonically increased across the range of baseline ASCVD risk being approximately twice as great in the highest compared with the lowest risk category. SAE occurrence was strongly associated with ASCVD risk but was similar within risk groups across treatment arms. In adjusted Cox models, experiencing a CVD event was the strongest predictor of SAEs in all risk groups. By the end of year 1, the hazard ratios for the low, middle, and high ASCVD risk tertiles, and baseline clinical CVD group were 2.56 (95% CI = 1.39-4.71); 2.52 (1.63-3.89); 3.61 (2.79-4.68); 1.86 (1.37-2.54), respectively-a trend observed in subsequent years until study end. Intensive treatment independently predicted SAEs only in the second ASVCD risk tertile. CONCLUSIONS: The occurrence of SAEs is multifactorial and mostly related to prerandomization patient characteristics, most prominently ASCVD risk, which, in turn, relates to in-study CVD events.


Subject(s)
Antihypertensive Agents/therapeutic use , Blood Pressure/drug effects , Cardiovascular Diseases/prevention & control , Hypertension/drug therapy , Aged , Antihypertensive Agents/adverse effects , Cardiovascular Diseases/mortality , Cardiovascular Diseases/physiopathology , Cluster Analysis , Female , Heart Disease Risk Factors , Humans , Hypertension/mortality , Hypertension/physiopathology , Male , Middle Aged , Randomized Controlled Trials as Topic , Risk Assessment , Time Factors , Treatment Outcome
16.
J Hypertens ; 37(9): 1797-1804, 2019 09.
Article in English | MEDLINE | ID: mdl-31058798

ABSTRACT

OBJECTIVES: Refractory hypertension has been defined as uncontrolled blood pressure (at or above 140/90 mmHg) when on five or more classes of antihypertensive medication, inclusive of a diuretic. Because unbiased estimates of the prevalence of refractory hypertension in the United States are lacking, we aim to provide such estimates using data from the National Health and Nutrition Examination Surveys (NHANES). METHODS: Refractory hypertension was assessed across multiple NHANES cycles using the aforementioned definition. Eight cycles of NHANES surveys (1999-2014) representing 41 552 patients are the subject of this study. Prevalence of refractory hypertension across these surveys was estimated in the drug-treated hypertensive population after adjusting for the complex survey design and standardizing for age. RESULTS: Across all surveys, refractory hypertension prevalence was 0.6% [95% confidence interval (CI) (0.5, 0.7)] amongst drug-treated hypertensive adults; 6.2% [95% CI (5.1, 7.6)] of individuals with treatment-resistant hypertension actually had refractory hypertension. Although the prevalence of refractory hypertension ranged from 0.3% [95% CI (0.1, 1.0)] to 0.9% [95% CI (0.6, 1.2)] over the eight cycles considered, there was no significant trend in prevalence over time. Refractory hypertension prevalence amongst those prescribed five or more drugs was 34.5% [95% CI (27.9, 41.9)]. Refractory hypertension was associated with advancing age, lower household income, black race, and also chronic kidney disease, albuminuria, diabetes, prior stroke, and coronary heart disease. CONCLUSIONS: We provided the first nationally representative estimate of refractory hypertension prevalence in US adults.


Subject(s)
Antihypertensive Agents/therapeutic use , Diuretics/therapeutic use , Hypertension/epidemiology , Aged , Aged, 80 and over , Albuminuria/etiology , Antihypertensive Agents/pharmacology , Blood Pressure/drug effects , Blood Pressure Determination , Female , Humans , Hypertension/complications , Hypertension/drug therapy , Male , Middle Aged , Nutrition Surveys , Prevalence , Renal Insufficiency, Chronic/etiology , Stroke/etiology , United States/epidemiology
17.
J Stroke Cerebrovasc Dis ; 28(6): 1440-1447, 2019 Jun.
Article in English | MEDLINE | ID: mdl-30952531

ABSTRACT

BACKGROUND AND PURPOSE: 15% of cerebral venous thrombosis (CVT) patients have poor outcomes despite anticoagulation. Uncontrolled studies suggest that endovascular approaches may benefit such patients. In this study, we analyze Nationwide Inpatient Sample (NIS) data to evaluate the safety and efficacy of endovascular therapy (ET) versus medical management in CVT. We also examined the yearly trends of ET utilization in the United States. METHODS: International Classification of Diseases, Ninth Revision, Clinical Modification codes were utilized to identify CVT patients who received ET. To make the data nationally representative, weights were applied per NIS recommendations. Since ET was not randomly assigned to patients and was likely to be influenced by disease severity, propensity score weighting methods were utilized to correct for this treatment selection bias. Outcome variables included in-hospital mortality and discharge disposition. To determine if our primary outcomes were associated with ET, we used weighted multivariable logistic regression analyses. RESULTS: Of the 49,952 estimated CVT cases, 48,704 (97%) received medical management and 1248 (3%) received ET (mechanical thrombectomy [MT] alone, N = 269 [21%], MT ± thrombolysis, N = 297 [24%], and thrombolysis alone, N = 682 [55%]). Patients who received ET were older with more CVT associated complications including venous infarct, intracranial hemorrhage, coma, seizure, and cerebral edema. There was a significant yearly rise in the use of ET, with a trend favoring MT versus thrombolysis alone. ET was independently associated with an increased risk of death (odds ratio 1.96, 95% confidence interval 1.15-3.32). CONCLUSIONS: Patients receiving ET experienced higher mortality after adjusting for age and CVT associated complications. Large, well designed prospective randomized trials are warranted for further evaluation of the safety and efficacy of ETs.


Subject(s)
Endovascular Procedures/trends , Inpatients , Intracranial Thrombosis/therapy , Practice Patterns, Physicians'/trends , Thrombectomy/trends , Thrombolytic Therapy/trends , Venous Thrombosis/therapy , Adult , Aged , Cardiovascular Agents/therapeutic use , Databases, Factual , Endovascular Procedures/adverse effects , Endovascular Procedures/mortality , Female , Humans , Intracranial Thrombosis/diagnostic imaging , Intracranial Thrombosis/mortality , Male , Middle Aged , Risk Assessment , Risk Factors , Thrombectomy/adverse effects , Thrombectomy/mortality , Thrombolytic Therapy/adverse effects , Thrombolytic Therapy/mortality , Time Factors , Treatment Outcome , United States , Venous Thrombosis/diagnostic imaging , Venous Thrombosis/mortality
18.
Int J Pharm Pract ; 27(4): 380-385, 2019 Aug.
Article in English | MEDLINE | ID: mdl-30847977

ABSTRACT

OBJECTIVE: To assess whether hypoglycaemia incidence during management of adult diabetic ketoacidosis (DKA) differed following transition from a fixed-rate insulin protocol to a protocol using an empiric insulin rate reduction after normoglycaemia. METHODS: We retrospectively reviewed charts from adult patients managed with a DKA order set before and after order set revision. In cohort 1 (n = 77), insulin rate was 0.1 unit/kg/h with no adjustments and dextrose was infused at 12.5 g/h after glucose reached 250 mg/dl. In cohort 2 (n = 78), insulin was reduced to 0.05 unit/kg/h concurrent with dextrose initiation at 12.5 g/h after glucose reached 200 mg/dl. The primary outcome was hypoglycaemia (glucose < 70 mg/dl) within 24 h of the first order for insulin. KEY FINDINGS: The 24-h incidence of hypoglycaemia was 19.2% in cohort 2 versus 32.5% in cohort 1; the adjusted odds ratio was 0.46 (95% confidence interval (CI) [0.21, 0.98]; P = 0.047). The 24-h use of dextrose 50% in water (D50W) was also reduced in cohort 2. No differences were seen in anion gap or bicarbonate normalization, rebound hyperglycaemia or ICU length of stay. In most patients who became hypoglycaemic, the preceding glucose value was below 100 mg/dl. CONCLUSIONS: The insulin rate-reduction protocol was associated with less hypoglycaemia and no obvious disadvantage. Robust intervention for low-normal glucose values could plausibly achieve low hypoglycaemia rates with either approach.


Subject(s)
Blood Glucose/analysis , Diabetic Ketoacidosis/drug therapy , Hypoglycemia/epidemiology , Hypoglycemic Agents/adverse effects , Insulin/adverse effects , Adult , Diabetic Ketoacidosis/blood , Dose-Response Relationship, Drug , Female , Humans , Hypoglycemia/blood , Hypoglycemia/chemically induced , Hypoglycemic Agents/administration & dosage , Incidence , Insulin/administration & dosage , Length of Stay , Male , Middle Aged , Retrospective Studies , Young Adult
20.
J Am Soc Hypertens ; 12(11): 809-817, 2018 11.
Article in English | MEDLINE | ID: mdl-30392848

ABSTRACT

Apparent treatment-resistant hypertension (aTRH) is associated with higher prevalence of secondary hypertension, greater risk for adverse pressure-related clinical outcomes, and influences diagnostic and therapeutic decision-making. We previously showed that cross-sectional prevalence estimates of aTRH are lower than its true prevalence as patients with uncontrolled hypertension undergoing intensification/optimization of therapy will, over time, increasingly satisfy diagnostic criteria for aTRH. aTRH was assessed in an urban referral hypertension clinic using a 140/90 mm Hg goal blood pressure target in 745 patients with uncontrolled blood pressure, who were predominately African-American (86%) and female (65%). Analyses were stratified according to existing prescription of diuretic at initial visit. Risk for aTRH was estimated using logistic regression with patient characteristics at index visit as predictors. Among those prescribed diuretics, 84/363 developed aTRH; the risk score discriminated well (area under the receiver operating curve = 0.77, bootstrapped 95% CI [0.71, 0.81]). In patients not prescribed a diuretic, 44/382 developed aTRH, and the risk score showed a significantly better discriminative ability (area under the receiver operating curve = 0.82 [0.76, 0.87]; P < .001). In the diuretic and nondiuretic cohorts, 145/363 and 290/382 of patients had estimated risks for development of aTRH <15%. Of these low-risk patients, 139/145 and 278/290 did not develop aTRH (negative predictive value, diuretics - 0.94 [0.91, 0.98], no diuretics - 0.95 [0.93, 0.97]). We created a novel clinical score that discriminates well between those who will and will not develop aTRH, especially among those without existing diuretic prescriptions. Irrespective of baseline diuretic treatment status, a low-risk score had very high negative predictive value.

SELECTION OF CITATIONS
SEARCH DETAIL
...