Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 189
Filter
1.
Sci Total Environ ; 943: 173833, 2024 Sep 15.
Article in English | MEDLINE | ID: mdl-38866159

ABSTRACT

BACKGROUND: Cohort studies linking greenspace exposure to a lower risk of obesity-related cancer (ORC) are scarce. Existing evidence on site-specific cancers has predominantly relied on non-specific greenspace measures, including vegetation indices. We examined the associations of total greenspace, private residential gardens, and other greenspace types with the risk of being diagnosed with overall and site-specific ORC. METHODS: We used data from the participants in the UK Biobank recruited between 2006 and 2010 and censored until December 31, 2016. We defined greenspace variables using Ordnance Survey MasterMap™ greenspace categories. The incidence of ORC was ascertained through data linkage to cancer registries. Hazard ratios (HRs) and 95 % confidence intervals (CIs) were estimated using Cox proportional hazard models and adjusted for covariates. We conducted mediation and modification analysis by physical activity, serum 25-hydroxyvitamin D [25(OH)D], and particulate matter air pollution with an aerodynamic diameter ≤ 2.5 (PM2.5) and nitrogen dioxide (NO2), as well as subgroup analysis by covariates. RESULTS: Among 279,326 participants, 9550 developed ORC over a median follow-up period of 7.82 years. An increase in private residential gardens within a 100 m buffer was associated with a decreased risk of overall ORC (HR:0.92; 95 % CI: 0.88, 0.96), breast cancer (HR: 0.91; 95 % CI: 0.84, 0.98), and uterine cancer (HR:0.80; 95 % CI: 0.67, 0.96). There was no association between other greenspace types and ORC, except for uterine cancer. The association for ORC was partly mediated by NO2 and modified by physical activity levels, 25(OH)D, PM2.5, and NO2, and sociodemographic factors, including sex and neighbourhood socioeconomic status. CONCLUSION: Increased exposure to private residential gardens may lower the risk of being diagnosed with obesity-related cancer, particularly breast and uterine cancer. Future studies might move beyond considering greenspace quantity to explore functional types of greenspace exposure that should be prioritized for targeted health intervention and cancer prevention.


Subject(s)
Gardens , Neoplasms , Obesity , Humans , Neoplasms/epidemiology , United Kingdom/epidemiology , Female , Obesity/epidemiology , Male , Middle Aged , Cohort Studies , Aged , Environmental Exposure/statistics & numerical data , Biological Specimen Banks , Risk Factors , Adult , UK Biobank
3.
Am J Ind Med ; 67(6): 556-561, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38698682

ABSTRACT

BACKGROUND: Occupational heat stress, exacerbated by factors such as climate change and insufficient cooling solutions, endangers the health and productivity of workers, especially in low-resource workplaces. OBJECTIVE: To evaluate the effectiveness of two cooling strategies in reducing physiological strain and productivity of piece-rate workers over a 9-h work shift in a southern Thailand sawmill. METHODS: In a crossover randomized control trial design, 12 (33 ± 7 y; 1.58 ± 0.05 m; 51 ± 9 kg; n = 5 females) medically screened sawmill workers were randomly allocated into three groups comprising an established phase change material vest (VEST), an on-site combination cooling oasis (OASIS) (i.e., hydration, cold towels, fans, water dousing), and no cooling (CON) across 3 consecutive workdays. Physiological strain was measured via core temperature telemetry and heart rate monitoring. Productivity was determined by counting the number of pallets of wood sorted, stacked, and stowed each day. RESULTS: Relative to CON, OASIS lowered core temperature by 0.25°C [95% confidence interval = 0.24, 0.25] and heart rate by 7 bpm [6, 9] bpm, compared to 0.17°C [0.17, 0.18] and 10 [9,12] bpm reductions with VEST. It was inconclusive whether productivity was statistically lower in OASIS compared to CON (mean difference [MD] = 2.5 [-0.2, 5.2]), and was not statistically different between VEST and CON (MD = 1.4 [-1.3, 4.1]). CONCLUSIONS: Both OASIS and VEST were effective in reducing physiological strain compared to no cooling. Their effect on productivity requires further investigation, as even small differences between interventions could lead to meaningful disparities in piece-rate worker earnings over time.


Subject(s)
Cross-Over Studies , Heat Stress Disorders , Humans , Thailand , Female , Adult , Male , Heat Stress Disorders/prevention & control , Heart Rate/physiology , Occupational Diseases/prevention & control , Occupational Diseases/etiology , Protective Clothing , Efficiency , Hot Temperature/adverse effects , Occupational Exposure/prevention & control , Occupational Exposure/adverse effects , Young Adult
4.
J Vasc Surg ; 2024 May 07.
Article in English | MEDLINE | ID: mdl-38723913

ABSTRACT

OBJECTIVE: The Society for Vascular Surgery (SVS) Wound, Ischemia, and foot Infection (WIfI) classification system aims to risk stratify patients with chronic limb-threatening ischemia (CLTI), predicting both amputation rates and the need for revascularization. However, real-world use of the system and whether it predicts outcomes accurately after open revascularization and peripheral interventions is unclear. Therefore, we sought to determine the adoption of the WIfI classification system within a contemporary statewide collaborative as well as the impact of patient factor, and WIfI risk assessment on short- and long-term outcomes. METHODS: Using data from a large statewide collaborative, we identified patients with CLTI undergoing open surgical revascularization or peripheral vascular intervention (PVI) between 2016 and 2022. The primary exposure was preoperative clinical WIfI stage. Patients were categorized according to the SVS Lower Extremity Threatened Limb Classification System into clinical WIfI stages 1, 2, 3, or 4. The primary outcomes were 30-day and 1-year amputation and mortality rates. Multivariable logistic regression was performed to estimate the association of WIfI stage on postrevascularization outcomes. RESULTS: In the cohort of 17,417 patients, 83.4% (n = 14,529) had WIfI stage documented. PVIs were performed on 57.6% of patients, and 42.4% underwent an open surgical revascularization. Of the patients, 49.5% were classified as stage 1, 19.3% stage 2, 12.8% stage 3, and 18.3% of patients met stage 4 criteria. Stage 3 and 4 patients had higher rates of diabetes, congestive heart failure, and renal failure, and were less likely to be current or former smokers. One-half of stage 3 patients underwent open surgical revascularization, whereas stage 1 patients were most likely to have received a PVI (64%). As WIfI stage increased from 1 to 4, 1-year mortality increased from 12% to 21% (P < .001), 30-day amputation rates increased from 5% to 38% (P < .001), and 1-year amputation rates increased from 15% to 55% (P < .001). Finally, patients who did not have WIfI scores classified had significantly higher 30-day and 1-year mortality rates, as well as higher 30-day and 1-year amputation rates. CONCLUSIONS: The SVS WIfI clinical stage is significantly associated with 1-year amputation rates in patients with CLTI after lower extremity revascularization. Because nearly 55% of stage 4 patients require a major amputation within 1 year of intervention, this finding study supports use of the WIfI classification system in clinical decision-making for patients with CLTI.

6.
Article in English | MEDLINE | ID: mdl-38711670

ABSTRACT

Obtaining a career development award from the National Institutes of Health (K award) is often an important step in establishing a career as a vascular surgeon scientist. The application and review process is competitive, involves many steps, and may be confusing to the prospective applicant. Further, there are requirements involving mentors and the applicant's institution. This article, authored completely by vascular surgeons with active K awards, is intended for potential applicants and personnel at their institution and reviews relevant information including strategies for a successful application.

7.
Int J Biometeorol ; 2024 May 06.
Article in English | MEDLINE | ID: mdl-38709342

ABSTRACT

Extreme heat alerts are the most common form of weather forecasting services used in Australia, yet very limited studies have documented their effectiveness in improving health outcomes. This study aimed to examine the temporal changes in temperature-related mortality in relation to the activation of the heat-health alert and response system (HARS) in the State of Victoria, Australia. We examined the relationship between temperatures and mortality using quasi-Poisson regression and the distributed lag non-linear model (dlnm) and compared the temperature-mortality association between the two periods: period 1- prior-HARS (1992-2009) and period 2- post-HARS (2010-2019). Since the HARS heavily weights heatwave effects, we also compared the main effects of heatwave events between the two periods. The heatwaves were defined for three levels, including 3 consecutive days at 97th, 98th, and 99th percentiles. We also controlled the potential confounding effect of seasonality by including a natural cubic B-spline of the day of the year with equally spaced knots and 8 degrees of freedom per year. The exposure-response curve reveals the temperature mortality was reduced in period 2 in comparison with period 1. The relative risk ratios (RRR) of Period 2 over Period 1 were all less than one and gradually decreased from 0.86 (95% CI, 0.72-1.03) to 0.64 (95% CI, 0.33-1.22), and the differences in attributable risk percent increased from 13.2 to 25.3%. The reduction in the risk of heatwave-related deaths decreased by 3.4% (RRp1 1.068, 95% CI, 1.024-1.112 versus RRp2 1.034, 95% CI, 0.986-1.082) and 10% (RRp1 1.16, 95% CI, 1.10-1.22 versus RRp2 1.06, 95% CI, 1.002-1.119) for all groups of people. The study indicated a decrease in heat-related mortality following the operation of HARS in Victoria under extreme heat and high-intensity heatwaves conditions. Further studies could investigate the extent of changes in mortality among populations of differing socio-economic groups during the operation of the heat-health alert system.

8.
Osteoporos Int ; 2024 May 28.
Article in English | MEDLINE | ID: mdl-38806788

ABSTRACT

The effect of deprivation on total bone health status has not been well defined. We examined the relationship between socioeconomic deprivation and poor bone health and falls and we found a significant association. The finding could be beneficial for current public health strategies to minimise disparities in bone health. PURPOSE: Socioeconomic deprivation is associated with many illnesses including increased fracture incidence in older people. However, the effect of deprivation on total bone health status has not been well defined. To examine the relationship between socioeconomic deprivation and poor bone health and falls, we conducted a cross-sectional study using baseline measures from the United Kingdom (UK) Biobank cohort comprising 502,682 participants aged 40-69 years at recruitment during 2006-2010. METHOD: We examined four outcomes: 1) low bone mineral density/osteopenia, 2) fall in last year, 3) fracture in the last five years, and 4) fracture from a simple fall in the last five years. To measure socioeconomic deprivation, we used the Townsend index of the participant's residential postcode. RESULTS: At baseline, 29% of participants had low bone density (T-score of heel < -1 standard deviation), 20% reported a fall in the previous year, and 10% reported a fracture in the previous five years. Among participants experiencing a fracture, 60% reported the cause as a simple fall. In the multivariable logistic regression model after controlling for other covariates, the odds of a fall, fracture in the last five years, fractures from simple fall, and osteopenia were respectively 1.46 times (95% confidence interval [CI] 1.42-1.49), 1.26 times (95% CI 1.22-1.30), 1.31 times (95% CI 1.26-1.36) and 1.16 times (95% CI 1.13-1.19) higher for the most deprived compared with the least deprived quantile. CONCLUSION: Socioeconomic deprivation was significantly associated with poor bone health and falls. This research could be beneficial to minimise social disparities in bone health.

9.
J Vasc Surg ; 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38697233

ABSTRACT

OBJECTIVE: Cumulative, probability-based metrics are regularly used to measure quality in professional sports, but these methods have not been applied to health care delivery. These techniques have the potential to be particularly useful in describing surgical quality, where case volume is variable and outcomes tend to be dominated by statistical "noise." The established statistical technique used to adjust for differences in case volume is reliability-adjustment, which emphasizes statistical "signal" but has several limitations. We sought to validate a novel measure of surgical quality based on earned outcomes methods (deaths above average [DAA]) against reliability-adjusted mortality rates, using abdominal aortic aneurysm (AAA) repair outcomes to illustrate the measure's performance. METHODS: Earned outcomes methods were used to calculate the outcome of interest for each patient: DAA. Hospital-level DAA was calculated for non-ruptured open AAA repair and endovascular aortic repair (EVAR) in the Vascular Quality Initiative database from 2016 to 2019. DAA for each center is the sum of observed - predicted risk of death for each patient; predicted risk of death was calculated using established multivariable logistic regression modeling. Correlations of DAA with reliability-adjusted mortality rates and procedure volume were determined. Because an accurate quality metric should correlate with future results, outcomes from 2016 to 2017 were used to categorize hospital quality based on: (1) risk-adjusted mortality; (2) risk- and reliability-adjusted mortality; and (3) DAA. The best performing quality metric was determined by comparing the ability of these categories to predict 2018 to 2019 risk-adjusted outcomes. RESULTS: During the study period, 3734 patients underwent open repair (106 hospitals), and 20,680 patients underwent EVAR (183 hospitals). DAA was closely correlated with reliability-adjusted mortality rates for open repair (r = 0.94; P < .001) and EVAR (r = 0.99; P < .001). DAA also correlated with hospital case volume for open repair (r = -.54; P < .001), but not EVAR (r = 0.07; P = .3). In 2016 to 2017, most hospitals had 0% mortality (55% open repair, 57% EVAR), making it impossible to evaluate these hospitals using traditional risk-adjusted mortality rates alone. Further, zero mortality hospitals in 2016 to 2017 did not demonstrate improved outcomes in 2018 to 2019 for open repair (3.8% vs 4.6%; P = .5) or EVAR (0.8% vs 1.0%; P = .2) compared with all other hospitals. In contrast to traditional risk-adjustment, 2016 to 2017 DAA evenly divided centers into quality quartiles that predicted 2018 to 2019 performance with increased mortality rate associated with each decrement in quality quartile (Q1, 3.2%; Q2, 4.0%; Q3, 5.1%; Q4, 6.0%). There was a significantly higher risk of mortality at worst quartile open repair hospitals compared with best quartile hospitals (odds ratio, 2.01; 95% confidence interval, 1.07-3.76; P = .03). Using 2016 to 2019 DAA to define quality, highest quality quartile open repair hospitals had lower median DAA compared with lowest quality quartile hospitals (-1.18 DAA vs +1.32 DAA; P < .001), correlating with lower median reliability-adjusted mortality rates (3.6% vs 5.1%; P < .001). CONCLUSIONS: Adjustment for differences in hospital volume is essential when measuring hospital-level outcomes. Earned outcomes accurately categorize hospital quality and correlate with reliability-adjustment but are easier to calculate and interpret. From 2016 to 2019, highest quality open AAA repair hospitals prevented >40 perioperative deaths compared with the average hospital, and >80 perioperative deaths compared with lowest quality hospitals.

10.
Biochem Pharmacol ; : 116241, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38697309

ABSTRACT

Fatty acid omega hydroxylase P450s consist of enzymes that hydroxylate various chain-length saturated and unsaturated fatty acids (FAs) and bioactive eicosanoid lipids. The human cytochrome P450 gene 4 family (CYP4) consists of 12 members that are associated with several human diseases. However, their role in the progression of metabolic dysfunction-associated fatty liver disease (MASLD) remains largely unknown. It has long been thought that the induction of CYP4 family P450 during fasting and starvation prevents FA-related lipotoxicity through FA metabolism to dicarboxylic acids that are chain-shortened in peroxisomes and then transported to the mitochondria for complete oxidation. Several studies have revealed that peroxisome succinate transported to the mitochondria is used for gluconeogenesis during fasting and starvation, and recent evidence suggests that peroxisome acetate can be utilized for lipogenesis and lipid droplet formation as well as epigenetic modification of gene transcription. In addition, omega hydroxylation of the bioactive eicosanoid arachidonic acid to 20-Hydroxyeicosatetraenoic acid (20-HETE) is essential for activating the GPR75 receptor, leading to vasoconstriction and cell proliferation. Several mouse models of diet-induced MASLD have revealed the induction of selective CYP4A members and the suppression of CYP4F during steatosis and steatohepatitis, suggesting a critical metabolic role in the progression of fatty liver disease. Thus, to further investigate the functional roles of CYP4 genes, we analyzed the differential gene expression of 12 members of CYP4 gene family in datasets from the Gene Expression Omnibus (GEO) from patients with steatosis, steatohepatitis, fibrosis, cirrhosis, and hepatocellular carcinoma. We also observed the differential expression of various CYP4 genes in the progression of MASLD, indicating that different CYP4 members may have unique functional roles in the metabolism of specific FAs and eicosanoids at various stages of fatty liver disease. These results suggest that targeting selective members of the CYP4A family is a viable therapeutic approach for treating and managing MASLD.

11.
Clin Toxicol (Phila) ; 62(3): 152-163, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38683031

ABSTRACT

INTRODUCTION: Patients with sedative overdose may have residual cognitive impairment at the time they are deemed medically cleared for discharge. Impairment could affect the performance of high-risk activities, including driving. The Trail Making Test is an alpha-numeric assessment that can be performed at the bedside to assess cognitive function. We examined whether there were differences in cognitive function when medically cleared between patients that overdosed on sedative and non-sedative drugs. METHODS: A prospective, observational study assessed cognitive function using the Trail Making Test between 2018 and 2021. Patients (16 years and greater) completed testing upon medical clearance if they spoke English and had no previous neurological injury. Continuous covariates were compared using t-tests or Mann-Whitney U tests and multiple linear regression; binary variables were modelled using logistic regression. RESULTS: Of 171 patients enrolled, 111 (65 per cent) had sedative overdose; they were older (median 32.1 versus 22.2 years) and more likely to be male (58.6 per cent versus 36.7 per cent). Benzodiazepines and paracetamol were the commonest drug overdoses. Patients with sedative overdose performed worse on Trail Making Test part A (37.0 versus 33.1 seconds, P = 0.017) and Trail Making Test part B (112.4 versus 81.5 seconds, P = 0.004). Multiple linear regression analysis indicated that patient age (P < 0.001, 1.7 seconds slower per year, 95 per cent confidence interval: 0.9-2.6 seconds) and perception of recovery (P = 0.006, 36.4 seconds slower if perceived not recovered, 95 per cent confidence interval: 10.8-62.0 seconds) were also associated with Trail Making Test part B times. Patients with sedative overdose were more likely to be admitted to the intensive care unit (Odds Ratio: 4.9, 95 percent confidence interval: 1.1-22.0; P = 0.04). DISCUSSION: Our results are broadly in keeping with previously published work, but include a wider range of drug overdose scenarios (polypharmacy and recreational drugs). While patients demonstrated some perception of their cognitive impairment, our model could not reliably be used to provide individual discharge advice. The study design did not allow us to prove causation of cognitive impairment, or to make comparison between the strength of an overdose to the trail making test time. CONCLUSIONS: Trail Making Test results suggested that patients who had sedative drug overdoses may have significant cognitive deficits even when medically cleared. Risk of harm may be minimised with advice to avoid high-risk activities such as driving. More profound impacts seen on the Trail Making Test part B than A may mean higher-order thinking is more affected than simple cognitive function.


Subject(s)
Cognitive Dysfunction , Drug Overdose , Hypnotics and Sedatives , Humans , Male , Hypnotics and Sedatives/poisoning , Female , Cognitive Dysfunction/chemically induced , Prospective Studies , Adult , Young Adult , Middle Aged , Adolescent , Trail Making Test , Cognition/drug effects , Benzodiazepines/poisoning
12.
J Vasc Surg Venous Lymphat Disord ; 12(4): 101884, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38552954

ABSTRACT

BACKGROUND: Insurance companies have adopted variable and inconsistent approval criteria for chronic venous disease (CVD) treatment. Although vein ablation (VA) is accepted as the standard of care for venous ulcers, the treatment criteria for patients with milder forms of CVD remain controversial. This study aims to identify factors associated with a lack of clinical improvement (LCI) in patients with less severe CVD without ulceration undergoing VA to improve patient selection for treatment. METHODS: We performed a retrospective analysis of patients undergoing VA for CEAP C2 to C4 disease in the Vascular Quality Initiative varicose veins database from 2014 to 2023. Patients who required intervention in multiple veins, had undergone prior interventions, or presented with CEAP C5 to C6 disease were excluded. The difference (Δ) in venous clinical severity score (VCSS; VCSS before minus after the procedure) was used to categorize the patients. Patients with a ΔVCSS of ≤0 were defined as having LCI after VA, and patients with ≥1 point decrease in the VCSS after VA (ΔVCSS ≥1) as having some benefit from the procedure and, therefore, "clinical improvement." The characteristics of both groups were compared, and multivariable regression analysis was performed to identify factors independently associated with LCI. A second analysis was performed based on the VVSymQ instrument, which measures patient-reported outcomes using five specific symptoms (ie, heaviness, achiness, swelling, throbbing pain, and itching). Patients with LCI showed no improvement in any of the five symptoms, and those with clinical improvement had a decrease in severity of at least one symptom. RESULTS: A total of 3544 patients underwent initial treatment of CVD with a single VA. Of the 3544 patients, 2607 had VCSSs available before and after VA, and 420 (16.1%) had LCI based on the ΔVCSS. Patients with LCI were more likely to be significantly older and African American and have CEAP C2 disease compared with patients with clinical improvement. Patients with clinical improvement were more likely to have reported using compression stockings before treatment. The vein diameters were not different between the two groups. The incidence of complications was overall low, with minor differences between the two groups. However, the patients with LCI were significantly more likely to have symptoms after intervention than those with improvement. Patients with LCI were more likely to have technical failure, defined as vein recanalization. On multivariable regression, age (odds ratio [OR], 1.01; 95% confidence interval [CI], 1.00-1.02) and obesity (OR, 1.47; 95% CI, 1.09-2.00) were independently associated with LCI, as was treatment of less severe disease (CEAP C2; OR, 1.82; 95% CI, 1.30-2.56) compared with more advanced disease (C4). The lack of compression therapy before intervention was also associated with LCI (OR, 6.05; 95% CI, 4.30-8.56). The analysis based on the VVSymQ showed similar results. CONCLUSIONS: LCI after VA is associated with treating patients with a lower CEAP class (C2 vs C4) and a lack of compression therapy before intervention. Importantly, no significant association between vein size and clinical improvement was observed.


Subject(s)
Ablation Techniques , Humans , Male , Female , Retrospective Studies , Middle Aged , Aged , Treatment Outcome , Risk Factors , Ablation Techniques/adverse effects , Varicose Veins/surgery , Varicose Veins/diagnostic imaging , Varicose Veins/physiopathology , Databases, Factual , Severity of Illness Index , Chronic Disease , Adult , Patient Selection , Time Factors , Risk Assessment
14.
Nephrology (Carlton) ; 29(4): 177-187, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38122827

ABSTRACT

During the last two decades, an epidemic of a severe form of chronic kidney disease (CKD) unrelated to traditional risk factors (diabetes and hypertension) has been recognized in low- to middle-income countries. CKD of unknown aetiology (CKDu) mainly affects young working-age adults, and has become as an important and devastating public health issue. CKDu is a multifactorial disease with associated genetic and environmental risk factors. This review summarizes the current epidemiological evidence on the burden of CKDu and its probable environmental risk factors contributing to CKD in Africa. PubMed/Medline and the African Journals Online databases were searched to identify relevant population-based studies published in the last two decades. In the general population, the burden of CKD attributable to CKDu varied from 19.4% to 79%. Epidemiologic studies have established that environmental factors, including genetics, infectious agents, rural residence, low socioeconomic status, malnutrition, agricultural practise and exposure to agrochemicals, heavy metals, use of traditional herbs, and contaminated water sources or food contribute to the burden of CKD in the region. There is a great need for epidemiological studies exploring the true burden of CKDu and unique geographical distribution, and the role of environmental factors in the development of CKD/CKDu.


Subject(s)
Metals, Heavy , Renal Insufficiency, Chronic , Adult , Humans , Chronic Kidney Diseases of Uncertain Etiology , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/epidemiology , Renal Insufficiency, Chronic/etiology , Risk Factors , Metals, Heavy/analysis , Africa/epidemiology , Sri Lanka/epidemiology
15.
J Vasc Surg ; 78(4): 1012-1020.e2, 2023 10.
Article in English | MEDLINE | ID: mdl-37318428

ABSTRACT

OBJECTIVE: Anticipated perioperative morbidity is an important factor for choosing a revascularization method for chronic limb-threatening ischemia (CLTI). Our goal was to assess systemic perioperative complications of patients treated with surgical and endovascular revascularization in the Best Endovascular vs Best Surgical Therapy in Patients with CLTI (BEST-CLI) trial. METHODS: BEST-CLI was a prospective randomized trial comparing open (OPEN) and endovascular (ENDO) revascularization strategies for patients with CLTI. Two parallel cohorts were studied: Cohort 1 included patients with adequate single-segment great saphenous vein (SSGSV), whereas Cohort 2 included those without SSGSV. Data were queried for major adverse cardiovascular events (MACE-composite myocardial infarction, stroke, death), non-serious (non-SAEs) and serious adverse events (SAEs) (criteria-death/life-threatening/requiring hospitalization or prolongation of hospitalization/significant disability/incapacitation/affecting subject safety in trial) 30 days after the procedure. Per protocol analysis was used (intervention received without crossover), and risk-adjusted analysis was performed. RESULTS: There were 1367 patients (662 OPEN, 705 ENDO) in Cohort 1 and 379 patients (188 OPEN, 191 ENDO) in Cohort 2. Thirty-day mortality in Cohort 1 was 1.5% (OPEN 1.8%; ENDO 1.3%) and in Cohort 2 was 1.3% (2.7% OPEN; 0% ENDO). MACE in Cohort 1 was 4.7% for OPEN vs 3.13% for ENDO (P = .14), and in Cohort 2, was 4.28% for OPEN and 1.05% for ENDO (P = .15). On risk-adjusted analysis, there was no difference in 30-day MACE for OPEN vs ENDO for Cohort 1 (hazard ratio [HR] 1.5; 95% confidence interval [CI], 0.85-2.64; P = .16) or Cohort 2 (HR, 2.17; 95% CI, 0.48-9.88; P = .31). The incidence of acute renal failure was similar across interventions; in Cohort 1 it was 3.6% for OPEN vs 2.1% for ENDO (HR, 1.6; 95% CI, 0.85-3.12; P = .14), and in Cohort 2, it was 4.2% OPEN vs 1.6% ENDO (HR, 2.86; 95% CI, 0.75-10.8; P = .12). The occurrence of venous thromboembolism was low overall and was similar between groups in Cohort 1 (OPEN 0.9%; ENDO 0.4%) and Cohort 2 (OPEN 0.5%; ENDO 0%). Rates of any non-SAEs in Cohort 1 were 23.4% in OPEN and 17.9% in ENDO (P = .013); in Cohort 2, they were 21.8% for OPEN and 19.9% for ENDO (P = .7). Rates for any SAEs in Cohort 1 were 35.3% for OPEN and 31.6% for ENDO (P = .15); in Cohort 2, they were 25.5% for OPEN and 23.6% for ENDO (P = .72). The most common types of non-SAEs and SAEs were infection, procedural complications, and cardiovascular events. CONCLUSIONS: In BEST-CLI, patients with CLTI who were deemed suitable candidates for open lower extremity bypass surgery had similar peri-procedural complications following either OPEN or ENDO revascularization: In such patients, concern about risk of peri-procedure complications should not be a deterrent in deciding revascularization strategy. Rather, other factors, including effectiveness in restoring perfusion and patient preference, are more relevant.


Subject(s)
Endovascular Procedures , Peripheral Arterial Disease , Humans , Chronic Limb-Threatening Ischemia , Prospective Studies , Risk Factors , Peripheral Arterial Disease/diagnostic imaging , Peripheral Arterial Disease/surgery , Limb Salvage , Ischemia/diagnostic imaging , Ischemia/etiology , Ischemia/surgery , Lower Extremity/blood supply , Treatment Outcome , Retrospective Studies
16.
Sci Total Environ ; 887: 164046, 2023 Aug 20.
Article in English | MEDLINE | ID: mdl-37187389

ABSTRACT

The prevalence of antimicrobial resistance genes (ARGs) in aquaculture has raised serious public concerns for food safety and human health, but its relationships to the use of antimicrobials in aquacultural ponds and even to their residues in the whole aquatic environment remain unclear. In this study, a better coverage of 323 target ARGs and 40 mobile genetic elements (MGEs) was analyzed in sediment using a smart chip-based high-throughput quantitative PCR approach (HT-qPCR) in random 20 ponds of a tilapia farming base in southern China, whose antimicrobial residues were reported previously. In total, 159 ARGs and 29 MGEs were quantified in 58 surface sediment samples across the ponds. Absolute abundance of ARGs ranged from 0.2 to 13.5 × 106 copies g-1, dominated by the categories of multidrug and sulfonamides. The quantified ARGs abundance and the antimicrobial compound residues were significantly correlated instead with the antimicrobial categories, mainly compounds in categories of fluoroquinolones and sulfonamides and trimethoprim (TMP). Antimicrobial residues alone explained 30.6 % of the ARGs' variation quantified in sediment across the ponds, indicating the clear link between antimicrobials and the proliferation of ARGs in aquaculture. Co-proliferation of the ARGs with non-related antimicrobial compounds quantified in sediment was also observed, especially for aminoglycosides' ARGs, which were highly associated with integrons (intI 1) as argued being carried by the intI 1 gene cassette arrays. Physicochemical properties of sediment (pH, electric conductivity, and total sulfur content) highly contributed the variations of the quantified ARGs abundance (21 %) across all the sediment samples equaling to the MGEs (20 %), suggesting co-selection for ARGs' proliferation in the aquaculture environment. This study provides insights into the interactions between residual antimicrobials and ARGs, which would enhance the understandings on the use and management of antimicrobials in aquaculture worldwide to strategize mitigation of antimicrobial resistance in aquaculture.


Subject(s)
Anti-Bacterial Agents , Tilapia , Animals , Humans , Anti-Bacterial Agents/pharmacology , Genes, Bacterial , Drug Resistance, Bacterial , Ponds , Agriculture , Cell Proliferation
17.
Ann Surg ; 278(5): e1128-e1134, 2023 11 01.
Article in English | MEDLINE | ID: mdl-37051921

ABSTRACT

OBJECTIVE: To evaluate the potential pathway, through which race and socioeconomic status, as measured by the social deprivation index (SDI), affect outcomes after lower extremity bypass chronic limb-threatening ischemia (CLTI), a marker for delayed presentation. BACKGROUND: Racial and socioeconomic disparities persist in outcomes after lower extremity bypass; however, limited studies have evaluated the role of disease severity as a mediator to potentially explain these outcomes using clinical registry data. METHODS: We captured patients who underwent lower extremity bypass using a statewide quality registry from 2015 to 2021. We used mediation analysis to assess the direct effects of race and high values of SDI (fifth quintile) on our outcome measures: 30-day major adverse cardiac event defined by new myocardial infarction, transient ischemic attack/stroke, or death, and 30-day and 1-year surgical site infection (SSI), amputation and bypass graft occlusion. RESULTS: A total of 7077 patients underwent a lower extremity bypass procedure. Black patients had a higher prevalence of CLTI (80.63% vs 66.37%, P < 0.001). In mediation analysis, there were significant indirect effects where Black patients were more likely to present with CLTI, and thus had increased odds of 30-day amputation [odds ratio (OR): 1.11, 95% CI: 1.068-1.153], 1-year amputation (OR: 1.083, 95% CI: 1.045-1.123) and SSI (OR: 1.052, 95% CI: 1.016-1.089). There were significant indirect effects where patients in the fifth quintile for SDI were more likely to present with CLTI and thus had increased odds of 30-day amputation (OR: 1.065, 95% CI: 1.034-1.098) and SSI (OR: 1.026, 95% CI: 1.006-1.046), and 1-year amputation (OR: 1.068, 95% CI: 1.036-1.101) and SSI (OR: 1.026, 95% CI: 1.006-1.046). CONCLUSIONS: Black patients and socioeconomically disadvantaged patients tended to present with a more advanced disease, CLTI, which in mediation analysis was associated with increased odds of amputation and other complications after lower extremity bypass compared with White patients and those that were not socioeconomically disadvantaged.


Subject(s)
Peripheral Arterial Disease , Humans , Risk Factors , Peripheral Arterial Disease/surgery , Treatment Outcome , Limb Salvage , Ischemia/surgery , Lower Extremity/surgery , Socioeconomic Factors , Retrospective Studies
18.
J Vasc Surg Venous Lymphat Disord ; 11(5): 986-994.e3, 2023 09.
Article in English | MEDLINE | ID: mdl-37120040

ABSTRACT

OBJECTIVE: Venous thromboembolism (VTE) after major surgery remains an important contributor to morbidity and mortality. Despite significant quality improvement efforts in prevention and prophylaxis strategies, the degree of hospital and regional variation in the United States remains unknown. METHODS: Medicare beneficiaries undergoing 13 different major surgeries at U.S. hospitals between 2016 and 2018 were included in this retrospective cohort study. We calculated the rates of 90-day VTE. We adjusted for a variety of patient and hospital covariates and used a multilevel logistic regression model to calculate the rates of VTE and coefficients of variation across hospitals and hospital referral regions (HRRs). RESULTS: A total of 4,115,837 patients from 4116 hospitals were included, of whom 116,450 (2.8%) experienced VTE within 90 days. The 90-day VTE rates varied substantially by procedure, from 2.5% for abdominal aortic aneurysm repair to 8.4% for pancreatectomy. Across the hospitals, there was a 6.6-fold variation in index hospitalization VTE and a 5.3-fold variation in the rate of postdischarge VTE. Across the HRRs, there was a 2.6-fold variation in 90-day VTE, with a 12.1-fold variation in the coefficient of variation. A subset of HRRs was identified with both higher VTE rates and higher variance across hospitals. CONCLUSIONS: Substantial variation exists in the rate of postoperative VTE across U.S. hospitals. Characterizing HRRs with high overall rates of VTE and those with significant variation across the hospitals will allow for targeted quality improvement efforts.


Subject(s)
Pulmonary Embolism , Venous Thromboembolism , Humans , Aged , United States/epidemiology , Venous Thromboembolism/diagnosis , Venous Thromboembolism/epidemiology , Venous Thromboembolism/etiology , Retrospective Studies , Aftercare , Patient Discharge , Medicare , Risk Factors
19.
Pediatr Allergy Immunol ; 34(3): e13941, 2023 03.
Article in English | MEDLINE | ID: mdl-36974652

ABSTRACT

BACKGROUND: Evidence has suggested a bidirectional association between both the effects and onset of asthma and anxiety. The direction of this association in children and adolescents is less clear. The study evaluates whether anxiety in children is associated with the development of later asthma or, by contrast, whether asthma in children precedes anxiety. METHODS: Parental reports from 9369 children at two age points (4-5 and 14-15 years old) and from baby (B) (recruited at birth in 2004) and kindergarten (K) (recruited at 4-5 years of age in 2004) cohorts of the Longitudinal Study of Australian Children (LSAC) were analyzed. Asthma cases were defined as reports of doctor-diagnosed asthma and the use of asthma medication or/and wheezing. Scores of the Strengths and Difficulties Questionnaire (SDQ) defined anxiety. RESULTS: We found a unidirectional association between asthma in children aged 4-5 years and future anxiety development in weighted generalized linear adjusted models (B cohort OR (CI 95%) = 1.54 (1.14-2.08); K cohort OR (CI 95%) = 1.87 (1.40-2.49)). Children with asthma (no anxiety at 4 years) had a higher prevalence of anxiety in adolescence compared with nonasthmatics (B cohort = 26.8% vs 17.6%: K cohort = 27.7% vs 14.3%). Anxiety in childhood was not associated with the development of asthma from 6 years old to adolescence. CONCLUSION: Australian children with asthma have a greater risk of developing anxiety from 6 to 15 years old. This suggests that early multidisciplinary intervention may be required to support children with asthma to either prevent the increased risk of anxiety and/or promote optimal anxiety management.


Subject(s)
Asthma , Infant , Infant, Newborn , Female , Child , Humans , Adolescent , Child, Preschool , Longitudinal Studies , Australia/epidemiology , Asthma/diagnosis , Anxiety/epidemiology , Anxiety Disorders , Respiratory Sounds/etiology , Risk Factors
20.
Ann Vasc Surg ; 93: 79-91, 2023 Jul.
Article in English | MEDLINE | ID: mdl-36863491

ABSTRACT

BACKGROUND: Contrast-associated acute kidney injury (CA-AKI) after endovascular abdominal aortic aneurysm repair (EVAR) is associated with mortality and morbidity. Risk stratification remains a vital component of preoperative evaluation. We sought to generate and validate a preprocedure CA-AKI risk stratification tool for elective EVAR patients. METHODS: We queried the Blue Cross Blue Shield of Michigan Cardiovascular Consortium database for elective EVAR patients and excluded those on dialysis, with a history of renal transplant, death during procedure, and without creatinine measures. Association with CA-AKI (rise in creatinine > 0.5 mg/dL) was tested using mixed-effects logistic regression. Variables associated with CA-AKI were used to generate a predictive model via a single classification tree. The variables selected by the classification tree were then validated by fitting a mixed-effects logistic regression model into the Vascular Quality Initiative dataset. RESULTS: Our derivation cohort included 7,043 patients, 3.5% of whom developed CA-AKI. After multivariate analysis, age (odds ratio [OR] 1.021, 95% confidence interval [CI] 1.004-1.040), female sex (OR 1.393, CI 1.012-1.916), glomerular filtration rate (GFR) < 30 mL/min (OR 5.068, CI 3.255-7.891), current smoking (OR 1.942, CI 1.067-3.535), chronic obstructive pulmonary disease (OR 1.402, CI 1.066-1.843), maximum abdominal aortic aneurysm (AAA) diameter (OR 1.018, CI 1.006-1.029), and presence of iliac artery aneurysm (OR 1.352, CI 1.007-1.816) were associated with increased odds of CA-AKI. Our risk prediction calculator demonstrated that patients with a GFR < 30 mL/min, females, and patients with a maximum AAA diameter of > 6.9 cm are at a higher risk of CA-AKI after EVAR. Using the Vascular Quality Initiative dataset (N = 62,986), we found that GFR < 30 mL/min (OR 4.668, CI 4.007-5.85), female sex (OR 1.352, CI 1.213-1.507), and maximum AAA diameter > 6.9 cm (OR 1.824, CI 1.212-1.506) were associated with an increased risk of CA-AKI after EVAR. CONCLUSIONS: Herein, we present a simple and novel risk assessment tool that can be used preoperatively to identify patients at risk of CA-AKI after EVAR. Patients with a GFR < 30 mL/min, maximum AAA diameter > 6.9 cm, and females who are undergoing EVAR may be at risk for CA-AKI after EVAR. Prospective studies are needed to determine the efficacy of our model.


Subject(s)
Acute Kidney Injury , Aortic Aneurysm, Abdominal , Blood Vessel Prosthesis Implantation , Endovascular Procedures , Female , Humans , Endovascular Procedures/adverse effects , Creatinine , Risk Factors , Treatment Outcome , Acute Kidney Injury/chemically induced , Acute Kidney Injury/diagnosis , Risk Assessment , Aortic Aneurysm, Abdominal/diagnostic imaging , Aortic Aneurysm, Abdominal/surgery , Aortic Aneurysm, Abdominal/complications , Blood Vessel Prosthesis Implantation/adverse effects , Retrospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL
...