RESUMO
RATIONALE: Early natural menopause (early-M; <45 years of age) increases the risk of lung morbidities and mortalities in smokers. However, it is largely unknown whether early-M due to surgery demonstrates similar effects and whether menopausal hormone therapy (MHT) is protective against lung diseases. OBJECTIVES: To assess the associations of early-M and MHT with lung morbidities and mortalities using the prospective Prostate, Lung, Colorectal and Ovarian (PLCO) trial. METHODS: We estimated the risk among 69 706 postmenopausal women in the PLCO trial, stratified by menopausal types and smoking status. RESULTS: Early-M was associated with an increased risk of most lung disease and mortality outcomes in ever smokers with the highest risk seen for respiratory mortality (HR 1.98, 95% CI 1.34 to 2.92) in those with bilateral oophorectomy (BO). Early-M was positively associated with chronic bronchitis, and all-cause, non-cancer and respiratory mortality in never smokers with natural menopause or BO, with the highest risk seen for BO- respiratory mortality (HR 1.91, 95% CI 1.16 to 3.12). Ever MHT was associated with reduced all-cause, non-cancer and cardiovascular mortality across menopause types regardless of smoking status and was additionally associated with reduced risk of non-ovarian cancer, lung cancer (LC) and respiratory mortality in ever smokers. Among smokers, ever MHT use was associated with a reduction in HR for all-cause, non-cancer and cardiovascular mortality in a duration-dependent manner. CONCLUSIONS: Smokers with early-M should be targeted for smoking cessation and LC screening regardless of menopause types. MHT users had a lower likelihood of dying from LC and respiratory diseases in ever smokers.
Assuntos
Pneumopatias , Humanos , Feminino , Pessoa de Meia-Idade , Estudos Prospectivos , Menopausa Precoce , Menopausa/fisiologia , Fumar/efeitos adversos , Fumar/epidemiologia , Idoso , Terapia de Reposição de Estrogênios , Fatores de Risco , Terapia de Reposição Hormonal/efeitos adversosRESUMO
BACKGROUND: Post-transplant health-related quality of life (HRQOL) is associated with health outcomes for kidney transplant (KT) recipients. However, pretransplant predictors of improvements in post-transplant HRQOL remain incompletely understood. Namely, important pretransplant cultural factors, such as experience of discrimination, perceived racism in healthcare, or mistrust of the healthcare system, have not been examined as potential HRQOL predictors. Also, few have examined predictors of decline in HRQOL post-transplant. METHODS: Using data from a prospective cohort study, we examined HRQOL change pre- to post-transplant, and novel cultural predictors of the change. We measured physical, mental, and kidney-specific HRQOL as outcomes, and used cultural factors as predictors, controlling for demographic, clinical, psychosocial, and transplant knowledge covariates. RESULTS: Among 166 KT recipients (57% male; mean age 50.6 years; 61.4% > high school graduates; 80% non-Hispanic White), we found mental and physical, but not kidney-specific, HRQOL significantly improved post-transplant. No culturally related factors outside of medical mistrust significantly predicted change in any HRQOL outcome. Instead, demographic, knowledge, and clinical factors significantly predicted decline in each HRQOL domain: physical HRQOL-older age, more post-KT complications, higher pre-KT physical HRQOL; mental HRQOL-having less information pre-KT, greater pre-KT mental HRQOL; and, kidney-specific HRQOL-poorer kidney functioning post-KT, lower expectations for physical condition to improve, and higher pre-KT kidney-specific HRQOL. CONCLUSIONS: Instead of cultural factors, predictors of HRQOL decline included demographic, knowledge, and clinical factors. These findings are useful for identifying patient groups that may be at greater risk of poorer post-transplant outcomes, in order to target individualized support to patients.
Assuntos
Transplante de Rim , Humanos , Masculino , Pessoa de Meia-Idade , Feminino , Transplante de Rim/psicologia , Qualidade de Vida/psicologia , Estudos Prospectivos , Confiança , RimRESUMO
OBJECTIVE: To understand the risk of unplanned hysterectomy (UH) in pregnant women better in association with maternal sociodemographic characteristics, cardiovascular disease (CVD) risk factors, and current pregnancy complications. DESIGN: Using Florida birth data from 2005 to 2014, we investigated the possible interactions between known risk factors of having UH, including maternal sociodemographic characteristics, maternal medical history, and other pregnancy complications. Logistic regression models were constructed. Adjusted odds ratios and 95% confidence intervals were reported. RESULTS: Several interactions were observed that significantly affected odds of UH. Compared to non-Hispanic White women, Hispanic minority women were more likely to have an UH. The overall risk of UH for women with preterm birth (<37 weeks) and concurrently had premature rupture of membranes (PRoM), uterine rupture, or a previous cesarean delivery was significantly higher than women who delivered to term and had no pregnancy complications. Women who delivered via cesarean who also had preeclampsia, PRoM, or uterine rupture had an overall increased risk of UH. Significantly decreased risk of UH was seen for Black women less than 20 years old, women of other minority races with either less than a high school degree or a college degree or greater, women of other minority races with PRoM, and women with preterm birth and diabetes compared to respective reference groups. CONCLUSIONS: Maternal race, ethnicity, CVD risk factors, and current pregnancy complications affect the risk of UH in pregnant women through complex interactions that would not be seen in unadjusted models of risk analysis.
Assuntos
Doenças Cardiovasculares , Complicações na Gravidez , Nascimento Prematuro , Ruptura Uterina , Gravidez , Feminino , Recém-Nascido , Humanos , Adulto Jovem , Adulto , Etnicidade , Nascimento Prematuro/epidemiologia , Fatores Sociodemográficos , Doenças Cardiovasculares/epidemiologia , Complicações na Gravidez/epidemiologia , Fatores de Risco , Histerectomia , Estudos RetrospectivosRESUMO
BACKGROUND: Impairment in cerebral autoregulation has been proposed as a potentially targetable factor in patients with aneurysmal subarachnoid hemorrhage (aSAH); however, there are different continuous measures that can be used to calculate the state of autoregulation. In addition, it has previously been proposed that there may be an association of impaired autoregulation with the occurrence of spreading depolarization (SD) events. METHODS: Study participants with invasive multimodal monitoring and aSAH were enrolled in an observational study. Autoregulation indices were prospectively calculated from this database as a 10 s moving correlation coefficient between various cerebral blood flow (CBF) surrogates and mean arterial pressure (MAP). In study participants with subdural electrocorticography (ECoG) monitoring, SD was also scored. Associations between clinical outcomes using the modified Rankin scale and occurrence of either isolated or clustered SD were assessed. RESULTS: A total of 320 study participants were included, 47 of whom also had ECoG SD monitoring. As expected, baseline severity factors, such as modified Fisher scale score and World Federation of Neurosurgical Societies scale grade, were strongly associated with the clinical outcome. SD probability was related to blood pressure in a triphasic pattern, with a linear increase in probability below MAP of ~ 100 mm Hg. Multiple autoregulation indices were available for review based on moving correlations between mean arterial pressure (MAP) and various surrogates of cerebral blood flow (CBF). We calculated the pressure reactivity (PRx) using two different sources for intracranial pressure (ICP). We calculated the oxygen reactivity (ORx) using the partial pressure of brain tissue oxygen (PbtO2) from the Licox probe. We calculated the cerebral blood flow reactivity (CBFRx) using perfusion measurements from the Bowman perfusion probe. Finally, we calculated the cerebral oxygen saturation reactivity (OSRx) using regional cerebral oxygen saturation measured by near-infrared spectroscopy from the INVOS sensors. Only worse ORx and OSRx were associated with worse clinical outcomes. Both ORx and OSRx also were found to increase in the hour prior to SD for both sporadic and clustered SD. CONCLUSIONS: Impairment in autoregulation in aSAH is associated with worse clinical outcomes and occurrence of SD when using ORx and OSRx. Impaired autoregulation precedes SD occurrence. Targeting the optimal MAP or cerebral perfusion pressure in patients with aSAH should use ORx and/or OSRx as the input function rather than intracranial pressure.
RESUMO
PURPOSE: To evaluate the effectiveness of microfragmented adipose tissue (MFAT) for pain relief and improved joint functionality in osteoarthritis (OA) of the knee in a randomized controlled clinical trial with 1-year follow-up. METHODS: Seventy-five patients were stratified by baseline pain level and randomized to 1 of 3 treatment groups: MFAT, corticosteroid (CS), or saline control (C) injection. Patients 18 years of age or older, diagnosed with symptomatic OA of the knee, with radiographic evidence of OA of the knee and a visual analog pain scale score of 3 of 10 or greater were included. Patients were excluded if they had any previous intra-articular knee injection, current knee ligamentous instability, or an allergy to lidocaine/corticosteroid. The visual analog pain scale, Western Ontario and McMaster Universities Osteoarthritis Index, and the Knee Injury and Osteoarthritis Outcome score (KOOS) were recorded preprocedure and at 2 weeks, 6 weeks, 3 and 6 months, and 1-year follow-up. RESULTS: MFAT demonstrated consistent and statistically significant improvements across all primary outcome measures for joint pain and functionality compared with C. For MFAT, there was a significant improvement over baseline at each follow-up, with median (95% confidence interval) KOOS Pain score changes of 18.1 (11.1-26.4) at week 2 to 27.8 (19.4-37.5) at 1 year. For CS, the median KOOS pain score reached a maximum of 22.2 (15.3-30.6) at week 2, only to level off to 13.9 (-2.8 to 29.2), a level not statistically different from baseline, at 1 year. The median changes for C hovered around 6 to 11 points, with statistically significant improvements over baseline indicating a placebo effect. Similar trends were seen for the Western Ontario and McMaster Universities Osteoarthritis Index Pain score and VAS Pain score. CONCLUSIONS: In this study, MFAT demonstrated a clinically significant improvement in primary outcome scores compared with the C group, whereas the CS group only showed statistically significant improvement compared with the C group at 2 and 6 weeks. This finding indicates that MFAT may be a viable alternative treatment for patients with OA of the knee who fall into the orthopaedic treatment gap. LEVEL OF EVIDENCE: Level II, partially blinded, randomized controlled clinical trial.
RESUMO
Non-attendance to kidney transplant evaluation (KTE) appointments is a barrier to optimal care for those with kidney failure. We examined the medical and socio-cultural factors that predict KTE non-attendance to identify opportunities for integrated medical teams to intervene. Patients scheduled for KTE between May, 2015 and June, 2018 completed an interview before their initial KTE appointment. The interview assessed various social determinants of health, including demographic (e.g., income), medical (e.g. co-morbidities), transplant knowledge, cultural (e.g., medical mistrust), and psychosocial (e.g., social support) factors. We used multiple logistic regression analysis to determine the strongest predictor of KTE non-attendance. Our sample (N = 1119) was 37% female, 76% non-Hispanic White, median age 59.4 years (IQR 49.2-67.5). Of note, 142 (13%) never attended an initial KTE clinic appointment. Being on dialysis predicted higher odds of KTE non-attendance (OR 1.76; p = .02; 64% of KTE attendees on dialysis vs. 77% of non-attendees on dialysis). Transplant and nephrology teams should consider working collaboratively with dialysis units to better coordinate care, (e.g., resources to attend appointment or outreach to emphasize the importance of transplant) adjusting the KTE referral and evaluation process to address access issues (e.g., using tele-health) and encouraging partnership with clinical psychologists to promote quality of life for those on dialysis.
Assuntos
Transplante de Rim , Qualidade de Vida , Humanos , Feminino , Pessoa de Meia-Idade , Masculino , Confiança , Diálise Renal , ComorbidadeRESUMO
BACKGROUND: Village doctors, as gatekeepers of the health system for rural residents in China, are often confronted with adversity in providing the basic public healthcare services. OBJECTIVE: We sought to summarize the training contents, training method, training location, and training costs most preferred by village doctors in China and hope to provide evidence and support for the government to deliver better training in the future. METHODS: Eight databases were searched to include studies that reported on the training needs of village doctors in China. We undertook a systematic review and a narrative synthesis of data. RESULTS: A total of 38 cross-sectional studies including 35,545 participants were included. In China, village doctors have extensive training needs. "Clinical knowledge and skill" and "diagnosis and treatment of common disease" were the most preferred training content; "continuing medical education" was the most preferred delivery method; above county- and county-level hospitals were the most desirable training locations, and the training costs were expected to be low or even free. CONCLUSION: Village doctors in various regions of China have similar preferences for training. Thus, future training should focus more on the training needs and preferences of village doctors.
RESUMO
INTRODUCTION: Health workers in rural and remote areas shoulder heavy responsibilities for rural residents. This systematic review aims to assess the effectiveness of continuing education programs for health workers in rural and remote areas. METHODS: Eight electronic databases were searched on 28 November 2021. Randomized controlled trials (RCTs) and quasi-experimental studies evaluating the effectiveness of continuing education for health workers in rural and remote areas were included. The quality of the studies was assessed using the risk of bias tool provided by Effective Practice and Organization of Care. A meta-analysis was performed for eligible trials, and the other findings were presented as a narrative review because of inconsistent study types and outcomes. RESULTS: A total of 17 studies were included, four of which were RCTs. The results of the meta-analysis showed that compared to no intervention, continuing education programs significantly improved the knowledge awareness rate of participants (odds ratio=4.09, 95% confidence interval 2.51-6.67, p<0.05). Qualitative analysis showed that 12 studies reported on the level of knowledge of participants, with all showing positive changes. Eight studies measured the performance of health workers in rural and remote areas, with 87.50% (n=7) finding improved performance. Two studies reported on the impact of continuing education programs for health workers in rural and remote areas on patient health, with only one showing a positive change. One study from India measured the health of communities, which showed a positive change. CONCLUSION: The results of this study showed that continuing education programs are an effective way to address the lack of knowledge and skills among health workers in rural and remote areas. Few studies have examined the effectiveness of education programs for health workers in rural and remote areas in improving patient health outcomes. It is not yet known whether the delivery of continuing education programs to health workers in rural areas has a positive impact on patient and community health. Future attention should continue to be paid to the impact on these outcomes.
Assuntos
Educação Continuada , Pessoal de Saúde , Humanos , Pessoal de Saúde/educação , Escolaridade , Saúde Pública/educação , ÍndiaRESUMO
BACKGROUND: The role of wood smoke (WS) exposure in the etiology of chronic obstructive pulmonary disease (COPD), lung cancer (LC), and mortality remains elusive in adults from countries with low ambient levels of combustion-emitted particulate matter. This study aims to delineate the impact of WS exposure on lung health and mortality in adults age 40 and older who ever smoked. METHODS: We assessed health impact of self-reported "ever WS exposure for over a year" in the Lovelace Smokers Cohort using both objective measures (i.e., lung function decline, LC incidence, and deaths) and two health related quality-of-life questionnaires (i.e., lung disease-specific St. George's Respiratory Questionnaire [SGRQ] and the generic 36-item short-form health survey). RESULTS: Compared to subjects without WS exposure, subjects with WS exposure had a more rapid decline of FEV1 (- 4.3 ml/s, P = 0.025) and FEV1/FVC ratio (- 0.093%, P = 0.015), but not of FVC (- 2.4 ml, P = 0.30). Age modified the impacts of WS exposure on lung function decline. WS exposure impaired all health domains with the increase in SGRQ scores exceeding the minimal clinically important difference. WS exposure increased hazard for incidence of LC and death of all-cause, cardiopulmonary diseases, and cancers by > 50% and shortened the lifespan by 3.5 year. We found no evidence for differential misclassification or confounding from socioeconomic status for the health effects of WS exposure. CONCLUSIONS: We identified epidemiological evidence supporting WS exposure as an independent etiological factor for the development of COPD through accelerating lung function decline in an obstructive pattern. Time-to-event analyses of LC incidence and cancer-specific mortality provide human evidence supporting the carcinogenicity of WS exposure.
Assuntos
Doença Pulmonar Obstrutiva Crônica , Qualidade de Vida , Adulto , Envelhecimento , Humanos , Pulmão , Doença Pulmonar Obstrutiva Crônica/diagnóstico , Doença Pulmonar Obstrutiva Crônica/epidemiologia , Doença Pulmonar Obstrutiva Crônica/etiologia , Fumaça/efeitos adversos , Fumantes , Madeira/efeitos adversosRESUMO
BACKGROUND/OBJECTIVES: Hirschsprung disease (HD) is associated with significant morbidities including long-term bowel dysfunction. The aim of this study was to update national and regional trends in the inpatient care utilization and epidemiology of HD in the United States between 2009 and 2014 using the National Inpatient Sample (NIS) database. METHODS: We identified all pediatric admissions with a diagnosis of HD within the NIS from 2009 through 2014. We analyzed HD discharges with respect to various demographic and clinical factors, specifically trends and group differences in inflation-adjusted cost of hospitalization, procedures, co-morbidities, hospital mortality, and length of stay (LOS). A modified Cochrane-Armitage trend test was used to analyze trends for dichotomous outcome variables, and regression analyses were conducted for continuous and binary variables. RESULTS: National estimates of HD-discharges showed no significant trend between 2009 and 2014 ( P = 0.27), with estimated relative incidence ranging from 46 to 70 per 100,000 pediatric discharges. Inflation-adjusted cost of hospitalization increased by $1137 (SE $326) per year ( P = 0.0005). Pull-through procedures in neonatal age group increased from 33.0% in 2009 to 36.5% in 2014 ( P = 0.003). Hospital mortality has remained stable between 0.4% and 1.0% ( P = 0.598). LOS decreased by 0.23 days per year ( P = 0.036). CONCLUSION: Increasing cost of HD-related hospitalization despite decreasing LOS was observed in this cohort. Stable rate of hospitalizations with increasing proportions of pull-through procedures among neonates was noted. Future studies and development of protocols to standardize patient care could improve outcomes and healthcare spending.
Assuntos
Doença de Hirschsprung , Pacientes Internados , Criança , Bases de Dados Factuais , Doença de Hirschsprung/epidemiologia , Doença de Hirschsprung/terapia , Hospitalização , Humanos , Recém-Nascido , Tempo de Internação , Estados Unidos/epidemiologiaRESUMO
Kidney transplant (KT) recipients face post-transplant health issues. Immunosuppressive agents can cause hyperlipidemia, hypertension, post-transplant diabetes, and glomerulopathy. Post-transplant weight gain and decreased activity are associated with poor quality of life, sleep, and cardiometabolic outcomes. This study will test the feasibility and acceptability of a culturally tailored diet and exercise intervention for KT patients delivered immediately post-transplant using novel technology. A registered dietitian nutritionist (RDN) and physical rehabilitation therapist will examine participants' cultural background, preferences, and health-related obstacles (with consultation from the transplant team) to create an individualized exercise and meal plan. The RDN will provide medical nutrition therapy via the nutrition care process throughout the course of the intervention. The Twistle Patient Engagement Platform will be used to deliver and collect survey data, communicate with participants, and promote retention. Outcomes to be assessed include intervention feasibility and acceptability and intervention efficacy on patients' adherence, medical, quality of life, and occupational outcomes.
Assuntos
Qualidade de Vida , Transplantados , Atenção à Saúde , Humanos , Tecnologia , Aumento de PesoRESUMO
BACKGROUND: Chronic subdural hematoma (cSDH) is a common neurosurgical condition responsible for excess morbidity, particularly in the geriatric population. Recovery after evacuation is complicated by fluctuating neurological deficits in a high proportion of patients. We previously demonstrated that spreading depolarizations (SDs) may be responsible for some of these events. In this study, we aim to determine candidate risk factors for probable SD and assess the influence of probable SD on outcome. METHODS: We used two cohorts who underwent surgery for cSDH. The first cohort (n = 40) had electrocorticographic monitoring to detect SD. In the second cohort (n = 345), we retrospectively identified subjects with suspected SD based on the presence of transient neurological symptoms not explained by structural etiology or ictal activity on electroencephalography. We extracted standard demographic and outcome variables for comparisons and modeling. RESULTS: Of 345 subjects, 80 (23%) were identified in the retrospective cohort as having probable SD. Potential risk factors included history of hypertension, worse clinical presentation on the Glasgow Coma Scale, and lower Hounsfield unit density and volume of the preoperative subdural hematoma. Probable SD was associated with multiple worse-outcome measures, including length of stay and clinical outcomes, but not increased mortality. On a multivariable analysis, probable SD was independently associated with worse outcome, determined by the Glasgow Outcome Scale score at the first clinic follow-up (odds ratio 1.793, 95% confidence interval 1.022-3.146) and longer hospital length of stay (odds ratio 7.952, 95% confidence interval 4.062-15.563). CONCLUSIONS: Unexplained neurological deficits after surgery for cSDH occur in nearly a quarter of patients and may be explained by SD. We identified several potential candidate risk factors. Patients with probable SD have worse outcomes, independent of other baseline risk factors. Further data with gold standard monitoring are needed to evaluate for possible predictors of SD to target therapies to a high-risk population.
Assuntos
Hematoma Subdural Crônico , Idoso , Escala de Coma de Glasgow , Hematoma Subdural Crônico/cirurgia , Humanos , Estudos Retrospectivos , Fatores de Risco , Resultado do TratamentoRESUMO
BACKGROUND: A small number of high-need patients account for a disproportionate amount of Medicaid spending, yet typically engage little in outpatient care and have poor outcomes. OBJECTIVE: To address this issue, we developed ECHO (Extension for Community Health Outcomes) Care™, a complex care intervention in which outpatient intensivist teams (OITs) provided care to high-need high-cost (HNHC) Medicaid patients. Teams were supported using the ECHO model™, a continuing medical education approach that connects specialists with primary care providers for case-based mentoring to treat complex diseases. DESIGN: Using an interrupted time series analysis of Medicaid claims data, we measured healthcare utilization and expenditures before and after ECHO Care. PARTICIPANTS: ECHO Care served 770 patients in New Mexico between September 2013 and June 2016. Nearly all had a chronic mental illness, and over three-quarters had a chronic substance use disorder. INTERVENTION: ECHO Care patients received care from an OIT, which typically included a nurse practitioner or physician assistant, a registered nurse, a licensed mental health provider, and at least one community health worker. Teams focused on addressing patients' physical, behavioral, and social issues. MAIN MEASURES: We assessed the effect of ECHO Care on Medicaid costs and utilization (inpatient admissions, emergency department (ED) visits, other outpatient visits, and dispensed prescriptions. KEY RESULTS: ECHO Care was associated with significant changes in patients' use of the healthcare system. At 12 months post-enrollment, the odds of a patient having an inpatient admission and an ED visit were each reduced by approximately 50%, while outpatient visits and prescriptions increased by 23% and 8%, respectively. We found no significant change in overall Medicaid costs associated with ECHO Care. CONCLUSIONS: ECHO Care shifts healthcare utilization from inpatient to outpatient settings, which suggests decreased patient suffering and greater access to care, including more effective prevention and early intervention for chronic conditions.
Assuntos
Hospitalização , Medicaid , Serviço Hospitalar de Emergência , Gastos em Saúde , Humanos , Aceitação pelo Paciente de Cuidados de Saúde , Estados UnidosRESUMO
We aimed to investigate if short-term exposure to reduced particulate matter (PM) air pollution would affect respiratory function in healthy adults. We followed a cohort of 42 healthy participants from a community afflicted with severe PM air pollution to a substantially less polluted area for nine days. We measured daily airborne PM [with an aerodynamic diameter of less than 2.5 µm (PM2.5) and 10 µm (PM10)] and PM2.5 carbon component concentrations. Five repeated respiratory function measurements and fractional exhaled nitric oxide test were made for each participant. Associations between respiratory health and PM exposure were assessed using linear mixed models. Each 10 µg/m3 decrease in same-day PM2.5 was associated with small but consistent increase in the forced expiratory volume in 1 s (FEV1) (9.00 mL) and forced vital capacity (14.35 mL). Our observations indicate that respiratory health benefits can be achieved even after a short-term reduction of exposure to PM. Our results provide strong evidence for more rigorous air pollution controls for the health benefit of populations.
Assuntos
Poluentes Atmosféricos/análise , Poluentes Atmosféricos/toxicidade , Poluição do Ar/prevenção & controle , Exposição Ambiental , Material Particulado/análise , Fenômenos Fisiológicos Respiratórios/efeitos dos fármacos , China , Feminino , Humanos , Modelos Lineares , Masculino , Pessoa de Meia-Idade , Tamanho da Partícula , Estudos Prospectivos , Testes de Função RespiratóriaRESUMO
In 2014, the National Research Council (NRC) published Review of EPA's Integrated Risk Information System (IRIS) Process that considers methods EPA uses for developing toxicity criteria for non-carcinogens. These criteria are the Reference Dose (RfD) for oral exposure and Reference Concentration (RfC) for inhalation exposure. The NRC Review suggested using Bayesian methods for application of uncertainty factors (UFs) to adjust the point of departure dose or concentration to a level considered to be without adverse effects for the human population. The NRC foresaw Bayesian methods would be potentially useful for combining toxicity data from disparate sources-high throughput assays, animal testing, and observational epidemiology. UFs represent five distinct areas for which both adjustment and consideration of uncertainty may be needed. NRC suggested UFs could be represented as Bayesian prior distributions, illustrated the use of a log-normal distribution to represent the composite UF, and combined this distribution with a log-normal distribution representing uncertainty in the point of departure (POD) to reflect the overall uncertainty. Here, we explore these suggestions and present a refinement of the methodology suggested by NRC that considers each individual UF as a distribution. From an examination of 24 evaluations from EPA's IRIS program, when individual UFs were represented using this approach, the geometric mean fold change in the value of the RfD or RfC increased from 3 to over 30, depending on the number of individual UFs used and the sophistication of the assessment. We present example calculations and recommendations for implementing the refined NRC methodology.
Assuntos
Teorema de Bayes , Substâncias Perigosas/toxicidade , Modelos Estatísticos , Testes de Toxicidade/métodos , Incerteza , Administração Oral , Animais , Simulação por Computador , Relação Dose-Resposta a Droga , Métodos Epidemiológicos , Substâncias Perigosas/farmacocinética , Ensaios de Triagem em Larga Escala , Humanos , Exposição por Inalação , Método de Monte Carlo , Valores de Referência , Medição de Risco , Testes de Toxicidade/normasRESUMO
Bivariate correlated (clustered) data often encountered in epidemiological and clinical research are routinely analyzed under a linear mixed-effected (LME) model with normality assumptions for the random-effects and within-subject errors. However, those analyses might not provide robust inference when the normality assumptions are questionable if the data set particularly exhibits skewness and heavy tails. In this article, we develop a Bayesian approach to bivariate linear mixed-effects (BLME) models replacing the Gaussian assumptions for the random terms with skew-normal/independent (SNI) distributions. The SNI distribution is an attractive class of asymmetric heavy-tailed parametric structure which includes the skew-normal, skew-t, skew-slash, and skew-contaminated normal distributions as special cases. We assume that the random-effects and the within-subject (random) errors, respectively, follow multivariate SNI and normal/independent (NI) distributions, which provide an appealing robust alternative to the symmetric normal distribution in a BLME model framework. The method is exemplified through an application to an AIDS clinical data set to compare potential models with different distribution specifications, and clinically important findings are reported.
Assuntos
Teorema de Bayes , Ensaios Clínicos como Assunto/estatística & dados numéricos , Infecções por HIV/tratamento farmacológico , Inibidores da Protease de HIV/uso terapêutico , Modelos Estatísticos , Terapia Antirretroviral de Alta Atividade , Humanos , Análise Multivariada , Distribuição Normal , Resultado do TratamentoRESUMO
INTRODUCTION: The effectiveness of the Stress Management and Resilience Training (SMART) with U.S. military personnel has not been reported in the literature. The purpose of this study was to examine the effectiveness of SMART in increasing resilience in Air Force healthcare personnel. MATERIALS AND METHODS: We conducted a pilot, randomized preventive trial with active component Air Force healthcare personnel. SMART was offered via an in-person, 2-h training session delivered through face-to-face or synchronous video teleconference training, or via a self-paced, computer-based training. A baseline survey included demographics questions and the Connor-Davidson-10 Resilience Scale (CD-10), Perceived Stress Scale (PSS), Generalized Anxiety Disorder Scale (GAD-7), and overall quality of life (QOL) measure. Follow-up surveys with the CD-10, PSS, GAD-7, and quality of life were sent to participants at 12, 18, and 24 weeks after completing SMART. RESULTS: Fifty-six service members completed the baseline assessment and were randomized to either the in-person modality (comprised of video teleconference or face-to-face training) or the computer-based training modality, and 49 participants completed SMART. Significant increases in median CD-10 scores were observed among all participants, showing a 4-point (14%), 6-point (21%), and 5-point (17%) increase at week-12, -18, and -24, respectively, from the baseline. A significant overall decrease in median PSS scores from baseline were observed, with 5.5-points (22%), 7.81-points (32%), and 8.5-points (35%) decrease at 12, 18, and 24 weeks post-SMART, respectively. CONCLUSIONS: In this pilot study, SMART demonstrated significant and meaningful improvements in self-reported CD-10 and PSS-14 scores at 12, 18, and 24 weeks post-training completion. A future replication of the study is necessary to evaluate the effectiveness of SMART on a larger scale.
Assuntos
Militares , Testes Psicológicos , Resiliência Psicológica , Autorrelato , Humanos , Qualidade de Vida , Estresse Psicológico , Projetos Piloto , Atenção à SaúdeRESUMO
Background: Impairment in cerebral autoregulation has been proposed as a potentially targetable factor in patients with aneurysmal subarachnoid hemorrhage (aSAH), however there are different continuous measures that can be used to calculate the state of autoregulation. In addition, it has previously been proposed that there may be an association of impaired autoregulation with the occurrence of spreading depolarization (SD) events. Methods: Subjects with invasive multimodal monitoring and aSAH were enrolled in an observational study. Autoregulation indices were prospectively calculated from this database as a 10 second moving correlation coefficient between various cerebral blood flow (CBF) surrogates and mean arterial pressure (MAP). In subjects with subdural ECoG (electrocorticography) monitoring, SD was also scored. Associations between clinical outcomes using the mRS (modified Rankin Scale) and occurrence of either isolated or clustered SD was assessed. Results: 320 subjects were included, 47 of whom also had ECoG SD monitoring. As expected, baseline severity factors such as mFS and WFNS (World Federation of Neurosurgical Societies scale) were strongly associated with the clinical outcome. SD probability was related to blood pressure in a triphasic pattern with a linear increase in probability below MAP of â¼100mmHg.Autoregulation indices were available for intracranial pressure (ICP) measurements (PRx), PbtO2 from Licox (ORx), perfusion from the Bowman perfusion probe (CBFRx), and cerebral oxygen saturation measured by near infrared spectroscopy (OSRx). Only worse ORx and OSRx were associated with worse clinical outcomes. ORx and OSRx also were found to both increase in the hour prior to SD for both sporadic and clustered SD. Conclusions: Impairment in autoregulation in aSAH is associated with worse clinical outcomes and occurrence of SD when using ORx and OSRx. Impaired autoregulation precedes SD occurrence. Targeting the optimal MAP or cerebral perfusion pressure in patients with aSAH should use ORx and/or OSRx as the input function rather than intracranial pressure.
RESUMO
Background: The assessment of heavy metals' effects on human health is frequently limited to investigating one metal or a group of related metals. The effect of heavy metals mixture on heart attack is unknown. Methods: This study applied the Bayesian kernel machine regression model (BKMR) to the 2011-2016 National Health and Nutrition Examination Survey (NHANES) data to investigate the association between heavy metal mixture exposure with heart attack. 2972 participants over the age of 20 were included in the study. Results: Results indicate that heart attack patients have higher levels of cadmium and lead in the blood and cadmium, cobalt, and tin in the urine, while having lower levels of mercury, manganese, and selenium in the blood and manganese, barium, tungsten, and strontium in the urine. The estimated risk of heart attack showed a negative association of 0.0030 units when all the metals were at their 25th percentile compared to their 50th percentile and a positive association of 0.0285 units when all the metals were at their 75th percentile compared to their 50th percentile. The results suggest that heavy metal exposure, especially cadmium and lead, may increase the risk of heart attacks. Conclusions: This study suggests a possible association between heavy metal mixture exposure and heart attack and, additionally, demonstrates how the BKMR model can be used to investigate new combinations of exposures in future studies.