Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
1.
Trauma Surg Acute Care Open ; 9(1): e001297, 2024.
Article in English | MEDLINE | ID: mdl-38666014

ABSTRACT

Objective: Venous thromboembolism (VTE) risk reduction strategies include early initiation of chemoprophylaxis, reducing missed doses, weight-based dosing and dose adjustment using anti-Xa levels. We hypothesized that time to initiation of chemoprophylaxis would be the strongest modifiable risk for VTE, even after adjusting for competing risk factors. Methods: A prospectively maintained trauma registry was queried for patients admitted July 2017-October 2021 who were 18 years and older and received emergency release blood products. Patients with deep vein thrombosis or pulmonary embolism (VTE) were compared to those without (no VTE). Door-to-prophylaxis was defined as time from hospital arrival to first dose of VTE chemoprophylaxis (hours). Univariate and multivariate analyses were then performed between the two groups. Results: 2047 patients met inclusion (106 VTE, 1941 no VTE). There were no differences in baseline or demographic data. VTE patients had higher injury severity score (29 vs 24), more evidence of shock by arrival lactate (4.6 vs 3.9) and received more post-ED transfusions (8 vs 2 units); all p<0.05. While there was no difference in need for enoxaparin dose adjustment or missed doses, door-to-prophylaxis time was longer in the VTE group (35 vs 25 hours; p=0.009). On multivariate logistic regression analysis, every hour delay from time of arrival increased likelihood of VTE by 1.5% (OR 1.015, 95% CI 1.004 to 1.023, p=0.004). Conclusion: The current retrospective study of severely injured patients with trauma who required emergency release blood products found that increased door-to-prophylaxis time was significantly associated with an increased likelihood for VTE. Chemoprophylaxis initiation is one of the few modifiable risk factors available to combat VTE, therefore early initiation is paramount. Similar to door-to-balloon time in treating myocardial infarction and door-to-tPA time in stroke, "door-to-prophylaxis time" should be considered as a hospital metric for prevention of VTE in trauma. Level of evidence: Level III, retrospective study with up to two negative criteria.

2.
J Am Geriatr Soc ; 71(6): 1891-1901, 2023 06.
Article in English | MEDLINE | ID: mdl-36912153

ABSTRACT

BACKGROUND: Although 50 years represents middle age among uninfected individuals, studies have shown that persons living with HIV (PWH) begin to demonstrate elevated risk for serious falls and fragility fractures in the sixth decade; the proportions of these outcomes attributable to modifiable factors are unknown. METHODS: We analyzed 21,041 older PWH on antiretroviral therapy (ART) from the Veterans Aging Cohort Study from 01/01/2010 through 09/30/2015. Serious falls were identified by Ecodes and a machine-learning algorithm applied to radiology reports. Fragility fractures (hip, vertebral, and upper arm) were identified using ICD9 codes. Predictors for both models included a serious fall within the past 12 months, body mass index, physiologic frailty (VACS Index 2.0), illicit substance and alcohol use disorders, and measures of multimorbidity and polypharmacy. We separately fit multivariable logistic models to each outcome using generalized estimating equations. From these models, the longitudinal extensions of average attributable fraction (LE-AAF) for modifiable risk factors were estimated. RESULTS: Key risk factors for both outcomes included physiologic frailty (VACS Index 2.0) (serious falls [15%; 95% CI 14%-15%]; fractures [13%; 95% CI 12%-14%]), a serious fall in the past year (serious falls [7%; 95% CI 7%-7%]; fractures [5%; 95% CI 4%-5%]), polypharmacy (serious falls [5%; 95% CI 4%-5%]; fractures [5%; 95% CI 4%-5%]), an opioid prescription in the past month (serious falls [7%; 95% CI 6%-7%]; fractures [9%; 95% CI 8%-9%]), and diagnosis of alcohol use disorder (serious falls [4%; 95% CI 4%-5%]; fractures [8%; 95% CI 7%-8%]). CONCLUSIONS: This study confirms the contributions of risk factors important in the general population to both serious falls and fragility fractures among older PWH. Successful prevention programs for these outcomes should build on existing prevention efforts while including risk factors specific to PWH.


Subject(s)
Alcoholism , Fractures, Bone , Frailty , HIV Infections , Humans , Aged , Aged, 80 and over , Cohort Studies , Frailty/epidemiology , Frailty/complications , Fractures, Bone/epidemiology , Fractures, Bone/etiology , Risk Factors , HIV Infections/complications , HIV Infections/drug therapy , HIV Infections/epidemiology
3.
J Acquir Immune Defic Syndr ; 91(2): 168-174, 2022 10 01.
Article in English | MEDLINE | ID: mdl-36094483

ABSTRACT

BACKGROUND: Older (older than 50 years) persons living with HIV (PWH) are at elevated risk for falls. We explored how well our algorithm for predicting falls in a general population of middle-aged Veterans (age 45-65 years) worked among older PWH who use antiretroviral therapy (ART) and whether model fit improved with inclusion of specific ART classes. METHODS: This analysis included 304,951 six-month person-intervals over a 15-year period (2001-2015) contributed by 26,373 older PWH from the Veterans Aging Cohort Study who were taking ART. Serious falls (those falls warranting a visit to a health care provider) were identified by external cause of injury codes and a machine-learning algorithm applied to radiology reports. Potential predictors included a fall within the past 12 months, demographics, body mass index, Veterans Aging Cohort Study Index 2.0 score, substance use, and measures of multimorbidity and polypharmacy. We assessed discrimination and calibration from application of the original coefficients (model derived from middle-aged Veterans) to older PWH and then reassessed by refitting the model using multivariable logistic regression with generalized estimating equations. We also explored whether model performance improved with indicators of ART classes. RESULTS: With application of the original coefficients, discrimination was good (C-statistic 0.725; 95% CI: 0.719 to 0.730) but calibration was poor. After refitting the model, both discrimination (C-statistic 0.732; 95% CI: 0.727 to 0.734) and calibration were good. Including ART classes did not improve model performance. CONCLUSIONS: After refitting their coefficients, the same variables predicted risk of serious falls among older PWH nearly and they had among middle-aged Veterans.


Subject(s)
HIV Infections , Accidental Falls , Aged , Aged, 80 and over , Aging , Cohort Studies , HIV Infections/complications , HIV Infections/drug therapy , Humans , Middle Aged , Polypharmacy
4.
J Acquir Immune Defic Syndr ; 88(2): 192-196, 2021 10 01.
Article in English | MEDLINE | ID: mdl-34506360

ABSTRACT

BACKGROUND: The extensive research on falls and fragility fractures among persons living with HIV (PWH) has not explored the association between serious falls and subsequent fragility fracture. We explored this association. SETTING: Veterans Aging Cohort Study. METHODS: This analysis included 304,951 6-month person- intervals over a 15-year period (2001-2015) contributed by 26,373 PWH who were 50+ years of age (mean age 55 years) and taking antiretroviral therapy (ART). Serious falls (those falls significant enough to result in a visit to a health care provider) were identified by the external cause of injury codes and a machine learning algorithm applied to radiology reports. Fragility fractures were identified using ICD9 codes and included hip fracture, vertebral fractures, and upper arm fracture and were modeled with multivariable logistic regression with generalized estimating equations. RESULTS: After adjustment, serious falls in the previous year were associated with increased risk of fragility fracture [odds ratio (OR) 2.10; 95% confidence interval (CI): 1.83 to 2.41]. The use of integrase inhibitors was the only ART risk factor (OR 1.17; 95% CI: 1.03 to 1.33). Other risk factors included the diagnosis of alcohol use disorder (OR 1.49; 95% CI: 1.31 to 1.70) and having a prescription for an opioid in the previous 6 months (OR 1.40; 95% CI: 1.27 to 1.53). CONCLUSIONS: Serious falls within the past year are strongly associated with fragility fractures among PWH on ART-largely a middle-aged population-much as they are among older adults in the general population.


Subject(s)
Accidental Falls/statistics & numerical data , Antiretroviral Therapy, Highly Active , Fractures, Bone/virology , HIV Infections/drug therapy , HIV Infections/pathology , Veterans/statistics & numerical data , Aged , Cohort Studies , Female , Fractures, Bone/epidemiology , Fractures, Bone/etiology , HIV Infections/epidemiology , Humans , Male , Middle Aged , Osteoporotic Fractures/epidemiology , Osteoporotic Fractures/etiology , Risk Factors , United States/epidemiology
5.
J Am Geriatr Soc ; 68(12): 2847-2854, 2020 12.
Article in English | MEDLINE | ID: mdl-32860222

ABSTRACT

BACKGROUND/OBJECTIVES: Due to high rates of multimorbidity, polypharmacy, and hazardous alcohol and opioid use, middle-aged Veterans are at risk for serious falls (those prompting a visit with a healthcare provider), posing significant risk to their forthcoming geriatric health and quality of life. We developed and validated a predictive model of the 6-month risk of serious falls among middle-aged Veterans. DESIGN: Cohort study. SETTING: Veterans Health Administration (VA). PARTICIPANTS: Veterans, aged 45 to 65 years, who presented for care within the VA between 2012 and 2015 (N = 275,940). EXPOSURES: The exposures of primary interest were substance use (including alcohol and prescription opioid use), multimorbidity, and polypharmacy. Hazardous alcohol use was defined as an Alcohol Use Disorders Identification Test - Consumption (AUDIT-C) score of 3 or greater for women and 4 or greater for men. We used International Classification of Diseases, Ninth Revision (ICD-9), codes to identify alcohol and illicit substance use disorders and identified prescription opioid use from pharmacy fill-refill data. We included counts of chronic medications and of physical and mental health comorbidities. MEASUREMENTS: We identified serious falls using external cause of injury codes and a machine-learning algorithm that identified serious falls in radiology reports. We used multivariable logistic regression with general estimating equations to calculate risk. We used an integrated predictiveness curve to identify intervention thresholds. RESULTS: Most of our sample (54%) was aged 60 years or younger. Duration of follow-up was up to 4 years. Veterans who fell were more likely to be female (11% vs 7%) and White (72% vs 68%). They experienced 43,641 serious falls during follow-up. We identified 16 key predictors of serious falls and five interaction terms. Model performance was enhanced by addition of opioid use, as evidenced by overall category-free net reclassification improvement of 0.32 (P < .001). Discrimination (C-statistic = 0.76) and calibration were excellent for both development and validation data sets. CONCLUSION: We developed and internally validated a model to predict 6-month risk of serious falls among middle-aged Veterans with excellent discrimination and calibration.


Subject(s)
Accidental Falls/statistics & numerical data , Algorithms , Comorbidity/trends , Polypharmacy , Substance-Related Disorders/epidemiology , Veterans/statistics & numerical data , Body Mass Index , Cohort Studies , Female , Humans , Male , Middle Aged , Quality of Life , Reproducibility of Results , Risk Assessment , Sex Factors , United States , United States Department of Veterans Affairs
6.
Proc Natl Acad Sci U S A ; 116(36): 17867-17873, 2019 09 03.
Article in English | MEDLINE | ID: mdl-31427510

ABSTRACT

Global change drivers (GCDs) are expected to alter community structure and consequently, the services that ecosystems provide. Yet, few experimental investigations have examined effects of GCDs on plant community structure across multiple ecosystem types, and those that do exist present conflicting patterns. In an unprecedented global synthesis of over 100 experiments that manipulated factors linked to GCDs, we show that herbaceous plant community responses depend on experimental manipulation length and number of factors manipulated. We found that plant communities are fairly resistant to experimentally manipulated GCDs in the short term (<10 y). In contrast, long-term (≥10 y) experiments show increasing community divergence of treatments from control conditions. Surprisingly, these community responses occurred with similar frequency across the GCD types manipulated in our database. However, community responses were more common when 3 or more GCDs were simultaneously manipulated, suggesting the emergence of additive or synergistic effects of multiple drivers, particularly over long time periods. In half of the cases, GCD manipulations caused a difference in community composition without a corresponding species richness difference, indicating that species reordering or replacement is an important mechanism of community responses to GCDs and should be given greater consideration when examining consequences of GCDs for the biodiversity-ecosystem function relationship. Human activities are currently driving unparalleled global changes worldwide. Our analyses provide the most comprehensive evidence to date that these human activities may have widespread impacts on plant community composition globally, which will increase in frequency over time and be greater in areas where communities face multiple GCDs simultaneously.


Subject(s)
Biodiversity , Ecosystem , Plants , Bayes Theorem , Climate Change , Human Activities , Humans
7.
J Acquir Immune Defic Syndr ; 82(3): 305-313, 2019 11 01.
Article in English | MEDLINE | ID: mdl-31339866

ABSTRACT

BACKGROUND: Medication classes, polypharmacy, and hazardous alcohol and illicit substance abuse may exhibit stronger associations with serious falls among persons living with HIV (PLWH) than with uninfected comparators. We investigated whether these associations differed by HIV status. SETTING: Veterans Aging Cohort Study. METHODS: We used a nested case-control design. Cases (N = 13,530) were those who fell. Falls were identified by external cause of injury codes and a machine-learning algorithm applied to radiology reports. Cases were matched to controls (N = 67,060) by age, race, sex, HIV status, duration of observation, and baseline date. Risk factors included medication classes, count of unique non-antiretroviral therapy (non-ART) medications, and hazardous alcohol and illicit substance use. We used unconditional logistic regression to evaluate associations. RESULTS: Among PLWH, benzodiazepines [odds ratio (OR) 1.24; 95% confidence interval (CI) 1.08 to 1.40] and muscle relaxants (OR 1.29; 95% CI: 1.08 to 1.46) were associated with serious falls but not among uninfected (P > 0.05). In both groups, key risk factors included non-ART medications (per 5 medications) (OR 1.20, 95% CI: 1.17 to 1.23), illicit substance use/abuse (OR 1.44; 95% CI: 1.34 to 1.55), hazardous alcohol use (OR 1.30; 95% CI: 1.23 to 1.37), and an opioid prescription (OR 1.35; 95% CI: 1.29 to 1.41). CONCLUSION: Benzodiazepines and muscle relaxants were associated with serious falls among PLWH. Non-ART medication count, hazardous alcohol and illicit substance use, and opioid prescriptions were associated with serious falls in both groups. Prevention of serious falls should focus on reducing specific classes and absolute number of medications and both alcohol and illicit substance use.


Subject(s)
HIV Infections/drug therapy , Polypharmacy , Substance-Related Disorders/complications , Analgesics, Opioid , Benzodiazepines/therapeutic use , Case-Control Studies , Cohort Studies , Drug Prescriptions , Female , Humans , Logistic Models , Male , Odds Ratio , Risk Factors
8.
Med Care ; 57 Suppl 6 Suppl 2: S157-S163, 2019 06.
Article in English | MEDLINE | ID: mdl-31095055

ABSTRACT

BACKGROUND: Electronic health records (EHRs) are a rich source of health information; however social determinants of health, including incarceration, and how they impact health and health care disparities can be hard to extract. OBJECTIVE: The main objective of this study was to compare sensitivity and specificity of patient self-report with various methods of identifying incarceration exposure using the EHR. RESEARCH DESIGN: Validation study using multiple data sources and types. SUBJECTS: Participants of the Veterans Aging Cohort Study (VACS), a national observational cohort based on data from the Veterans Health Administration (VHA) EHR that includes all human immunodeficiency virus-infected patients in care (47,805) and uninfected patients (99,060) matched on region, age, race/ethnicity, and sex. MEASURES AND DATA SOURCES: Self-reported incarceration history compared with: (1) linked VHA EHR data to administrative data from a state Department of Correction (DOC), (2) linked VHA EHR data to administrative data on incarceration from Centers for Medicare and Medicaid Services (CMS), (3) VHA EHR-specific identifier codes indicative of receipt of VHA incarceration reentry services, and (4) natural language processing (NLP) in unstructured text in VHA EHR. RESULTS: Linking the EHR to DOC data: sensitivity 2.5%, specificity 100%; linking the EHR to CMS data: sensitivity 7.9%, specificity 99.3%; VHA EHR-specific identifier for receipt of reentry services: sensitivity 7.3%, specificity 98.9%; and NLP, sensitivity 63.5%, specificity 95.9%. CONCLUSIONS: NLP tools hold promise as a feasible and valid method to identify individuals with exposure to incarceration in EHR. Future work should expand this approach using a larger body of documents and refinement of the methods, which may further improve operating characteristics of this method.


Subject(s)
Administrative Claims, Healthcare/statistics & numerical data , Electronic Health Records/statistics & numerical data , Natural Language Processing , Prisoners/statistics & numerical data , Self Report , Veterans/statistics & numerical data , Adult , Cohort Studies , Ethnicity , Female , Humans , Information Storage and Retrieval , Male , Medicare/statistics & numerical data , Middle Aged , Sensitivity and Specificity , United States , United States Department of Veterans Affairs
9.
Article in English | MEDLINE | ID: mdl-30532796

ABSTRACT

BACKGROUND: Physical fitness has been recognized not only as an integrated predictor of the body's functional status, but also as an important marker of health outcomes. The aim of this study was to examine the factors associated with physical fitness among 3-6-year-old children within the Tujia-Nationality settlement in the years 2005, 2010, and 2014. METHODS: Demographics questionnaires and fitness assessment were performed to identify the risk factors for poor physical fitness (PPF) among 3- to 6-year-old children in the years 2005, 2010, and 2014 in the area of southwest Hubei of China. RESULTS: Of the 2128 children, 495 were classified as PPF (23.3%). In 2005, the percentage of PPF children was 21.7%, and the percentage of PPF children decreased from 29.1% in 2010 to 18.8% in 2014. Furthermore, Urban area children had a significant risk of PPF than rural area children (OR=1.299, P=0.031). Three-year-old children had 2.150-fold risk of PPF as compared to 6-year-old children. The children with less than 0.5 hours of activity time per day had 1.95-fold risk of PPF as compared to those with 1-2-hour activity time per day, respectively. Underweight and overweight/obese children had 2.74-fold and 1.67-fold risk of PPF as compared to normal weight children. Children had 1.97-fold risk of PPF when their father's schooling ceased after middle school and 1.51-fold risk of PPF when their father's schooling ceased after high school, respectively. CONCLUSIONS: These results demonstrated that the incidence of PPF children went up from 2005 to 2010 and then down from 2010 to 2014 within the Tujia settlement. For the children in this area, the risk factors associated with PPF included urban location, younger age, less than 1-hour activity time per day in kindergarten, underweight/overweight, low father's education level, and mother's childbearing age being less than 20 years.

10.
Pharmacoepidemiol Drug Saf ; 27(8): 848-856, 2018 08.
Article in English | MEDLINE | ID: mdl-29896873

ABSTRACT

PURPOSE: To estimate medical device utilization needed to detect safety differences among implantable cardioverter defibrillators (ICDs) generator models and compare these estimates to utilization in practice. METHODS: We conducted repeated sample size estimates to calculate the medical device utilization needed, systematically varying device-specific safety event rate ratios and significance levels while maintaining 80% power, testing 3 average adverse event rates (3.9, 6.1, and 12.6 events per 100 person-years) estimated from the American College of Cardiology's 2006 to 2010 National Cardiovascular Data Registry of ICDs. We then compared with actual medical device utilization. RESULTS: At significance level 0.05 and 80% power, 34% or fewer ICD models accrued sufficient utilization in practice to detect safety differences for rate ratios <1.15 and an average event rate of 12.6 events per 100 person-years. For average event rates of 3.9 and 12.6 events per 100 person-years, 30% and 50% of ICD models, respectively, accrued sufficient utilization for a rate ratio of 1.25, whereas 52% and 67% for a rate ratio of 1.50. Because actual ICD utilization was not uniformly distributed across ICD models, the proportion of individuals receiving any ICD that accrued sufficient utilization in practice was 0% to 21%, 32% to 70%, and 67% to 84% for rate ratios of 1.05, 1.15, and 1.25, respectively, for the range of 3 average adverse event rates. CONCLUSIONS: Small safety differences among ICD generator models are unlikely to be detected through routine surveillance given current ICD utilization in practice, but large safety differences can be detected for most patients at anticipated average adverse event rates.


Subject(s)
Databases, Factual/statistics & numerical data , Defibrillators, Implantable/statistics & numerical data , Product Surveillance, Postmarketing/statistics & numerical data , Prosthesis Failure , Registries/statistics & numerical data , Cardiac Surgical Procedures/instrumentation , Cardiac Surgical Procedures/statistics & numerical data , Data Interpretation, Statistical , Death, Sudden, Cardiac , Defibrillators, Implantable/adverse effects , Heart Failure/surgery , Humans , Primary Prevention , Product Surveillance, Postmarketing/methods , Prosthesis Implantation/instrumentation , Prosthesis Implantation/statistics & numerical data , Sample Size , United States
11.
BMC Med Res Methodol ; 18(1): 24, 2018 02 26.
Article in English | MEDLINE | ID: mdl-29482517

ABSTRACT

BACKGROUND: Medical practitioners use survival models to explore and understand the relationships between patients' covariates (e.g. clinical and genetic features) and the effectiveness of various treatment options. Standard survival models like the linear Cox proportional hazards model require extensive feature engineering or prior medical knowledge to model treatment interaction at an individual level. While nonlinear survival methods, such as neural networks and survival forests, can inherently model these high-level interaction terms, they have yet to be shown as effective treatment recommender systems. METHODS: We introduce DeepSurv, a Cox proportional hazards deep neural network and state-of-the-art survival method for modeling interactions between a patient's covariates and treatment effectiveness in order to provide personalized treatment recommendations. RESULTS: We perform a number of experiments training DeepSurv on simulated and real survival data. We demonstrate that DeepSurv performs as well as or better than other state-of-the-art survival models and validate that DeepSurv successfully models increasingly complex relationships between a patient's covariates and their risk of failure. We then show how DeepSurv models the relationship between a patient's features and effectiveness of different treatment options to show how DeepSurv can be used to provide individual treatment recommendations. Finally, we train DeepSurv on real clinical studies to demonstrate how it's personalized treatment recommendations would increase the survival time of a set of patients. CONCLUSIONS: The predictive and modeling capabilities of DeepSurv will enable medical researchers to use deep neural networks as a tool in their exploration, understanding, and prediction of the effects of a patient's characteristics on their risk of failure.


Subject(s)
Algorithms , Neural Networks, Computer , Outcome Assessment, Health Care/methods , Proportional Hazards Models , Humans , Kaplan-Meier Estimate , Outcome Assessment, Health Care/statistics & numerical data , Precision Medicine/methods
12.
Med Decis Making ; 38(1): 34-43, 2018 01.
Article in English | MEDLINE | ID: mdl-28853340

ABSTRACT

OBJECTIVE: To determine if 1) patients have distinct affective reaction patterns to medication information, and 2) whether there is an association between affective reaction patterns and willingness to take medication. METHODS: We measured affect in real time as subjects listened to a description of benefits and side effects for a hypothetical new medication. Subjects moved a dial on a handheld response system to indicate how they were feeling from "Very Good" to "Very Bad". Patterns of reactions were identified using a cluster-analytic statistical approach for multiple time series. Subjects subsequently rated their willingness to take the medication on a 7-point Likert scale. Associations between subjects' willingness ratings and affect patterns were analyzed. Additional analyses were performed to explore the role of race/ethnicity regarding these associations. RESULTS: Clusters of affective reactions emerged that could be classified into 4 patterns: "Moderate" positive reactions to benefits and negative reactions to side effects ( n = 186), "Pronounced" positive reactions to benefits and negative reactions to side effects ( n = 110), feeling consistently "Good" ( n = 58), and feeling consistently close to "Neutral" ( n = 33). Mean (standard error) willingness to take the medication was greater among subjects feeling consistently Good 4.72 (0.20) compared with those in the Moderate 3.76 (0.11), Pronounced 3.68 (0.14), and Neutral 3.62 (0.26) groups. Black subjects with a Pronounced pattern were less willing to take the medication compared with both Hispanic ( P = 0.0270) and White subjects ( P = 0.0001) with a Pronounced pattern. CONCLUSION: Patients' affective reactions to information were clustered into specific patterns. Reactions varied by race/ethnicity and were associated with treatment willingness. Ultimately, a better understanding of how patients react to information may help providers develop improved methods of communication.


Subject(s)
Affect , Patient Acceptance of Health Care/psychology , Prescription Drugs/administration & dosage , Adult , Aged , Female , Humans , Male , Middle Aged , Prescription Drugs/adverse effects , Racial Groups , Risk Assessment , Socioeconomic Factors
13.
Data Brief ; 14: 515-523, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28856182

ABSTRACT

Conifer control in sagebrush steppe of the western United States causes various levels of site disturbance influencing vegetation recovery and resource availability. The data set presented in this article include growing season availability of soil micronutrients and levels of total soil carbon, organic matter, and N spanning a six year period following western juniper (Juniperus occidentalis spp. occidentalis) reduction by mechanical cutting and prescribed fire of western juniper woodlands in southeast Oregon. These data can be useful to further evaluate the impacts of conifer woodland reduction to soil resources in sagebrush steppe plant communities.

14.
Med Devices (Auckl) ; 10: 165-188, 2017.
Article in English | MEDLINE | ID: mdl-28860874

ABSTRACT

BACKGROUND: Machine learning methods may complement traditional analytic methods for medical device surveillance. METHODS AND RESULTS: Using data from the National Cardiovascular Data Registry for implantable cardioverter-defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%-20.9%; nonfatal ICD-related adverse events, 19.3%-26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%-37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k=0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k=-0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k=-0.042). CONCLUSION: Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance.

15.
Epilepsy Behav ; 73: 31-35, 2017 08.
Article in English | MEDLINE | ID: mdl-28605631

ABSTRACT

OBJECTIVE: The study sought to quantify coordination of epilepsy care, over time, between neurologists and other health care providers using social network analysis (SNA). METHODS: The Veterans Health Administration (VA) instituted an Epilepsy Center of Excellence (ECOE) model in 2008 to enhance care coordination between neurologists and other health care providers. Provider networks in the 16 VA ECOE facilities (hub sites) were compared to a subset of 33 VA facilities formally affiliated (consortium sites) and 14 unaffiliated VA facilities. The number of connections between neurologists and each provider (node degree) was measured by shared epilepsy patients and tallied to generate estimates at the facility level separately within and across facilities. Mixed models were used to compare change of facility-level node degree over time across the three facility types, adjusted for number of providers per facility. RESULTS: Over the time period 2000-2013, epilepsy care coordination both within and across facilities significantly increased. These increases were seen in all three types of facilities namely hub, consortium, and unaffiliated site, relatively equally. The increase in connectivity was more dramatic with providers across facilities compared to providers within the same facilities. CONCLUSION: Establishment of the ECOE hub and spoke model contributed to an increase in epilepsy care coordination both within and across facilities from 2000 to 2013, but there was substantial variation across different facilities. SNA is a tool that may help measure coordination of specialty care.


Subject(s)
Epilepsy/therapy , Health Personnel/statistics & numerical data , Health Services/statistics & numerical data , Neurologists/statistics & numerical data , Social Networking , Humans , United States , United States Department of Veterans Affairs
16.
J Am Med Inform Assoc ; 23(e1): e113-7, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26567329

ABSTRACT

OBJECTIVE: To identify patients in a human immunodeficiency virus (HIV) study cohort who have fallen by applying supervised machine learning methods to radiology reports of the cohort. METHODS: We used the Veterans Aging Cohort Study Virtual Cohort (VACS-VC), an electronic health record-based cohort of 146 530 veterans for whom radiology reports were available (N=2 977 739). We created a reference standard of radiology reports, represented each report by a feature set of words and Unified Medical Language System concepts, and then developed several support vector machine (SVM) classifiers for falls. We compared mutual information (MI) ranking and embedded feature selection approaches. The SVM classifier with MI feature selection was chosen to classify all radiology reports in VACS-VC. RESULTS: Our SVM classifier with MI feature selection achieved an area under the curve score of 97.04 on the test set. When applied to all the radiology reports in VACS-VC, 80 416 of these reports were classified as positive for a fall. Of these, 11 484 were associated with a fall-related external cause of injury code (E-code) and 68 932 were not, corresponding to 29 280 patients with potential fall-related injuries who could not have been found using E-codes. DISCUSSION: Feature selection was crucial to improving the classifier's performance. Feature selection with MI allowed us to select the number of discriminative features to use for classification, in contrast to the embedded feature selection method, in which the number of features is chosen automatically. CONCLUSION: Machine learning is an effective method of identifying patients who have suffered a fall. The development of this classifier supplements the clinical researcher's toolkit and reduces dependence on under-coded structured electronic health record data.


Subject(s)
Accidental Falls , Radiology Information Systems/classification , Support Vector Machine , Area Under Curve , Cohort Studies , Electronic Health Records , HIV Infections , Humans , Unified Medical Language System , United States , United States Department of Veterans Affairs , Veterans
17.
PLoS One ; 10(3): e0122439, 2015.
Article in English | MEDLINE | ID: mdl-25775124

ABSTRACT

BACKGROUND: The stromal cell derived factor (SDF)-1/chemokine receptor (CXCR)-4 signaling pathway plays a key role in lung cancer metastasis and is molecular target for therapy. In the present study we investigated whether interleukin (IL)-24 can inhibit the SDF-1/CXCR4 axis and suppress lung cancer cell migration and invasion in vitro. Further, the efficacy of IL-24 in combination with CXCR4 antagonists was investigated. METHODS: Human H1299, A549, H460 and HCC827 lung cancer cell lines were used in the present study. The H1299 lung cancer cell line was stably transfected with doxycycline-inducible plasmid expression vector carrying the human IL-24 cDNA and used in the present study to determine the inhibitory effects of IL-24 on SDF-1/CXCR4 axis. H1299 and A549 cell lines were used in transient transfection studies. The inhibitory effects of IL-24 on SDF1/CXCR4 and its downstream targets were analyzed by quantitative RT-PCR, western blot, luciferase reporter assay, flow cytometry and immunocytochemistry. Functional studies included cell migration and invasion assays. PRINCIPAL FINDINGS: Endogenous CXCR4 protein expression levels varied among the four human lung cancer cell lines. Doxycycline-induced IL-24 expression in the H1299-IL24 cell line resulted in reduced CXCR4 mRNA and protein expression. IL-24 post-transcriptionally regulated CXCR4 mRNA expression by decreasing the half-life of CXCR4 mRNA (>40%). Functional studies showed IL-24 inhibited tumor cell migration and invasion concomitant with reduction in CXCR4 and its downstream targets (pAKTS473, pmTORS2448, pPRAS40T246 and HIF-1α). Additionally, IL-24 inhibited tumor cell migration both in the presence and absence of the CXCR4 agonist, SDF-1. Finally, IL-24 when combined with CXCR4 inhibitors (AMD3100, SJA5) or with CXCR4 siRNA demonstrated enhanced inhibitory activity on tumor cell migration. CONCLUSIONS: IL-24 disrupts the SDF-1/CXCR4 signaling pathway and inhibits lung tumor cell migration and invasion. Additionally, IL-24, when combined with CXCR4 inhibitors exhibited enhanced anti-metastatic activity and is an attractive therapeutic strategy for lung metastasis.


Subject(s)
Cell Movement/drug effects , Chemokine CXCL12/metabolism , Interleukins/pharmacology , Receptors, CXCR4/metabolism , Signal Transduction/drug effects , Benzylamines , Carcinoma, Non-Small-Cell Lung/genetics , Carcinoma, Non-Small-Cell Lung/metabolism , Carcinoma, Non-Small-Cell Lung/pathology , Cell Line, Tumor , Cell Movement/genetics , Cyclams , Drug Synergism , Gene Expression , Gene Expression Regulation, Neoplastic , Heterocyclic Compounds/pharmacology , Humans , Interleukins/genetics , Lung Neoplasms/genetics , Lung Neoplasms/metabolism , Lung Neoplasms/pathology , Proto-Oncogene Proteins c-akt/genetics , Proto-Oncogene Proteins c-akt/metabolism , RNA, Small Interfering , Receptors, CXCR4/antagonists & inhibitors , Receptors, CXCR4/genetics , TOR Serine-Threonine Kinases/metabolism
18.
Environ Manage ; 52(3): 553-66, 2013 Sep.
Article in English | MEDLINE | ID: mdl-23811771

ABSTRACT

The expansion of piñon-juniper woodlands the past 100 years in the western United States has resulted in large scale efforts to kill trees and recover sagebrush steppe rangelands. It is important to evaluate vegetation recovery following woodland control to develop best management practices. In this study, we compared two fuel reduction treatments and a cut-and-leave (CUT) treatment used to control western juniper (Juniperus occidentalis spp. occidentalis Hook.) of the northwestern United States. Treatments were; CUT, cut-and-broadcast burn (BURN), and cut-pile-and-burn the pile (PILE). A randomized complete block design was used with five replicates of each treatment located in a curl leaf mahogany (Cercocarpus ledifolius Nutt. ex Torr. & A. Gray)/mountain big sagebrush (Artemisia tridentata Nutt. spp. vaseyana (Rydb.) Beetle)/Idaho fescue (Festuca idahoensis Elmer) association. In 2010, 4 years after tree control the cover of perennial grasses (PG) [Sandberg's bluegrass (Poa secunda J. Pres) and large bunchgrasses] were about 4 and 5 % less, respectively, in the BURN (7.1 ± 0.6 %) than the PILE (11.4 ± 2.3 %) and CUT (12.4 ± 1.7 %) treatments (P < 0.0015). In 2010, cover of invasive cheatgrass (Bromus tectorum L.) was greater in the BURN (6.3 ± 1.0 %) and was 50 and 100 % greater than PILE and CUT treatments, respectively. However, the increase in perennial bunchgrass density and cover, despite cheatgrass in the BURN treatment, mean it unlikely that cheatgrass will persist as a major understory component. In the CUT treatment mahogany cover increased 12.5 % and density increased in from 172 ± 25 to 404 ± 123 trees/ha. Burning, killed most or all of the adult mahogany, and mahogany recovery consisted of 100 and 67 % seedlings in the PILE and BURN treatments, respectively. After treatment, juniper presence from untreated small trees (<1 m tall; PILE and CUT treatments) and seedling emergence (all treatments) represented 25-33 % of pre-treatment tree density. To maintain recovery of herbaceous, shrub, and mahogany species additional control of reestablished juniper will be necessary.


Subject(s)
Agriculture , Ecosystem , Fires , Juniperus/growth & development , Oregon , Random Allocation , Wood/growth & development
19.
Environ Manage ; 47(3): 468-81, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21344252

ABSTRACT

Pinus-Juniperus L. (Piñon-juniper) woodlands of the western United States have expanded in area nearly 10-fold since the late 1800's. Juniperus occidentalis ssp. occidentalis Hook. (western juniper) dominance in sagebrush steppe has several negative consequences, including reductions in herbaceous production and diversity, decreased wildlife habitat, and higher erosion and runoff potentials. Prescribed fire and mechanical tree removal are the main methods used to control J. occidentalis and restore sagebrush steppe. However, mature woodlands become difficult to prescribe burn because of the lack of understory fuels. We evaluated partial cutting of the woodlands (cutting 25-50% of the trees) to increase surface fuels, followed by prescribed fire treatments in late successional J. occidentalis woodlands of southwest Idaho to assess understory recovery. The study was conducted in two different plant associations and evaluated what percentage of the woodland required preparatory cutting to eliminate remaining J. occidentalis by prescribed fire, determined the impacts of fire to understory species, and examined early post-fire successional dynamics. The study demonstrated that late successional J. occidentalis woodlands can be burned after pre-cutting only a portion of the trees. Early succession in the cut-and-burn treatments were dominated by native annual and perennial forbs, in part due to high mortality of perennial bunchgrasses. By the third year after fire the number of establishing perennial grass seedlings indicated that both associations would achieve full herbaceous recovery. Cutting-prescribed fire combinations are an effective means for controlling encroaching late successional J. occidentalis and restoring herbaceous plant communities. However, land managers should recognize that there are potential problems associated with cutting-prescribed fire applications when invasive weeds are present.


Subject(s)
Conservation of Natural Resources/methods , Fires , Forestry/methods , Juniperus/growth & development , Environment , Plant Weeds/growth & development
20.
Environ Manage ; 44(1): 84-92, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19159967

ABSTRACT

Mowing is commonly implemented to Artemisia tridentata ssp. wyomingensis (Beetle & A. Young) S.L. Welsh (Wyoming big sagebrush) plant communities to improve wildlife habitat, increase forage production for livestock, and create fuel breaks for fire suppression. However, information detailing the influence of mowing on winter habitat for wildlife is lacking. This information is crucial because many wildlife species depended on A. tridentata spp. wyomingensis plant communities for winter habitat and consume significant quantities of Artemisia during this time. Furthermore, information is generally limited describing the recovery of A. tridentata spp. wyomingensis to mowing and the impacts of mowing on stand structure. Stand characteristics and Artemisia leaf tissue crude protein (CP), acid detergent fiber (ADF), and neutral detergent fiber (NDF) concentrations were measured in midwinter on 0-, 2-, 4-, and 6-year-old fall-applied mechanical (mowed at 20 cm height) treatments and compared to adjacent untreated (control) areas. Mowing compared to the control decreased Artemisia cover, density, canopy volume, canopy elliptical area, and height (P < 0.05), but all characteristics were recovering (P < 0.05). Mowing A. tridentata spp. wyomingensis plant communities slightly increases the nutritional quality of Artemisia leaves (P < 0.05), but it simultaneously results in up to 20 years of decrease in Artemisia structural characteristics. Because of the large reduction in A. tridentata spp. wyomingensis for potentially 20 years following mowing, mowing should not be applied in Artemisia facultative and obligate wildlife winter habitat. Considering the decline in A. tridentata spp. wyomingensis-dominated landscapes, we caution against mowing these communities.


Subject(s)
Artemisia/growth & development , Ecosystem , Animals , Artemisia/chemistry , Cellulose/analysis , Cold Climate , Food , Food Chain , Lignin/analysis , Plant Leaves/chemistry , Plant Leaves/growth & development , Plant Proteins/analysis , Seasons
SELECTION OF CITATIONS
SEARCH DETAIL
...