Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 90
Filter
Add more filters

Publication year range
1.
PLoS Biol ; 21(3): e3001879, 2023 03.
Article in English | MEDLINE | ID: mdl-36947547

ABSTRACT

Bacteria that live inside the cells of insect hosts (endosymbionts) can alter the reproduction of their hosts, including the killing of male offspring (male killing, MK). MK has only been described in a few insects, but this may reflect challenges in detecting MK rather than its rarity. Here, we identify MK Wolbachia at a low frequency (around 4%) in natural populations of Drosophila pseudotakahashii. MK Wolbachia had a stable density and maternal transmission during laboratory culture, but the MK phenotype which manifested mainly at the larval stage was lost rapidly. MK Wolbachia occurred alongside a second Wolbachia strain expressing a different reproductive manipulation, cytoplasmic incompatibility (CI). A genomic analysis highlighted Wolbachia regions diverged between the 2 strains involving 17 genes, and homologs of the wmk and cif genes implicated in MK and CI were identified in the Wolbachia assembly. Doubly infected males induced CI with uninfected females but not females singly infected with CI-causing Wolbachia. A rapidly spreading dominant nuclear suppressor genetic element affecting MK was identified through backcrossing and subsequent analysis with ddRAD SNPs of the D. pseudotakahashii genome. These findings highlight the complexity of nuclear and microbial components affecting MK endosymbiont detection and dynamics in populations and the challenges of making connections between endosymbionts and the host phenotypes affected by them.


Subject(s)
Wolbachia , Animals , Male , Wolbachia/genetics , Reproduction , Drosophila/genetics , Phenotype , Insecta , Symbiosis
2.
Blood Cells Mol Dis ; 98: 102699, 2023 01.
Article in English | MEDLINE | ID: mdl-36027791

ABSTRACT

Elevated levels of circulating cell-free hemoglobin (CFH) are an integral feature of several clinical conditions including sickle cell anemia, sepsis, hemodialysis and cardiopulmonary bypass. Oxidized (Fe3+, ferric) hemoglobin contributes to the pathophysiology of these disease states and is therefore widely studied in experimental models, many of which use commercially sourced CFH. In this study, we treated human endothelial cells with commercially sourced ferric hemoglobin and observed the appearance of dense cytoplasmic aggregates (CAgg) over time. These CAgg were intensely autofluorescent, altered intracellular structures (such as mitochondria), formed in multiple cell types and with different media composition, and formed regardless of the presence or absence of cells. An in-depth chemical analysis of these CAgg revealed that they contain inorganic components and are not pure hemoglobin. To oxidize freshly isolated hemoglobin without addition of an oxidizing agent, we developed a novel method to convert ferrous CFH to ferric CFH using ultraviolet light without the need for additional redox agents. Unlike commercial ferric hemoglobin, treatment of cells with the fresh ferric hemoglobin did not lead to CAgg formation. These studies suggest that commercially sourced CFH may contain stabilizers and additives which contribute to CAgg formation.


Subject(s)
Endothelial Cells , Ultraviolet Rays , Humans , Endothelial Cells/metabolism , Hemoglobins/metabolism , Oxidation-Reduction , Iron/metabolism
3.
J Emerg Med ; 65(3): e209-e220, 2023 09.
Article in English | MEDLINE | ID: mdl-37635036

ABSTRACT

BACKGROUND: Cardiac arrest occurs in approximately 350,000 patients outside the hospital and approximately 30,000 patients in the emergency department (ED) annually in the United States. When return of spontaneous circulation (ROSC) is achieved, hypotension is a common complication. However, optimal dosing of vasopressors is not clear. OBJECTIVE: The objective of this study was to determine if initial vasopressor dosing was associated with cardiac re-arrest in patients after ROSC. METHODS: This was a retrospective, single-center analysis of adult patients experiencing cardiac arrest prior to arrival or within the ED. Patients were assigned to one of four groups based on starting dose of vasopressor: low dose (LD; < 0.25 µg/kg/min), medium dose (MD; 0.25-0.49 µg/kg/min), high dose (HD; 0.5-0.99 µg/kg/min), and very high dose (VHD; ≥ 1 µg/kg/min). Data collection was performed primarily via manual chart review of medical records. The primary outcome was incidence of cardiac re-arrest within 1 h of vasopressor initiation. Multivariate logistic regression analysis was conducted to identify any covariates strongly associated with the primary outcome. RESULTS: No difference in cardiac re-arrest incidence was noted between groups. The VHD group was significantly more likely to require a second vasopressor (p = 0.003). The HD group had lower survival rates to hospital discharge compared with the LD and MD groups (p = 0.0033 and p = 0.0147). In the multivariate regression, longer duration of pre-vasopressor re-arrests and hyperkalemic cardiac arrest etiology were significant predictors of cardiac re-arrest after vasopressor initiation. CONCLUSIONS: Initial vasopressor dosing was not found to be associated with risk of cardiac re-arrest or, conversely, risk of adverse events.


Subject(s)
Heart Arrest , Return of Spontaneous Circulation , Adult , Humans , Retrospective Studies , Heart , Heart Arrest/drug therapy , Emergency Service, Hospital , Vasoconstrictor Agents/pharmacology , Vasoconstrictor Agents/therapeutic use
4.
Clin Infect Dis ; 72(Suppl 1): S68-S73, 2021 01 29.
Article in English | MEDLINE | ID: mdl-33512521

ABSTRACT

BACKGROUND: Patients with methicillin-resistant Staphylococcus aureus bloodstream infections (MRSA BSI) usually receive initial treatment with vancomycin but may be switched to daptomycin for definitive therapy, especially if treatment failure is suspected. Our objective was to evaluate the effectiveness of switching from vancomycin to daptomycin compared with remaining on vancomycin among patients with MRSA BSI. METHODS: Patients admitted to 124 Veterans Affairs Hospitals who experienced MRSA BSI and were treated with vancomycin during 2007-2014 were included. The association between switching to daptomycin and 30-day mortality was assessed using Cox regression models. Separate models were created for switching to daptomycin any time during the first hospitalization and for switching within 3 days of receiving vancomycin. RESULTS: In total, 7411 patients received vancomycin for MRSA BSI. Also, 606 (8.2%) patients switched from vancomycin to daptomycin during the first hospitalization, and 108 (1.5%) switched from vancomycin to daptomycin within 3 days of starting vancomycin. In the multivariable analysis, switching to daptomycin within 3 days was significantly associated with lower 30-day mortality (hazards ratio [HR] = 0.48; 95% confidence interval [CI]: .25, .92). However, switching to daptomycin at any time during the first hospitalization was not significantly associated with 30-day mortality (HR: 0.87; 95% CI: .69, 1.09). CONCLUSIONS: Switching to daptomycin within 3 days of initial receipt of vancomycin is associated with lower 30-day mortality among patients with MRSA BSI. This benefit was not seen when the switch occurred later. Future studies should prospectively assess the benefit of early switching from vancomycin to other anti-MRSA antibiotics.


Subject(s)
Bacteremia , Daptomycin , Methicillin-Resistant Staphylococcus aureus , Staphylococcal Infections , Anti-Bacterial Agents/therapeutic use , Bacteremia/drug therapy , Daptomycin/therapeutic use , Humans , Microbial Sensitivity Tests , Retrospective Studies , Staphylococcal Infections/drug therapy , Treatment Outcome , Vancomycin/therapeutic use
5.
Mol Phylogenet Evol ; 158: 107061, 2021 05.
Article in English | MEDLINE | ID: mdl-33387647

ABSTRACT

The Drosophila montium species group is a clade of 94 named species, closely related to the model species D. melanogaster. The montium species group is distributed over a broad geographic range throughout Asia, Africa, and Australasia. Species of this group possess a wide range of morphologies, mating behaviors, and endosymbiont associations, making this clade useful for comparative analyses. We use genomic data from 42 available species to estimate the phylogeny and relative divergence times within the montium species group, and its relative divergence time from D. melanogaster. To assess the robustness of our phylogenetic inferences, we use 3 non-overlapping sets of 20 single-copy coding sequences and analyze all 60 genes with both Bayesian and maximum likelihood methods. Our analyses support monophyly of the group. Apart from the uncertain placement of a single species, D. baimaii, our analyses also support the monophyly of all seven subgroups proposed within the montium group. Our phylograms and relative chronograms provide a highly resolved species tree, with discordance restricted to estimates of relatively short branches deep in the tree. In contrast, age estimates for the montium crown group, relative to its divergence from D. melanogaster, depend critically on prior assumptions concerning variation in rates of molecular evolution across branches, and hence have not been reliably determined. We discuss methodological issues that limit phylogenetic resolution - even when complete genome sequences are available - as well as the utility of the current phylogeny for understanding the evolutionary and biogeographic history of this clade.


Subject(s)
Drosophila/classification , Animals , Bayes Theorem , DNA/chemistry , DNA/isolation & purification , DNA/metabolism , Drosophila/genetics , Drosophila Proteins/classification , Drosophila Proteins/genetics , Drosophila melanogaster/classification , Drosophila melanogaster/genetics , Evolution, Molecular , Phylogeny , Sequence Analysis, DNA
6.
Med Care ; 59(8): 727-735, 2021 08 01.
Article in English | MEDLINE | ID: mdl-33900271

ABSTRACT

BACKGROUND: With human immunodeficiency virus (HIV) now managed as a chronic disease, health care has had to change and expand to include management of other critical comorbidities. We sought to understand how variation in the organization, structure and processes of HIV and comorbidity care, based on patient-centered medical home (PCMH) principles, was related to care quality for Veterans with HIV. RESEARCH DESIGN: Qualitative site visits were conducted at a purposive sample of 8 Department of Veterans Affairs Medical Centers, varying in care quality and outcomes for HIV and common comorbidities. Site visits entailed conduct of patient interviews (n=60); HIV care team interviews (n=60); direct observation of clinic processes and team interactions (n=22); and direct observations of patient-provider clinical encounters (n=45). Data were analyzed using a priori and emergent codes, construction of site syntheses and comparing sites with varying levels of quality. RESULTS: Sites highest and lowest in both HIV and comorbidity care quality demonstrated clear differences in provision of PCMH-principled care. The highest site provided greater team-based, comprehensive, patient-centered, and data-driven care and engaged in continuous improvement. Sites with higher HIV care quality attended more to psychosocial needs. Sites that had consistent processes for comorbidity care, whether in HIV or primary care clinics, had higher quality of comorbidity care. CONCLUSIONS: Provision of high-quality HIV care and high-quality co-morbidity care require different care structures and processes. Provision of both requires a focus on providing care aligned with PCMH principles, integrating psychosocial needs into care, and establishing explicit consistent approaches to comorbidity management.


Subject(s)
Comorbidity , HIV Infections/therapy , Patient-Centered Care/organization & administration , Quality of Health Care/organization & administration , Ambulatory Care Facilities/standards , Humans , Patient Care Team , Patient Satisfaction , Patient-Centered Care/methods , Qualitative Research , Quality of Health Care/statistics & numerical data , United States , United States Department of Veterans Affairs , Veterans
7.
Chem Rev ; 119(2): 1456-1518, 2019 01 23.
Article in English | MEDLINE | ID: mdl-30511833

ABSTRACT

Infectious diseases claim millions of lives each year. Robust and accurate diagnostics are essential tools for identifying those who are at risk and in need of treatment in low-resource settings. Inorganic complexes and metal-based nanomaterials continue to drive the development of diagnostic platforms and strategies that enable infectious disease detection in low-resource settings. In this review, we highlight works from the past 20 years in which inorganic chemistry and nanotechnology were implemented in each of the core components that make up a diagnostic test. First, we present how inorganic biomarkers and their properties are leveraged for infectious disease detection. In the following section, we detail metal-based technologies that have been employed for sample preparation and biomarker isolation from sample matrices. We then describe how inorganic- and nanomaterial-based probes have been utilized in point-of-care diagnostics for signal generation. The following section discusses instrumentation for signal readout in resource-limited settings. Next, we highlight the detection of nucleic acids at the point of care as an emerging application of inorganic chemistry. Lastly, we consider the challenges that remain for translation of the aforementioned diagnostic platforms to low-resource settings.


Subject(s)
Communicable Diseases/diagnosis , Coordination Complexes/chemistry , Metals/chemistry , Nanostructures/chemistry , Biomarkers/analysis , Humans , Luminescent Measurements/methods , Magnetics , Point-of-Care Systems
8.
BMC Health Serv Res ; 20(1): 110, 2020 02 12.
Article in English | MEDLINE | ID: mdl-32050947

ABSTRACT

BACKGROUND: Inter-facility transfer is an important strategy for improving access to specialized health services, but transfers are complicated by over-triage, under-triage, travel burdens, and costs. The purpose of this study is to describe ED-based inter-facility transfer practices within the Veterans Health Administration (VHA) and to estimate the proportion of potentially avoidable transfers. METHODS: This observational cohort study included all patients treated in VHA EDs between 2012 and 2014 who were transferred to another VHA hospital. Potentially avoidable transfers were defined as patients who were either discharged from the receiving ED or admitted to the receiving hospital for ≤1 day without having an invasive procedure performed. We conducted facility- and diagnosis-level analyses to identify subgroups of patients for whom potentially avoidable transfers had increased prevalence. RESULTS: Of 6,173,189 ED visits during the 3-year study period, 18,852 (0.3%) were transferred from one VHA ED to another VHA facility. Rural residents were transferred three times as often as urban residents (0.6% vs. 0.2%, p < 0.001), and 22.8% of all VHA-to-VHA transfers were potentially avoidable transfers. The 3 disease categories most commonly associated with inter-facility transfer were mental health (34%), cardiac (12%), and digestive diagnoses (9%). CONCLUSIONS: VHA inter-facility transfer is commonly performed for mental health and cardiac evaluation, particularly for patients in rural settings. The proportion that are potentially avoidable is small. Future work should focus on improving capabilities to provide specialty evaluation locally for these conditions, possibly using telehealth solutions.


Subject(s)
Emergency Service, Hospital , Patient Transfer/statistics & numerical data , United States Department of Veterans Affairs , Adult , Aged , Cohort Studies , Female , Health Services Research , Humans , Male , Middle Aged , United States
9.
Heredity (Edinb) ; 122(4): 428-440, 2019 04.
Article in English | MEDLINE | ID: mdl-30139962

ABSTRACT

Wolbachia bacteria are common insect endosymbionts transmitted maternally and capable of spreading through insect populations by cytoplasmic incompatibility (CI) when infected males cause embryo death after mating with uninfected females. Selection in the Wolbachia endosymbiont occurs on female hosts and is expected to favour strong maternal transmission to female offspring, even at the cost of reduced CI. With maternal leakage, nuclear genes are expected to be selected to suppress cytoplasmic incompatibility caused by males while also reducing any deleterious effects associated with the infection. Here we describe a new type of Wolbachia strain from Drosophila pseudotakahashii likely to have arisen from evolutionary processes on host and/or Wolbachia genomes. This strain is often absent from adult male offspring, but always transmitted to females. It leads to males with low or non-detectable Wolbachia that nevertheless show CI. When detected in adult males, the infection has a low density relative to that in females, a phenomenon not previously seen in Wolbachia infections of Drosophila. This Wolbachia strain is common in natural populations, and shows reduced CI when older (infected) males are crossed. These patterns highlight that endosymbionts can have strong sex-specific effects and that high frequency Wolbachia strains persist through effects on female reproduction. Female-limited Wolbachia infections may be of applied interest if the low level of Wolbachia in males reduces deleterious fitness effects on the host.


Subject(s)
Cytoplasm/microbiology , Drosophila/genetics , Drosophila/microbiology , Wolbachia/physiology , Animals , Biological Evolution , Female , Fertility/genetics , Male , Phylogeny , Reproduction , Symbiosis/genetics , Wolbachia/classification , Wolbachia/genetics
10.
Pain Med ; 19(4): 788-792, 2018 04 01.
Article in English | MEDLINE | ID: mdl-28340259

ABSTRACT

Background: Concurrent use of sedatives, especially anxiolytics, and opioids is associated with increased risk of medication-related harms. To the extent that multiple prescribers are involved, approaches to influence patterns of coprescribing will differ from those to influence prescribing within a single drug class. Objectives: Describe the proportion of new opioid recipients with concurrent sedative medications at opioid initiation and determine whether these medications were prescribed by the same prescriber. Methods: We used national Department of Veterans Affairs (VA) outpatient pharmacy administration data to identify veterans who received a new opioid prescription between October 20, 2010, and September 1, 2011 (FY 2011), preceded by a 365-day opioid-free period. Concurrent sedative use was defined as a skeletal muscle relaxant, benzodiazepine, atypical antipsychotic, or hypnotic filled on the opioid start date or before and after the opioid start date with a gap of less than twice the day supply of the prior fill. Results: Concurrent sedative use at opioid initiation was 21.4% (112,408/526,499) in FY 2011. The proportion of concurrent recipients who received at least one concurrent sedative prescribed by a provider other than the opioid prescriber was 61.4% (69,002/112,408). The proportion of recipients who received a sedative concurrent with opioid initiation from the same prescriber varied across sedative class. Benzodiazepines and opioids were prescribed by the same provider in 41.1% (15,520/37,750) of concurrent users. Conclusion: One in five patients newly prescribed opioids also had a sedative prescription. Less than half of patients with concurrent opioid and benzodiazepine prescriptions received these from the same provider. Efforts to reduce concurrent opioid and sedative prescribing will require addressing care coordination.


Subject(s)
Analgesics, Opioid/therapeutic use , Drug Prescriptions/statistics & numerical data , Hypnotics and Sedatives/therapeutic use , Practice Patterns, Physicians'/statistics & numerical data , Adult , Aged , Female , Humans , Male , Middle Aged , United States , United States Department of Veterans Affairs , Veterans
11.
Emerg Infect Dis ; 23(11): 1815-1825, 2017 11.
Article in English | MEDLINE | ID: mdl-29047423

ABSTRACT

Bacteremia caused by gram-negative bacteria is associated with serious illness and death, and emergence of antimicrobial drug resistance in these bacteria is a major concern. Using national microbiology and patient data for 2003-2013 from the US Veterans Health Administration, we characterized nonsusceptibility trends of community-acquired, community-onset; healthcare-associated, community-onset; and hospital-onset bacteremia for selected gram-negative bacteria (Escherichia coli, Klebsiella spp., Pseudomonas aeruginosa, and Acinetobacter spp.). For 47,746 episodes of bacteremia, the incidence rate was 6.37 episodes/10,000 person-years for community-onset bacteremia and 4.53 episodes/10,000 patient-days for hospital-onset bacteremia. For Klebsiella spp., P. aeruginosa, and Acinetobacter spp., we observed a decreasing proportion of nonsusceptibility across nearly all antimicrobial drug classes for patients with healthcare exposure; trends for community-acquired, community-onset isolates were stable or increasing. The role of infection control and antimicrobial stewardship efforts in inpatient settings in the decrease in drug resistance rates for hospital-onset isolates needs to be determined.


Subject(s)
Anti-Bacterial Agents/pharmacology , Drug Resistance, Bacterial , Gram-Negative Bacteria/drug effects , Gram-Negative Bacterial Infections/microbiology , Veterans , Acinetobacter/drug effects , Aged , Bacteremia/microbiology , Cohort Studies , Escherichia coli/drug effects , Female , Gram-Negative Bacteria/isolation & purification , Gram-Negative Bacterial Infections/epidemiology , Humans , Klebsiella/drug effects , Male , Pseudomonas aeruginosa/drug effects , Retrospective Studies , United States/epidemiology , United States Department of Veterans Affairs
12.
Analyst ; 142(9): 1569-1580, 2017 May 02.
Article in English | MEDLINE | ID: mdl-28386613

ABSTRACT

Diagnosis of asymptomatic malaria poses a great challenge to global disease elimination efforts. Healthcare infrastructure in rural settings cannot support existing state-of-the-art tools necessary to diagnose asymptomatic malaria infections. Instead, lateral flow immunoassays (LFAs) are widely used as a diagnostic tool in malaria endemic areas. While LFAs are simple and easy to use, they are unable to detect low levels of parasite infection. We have developed a field deployable Magnetically-enabled Biomarker Extraction And Delivery System (mBEADS) that significantly improves limits of detection for several commercially available LFAs. Integration of mBEADS with leading commercial Plasmodium falciparum malaria LFAs improves detection limits to encompass an estimated 95% of the disease reservoir. This user-centered mBEADS platform makes significant improvements to a previously cumbersome malaria biomarker enrichment strategy by improving reagent stability, decreasing the processing time 10-fold, and reducing the assay cost 10-fold. The resulting mBEADS process adds just three minutes and less than $0.25 to the total cost of a single LFA, thus balancing sensitivity and practicality to align with the World Health Organization's ASSURED criteria for point-of-care (POC) testing.


Subject(s)
Biomarkers/analysis , Immunoassay , Malaria, Falciparum/diagnosis , Ferrosoferric Oxide , Humans , Limit of Detection , Microspheres , Plasmodium falciparum
13.
AIDS Res Ther ; 14(1): 7, 2017 Feb 13.
Article in English | MEDLINE | ID: mdl-28193244

ABSTRACT

OBJECTIVES: Antigen-induced activation and proliferation of HIV-1-infected cells is hypothesized to be a mechanism of HIV persistence during antiretroviral therapy. The objective of this study was to determine if proliferation of H1N1-specific HIV-infected cells could be detected following H1N1 vaccination. METHODS: This study utilized cryopreserved PBMC from a previously conducted trial of H1N1 vaccination in HIV-infected pregnant women. HIV-1 DNA concentrations and 437 HIV-1 C2V5 env DNA sequences were analyzed from ten pregnant women on effective antiretroviral therapy, before and 21 days after H1N1 influenza vaccination. RESULTS: HIV-1 DNA concentration did not change after vaccination (median pre- vs. post-vaccination: 95.77 vs. 41.28 copies/million PBMC, p = .37). Analyses of sequences did not detect evidence of HIV replication or proliferation of infected cells. CONCLUSIONS: Antigenic stimulation during effective ART did not have a detectable effect on the genetic makeup of the HIV-1 DNA reservoir. Longitudinal comparison of the amount and integration sites of HIV-1 in antigen-specific cells to chronic infections (such as herpesviruses) may be needed to definitively evaluate whether antigenic stimulation induces proliferation of HIV-1 infected cells.


Subject(s)
HIV Infections/immunology , HIV-1/isolation & purification , Influenza A Virus, H1N1 Subtype/immunology , Influenza Vaccines/immunology , Influenza, Human/prevention & control , Pregnancy Complications, Infectious/immunology , Anti-HIV Agents/therapeutic use , Antigens, Viral , Antiretroviral Therapy, Highly Active/methods , Base Sequence , CD4 Lymphocyte Count , Female , HIV Infections/blood , HIV Infections/drug therapy , HIV-1/genetics , HIV-1/growth & development , Humans , Influenza Vaccines/administration & dosage , Influenza, Human/drug therapy , Influenza, Human/immunology , Leukocytes, Mononuclear , Pregnancy , Pregnancy Complications, Infectious/blood , Pregnancy Complications, Infectious/drug therapy , Pregnancy Complications, Infectious/virology , Proviruses/isolation & purification , Sequence Analysis , Treatment Outcome , Virus Replication
14.
Lang Speech ; 60(1): 27-47, 2017 03.
Article in English | MEDLINE | ID: mdl-28326988

ABSTRACT

Typically-developing children, 4 to 6 years of age, and adults participated in discrimination and identification speech perception tasks using a synthetic consonant-vowel continuum ranging from /da/ to /ga/. The seven-step synthetic /da/-/ga/ continuum was created by adjusting the first 40 ms of the third formant frequency transition. For the discrimination task, listeners participated in a Change/No-Change paradigm with four different stimuli compared to the endpoint-1 /da/ token. For the identification task, listeners labeled each token along the /da/-/ga/ continuum as either "DA" or "GA." Results of the discrimination experiment showed that sensitivity to the third-formant transition cue improved for the adult listeners as the stimulus contrast increased, whereas the performance of the children remained poor across all stimulus comparisons. Results of the identification experiment support previous hypotheses of age-related differences in phonetic categorization. Results have implications for normative data on identification and discrimination tasks. These norms provide a metric against which children with auditory-based speech sound disorders can be compared. Furthermore, the results provide some insight into the developmental nature of categorical and non-categorical speech perception.


Subject(s)
Child Development , Cues , Phonetics , Pitch Discrimination , Speech Acoustics , Speech Perception , Voice Quality , Acoustic Stimulation , Adult , Age Factors , Child, Preschool , Female , Humans , Male , Time Factors , Young Adult
15.
Clin Infect Dis ; 73(6): 1129-1130, 2021 09 15.
Article in English | MEDLINE | ID: mdl-33738493
16.
Clin Infect Dis ; 63(5): 642-650, 2016 Sep 01.
Article in English | MEDLINE | ID: mdl-27358355

ABSTRACT

BACKGROUND: The Veterans Health Administration (VHA) introduced the Methicillin-Resistant Staphylococcus aureus (MRSA) Prevention Initiative in March 2007. Although the initiative has been perceived as a vertical intervention focusing on MRSA, it also expanded infection prevention and control programs and resources. We aimed to assess the horizontal effect of the initiative on hospital-onset (HO) gram-negative rod (GNR) bacteremia. METHODS: This retrospective cohort included all patients who had HO bacteremia due to Escherichia coli, Klebsiella species, or Pseudomonas aeruginosa at 130 VHA facilities from January 2003 to December 2013. The effects were assessed using segmented linear regression with autoregressive error models, incorporating autocorrelation, immediate effect, and time before and after the initiative. Community-acquired (CA) bacteremia with same species was also analyzed as nonequivalent dependent controls. RESULTS: A total of 11 196 patients experienced HO-GNR bacteremia during the study period. There was a significant change of slope in HO-GNR bacteremia incidence rates from before the initiative (+0.3%/month) to after (-0.4%/month) (P < .01), while CA GNR incidence rates did not significantly change (P = .08). Cumulative effect of the intervention on HO-GNR bacteremia incidence rates at the end of the study period was estimated to be -43.2% (95% confidence interval, -51.6% to -32.4%). Similar effects were observed in subgroup analyses of each species and antimicrobial susceptibility profile. CONCLUSIONS: Within 130 VHA facilities, there was a sustained decline in HO-GNR bacteremia incidence rates after the implementation of the MRSA Prevention Initiative. As these organisms were not specifically targeted, it is likely that horizontal components of the initiative contributed to this decline.


Subject(s)
Bacteremia , Cross Infection , Gram-Negative Bacterial Infections , Veterans/statistics & numerical data , Aged , Bacteremia/epidemiology , Bacteremia/prevention & control , Cross Infection/epidemiology , Cross Infection/prevention & control , Female , Gram-Negative Bacterial Infections/epidemiology , Gram-Negative Bacterial Infections/prevention & control , Humans , Infection Control/methods , Infection Control/statistics & numerical data , Male , Methicillin-Resistant Staphylococcus aureus , Middle Aged , Retrospective Studies , Staphylococcal Infections/epidemiology , Staphylococcal Infections/prevention & control , United States , United States Department of Veterans Affairs
17.
Pain Med ; 17(7): 1282-1291, 2016 Jul 01.
Article in English | MEDLINE | ID: mdl-27048346

ABSTRACT

BACKGROUND: Understanding opioid prescribing trends requires differentiating clinically distinct short- and long-term receipt patterns. OBJECTIVES: Describe the one-year course of opioid receipt among new opioid recipients and determine the proportion with subsequent long-term opioid therapy. Discern variation in proportion with long-term therapy initiation by geographic region and across Veterans Health Administration (VHA) medical centers. METHODS: Longitudinal course of opioid receipt was analyzed using a cabinet supply approach. Short-term receipt was defined as index treatment episode lasting no longer than 30 days; long-term therapy as treatment episode of >90 days that began within the first 30 days following opioid index date. PATIENTS: All VHA pharmacy users in 2004 and to 2011 who received a new prescription for an opioid (incident opioid recipients) preceded by 365 days with no opioid prescribed. RESULTS: The proportion of all incident recipients who met the definition for long-term therapy within the first year decreased from 20.4% (N = 76,280) in 2004 to 18.3% (N = 96,166) in 2011. The proportion of incident recipients with chronic pain was unchanged between 2004 and 2011. Hydrocodone and tramadol increased as a proportion of initial opioids prescribed. Median days initially supplied decreased from 30 to 20 days. A greater percentage of new opioid prescriptions were for 7 days or fewer (20.9% in 2004; 27.9% in 2011). The proportion of new recipients who initiated long-term opioid therapy varied widely by medical center. Medical centers with higher proportions of new long-term recipients in 2004 saw greater decreases in this metric by 2011. CONCLUSION: The proportion of new opioid recipients who initiated long-term opioid therapy declined between 2004 and 2011.

18.
Prev Chronic Dis ; 13: E51, 2016 04 14.
Article in English | MEDLINE | ID: mdl-27079649

ABSTRACT

INTRODUCTION: In the United States, prostate cancer mortality rates have declined in recent decades. Cigarette smoking, a risk factor for prostate cancer death, has also declined. It is unknown whether declines in smoking prevalence produced detectable declines in prostate cancer mortality. We examined state prostate cancer mortality rates in relation to changes in cigarette smoking. METHODS: We studied men aged 35 years or older from California, Kentucky, Maryland, and Utah. Data on state smoking prevalence were obtained from the Behavioral Risk Factor Surveillance System. Mortality rates for prostate cancer and external causes (control condition) were obtained from the Centers for Disease Control and Prevention's Wide-Ranging Online Data for Epidemiologic Research. The average annual percentage change from 1999 through 2010 was estimated using joinpoint analysis. RESULTS: From 1999 through 2010, smoking in California declined by 3.5% per year (-4.4% to -2.5%), and prostate cancer mortality rates declined by 2.5% per year (-2.9% to -2.2%). In Kentucky, smoking declined by 3.0% per year (-4.0% to -1.9%) and prostate cancer mortality rates declined by 3.5% per year (-4.3% to -2.7%). In Maryland, smoking declined by 3.0% per year (-7.0% to 1.2%), and prostate cancer mortality rates declined by 3.5% per year (-4.1% to -3.0%).In Utah, smoking declined by 3.5% per year (-5.6% to -1.3%) and prostate cancer mortality rates declined by 2.1% per year (-3.8% to -0.4%). No corresponding patterns were observed for external causes of death. CONCLUSION: Declines in prostate cancer mortality rates appear to parallel declines in smoking prevalence at the population level. This study suggests that declines in prostate cancer mortality rates may be a beneficial effect of reduced smoking in the population.


Subject(s)
Prostatic Neoplasms/mortality , Smoking/epidemiology , Smoking/trends , Adult , Aged , Aged, 80 and over , Behavioral Risk Factor Surveillance System , California/epidemiology , Centers for Disease Control and Prevention, U.S. , Humans , Kentucky/epidemiology , Male , Maryland/epidemiology , Middle Aged , United States , Utah/epidemiology
19.
J Natl Med Assoc ; 108(4): 201-210.e3, 2016.
Article in English | MEDLINE | ID: mdl-27979005

ABSTRACT

BACKGROUND: Prior studies have described racial disparities in the quality of care for persons with HIV infection, but it is unknown if these disparities extend to common comorbid conditions. To inform implementation of interventions to reduce disparities in HIV care, we examined racial variation in a set of quality measures for common comorbid conditions among Veterans in care for HIV in the United States. METHOD: The cohort included 23,974 Veterans in care for HIV in 2013 (53.4% black; 46.6% white). Measures extracted from electronic health record and administrative data were receipt of combination antiretroviral therapy (cART), HIV viral control (serum RNA < 200 copies/ml among those on cART), hypertension control (blood pressure < 140/90 mm Hg among those with hypertension), diabetes control (hemoglobin A1C < 9% among those with diabetes), lipid monitoring, guideline-concordant antidepressant prescribing, and initiation and engagement in substance use disorder (SUD) treatment. Black persons were less likely than their white counterparts to receive cART (90.2% vs. 93.2%, p<.001), and experience viral control (84.6% vs. 91.3%, p<.001), hypertension control (61.9% vs. 68.3%, p<.001), diabetes control (85.5% vs. 89.5%, p<.001), and lipid monitoring (81.5% vs. 85.2%, p<.001). Initiation and engagement in SUD treatment were similar among blacks and whites. Differences remained after adjusting for age, comorbidity, retention in HIV care, and a measure of neighborhood social disadvantage created from census data. SIGNIFICANCE: Implementation of interventions to reduce racial disparities in HIV care should comprehensively address and monitor processes and outcomes of care for key comorbidities.


Subject(s)
Ethnicity/statistics & numerical data , HIV Infections/therapy , Healthcare Disparities/ethnology , Black or African American , Anti-HIV Agents/therapeutic use , Comorbidity , Diabetes Mellitus , HIV Infections/epidemiology , Humans , Racial Groups , United States , White People
20.
Clin Infect Dis ; 59(8): 1160-7, 2014 Oct 15.
Article in English | MEDLINE | ID: mdl-25034427

ABSTRACT

BACKGROUND: There will be increasing pressure to publicly report and rank the performance of healthcare systems on human immunodeficiency virus (HIV) quality measures. To inform discussion of public reporting, we evaluated the influence of case-mix adjustment when ranking individual care systems on the viral control quality measure. METHODS: We used data from the Veterans Health Administration (VHA) HIV Clinical Case Registry and administrative databases to estimate case-mix adjusted viral control for 91 local systems caring for 12 368 patients. We compared results using 2 adjustment methods, the observed-to-expected estimator and the risk-standardized ratio. RESULTS: Overall, 10 913 patients (88.2%) achieved viral control (viral load ≤400 copies/mL). Prior to case-mix adjustment, system-level viral control ranged from 51% to 100%. Seventeen (19%) systems were labeled as low outliers (performance significantly below the overall mean) and 11 (12%) as high outliers. Adjustment for case mix (patient demographics, comorbidity, CD4 nadir, time on therapy, and income from VHA administrative databases) reduced the number of low outliers by approximately one-third, but results differed by method. The adjustment model had moderate discrimination (c statistic = 0.66), suggesting potential for unadjusted risk when using administrative data to measure case mix. CONCLUSIONS: Case-mix adjustment affects rankings of care systems on the viral control quality measure. Given the sensitivity of rankings to selection of case-mix adjustment methods-and potential for unadjusted risk when using variables limited to current administrative databases-the HIV care community should explore optimal methods for case-mix adjustment before moving forward with public reporting.


Subject(s)
Delivery of Health Care/statistics & numerical data , HIV Infections/diagnosis , HIV Infections/drug therapy , Health Services Research , Risk Adjustment , Adult , Aged , Female , Humans , Male , Middle Aged , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL