RESUMEN
Biodiversity-mediated ecosystem services (ES) support human well-being, but their values are typically estimated individually. Although ES are part of complex socioecological systems, we know surprisingly little about how multiple ES interact ecologically and economically. Interactions could be positive (synergy), negative (trade-offs), or absent (additive effects), with strong implications for management and valuation. Here, we evaluate the interactions of two ES, pollination and pest control, via a factorial field experiment in 30 Costa Rican coffee farms. We found synergistic interactions between these two critical ES to crop production. The combined positive effects of birds and bees on fruit set, fruit weight, and fruit weight uniformity were greater than their individual effects. This represents experimental evidence at realistic farm scales of positive interactions among ES in agricultural systems. These synergies suggest that assessments of individual ES may underestimate the benefits biodiversity provides to agriculture and human well-being. Using our experimental results, we demonstrate that bird pest control and bee pollination services translate directly into monetary benefits to coffee farmers. Excluding both birds and bees resulted in an average yield reduction of 24.7% (equivalent to losing US$1,066.00/ha). These findings highlight that habitat enhancements to support native biodiversity can have multiple benefits for coffee, a valuable crop that supports rural livelihoods worldwide. Accounting for potential interactions among ES is essential to quantifying their combined ecological and economic value.
Asunto(s)
Café , Producción de Cultivos , Control de Plagas , Polinización , BiodiversidadRESUMEN
BACKGROUND: There are no systematic measures of central line-associated bloodstream infections (CLABSIs) in patients maintaining central venous catheters (CVCs) outside acute care hospitals. To clarify the burden of CLABSIs in these patients, we characterized patients with CLABSI present on hospital admission (POA). METHODS: Retrospective cross-sectional analysis of patients with CLABSI-POA in 3 health systems covering 11 hospitals across Maryland, Washington DC, and Missouri from November 2020 to October 2021. CLABSI-POA was defined using an adaptation of the acute care CLABSI definition. Patient demographics, clinical characteristics, and outcomes were collected via record review. Cox proportional hazard analysis was used to assess factors associated with the all-cause mortality rate within 30 days. RESULTS: A total of 461 patients were identified as having CLABSI-POA. CVCs were most commonly maintained in home infusion therapy (32.8%) or oncology clinics (31.2%). Enterobacterales were the most common etiologic agent (29.2%). Recurrent CLABSIs occurred in a quarter of patients (25%). Eleven percent of patients died during the hospital admission. Among patients with CLABSI-POA, mortality risk increased with age (hazard ratio vs age <20 years by age group: 20-44 years, 11.2 [95% confidence interval, 1.46-86.22]; 45-64 years, 20.88 [2.84-153.58]; ≥65 years, 22.50 [2.98-169.93]) and lack of insurance (2.46 [1.08-5.59]), and it decreased with CVC removal (0.57 [.39-.84]). CONCLUSIONS: CLABSI-POA is associated with significant in-hospital mortality risk. Surveillance is required to understand the burden of CLABSI in the community to identify targets for CLABSI prevention initiatives outside acute care settings.
Asunto(s)
Infecciones Relacionadas con Catéteres , Humanos , Masculino , Infecciones Relacionadas con Catéteres/epidemiología , Infecciones Relacionadas con Catéteres/microbiología , Femenino , Persona de Mediana Edad , Estudios Retrospectivos , Estudios Transversales , Anciano , Adulto , Catéteres Venosos Centrales/efectos adversos , Catéteres Venosos Centrales/microbiología , Hospitalización/estadística & datos numéricos , Cateterismo Venoso Central/efectos adversos , Factores de Riesgo , Bacteriemia/epidemiología , Maryland/epidemiología , Adulto JovenRESUMEN
The COVID-19 pandemic has been reported to disrupt the access to care of people who live with HIV (PWH). The impact of the pandemic on the longitudinal HIV care continuum, however, has not been properly evaluated. We performed a mixed-methods study using data from the Mexican System of Distribution, Logistics, and ART Surveillance on PWH that are cared for in the state of Oaxaca. We evaluated the number of HIV diagnoses performed in the state before and during the pandemic with an interrupted time series. We used the longitudinal HIV care continuum framework to describe the stages of HIV care before and during the pandemic. Finally, we performed a qualitative analysis to determine which were the challenges faced by staff and users regarding HIV care during the pandemic. New HIV diagnoses were lower during the first year of the pandemic compared with the year immediately before. Among 2682 PWH with enough information to determine their status of care, 728 started receiving care during the COVID-19 pandemic and 1954 before the pandemic. PWH engaged before the pandemic spent 42825 months (58.2% of follow-up) in optimal HIV control compared with 3061 months (56.1% of follow-up) for those engaged in care during the pandemic. Staff and users reported decreases in the frequency of appointments, prioritisation of unhealthy users, larger disbursements of ART medication, and novel communication strategies with PWH. Despite challenges due to government cutbacks, changes implemented by staff helped maintain HIV care due to higher flexibility in ART delivery and individualised attention.
Asunto(s)
COVID-19 , Infecciones por VIH , Humanos , COVID-19/epidemiología , México/epidemiología , Pandemias , Infecciones por VIH/tratamiento farmacológico , Infecciones por VIH/epidemiología , Continuidad de la Atención al PacienteRESUMEN
Functional traits offer a rich quantitative framework for developing and testing theories in evolutionary biology, ecology and ecosystem science. However, the potential of functional traits to drive theoretical advances and refine models of global change can only be fully realised when species-level information is complete. Here we present the AVONET dataset containing comprehensive functional trait data for all birds, including six ecological variables, 11 continuous morphological traits, and information on range size and location. Raw morphological measurements are presented from 90,020 individuals of 11,009 extant bird species sampled from 181 countries. These data are also summarised as species averages in three taxonomic formats, allowing integration with a global phylogeny, geographical range maps, IUCN Red List data and the eBird citizen science database. The AVONET dataset provides the most detailed picture of continuous trait variation for any major radiation of organisms, offering a global template for testing hypotheses and exploring the evolutionary origins, structure and functioning of biodiversity.
Asunto(s)
Aves , Ecosistema , Animales , Biodiversidad , Evolución Biológica , Humanos , FilogeniaRESUMEN
Degradation, fragmentation, and loss of tropical forests has exponentially increased in the last decades leading to unprecedented rates of species extinctions and loss of ecosystems functions and services. Forest restoration is key to recover ecosystems health and achieve the UN Sustainable Development Goals. However, restoring forests at the landscape scale presents many challenges, since it requires balancing conservation goals and economic development. In this study, we used a spatial planning tool (Marxan) to identify priority areas for restoration satisfying multiple objectives across a biological corridor in Costa Rica. Biological corridors are critical conservation instruments promoting forest connectivity while acknowledging human presence. Increasing forest connectivity requires restoration initiatives that will likely conflict with other land uses, some of them of high national economic importance. Our restoration plan sought to maximize the provision of forest-related services (i.e., seed dispersal, tourism and carbon storage) while minimizing the impact on current land uses and thus avoiding potential conflicts. We quantified seed dispersal and tourism services (birdwatching potential) using species distribution models. We used the carbon sequestration model of InVEST to quantify carbon storage potential. We tested different restoration scenarios that differed in whether land opportunity costs of current uses were considered or not when identifying potential restoration areas, or how these costs were estimated. We showed how a landscape-scale forest restoration plan accounting for only forest connectivity and ecosystem service provision capacity can greatly differ from a plan that considers the potential impacts on local livelihoods. Spatial planning tools can assist at designing cost-effective landscape-scale forest restoration plans, identifying priority areas where forest restoration can maximize ecosystem provision and increase forest connectivity. Special care must be paid to the use of adequate estimates of opportunity cost, to avoid potential conflicts between restoration goals and other legitimate land uses.
Asunto(s)
Ecosistema , Desarrollo Sostenible , Biodiversidad , Secuestro de Carbono , Conservación de los Recursos Naturales , Costa Rica , Bosques , HumanosRESUMEN
The idea that noncrop habitat enhances pest control and represents a win-win opportunity to conserve biodiversity and bolster yields has emerged as an agroecological paradigm. However, while noncrop habitat in landscapes surrounding farms sometimes benefits pest predators, natural enemy responses remain heterogeneous across studies and effects on pests are inconclusive. The observed heterogeneity in species responses to noncrop habitat may be biological in origin or could result from variation in how habitat and biocontrol are measured. Here, we use a pest-control database encompassing 132 studies and 6,759 sites worldwide to model natural enemy and pest abundances, predation rates, and crop damage as a function of landscape composition. Our results showed that although landscape composition explained significant variation within studies, pest and enemy abundances, predation rates, crop damage, and yields each exhibited different responses across studies, sometimes increasing and sometimes decreasing in landscapes with more noncrop habitat but overall showing no consistent trend. Thus, models that used landscape-composition variables to predict pest-control dynamics demonstrated little potential to explain variation across studies, though prediction did improve when comparing studies with similar crop and landscape features. Overall, our work shows that surrounding noncrop habitat does not consistently improve pest management, meaning habitat conservation may bolster production in some systems and depress yields in others. Future efforts to develop tools that inform farmers when habitat conservation truly represents a win-win would benefit from increased understanding of how landscape effects are modulated by local farm management and the biology of pests and their enemies.
Asunto(s)
Productos Agrícolas , Ecosistema , Modelos Biológicos , Control Biológico de Vectores , Animales , Productos Agrícolas/crecimiento & desarrollo , Productos Agrícolas/parasitologíaRESUMEN
Guidance regarding indications for initial or follow-up blood cultures is limited. We conducted a scoping review of articles published between January 2004 and June 2019 that reported the yield of blood cultures and/or their impact in the clinical management of fever and common infectious syndromes in nonneutropenic adult inpatients. A total of 2893 articles were screened; 50 were included. Based on the reported incidence of bacteremia, syndromes were categorized into low, moderate, and high pretest probability of bacteremia. Routine blood cultures are recommended in syndromes with a high likelihood of bacteremia (eg, endovascular infections) and those with moderate likelihood when cultures from the primary source of infection are unavailable or when prompt initiation of antibiotics is needed prior to obtaining primary source cultures. In syndromes where blood cultures are low-yield, blood cultures can be considered for patients at risk of adverse events if a bacteremia is missed (eg, patient with pacemaker and severe purulent cellulitis). If a patient has adequate source control and risk factors or concern for endovascular infection are not present, most streptococci or Enterobacterales bacteremias do not require routine follow-up blood cultures.
Asunto(s)
Bacteriemia , Cultivo de Sangre , Adulto , Bacteriemia/diagnóstico , Celulitis (Flemón) , Fiebre , Humanos , Pacientes Internos , Estudios RetrospectivosRESUMEN
Interventions to optimize blood culture (BCx) practices in adult inpatients are limited. We conducted a before-after study evaluating the impact of a diagnostic stewardship program that aimed to optimize BCx use in a medical intensive care unit (MICU) and five medicine units at a large academic center. The program included implementation of an evidence-based algorithm detailing indications for BCx use and education and feedback to providers about BCx rates and indication inappropriateness. Neutropenic patients were excluded. BCx rates from contemporary control units were obtained for comparison. The primary outcome was the change in BCxs ordered with the intervention. Secondary outcomes included proportion of inappropriate BCx, solitary BCx, and positive BCx. Balancing metrics included compliance with the Centers for Medicare and Medicaid Services (CMS) SEP-1 BCx component, 30-day readmission, and all-cause in-hospital and 30-day mortality. After the intervention, BCx rates decreased from 27.7 to 22.8 BCx/100 patient-days (PDs) in the MICU (P = 0.001) and from 10.9 to 7.7 BCx/100 PD for the 5 medicine units combined (P < 0.001). BCx rates in the control units did not decrease significantly (surgical intensive care unit [ICU], P = 0.06; surgical units, P = 0.15). The proportion of inappropriate BCxs did not significantly change with the intervention (30% in the MICU and 50% in medicine units). BCx positivity increased in the MICU (from 8% to 11%, P < 0.001). Solitary BCxs decreased by 21% in the medicine units (P < 0.001). Balancing metrics were similar before and after the intervention. BCx use can be optimized with clinician education and practice guidance without affecting sepsis quality metrics or mortality.
Asunto(s)
Cultivo de Sangre , Sepsis , Adulto , Anciano , Humanos , Pacientes Internos , Unidades de Cuidados Intensivos , Medicare , Estados UnidosRESUMEN
BACKGROUND: Self-efficacy refers to people's expectations about personal resources available for goal achievement. Higher self-efficacy expectations are correlated with higher academic performance. AIM: To analyze the psychometric properties of the Academic Behavior Self-Efficacy Scale (ABSES) and to describe Self-efficacy expectations of students from health-related careers. MATERIAL AND METHODS: A non-probabilistic sample of 479 first- and second-year students from Nursing, Physiotherapy, Medicine, Nutrition and Medical Technology in a public university in Chile, answered the ABSES. Results were analyzed by Exploratory Factor Analysis and its reliability was evaluated using Cronbach's alpha. Also a descriptive analysis and a non-parametric relational analysis were performed. RESULTS: Two factors were identified: Attention and Participation. Attention obtained significantly higher scores than Participation (p < 0.001). Compared to their second-year counterparts, first year students had higher scores in Attention (p < 0.001) and Participation (p < 0.01). Medicine students had higher scores in Participation than students from other careers. CONCLUSIONS: A two factor solution was identified for ABSES. Surveyed students had a predominantly passive Self-efficacy, focused in attention. Also, a reduction in self efficacy was noted among second year students.
Asunto(s)
Autoeficacia , Estudiantes del Área de la Salud , Encuestas y Cuestionarios , Chile , Estudios Transversales , Femenino , Humanos , Masculino , Psicometría , Universidades , Adulto JovenRESUMEN
Objective: To (1) understand the role of antibiotic-associated adverse events (ABX-AEs) on antibiotic decision-making, (2) understand clinician preferences for ABX-AE feedback, and (3) identify ABX-AEs of greatest clinical concern. Design: Focus groups. Setting: Academic medical center. Participants: Medical and surgical house staff, attending physicians, and advanced practice practitioners. Methods: Focus groups were conducted from May 2022 to December 2022. Participants discussed the role of ABX-AEs in antibiotic decision-making and feedback preferences and evaluated the prespecified categorization of ABX-AEs based on degree of clinical concern. Thematic analysis was conducted using inductive coding. Results: Four focus groups were conducted (n = 15). Six themes were identified. (1) ABX-AE risks during initial prescribing influence the antibiotic prescribed rather than the decision of whether to prescribe. (2) The occurrence of an ABX-AE leads to reassessment of the clinical indication for antibiotic therapy. (3) The impact of an ABX-AE on other management decisions is as important as the direct harm of the ABX-AE. (4) ABX-AEs may be overlooked because of limited feedback regarding the occurrence of ABX-AEs. (5) Clinicians are receptive to feedback regarding ABX-AEs but are concerned about it being punitive. (6) Feedback must be curated to prevent clinicians from being overwhelmed with data. Clinicians generally agreed with the prespecified categorizations of ABX-AEs by degree of clinical concern. Conclusions: The themes identified and assessment of ABX-AEs of greatest clinical concern may inform antibiotic stewardship initiatives that incorporate reporting of ABX-AEs as a strategy to reduce unnecessary antibiotic use.
RESUMEN
Patients managing central venous catheters (CVCs) outside of hospitals need training in CVC care. Using 3 focus groups, the study identified themes in how health care personnel (HCP) prepare patients and their caregivers for CVC care at home. Four major themes and 25 nested subthemes were identified: (1) providing the right amount of education at the right time, (2) tailoring education to patient needs, (3) developing patient education tools, and (4) managing differences in recommendations to patients. HCPs in the study ensured patients and caregivers learn what they need to know when they need to know it, using appropriate patient education tools. Patients and caregivers are largely responsible for CVC care and central line-associated bloodstream infection prevention outside of acute care hospitals and long-term care settings, and HCP take seriously their obligation to provide them with appropriate education and tools to best enhance their ability to keep themselves safe.
Asunto(s)
Cateterismo Venoso Central , Grupos Focales , Personal de Salud , Educación del Paciente como Asunto , Humanos , Educación del Paciente como Asunto/organización & administración , Femenino , Masculino , Infecciones Relacionadas con Catéteres/prevención & control , Servicios de Atención de Salud a Domicilio/organización & administración , Catéteres Venosos Centrales , Persona de Mediana Edad , Adulto , CuidadoresRESUMEN
Infection prevention and surveillance training approaches for home infusion therapy have not been well defined. We interviewed home infusion staff who perform surveillance activities about barriers to and facilitators for central line-associated bloodstream infection (CLABSI) surveillance and identified barriers to training in CLABSI surveillance. Our findings show a lack of formal surveillance training for staff. This gap can be addressed by adapting existing training resources to the home infusion setting.
Asunto(s)
Infecciones Relacionadas con Catéteres , Cateterismo Venoso Central , Infección Hospitalaria , Terapia de Infusión a Domicilio , Humanos , Infecciones Relacionadas con Catéteres/prevención & control , Infección Hospitalaria/prevención & controlRESUMEN
Objectives: Access to patient information may affect how home-infusion surveillance staff identify central-line-associated bloodstream infections (CLABSIs). We characterized information hazards in home-infusion CLABSI surveillance and identified possible strategies to mitigate information hazards. Design: Qualitative study using semistructured interviews. Setting and participants: The study included 21 clinical staff members involved in CLABSI surveillance at 5 large home-infusion agencies covering 13 states and the District of Columbia. Methods: Interviews were conducted by 1 researcher. Transcripts were coded by 2 researchers; consensus was reached by discussion. Results: Data revealed the following barriers: information overload, information underload, information scatter, information conflict, and erroneous information. Respondents identified 5 strategies to mitigate information chaos: (1) engage information technology in developing reports; (2) develop streamlined processes for acquiring and sharing data among staff; (3) enable staff access to hospital electronic health records; (4) use a single, validated, home-infusion CLABSI surveillance definition; and (5) develop relationships between home-infusion surveillance staff and inpatient healthcare workers. Conclusions: Information chaos occurs in home-infusion CLABSI surveillance and may affect the development of accurate CLABSI rates in home-infusion therapy. Implementing strategies to minimize information chaos will enhance intra- and interteam collaborations in addition to improving patient-related outcomes.
RESUMEN
Exposure investigations are labor intensive and vulnerable to recall bias. We developed an algorithm to identify healthcare personnel (HCP) interactions from the electronic health record (EHR), and we evaluated its accuracy against conventional exposure investigations. The EHR algorithm identified every known transmission and used ranking to produce a manageable contact list.
Asunto(s)
Registros Electrónicos de Salud , Personal de Salud , Humanos , Actitud del Personal de SaludRESUMEN
OBJECTIVE: Central-line-associated bloodstream infection (CLABSI) surveillance in home infusion therapy is necessary to track efforts to reduce infections, but a standardized, validated, and feasible definition is lacking. We tested the validity of a home-infusion CLABSI surveillance definition and the feasibility and acceptability of its implementation. DESIGN: Mixed-methods study including validation of CLABSI cases and semistructured interviews with staff applying these approaches. SETTING: This study was conducted in 5 large home-infusion agencies in a CLABSI prevention collaborative across 14 states and the District of Columbia. PARTICIPANTS: Staff performing home-infusion CLABSI surveillance. METHODS: From May 2021 to May 2022, agencies implemented a home-infusion CLABSI surveillance definition, using 3 approaches to secondary bloodstream infections (BSIs): National Healthcare Safety Program (NHSN) criteria, modified NHSN criteria (only applying the 4 most common NHSN-defined secondary BSIs), and all home-infusion-onset bacteremia (HiOB). Data on all positive blood cultures were sent to an infection preventionist for validation. Surveillance staff underwent semistructured interviews focused on their perceptions of the definition 1 and 3-4 months after implementation. RESULTS: Interrater reliability scores overall ranged from κ = 0.65 for the modified NHSN criteria to κ = 0.68 for the NHSN criteria to κ = 0.72 for the HiOB criteria. For the NHSN criteria, the agency-determined rate was 0.21 per 1,000 central-line (CL) days, and the validator-determined rate was 0.20 per 1,000 CL days. Overall, implementing a standardized definition was thought to be a positive change that would be generalizable and feasible though time-consuming and labor intensive. CONCLUSIONS: The home-infusion CLABSI surveillance definition was valid and feasible to implement.
Asunto(s)
Bacteriemia , Infecciones Relacionadas con Catéteres , Cateterismo Venoso Central , Infección Hospitalaria , Sepsis , Humanos , Infección Hospitalaria/epidemiología , Infecciones Relacionadas con Catéteres/diagnóstico , Infecciones Relacionadas con Catéteres/epidemiología , Infecciones Relacionadas con Catéteres/prevención & control , Reproducibilidad de los Resultados , Sepsis/epidemiología , Bacteriemia/diagnóstico , Bacteriemia/epidemiología , Bacteriemia/prevención & control , Cateterismo Venoso Central/efectos adversosRESUMEN
Background: The burden of vancomycin-associated acute kidney injury (V-AKI) is unclear because it is not systematically monitored. The objective of this study was to develop and validate an electronic algorithm to identify cases of V-AKI and to determine its incidence. Methods: Adults and children admitted to 1 of 5 health system hospitals from January 2018 to December 2019 who received at least 1 dose of intravenous (IV) vancomycin were included. A subset of charts was reviewed using a V-AKI assessment framework to classify cases as unlikely, possible, or probable events. Based on review, an electronic algorithm was developed and then validated using another subset of charts. Percentage agreement and kappa coefficients were calculated. Sensitivity and specificity were determined at various cutoffs, using chart review as the reference standard. For courses ≥48 hours, the incidence of possible or probable V-AKI events was assessed. Results: The algorithm was developed using 494 cases and validated using 200 cases. The percentage agreement between the electronic algorithm and chart review was 92.5% and the weighted kappa was 0.95. The electronic algorithm was 89.7% sensitive and 98.2% specific in detecting possible or probable V-AKI events. For the 11 073 courses of ≥48 hours of vancomycin among 8963 patients, the incidence of possible or probable V-AKI events was 14.0%; the V-AKI incidence rate was 22.8 per 1000 days of IV vancomycin therapy. Conclusions: An electronic algorithm demonstrated substantial agreement with chart review and had excellent sensitivity and specificity in detecting possible or probable V-AKI events. The electronic algorithm may be useful for informing future interventions to reduce V-AKI.
RESUMEN
BACKGROUND: Chagas disease, caused by the parasite Trypanosoma cruzi, is a neglected infectious disease that exerts the highest public health burden in the Americas. There are two anti-parasitic drugs approved for its treatment-benznidazole and nifurtimox-but the absence of biomarkers to early assess treatment efficacy hinders patients´ follow-up. METHODOLOGY/PRINCIPAL FINDINGS: We conducted a longitudinal, observational study among a cohort of 106 chronically T. cruzi-infected patients in Cochabamba (Bolivia) who completed the recommended treatment of benznidazole. Participants were followed-up for five years, in which we collected clinical and serological data, including yearly electrocardiograms and optical density readouts from two ELISAs (total and recombinant antigens). Descriptive and statistical analyses were performed to understand trends in data, as well as the relationship between clinical symptoms and serological evolution after treatment. Our results showed that both ELISAs documented average declines up to year three and slight inclines for the following two years. The recorded clinical parameters indicated that most patients did not have any significant changes to their cardiac or digestive symptoms after treatment, at least in the timeframe under investigation, while a small percentage demonstrated either a regression or progression in symptoms. Only one participant met the "cure criterion" of a negative serological readout for both ELISAs by the final year. CONCLUSIONS/SIGNIFICANCE: The study confirms that follow-up of benznidazole-treated T. cruzi-infected patients should be longer than five years to determine, with current tools, if they are cured. In terms of serological evolution, the single use of a total antigen ELISA might be a more reliable measure and suffice to address infection status, at least in the region of Bolivia where the study was done. Additional work is needed to develop a test-of-cure for an early assessment of drugs´ efficacy with the aim of improving case management protocols.
Asunto(s)
Enfermedad de Chagas , Nitroimidazoles , Tripanocidas , Trypanosoma cruzi , Humanos , Bolivia , Enfermedad de Chagas/parasitología , Nitroimidazoles/uso terapéutico , Tripanocidas/uso terapéutico , Enfermedad CrónicaRESUMEN
BACKGROUND: Patients discharged to the home on home-based outpatient parenteral antimicrobial therapy (OPAT) perform their own infusions and catheter care; thus, they require high-quality training to improve safety and the likelihood of treatment success. This article describes the study team's experience piloting an educational toolkit for patients on home-based OPAT. METHODS: An OPAT toolkit was developed to address barriers such as unclear communication channels, rushed instruction, safe bathing with an intravenous (IV) catheter, and lack of standardized instructions. The research team evaluated the toolkit through interviews with home infusion nurses implementing the intervention, surveys of 20 patients who received the intervention, and five observations of the home infusion nurses delivering the intervention to patients and caregivers. RESULTS: Of surveyed patients, 90.0% were comfortable infusing medications at the time of discharge, and 80.0% with bathing with the IV catheter. While all practiced on equipment, 75.0% used the videos and the paper checklists. Almost all (95.0%) were satisfied with their training, and all were satisfied with managing their IV catheters at home. The videos were considered very helpful, particularly as reference. Overall, nurses adjusted training to patient characteristics and modified the toolkit over time. Shorter instruction forms were more helpful than longer instruction forms. CONCLUSION: Developing a toolkit to improve the education of patients on home-based OPAT has the potential to improve the safety of and experience with home-based OPAT.
Asunto(s)
Antiinfecciosos , Pacientes Ambulatorios , Atención Ambulatoria , Antibacterianos , Humanos , Infusiones Parenterales , Alta del PacienteRESUMEN
BACKGROUND: Barriers for home infusion therapy central line associated bloodstream infection (CLABSI) surveillance have not been elucidated and are needed to identify how to support home infusion CLABSI surveillance. We aimed to (1) perform a goal-directed task analysis of home infusion CLABSI surveillance, and (2) describe barriers to, facilitators for, and suggested strategies for successful home infusion CLABSI surveillance. METHODS: We conducted semi-structured interviews with team members involved in CLABSI surveillance at 5 large home infusion agencies to explore work systems used by members of the agency for home infusion CLABSI surveillance. We analyzed the transcribed interviews qualitatively for themes. RESULTS: Twenty-one interviews revealed 8 steps for performing CLABSI surveillance in home infusion therapy. Major barriers identified included the need for training of the surveillance staff, lack of a standardized definition, inadequate information technology support, struggles communicating with hospitals, inadequate time, and insufficient clinician engagement and leadership support. DISCUSSION: Staff performing home infusion CLABSI surveillance need health system resources, particularly leadership and front-line engagement, access to data, information technology support, training, dedicated time, and reports to perform tasks. CONCLUSIONS: Building home infusion CLABSI surveillance programs will require support from home infusion leadership.
Asunto(s)
Infecciones Relacionadas con Catéteres , Cateterismo Venoso Central , Infección Hospitalaria , Terapia de Infusión a Domicilio , Sepsis , Infecciones Relacionadas con Catéteres/epidemiología , Infecciones Relacionadas con Catéteres/prevención & control , Humanos , LiderazgoRESUMEN
BACKGROUND: SARS-CoV-2 circulating variants coupled with waning immunity pose a significant threat to the long-term care (LTC) population. Our objective was to measure salivary IgG antibodies in residents and staff of an LTC facility to (1) evaluate IgG response in saliva post-natural infection and vaccination and (2) assess its feasibility to describe the seroprevalence over time. METHODS: We performed salivary IgG sampling of all residents and staff who agreed to test in a 150-bed skilled nursing facility during three seroprevalence surveys between October 2020 and February 2021. The facility had SARS-CoV-2 outbreaks in May 2020 and November 2020, when 45 of 138 and 37 of 125 residents were infected, respectively; they offered two Federal vaccine clinics in January 2021. We evaluated quantitative IgG in saliva to the Nucleocapsid (N), Spike (S), and Receptor-binding domain (RBD) Antigens of SARS-CoV-2 over time post-infection and post-vaccination. RESULTS: One hundred twenty-four residents and 28 staff underwent saliva serologic testing on one or more survey visits. Over three surveys, the SARS-CoV-2 seroprevalence at the facility was 49%, 64%, and 81%, respectively. IgG to S, RBD, and N Antigens all increased post infection. Post vaccination, the infection naïve group did not have a detectable N IgG level, and N IgG levels for the previously infected did not increase post vaccination (p < 0.001). Fully vaccinated subjects with prior COVID-19 infection had significantly higher RBD and S IgG responses compared with those who were infection-naïve prior to vaccination (p < 0.001 for both). CONCLUSIONS: Positive SARS-COV-2 IgG in saliva was concordant with prior infection (Anti N, S, RBD) and vaccination (Anti S, RBD) and remained above positivity threshold for up to 9 months from infection. Salivary sampling is a non-invasive method of tracking immunity and differentiating between prior infection and vaccination to inform the need for boosters in LTC residents and staff.