RESUMEN
Specialized pro-resolving mediators (SPMs) are endogenous small molecules produced mainly from dietary omega-3 polyunsaturated fatty acids by both structural cells and cells of the active and innate immune systems. Specialized pro-resolving mediators have been shown to both limit acute inflammation and promote resolution and return to homeostasis following infection or injury. There is growing evidence that chronic immune disorders are characterized by deficiencies in resolution and SPMs have significant potential as novel therapeutics to prevent and treat chronic inflammation and immune system disorders. This review focuses on important breakthroughs in understanding how SPMs are produced by, and act on, cells of the adaptive immune system, specifically macrophages, B cells and T cells. We also highlight recent evidence demonstrating the potential of SPMs as novel therapeutic agents in topics including immunization, autoimmune disease and transplantation.
Asunto(s)
Ácidos Docosahexaenoicos , Ácidos Grasos Omega-3 , Humanos , Ácidos Docosahexaenoicos/uso terapéutico , Ácidos Grasos Omega-3/uso terapéutico , Inflamación/tratamiento farmacológico , Mediadores de Inflamación/uso terapéutico , InmunidadRESUMEN
BACKGROUND: Before the emergence of the B.1.617.2 (delta) variant of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), vaccination reduced transmission of SARS-CoV-2 from vaccinated persons who became infected, potentially by reducing viral loads. Although vaccination still lowers the risk of infection, similar viral loads in vaccinated and unvaccinated persons who are infected with the delta variant call into question the degree to which vaccination prevents transmission. METHODS: We used contact-testing data from England to perform a retrospective observational cohort study involving adult contacts of SARS-CoV-2-infected adult index patients. We used multivariable Poisson regression to investigate associations between transmission and the vaccination status of index patients and contacts and to determine how these associations varied with the B.1.1.7 (alpha) and delta variants and time since the second vaccination. RESULTS: Among 146,243 tested contacts of 108,498 index patients, 54,667 (37%) had positive SARS-CoV-2 polymerase-chain-reaction (PCR) tests. In index patients who became infected with the alpha variant, two vaccinations with either BNT162b2 or ChAdOx1 nCoV-19 (also known as AZD1222), as compared with no vaccination, were independently associated with reduced PCR positivity in contacts (adjusted rate ratio with BNT162b2, 0.32; 95% confidence interval [CI], 0.21 to 0.48; and with ChAdOx1 nCoV-19, 0.48; 95% CI, 0.30 to 0.78). Vaccine-associated reductions in transmission of the delta variant were smaller than those with the alpha variant, and reductions in transmission of the delta variant after two BNT162b2 vaccinations were greater (adjusted rate ratio for the comparison with no vaccination, 0.50; 95% CI, 0.39 to 0.65) than after two ChAdOx1 nCoV-19 vaccinations (adjusted rate ratio, 0.76; 95% CI, 0.70 to 0.82). Variation in cycle-threshold (Ct) values (indicative of viral load) in index patients explained 7 to 23% of vaccine-associated reductions in transmission of the two variants. The reductions in transmission of the delta variant declined over time after the second vaccination, reaching levels that were similar to those in unvaccinated persons by 12 weeks in index patients who had received ChAdOx1 nCoV-19 and attenuating substantially in those who had received BNT162b2. Protection in contacts also declined in the 3-month period after the second vaccination. CONCLUSIONS: Vaccination was associated with a smaller reduction in transmission of the delta variant than of the alpha variant, and the effects of vaccination decreased over time. PCR Ct values at diagnosis of the index patient only partially explained decreased transmission. (Funded by the U.K. Government Department of Health and Social Care and others.).
Asunto(s)
Vacuna BNT162 , COVID-19/transmisión , ChAdOx1 nCoV-19 , Transmisión de Enfermedad Infecciosa/prevención & control , SARS-CoV-2 , Adulto , Anciano , Anciano de 80 o más Años , COVID-19/diagnóstico , COVID-19/virología , Prueba de Ácido Nucleico para COVID-19 , Inglaterra , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Carga ViralRESUMEN
BACKGROUND: The relationship between the presence of antibodies to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and the risk of subsequent reinfection remains unclear. METHODS: We investigated the incidence of SARS-CoV-2 infection confirmed by polymerase chain reaction (PCR) in seropositive and seronegative health care workers attending testing of asymptomatic and symptomatic staff at Oxford University Hospitals in the United Kingdom. Baseline antibody status was determined by anti-spike (primary analysis) and anti-nucleocapsid IgG assays, and staff members were followed for up to 31 weeks. We estimated the relative incidence of PCR-positive test results and new symptomatic infection according to antibody status, adjusting for age, participant-reported gender, and changes in incidence over time. RESULTS: A total of 12,541 health care workers participated and had anti-spike IgG measured; 11,364 were followed up after negative antibody results and 1265 after positive results, including 88 in whom seroconversion occurred during follow-up. A total of 223 anti-spike-seronegative health care workers had a positive PCR test (1.09 per 10,000 days at risk), 100 during screening while they were asymptomatic and 123 while symptomatic, whereas 2 anti-spike-seropositive health care workers had a positive PCR test (0.13 per 10,000 days at risk), and both workers were asymptomatic when tested (adjusted incidence rate ratio, 0.11; 95% confidence interval, 0.03 to 0.44; P = 0.002). There were no symptomatic infections in workers with anti-spike antibodies. Rate ratios were similar when the anti-nucleocapsid IgG assay was used alone or in combination with the anti-spike IgG assay to determine baseline status. CONCLUSIONS: The presence of anti-spike or anti-nucleocapsid IgG antibodies was associated with a substantially reduced risk of SARS-CoV-2 reinfection in the ensuing 6 months. (Funded by the U.K. Government Department of Health and Social Care and others.).
Asunto(s)
Anticuerpos Antivirales/sangre , COVID-19/inmunología , Proteínas de la Nucleocápside de Coronavirus/inmunología , Personal de Salud , Inmunoglobulina G/inmunología , SARS-CoV-2/inmunología , Glicoproteína de la Espiga del Coronavirus/inmunología , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , COVID-19/diagnóstico , COVID-19/epidemiología , Prueba de Ácido Nucleico para COVID-19 , Prueba Serológica para COVID-19 , Femenino , Humanos , Inmunoglobulina G/sangre , Incidencia , Estudios Longitudinales , Masculino , Persona de Mediana Edad , Reacción en Cadena de la Polimerasa , Recurrencia , SARS-CoV-2/aislamiento & purificación , Seroconversión , Reino Unido , Adulto JovenRESUMEN
Enhancement of net primary production (NPP) in forests as atmospheric [CO2 ] increases is likely limited by the availability of other growth resources. The Duke Free Air CO2 Enrichment (FACE) experiment was located on a moderate-fertility site in the southeastern US, in a loblolly pine (Pinus taeda L.) plantation with broadleaved species growing mostly in mid-canopy and understory. Duke FACE ran from 1994 to 2010 and combined elevated [CO2 ] (eCO2 ) with nitrogen (N) additions. We assessed the spatial and temporal variation of NPP response using a dataset that includes previously unpublished data from 6 years of the replicated CO2 × N experiment and extends to 2 years beyond the termination of enrichment. Averaged over time (1997-2010), NPP of pine and broadleaved species were 38% and 52% higher under eCO2 compared to ambient conditions. Furthermore, there was no evidence of a decline in enhancement over time in any plot regardless of its native site quality. The relation between spatial variation in the response and native site quality was suggested but inconclusive. Nitrogen amendments under eCO2 , in turn, resulted in an additional 11% increase in pine NPP. For pine, the eCO2 -induced increase in NPP was similar above- and belowground and was driven by both increased leaf area index (L) and production efficiency (PE = NPP/L). For broadleaved species, coarse-root biomass production was more than 200% higher under eCO2 and accounted for the entire production response, driven by increased PE. Notably, the fraction of annual NPP retained in total living biomass was higher under eCO2 , reflecting a slight shift in allocation fraction to woody mass and a lower mortality rate. Our findings also imply that tree growth may not have been only N-limited, but perhaps constrained by the availability of other nutrients. The observed sustained NPP enhancement, even without N-additions, demonstrates no progressive N limitation.
Asunto(s)
Dióxido de Carbono , Pinus , Nitrógeno , Pinus/fisiología , Bosques , Árboles , Pinus taeda , Hojas de la Planta/fisiologíaRESUMEN
In clinical settings with no commonly accepted standard-of-care, multiple treatment regimens are potentially useful, but some treatments may not be appropriate for some patients. A personalized randomized controlled trial (PRACTical) design has been proposed for this setting. For a network of treatments, each patient is randomized only among treatments which are appropriate for them. The aim is to produce treatment rankings that can inform clinical decisions about treatment choices for individual patients. Here we propose methods for determining sample size in a PRACTical design, since standard power-based methods are not applicable. We derive a sample size by evaluating information gained from trials of varying sizes. For a binary outcome, we quantify how many adverse outcomes would be prevented by choosing the top-ranked treatment for each patient based on trial results rather than choosing a random treatment from the appropriate personalized randomization list. In simulations, we evaluate three performance measures: mean reduction in adverse outcomes using sample information, proportion of simulated patients for whom the top-ranked treatment performed as well or almost as well as the best appropriate treatment, and proportion of simulated trials in which the top-ranked treatment performed better than a randomly chosen treatment. We apply the methods to a trial evaluating eight different combination antibiotic regimens for neonatal sepsis (NeoSep1), in which a PRACTical design addresses varying patterns of antibiotic choice based on disease characteristics and resistance. Our proposed approach produces results that are more relevant to complex decision making by clinicians and policy makers.
Asunto(s)
Medicina de Precisión , Ensayos Clínicos Controlados Aleatorios como Asunto , Humanos , Ensayos Clínicos Controlados Aleatorios como Asunto/métodos , Tamaño de la Muestra , Medicina de Precisión/métodos , Simulación por Computador , Recién Nacido , Sepsis/tratamiento farmacológico , Modelos EstadísticosRESUMEN
BACKGROUND: A 2×2 factorial design evaluates two interventions (A versus control and B versus control) by randomising to control, A-only, B-only or both A and B together. Extended factorial designs are also possible (e.g. 3×3 or 2×2×2). Factorial designs often require fewer resources and participants than alternative randomised controlled trials, but they are not widely used. We identified several issues that investigators considering this design need to address, before they use it in a late-phase setting. METHODS: We surveyed journal articles published in 2000-2022 relating to designing factorial randomised controlled trials. We identified issues to consider based on these and our personal experiences. RESULTS: We identified clinical, practical, statistical and external issues that make factorial randomised controlled trials more desirable. Clinical issues are (1) interventions can be easily co-administered; (2) risk of safety issues from co-administration above individual risks of the separate interventions is low; (3) safety or efficacy data are wanted on the combination intervention; (4) potential for interaction (e.g. effect of A differing when B administered) is low; (5) it is important to compare interventions with other interventions balanced, rather than allowing randomised interventions to affect the choice of other interventions; (6) eligibility criteria for different interventions are similar. Practical issues are (7) recruitment is not harmed by testing many interventions; (8) each intervention and associated toxicities is unlikely to reduce either adherence to the other intervention or overall follow-up; (9) blinding is easy to implement or not required. Statistical issues are (10) a suitable scale of analysis can be identified; (11) adjustment for multiplicity is not required; (12) early stopping for efficacy or lack of benefit can be done effectively. External issues are (13) adequate funding is available and (14) the trial is not intended for licensing purposes. An overarching issue (15) is that factorial design should give a lower sample size requirement than alternative designs. Across designs with varying non-adherence, retention, intervention effects and interaction effects, 2×2 factorial designs require lower sample size than a three-arm alternative when one intervention effect is reduced by no more than 24%-48% in the presence of the other intervention compared with in the absence of the other intervention. CONCLUSIONS: Factorial designs are not widely used and should be considered more often using our issues to consider. Low potential for at most small to modest interaction is key, for example, where the interventions have different mechanisms of action or target different aspects of the disease being studied.
Asunto(s)
Proyectos de Investigación , Humanos , Tamaño de la Muestra , Ensayos Clínicos Controlados Aleatorios como AsuntoRESUMEN
We aimed to test whether bilateral injection of bupivacaine 0.25% in the transversalis fascia plane reduced 24 h opioid dose after singleton caesarean section, under spinal anaesthesia with intrathecal morphine, compared with saline 0.9% injectate. We allocated randomly 52 women to bilateral injection of 20 ml saline 0.9% on arrival in the post-anaesthesia care unit and 54 women to bilateral injection of 20 ml bupivacaine 0.25% (with adrenaline 2.5 µg.ml-1 ). Mean (SD) cumulative morphine equivalent opioid dose 24 h after saline injection was 32.3 (28.3) mg and 18.7 (20.2) mg after bupivacaine injection, a mean (95%CI) difference of 13.7 (4.1-23.2) mg (p = 0.006). Median (IQR [range]) time to first postoperative opioid dose was 3.0 (1.5-10.3 [0.0-57.4]) h after saline 0.9% and 8.2 (2.7-29.6 [0.2-55.4]) h after bupivacaine 0.25% (p = 0.054). Transversalis fascia plane with bupivacaine 0.25% with adrenaline reduced postoperative pain at rest during 48 h (0-10-point scale) by a mean (95%CI) of 0.9 (0.2-1.6) points (p = 0.013) and on movement by 1.2 (0.4-2.1) points (p = 0.004). We conclude that transversalis fascia plane bupivacaine 0.25% with adrenaline reduces pain and opioid dose after caesarean section compared with saline 0.9%.
Asunto(s)
Anestesia Raquidea , Morfina , Femenino , Embarazo , Humanos , Analgésicos Opioides , Cesárea , Bupivacaína , Dolor Postoperatorio/tratamiento farmacológico , Dolor Postoperatorio/prevención & control , Epinefrina , Método Doble Ciego , Anestésicos LocalesRESUMEN
BACKGROUND: Antimicrobial resistance is a global patient safety priority and inappropriate antimicrobial use is a key contributing factor. Evidence have shown that delayed (back-up) antibiotic prescriptions (DP) are an effective and safe strategy for reducing unnecessary antibiotic consumption but its use is controversial. METHODS: We conducted a realist review to ask why, how, and in what contexts general practitioners (GPs) use DP. We searched five electronic databases for relevant articles and included DP-related data from interviews with healthcare professionals in a related study. Data were analysed using a realist theory-driven approach - theorising which context(s) influenced (mechanisms) resultant outcome(s) (context-mechanism-outcome-configurations: CMOCs). RESULTS: Data were included from 76 articles and 41 interviews to develop a program theory comprising nine key and 56 related CMOCs. These explain the reasons for GPs' tolerance of risk to different uncertainties and how these may interact with GPs' work environment, self-efficacy and perceived patient concordance to make using DP as a safety-net or social tool more or less likely, at a given time-point. For example, when a GP uses clinical scores or diagnostic tests: a clearly high or low score/test result may mitigate scientific uncertainty and lead to an immediate or no antibiotic decision; an intermediary result may provoke hermeneutic (interpretation-related) uncertainty and lead to DP becoming preferred and used as a safety net. Our program theory explains how DP can be used to mitigate some uncertainties but also provoke or exacerbate others. CONCLUSION: This review explains how, why and in what contexts GPs are more or less likely to use DP, as well as various uncertainties GPs face which DP may mitigate or provoke. We recommend that efforts to plan and implement interventions to optimise antibiotic prescribing in primary care consider these uncertainties and the contexts when DP may be (dis)preferred over other interventions to reduce antibiotic prescribing. We also recommend the following and have included example activities for: (i) reducing demand for immediate antibiotics; (ii) framing DP as an 'active' prescribing option; (iii) documenting the decision-making process around DP; and (iv) facilitating social and system support.
Asunto(s)
Antibacterianos , Pautas de la Práctica en Medicina , Atención Primaria de Salud , Humanos , Antibacterianos/uso terapéutico , Incertidumbre , Pautas de la Práctica en Medicina/estadística & datos numéricos , Médicos Generales/psicología , Prescripciones de Medicamentos/estadística & datos numéricos , Prescripción Inadecuada/prevención & controlRESUMEN
BACKGROUND: Evidence on the long-term employment consequences of SARS-CoV-2 infection is lacking. We used data from a large, community-based sample in the UK to estimate associations between Long Covid and employment outcomes. METHODS: This was an observational, longitudinal study using a pre-post design. We included survey participants from 3 February 2021 to 30 September 2022 when they were aged 16-64 years and not in education. Using conditional logit modelling, we explored the time-varying relationship between Long Covid status ≥12 weeks after a first test-confirmed SARS-CoV-2 infection (reference: pre-infection) and labour market inactivity (neither working nor looking for work) or workplace absence lasting ≥4 weeks. RESULTS: Of 206 299 participants (mean age 45 years, 54% female, 92% white), 15% were ever labour market inactive and 10% were ever long-term absent during follow-up. Compared with pre-infection, inactivity was higher in participants reporting Long Covid 30 to <40 weeks [adjusted odds ratio (aOR): 1.45; 95% CI: 1.17-1.81] or 40 to <52 weeks (aOR: 1.34; 95% CI: 1.05-1.72) post-infection. Combining with official statistics on Long Covid prevalence, and assuming a correct statistical model, our estimates translate to 27 000 (95% CI: 6000-47 000) working-age adults in the UK being inactive because of Long Covid in July 2022. CONCLUSIONS: Long Covid is likely to have contributed to reduced participation in the UK labour market, though it is unlikely to be the sole driver. Further research is required to quantify the contribution of other factors, such as indirect health effects of the pandemic.
Asunto(s)
COVID-19 , Empleo , SARS-CoV-2 , Humanos , COVID-19/epidemiología , Femenino , Masculino , Persona de Mediana Edad , Adulto , Empleo/estadística & datos numéricos , Estudios Longitudinales , Reino Unido/epidemiología , Adolescente , Adulto Joven , Estudios de CohortesRESUMEN
Estimating real-world vaccine effectiveness is vital to assessing the coronavirus disease 2019 (COVID-19) vaccination program and informing the ongoing policy response. However, estimating vaccine effectiveness using observational data is inherently challenging because of the nonrandomized design and potential for unmeasured confounding. We used a regression discontinuity design to estimate vaccine effectiveness against COVID-19 mortality in England using the fact that people aged 80 years or older were prioritized for the vaccine rollout. The prioritization led to a large discrepancy in vaccination rates among people aged 80-84 years compared with those aged 75-79 at the beginning of the vaccination campaign. We found a corresponding difference in COVID-19 mortality but not in non-COVID-19 mortality, suggesting that our approach appropriately addressed the issue of unmeasured confounding factors. Our results suggest that the first vaccine dose reduced the risk of COVID-19 death by 52.6% (95% confidence limits: 15.7, 73.4) in those aged 80 years, supporting existing evidence that a first dose of a COVID-19 vaccine had a strong protective effect against COVID-19 mortality in older adults. The regression discontinuity model's estimate of vaccine effectiveness is only slightly lower than those of previously published studies using different methods, suggesting that these estimates are unlikely to be substantially affected by unmeasured confounding factors.
Asunto(s)
Vacunas contra la COVID-19 , COVID-19 , Humanos , Anciano , COVID-19/prevención & control , Inglaterra/epidemiología , Programas de Inmunización , Políticas , VacunaciónRESUMEN
BACKGROUND: To determine the extent and nature of changes associated with COVID-19 infection in terms of healthcare utilisation, this study observed healthcare contact 1 to 4 and 5 to 24 weeks following a COVID-19 diagnosis compared to propensity-matched controls. METHODS: Two hundred forty nine thousand three hundred ninety Welsh individuals with a positive reverse transcription-polymerase chain reaction (RT-PCR) test were identified from data from national PCR test results. After elimination criteria, 98,600 positive individuals were matched to test negative and never tested controls using propensity matching. Cohorts were split on test location. Tests could be taken in either the hospital or community. Controls were those who had tested negative in their respective environments. Survival analysis was utilised for first clinical outcomes which are grouped into primary and secondary. Primary outcomes include post-viral-illness and fatigue as an indication of long-COVID. Secondary outcomes include clinical terminology concepts for embolism, respiratory conditions, mental health conditions, fit notes, or hospital attendance. Increased instantaneous risk for positive individuals was quantified using hazard ratios (HR) from Cox regression, while absolute risk (AR) and relative risk were quantified using life table analysis. RESULTS: Analysis was conducted using all individuals and stratified by test location. Cases are compared to controls from the same test location. Fatigue (HR: 1.77, 95% CI: 1.34-2.25, p = < 0.001) and embolism (HR: 1.50, 95% CI: 1.15-1.97, p = 0.003) were more likely to occur in all positive individuals in the first 4 weeks; however, anxiety and depression (HR: 0.83, 95% CI: 0.73-0.95, p = 0.007) were less likely. Positive individuals continued to be more at risk of fatigue (HR: 1.47, 95% CI: 1.24-1.75, p = < 0.001) and embolism (HR: 1.51, 95% CI: 1.13-2.02, p = 0.005) after 4 weeks. All positive individuals are also at greater risk of post-viral illness (HR: 4.57, 95% CI: 1.77-11.80, p = 0.002). Despite statistical association between testing positive and several conditions, life table analysis shows that only a small minority of the study population were affected. CONCLUSIONS: Community COVID-19 disease is associated with increased risks of post-viral-illness, fatigue, embolism, and respiratory conditions. Despite elevated risks, the absolute healthcare burden is low. Subsequently, either very small proportions of people experience adverse outcomes following COVID-19 or they are not presenting to healthcare.
Asunto(s)
COVID-19 , Virosis , Humanos , COVID-19/diagnóstico , COVID-19/epidemiología , COVID-19/complicaciones , Prueba de COVID-19 , SARS-CoV-2 , Síndrome Post Agudo de COVID-19 , Estudios de Cohortes , Gales/epidemiología , Registros Electrónicos de Salud , Atención a la Salud , FatigaRESUMEN
In some clinical scenarios, for example, severe sepsis caused by extensively drug resistant bacteria, there is uncertainty between many common treatments, but a conventional multiarm randomized trial is not possible because individual participants may not be eligible to receive certain treatments. The Personalised Randomized Controlled Trial design allows each participant to be randomized between a "personalised randomization list" of treatments that are suitable for them. The primary aim is to produce treatment rankings that can guide choice of treatment, rather than focusing on the estimates of relative treatment effects. Here we use simulation to assess several novel analysis approaches for this innovative trial design. One of the approaches is like a network meta-analysis, where participants with the same personalised randomization list are like a trial, and both direct and indirect evidence are used. We evaluate this proposed analysis and compare it with analyses making less use of indirect evidence. We also propose new performance measures including the expected improvement in outcome if the trial's rankings are used to inform future treatment rather than random choice. We conclude that analysis of a personalized randomized controlled trial can be performed by pooling data from different types of participants and is robust to moderate subgroup-by-intervention interactions based on the parameters of our simulation. The proposed approach performs well with respect to estimation bias and coverage. It provides an overall treatment ranking list with reasonable precision, and is likely to improve outcome on average if used to determine intervention policies and guide individual clinical decisions.
Asunto(s)
Ensayos Clínicos Controlados Aleatorios como Asunto , Proyectos de Investigación , Humanos , Medicina de Precisión , Participación del PacienteRESUMEN
Bayesian analysis of a non-inferiority trial is advantageous in allowing direct probability statements to be made about the relative treatment difference rather than relying on an arbitrary and often poorly justified non-inferiority margin. When the primary analysis will be Bayesian, a Bayesian approach to sample size determination will often be appropriate for consistency with the analysis. We demonstrate three Bayesian approaches to choosing sample size for non-inferiority trials with binary outcomes and review their advantages and disadvantages. First, we present a predictive power approach for determining sample size using the probability that the trial will produce a convincing result in the final analysis. Next, we determine sample size by considering the expected posterior probability of non-inferiority in the trial. Finally, we demonstrate a precision-based approach. We apply these methods to a non-inferiority trial in antiretroviral therapy for treatment of HIV-infected children. A predictive power approach would be most accessible in practical settings, because it is analogous to the standard frequentist approach. Sample sizes are larger than with frequentist calculations unless an informative analysis prior is specified, because appropriate allowance is made for uncertainty in the assumed design parameters, ignored in frequentist calculations. An expected posterior probability approach will lead to a smaller sample size and is appropriate when the focus is on estimating posterior probability rather than on testing. A precision-based approach would be useful when sample size is restricted by limits on recruitment or costs, but it would be difficult to decide on sample size using this approach alone.
Asunto(s)
Proyectos de Investigación , Niño , Humanos , Teorema de Bayes , Probabilidad , Tamaño de la Muestra , Incertidumbre , Estudios de Equivalencia como AsuntoRESUMEN
BACKGROUND: How severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infectivity varies with viral load is incompletely understood. Whether rapid point-of-care antigen lateral flow devices (LFDs) detect most potential transmission sources despite imperfect clinical sensitivity is unknown. METHODS: We combined SARS-CoV-2 testing and contact tracing data from England between 1 September 2020 and 28 February 2021. We used multivariable logistic regression to investigate relationships between polymerase chain reaction (PCR)-confirmed infection in contacts of community-diagnosed cases and index case viral load, S gene target failure (proxy for B.1.1.7 infection), demographics, SARS-CoV-2 incidence, social deprivation, and contact event type. We used LFD performance to simulate the proportion of cases with a PCR-positive contact expected to be detected using 1 of 4 LFDs. RESULTS: In total, 231 498/2 474 066 (9%) contacts of 1 064 004 index cases tested PCR-positive. PCR-positive results in contacts independently increased with higher case viral loads (lower cycle threshold [Ct] values), for example, 11.7% (95% confidence interval [CI] 11.5-12.0%) at Ct = 15 and 4.5% (95% CI 4.4-4.6%) at Ct = 30. B.1.1.7 infection increased PCR-positive results by ~50%, (eg, 1.55-fold, 95% CI 1.49-1.61, at Ct = 20). PCR-positive results were most common in household contacts (at Ct = 20.1, 8.7% [95% CI 8.6-8.9%]), followed by household visitors (7.1% [95% CI 6.8-7.3%]), contacts at events/activities (5.2% [95% CI 4.9-5.4%]), work/education (4.6% [95% CI 4.4-4.8%]), and least common after outdoor contact (2.9% [95% CI 2.3-3.8%]). Contacts of children were the least likely to test positive, particularly following contact outdoors or at work/education. The most and least sensitive LFDs would detect 89.5% (95% CI 89.4-89.6%) and 83.0% (95% CI 82.8-83.1%) of cases with PCR-positive contacts, respectively. CONCLUSIONS: SARS-CoV-2 infectivity varies by case viral load, contact event type, and age. Those with high viral loads are the most infectious. B.1.1.7 increased transmission by ~50%. The best performing LFDs detect most infectious cases.
Asunto(s)
COVID-19 , SARS-CoV-2 , Prueba de COVID-19 , Niño , Composición Familiar , Humanos , Carga ViralRESUMEN
BACKGROUND: Natural and vaccine-induced immunity will play a key role in controlling the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic. SARS-CoV-2 variants have the potential to evade natural and vaccine-induced immunity. METHODS: In a longitudinal cohort study of healthcare workers (HCWs) in Oxfordshire, United Kingdom, we investigated the protection from symptomatic and asymptomatic polymerase chain reaction (PCR)-confirmed SARS-CoV-2 infection conferred by vaccination (Pfizer-BioNTech BNT162b2, Oxford-AstraZeneca ChAdOx1 nCOV-19) and prior infection (determined using anti-spike antibody status), using Poisson regression adjusted for age, sex, temporal changes in incidence and role. We estimated protection conferred after 1 versus 2 vaccinations and from infections with the B.1.1.7 variant identified using whole genome sequencing. RESULTS: In total, 13 109 HCWs participated; 8285 received the Pfizer-BioNTech vaccine (1407 two doses), and 2738 the Oxford-AstraZeneca vaccine (49 two doses). Compared to unvaccinated seronegative HCWs, natural immunity and 2 vaccination doses provided similar protection against symptomatic infection: no HCW vaccinated twice had symptomatic infection, and incidence was 98% lower in seropositive HCWs (adjusted incidence rate ratio 0.02 [95% confidence interval {CI}â <â .01-.18]). Two vaccine doses or seropositivity reduced the incidence of any PCR-positive result with or without symptoms by 90% (0.10 [95% CI .02-.38]) and 85% (0.15 [95% CI .08-.26]), respectively. Single-dose vaccination reduced the incidence of symptomatic infection by 67% (0.33 [95% CI .21-.52]) and any PCR-positive result by 64% (0.36 [95% CI .26-.50]). There was no evidence of differences in immunity induced by natural infection and vaccination for infections with S-gene target failure and B.1.1.7. CONCLUSIONS: Natural infection resulting in detectable anti-spike antibodies and 2 vaccine doses both provide robust protection against SARS-CoV-2 infection, including against the B.1.1.7 variant.
Asunto(s)
COVID-19 , SARS-CoV-2 , Vacuna BNT162 , COVID-19/epidemiología , COVID-19/prevención & control , Vacunas contra la COVID-19 , ChAdOx1 nCoV-19 , Estudios de Cohortes , Personal de Salud , Humanos , Inmunoglobulinas , Incidencia , Estudios Longitudinales , VacunaciónAsunto(s)
Antibacterianos , Azitromicina , Infecciones Bacterianas , Administración Masiva de Medicamentos , Humanos , Antibacterianos/administración & dosificación , Antibacterianos/efectos adversos , Azitromicina/administración & dosificación , Azitromicina/efectos adversos , Ensayos Clínicos Controlados Aleatorios como Asunto , Lactante , Preescolar , Niger/epidemiología , Mortalidad Infantil , Mortalidad del Niño , Administración Masiva de Medicamentos/efectos adversos , Administración Masiva de Medicamentos/métodos , Administración Masiva de Medicamentos/estadística & datos numéricos , Quimioprevención/efectos adversos , Quimioprevención/métodos , Quimioprevención/estadística & datos numéricos , Farmacorresistencia Bacteriana , Infecciones Bacterianas/mortalidad , Infecciones Bacterianas/prevención & controlRESUMEN
BACKGROUND: Severe anemia (hemoglobin level, <6 g per deciliter) is a leading cause of hospital admission and death in children in sub-Saharan Africa. The World Health Organization recommends transfusion of 20 ml of whole-blood equivalent per kilogram of body weight for anemia, regardless of hemoglobin level. METHODS: In this factorial, open-label trial, we randomly assigned Ugandan and Malawian children 2 months to 12 years of age with a hemoglobin level of less than 6 g per deciliter and severity features (e.g., respiratory distress or reduced consciousness) to receive immediate blood transfusion with 20 ml per kilogram or 30 ml per kilogram. Three other randomized analyses investigated immediate as compared with no immediate transfusion, the administration of postdischarge micronutrients, and postdischarge prophylaxis with trimethoprim-sulfamethoxazole. The primary outcome was 28-day mortality. RESULTS: A total of 3196 eligible children (median age, 37 months; 2050 [64.1%] with malaria) were assigned to receive a transfusion of 30 ml per kilogram (1598 children) or 20 ml per kilogram (1598 children) and were followed for 180 days. A total of 1592 children (99.6%) in the higher-volume group and 1596 (99.9%) in the lower-volume group started transfusion (median, 1.2 hours after randomization). The mean (±SD) volume of total blood transfused per child was 475±385 ml and 353±348 ml, respectively; 197 children (12.3%) and 300 children (18.8%) in the respective groups received additional transfusions. Overall, 55 children (3.4%) in the higher-volume group and 72 (4.5%) in the lower-volume group died before 28 days (hazard ratio, 0.76; 95% confidence interval [CI], 0.54 to 1.08; P = 0.12 by log-rank test). This finding masked significant heterogeneity in 28-day mortality according to the presence or absence of fever (>37.5°C) at screening (P=0.001 after Sidak correction). Among the 1943 children (60.8%) without fever, mortality was lower with a transfusion volume of 30 ml per kilogram than with a volume of 20 ml per kilogram (hazard ratio, 0.43; 95% CI, 0.27 to 0.69). Among the 1253 children (39.2%) with fever, mortality was higher with 30 ml per kilogram than with 20 ml per kilogram (hazard ratio, 1.91; 95% CI, 1.04 to 3.49). There was no evidence of differences between the randomized groups in readmissions, serious adverse events, or hemoglobin recovery at 180 days. CONCLUSIONS: Overall mortality did not differ between the two transfusion strategies. (Funded by the Medical Research Council and Department for International Development, United Kingdom; TRACT Current Controlled Trials number, ISRCTN84086586.).
Asunto(s)
Anemia/terapia , Transfusión Sanguínea , Hemoglobinas/análisis , Anemia/complicaciones , Anemia/mortalidad , Transfusión Sanguínea/economía , Niño , Preescolar , Análisis Costo-Beneficio , Femenino , Fiebre/complicaciones , Estudios de Seguimiento , Costos de la Atención en Salud , Humanos , Lactante , Tiempo de Internación/economía , Malaria/complicaciones , Malaui/epidemiología , Masculino , Readmisión del Paciente/estadística & datos numéricos , Reacción a la Transfusión/epidemiología , Uganda/epidemiologíaRESUMEN
BACKGROUND: The World Health Organization recommends not performing transfusions in African children hospitalized for uncomplicated severe anemia (hemoglobin level of 4 to 6 g per deciliter and no signs of clinical severity). However, high mortality and readmission rates suggest that less restrictive transfusion strategies might improve outcomes. METHODS: In this factorial, open-label, randomized, controlled trial, we assigned Ugandan and Malawian children 2 months to 12 years of age with uncomplicated severe anemia to immediate transfusion with 20 ml or 30 ml of whole-blood equivalent per kilogram of body weight, as determined in a second simultaneous randomization, or no immediate transfusion (control group), in which transfusion with 20 ml of whole-blood equivalent per kilogram was triggered by new signs of clinical severity or a drop in hemoglobin to below 4 g per deciliter. The primary outcome was 28-day mortality. Three other randomizations investigated transfusion volume, postdischarge supplementation with micronutrients, and postdischarge prophylaxis with trimethoprim-sulfamethoxazole. RESULTS: A total of 1565 children (median age, 26 months) underwent randomization, with 778 assigned to the immediate-transfusion group and 787 to the control group; 984 children (62.9%) had malaria. The children were followed for 180 days, and 71 (4.5%) were lost to follow-up. During the primary hospitalization, transfusion was performed in all the children in the immediate-transfusion group and in 386 (49.0%) in the control group (median time to transfusion, 1.3 hours vs. 24.9 hours after randomization). The mean (±SD) total blood volume transfused per child was 314±228 ml in the immediate-transfusion group and 142±224 ml in the control group. Death had occurred by 28 days in 7 children (0.9%) in the immediate-transfusion group and in 13 (1.7%) in the control group (hazard ratio, 0.54; 95% confidence interval [CI], 0.22 to 1.36; P = 0.19) and by 180 days in 35 (4.5%) and 47 (6.0%), respectively (hazard ratio, 0.75; 95% CI, 0.48 to 1.15), without evidence of interaction with other randomizations (P>0.20) or evidence of between-group differences in readmissions, serious adverse events, or hemoglobin recovery at 180 days. The mean length of hospital stay was 0.9 days longer in the control group. CONCLUSIONS: There was no evidence of differences in clinical outcomes over 6 months between the children who received immediate transfusion and those who did not. The triggered-transfusion strategy in the control group resulted in lower blood use; however, the length of hospital stay was longer, and this strategy required clinical and hemoglobin monitoring. (Funded by the Medical Research Council and Department for International Development; TRACT Current Controlled Trials number, ISRCTN84086586.).
Asunto(s)
Anemia/terapia , Transfusión Sanguínea , Hemoglobinas/análisis , Tiempo de Tratamiento , Anemia/complicaciones , Anemia/mortalidad , Transfusión Sanguínea/economía , Niño , Preescolar , Análisis Costo-Beneficio , Femenino , Estudios de Seguimiento , Costos de la Atención en Salud , Humanos , Lactante , Tiempo de Internación/economía , Malaria/complicaciones , Malaui/epidemiología , Masculino , Readmisión del Paciente/estadística & datos numéricos , Reacción a la Transfusión/epidemiología , Uganda/epidemiologíaRESUMEN
BACKGROUND: The management of complex orthopedic infections usually includes a prolonged course of intravenous antibiotic agents. We investigated whether oral antibiotic therapy is noninferior to intravenous antibiotic therapy for this indication. METHODS: We enrolled adults who were being treated for bone or joint infection at 26 U.K. centers. Within 7 days after surgery (or, if the infection was being managed without surgery, within 7 days after the start of antibiotic treatment), participants were randomly assigned to receive either intravenous or oral antibiotics to complete the first 6 weeks of therapy. Follow-on oral antibiotics were permitted in both groups. The primary end point was definitive treatment failure within 1 year after randomization. In the analysis of the risk of the primary end point, the noninferiority margin was 7.5 percentage points. RESULTS: Among the 1054 participants (527 in each group), end-point data were available for 1015 (96.3%). Treatment failure occurred in 74 of 506 participants (14.6%) in the intravenous group and 67 of 509 participants (13.2%) in the oral group. Missing end-point data (39 participants, 3.7%) were imputed. The intention-to-treat analysis showed a difference in the risk of definitive treatment failure (oral group vs. intravenous group) of -1.4 percentage points (90% confidence interval [CI], -4.9 to 2.2; 95% CI, -5.6 to 2.9), indicating noninferiority. Complete-case, per-protocol, and sensitivity analyses supported this result. The between-group difference in the incidence of serious adverse events was not significant (146 of 527 participants [27.7%] in the intravenous group and 138 of 527 [26.2%] in the oral group; P=0.58). Catheter complications, analyzed as a secondary end point, were more common in the intravenous group (9.4% vs. 1.0%). CONCLUSIONS: Oral antibiotic therapy was noninferior to intravenous antibiotic therapy when used during the first 6 weeks for complex orthopedic infection, as assessed by treatment failure at 1 year. (Funded by the National Institute for Health Research; OVIVA Current Controlled Trials number, ISRCTN91566927 .).
Asunto(s)
Administración Oral , Antibacterianos/administración & dosificación , Enfermedades Óseas Infecciosas/tratamiento farmacológico , Artropatías/tratamiento farmacológico , Administración Intravenosa , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Antibacterianos/efectos adversos , Antibacterianos/farmacocinética , Femenino , Humanos , Análisis de Intención de Tratar , Masculino , Cumplimiento de la Medicación , Persona de Mediana Edad , Resultado del Tratamiento , Adulto JovenRESUMEN
BACKGROUND: Reported bacteraemia outcomes following inactive empirical antibiotics (based on in vitro testing) are conflicting, potentially reflecting heterogeneity in causative species, MIC breakpoints defining resistance/susceptibility, and times to rescue therapy. METHODS: We investigated adult inpatients with Escherichia coli bacteraemia at Oxford University Hospitals, UK, from 4 February 2014 to 30 June 2021 who were receiving empirical amoxicillin/clavulanate with/without other antibiotics. We used Cox regression to analyse 30â day all-cause mortality by in vitro amoxicillin/clavulanate susceptibility (activity) using the EUCAST resistance breakpoint (>8/2â mg/L), categorical MIC, and a higher resistance breakpoint (>32/2â mg/L), adjusting for other antibiotic activity and confounders including comorbidities, vital signs and blood tests. RESULTS: A total of 1720 E. coli bacteraemias (1626 patients) were treated with empirical amoxicillin/clavulanate. Thirty-day mortality was 193/1400 (14%) for any active baseline therapy and 52/320 (16%) for inactive baseline therapy (Pâ=â0.17). With EUCAST breakpoints, there was no evidence that mortality differed for inactive versus active amoxicillin/clavulanate [adjusted HR (aHR)â=â1.27 (95% CI 0.83-1.93); Pâ=â0.28], nor of an association with active aminoglycoside (Pâ=â0.93) or other active antibiotics (Pâ=â0.18). Considering categorical amoxicillin/clavulanate MIC, MICsâ>â32/2â mg/L were associated with mortality [aHRâ=â1.85 versus MICâ=â2/2â mg/L (95% CI 0.99-3.73); Pâ=â0.054]. A higher resistance breakpoint (>32/2â mg/L) was independently associated with higher mortality [aHRâ=â1.82 (95% CI 1.07-3.10); Pâ=â0.027], as were MICsâ>â32/2â mg/L with active empirical aminoglycosides [aHRâ=â2.34 (95% CI 1.40-3.89); Pâ=â0.001], but not MICsâ>â32/2â mg/L with active non-aminoglycoside antibiotic(s) [aHRâ=â0.87 (95% CI 0.40-1.89); Pâ=â0.72]. CONCLUSIONS: We found no evidence that EUCAST-defined amoxicillin/clavulanate resistance was associated with increased mortality, but a higher resistance breakpoint (MICâ>â32/2â mg/L) was. Additional active baseline non-aminoglycoside antibiotics attenuated amoxicillin/clavulanate resistance-associated mortality, but aminoglycosides did not. Granular phenotyping and comparison with clinical outcomes may improve AMR breakpoints.