RESUMO
BACKGROUND: An interatrial shunt may provide an autoregulatory mechanism to decrease left atrial pressure and improve heart failure (HF) symptoms and prognosis. METHODS: Patients with symptomatic HF with any left ventricular ejection fraction (LVEF) were randomized 1:1 to transcatheter shunt implantation versus a placebo procedure, stratified by reduced (≤40%) versus preserved (>40%) LVEF. The primary safety outcome was a composite of device-related or procedure-related major adverse cardiovascular or neurological events at 30 days compared with a prespecified performance goal of 11%. The primary effectiveness outcome was the hierarchical composite ranking of all-cause death, cardiac transplantation or left ventricular assist device implantation, HF hospitalization, outpatient worsening HF events, and change in quality of life from baseline measured by the Kansas City Cardiomyopathy Questionnaire overall summary score through maximum 2-year follow-up, assessed when the last enrolled patient reached 1-year follow-up, expressed as the win ratio. Prespecified hypothesis-generating analyses were performed on patients with reduced and preserved LVEF. RESULTS: Between October 24, 2018, and October 19, 2022, 508 patients were randomized at 94 sites in 11 countries to interatrial shunt treatment (n=250) or a placebo procedure (n=258). Median (25th and 75th percentiles) age was 73.0 years (66.0, 79.0), and 189 patients (37.2%) were women. Median LVEF was reduced (≤40%) in 206 patients (40.6%) and preserved (>40%) in 302 patients (59.4%). No primary safety events occurred after shunt implantation (upper 97.5% confidence limit, 1.5%; P<0.0001). There was no difference in the 2-year primary effectiveness outcome between the shunt and placebo procedure groups (win ratio, 0.86 [95% CI, 0.61-1.22]; P=0.20). However, patients with reduced LVEF had fewer adverse cardiovascular events with shunt treatment versus placebo (annualized rate 49.0% versus 88.6%; relative risk, 0.55 [95% CI, 0.42-0.73]; P<0.0001), whereas patients with preserved LVEF had more cardiovascular events with shunt treatment (annualized rate 60.2% versus 35.9%; relative risk, 1.68 [95% CI, 1.29-2.19]; P=0.0001; Pinteraction<0.0001). There were no between-group differences in change in Kansas City Cardiomyopathy Questionnaire overall summary score during follow-up in all patients or in those with reduced or preserved LVEF. CONCLUSIONS: Transcatheter interatrial shunt implantation was safe but did not improve outcomes in patients with HF. However, the results from a prespecified exploratory analysis in stratified randomized groups suggest that shunt implantation is beneficial in patients with reduced LVEF and harmful in patients with preserved LVEF. REGISTRATION: URL: https://www.clinicaltrials.gov; Unique identifier: NCT03499236.
RESUMO
BACKGROUND: Cholera surveillance relies on clinical diagnosis of acute watery diarrhea. Suspected cholera case definitions have high sensitivity but low specificity, challenging our ability to characterize cholera burden and epidemiology. Our objective was to estimate the proportion of clinically suspected cholera that are true Vibrio cholerae infections and identify factors that explain variation in positivity. METHODS AND FINDINGS: We conducted a systematic review of studies that tested ≥10 suspected cholera cases for V. cholerae O1/O139 using culture, PCR, and/or a rapid diagnostic test. We searched PubMed, Embase, Scopus, and Google Scholar for studies that sampled at least one suspected case between January 1, 2000 and April 19, 2023, to reflect contemporary patterns in V. cholerae positivity. We estimated diagnostic test sensitivity and specificity using a latent class meta-analysis. We estimated V. cholerae positivity using a random-effects meta-analysis, adjusting for test performance. We included 119 studies from 30 countries. V. cholerae positivity was lower in studies with representative sampling and in studies that set minimum ages in suspected case definitions. After adjusting for test performance, on average, 52% (95% credible interval (CrI): 24%, 80%) of suspected cases represented true V. cholerae infections. After adjusting for test performance and study methodology, the odds of a suspected case having a true infection were 5.71 (odds ratio 95% CrI: 1.53, 15.43) times higher when surveillance was initiated in response to an outbreak than in non-outbreak settings. Variation across studies was high, and a limitation of our approach was that we were unable to explain all the heterogeneity with study-level attributes, including diagnostic test used, setting, and case definitions. CONCLUSIONS: In this study, we found that burden estimates based on suspected cases alone may overestimate the incidence of medically attended cholera by 2-fold. However, accounting for cases missed by traditional clinical surveillance is key to unbiased cholera burden estimates. Given the substantial variability in positivity between settings, extrapolations from suspected to confirmed cases, which is necessary to estimate cholera incidence rates without exhaustive testing, should be based on local data.
Assuntos
Cólera , Vibrio cholerae , Humanos , Cólera/diagnóstico , Cólera/epidemiologia , Vibrio cholerae/genética , Surtos de Doenças , Diarreia/epidemiologia , Reação em Cadeia da PolimeraseRESUMO
Improvements in water and sanitation should reduce cholera risk though the associations between cholera and specific water and sanitation access measures remain unclear. We estimated the association between eight water and sanitation measures and annual cholera incidence access across sub-Saharan Africa (2010-2016) for data aggregated at the country and district levels. We fit random forest regression and classification models to understand how well these measures combined might be able to predict cholera incidence rates and identify high cholera incidence areas. Across spatial scales, piped or "other improved" water access was inversely associated with cholera incidence. Access to piped water, septic or sewer sanitation, and septic, sewer, or "other improved" sanitation were associated with decreased district-level cholera incidence. The classification model had moderate performance in identifying high cholera incidence areas (cross-validated-AUC 0.81, 95% CI 0.78-0.83) with high negative predictive values (93-100%) indicating the utility of water and sanitation measures for screening out areas that are unlikely to be at high cholera risk. While comprehensive cholera risk assessments must incorporate other data sources (e.g., historical incidence), our results suggest that water and sanitation measures could alone be useful in narrowing the geographic focus for detailed risk assessments.
Assuntos
Cólera , Água , Humanos , Saneamento , Cólera/epidemiologia , Cólera/prevenção & controle , Abastecimento de Água , África Subsaariana/epidemiologiaRESUMO
BACKGROUND: A surveillance system that is sensitive to detecting high burden areas is critical for achieving widespread disease control. In 2014, Bangladesh established a nationwide, facility-based cholera surveillance system for Vibrio cholerae infection. We sought to measure the sensitivity of this surveillance system to detect cases to assess whether cholera elimination targets outlined by the Bangladesh national control plan can be adequately measured. METHODS: We overlaid maps of nationally representative annual V cholerae seroincidence onto maps of the catchment areas of facilities where confirmatory laboratory testing for cholera was conducted, and we identified its spatial complement as surveillance greyspots, areas where cases likely occur but go undetected. We assessed surveillance system sensitivity and changes to sensitivity given alternate surveillance site selection strategies. RESULTS: We estimated that 69% of Bangladeshis (111.7 million individuals) live in surveillance greyspots and that 23% (25.5 million) of these individuals live in areas with the highest V cholerae infection rates. CONCLUSIONS: The cholera surveillance system in Bangladesh has the ability to monitor progress towards cholera elimination goals among 31% of the country's population, which may be insufficient for accurately measuring progress. Increasing surveillance coverage, particularly in the highest risk areas, should be considered.
Assuntos
Cólera/prevenção & controle , Vigilância em Saúde Pública/métodos , Vibrio cholerae , Bangladesh/epidemiologia , Cólera/epidemiologia , Controle de Doenças Transmissíveis , HumanosRESUMO
The coronavirus disease pandemic has highlighted the key role epidemiologic models play in supporting public health decision-making. In particular, these models provide estimates of outbreak potential when data are scarce and decision-making is critical and urgent. We document the integrated modeling response used in the US state of Utah early in the coronavirus disease pandemic, which brought together a diverse set of technical experts and public health and healthcare officials and led to an evidence-based response to the pandemic. We describe how we adapted a standard epidemiologic model; harmonized the outputs across modeling groups; and maintained a constant dialogue with policymakers at multiple levels of government to produce timely, evidence-based, and coordinated public health recommendations and interventions during the first wave of the pandemic. This framework continues to support the state's response to ongoing outbreaks and can be applied in other settings to address unique public health challenges.
Assuntos
COVID-19 , Surtos de Doenças , Humanos , Pandemias , SARS-CoV-2 , Utah/epidemiologiaRESUMO
BACKGROUND: Test-trace-isolate programs are an essential part of coronavirus disease 2019 (COVID-19) control that offer a more targeted approach than many other nonpharmaceutical interventions. Effective use of such programs requires methods to estimate their current and anticipated impact. METHODS AND FINDINGS: We present a mathematical modeling framework to evaluate the expected reductions in the reproductive number, R, from test-trace-isolate programs. This framework is implemented in a publicly available R package and an online application. We evaluated the effects of completeness in case detection and contact tracing and speed of isolation and quarantine using parameters consistent with COVID-19 transmission (R0: 2.5, generation time: 6.5 days). We show that R is most sensitive to changes in the proportion of cases detected in almost all scenarios, and other metrics have a reduced impact when case detection levels are low (<30%). Although test-trace-isolate programs can contribute substantially to reducing R, exceptional performance across all metrics is needed to bring R below one through test-trace-isolate alone, highlighting the need for comprehensive control strategies. Results from this model also indicate that metrics used to evaluate performance of test-trace-isolate, such as the proportion of identified infections among traced contacts, may be misleading. While estimates of the impact of test-trace-isolate are sensitive to assumptions about COVID-19 natural history and adherence to isolation and quarantine, our qualitative findings are robust across numerous sensitivity analyses. CONCLUSIONS: Effective test-trace-isolate programs first need to be strong in the "test" component, as case detection underlies all other program activities. Even moderately effective test-trace-isolate programs are an important tool for controlling the COVID-19 pandemic and can alleviate the need for more restrictive social distancing measures.
Assuntos
COVID-19/prevenção & controle , Busca de Comunicante , Surtos de Doenças/prevenção & controle , Modelos Teóricos , COVID-19/diagnóstico , Busca de Comunicante/métodos , Humanos , Quarentena , SARS-CoV-2/patogenicidadeRESUMO
After a period of rapidly declining U.S. COVID-19 incidence during January-March 2021, increases occurred in several jurisdictions (1,2) despite the rapid rollout of a large-scale vaccination program. This increase coincided with the spread of more transmissible variants of SARS-CoV-2, the virus that causes COVID-19, including B.1.1.7 (1,3) and relaxation of COVID-19 prevention strategies such as those for businesses, large-scale gatherings, and educational activities. To provide long-term projections of potential trends in COVID-19 cases, hospitalizations, and deaths, COVID-19 Scenario Modeling Hub teams used a multiple-model approach comprising six models to assess the potential course of COVID-19 in the United States across four scenarios with different vaccination coverage rates and effectiveness estimates and strength and implementation of nonpharmaceutical interventions (NPIs) (public health policies, such as physical distancing and masking) over a 6-month period (April-September 2021) using data available through March 27, 2021 (4). Among the four scenarios, an accelerated decline in NPI adherence (which encapsulates NPI mandates and population behavior) was shown to undermine vaccination-related gains over the subsequent 2-3 months and, in combination with increased transmissibility of new variants, could lead to surges in cases, hospitalizations, and deaths. A sharp decline in cases was projected by July 2021, with a faster decline in the high-vaccination scenarios. High vaccination rates and compliance with public health prevention measures are essential to control the COVID-19 pandemic and to prevent surges in hospitalizations and deaths in the coming months.
Assuntos
Vacinas contra COVID-19/administração & dosagem , COVID-19/epidemiologia , COVID-19/terapia , Hospitalização/estatística & dados numéricos , Modelos Estatísticos , Política Pública , Vacinação/estatística & dados numéricos , COVID-19/mortalidade , COVID-19/prevenção & controle , Previsões , Humanos , Máscaras , Distanciamento Físico , Estados Unidos/epidemiologiaRESUMO
Autism spectrum disorder (ASD) describes a group of neurodevelopmental disorders with core deficits in social communication and manifestation of restricted, repetitive, and stereotyped behaviors. Despite the core symptomatology, ASD is extremely heterogeneous with respect to the severity of symptoms and behaviors. This heterogeneity presents an inherent challenge to all large-scale genome-wide omics analyses. In the present study, we address this heterogeneity by stratifying ASD probands from simplex families according to the severity of behavioral scores on the Autism Diagnostic Interview-Revised diagnostic instrument, followed by re-analysis of existing DNA methylation data from individuals in three ASD subphenotypes in comparison to that of their respective unaffected siblings. We demonstrate that subphenotyping of cases enables the identification of over 1.6 times the number of statistically significant differentially methylated regions (DMR) and DMR-associated genes (DAGs) between cases and controls, compared to that identified when all cases are combined. Our analyses also reveal ASD-related neurological functions and comorbidities that are enriched among DAGs in each phenotypic subgroup but not in the combined case group. Moreover, relational gene networks constructed with the DAGs reveal signaling pathways associated with specific functions and comorbidities. In addition, a network comprised of DAGs shared among all ASD subgroups and the combined case group is enriched in genes involved in inflammatory responses, suggesting that neuroinflammation may be a common theme underlying core features of ASD. These findings demonstrate the value of phenotype definition in methylomic analyses of ASD and may aid in the development of subtype-directed diagnostics and therapeutics.
Assuntos
Transtorno do Espectro Autista , Metilação de DNA/genética , Redes Reguladoras de Genes , Fenótipo , Irmãos , Transdução de Sinais/genética , Transtorno do Espectro Autista/genética , Transtorno do Espectro Autista/metabolismo , Feminino , Humanos , MasculinoRESUMO
BACKGROUND: Cholera causes an estimated 100,000 deaths annually worldwide, with the majority of burden reported in sub-Saharan Africa. In May 2018, the World Health Assembly committed to reducing worldwide cholera deaths by 90% by 2030. Oral cholera vaccine (OCV) plays a key role in reducing the near-term risk of cholera, although global supplies are limited. Characterizing the potential impact and cost-effectiveness of mass OCV deployment strategies is critical for setting expectations and developing cholera control plans that maximize the chances of success. METHODS AND FINDINGS: We compared the projected impacts of vaccination campaigns across sub-Saharan Africa from 2018 through 2030 when targeting geographically according to historical cholera burden and risk factors. We assessed the number of averted cases, deaths, and disability-adjusted life years and the cost-effectiveness of these campaigns with models that accounted for direct and indirect vaccine effects and population projections over time. Under current vaccine supply projections, an approach optimized to targeting by historical burden is projected to avert 828,971 (95% CI 803,370-859,980) cases (equivalent to 34.0% of projected cases; 95% CI 33.2%-34.8%). An approach that balances logistical feasibility with targeting historical burden is projected to avert 617,424 (95% CI 599,150-643,891) cases. In contrast, approaches optimized for targeting locations with limited access to water and sanitation are projected to avert 273,939 (95% CI 270,319-277,002) and 109,817 (95% CI 103,735-114,110) cases, respectively. We find that the most logistically feasible targeting strategy costs US$1,843 (95% CI 1,328-14,312) per DALY averted during this period and that effective geographic targeting of OCV campaigns can have a greater impact on cost-effectiveness than improvements to vaccine efficacy and moderate increases in coverage. Although our modeling approach does not project annual changes in baseline cholera risk or directly incorporate immunity from natural cholera infection, our estimates of the relative performance of different vaccination strategies should be robust to these factors. CONCLUSIONS: Our study suggests that geographic targeting substantially improves the cost-effectiveness and impact of oral cholera vaccination campaigns. Districts with the poorest access to improved water and sanitation are not the same as districts with the greatest historical cholera incidence. While OCV campaigns can improve cholera control in the near term, without rapid progress in developing water and sanitation services or dramatic increases in OCV supply, our results suggest that vaccine use alone is unlikely to allow us to achieve the 2030 goal.
Assuntos
Cólera/epidemiologia , Vacinação em Massa/economia , Vacinação/economia , Administração Oral , Adulto , África Subsaariana , Cólera/prevenção & controle , Análise Custo-Benefício , Feminino , Humanos , Incidência , Vacinação em Massa/métodos , Fatores de RiscoRESUMO
BACKGROUND: Cholera remains a persistent health problem in sub-Saharan Africa and worldwide. Cholera can be controlled through appropriate water and sanitation, or by oral cholera vaccination, which provides transient (â¼3 years) protection, although vaccine supplies remain scarce. We aimed to map cholera burden in sub-Saharan Africa and assess how geographical targeting could lead to more efficient interventions. METHODS: We combined information on cholera incidence in sub-Saharan Africa (excluding Djibouti and Eritrea) from 2010 to 2016 from datasets from WHO, Médecins Sans Frontières, ProMED, ReliefWeb, ministries of health, and the scientific literature. We divided the study region into 20 kmâ×â20 km grid cells and modelled annual cholera incidence in each grid cell assuming a Poisson process adjusted for covariates and spatially correlated random effects. We combined these findings with data on population distribution to estimate the number of people living in areas of high cholera incidence (>1 case per 1000 people per year). We further estimated the reduction in cholera incidence that could be achieved by targeting cholera prevention and control interventions at areas of high cholera incidence. FINDINGS: We included 279 datasets covering 2283 locations in our analyses. In sub-Saharan Africa (excluding Djibouti and Eritrea), a mean of 141â918 cholera cases (95% credible interval [CrI] 141â538-146â505) were reported per year. 4·0% (95% CrI 1·7-16·8) of districts, home to 87·2 million people (95% CrI 60·3 million to 118·9 million), have high cholera incidence. By focusing on the highest incidence districts first, effective targeted interventions could eliminate 50% of the region's cholera by covering 35·3 million people (95% CrI 26·3 million to 62·0 million), which is less than 4% of the total population. INTERPRETATION: Although cholera occurs throughout sub-Saharan Africa, its highest incidence is concentrated in a small proportion of the continent. Prioritising high-risk areas could substantially increase the efficiency of cholera control programmes. FUNDING: The Bill & Melinda Gates Foundation.
Assuntos
Cólera/epidemiologia , Cólera/prevenção & controle , Vacinação/métodos , África Subsaariana/epidemiologia , Demografia , Humanos , Incidência , Cadeias de Markov , Vacinação em Massa , Densidade Demográfica , SaneamentoRESUMO
The surveillance of influenza activity is critical to early detection of epidemics and pandemics and the design of disease control strategies. Case reporting through a voluntary network of sentinel physicians is a commonly used method of passive surveillance for monitoring rates of influenza-like illness (ILI) worldwide. Despite its ubiquity, little attention has been given to the processes underlying the observation, collection, and spatial aggregation of sentinel surveillance data, and its subsequent effects on epidemiological understanding. We harnessed the high specificity of diagnosis codes in medical claims from a database that represented 2.5 billion visits from upwards of 120,000 United States healthcare providers each year. Among influenza seasons from 2002-2009 and the 2009 pandemic, we simulated limitations of sentinel surveillance systems such as low coverage and coarse spatial resolution, and performed Bayesian inference to probe the robustness of ecological inference and spatial prediction of disease burden. Our models suggest that a number of socio-environmental factors, in addition to local population interactions, state-specific health policies, as well as sampling effort may be responsible for the spatial patterns in U.S. sentinel ILI surveillance. In addition, we find that biases related to spatial aggregation were accentuated among areas with more heterogeneous disease risk, and sentinel systems designed with fixed reporting locations across seasons provided robust inference and prediction. With the growing availability of health-associated big data worldwide, our results suggest mechanisms for optimizing digital data streams to complement traditional surveillance in developed settings and enhance surveillance opportunities in developing countries.
Assuntos
Influenza Humana/epidemiologia , Vigilância da População/métodos , Teorema de Bayes , Simulação por Computador , Bases de Dados Factuais , Humanos , Vírus da Influenza A Subtipo H1N1/patogenicidade , Prontuários Médicos , Modelos Teóricos , Sistemas On-Line , Pandemias , Viés de Seleção , Vigilância de Evento Sentinela , Estados UnidosRESUMO
The factors that drive spatial heterogeneity and diffusion of pandemic influenza remain debated. We characterized the spatiotemporal mortality patterns of the 1918 influenza pandemic in British India and studied the role of demographic factors, environmental variables, and mobility processes on the observed patterns of spread. Fever-related and all-cause excess mortality data across 206 districts in India from January 1916 to December 1920 were analyzed while controlling for variation in seasonality particular to India. Aspects of the 1918 autumn wave in India matched signature features of influenza pandemics, with high disease burden among young adults, (moderate) spatial heterogeneity in burden, and highly synchronized outbreaks across the country deviating from annual seasonality. Importantly, we found population density and rainfall explained the spatial variation in excess mortality, and long-distance travel via railroad was predictive of the observed spatial diffusion of disease. A spatiotemporal analysis of mortality patterns during the 1918 influenza pandemic in India was integrated in this study with data on underlying factors and processes to reveal transmission mechanisms in a large, intensely connected setting with significant climatic variability. The characterization of such heterogeneity during historical pandemics is crucial to prepare for future pandemics.
Assuntos
Influenza Pandêmica, 1918-1919/história , Influenza Humana/epidemiologia , Influenza Humana/história , Distribuição por Idade , Causas de Morte , Febre/mortalidade , História do Século XX , Humanos , Índia/epidemiologia , Influenza Pandêmica, 1918-1919/mortalidade , Influenza Humana/mortalidade , Floresta Úmida , Doenças Respiratórias/mortalidade , Estações do Ano , Fatores Socioeconômicos , Análise Espaço-Temporal , ViagemRESUMO
Background: The seasonality of influenza is thought to vary according to environmental factors and human behavior. During winter holidays, potential disease-causing contact and travel deviate from typical patterns. We aim to understand these changes on age-specific and spatial influenza transmission. Methods: We characterized the changes to transmission and epidemic trajectories among children and adults in a spatial context before, during, and after the winter holidays among aggregated physician medical claims in the United States from 2001 to 2009 and among synthetic data simulated from a deterministic, age-specific spatial metapopulation model. Results: Winter holidays reduced influenza transmission and delayed the trajectory of influenza season epidemics. The holiday period was marked by a shift in the relative risk of disease from children toward adults. Model results indicated that holidays delayed epidemic peaks and synchronized incidence across locations, and that contact reductions from school closures, rather than age-specific mixing and travel, produced these observed holiday influenza dynamics. Conclusions: Winter holidays delay seasonal influenza epidemic peaks and shift disease risk toward adults because of changes in contact patterns. These findings may inform targeted influenza information and vaccination campaigns during holiday periods.
Assuntos
Surtos de Doenças , Influenza Humana/diagnóstico , Influenza Humana/epidemiologia , Influenza Humana/transmissão , Adolescente , Adulto , Idoso , Criança , Pré-Escolar , Férias e Feriados , Humanos , Incidência , Pessoa de Meia-Idade , Modelos Teóricos , Fatores de Risco , Instituições Acadêmicas , Estações do Ano , Viagem , Estados Unidos/epidemiologia , Adulto JovemRESUMO
Mathematical models of cholera and waterborne disease vary widely in their structures, in terms of transmission pathways, loss of immunity, and a range of other features. These differences can affect model dynamics, with different models potentially yielding different predictions and parameter estimates from the same data. Given the increasing use of mathematical models to inform public health decision-making, it is important to assess model distinguishability (whether models can be distinguished based on fit to data) and inference robustness (whether inferences from the model are robust to realistic variations in model structure). In this paper, we examined the effects of uncertainty in model structure in the context of epidemic cholera, testing a range of models with differences in transmission and loss of immunity structure, based on known features of cholera epidemiology. We fit these models to simulated epidemic and long-term data, as well as data from the 2006 Angola epidemic. We evaluated model distinguishability based on fit to data, and whether the parameter values, model behavior, and forecasting ability can accurately be inferred from incidence data. In general, all models were able to successfully fit to all data sets, both real and simulated, regardless of whether the model generating the simulated data matched the fitted model. However, in the long-term data, the best model fits were achieved when the loss of immunity structures matched those of the model that simulated the data. Two parameters, one representing person-to-person transmission and the other representing the reporting rate, were accurately estimated across all models, while the remaining parameters showed broad variation across the different models and data sets. The basic reproduction number (R0) was often poorly estimated even using the correct model, due to practical unidentifiability issues in the waterborne transmission pathway which were consistent across all models. Forecasting efforts using noisy data were not successful early in the outbreaks, but once the epidemic peak had been achieved, most models were able to capture the downward incidence trajectory with similar accuracy. Forecasting from noise-free data was generally successful for all outbreak stages using any model. Our results suggest that we are unlikely to be able to infer mechanistic details from epidemic case data alone, underscoring the need for broader data collection, such as immunity/serology status, pathogen dose response curves, and environmental pathogen data. Nonetheless, with sufficient data, conclusions from forecasting and some parameter estimates were robust to variations in the model structure, and comparative modeling can help to determine how realistic variations in model structure may affect the conclusions drawn from models and data.
Assuntos
Cólera/epidemiologia , Modelos Teóricos , Incerteza , Angola , Número Básico de Reprodução , Cólera/imunologia , Cólera/transmissão , Simulação por Computador , Epidemias , HumanosRESUMO
Nonhuman proteins have valuable therapeutic properties, but their efficacy is limited by neutralizing antibodies. Recombinant immunotoxins (RITs) are potent anticancer agents that have produced many complete remissions in leukemia, but immunogenicity limits the number of doses that can be given to patients with normal immune systems. Using human cells, we identified eight helper T-cell epitopes in PE38, a portion of the bacterial protein Pseudomonas exotoxin A which consists of the toxin moiety of the RIT, and used this information to make LMB-T18 in which three epitopes were deleted and five others diminished by point mutations in key residues. LMB-T18 has high cytotoxic and antitumor activity and is very resistant to thermal denaturation. The new immunotoxin has a 93% decrease in T-cell epitopes and should have improved efficacy in patients because more treatment cycles can be given. Furthermore, the deimmunized toxin can be used to make RITs targeting other antigens, and the approach we describe can be used to deimmunize other therapeutically useful nonhuman proteins.
Assuntos
Epitopos de Linfócito T/imunologia , Imunotoxinas/imunologia , Neoplasias/imunologia , Proteínas Recombinantes de Fusão/imunologia , ADP Ribose Transferases/genética , ADP Ribose Transferases/imunologia , Aminoácidos/genética , Aminoácidos/imunologia , Animais , Formação de Anticorpos/imunologia , Toxinas Bacterianas/genética , Toxinas Bacterianas/imunologia , Linhagem Celular Tumoral , Sobrevivência Celular/efeitos dos fármacos , Sobrevivência Celular/imunologia , Eletroforese em Gel de Poliacrilamida , Mapeamento de Epitopos , Exotoxinas/genética , Exotoxinas/imunologia , Feminino , Humanos , Imunoterapia/métodos , Imunotoxinas/genética , Imunotoxinas/uso terapêutico , Ativação Linfocitária/efeitos dos fármacos , Ativação Linfocitária/imunologia , Camundongos , Camundongos SCID , Modelos Moleculares , Neoplasias/patologia , Neoplasias/terapia , Peptídeos/genética , Peptídeos/imunologia , Mutação Puntual , Estrutura Terciária de Proteína , Proteínas Recombinantes de Fusão/química , Proteínas Recombinantes de Fusão/uso terapêutico , Linfócitos T/efeitos dos fármacos , Linfócitos T/imunologia , Linfócitos T/metabolismo , Fatores de Virulência/genética , Fatores de Virulência/imunologia , Ensaios Antitumorais Modelo de Xenoenxerto , Exotoxina A de Pseudomonas aeruginosaRESUMO
Spatial big data have the velocity, volume, and variety of big data sources and contain additional geographic information. Digital data sources, such as medical claims, mobile phone call data records, and geographically tagged tweets, have entered infectious diseases epidemiology as novel sources of data to complement traditional infectious disease surveillance. In this work, we provide examples of how spatial big data have been used thus far in epidemiological analyses and describe opportunities for these sources to improve disease-mitigation strategies and public health coordination. In addition, we consider the technical, practical, and ethical challenges with the use of spatial big data in infectious disease surveillance and inference. Finally, we discuss the implications of the rising use of spatial big data in epidemiology to health risk communication, and public health policy recommendations and coordination across scales.
Assuntos
Doenças Transmissíveis/epidemiologia , Monitoramento Epidemiológico , Análise Espacial , Política de Saúde , Humanos , Administração em Saúde Pública/ética , Topografia MédicaRESUMO
The endosomal innate receptor CD158d (killer cell Ig-like receptor 2DL4) induces cellular senescence in human NK cells in response to soluble ligand (HLA-G or agonist Ab). These senescent NK cells display a senescence-associated secretory phenotype, and their secretome promotes vascular remodeling and angiogenesis. To understand how CD158d initiates signaling for a senescence response, we mapped the region in its cytoplasmic tail that controls signaling. We identified a conserved TNFR-associated factor 6 (TRAF6) binding motif, which was required for CD158d-induced NF-κB activation and IL-8 secretion, TRAF6 association with CD158d, and TRAF6 recruitment to CD158d(+) endosomes in transfected cells. The adaptor TRAF6 is known to couple proximal signals from receptors such as endosomal TLRs and CD40 through the kinase TGF-ß-activated kinase 1 (TAK1) for NF-κB-dependent proinflammatory responses. Small interfering RNA-mediated silencing of TRAF6 and TAK1, and inhibition of TAK1 blocked CD158d-dependent IL-8 secretion. Stimulation of primary, resting NK cells with soluble Ab to CD158d induced TRAF6 association with CD158d, induced TAK1 phosphorylation, and inhibition of TAK1 blocked the CD158d-dependent reprogramming of NK cells that produces the senescence-associated secretory phenotype signature. Our results reveal that a prototypic TLR and TNFR signaling pathway is used by a killer cell Ig-like receptor that promotes secretion of proinflammatory and proangiogenic mediators as part of a unique senescence phenotype in NK cells.
Assuntos
Senescência Celular/genética , Endossomos/metabolismo , Células Matadoras Naturais/metabolismo , MAP Quinase Quinase Quinases/genética , MAP Quinase Quinase Quinases/metabolismo , Fator 6 Associado a Receptor de TNF/genética , Fator 6 Associado a Receptor de TNF/metabolismo , Motivos de Aminoácidos/genética , Linhagem Celular , Citoplasma/genética , Citoplasma/metabolismo , Endossomos/genética , Células HEK293 , Humanos , Interleucina-8/genética , Interleucina-8/metabolismo , NF-kappa B/genética , NF-kappa B/metabolismo , Fosforilação/genética , Ligação Proteica/genética , Receptores KIR2DL4/genética , Receptores KIR2DL4/metabolismo , Transdução de Sinais/genética , Receptores Toll-Like/genética , Receptores Toll-Like/metabolismoRESUMO
BACKGROUND: Measures of population-level influenza severity are important for public health planning, but estimates are often based on case-fatality and case-hospitalization risks, which require multiple data sources, are prone to surveillance biases, and are typically unavailable in the early stages of an outbreak. To address the limitations of traditional indicators, we propose a novel severity index based on influenza age dynamics estimated from routine physician diagnosis data that can be used retrospectively and for early warning. METHODS: We developed a quantitative 'ground truth' severity benchmark that synthesizes multiple traditional severity indicators from publicly available influenza surveillance data in the United States. Observing that the age distribution of cases may signal severity early in an epidemic, we constructed novel retrospective and early warning severity indexes based on the relative risk of influenza-like illness (ILI) among working-age adults to that among school-aged children using weekly outpatient medical claims. We compared our relative risk-based indexes to the composite benchmark and estimated seasonal severity for flu seasons from 2001-02 to 2008-09 at the national and state levels. RESULTS: The severity classifications made by the benchmark were not uniquely captured by any single contributing metric, including pneumonia and influenza mortality; the influenza epidemics of 2003-04 and 2007-08 were correctly identified as the most severe of the study period. The retrospective index was well correlated with the severity benchmark and correctly identified the two most severe seasons. The early warning index performance varied, but it projected 2007-08 as relatively severe 10 weeks prior to the epidemic peak. Influenza severity varied significantly among states within seasons, and four states were identified as possible early warning sentinels for national severity. CONCLUSIONS: Differences in age patterns of ILI may be used to characterize seasonal influenza severity in the United States in real-time and in a spatially resolved way. Future research on antigenic changes among circulating viruses, pre-existing immunity, and changing contact patterns may better elucidate the mechanisms underlying these indexes. Researchers and practitioners should consider the use of composite or ILI-based severity metrics in addition to traditional severity measures to inform epidemiological understanding and situational awareness in future seasonal outbreaks.
Assuntos
Influenza Humana/epidemiologia , Influenza Humana/etiologia , Adolescente , Adulto , Distribuição por Idade , Criança , Pré-Escolar , Surtos de Doenças , Hospitalização/estatística & dados numéricos , Humanos , Influenza Humana/mortalidade , Revisão da Utilização de Seguros , Pessoa de Meia-Idade , Estudos Retrospectivos , Estações do Ano , Índice de Gravidade de Doença , Estados Unidos/epidemiologia , Adulto JovemRESUMO
The COVID-19 pandemic led to an unprecedented demand for projections of disease burden and healthcare utilization under scenarios ranging from unmitigated spread to strict social distancing policies. In response, members of the Johns Hopkins Infectious Disease Dynamics Group developed flepiMoP (formerly called the COVID Scenario Modeling Pipeline), a comprehensive open-source software pipeline designed for creating and simulating compartmental models of infectious disease transmission and inferring parameters through these models. The framework has been used extensively to produce short-term forecasts and longer-term scenario projections of COVID-19 at the state and county level in the US, for COVID-19 in other countries at various geographic scales, and more recently for seasonal influenza. In this paper, we highlight how the flepiMoP has evolved throughout the COVID-19 pandemic to address changing epidemiological dynamics, new interventions, and shifts in policy-relevant model outputs. As the framework has reached a mature state, we provide a detailed overview of flepiMoP's key features and remaining limitations, thereby distributing flepiMoP and its documentation as a flexible and powerful tool for researchers and public health professionals to rapidly build and deploy large-scale complex infectious disease models for any pathogen and demographic setup.
Assuntos
COVID-19 , SARS-CoV-2 , Software , Humanos , COVID-19/epidemiologia , COVID-19/transmissão , COVID-19/prevenção & controle , Pandemias/prevenção & controle , Modelos EpidemiológicosRESUMO
Rodent malaria models serve as important preclinical antimalarial and vaccine testing tools. Evaluating treatment outcomes in these models often requires manually counting parasite-infected red blood cells (iRBCs), a time-consuming process, which can be inconsistent between individuals and laboratories. We have developed an easy-to-use machine learning (ML)-based software, Malaria Screener R, to expedite and standardize such studies by automating the counting of Plasmodium iRBCs in rodents. This software can process Giemsa-stained blood smear images captured by any camera-equipped microscope. It features an intuitive graphical user interface that facilitates image processing and visualization of the results. The software has been developed as a desktop application that processes images on standard Windows and MacOS computers. A previous ML model created by the authors designed to count Plasmodium falciparum-infected human RBCs did not perform well counting Plasmodium-infected mouse RBCs. We leveraged that model by loading the pretrained weights and training the algorithm with newly collected data to target Plasmodium yoelii- and Plasmodium berghei-infected mouse RBCs. This new model reliably measured both P. yoelii and P. berghei parasitemia (R2 = 0.9916). Additional rounds of training data to incorporate variances due to length of Giemsa staining and type of microscopes, etc., have produced a generalizable model, meeting WHO competency level 1 for the subcategory of parasite counting using independent microscopes. Reliable, automated analyses of blood-stage parasitemia will facilitate rapid and consistent evaluation of novel vaccines and antimalarials across laboratories in an easily accessible in vivo malaria model.