RESUMO
OBJECTIVE: Scalable strategies to reduce the time burden and increase contact tracing efficiency are crucial during early waves and peaks of infectious transmission. DESIGN: We enrolled a cohort of SARS-CoV-2-positive seed cases into a peer recruitment study testing social network methodology and a novel electronic platform to increase contact tracing efficiency. SETTING: Index cases were recruited from an academic medical center and requested to recruit their local social contacts for enrollment and SARS-CoV-2 testing. PARTICIPANTS: A total of 509 adult participants enrolled over 19 months (384 seed cases and 125 social peers). INTERVENTION: Participants completed a survey and were then eligible to recruit their social contacts with unique "coupons" for enrollment. Peer participants were eligible for SARS-CoV-2 and respiratory pathogen screening. MAIN OUTCOME MEASURES: The main outcome measures were the percentage of tests administered through the study that identified new SARS-CoV-2 cases, the feasibility of deploying the platform and the peer recruitment strategy, the perceived acceptability of the platform and the peer recruitment strategy, and the scalability of both during pandemic peaks. RESULTS: After development and deployment, few human resources were needed to maintain the platform and enroll participants, regardless of peaks. Platform acceptability was high. Percent positivity tracked with other testing programs in the area. CONCLUSIONS: An electronic platform may be a suitable tool to augment public health contact tracing activities by allowing participants to select an online platform for contact tracing rather than sitting for an interview.
Assuntos
COVID-19 , Adulto , Humanos , COVID-19/epidemiologia , COVID-19/prevenção & controle , Saúde Pública , Teste para COVID-19 , SARS-CoV-2 , Busca de Comunicante/métodosRESUMO
OBJECTIVE: To design and establish a prospective biospecimen repository that integrates multi-omics assays with clinical data to study mechanisms of controlled injury and healing. BACKGROUND: Elective surgery is an opportunity to understand both the systemic and focal responses accompanying controlled and well-characterized injury to the human body. The overarching goal of this ongoing project is to define stereotypical responses to surgical injury, with the translational purpose of identifying targetable pathways involved in healing and resilience, and variations indicative of aberrant peri-operative outcomes. METHODS: Clinical data from the electronic medical record combined with large-scale biological data sets derived from blood, urine, fecal matter, and tissue samples are collected prospectively through the peri-operative period on patients undergoing 14 surgeries chosen to represent a range of injury locations and intensities. Specimens are subjected to genomic, transcriptomic, proteomic, and metabolomic assays to describe their genetic, metabolic, immunologic, and microbiome profiles, providing a multidimensional landscape of the human response to injury. RESULTS: The highly multiplexed data generated includes changes in over 28,000 mRNA transcripts, 100 plasma metabolites, 200 urine metabolites, and 400 proteins over the longitudinal course of surgery and recovery. In our initial pilot dataset, we demonstrate the feasibility of collecting high quality multi-omic data at pre- and postoperative time points and are already seeing evidence of physiologic perturbation between timepoints. CONCLUSIONS: This repository allows for longitudinal, state-of-the-art geno-mic, transcriptomic, proteomic, metabolomic, immunologic, and clinical data collection and provides a rich and stable infrastructure on which to fuel further biomedical discovery.
Assuntos
Biologia Computacional , Proteômica , Genômica , Humanos , Metabolômica , Estudos Prospectivos , Proteômica/métodosRESUMO
BACKGROUND: Reduced surgical site infection (SSI) rates have been reported with use of closed incision negative pressure therapy (ciNPT) in high-risk patients. METHODS: A deep learning-based, risk-based prediction model was developed from a large national database of 72,435 patients who received infrainguinal vascular surgeries involving upper thigh/groin incisions. Patient demographics, histories, laboratory values, and other variables were inputs to the multilayered, adaptive model. The model was then retrospectively applied to a prospectively tracked single hospital data set of 370 similar patients undergoing vascular surgery, with ciNPT or control dressings applied over the closed incision at the surgeon's discretion. Objective predictive risk scores were generated for each patient and used to categorize patients as "high" or "low" predicted risk for SSI. RESULTS: Actual institutional cohort SSI rates were 10/148 (6.8%) and 28/134 (20.9%) for high-risk ciNPT versus control, respectively (P < 0.001), and 3/31 (9.7%) and 5/57 (8.8%) for low-risk ciNPT versus control, respectively (P = 0.99). Application of the model to the institutional cohort suggested that 205/370 (55.4%) patients were matched with their appropriate intervention over closed surgical incision (high risk with ciNPT or low risk with control), and 165/370 (44.6%) were inappropriately matched. With the model applied to the cohort, the predicted SSI rate with perfect utilization would be 27/370 (7.3%), versus 12.4% actual rate, with estimated cost savings of $231-$458 per patient. CONCLUSIONS: Compared with a subjective practice strategy, an objective risk-based strategy using prediction software may be associated with superior results in optimizing SSI rates and costs after vascular surgery.
Assuntos
Técnicas de Apoio para a Decisão , Aprendizado Profundo , Tratamento de Ferimentos com Pressão Negativa/estatística & dados numéricos , Procedimentos Cirúrgicos Vasculares/reabilitação , Idoso , Feminino , Virilha , Humanos , Masculino , Pessoa de Meia-Idade , Tratamento de Ferimentos com Pressão Negativa/economia , Estudos Retrospectivos , Medição de Risco/métodosRESUMO
Healthcare disparities are a persistent societal problem. One of the contributing factors to this status quo is the lack of diversity and representativeness of research efforts, which result in nongeneralizable evidence that, in turn, provides suboptimal means to enable the best possible outcomes at the individual level. There are several strategies that research teams can adopt to improve the diversity, equity, and inclusion (DEI) of their efforts; these strategies span the totality of the research path, from initial design to the shepherding of clinical data through a potential regulatory process. These strategies include more intentionality and DEI-based goal-setting, more diverse research and leadership teams, better community engagement to set study goals and approaches, better tailored outreach interventions, decentralization of study procedures and incorporation of innovative technology for more flexible data collection, and self-surveillance to identify and prevent biases. Within their remit of overlooking research efforts, regulatory authorities, as stakeholders, also have the potential for a positive effect on the DEI of emerging clinical evidence. All these are implementable tools and mechanisms that can make study participation more approachable to diverse communities, and ultimately generate evidence that is more generalizable and a conduit for better outcomes. The research community has an imperative to make DEI principles key foundational aspects in study conduct in order to pursue better personalized medicine for diverse patient populations.
Assuntos
Diversidade, Equidade, Inclusão , Medicina de Precisão , Humanos , Coleta de Dados , LiderançaRESUMO
The promise of artificial intelligence (AI) and machine learning in healthcare can be realized only when they are smoothly integrated into existing clinical workflows. Doing so requires optimizing the user experience of AI and the data on which these systems are built, enabling clinicians to deliver focused patient care.
Assuntos
Inteligência Artificial , Aprendizado de Máquina , Atenção à Saúde , Instalações de Saúde , Humanos , Fluxo de TrabalhoRESUMO
This commentary article discusses the recent trends and changes in popularity of telehealth usage as well as the most recent efforts to redefine telehealth value and usability. Six strategies to improve the patient experience and increase telehealth acceptance by overcoming simultaneous barriers are presented, which include (1) creating a new healthcare paradigm using telehealth, (2) scheduling the telehealth visit, (3) preparing for the telehealth visit, (4) conducting the telehealth visit, (5) using data and biomarkers, and (6) providing digital equity. With the application of these strategies, we believe that the recent decline in the popularity of telehealth can be reversed.
RESUMO
As COVID-19 hounds the world, the common cause of finding a swift solution to manage the pandemic has brought together researchers, institutions, governments, and society at large. The Internet of Things (IoT), artificial intelligence (AI)-including machine learning (ML) and Big Data analytics-as well as Robotics and Blockchain, are the four decisive areas of technological innovation that have been ingenuity harnessed to fight this pandemic and future ones. While these highly interrelated smart and connected health technologies cannot resolve the pandemic overnight and may not be the only answer to the crisis, they can provide greater insight into the disease and support frontline efforts to prevent and control the pandemic. This article provides a blend of discussions on the contribution of these digital technologies, propose several complementary and multidisciplinary techniques to combat COVID-19, offer opportunities for more holistic studies, and accelerate knowledge acquisition and scientific discoveries in pandemic research. First, four areas, where IoT can contribute are discussed, namely: 1) tracking and tracing; 2) remote patient monitoring (RPM) by wearable IoT (WIoT); 3) personal digital twins (PDTs); and 4) real-life use case: ICT/IoT solution in South Korea. Second, the role and novel applications of AI are explained, namely: 1) diagnosis and prognosis; 2) risk prediction; 3) vaccine and drug development; 4) research data set; 5) early warnings and alerts; 6) social control and fake news detection; and 7) communication and chatbot. Third, the main uses of robotics and drone technology are analyzed, including: 1) crowd surveillance; 2) public announcements; 3) screening and diagnosis; and 4) essential supply delivery. Finally, we discuss how distributed ledger technologies (DLTs), of which blockchain is a common example, can be combined with other technologies for tackling COVID-19.
RESUMO
[This corrects the article DOI: 10.1038/s41746-020-0235-5.].
RESUMO
Storing very large amounts of data and delivering them to researchers in an efficient, verifiable, and compliant manner, is one of the major challenges faced by health care providers and researchers in the life sciences. The electronic health record (EHR) at a hospital or clinic currently functions as a silo, and although EHRs contain rich and abundant information that could be used to understand, improve, and learn from care as part learning health system access to these data is difficult, and the technical, legal, ethical, and social barriers are significant. If we create a microservice ecosystem where data can be accessed through APIs, these challenges become easier to overcome: a service-driven design decouples data from clients. This decoupling provides flexibility: different users can write in their preferred language and use different clients depending on their needs. APIs can be written for iOS apps, web apps, or an R library, and this flexibility highlights the potential ecosystem-building power of APIs. In this article, we use two case studies to illustrate what it means to participate in and contribute to interconnected ecosystems that powers APIs in a healthcare systems.
RESUMO
The Project Baseline Health Study (PBHS) was launched to map human health through a comprehensive understanding of both the health of an individual and how it relates to the broader population. The study will contribute to the creation of a biomedical information system that accounts for the highly complex interplay of biological, behavioral, environmental, and social systems. The PBHS is a prospective, multicenter, longitudinal cohort study that aims to enroll thousands of participants with diverse backgrounds who are representative of the entire health spectrum. Enrolled participants will be evaluated serially using clinical, molecular, imaging, sensor, self-reported, behavioral, psychological, environmental, and other health-related measurements. An initial deeply phenotyped cohort will inform the development of a large, expanded virtual cohort. The PBHS will contribute to precision health and medicine by integrating state of the art testing, longitudinal monitoring and participant engagement, and by contributing to the development of an improved platform for data sharing and analysis.
RESUMO
The term big data has been popularized over the past decade and is often used to refer to data sets that are too large or complex to be analyzed by traditional means. Although the term has been utilized for some time in business and engineering, the concept of big data is relatively new to medicine. The reception from the medical community has been mixed; however, the widespread utilization of electronic health records in the United States, the creation of large clinical data sets and national registries that capture information on numerous vectors affecting healthcare delivery and patient outcomes, and the sequencing of the human genome are all opportunities to leverage big data. This review was inspired by a lively panel discussion on big data that took place at the 75th Central Surgical Association Annual Meeting. The authors' aim was to describe big data, the methodologies used to analyze big data, and their practical clinical application.
Assuntos
Big Data , Conjuntos de Dados como Assunto , Humanos , Aprendizado de Máquina , Redes Neurais de Computação , Máquina de Vetores de SuporteRESUMO
Background: Acute respiratory infections (ARIs) are the leading indication for antibacterial prescriptions despite a viral etiology in the majority of cases. The lack of available diagnostics to discriminate viral and bacterial etiologies contributes to this discordance. Recent efforts have focused on the host response as a source for novel diagnostic targets although none have explored the ability of host-derived microRNAs (miRNA) to discriminate between these etiologies. Methods: In this study, we compared host-derived miRNAs and mRNAs from human H3N2 influenza challenge subjects to those from patients with Streptococcus pneumoniae pneumonia. Sparse logistic regression models were used to generate miRNA signatures diagnostic of ARI etiologies. Generalized linear modeling of mRNAs to identify differentially expressed (DE) genes allowed analysis of potential miRNA:mRNA relationships. High likelihood miRNA:mRNA interactions were examined using binding target prediction and negative correlation to further explore potential changes in pathway regulation in response to infection. Results: The resultant miRNA signatures were highly accurate in discriminating ARI etiologies. Mean accuracy was 100% [88.8-100; 95% Confidence Interval (CI)] in discriminating the healthy state from S. pneumoniae pneumonia and 91.3% (72.0-98.9; 95% CI) in discriminating S. pneumoniae pneumonia from influenza infection. Subsequent differential mRNA gene expression analysis revealed alterations in regulatory networks consistent with known biology including immune cell activation and host response to viral infection. Negative correlation network analysis of miRNA:mRNA interactions revealed connections to pathways with known immunobiology such as interferon regulation and MAP kinase signaling. Conclusion: We have developed novel human host-response miRNA signatures for bacterial and viral ARI etiologies. miRNA host response signatures reveal accurate discrimination between S. pneumoniae pneumonia and influenza etiologies for ARI and integrated analyses of the host-pathogen interface are consistent with expected biology. These results highlight the differential miRNA host response to bacterial and viral etiologies of ARI, offering new opportunities to distinguish these entities.
RESUMO
We previously identified 34 genes of interest (GOI) in 2006 to aid the oncologists to determine whether post-mastectomy radiotherapy (PMRT) is indicated for certain patients with breast cancer. At this time, an independent cohort of 135 patients having DNA microarray study available from the primary tumor tissue samples was chosen. Inclusion criteria were 1) mastectomy as the first treatment, 2) pathology stages I-III, 3) any locoregional recurrence (LRR) and 4) no PMRT. After inter-platform data integration of Affymetrix U95 and U133 Plus 2.0 arrays and quantile normalization, in this paper we used 18 of 34 GOI to divide the mastectomy patients into high and low risk groups. The 5-year rate of freedom from LRR in the high-risk group was 30%. In contrast, in the low-risk group it was 99% (p < 0.0001). Multivariate analysis revealed that the 18-gene classifier independently predicts rates of LRR regardless of nodal status or cancer subtype.
Assuntos
Neoplasias da Mama/genética , Proteínas de Neoplasias/genética , Recidiva Local de Neoplasia/genética , Prognóstico , Adulto , Idoso , Idoso de 80 Anos ou mais , Neoplasias da Mama/patologia , Neoplasias da Mama/cirurgia , Feminino , Regulação Neoplásica da Expressão Gênica , Humanos , Metástase Linfática , Mastectomia , Pessoa de Meia-Idade , Proteínas de Neoplasias/biossíntese , Recidiva Local de Neoplasia/patologia , Estadiamento de Neoplasias , Análise de Sequência com Séries de Oligonucleotídeos , TranscriptomaRESUMO
BACKGROUND: Despite an aggressive therapeutic approach, the prognosis for most patients with glioblastoma (GBM) remains poor. The aim of this study was to determine the significance of preoperative MRI variables, both quantitative and qualitative, with regard to overall and progression-free survival in GBM. METHODS: We retrospectively identified 94 untreated GBM patients from the Cancer Imaging Archive who had pretreatment MRI and corresponding patient outcomes and clinical information in The Cancer Genome Atlas. Qualitative imaging assessments were based on the Visually Accessible Rembrandt Images feature-set criteria. Volumetric parameters were obtained of the specific tumor components: contrast enhancement, necrosis, and edema/invasion. Cox regression was used to assess prognostic and survival significance of each image. RESULTS: Univariable Cox regression analysis demonstrated 10 imaging features and 2 clinical variables to be significantly associated with overall survival. Multivariable Cox regression analysis showed that tumor-enhancing volume (P = .03) and eloquent brain involvement (P < .001) were independent prognostic indicators of overall survival. In the multivariable Cox analysis of the volumetric features, the edema/invasion volume of more than 85 000 mm(3) and the proportion of enhancing tumor were significantly correlated with higher mortality (Ps = .004 and .003, respectively). CONCLUSIONS: Preoperative MRI parameters have a significant prognostic role in predicting survival in patients with GBM, thus making them useful for patient stratification and endpoint biomarkers in clinical trials.
Assuntos
Neoplasias Encefálicas/patologia , Glioblastoma/patologia , Imageamento por Ressonância Magnética , Neuroimagem/métodos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Área Sob a Curva , Neoplasias Encefálicas/mortalidade , Estudos de Coortes , Bases de Dados Factuais , Intervalo Livre de Doença , Feminino , Glioblastoma/mortalidade , Humanos , Interpretação de Imagem Assistida por Computador , Estimativa de Kaplan-Meier , Masculino , Pessoa de Meia-Idade , Prognóstico , Modelos de Riscos Proporcionais , Curva ROC , Estudos Retrospectivos , Sensibilidade e Especificidade , Adulto JovemRESUMO
Progress in biomedical research requires effective scientific communication to one's peers and to the public. Current research routinely encompasses large datasets and complex analytic processes, and the constraints of traditional journal formats limit useful transmission of these elements. We are constructing a framework through which authors can not only provide the narrative of what was done, but the primary and derivative data, the source code, the compute environment, and web-accessible virtual machines. This infrastructure allows authors to "hand their machine"- prepopulated with libraries, data, and code-to those interested in reviewing or building off of their work. This project, "clearScience," seeks to provide an integrated system that accommodates the ad hoc nature of discovery in the data-intensive sciences and seamless transitions from working to reporting. We demonstrate that rather than merely describing the science being reported, one can deliver the science itself.
RESUMO
Genomic data, particularly genome-scale measures of gene expression derived from DNA microarray studies, has the potential for adding enormous information to the analysis of biological phenotypes. Perhaps the most successful application of this data has been in the characterization of human cancers, including the ability to predict clinical outcomes. Nevertheless, most analyses have used gene expression profiles to define broad group distinctions, similar to the use of traditional clinical risk factors. As a result, there remains considerable heterogeneity within the broadly defined groups and thus predictions fall short of providing accurate predictions for individual patients. One strategy to resolve this heterogeneity is to make use of multiple gene expression patterns that are more powerful in defining individual characteristics and predicting outcomes than any single gene expression pattern. Statistical tree-based classification systems provide a framework for assessing multiple patterns, that we term metagenes, selecting those that are most capable of resolving the biological heterogeneity. Moreover, this framework provides a mechanism to combine multiple forms of data, both genomic and clinical, to most effectively characterize individual patients and achieve the goal of personalized predictions of clinical outcomes.