Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
2.
J Am Med Inform Assoc ; 28(1): 62-70, 2021 01 15.
Artigo em Inglês | MEDLINE | ID: mdl-33164100

RESUMO

OBJECTIVE: Clinical trials ensure that pharmaceutical treatments are safe, efficacious, and effective for public consumption, but are extremely complex, taking up to 10 years and $2.6 billion to complete. One main source of complexity arises from the collaboration between actors, and network science methodologies can be leveraged to explore that complexity. We aim to characterize collaborations between actors in the clinical trials context and investigate trends of successful actors. MATERIALS AND METHODS: We constructed a temporal network of clinical trial collaborations between large and small-size pharmaceutical companies, academic institutions, nonprofit organizations, hospital systems, and government agencies from public and proprietary data and introduced metrics to quantify actors' collaboration network structure, organizational behavior, and partnership characteristics. A multivariable regression analysis was conducted to determine the metrics' relationship with success. RESULTS: We found a positive correlation between the number of successful approved trials and interdisciplinary collaborations measured by a collaboration diversity metric (P < .01). Our results also showed a negative effect of the local clustering coefficient (P < .01) on the success of clinical trials. Large pharmaceutical companies have the lowest local clustering coefficient and more diversity in partnerships across biomedical specializations. CONCLUSIONS: Large pharmaceutical companies are more likely to collaborate with a wider range of actors from other specialties, especially smaller industry actors who are newcomers in clinical research, resulting in exclusive access to smaller actors. Future investigations are needed to show how concentrations of influence and resources might result in diminished gains in treatment development.


Assuntos
Ensaios Clínicos como Assunto/organização & administração , Aprovação de Drogas/organização & administração , Indústria Farmacêutica/organização & administração , Preparações Farmacêuticas , Comportamento Cooperativo , Humanos , Análise Multivariada , Análise de Regressão
3.
BMC Infect Dis ; 20(1): 649, 2020 Sep 03.
Artigo em Inglês | MEDLINE | ID: mdl-32883213

RESUMO

BACKGROUND: More than 80,000 dengue cases including 215 deaths were reported nationally in less than 7 months between 2016 and 2017, a fourfold increase in the number of reported cases compared to the average number over 2010-2016. The region of Negombo, located in the Western province, experienced the greatest number of dengue cases in the country and is the focus area of our study, where we aim to capture the spatial-temporal dynamics of dengue transmission. METHODS: We present a statistical modeling framework to evaluate the spatial-temporal dynamics of the 2016-2017 dengue outbreak in the Negombo region of Sri Lanka as a function of human mobility, land-use, and climate patterns. The analysis was conducted at a 1 km × 1 km spatial resolution and a weekly temporal resolution. RESULTS: Our results indicate human mobility to be a stronger indicator for local outbreak clusters than land-use or climate variables. The minimum daily temperature was identified as the most influential climate variable on dengue cases in the region; while among the set of land-use patterns considered, urban areas were found to be most prone to dengue outbreak, followed by areas with stagnant water and then coastal areas. The results are shown to be robust across spatial resolutions. CONCLUSIONS: Our study highlights the potential value of using travel data to target vector control within a region. In addition to illustrating the relative relationship between various potential risk factors for dengue outbreaks, the results of our study can be used to inform where and when new cases of dengue are likely to occur within a region, and thus help more effectively and innovatively, plan for disease surveillance and vector control.


Assuntos
Dengue/epidemiologia , Clima , Surtos de Doenças , Humanos , Modelos Estatísticos , Fatores de Risco , Sri Lanka/epidemiologia , Temperatura , Viagem
4.
Adv Radiat Oncol ; 5(2): 221-230, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32280822

RESUMO

PURPOSE: Radiation-induced xerostomia is one of the most prevalent symptoms during and after head and neck cancer radiation therapy (RT). We aimed to discover the spatial radiation dose-based (voxel dose) importance pattern in the major salivary glands in relation to the recovery of xerostomia 18 months after RT, and to compare the recovery voxel dose importance pattern to the acute incidence (injury) pattern. METHODS AND MATERIALS: This study included all patients within our database with xerostomia outcomes after completion of curative intensity modulated RT. Common Terminology Criteria for Adverse Events xerostomia grade was used to define recovered versus nonrecovered group at baseline, between end of treatment and 18 months post-RT, and beyond 18 months, respectively. Ridge logistic regression was performed to predict the probability of xerostomia recovery. Voxel doses within geometrically defined parotid glands (PG) and submandibular glands (SMG), demographic characteristics, and clinical factors were included in the algorithm. We plotted the normalized learned weights on the 3-dimensional PG and SMG structures to visualize the voxel dose importance for predicting xerostomia recovery. RESULTS: A total of 146 head and neck cancer patients from 2008 to 2016 were identified. The superior region of the ipsilateral and contralateral PG was the most influencial for xerostomia recovery. The area under the receiver operating characteristic curve evaluated using 10-fold cross-validation for ridge logistic regression was 0.68 ± 0.07. Compared with injury, the recovery voxel dose importance pattern was more symmetrical and was influenced by lower dose voxels. CONCLUSIONS: The superior portion of the 2 PGs (low dose region) are the most influential on xerostomia recovery and seem to be equal in their contribution. The dissimilarity of the influence pattern between injury and recovery suggests different underlying mechanisms. The importance pattern identified by spatial radiation dose and machine learning methods can improve our understanding of normal tissue toxicities in RT. Further external validation is warranted.

5.
Drug Discov Today ; 25(2): 414-421, 2020 02.
Artigo em Inglês | MEDLINE | ID: mdl-31926317

RESUMO

A significant number of drugs fail during the clinical testing stage. To understand the attrition of drugs through the regulatory process, here we review and advance machine-learning (ML) and natural language-processing algorithms to investigate the importance of factors in clinical trials that are linked with failure in Phases II and III. We find that clinical trial phase transitions can be predicted with an average accuracy of 80%. Identifying these trials provides information to sponsors facing difficult decisions about whether these higher risk trials should be modified or halted. We also find common protocol characteristics across therapeutic areas that are linked to phase success, including the number of endpoints and the complexity of the eligibility criteria.


Assuntos
Ensaios Clínicos como Assunto , Aprendizado de Máquina , Desenvolvimento de Medicamentos , Humanos
6.
JMIR Med Inform ; 7(4): e14756, 2019 Sep 16.
Artigo em Inglês | MEDLINE | ID: mdl-31579025

RESUMO

BACKGROUND: Patients hospitalized with heart failure suffer the highest rates of 30-day readmission among other clinically defined patient populations in the United States. Investigation into the predictability of 30-day readmissions can lead to clinical decision support tools and targeted interventions that can help care providers to improve individual patient care and reduce readmission risk. OBJECTIVE: This study aimed to develop a dynamic readmission risk prediction model that yields daily predictions for patients hospitalized with heart failure toward identifying risk trajectories over time and identifying clinical predictors associated with different patterns in readmission risk trajectories. METHODS: A two-stage predictive modeling approach combining logistic and beta regression was applied to electronic health record data accumulated daily to predict 30-day readmission for 534 hospital encounters of patients with heart failure over 2750 patient days. Unsupervised clustering was performed on predictions to uncover time-dependent trends in readmission risk over the patient's hospital stay. We used data collected between September 1, 2013, and August 31, 2015, from a community hospital in Maryland (United States) for patients with a primary diagnosis of heart failure. Patients who died during the hospital stay or were transferred to other acute care hospitals or hospice care were excluded. RESULTS: Readmission occurred in 107 (107/534, 20.0%) encounters. The out-of-sample area under curve for the 2-stage predictive model was 0.73 (SD 0.08). Dynamic clinical predictors capturing laboratory results and vital signs had the highest predictive value compared with demographic, administrative, medical, and procedural data included. Unsupervised clustering identified four risk trajectory groups: decreasing risk (131/534, 24.5% encounters), high risk (113/534, 21.2%), moderate risk (177/534, 33.1%), and low risk (113/534, 21.2%). The decreasing risk group demonstrated change in average probability of readmission from admission (0.69) to discharge (0.30), whereas the high risk (0.75), moderate risk (0.61), and low risk (0.39) groups maintained consistency over the hospital course. A higher level of hemoglobin, larger decrease in potassium and diastolic blood pressure from admission to discharge, and smaller number of past hospitalizations are associated with decreasing readmission risk (P<.001). CONCLUSIONS: Dynamically predicting readmission and quantifying trends over patients' hospital stay illuminated differing risk trajectory groups. Identifying risk trajectory patterns and distinguishing predictors may shed new light on indicators of readmission and the isolated effects of the index hospitalization.

7.
HERD ; 12(2): 147-161, 2019 04.
Artigo em Inglês | MEDLINE | ID: mdl-30991849

RESUMO

OBJECTIVES: The objective of this study is to determine the optimal allocation of budgets for pairs of alterations that reduce pathogenic bacterial transmission. Three alterations of the built environment are examined: handwashing stations (HW), relative humidity control (RH), and negatively pressured treatment rooms (NP). These interventions were evaluated to minimize total cost of healthcare-associated infections (HAIs), including medical and litigation costs. BACKGROUND: HAIs are largely preventable but are difficult to control because of their multiple mechanisms of transmission. Moreover, the costs of HAIs and resulting mortality are increasing with the latest estimates at US$9.8 billion annually. METHOD: Using 6 years of longitudinal multidrug-resistant infection data, we simulated the transmission of pathogenic bacteria and the infection control efforts of the three alterations using Chamchod and Ruan's model. We determined the optimal budget allocations among the alterations by representing them under Karush-Kuhn-Tucker conditions for this nonlinear optimization problem. RESULTS: We examined 24 scenarios using three virulence levels across three facility sizes with varying budget levels. We found that in general, most of the budget is allocated to the NP or RH alterations in each intervention. At lower budgets, however, it was necessary to use the lower cost alterations, HW or RH. CONCLUSIONS: Mathematical optimization offers healthcare enterprise executives and engineers a tool to assist with the design of safer healthcare facilities within a fiscally constrained environment. Herein, models were developed for the optimal allocation of funds between HW, RH, and negatively pressured treatment rooms (NP) to best reduce HAIs. Specific strategies vary by facility size and virulence.


Assuntos
Infecções Bacterianas/prevenção & controle , Análise Custo-Benefício/estatística & dados numéricos , Infecção Hospitalar/prevenção & controle , Arquitetura Hospitalar/economia , Arquitetura Hospitalar/estatística & dados numéricos , Arquitetura Hospitalar/normas , Controle de Infecções/métodos , Infecções Bacterianas/transmissão , Desinfecção das Mãos , Humanos , Umidade , Estados Unidos
8.
Adv Radiat Oncol ; 4(2): 401-412, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31011686

RESUMO

PURPOSE: Patients with head-and-neck cancer (HNC) may experience xerostomia after radiation therapy (RT), which leads to compromised quality of life. The purpose of this study is to explore how the spatial pattern of radiation dose (radiomorphology) in the major salivary glands influences xerostomia in patients with HNC. METHODS AND MATERIALS: A data-driven approach using spatially explicit dosimetric predictors, voxel dose (ie, actual radiation dose in voxels in parotid glands [PG] and submandibular glands [SMG]) was used to predict whether patients would develop xerostomia 3 months after RT. Using planned radiation dose data and other nondose covariates including baseline xerostomia grade of 427 patients with HNC in our database, the machine learning methods were used to investigate the influence of dose patterns across subvolumes in PG and SMG on xerostomia. RESULTS: Of the 3 supervised learning methods studied, ridge logistic regression yielded the best predictive performance. Ridge logistic regression was also preferred to evaluate the influence pattern of highly correlated dose on xerostomia, which showed a discriminative pattern of influence of doses in the PG and SMG on xerostomia. Moreover, the superior-anterior portion of the contralateral PG and medial portion of the ipsilateral PG were determined to be the most influential regions regarding dose effect on xerostomia. The area under the receiver operating characteristic curve from a 10-fold cross-validation was 0.70 ± 0.04. CONCLUSIONS: Radiomorphology, combined with machine learning methods, is able to suggest patterns of dose in PG and SMG that are the most influential on xerostomia. The influence pattern identified by this data-driven approach and machine learning methods may help improve RT treatment planning and reduce xerostomia after treatment.

9.
Sci Rep ; 9(1): 3616, 2019 03 05.
Artigo em Inglês | MEDLINE | ID: mdl-30837617

RESUMO

Xerostomia is a common consequence of radiotherapy in head and neck cancer. The objective was to compare the regional radiation dose distribution in patients that developed xerostomia within 6 months of radiotherapy and those recovered from xerostomia within 18 months post-radiotherapy. We developed a feature generation pipeline to extract dose volume histogram features from geometrically defined ipsilateral/contralateral parotid glands, submandibular glands, and oral cavity surrogates for each patient. Permutation tests with multiple comparisons were performed to assess the dose difference between injury vs. non-injury and recovery vs. non-recovery. Ridge logistic regression models were applied to predict injury and recovery using clinical features along with dose features (D10-D90) of the subvolumes extracted from oral cavity and salivary gland contours + 3 mm peripheral shell. Model performances were assessed by the area under the receiver operating characteristic curve (AUC) using nested cross-validation. We found that different regional dose/volume metrics patterns exist for injury vs. recovery. Compared to injury, recovery has increased importance to the subvolumes receiving lower dose. Within the subvolumes, injury tends to have increased importance towards D10 from D90. This suggests that different threshold for xerostomia injury and recovery. Injury is induced by the subvolumes receiving higher dose, and the ability to recover can be preserved by further reducing the dose to subvolumes receiving lower dose.


Assuntos
Neoplasias de Cabeça e Pescoço/radioterapia , Órgãos em Risco/efeitos da radiação , Radioterapia/efeitos adversos , Recuperação de Função Fisiológica , Glândulas Salivares/patologia , Glândula Submandibular/patologia , Xerostomia/patologia , Idoso , Feminino , Neoplasias de Cabeça e Pescoço/patologia , Humanos , Masculino , Tratamentos com Preservação do Órgão/métodos , Estudos Prospectivos , Dosagem Radioterapêutica , Glândulas Salivares/efeitos da radiação , Glândula Submandibular/efeitos da radiação , Xerostomia/etiologia
10.
Med Phys ; 46(2): 704-713, 2019 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-30506737

RESUMO

PURPOSE: In radiotherapy, it is necessary to characterize dose over the patient anatomy to target areas and organs at risk. Current tools provide methods to describe dose in terms of percentage of volume and magnitude of dose, but are limited by assumptions of anatomical homogeneity within a region of interest (ROI) and provide a non-spatially aware description of dose. A practice termed radio-morphology is proposed as a method to apply anatomical knowledge to parametrically derive new shapes and substructures from a normalized set of anatomy, ensuring consistently identifiable spatially aware features of the dose across a patient set. METHODS: Radio-morphologic (RM) features are derived from a three-step procedure: anatomy normalization, shape transformation, and dose calculation. Predefined ROI's are mapped to a common anatomy, a series of geometric transformations are applied to create new structures, and dose is overlaid to the new images to extract dosimetric features; this feature computation pipeline characterizes patient treatment with greater anatomic specificity than current methods. RESULTS: Examples of applications of this framework to derive structures include concentric shells based around expansions and contractions of the parotid glands, separation of the esophagus into slices along the z-axis, and creating radial sectors to approximate neurovascular bundles surrounding the prostate. Compared to organ-level dose-volume histograms (DVHs), using derived RM structures permits a greater level of control over the shapes and anatomical regions that are studied and ensures that all new structures are consistently identified. Using machine learning methods, these derived dose features can help uncover dose dependencies of inter- and intra-organ regions. Voxel-based and shape-based analysis of the parotid and submandibular glands identified regions that were predictive of the development of high-grade xerostomia (CTCAE grade 2 or greater) at 3-6 months post treatment. CONCLUSIONS: Radio-morphology is a valuable data mining tool that approaches radiotherapy data in a new way, improving the study of radiotherapy to potentially improve prognostic and predictive accuracy. Further applications of this methodology include the use of parametrically derived sub-volumes to drive radiotherapy treatment planning.


Assuntos
Radioterapia Guiada por Imagem/métodos , Dosagem Radioterapêutica , Planejamento da Radioterapia Assistida por Computador
11.
J Am Med Inform Assoc ; 23(e1): e2-e10, 2016 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-26253131

RESUMO

OBJECTIVE: Hospitals are challenged to provide timely patient care while maintaining high resource utilization. This has prompted hospital initiatives to increase patient flow and minimize nonvalue added care time. Real-time demand capacity management (RTDC) is one such initiative whereby clinicians convene each morning to predict patients able to leave the same day and prioritize their remaining tasks for early discharge. Our objective is to automate and improve these discharge predictions by applying supervised machine learning methods to readily available health information. MATERIALS AND METHODS: The authors use supervised machine learning methods to predict patients' likelihood of discharge by 2 p.m. and by midnight each day for an inpatient medical unit. Using data collected over 8000 patient stays and 20 000 patient days, the predictive performance of the model is compared to clinicians using sensitivity, specificity, Youden's Index (i.e., sensitivity + specificity - 1), and aggregate accuracy measures. RESULTS: The model compared to clinician predictions demonstrated significantly higher sensitivity (P < .01), lower specificity (P < .01), and a comparable Youden Index (P > .10). Early discharges were less predictable than midnight discharges. The model was more accurate than clinicians in predicting the total number of daily discharges and capable of ranking patients closest to future discharge. CONCLUSIONS: There is potential to use readily available health information to predict daily patient discharges with accuracies comparable to clinician predictions. This approach may be used to automate and support daily RTDC predictions aimed at improving patient flow.


Assuntos
Administração Hospitalar , Tempo de Internação , Aprendizado de Máquina , Alta do Paciente , Centros Médicos Acadêmicos , Adulto , Idoso , Feminino , Humanos , Masculino , Maryland , Pessoa de Meia-Idade , Fluxo de Trabalho , Carga de Trabalho
12.
J Am Med Inform Assoc ; 23(e1): e49-57, 2016 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-26342217

RESUMO

OBJECTIVE: To develop and prospectively evaluate a web-based tool that forecasts the daily bed need for admissions from the cardiac catheterization laboratory using routinely available clinical data within electronic medical records (EMRs). METHODS: The forecast model was derived using a 13-month retrospective cohort of 6384 catheterization patients. Predictor variables such as demographics, scheduled procedures, and clinical indicators mined from free-text notes were input to a multivariable logistic regression model that predicted the probability of inpatient admission. The model was embedded into a web-based application connected to the local EMR system and used to support bed management decisions. After implementation, the tool was prospectively evaluated for accuracy on a 13-month test cohort of 7029 catheterization patients. RESULTS: The forecast model predicted admission with an area under the receiver operating characteristic curve of 0.722. Daily aggregate forecasts were accurate to within one bed for 70.3% of days and within three beds for 97.5% of days during the prospective evaluation period. The web-based application housing the forecast model was used by cardiology providers in practice to estimate daily admissions from the catheterization laboratory. DISCUSSION: The forecast model identified older age, male gender, invasive procedures, coronary artery bypass grafts, and a history of congestive heart failure as qualities indicating a patient was at increased risk for admission. Diagnostic procedures and less acute clinical indicators decreased patients' risk of admission. Despite the site-specific limitations of the model, these findings were supported by the literature. CONCLUSION: Data-driven predictive analytics may be used to accurately forecast daily demand for inpatient beds for cardiac catheterization patients. Connecting these analytics to EMR data sources has the potential to provide advanced operational decision support.


Assuntos
Algoritmos , Cateterismo Cardíaco , Registros Eletrônicos de Saúde , Admissão do Paciente , Adulto , Fatores Etários , Idoso , Idoso de 80 Anos ou mais , Feminino , Previsões , Insuficiência Cardíaca , Administração Hospitalar , Humanos , Internet , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Processamento de Linguagem Natural , Estudos Prospectivos , Curva ROC , Fatores Sexuais
13.
Philos Trans A Math Phys Eng Sci ; 369(1938): 976-1009, 2011 Mar 13.
Artigo em Inglês | MEDLINE | ID: mdl-21282157

RESUMO

Recently, widespread valley-bottom damming for water power was identified as a primary control on valley sedimentation in the mid-Atlantic US during the late seventeenth to early twentieth century. The timing of damming coincided with that of accelerated upland erosion during post-European settlement land-use change. In this paper, we examine the impact of local drops in base level on incision into historic reservoir sediment as thousands of ageing dams breach. Analysis of lidar and field data indicates that historic milldam building led to local base-level rises of 2-5 m (typical milldam height) and reduced valley slopes by half. Subsequent base-level fall with dam breaching led to an approximate doubling in slope, a significant base-level forcing. Case studies in forested, rural as well as agricultural and urban areas demonstrate that a breached dam can lead to stream incision, bank erosion and increased loads of suspended sediment, even with no change in land use. After dam breaching, key predictors of stream bank erosion include number of years since dam breach, proximity to a dam and dam height. One implication of this work is that conceptual models linking channel condition and sediment yield exclusively with modern upland land use are incomplete for valleys impacted by milldams. With no equivalent in the Holocene or late Pleistocene sedimentary record, modern incised stream-channel forms in the mid-Atlantic region represent a transient response to both base-level forcing and major changes in land use beginning centuries ago. Similar channel forms might also exist in other locales where historic milling was prevalent.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA