Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
1.
INFORMS J Comput ; 36(2): 434-455, 2024.
Article in English | MEDLINE | ID: mdl-38883557

ABSTRACT

Chemotherapy drug administration is a complex problem that often requires expensive clinical trials to evaluate potential regimens; one way to alleviate this burden and better inform future trials is to build reliable models for drug administration. This paper presents a mixed-integer program for combination chemotherapy (utilization of multiple drugs) optimization that incorporates various important operational constraints and, besides dose and concentration limits, controls treatment toxicity based on its effect on the count of white blood cells. To address the uncertainty of tumor heterogeneity, we also propose chance constraints that guarantee reaching an operable tumor size with a high probability in a neoadjuvant setting. We present analytical results pertinent to the accuracy of the model in representing biological processes of chemotherapy and establish its potential for clinical applications through a numerical study of breast cancer.

2.
Article in English | MEDLINE | ID: mdl-38766899

ABSTRACT

The intrinsic stochasticity of patients' response to treatment is a major consideration for clinical decision-making in radiation therapy. Markov models are powerful tools to capture this stochasticity and render effective treatment decisions. This paper provides an overview of the Markov models for clinical decision analysis in radiation oncology. A comprehensive literature search was conducted within MEDLINE using PubMed, following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Only studies published from 2000 to 2023 were considered. Selected publications were summarized in two categories: (i) studies that compare two (or more) fixed treatment policies using Monte Carlo simulation and (ii) studies that seek an optimal treatment policy through Markov Decision Processes (MDPs). Relevant to the scope of this study, 61 publications were selected for detailed review. The majority of these publications (n = 56) focused on comparative analysis of two or more fixed treatment policies using Monte Carlo simulation. Classifications based on cancer site, utility measures and the type of sensitivity analysis are presented. Five publications considered MDPs with the aim of computing an optimal treatment policy; a detailed statement of the analysis and results is provided for each work. As an extension of Markov model-based simulation analysis, MDP offers a flexible framework to identify an optimal treatment policy among a possibly large set of treatment policies. However, the applications of MDPs to oncological decision-making have been understudied, and the full capacity of this framework to render complex optimal treatment decisions warrants further consideration.

3.
Oper Res Let ; 542024 May.
Article in English | MEDLINE | ID: mdl-38560724

ABSTRACT

We consider the problem of optimally designing a system for repeated use under uncertainty. We develop a modeling framework that integrates the design and operational phases, which are represented by a mixed-integer program and discounted-cost infinite-horizon Markov decision processes, respectively. We seek to simultaneously minimize the design costs and the subsequent expected operational costs. This problem setting arises naturally in several application areas, as we illustrate through examples. We derive a bilevel mixed-integer linear programming formulation for the problem and perform a computational study to demonstrate that realistic instances can be solved numerically.

4.
Article in English | MEDLINE | ID: mdl-38462018

ABSTRACT

PURPOSE: Given the limitations of extant models for normal tissue complication probability estimation for osteoradionecrosis (ORN) of the mandible, the purpose of this study was to enrich statistical inference by exploiting structural properties of data and provide a clinically reliable model for ORN risk evaluation through an unsupervised-learning analysis that incorporates the whole radiation dose distribution on the mandible. METHODS AND MATERIALS: The analysis was conducted on retrospective data of 1259 patients with head and neck cancer treated at The University of Texas MD Anderson Cancer Center between 2005 and 2015. During a minimum 12-month posttherapy follow-up period, 173 patients in this cohort (13.7%) developed ORN (grades I to IV). The (structural) clusters of mandibular dose-volume histograms (DVHs) for these patients were identified using the K-means clustering method. A soft-margin support vector machine was used to determine the cluster borders and partition the dose-volume space. The risk of ORN for each dose-volume region was calculated based on incidence rates and other clinical risk factors. RESULTS: The K-means clustering method identified 6 clusters among the DVHs. Based on the first 5 clusters, the dose-volume space was partitioned by the soft-margin support vector machine into distinct regions with different risk indices. The sixth cluster entirely overlapped with the others; the region of this cluster was determined by its envelopes. For each region, the ORN incidence rate per preradiation dental extraction status (a statistically significant, nondose related risk factor for ORN) was reported as the corresponding risk index. CONCLUSIONS: This study presents an unsupervised-learning analysis of a large-scale data set to evaluate the risk of mandibular ORN among patients with head and neck cancer. The results provide a visual risk-assessment tool for ORN (based on the whole DVH and preradiation dental extraction status) as well as a range of constraints for dose optimization under different risk levels.

5.
medRxiv ; 2023 Mar 29.
Article in English | MEDLINE | ID: mdl-37034700

ABSTRACT

Purpose: Given the limitations of extant models for normal tissue complication probability estimation for osteoradionecrosis (ORN) of the mandible, the purpose of this study was to enrich statistical inference by exploiting structural properties of data and provide a clinically reliable model for ORN risk evaluation through an unsupervised-learning analysis. Materials and Methods: The analysis was conducted on retrospective data of 1,259 head and neck cancer (HNC) patients treated at the University of Texas MD Anderson Cancer Center between 2005 and 2015. The (structural) clusters of mandibular dose-volume histograms (DVHs) were identified through the K-means clustering method. A soft-margin support vector machine (SVM) was used to determine the cluster borders and partition the dose-volume space. The risk of ORN for each dose-volume region was calculated based on the clinical risk factors and incidence rates. Results: The K-means clustering method identified six clusters among the DVHs. Based on the first five clusters, the dose-volume space was partitioned almost perfectly by the soft-margin SVM into distinct regions with different risk indices. The sixth cluster overlapped the others entirely; the region of this cluster was determined by its envelops. These regions and the associated risk indices provide a range of constraints for dose optimization under different risk levels. Conclusion: This study presents an unsupervised-learning analysis of a large-scale data set to evaluate the risk of mandibular ORN among HNC patients. The results provide a visual risk-assessment tool (based on the whole DVH) and a spectrum of dose constraints for radiation planning.

6.
Cancer Med ; 12(4): 5088-5098, 2023 02.
Article in English | MEDLINE | ID: mdl-36229990

ABSTRACT

BACKGROUND: A primary goal in transoral robotic surgery (TORS) for oropharyngeal squamous cell cancer (OPSCC) survivors is to optimize swallowing function. However, the uncertainty in the outcomes of TORS including postoperative residual positive margin (PM) and extranodal extension (ENE), may necessitate adjuvant therapy, which may cause significant swallowing toxicity to survivors. METHODS: A secondary analysis was performed on a prospective registry data with low- to intermediate-risk human papillomavirus-related OPSCC possibly resectable by TORS. Decision trees were developed to model the uncertainties in TORS compared with definitive radiation therapy (RT) and chemoradiation therapy (CRT). Swallowing toxicities were measured by Dynamic Imaging Grade of Swallowing Toxicity (DIGEST), MD Anderson Dysphagia Inventory (MDADI), and the MD Anderson Symptom Inventory-Head and Neck (MDASI-HN) instruments. The likelihoods of PM/ENE were varied to determine the thresholds within which each therapy remains optimal. RESULTS: Compared with RT, TORS resulted in inferior swallowing function for moderate likelihoods of PM/ENE (>60% in short term for all instruments, >75% in long term for DIGEST and MDASI) leaving RT as the optimal treatment. Compared with CRT, TORS remained the optimal therapy based on MDADI and MDASI but showed inferior swallowing outcomes based on DIGEST for moderate-to-high likelihoods of PM/ENE (>75% for short-term and >40% for long-term outcomes). CONCLUSION: In the absence of reliable estimation of postoperative PM/ENE concurrent with significant postoperative PM, the overall toxicity level in OPSCC patients undergoing TORS with adjuvant therapy may become more severe compared with patients receiving nonsurgical treatments thus advocating definitive (C)RT protocols.


Subject(s)
Head and Neck Neoplasms , Oropharyngeal Neoplasms , Robotic Surgical Procedures , Humans , Robotic Surgical Procedures/adverse effects , Robotic Surgical Procedures/methods , Deglutition , Oropharyngeal Neoplasms/therapy , Oropharyngeal Neoplasms/etiology , Squamous Cell Carcinoma of Head and Neck , Head and Neck Neoplasms/etiology
7.
Cancer ; 126(4): 749-756, 2020 02 15.
Article in English | MEDLINE | ID: mdl-31725906

ABSTRACT

BACKGROUND: A possible surveillance model for patients with head and neck cancer (HNC) who received definitive radiotherapy was created using a partially observed Markov decision process. The goal of this model is to guide surveillance imaging policies after definitive radiotherapy. METHODS: The partially observed Markov decision process model was formulated to determine the optimal times to scan patients. Transition probabilities were computed using a data set of 1508 patients with HNC who received definitive radiotherapy between the years 2000 and 2010. Kernel density estimation was used to smooth the sample distributions. The reward function was derived using cost estimates from the literature. Additional model parameters were estimated using either data from the literature or clinical expertise. RESULTS: When considering all forms of relapse, the model showed that the optimal time between scans was longer than the time intervals used in the institutional guidelines. The optimal policy dictates that there should be less time between surveillance scans immediately after treatment compared with years after treatment. Comparable results also held when only locoregional relapses were considered as relapse events in the model. Simulation results for the inclusive relapse cases showed that <15% of patients experienced a relapse over a simulated 36-month surveillance program. CONCLUSIONS: This model suggests that less frequent surveillance scan policies can maintain adequate information on relapse status for patients with HNC treated with radiotherapy. This model could potentially translate into a more cost-effective surveillance program for this group of patients.


Subject(s)
Carcinoma, Squamous Cell/radiotherapy , Head and Neck Neoplasms/radiotherapy , Markov Chains , Monitoring, Physiologic/methods , Algorithms , Carcinoma, Squamous Cell/diagnostic imaging , Cohort Studies , Female , Head and Neck Neoplasms/diagnostic imaging , Humans , Magnetic Resonance Imaging/methods , Male , Middle Aged , Models, Biological , Neoplasm Recurrence, Local , Tomography, X-Ray Computed/methods
8.
Cancer ; 125(11): 1823-1829, 2019 06 01.
Article in English | MEDLINE | ID: mdl-30748005

ABSTRACT

BACKGROUND: The current study was performed to assess the efficacy of surveillance imaging in patients with head and neck cancer (HNC) who are treated definitively with radiotherapy. METHODS: Eligible patients included those with a demonstrable disease-free interval (≥1 follow-up imaging procedure without evidence of disease and a subsequent visit/imaging procedure) who underwent treatment of HNC from 2000 through 2010. RESULTS: A total of 1508 patients were included. The median overall survival was 99 months, with a median imaging follow-up period of 59 months. Of the 1508 patients, 190 patients (12.6%) experienced disease recurrence (107 patients had locoregional and 83 had distant disease recurrence). A total of 119 patients (62.6%) in the group with disease recurrence were symptomatic and/or had an adverse clinical finding associated with the recurrence. Approximately 80% of patients with locoregional disease recurrences presented with a clinical finding, whereas 60% of distant disease recurrences were detected by imaging in asymptomatic patients. Despite the earlier detection of disease recurrence via imaging, those patients in the group of patients with clinically detected disease recurrence were significantly more likely to undergo salvage therapy compared with those whose recurrence was detected on imaging (odds ratio, 0.35). There was no difference in overall survival noted between those patients with disease recurrences that were detected clinically or with imaging alone. Approximately 70% of disease recurrences occurred within the first 2 years. In those patients who developed disease recurrence after 2 years, the median time to recurrence was 51 months. After 2 years, the average number of imaging procedures per patient for the detection of a salvageable recurrence for the imaging-detected group was 1539. CONCLUSIONS: Surveillance imaging in asymptomatic patients with HNC who are treated definitively with radiotherapy without clinically suspicious findings beyond 2 years has a low yield and a high cost. Physicians ordering these studies must use judicious consideration and discretion.


Subject(s)
Head and Neck Neoplasms/diagnostic imaging , Head and Neck Neoplasms/epidemiology , Neoplasm Recurrence, Local/diagnostic imaging , Neoplasm Recurrence, Local/epidemiology , Population Surveillance/methods , Adolescent , Adult , Aged , Aged, 80 and over , Early Detection of Cancer , Female , Head and Neck Neoplasms/radiotherapy , Humans , Male , Middle Aged , Neoplasm Recurrence, Local/radiotherapy , Retrospective Studies , Salvage Therapy , Survival Analysis , Time-to-Treatment , Young Adult
9.
Breastfeed Med ; 12(10): 645-658, 2017 12.
Article in English | MEDLINE | ID: mdl-28906133

ABSTRACT

OBJECTIVE: We sought to determine the impact of changes in breastfeeding rates on population health. MATERIALS AND METHODS: We used a Monte Carlo simulation model to estimate the population-level changes in disease burden associated with marginal changes in rates of any breastfeeding at each month from birth to 12 months of life, and in rates of exclusive breastfeeding from birth to 6 months of life. We used these marginal estimates to construct an interactive online calculator (available at www.usbreastfeeding.org/saving-calc ). The Institutional Review Board of the Cambridge Health Alliance exempted the study. RESULTS: Using our interactive online calculator, we found that a 5% point increase in breastfeeding rates was associated with statistically significant differences in child infectious morbidity for the U.S. population, including otitis media (101,952 cases, 95% confidence interval [CI] 77,929-131,894 cases) and gastrointestinal infection (236,073 cases, 95% CI 190,643-290,278 cases). Associated medical cost differences were $31,784,763 (95% CI $24,295,235-$41,119,548) for otitis media and $12,588,848 ($10,166,203-$15,479,352) for gastrointestinal infection. The state-level impact of attaining Healthy People 2020 goals varied by population size and current breastfeeding rates. CONCLUSION: Modest increases in breastfeeding rates substantially impact healthcare costs in the first year of life.


Subject(s)
Breast Feeding/economics , Breast Feeding/statistics & numerical data , Health Care Costs/statistics & numerical data , Internet , Population Health/statistics & numerical data , Female , Health Knowledge, Attitudes, Practice , Humans , Infant , Infant, Newborn , Male , Monte Carlo Method , Software , United States
10.
Matern Child Nutr ; 13(1)2017 01.
Article in English | MEDLINE | ID: mdl-27647492

ABSTRACT

The aim of this study was to quantify the excess cases of pediatric and maternal disease, death, and costs attributable to suboptimal breastfeeding rates in the United States. Using the current literature on the associations between breastfeeding and health outcomes for nine pediatric and five maternal diseases, we created Monte Carlo simulations modeling a hypothetical cohort of U.S. women followed from age 15 to age 70 years and their children from birth to age 20 years. We examined disease outcomes using (a) 2012 breastfeeding rates and (b) assuming that 90% of infants were breastfed according to medical recommendations. We measured annual excess cases, deaths, and associated costs, in 2014 dollars, using a 2% discount rate. Annual excess deaths attributable to suboptimal breastfeeding total 3,340 (95% confidence interval [1,886 to 4,785]), 78% of which are maternal due to myocardial infarction (n = 986), breast cancer (n = 838), and diabetes (n = 473). Excess pediatric deaths total 721, mostly due to Sudden Infant Death Syndrome (n = 492) and necrotizing enterocolitis (n = 190). Medical costs total $3.0 billion, 79% of which are maternal. Costs of premature death total $14.2 billion. The number of women needed to breastfeed as medically recommended to prevent an infant gastrointestinal infection is 0.8; acute otitis media, 3; hospitalization for lower respiratory tract infection, 95; maternal hypertension, 55; diabetes, 162; and myocardial infarction, 235. For every 597 women who optimally breastfeed, one maternal or child death is prevented. Policies to increase optimal breastfeeding could result in substantial public health gains. Breastfeeding has a larger impact on women's health than previously appreciated.


Subject(s)
Breast Feeding/economics , Breast Feeding/statistics & numerical data , Child Health/economics , Maternal Health/economics , Adolescent , Adult , Aged , Child , Child, Preschool , Cohort Studies , Female , Health Care Costs , Health Status , Humans , Infant , Middle Aged , Treatment Outcome , United States , Young Adult
11.
J Pediatr ; 175: 100-105.e2, 2016 08.
Article in English | MEDLINE | ID: mdl-27131403

ABSTRACT

OBJECTIVE: To estimate risk of necrotizing enterocolitis (NEC) for extremely low birth weight (ELBW) infants as a function of preterm formula (PF) and maternal milk intake and calculate the impact of suboptimal feeding on the incidence and costs of NEC. STUDY DESIGN: We used aORs derived from the Glutamine Trial to perform Monte Carlo simulation of a cohort of ELBW infants under current suboptimal feeding practices, compared with a theoretical cohort in which 90% of infants received at least 98% human milk. RESULTS: NEC incidence among infants receiving ≥98% human milk was 1.3%; 11.1% among infants fed only PF; and 8.2% among infants fed a mixed diet (P = .002). In adjusted models, compared with infants fed predominantly human milk, we found an increased risk of NEC associated with exclusive PF (aOR = 12.1, 95% CI 1.5, 94.2), or a mixed diet (aOR 8.7, 95% CI 1.2-65.2). In Monte Carlo simulation, current feeding of ELBW infants was associated with 928 excess NEC cases and 121 excess deaths annually, compared with a model in which 90% of infants received ≥98% human milk. These models estimated an annual cost of suboptimal feeding of ELBW infants of $27.1 million (CI $24 million, $30.4 million) in direct medical costs, $563 655 (CI $476 191, $599 069) in indirect nonmedical costs, and $1.5 billion (CI $1.3 billion, $1.6 billion) in cost attributable to premature death. CONCLUSIONS: Among ELBW infants, not being fed predominantly human milk is associated with an increased risk of NEC. Efforts to support milk production by mothers of ELBW infants may prevent infant deaths and reduce costs.


Subject(s)
Breast Feeding/economics , Enterocolitis, Necrotizing/economics , Health Care Costs/statistics & numerical data , Infant Formula/economics , Infant, Extremely Low Birth Weight , Infant, Premature, Diseases/economics , Enterocolitis, Necrotizing/epidemiology , Enterocolitis, Necrotizing/prevention & control , Humans , Infant, Newborn , Infant, Premature , Infant, Premature, Diseases/epidemiology , Infant, Premature, Diseases/prevention & control , Milk, Human , Models, Economic , Monte Carlo Method , United States/epidemiology
12.
Ann Intern Med ; 161(3): 170-80, 2014 Aug 05.
Article in English | MEDLINE | ID: mdl-25089861

ABSTRACT

BACKGROUND: Chronic hepatitis C virus (HCV) infection causes a substantial health and economic burden in the United States. With the availability of direct-acting antiviral agents, recently approved therapies and those under development, and 1-time birth-cohort screening, the burden of this disease is expected to decrease. OBJECTIVE: To predict the effect of new therapies and screening on chronic HCV infection and associated disease outcomes. DESIGN: Individual-level state-transition model. SETTING: Existing and anticipated therapies and screening for HCV infection in the United States. PATIENTS: Total HCV-infected population in the United States. MEASUREMENTS: The number of cases of chronic HCV infection and outcomes of advanced-stage HCV infection. RESULTS: The number of cases of chronic HCV infection decreased from 3.2 million in 2001 to 2.3 million in 2013. One-time birth-cohort screening beginning in 2013 is expected to identify 487,000 cases of HCV infection in the next 10 years. In contrast, 1-time universal screening could identify 933,700 cases. With the availability of highly effective therapies, HCV infection could become a rare disease in the next 22 years. Recently approved therapies for HCV infection and 1-time birth-cohort screening could prevent approximately 124,200 cases of decompensated cirrhosis, 78,800 cases of hepatocellular carcinoma, 126,500 liver-related deaths, and 9900 liver transplantations by 2050. Increasing the treatment capacity would further reduce the burden of HCV disease. LIMITATION: Institutionalized patients with HCV infection were excluded, and empirical data on the effectiveness of future therapies and on the future annual incidence and treatment capacity of HCV infection are lacking. CONCLUSION: New therapies for HCV infection and widespread implementation of screening and treatment will play an important role in reducing the burden of HCV disease. More aggressive screening recommendations are needed to identify a large pool of infected patients. PRIMARY FUNDING SOURCE: National Institutes of Health.


Subject(s)
Antiviral Agents/therapeutic use , Hepacivirus , Hepatitis C, Chronic/epidemiology , Models, Biological , Genotype , Hepacivirus/genetics , Hepatitis C, Chronic/diagnosis , Hepatitis C, Chronic/drug therapy , Humans , Mass Screening , Treatment Outcome , United States/epidemiology
13.
PLoS One ; 9(6): e98354, 2014.
Article in English | MEDLINE | ID: mdl-24963883

ABSTRACT

BACKGROUND: Many analyses of HIV treatment decisions assume a fixed formulary of HIV drugs. However, new drugs are approved nearly twice a year, and the rate of availability of new drugs may affect treatment decisions, particularly when to initiate antiretroviral therapy (ART). OBJECTIVES: To determine the impact of considering the availability of new drugs on the optimal initiation criteria for ART and outcomes in patients with HIV/AIDS. METHODS: We enhanced a previously described simulation model of the optimal time to initiate ART to incorporate the rate of availability of new antiviral drugs. We assumed that the future rate of availability of new drugs would be similar to the past rate of availability of new drugs, and we estimated the past rate by fitting a statistical model to actual HIV drug approval data from 1982-2010. We then tested whether or not the future availability of new drugs affected the model-predicted optimal time to initiate ART based on clinical outcomes, considering treatment initiation thresholds of 200, 350, and 500 cells/mm3. We also quantified the impact of the future availability of new drugs on life expectancy (LE) and quality-adjusted life expectancy (QALE). RESULTS: In base case analysis, considering the availability of new drugs raised the optimal starting CD4 threshold for most patients to 500 cells/mm3. The predicted gains in outcomes due to availability of pipeline drugs were generally small (less than 1%), but for young patients with a high viral load could add as much as a 4.9% (1.73 years) increase in LE and a 8% (2.43 QALY) increase in QALE, because these patients were particularly likely to exhaust currently available ART regimens before they died. In sensitivity analysis, increasing the rate of availability of new drugs did not substantially alter the results. Lowering the toxicity of future ART drugs had greater potential to increase benefit for many patient groups, increasing QALE by as much as 10%. CONCLUSIONS: The future availability of new ART drugs without lower toxicity raises optimal treatment initiation for most patients, and improves clinical outcomes, especially for younger patients with higher viral loads. Reductions in toxicity of future ART drugs could impact optimal treatment initiation and improve clinical outcomes for all HIV patients.


Subject(s)
Anti-Retroviral Agents/therapeutic use , Drug Discovery/trends , HIV Infections/drug therapy , Adult , Age Factors , Decision Making , Drug Resistance, Viral , Humans , Life Expectancy , Quality-Adjusted Life Years , Time Factors , Treatment Outcome , Viral Load
14.
PLoS One ; 6(1): e16170, 2011 Jan 25.
Article in English | MEDLINE | ID: mdl-21283569

ABSTRACT

BACKGROUND: Several guidelines to reduce cardiovascular risk in diabetes patients exist in North America, Europe, and Australia. Their ability to achieve this goal efficiently is unclear. METHODS AND FINDINGS: Decision analysis was used to compare the efficiency and effectiveness of international contemporary guidelines for the management of hypertension and hyperlipidemia for patients aged 40-80 with type 2 diabetes. Measures of comparative effectiveness included the expected probability of a coronary or stroke event, incremental medication costs per event, and number-needed-to-treat (NNT) to prevent an event. All guidelines are equally effective, but they differ significantly in their medication costs. The range of NNT to prevent an event was small across guidelines (6.5-7.6 for males and 6.5-7.5 for females); a larger range of differences were observed for expected cost per event avoided (ranges, $117,269-$157,186 for males and $115,999-$163,775 for females). Australian and U.S. guidelines result in the highest and lowest expected costs, respectively. CONCLUSIONS: International guidelines based on the same evidence and seeking the same goal are similar in their effectiveness; however, there are large differences in expected medication costs.


Subject(s)
Diabetes Complications/economics , Diabetes Mellitus, Type 2/drug therapy , Hyperlipidemias/drug therapy , Hypertension/drug therapy , Practice Guidelines as Topic/standards , Australia , Cost-Benefit Analysis , Decision Support Techniques , Diabetes Complications/drug therapy , Diabetes Mellitus, Type 2/complications , Disease Management , Drug Costs , Europe , Female , Humans , Hyperlipidemias/complications , Hyperlipidemias/economics , Hypertension/complications , Hypertension/economics , Male , North America
15.
Med Decis Making ; 30(4): 474-83, 2010.
Article in English | MEDLINE | ID: mdl-20044582

ABSTRACT

We provide a tutorial on the construction and evaluation of Markov decision processes (MDPs), which are powerful analytical tools used for sequential decision making under uncertainty that have been widely used in many industrial and manufacturing applications but are underutilized in medical decision making (MDM). We demonstrate the use of an MDP to solve a sequential clinical treatment problem under uncertainty. Markov decision processes generalize standard Markov models in that a decision process is embedded in the model and multiple decisions are made over time. Furthermore, they have significant advantages over standard decision analysis. We compare MDPs to standard Markov-based simulation models by solving the problem of the optimal timing of living-donor liver transplantation using both methods. Both models result in the same optimal transplantation policy and the same total life expectancies for the same patient and living donor. The computation time for solving the MDP model is significantly smaller than that for solving the Markov model. We briefly describe the growing literature of MDPs applied to medical decisions.


Subject(s)
Decision Making , Markov Chains , Uncertainty , Humans , Liver Transplantation
16.
PLoS One ; 3(6): e2468, 2008 Jun 25.
Article in English | MEDLINE | ID: mdl-18575623

ABSTRACT

INTRODUCTION: The ability to preserve organs prior to transplant is essential to the organ allocation process. OBJECTIVE: The purpose of this study is to describe the functional relationship between cold-ischemia time (CIT) and primary nonfunction (PNF), patient and graft survival in liver transplant. METHODS: To identify relevant articles Medline, EMBASE and the Cochrane database, including the non-English literature identified in these databases, was searched from 1966 to April 2008. Two independent reviewers screened and extracted the data. CIT was analyzed both as a continuous variable and stratified by clinically relevant intervals. Nondichotomous variables were weighted by sample size. Percent variables were weighted by the inverse of the binomial variance. RESULTS: Twenty-six studies met criteria. Functionally, PNF% = -6.678281+0.9134701*CIT Mean+0.1250879*(CIT Mean-9.89535)2-0.0067663*(CIT Mean-9.89535)3, r2 = .625, , p<.0001. Mean patient survival: 93% (1 month), 88% (3 months), 83% (6 months) and 83% (12 months). Mean graft survival: 85.9% (1 month), 80.5% (3 months), 78.1% (6 months) and 76.8% (12 months). Maximum patient and graft survival occurred with CITs between 7.5-12.5 hrs at each survival interval. PNF was also significantly correlated with ICU time, % first time grafts and % immunologic mismatches. CONCLUSION: The results of this work imply that CIT may be the most important pre-transplant information needed in the decision to accept an organ.


Subject(s)
Cold Temperature , Graft Survival , Ischemia , Liver Transplantation , Adult , Case-Control Studies , Cohort Studies , Humans , Randomized Controlled Trials as Topic
17.
Crit Care ; 11(3): R65, 2007.
Article in English | MEDLINE | ID: mdl-17570835

ABSTRACT

INTRODUCTION: Sepsis is the leading cause of death in critically ill patients and often affects individuals with community-acquired pneumonia. To overcome the limitations of earlier mathematical models used to describe sepsis and predict outcomes, we designed an empirically based Monte Carlo model that simulates the progression of sepsis in hospitalized patients over a 30-day period. METHODS: The model simulates changing health over time, as represented by the Sepsis-related Organ Failure Assessment (SOFA) score, as a function of a patient's previous health state and length of hospital stay. We used data from patients enrolled in the GenIMS (Genetic and Inflammatory Markers of Sepsis) study to calibrate the model, and tested the model's ability to predict deaths, discharges, and daily SOFA scores over time using different algorithms to estimate the natural history of sepsis. We evaluated the stability of the methods using bootstrap sampling techniques. RESULTS: Of the 1,888 patients originally enrolled, most were elderly (mean age 67.77 years) and white (80.72%). About half (47.98%) were female. Most were relatively ill, with a mean Acute Physiology and Chronic Health Evaluation III score of 56 and Pneumonia Severity Index score of 73.5. The model's estimates of the daily pattern of deaths, discharges, and SOFA scores over time were not statistically different from the actual pattern when information about how long patients had been ill was included in the model (P = 0.91 to 0.98 for discharges; P = 0.26 to 0.68 for deaths). However, model estimates of these patterns were different from the actual pattern when the model did not include data on the duration of illness (P < 0.001 for discharges; P = 0.001 to 0.040 for deaths). Model results were stable to bootstrap validation. CONCLUSION: An empiric simulation model of sepsis can predict complex longitudinal patterns in the progression of sepsis, most accurately by models that contain data representing both organ-system levels of and duration of illness. This work supports the incorporation into mathematical models of disease of the clinical intuition that the history of disease in an individual matters, and represents an advance over several prior simulation models that assume a constant rate of disease progression.


Subject(s)
Monte Carlo Method , Pneumonia, Bacterial/epidemiology , Sepsis/diagnosis , Sepsis/epidemiology , Aged , Comorbidity , Disease Progression , Female , Hospitalization/statistics & numerical data , Humans , Male , Predictive Value of Tests , Severity of Illness Index , United States/epidemiology
18.
Med Decis Making ; 26(5): 550-3, 2006.
Article in English | MEDLINE | ID: mdl-16997930

ABSTRACT

The authors discuss techniques for Monte Carlo (MC) cohort simulations that reduce the number of simulation replications required to achieve a given degree of precision for various output measures. Known as variance reduction techniques, they are often used in industrial engineering and operations research models, but they are seldom used in medical models. However, most MC cohort simulations are well suited to the implementation of these techniques. The authors discuss the cost of implementation versus the benefit of reduced replications.


Subject(s)
Computer Simulation/statistics & numerical data , Models, Statistical , Monte Carlo Method , Decision Making , Humans
19.
Med Decis Making ; 25(2): 199-209, 2005.
Article in English | MEDLINE | ID: mdl-15800304

ABSTRACT

BACKGROUND: The optimal allocation of scarce donor livers is a contentious health care issue requiring careful analysis. The objective of this article was to design a biologically based discrete-event simulation to test proposed changes in allocation policies. METHODS: The authors used data from multiple sources to simulate end-stage liver disease and the complex allocation system. To validate the model, they compared simulation output with historical data. RESULTS: Simulation outcomes were within 1% to 2% of actual results for measures such as new candidates, donated livers, and transplants by year. The model overestimated the yearly size of the waiting list by 5% in the last year of the simulation and the total number of pretransplant deaths by 10%. CONCLUSION: The authors created a discrete-event simulation model that represents the biology of end-stage liver disease and the health care organization of transplantation in the United States.


Subject(s)
Computer Simulation , Decision Support Techniques , Liver Failure, Acute/surgery , Liver Transplantation/statistics & numerical data , Patient Selection , Tissue and Organ Procurement/methods , Adolescent , Adult , Algorithms , Graft Survival , Humans , Liver Failure, Acute/mortality , Liver Transplantation/mortality , Quality-Adjusted Life Years , Registries , Resource Allocation/methods , Waiting Lists
20.
Med Decis Making ; 25(1): 35-46, 2005.
Article in English | MEDLINE | ID: mdl-15673580

ABSTRACT

BACKGROUND: The United States is divided currently into 11 transplant regions, which vary in area and number of organ procurement organizations (OPOs). Region size affects organ travel time and organ viability at transplant. PURPOSE: To develop a methodologic framework for determining optimal configurations of regions maximizing transplant allocation efficiency and geographic parity. METHODS: An integer program was designed to maximize a weighted combination of 2 objectives: 1) intraregional transplants, 2) geographic parity-maximizing the lowest intraregional transplant rate across all OPOs. Two classes of functions relating liver travel time to liver viability were also examined as part of the sensitivity analyses. RESULTS: Preliminary results indicate that reorganizing regions, while constraining their number to 11, resulted in up to 17 additional transplants/year depending on the travel-viability function; when not constrained, it resulted in up to 18/year of increase. CONCLUSION: Our analysis indicates that liver transplantation may benefit through region reorganization. The analytic method developed here should be applicable to other organs and sets of organs.


Subject(s)
Kidney Failure, Chronic/epidemiology , Liver Transplantation , Regional Health Planning/methods , Humans , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/surgery , Tissue and Organ Procurement , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...