ABSTRACT
OBJECTIVES: A majority of patients who experience acute coronary syndrome (ACS) initially receive care in the emergency department (ED). Guidelines for care of patients experiencing ACS, specifically ST-segment elevation myocardial infarction (STEMI) are well defined. We examine the utilization of hospital resources between patients with NSTEMI as compared to STEMI and unstable angina (UA). We then make the case that as NSTEMI patients are the majority of ACS cases, there is a great opportunity to risk stratify these patients in the emergency department. MATERIALS AND METHODS: We examined hospital resource utilization measure between those with STEMI, NSTEMI, and UA. These included hospital length of stay (LOS), any intensive care unit (ICU) care time, and in-hospital mortality. RESULTS AND CONCLUSIONS: The sample included 284,945 adult ED patients, of whom 1195 experienced ACS. Among the latter, 978 (70%) were diagnosed with NSTEMI, 225 (16%) with STEMI, and 194 with UA (14%). We observed 79.1% of STEMI patients receiving ICU care. 14.4% among NSTEMI patients, and 9.3% among UA patients. NSTEMI patients' mean hospital LOS was 3.7 days. This was shorter than that of non-ACS patients 4.75 days and UA patients 2.99. In-hospital mortality for NSTEMI was 1.6%, compared to, 4.4% for those with STEMI patients and 0% for UA. There are recommendations for risk stratification among NSTEMI patients to evaluate risk for major adverse cardiac events (MACE) that can be used in the ED to guide admission decisions and use of ICU care, thus optimizing care for a majority of ACS patients.
Subject(s)
Acute Coronary Syndrome , Non-ST Elevated Myocardial Infarction , ST Elevation Myocardial Infarction , Adult , Humans , Acute Coronary Syndrome/diagnosis , Acute Coronary Syndrome/therapy , Non-ST Elevated Myocardial Infarction/diagnosis , Non-ST Elevated Myocardial Infarction/therapy , ST Elevation Myocardial Infarction/therapy , ST Elevation Myocardial Infarction/diagnosis , Risk Assessment , Emergency Service, Hospital , HospitalsABSTRACT
BACKGROUND: Chest pain (CP) is the hallmark symptom for acute coronary syndrome (ACS) but is not reported in 20-30% of patients, especially women, elderly, non-white patients, presenting to the emergency department (ED) with an ST-segment elevation myocardial infarction (STEMI). METHODS: We used a retrospective 5-year adult ED sample of 279,132 patients to explore using CP alone to predict ACS, then we incrementally added other ACS chief complaints, age, and sex in a series of multivariable logistic regression models. We evaluated each model's identification of ACS and STEMI. RESULTS: Using CP alone would recommend ECGs for 8% of patients (sensitivity, 61%; specificity, 92%) but missed 28.4% of STEMIs. The model with all variables identified ECGs for 22% of patients (sensitivity, 82%; specificity, 78%) but missed 14.7% of STEMIs. The model with CP and other ACS chief complaints had the highest sensitivity (93%) and specificity (55%), identified 45.1% of patients for ECG, and only missed 4.4% of STEMIs. CONCLUSION: CP alone had highest specificity but lacked sensitivity. Adding other ACS chief complaints increased sensitivity but identified 2.2-fold more patients for ECGs. Achieving an ECG in 10 min for patients with ACS to identify all STEMIs will be challenging without introducing more complex risk calculation into clinical care.
Subject(s)
Acute Coronary Syndrome , ST Elevation Myocardial Infarction , Adult , Humans , Female , Aged , ST Elevation Myocardial Infarction/diagnosis , Retrospective Studies , Electrocardiography , Chest Pain/diagnosis , Chest Pain/etiology , Acute Coronary Syndrome/complications , Acute Coronary Syndrome/diagnosis , Emergency Service, HospitalABSTRACT
Chronic myeloid leukemia (CML) is effectively treated with long-term tyrosine kinase inhibitor (TKI) therapy, yet little is known about risks of prolonged TKI exposure in young patients, and long-term effect monitoring is not standardized. We surveyed North American pediatric oncologists (n = 119) to evaluate perceived risk of and surveillance practices for potential toxicities associated with prolonged TKI exposure in children and adolescents/young adults (AYAs) with CML. Survey domains included general and specific risk perceptions and surveillance practices for asymptomatic patients on chronic TKI therapy. We analyzed data descriptively and explored relationships between risk perceptions and surveillance. Risk perceptions varied among oncologists but were similar across six categories (thyroid, cardiac, vascular, metabolic, fertility, psychologic), with less than one-third rating each risk as moderate or high in pediatric and AYA patients. More oncologists perceived moderate or high risk of growth abnormalities in children (62% pediatric, 14% AYA) and financial toxicity in all patients (60% pediatric, 64% AYA). A greater proportion of oncologists with moderate or high perceived risk of thyroid abnormalities reported testing thyroid function compared to those with lower perceived risk; patterns for metabolic risk/lipid tests and cardiac risk/tests were similar. In summary, we found that pediatric oncologists had variable risk perceptions and surveillance practices for potential toxicities associated with prolonged TKI exposure. Standardizing surveillance would help quantify risks and refine recommendations.Supplemental data for this article is available online at https://doi.org/10.1080/08880018.2021.2017085 .
Subject(s)
Graft vs Host Disease , Leukemia, Myelogenous, Chronic, BCR-ABL Positive , Physicians , Adolescent , Child , Humans , Leukemia, Myelogenous, Chronic, BCR-ABL Positive/drug therapy , Protein Kinase Inhibitors/adverse effects , Young AdultABSTRACT
INTRODUCTION: Risk stratification has been proposed as a strategy to improve participation in colorectal cancer (CRC) screening, but evidence is lacking. We performed a randomized controlled trial of risk stratification using the National Cancer Institute's Colorectal Cancer Risk Assessment Tool (CCRAT) on screening intent and completion. METHODS: A total of 230 primary care patients eligible for first-time CRC screening were randomized to risk assessment via CCRAT or education control. Follow-up of screening intent and completion was performed by record review and phone at 6 and 12 months. We analyzed change in intent after intervention, time to screening, overall screening completion rates, and screening completion by CCRAT risk score tertile. RESULTS: Of the patients, 61.7% of patients were aged <60 years, 58.7% female, and 94.3% with college or higher education. Time to screening did not differ between arms (hazard ratio 0.78 [95% confidence interval (CI) 0.52-1.18], P = 0.24). At 12 months, screening completion was 38.6% with CCRAT vs 44.0% with education (odds ratio [OR] 0.80 [95% CI 0.47-1.37], P = 0.41). Changes in screening intent did not differ between the risk assessment and education arms (precontemplation to contemplation: OR 1.52 [95% CI 0.81-2.86], P = 0.19; contemplation to precontemplation: OR 1.93 [95% CI 0.45-8.34], P = 0.38). There were higher screening completion rates at 12 months in the top CCRAT risk tertile (52.6%) vs the bottom (32.4%) and middle (31.6%) tertiles (P = 0.10). DISCUSSION: CCRAT risk assessment did not increase screening participation or intent. Risk stratification might motivate persons classified as higher CRC risk to complete screening, but unintentionally discourage screening among persons not identified as higher risk.
Subject(s)
Colorectal Neoplasms/diagnosis , Early Detection of Cancer/statistics & numerical data , Health Belief Model , Patient Participation/statistics & numerical data , Aged , Female , Humans , Intention , Male , Middle Aged , Patient Education as Topic , Risk Assessment , Risk Factors , Time Factors , United StatesABSTRACT
BACKGROUND: Improving adherence to direct oral anticoagulants (DOAC) is challenging, and simple text messaging reminders have not been effective. METHODS: SmartADHERE was a randomized trial that tested a personalized digital and human direct oral anticoagulant adherence intervention compared to usual care. Eligibility required age ≥ 18, newly-prescribed (≤90 days) rivaroxaban for atrial fibrillation (AF), 1 of 4 at-risk criteria for nonadherence, and a smartphone. The intervention consisted of combination of a medication management smartphone app, daily app-based reminders, adaptive text messaging, and phone-based counseling for severe nonadherence. The primary outcome was the proportion of days covered by rivaroxaban (PDC) at 6 months. There were 25 U.S. sites, all cardiology and electrophysiology outpatient practices, activated for a target sample size of 378, but the study was terminated by the sponsor prior to reaching target enrollment. RESULTS: There were 139 participants (age 65±9.6 years, 30% female, median CHA2DS2-VASc score 3 with IQR 2 to 4, mean total medication burden 7.7±4.4). DOAC adherence was high in both arms with no difference in the primary outcome (PDC 0.86±0.25 intervention vs 0.88±0.25 control, p=0.62) or in secondary outcomes including PDC ≥ 0.80 and medication persistence. Per protocol analyses had similar results. Because of the high overall PDC, the likelihood to answer the primary hypothesis was only 51% even if target enrollment were achieved. There were no study-related adverse events. CONCLUSIONS: The use of a centralized digital and human adherence intervention was feasible across multiple sites. Overall adherence was much higher than expected despite prescreening for at-risk individuals. SmartADHERE illustrates the challenges of trials of behavioral and technology interventions, where enrollment itself may lead to selection bias or treatment effects. Pragmatic study designs, such as cluster randomization or stepped-wedge implementation, should be considered to improve enrollment and generalizability.
Subject(s)
Atrial Fibrillation/drug therapy , Electronics , Rivaroxaban/administration & dosage , Smartphone , Stroke/prevention & control , Administration, Oral , Aged , Atrial Fibrillation/complications , Dose-Response Relationship, Drug , Drug Administration Schedule , Factor Xa Inhibitors/administration & dosage , Female , Follow-Up Studies , Humans , Male , Medication Adherence , Middle Aged , Retrospective Studies , Stroke/etiologyABSTRACT
BACKGROUND: Clinical trials, conducted efficiently and with the utmost integrity, are a key component in identifying effective vaccines, therapies, and other interventions urgently needed to solve the COVID-19 crisis. Yet launching and implementing trials with the rigor necessary to produce convincing results is a complicated and time-consuming process. Balancing rigor and efficiency involves relying on designs that employ flexible features to respond to a fast-changing landscape, measuring valid endpoints that result in translational actions and disseminating findings in a timely manner. We describe the challenges involved in creating infrastructure with potential utility for shared learning. METHODS: We have established a shared infrastructure that borrows strength across multiple trials. The infrastructure includes an endpoint registry to aid in selecting appropriate endpoints, a registry to facilitate establishing a Data & Safety Monitoring Board, common data collection instruments, a COVID-19 dedicated design and analysis team, and a pragmatic platform protocol, among other elements. RESULTS: The authors have relied on the shared infrastructure for six clinical trials for which they serve as the Data Coordinating Center and have a design and analysis team comprising 15 members who are dedicated to COVID-19. The authors established a pragmatic platform to simultaneously investigate multiple treatments for the outpatient with adaptive features to add or drop treatment arms. CONCLUSION: The shared infrastructure provides appealing opportunities to evaluate disease in a more robust manner with fewer resources and is especially valued during a pandemic where efficiency in time and resources is crucial. The most important element of the shared infrastructure is the pragmatic platform. While it may be the most challenging of the elements to establish, it may provide the greatest benefit to both patients and researchers.
Subject(s)
COVID-19/therapy , Clinical Trials as Topic/methods , Pandemics , Clinical Trial Protocols as Topic , Clinical Trials Data Monitoring Committees , Endpoint Determination , Humans , SARS-CoV-2ABSTRACT
Disease relapse is the leading cause of death for patients with acute leukemia (AL) and myelodysplastic syndrome (MDS) who undergo allogeneic hematopoietic cell transplantation (HCT). Relapse post-HCT is associated with poor prognosis; however, inpatient healthcare utilization of this population is unknown. Here we describe survival, intensity of healthcare utilization, and characteristics associated with high resource use at the end of life (EOL). Adult patients with AL/MDS who underwent HCT at a large regional referral center with subsequent relapse between 2005 and 2015 were included in this retrospective study. We compared the distribution of demographic and clinical characteristics of patients as well as healthcare utilization over 2 years postrelapse and at EOL by postrelapse disease-directed therapeutic interventions. We created a composite score for EOL healthcare utilization intensity by summing the presence of any of the following criteria: death in the hospital, use of chemotherapy, emergency department, hospitalization, intensive care unit, intubation, cardiopulmonary resuscitation, or hemodialysis in the last month of life. Higher scores indicate more intense healthcare use at EOL. Multivariable linear regression analysis was used to determine variables (demographic characteristics, postrelapse treatment group, advance directives documentation, palliative care referral, time to relapse) associated with EOL healthcare utilization intensity. One hundred fifty-four patients were included; median age at relapse was 56 years (interquartile range [IQR], 39 to 63), 55% were men, 79% had AL, and median time from HCT to relapse was 6 months (IQR, 3 to 10). After relapse, 28% received supportive care only, 50% received chemotherapy only, and 22% received chemotherapy plus cell therapy (either donor lymphocyte infusion, second HCT, or donor lymphocyte infusion plus second HCT). With the exception of time until relapse and Karnofsky Performance Status, baseline characteristics (gender, age, race, graft-versus-host disease, year of treatment) did not significantly differ by postrelapse treatment group. One hundred thirty-six patients (88%) died within 2 years of relapse; survival differed significantly by postrelapse treatment group, with those receiving disease-directed treatment showing lower risk of death. Healthcare use in AL/MDS patients after post-HCT relapse was high overall, with 44% visiting the emergency department at least once (22% at least 2 times), 93% hospitalized (55% at least 2 times, 16% at least 5 times), and 38% using the intensive care unit (median length of stay 5, days; IQR, 3 to 10). Use was high even among those receiving only supportive care. For those patients who died, the mean intensity score for EOL healthcare use was 1.8 (standard deviation, 1.8). Most patients (70%) had a marker of high-intensity healthcare utilization at the EOL or died in hospital. In multivariable analysis, an increase in age (estimate -.03 (95% CI, -.06 to -.003) and having AL versus MDS were significantly associated with a decreased EOL healthcare intensity score; no other variables were associated with intensity of EOL healthcare use. Healthcare utilization after post-HCT relapse is associated with receipt of disease-directed therapy but remains high across all groups despite known poor prognosis. Interventions are needed to minimize nonbeneficial treatments and promote goal-concordant EOL care in this seriously ill patient population.
Subject(s)
Delivery of Health Care , Hematopoietic Stem Cell Transplantation , Leukemia , Myelodysplastic Syndromes , Acute Disease , Adult , Allografts , Disease-Free Survival , Female , Humans , Leukemia/mortality , Leukemia/therapy , Male , Middle Aged , Myelodysplastic Syndromes/mortality , Myelodysplastic Syndromes/therapy , Retrospective Studies , Survival RateABSTRACT
Purpose To develop and validate a predictive model for postembolization syndrome (PES) following transarterial hepatic chemoembolization (TACE) for hepatocellular carcinoma. Materials and Methods In this single-center, retrospective study, 370 patients underwent 513 TACE procedures between October 2014 and September 2016. Seventy percent of the patients were randomly assigned to a training data set and the remaining 30% were assigned to a testing data set. Variables included demographic, laboratory, clinical, and procedural details. PES was defined as pain and/or nausea beyond 6 hours after TACE that required intravenous medication for symptom control. The predictive model was developed by using conditional inference trees and Lasso regression. Results Demographics, laboratory data, performance, tumor characteristics, and procedural details were statistically similar for the training and testing data sets. Overall, 83 of 370 patients (22.4%) after 107 of 513 TACE procedures (20.8%) met the predefined criteria. Factors identified at univariable analysis included large tumor burden (P = .004), drug-eluting embolic TACE (P = .03), doxorubicin dose (P = .003), history of PES (P < .001) and chronic pain (P < .001), of which history of PES, tumor burden, and drug-eluting embolic TACE were identified as the strongest predictors by the multivariable analysis and were used to develop the predictive model. When applied to the testing data set, the model demonstrated an area under the curve of 0.62, sensitivity of 79% (22 of 28), specificity of 44.2% (53 of 120), and a negative predictive value of 90% (53 of 59). Conclusion The model identified history of postembolization syndrome, tumor burden, and drug-eluting embolic chemoembolization as predictors of protracted recovery because of postembolization syndrome. © RSNA, 2018.
Subject(s)
Carcinoma, Hepatocellular/therapy , Chemoembolization, Therapeutic/adverse effects , Liver Neoplasms/therapy , Adult , Aged , Aged, 80 and over , Chemoembolization, Therapeutic/methods , Female , Humans , Male , Middle Aged , Models, Statistical , Retrospective Studies , SyndromeABSTRACT
Allogeneic hematopoietic cell transplantation (HCT) is associated with significant morbidity and mortality, making advance care planning (ACP) and management especially important in this patient population. A paucity of data exists on the utilization of ACP among allogeneic HCT recipients and the relationship between ACP and intensity of healthcare utilization in these patients. We performed a retrospective review of patients receiving allogeneic HCT at our institution from 2008 to 2015 who had subsequently died after HCT. Documentation and timing of advance directive (AD) completion were abstracted from the electronic medical record. Outcomes of interest included use of intensive care unit (ICU) level of care at any time point after HCT, within 30 days of death, and within 14 days of death; use of mechanical ventilation at any time after HCT; and location of death. Univariate logistic regression was performed to explore associations between AD completion and each outcome. Of the 1031 patients who received allogeneic HCT during the study period, 422 decedents (41%) were included in the analysis. Forty-four percent had AD documentation prior to death. Most patients (69%) indicated that if terminally ill, they did not wish to be subjected to life-prolonging treatment attempts. Race/ethnicity was significantly associated with AD documentation, with non-Hispanic white patients documenting ADs more frequently (51%) compared with Hispanic (22%) or Asian patients (35%; P = .0007). Patients with ADs were less likely to use the ICU during the transplant course (41% for patients with ADs versus 52% of patients without ADs; P = .03) and also were less likely to receive mechanical ventilation at any point after transplantation (21% versus 37%, P < .001). AD documentation was also associated with decreased ICU use at the end of life; relative to patients without ADs, patients with ADs were more likely to die at home or in hospital as opposed to in the ICU (odds ratio, .44; 95% confidence interval, .27 to .72). ACP remains underused in allogeneic HCT. Adoption of a systematic practice to standardize AD documentation as part of allogeneic HCT planning has the potential to significantly reduce ICU use and mechanical ventilation while improving quality of care at end of life in HCT recipients.
Subject(s)
Advance Directives , Hematopoietic Stem Cell Transplantation/methods , Terminal Care/standards , Adult , Advance Directives/ethnology , Aged , Female , Humans , Intensive Care Units/supply & distribution , Male , Middle Aged , Respiration, Artificial , Retrospective Studies , Transplantation, HomologousABSTRACT
BACKGROUND & AIMS: Families with a history of Lynch syndrome often do not adhere to guidelines for genetic testing and screening. We investigated practice patterns related to Lynch syndrome worldwide, to ascertain potential targets for research and public policy efforts. METHODS: We collected data from the International Mismatch Repair Consortium (IMRC), which comprises major research and clinical groups engaged in the care of families with Lynch syndrome worldwide. IMRC institutions were invited to complete a questionnaire to characterize diagnoses of Lynch syndrome and management practice patterns. RESULTS: Fifty-five providers, representing 63 of 128 member institutions (49%) in 21 countries, completed the questionnaire. For case finding, 55% of respondents reported participating in routine widespread population tumor testing among persons with newly diagnosed Lynch syndrome-associated cancers, whereas 27% reported relying on clinical criteria with selective tumor and/or germline analyses. Most respondents (64%) reported using multigene panels for germline analysis, and only 28% reported testing tumors for biallelic mutations for cases in which suspected pathogenic mutations were not confirmed by germline analysis. Respondents reported relying on passive dissemination of information to at-risk family members, and there was variation in follow through of genetic testing recommendations. Reported risk management practices varied-nearly all programs (98%) recommended colonoscopy every 1 to 2 years, but only 35% recommended chemoprevention with aspirin. CONCLUSIONS: There is widespread heterogeneity in management practices for Lynch syndrome worldwide among IMRC member institutions. This may reflect the rapid pace of emerging technology, regional differences in resources, and the lack of definitive data for many clinical questions. Future efforts should focus on the large numbers of high-risk patients without access to state-of-the-art Lynch syndrome management.
Subject(s)
Colorectal Neoplasms, Hereditary Nonpolyposis/diagnosis , Colorectal Neoplasms, Hereditary Nonpolyposis/therapy , Disease Management , Practice Patterns, Physicians'/statistics & numerical data , Aged , Aged, 80 and over , Cross-Sectional Studies , Female , Humans , Male , Middle AgedABSTRACT
The Hematopoietic Cell Transplantation (HCT)-Specific Comorbidity Index (HCT-CI) has been extensively studied in myeloablative and reduced-intensity conditioning regimens, with less data available regarding the validity of HCT-CI in nonmyeloablative (NMA) allogeneic transplantation. We conducted a retrospective analysis to evaluate the association between HCT-CI and nonrelapse mortality (NRM) and all-cause mortality (ACM) in patients receiving the total lymphoid irradiation and antithymocyte globulin (TLI/ATG) NMA transplantation preparative regimen. We abstracted demographic and clinical data from consecutive patients, who received allogeneic HCT with the TLI/ATG regimen between January 2008 and September 2014, from the Stanford blood and marrow transplantation database. We conducted univariable and multivariable Cox proportional hazards regression models to evaluate the association between HCT-CI and NRM and ACM. In all, 287 patients were included for analysis. The median age of the patients was 61 (range, 22 to 77) years. The median overall survival was 844 (range, 374 to 1484) days. Most patients had Karnofsky performance score of 90 or above (85%). Fifty-two (18%) patients relapsed within 3 months and 108 (38%) patients relapsed within 1 year, with a median time to relapse of 163 (range, 83 to 366) days. Among the comorbidities in the HCT-CI identified at the time of HCT, reduced pulmonary function was the most common (n = 89), followed by prior history of malignancy (n = 39), psychiatric condition (n = 38), and diabetes (n = 31). Patients with higher HCT-CI scores had higher mortality risks for ACM (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.22 to 3.14 for HCT-CI score 1 or 2 and HR, 1.85; 95% CI, 1.11 to 3.08 for HCT-CI score ≥ 3, compared with 0, respectively). Among individual HCT-CI variables, diabetes (HR, 2.31; 95% CI, 1.79 to 2.89; P = .003) and prior solid tumors (HR, 1.75; 95% CI, 1.02 to 3.00; P = .043) were associated with a higher risk of ACM. Higher HCT-CI scores were significantly associated with higher risk of death. HCT-CI is a valid tool for predicting ACM in NMA TLI/ATG allogeneic HCT.
Subject(s)
Comorbidity , Hematopoietic Stem Cell Transplantation/methods , Transplantation Conditioning/methods , Adult , Aged , Hematopoietic Stem Cell Transplantation/mortality , Humans , Middle Aged , Prognosis , Regression Analysis , Retrospective Studies , Risk Assessment , Survival Rate , Transplantation, Homologous , Young AdultABSTRACT
BACKGROUND & AIMS: The incidence of colorectal cancer (CRC) is increasing in the United States among adults younger than the age of 50 years. Studies of young-onset CRC have focused on outcomes and treatment patterns. We examined patient presentation, provider evaluation, and time to diagnosis, which can affect stage and prognosis. METHODS: In a retrospective study, we collected data from patients with a diagnosis of colorectal adenocarcinoma, confirmed by pathologists, seen at the Stanford Cancer Institute from January 1, 2008, through December 31, 2014. We compared symptoms, clinical features, time to diagnosis, and cancer stage in patients with young-onset CRC (diagnosed at an age younger than 50 years; n = 253) with patients diagnosed with CRC at an age of 50 years or older (n = 232). RESULTS: A higher proportion of patients with young-onset CRC were diagnosed with advanced-stage tumors (72%) compared with older patients (63%) (P = .03). Larger proportions of patients with young-onset CRC also had a family history of CRC (25% vs 17% in older patients; P = .03), confirmed or probable hereditary cancer syndromes (7% vs 1% in older patients; P < .01), and left-sided disease (distal colon cancer in 41% vs 34% in older patients; P = .01; and rectal cancer in 40% vs 35% in older patients; P = .29). Patients with young-onset CRC had a significantly longer median time to diagnosis (128 vs 79 days for older patients; P < .05), symptom duration (60 vs 30 days for older patients; P < .01), and time of evaluation (31 vs 22 days; P < .05). In multivariable analyses, time to diagnosis was 1.4-fold longer for younger than for older patients (P < .01). Among younger patients, those with stage III or IV CRC had shorter durations of symptoms and evaluations than those with stage I or II CRC. CONCLUSIONS: In a retrospective analysis of patients with CRC, we found that greater proportions of patients younger than 50 years were diagnosed with advanced-stage tumors than older patients; this difference could not be explained simply by delays from symptom onset to diagnosis. Although tumor biology may be an important determinant of stage at diagnosis, clinicians should be aware of CRC alarm symptoms, family history, and genetic syndromes, to speed evaluation and diagnosis of younger patients and potentially improve outcomes. It remains to be determined whether subgroups of persons at risk for young-onset CRC who benefit from early screening can be identified.
Subject(s)
Adenocarcinoma/epidemiology , Adenocarcinoma/pathology , Colorectal Neoplasms/epidemiology , Colorectal Neoplasms/pathology , Adenocarcinoma/diagnosis , Adult , Aged , Colorectal Neoplasms/diagnosis , Female , Humans , Incidence , Male , Middle Aged , Neoplasm Staging , Retrospective Studies , Time Factors , United States/epidemiologyABSTRACT
OBJECTIVE: To examine the safety and efficacy of cangrelor in patients with single-vessel disease (SVD) and multi-vessel disease (MVD). BACKGROUND: Cangrelor, an intravenous, rapidly acting P2Y12 inhibitor, is superior to clopidogrel in reducing ischemic events among patients receiving percutaneous coronary intervention (PCI). METHODS: We studied a modified intention to treat population of patients with SVD and MVD from the CHAMPION PHOENIX trial. The primary efficacy outcome was the composite of death, myocardial infarction (MI), ischemia-driven revascularization (IDR), and stent thrombosis (ST) at 48hours. The key safety outcome was non-coronary artery bypass grafting GUSTO severe bleeding at 48hours. RESULTS: Among 10,921 patients, 5,220 (48%) had SVD and 5,701 (52%) had MVD. MVD patients were older and more often had diabetes, hyperlipidemia, hypertension, prior stroke, and prior MI. After adjustment, MVD patients had similar rates of 48-hour death/MI/IDR/ST (6.3% vs 4.2%, adjusted odds ratio [OR] 1.6 [95% CI 0.42-6.06]) and GUSTO severe bleeding (0.1% vs 0.2%, P=.67) compared with SVD patients. Consistent with overall trial findings, cangrelor use reduced ischemic complications in patients with both SVD (3.9% vs 4.5%; OR 0.86, 95% CI 0.65-1.12) and MVD (5.5% vs 7.2%; OR 0.74, 95% CI 0.6-0.92, P-interaction=.43). GUSTO severe bleeding outcomes were not significantly increased with cangrelor or clopidogrel in either SVD or MVD patients. CONCLUSION: In the CHAMPION PHOENIX trial, MVD and SVD patients had similar ischemic outcomes at 48hours and 30days. Cangrelor consistently reduced ischemic complications in both SVD and MVD patients without a significant increase in GUSTO severe bleeding. CLINICAL PERSPECTIVES.
Subject(s)
Adenosine Monophosphate/analogs & derivatives , Coronary Artery Disease/therapy , Myocardial Infarction/prevention & control , Percutaneous Coronary Intervention , Postoperative Complications/prevention & control , Adenosine Monophosphate/administration & dosage , Administration, Oral , Aged , Cause of Death/trends , Clopidogrel , Coronary Angiography , Coronary Artery Disease/diagnosis , Dose-Response Relationship, Drug , Double-Blind Method , Female , Follow-Up Studies , Global Health , Humans , Incidence , Infusions, Intravenous , Male , Middle Aged , Myocardial Infarction/epidemiology , Postoperative Complications/epidemiology , Purinergic P2Y Receptor Antagonists/administration & dosage , Survival Rate/trends , Ticlopidine/administration & dosage , Ticlopidine/analogs & derivatives , Time FactorsABSTRACT
BACKGROUND: Tailoring screening to colorectal cancer (CRC) risk could improve screening effectiveness. Most CRCs arise from advanced neoplasia (AN) that dwells for years. To date, no available colorectal neoplasia risk score has been validated externally in a diverse population. The authors explored whether the National Cancer Institute (NCI) CRC risk-assessment tool, which was developed to predict future CRC risk, could predict current AN prevalence in a diverse population, thereby allowing its use in risk stratification for screening. METHODS: This was a prospective examination of the relation between predicted 10-year CRC risk and the prevalence of AN, defined as advanced or multiple (≥3 adenomatous, ≥5 serrated) adenomatous or sessile serrated polyps, in individuals undergoing screening colonoscopy. RESULTS: Among 509 screenees (50% women; median age, 58 years; 61% white, 5% black, 10% Hispanic, and 24% Asian), 58 (11%) had AN. The prevalence of AN increased progressively from 6% in the lowest risk-score quintile to 17% in the highest risk-score quintile (P = .002). Risk-score distributions in individuals with versus without AN differed significantly (median, 1.38 [0.90-1.87] vs 1.02 [0.62-1.57], respectively; P = .003), with substantial overlap. The discriminatory accuracy of the tool was modest, with areas under the curve of 0.61 (95% confidence interval [CI], 0.54-0.69) overall, 0.59 (95% CI, 0.49-0.70) for women, and 0.63 (95% CI, 0.53-0.73) for men. The results did not change substantively when the analysis was restricted to adenomatous lesions or to screening procedures without any additional incidental indication. CONCLUSIONS: The NCI CRC risk-assessment tool displays modest discriminatory accuracy in predicting AN at screening colonoscopy in a diverse population. This tool may aid shared decision-making in clinical practice. Cancer 2016. © 2016 American Cancer Society. Cancer 2016;122:2663-2670. © 2016 American Cancer Society.
Subject(s)
Adenoma/diagnosis , Colonic Polyps/diagnosis , Colonoscopy/methods , Colorectal Neoplasms/diagnosis , Decision Support Techniques , Early Detection of Cancer/methods , Risk Assessment/methods , Adenoma/epidemiology , Age Factors , Aged , California , Colonic Polyps/epidemiology , Colorectal Neoplasms/epidemiology , Female , Follow-Up Studies , Humans , Male , Middle Aged , Neoplasm Staging , Prevalence , Prognosis , Prospective Studies , Risk Factors , Sex FactorsABSTRACT
OBJECTIVE: To examine diabetes-related health care utilization and costs for a population-based sample of children with presumed type 1 diabetes (T1D) enrolled in the California Children's Services program. STUDY DESIGN: Our data source was the California Children's Services claims data for the period July 1, 2009, to June 30, 2012. We studied a sample of 652 children aged 0-21 years who were continuously enrolled for at least 365 days, had an outpatient visit for T1D, and were taking insulin. RESULTS: Compared with the younger age groups, individuals in the 19-21 year age group had the highest rates of hospitalization, T1D-specific bed-days, and emergency department visits. The overall median cost for this population was $7654. The overall median costs per year (and proportion of total costs) were $5603 (59%) for hospitalizations, $58 (0.4%) for emergency department visits, $144 (1.3%) for outpatient utilization, $2930 (23%) for insulin, and $1579 (13%) for blood glucose monitoring supplies. For those who used them, the median cost of pumps was an additional $2162. CONCLUSION: Further studies are needed to provide more insight into patterns of care and adverse health outcomes for children with T1D as they transition into young adulthood. The costs of insulin, glucose monitoring supplies, and pump therapy for children with T1D is substantial and may factor into future policy considerations regarding coverage and cost-sharing with families.
Subject(s)
Diabetes Mellitus, Type 1/economics , Diabetes Mellitus, Type 1/therapy , Ethnicity/statistics & numerical data , Health Care Costs/statistics & numerical data , Medical Assistance , Patient Acceptance of Health Care/statistics & numerical data , White People/statistics & numerical data , Adolescent , Age Factors , California/epidemiology , Child , Child, Preschool , Diabetes Mellitus, Type 1/ethnology , Female , Humans , Infant , Infant, Newborn , Insulin Infusion Systems/economics , Male , Patient Acceptance of Health Care/ethnology , Young AdultABSTRACT
PURPOSE: To compare the visual outcomes and monocular defocus curve of a new monofocal Tecnis Eyhance IOL (Tecnis ICB00) with Tecnis 1 single piece (ZCB00). METHODS: Eighty patients diagnosed with cataract were divided into two groups: Tecnis ICB00 (n = 40) and ZCB00 (n = 40). The visual outcome was evaluated using the following parameters: uncorrected distance visual acuity (UDVA), uncorrected intermediate visual acuity (UIVA), uncorrected near visual acuity (UNVA), corrected distance visual acuity (CDVA), distance corrected intermediate visual acuity (DCIVA), corrected near visual acuity (CNVA), uncorrected visual acuity contrast sensitivity (UVACS), best-corrected visual acuity contrast sensitivity (BCVACS), manifest refraction, and defocus curve and was compared at the 6th week and 3 months after surgery. RESULTS: The UIVA and UNVA were significantly (P < 0.05) better in ICB00 as compared with ZCB00 at 6 weeks and 3 months postoperative. The DCIVA was significantly better in ICB00 as compared with ZCB00 at 3 months postoperative (-0.015 ± 0.04 vs. 0.01 ± 0.020; P = 0.01). Regarding contrast sensitivity, UVACS and BCVACS were significantly better in ICB00 as compared with ZCB00 at 6 weeks and 3 months postoperative (P < 0.05). The defocus curves showed that the mean visual acuity of the ICB00 group was significantly better than that of the ZCB00 group at between - 0.5 D and - 2.50 D of defocus. CONCLUSION: In patients undergoing cataract surgery, Eyhance ICB00 provided better intermediate vision as compared with ZCB00.
Subject(s)
Cataract , Lenses, Intraocular , Phacoemulsification , Humans , Refraction, Ocular , Lens Implantation, Intraocular , Prosthesis Design , Cataract/complications , Patient SatisfactionABSTRACT
BACKGROUND: The health benefits of organic foods are unclear. PURPOSE: To review evidence comparing the health effects of organic and conventional foods. DATA SOURCES: MEDLINE (January 1966 to May 2011), EMBASE, CAB Direct, Agricola, TOXNET, Cochrane Library (January 1966 to May 2009), and bibliographies of retrieved articles. STUDY SELECTION: English-language reports of comparisons of organically and conventionally grown food or of populations consuming these foods. DATA EXTRACTION: 2 independent investigators extracted data on methods, health outcomes, and nutrient and contaminant levels. DATA SYNTHESIS: 17 studies in humans and 223 studies of nutrient and contaminant levels in foods met inclusion criteria. Only 3 of the human studies examined clinical outcomes, finding no significant differences between populations by food type for allergic outcomes (eczema, wheeze, atopic sensitization) or symptomatic Campylobacter infection. Two studies reported significantly lower urinary pesticide levels among children consuming organic versus conventional diets, but studies of biomarker and nutrient levels in serum, urine, breast milk, and semen in adults did not identify clinically meaningful differences. All estimates of differences in nutrient and contaminant levels in foods were highly heterogeneous except for the estimate for phosphorus; phosphorus levels were significantly higher than in conventional produce, although this difference is not clinically significant. The risk for contamination with detectable pesticide residues was lower among organic than conventional produce (risk difference, 30% [CI, -37% to -23%]), but differences in risk for exceeding maximum allowed limits were small. Escherichia coli contamination risk did not differ between organic and conventional produce. Bacterial contamination of retail chicken and pork was common but unrelated to farming method. However, the risk for isolating bacteria resistant to 3 or more antibiotics was higher in conventional than in organic chicken and pork (risk difference, 33% [CI, 21% to 45%]). LIMITATION: Studies were heterogeneous and limited in number, and publication bias may be present. CONCLUSION: The published literature lacks strong evidence that organic foods are significantly more nutritious than conventional foods. Consumption of organic foods may reduce exposure to pesticide residues and antibiotic-resistant bacteria. PRIMARY FUNDING SOURCE: None.
Subject(s)
Food Contamination , Food, Organic , Nutritive Value , Drug Resistance, Bacterial , Food Contamination/analysis , Food Microbiology , Food, Organic/analysis , Food, Organic/microbiology , Humans , Pesticide Residues/analysis , Vitamins/analysisABSTRACT
Background and Aims: The 2012 and 2020 US Multi-Society Task Force postpolypectomy guidelines have recommended progressively longer surveillance intervals for patients with low-risk adenomas (LRAs). These guidelines require data from past colonoscopies. We examined the impact of the 2012 guidelines for second surveillance on clinical practice, including the availability of prior colonoscopy data, with the aim of informing the implementation of the 2020 guidelines. Methods: We identified surveillance colonoscopies at Stanford Health Care and the Palo Alto Veterans Affairs Health Care System in 3 periods: preguideline (March-August 2012), postguideline (January-June 2013), and delayed postguideline (July-September 2017). We collected data on the most recent previous colonoscopy, findings at the study entry surveillance colonoscopy, and recommendations for subsequent surveillance. Results: Among 977 patients, the most recent prior colonoscopy data were available in 78% of preguideline, 78% of postguideline, and 61% of delayed postguideline cases (P < .001). The fraction of surveillance colonoscopy reports that deferred recommendations awaiting pathology increased from 6% to 11% in preguideline and postguideline to 59% in delayed postguideline cases (P < .001). Overall adherence to guidelines for subsequent surveillance was similar in all 3 periods (54%-67%; P = .089). In the postguideline and delayed postguideline periods combined, a 10-year subsequent surveillance interval was recommended in 0 of 29 cases with LRA followed by normal surveillance colonoscopy. Conclusion: In patients undergoing surveillance, prior colonoscopy data were not always available and recommendations were often deferred awaiting pathology. Adherence to subsequent surveillance guidelines was suboptimal, especially for LRA followed by normal colonoscopy. Strategies addressing these gaps are needed to optimize implementation of the updated 2020 postpolypectomy guidelines.
ABSTRACT
BACKGROUND: Frontline providers frequently make time-sensitive antibiotic choices, but many feel poorly equipped to handle antibiotic allergies. OBJECTIVE: We hypothesized that a digital decision support tool could improve antibiotic selection and confidence when managing ß-lactam allergies. METHODS: A digital decision support tool was designed to guide non-allergist providers in managing patients with ß-lactam allergy labels. Non-allergists were asked to make decisions in clinical test cases without the tool, and then with it. These decisions were compared using paired t tests. Users also completed surveys assessing their confidence in managing antibiotic allergies. RESULTS: The tool's algorithm was validated by confirming its recommendations aligned with that of five allergists. Non-allergist providers (n = 102) made antibiotic management decisions in test cases, both with and without the tool. Use of the tool increased the proportion of correct decisions from 0.41 to 0.67, a difference of 0.26 (95% CI, 0.22-0.30; P < .001). Users were more likely to give full-dose antibiotics in low-risk situations, give challenge doses in medium-risk situations, and avoid the antibiotic and/or consult allergy departments in high-risk situations. A total of 98 users (96%) said the tool would increase their confidence when choosing antibiotics for patients with allergies. CONCLUSIONS: A point-of-care clinical decision tool provides allergist-designed guidance for non-allergists and is a scalable system for addressing antibiotic allergies, irrespective of allergist availability. This tool encouraged appropriate antibiotic use in low- and medium-risk situations and increased caution in high-risk situations. A digital support tool should be considered in quality improvement and antibiotic stewardship efforts.