RESUMO
The World Endoscopy Organization (WEO) standardized the reporting of post-colonoscopy colorectal cancers (PCCRCs), which account for 7% to 10% of colorectal cancers (CRCs).1 PCCRCs are diagnosed 6 to 36 months after a false negative colonoscopy. Detected CRCs (dCRCs) are diagnosed ≤6 months after an index true positive colonoscopy.2 PCCRC prognosis is unclear, with outcomes reported as comparable,3 superior,4 or inferior5,6 to those of dCRC. Because WEO terminology defines cases relative to the index colonoscopy, conventional survival analyses of PCCRC are susceptible to lead time and immortal time biases. We evaluated the influence of these biases on mortality in a population-based retrospective cohort of 10,938 dCRCs (93.8%) and 717 PCCRCs (6.2%). This study was set within Kaiser Permanente Northern California (KPNC), a large integrated health system, whose members are similar in demographic and socioeconomic characteristics to the Northern California region.7.
RESUMO
BACKGROUND AND AIMS: Prior antibiotic use may be a factor in the rising incidence of colorectal cancer seen in those under 50 years of age (early-onset colorectal cancer [EOCRC]); however, the few studies to examine this link have reported conflicting results. Therefore, we evaluated the association between oral antibiotic use in adulthood and EOCRC in a large integrated healthcare system in the United States. METHODS: A population-based nested case-control study was conducted among Kaiser Permanente Northern California patients 18-49 years of age diagnosed with EOCRC (adenocarcinoma of the colon or rectum) in 1998-2020 who had ≥2 years of continuous pharmacy benefit prior to diagnosis. Cases were matched 4:1 to healthy controls on birth year, sex, race and ethnicity, medical facility, and duration of pharmacy benefit. Antibiotic exposure >1 year before the diagnosis/index date was assessed using prescribing records. Conditional logistic regression was used to estimate odds ratios and 95% confidence intervals. A sensitivity analysis was performed among those with ≥10 years of continuous prescribing records. RESULTS: A total of 1359 EOCRC cases were matched to 4711 healthy controls. Antibiotic use in adulthood was not significantly associated with EOCRC in unadjusted or adjusted analyses (adjusted odds ratio, 1.04; 95% confidence interval, 0.94-1.26). No associations were seen for cumulative number of oral antibiotic dispensations or for any prior period of antibiotic exposure. CONCLUSIONS: In a large U.S. healthcare setting, there was no conclusive evidence of an association between oral antibiotic use in adulthood and risk of EOCRC.
RESUMO
BACKGROUND AND AIMS: Guidelines now recommend patients with low-risk adenomas receive colonoscopy surveillance in 7-10 years and those with the previously recommended 5-year interval be re-evaluated. We tested 3 outreach approaches for transitioning patients to the 10-year interval recommendation. METHODS: This was a 3-arm pragmatic randomized trial comparing telephone, secure messaging, and mailed letter outreach. The setting was Kaiser Permanente Northern California, a large integrated healthcare system. Participants were patients 54-70 years of age with 1-2 small (<10 mm) tubular adenomas at baseline colonoscopy, due for 5-year surveillance in 2022, without high-risk conditions, and with access to all 3 outreach modalities. Patients were randomly assigned to the outreach arm (telephone [n = 200], secure message [n = 203], and mailed letter [n = 201]) stratified by age, sex, and race/ethnicity. Outreach in each arm was performed by trained medical assistants (unblinded) communicating in English with 1 reminder attempt at 2-4 weeks. Participants could change their assigned interval to 10 years or continue their planned 5-year interval. RESULTS: Sixty-day response rates were higher for telephone (64.5%) and secure messaging outreach (51.7%) vs mailed letter (31.3%). Also, more patients adopted the 10-year surveillance interval in the telephone (37.0%) and secure messaging arms (32.0%) compared with mailed letter (18.9%) and rate differences were significant for telephone (18.1%; 97.5% confidence interval: 8.3%-27.9%) and secure message outreach (13.1%; 97.5% confidence interval: 3.5%-22.7%) vs mailed letter outreach. CONCLUSIONS: Telephone and secure messaging were more effective than mailed letter outreach for de-implementing outdated colonoscopy surveillance recommendations among individuals with a history of low-risk adenomas in an integrated healthcare setting. (ClinicalTrials.gov, Number: NCT05389397).
Assuntos
Colonoscopia , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Adenoma/diagnóstico , California , Colonoscopia/métodos , Colonoscopia/estatística & dados numéricos , Neoplasias Colorretais/diagnóstico , Neoplasias Colorretais/prevenção & controle , Detecção Precoce de Câncer/métodos , TelefoneRESUMO
INTRODUCTION: Colonoscopy surveillance guidelines categorize individuals as high or low risk for future colorectal cancer (CRC) based primarily on their prior polyp characteristics, but this approach is imprecise, and consideration of other risk factors may improve postpolypectomy risk stratification. METHODS: Among patients who underwent a baseline colonoscopy with removal of a conventional adenoma in 2004-2016, we compared the performance for postpolypectomy CRC risk prediction (through 2020) of a comprehensive model featuring patient age, diabetes diagnosis, and baseline colonoscopy indication and prior polyp findings (i.e., adenoma with advanced histology, polyp size ≥10 mm, and sessile serrated adenoma or traditional serrated adenoma) with a polyp model featuring only polyp findings. Models were developed using Cox regression. Performance was assessed using area under the receiver operating characteristic curve (AUC) and calibration by the Hosmer-Lemeshow goodness-of-fit test. RESULTS: Among 95,001 patients randomly divided 70:30 into model development (n = 66,500) and internal validation cohorts (n = 28,501), 495 CRC were subsequently diagnosed; 354 in the development cohort and 141 in the validation cohort. Models demonstrated adequate calibration, and the comprehensive model demonstrated superior predictive performance to the polyp model in the development cohort (AUC 0.71, 95% confidence interval [CI] 0.68-0.74 vs AUC 0.61, 95% CI 0.58-0.64, respectively) and validation cohort (AUC 0.70, 95% CI 0.65-0.75 vs AUC 0.62, 95% CI 0.57-0.67, respectively). DISCUSSION: A comprehensive CRC risk prediction model featuring patient age, diabetes diagnosis, and baseline colonoscopy indication and polyp findings was more accurate at predicting postpolypectomy CRC diagnosis than a model based on polyp findings alone.
Assuntos
Adenoma , Pólipos do Colo , Colonoscopia , Neoplasias Colorretais , Humanos , Neoplasias Colorretais/patologia , Neoplasias Colorretais/cirurgia , Neoplasias Colorretais/diagnóstico , Masculino , Feminino , Colonoscopia/métodos , Pessoa de Meia-Idade , Adenoma/cirurgia , Adenoma/patologia , Adenoma/diagnóstico , Medição de Risco , Idoso , Pólipos do Colo/cirurgia , Pólipos do Colo/patologia , Pólipos do Colo/diagnóstico , Fatores de Risco , Curva ROC , Modelos de Riscos Proporcionais , Estudos RetrospectivosRESUMO
OBJECTIVE: Since its inception in the early 2000s, hybrid arch repair (HAR) has evolved from novel approach to well-established treatment modality for aortic arch pathology in appropriately selected patients. Despite this nearly 20-year history of use, long-term results of HAR remain to be determined. As such, objectives of this study are to detail the long-term outcomes for HAR within an expanded classification scheme. METHODS: From August 2005 to August 2022, 163 consecutive patients underwent HAR at a single referral institution. Operative approach was selected according to an institutional algorithm and included zone 0/1 HAR in 25% (n = 40), type I HAR in 34% (n = 56), and type II/III HAR in 41% (n = 67). Specific zone 0/1 technique was zone 1 HAR in 31 (78%), zone 0 with innominate snorkel (zone 0S HAR) in 7 (18%), and zone 0 with single side-branch endograft (zone 0B HAR) in 2 (5%). The 30-day and long-term outcomes, including overall and aortic-specific survival, as well as freedom from reintervention, were assessed. RESULTS: The mean age was 63 ± 13 years and almost one-half of patients (47% [n = 77]) had prior sternotomy. Presenting pathology included degenerative aneurysm in 44% (n = 71), residual dissection after prior type A repair in 38% (n = 62), chronic type B dissection in 12% (n = 20), and other indications in 6% (n = 10). Operative outcomes included 9% mortality (n = 14) at 30 days, 5% mortality (n = 8) in hospital, 4% stroke (n = 7), 2% new dialysis (n = 3), and 2% permanent paraparesis/plegia (n = 3). The median follow-up was 44 month (interquartile range, 12-84 months). Overall survival was 59% and 47% at 5 and 10 years, respectively, whereas aorta-specific survival was 86% and 84% at the same time points. At 5 and 10 years, freedom from major reintervention was 92% and 91%, respectively. Institutional experience had a significant impact on both early and late outcomes: comparing the first (2005-2012) and second (2013-2022) halves of the series, 30-day mortality decreased from 14% to 1% (P = .01) and stroke from 6% to 3% (P = .62). Improved operative outcomes were accompanied by improved late survival, with 78% of patients in the later era vs 45% in the earlier era surviving to 5 years. CONCLUSIONS: HAR is associated with excellent operative outcomes, as well as sustained protection from adverse aortic events as evidenced by high long-term aorta-specific survival and freedom from reintervention. However, surgeon and institutional experience appear to play a major role in achieving these superior outcomes, with a five-fold decrease in operative mortality and a two-fold decrease in stroke rate in the latter half of the series. These long-term results expand on prior midterm data and continue to support use of HAR for properly selected patients with arch disease.
Assuntos
Aneurisma da Aorta Torácica , Implante de Prótese Vascular , Procedimentos Endovasculares , Acidente Vascular Cerebral , Humanos , Pessoa de Meia-Idade , Idoso , Aorta Torácica/diagnóstico por imagem , Aorta Torácica/cirurgia , Aneurisma da Aorta Torácica/diagnóstico por imagem , Aneurisma da Aorta Torácica/cirurgia , Aneurisma da Aorta Torácica/etiologia , Resultado do Tratamento , Fatores de Risco , Estudos Retrospectivos , Estimativa de Kaplan-Meier , Complicações Pós-Operatórias , Acidente Vascular Cerebral/etiologiaRESUMO
BACKGROUND/OBJECTIVE: Multilevel barriers to colonoscopy after a positive fecal blood test for colorectal cancer (CRC) are well-documented. A less-explored barrier to appropriate follow-up is repeat fecal testing after a positive test. We investigated this phenomenon using mixed methods. DESIGN: This sequential mixed methods study included quantitative data from a large cohort of patients 50-89 years from four healthcare systems with a positive fecal test 2010-2018 and qualitative data from interviews with physicians and patients. MAIN MEASURES: Logistic regression was used to evaluate whether repeat testing was associated with failure to complete subsequent colonoscopy and to identify factors associated with repeat testing. Interviews were coded and analyzed to explore reasons for repeat testing. KEY RESULTS: A total of 316,443 patients had a positive fecal test. Within 1 year, 76.3% received a colonoscopy without repeat fecal testing, 3% repeated testing and then received a colonoscopy, 4.4% repeated testing without colonoscopy, and 16.3% did nothing. Among repeat testers (7.4% of total cohort, N = 23,312), 59% did not receive a colonoscopy within 1 year. In adjusted models, those with an initial positive test followed by a negative second test were significantly less likely to receive colonoscopy than those with two successive positive tests (OR 0.37, 95% CI 0.35-0.40). Older age (65-75 vs. 50-64 years: OR 1.37, 95% CI 1.33-1.41) and higher comorbidity score (≥ 4 vs. 0: OR 1.75, 95% CI 1.67-1.83) were significantly associated with repeat testing compared to those who received colonoscopy without repeat tests. Qualitative interview data revealed reasons underlying repeat testing, including colonoscopy avoidance, bargaining, and disbelief of positive results. CONCLUSIONS: Among patients in this cohort, 7.4% repeated fecal testing after an initial positive test. Of those, over half did not go on to receive a colonoscopy within 1 year. Efforts to improve CRC screening must address repeat fecal testing after a positive test as a barrier to completing colonoscopy.
RESUMO
BACKGROUND: Minimally invasive distal pancreatectomy (MIDP) has established advantages over the open approach. The costs associated with robotic DP (RDP) versus laparoscopic DP (LDP) make the robotic approach controversial. We sought to compare outcomes and cost of LDP and RDP using propensity matching analysis at our institution. METHODS: Patients undergoing LDP or RDP between 2000 and 2021 were retrospectively identified. Patients were optimally matched using age, gender, American Society of Anesthesiologists status, body mass index, and tumor size. Between-group differences were analyzed using the Wilcoxon signed-rank test for continuous data, and the McNemar's test for categorical data. Outcomes included operative duration, conversion to open surgery, postoperative length of stay, pancreatic fistula rate, pseudocyst requiring intervention, and costs. RESULTS: 298 patients underwent MIDP, 180 (60%) were laparoscopic and 118 (40%) were robotic. All RDPs were matched 1:1 to a laparoscopic case with absolute standardized mean differences for all matching covariates below 0.10, except for tumor type (0.16). RDP had longer operative times (268 vs 178 min, p < 0.01), shorter length of stay (2 vs 4 days, p < 0.01), fewer biochemical pancreatic leaks (11.9% vs 34.7%, p < 0.01), and fewer interventional radiological drainage (0% vs 5.9%, p = 0.01). The number of pancreatic fistulas (11.9% vs 5.1%, p = 0.12), collections requiring antibiotics or intervention (11.9% vs 5.1%, p = 0.12), and conversion rates (3.4% vs 5.1%, p = 0.72) were comparable between the two groups. The total direct index admission costs for RDP were 1.01 times higher than for LDP for FY16-19 (p = 0.372), and 1.33 times higher for FY20-22 (p = 0.031). CONCLUSIONS: Although RDP required longer operative times than LDP, postoperative stays were shorter. The procedure cost of RDP was modestly more expensive than LDP, though this was partially offset by reduced hospital stay and reintervention rate.
Assuntos
Laparoscopia , Neoplasias Pancreáticas , Procedimentos Cirúrgicos Robóticos , Humanos , Procedimentos Cirúrgicos Robóticos/métodos , Pancreatectomia/métodos , Estudos Retrospectivos , Neoplasias Pancreáticas/cirurgia , Resultado do Tratamento , Fístula Pancreática/epidemiologia , Fístula Pancreática/etiologia , Fístula Pancreática/cirurgia , Tempo de Internação , Laparoscopia/métodos , Duração da CirurgiaRESUMO
OBJECTIVE: To determine the threshold annualized esophagectomy volume that is associated with improved survival, oncologic resection, and postoperative outcomes. BACKGROUND: Esophagectomy at high-volume centers is associated with improved outcomes; however, the definition of high-volume remains debated. METHODS: The 2004 to 2016 National Cancer Database was queried for patients with clinical stage I to III esophageal cancer undergoing esophagectomy. Center esophagectomy volume was modeled as a continuous variable using restricted cubic splines. Maximally selected ranks were used to identify an inflection point of center volume and survival. Survival was compared using multivariable Cox proportional hazards methods. Multivariable logistic regression was used to examine secondary outcomes. RESULTS: Overall, 13,493 patients met study criteria. Median center esophagectomy volume was 8.2 (interquartile range: 3.2-17.2) cases per year. On restricted cubic splines, inflection points were identified at 9 and 30 cases per year. A multivariable Cox model was constructed modeling annualized center surgical volume as a continuous variable using 3 linear splines and inflection points at 9 and 30 cases per year. On multivariable analysis, increasing center volume up to 9 cases per year was associated with a substantial survival benefit (hazard ratio: 0.97, 95% confidence interval, 0.95-0.98, P ≤0.001). On multivariable logistic regression, factors associated with undergoing surgery at a high-volume center (>9 cases per year) included private insurance, care at an academic center, completion of high school education, and greater travel distance. CONCLUSIONS: This National Cancer Database study utilizing multivariable analysis and restricted cubic splines suggests the threshold definition of a high-volume esophagectomy center as one that performs at least 10 operations a year.
Assuntos
Neoplasias Esofágicas , Esofagectomia , Humanos , Esofagectomia/métodos , Modelos de Riscos Proporcionais , Neoplasias Esofágicas/cirurgia , Modelos Logísticos , Bases de Dados Factuais , Estudos Retrospectivos , Resultado do TratamentoRESUMO
BACKGROUND & AIMS: Recent research has demonstrated biologic plausibility for iatrogenic tumor seeding via colonoscopy as a cause of metachronous colorectal cancers (CRC). This study evaluated the association between biopsy of non-tumor sites after CRC biopsy and risk of metachronous CRC in a large community-based health care organization. METHODS: This was a retrospective case-control study of adults with an initial CRC diagnosed by colonoscopy between January 2006 and June 2018 who underwent curative resection. Cases developed a second primary (metachronous) CRC diagnosed 6 months to 4 years after the initial CRC, and were matched by age, sex, diagnosis of inflammatory bowel disease, race, and ethnicity with up to 5 controls without a second CRC diagnosis. The exposure was biopsy in the colonic segment of the metachronous CRC (or corresponding segment in controls) after tumor biopsy, ascertained with blinding to case status. Associations were evaluated using conditional logistic regression and adjusted for potential cofounders. RESULTS: Among 14,119 patients diagnosed with an initial CRC during colonoscopy, 107 received a second CRC diagnosis. After exclusions for recurrent or synchronous CRC, 45 cases and 212 controls were included. There was no significant association between biopsy of non-tumor sites after initial CRC biopsy and risk of metachronous CRC in the segment of the additional biopsy site (adjusted odds ratio, 2.29; 95% confidence interval, 0.77-6.81). CONCLUSIONS: Metachronous cancers are not significantly associated with biopsy of non-tumor sites after biopsy of the primary cancer. Although the sample size does not allow definite exclusion of any association, these findings do not support iatrogenic tumor seeding as a common risk factor for metachronous CRC.
Assuntos
Neoplasias Colorretais , Segunda Neoplasia Primária , Adulto , Humanos , Estudos de Casos e Controles , Estudos Retrospectivos , Segunda Neoplasia Primária/diagnóstico , Neoplasias Colorretais/patologia , Fatores de Risco , Colonoscopia/efeitos adversos , Biópsia/efeitos adversos , Doença IatrogênicaRESUMO
BACKGROUND & AIMS: The COVID-19 pandemic has affected clinical services globally, including colorectal cancer (CRC) screening and diagnostic testing. We investigated the pandemic's impact on fecal immunochemical test (FIT) screening, colonoscopy utilization, and colorectal neoplasia detection across 21 medical centers in a large integrated health care organization. METHODS: We performed a retrospective cohort study in Kaiser Permanente Northern California patients ages 18 to 89 years in 2019 and 2020 and measured changes in the numbers of mailed, completed, and positive FITs; colonoscopies; and cases of colorectal neoplasia detected by colonoscopy in 2020 vs 2019. RESULTS: FIT kit mailings ceased in mid-March through April 2020 but then rebounded and there was an 8.7% increase in kits mailed compared with 2019. With the later mailing of FIT kits, there were 9.0% fewer FITs completed and 10.1% fewer positive tests in 2020 vs 2019. Colonoscopy volumes declined 79.4% in April 2020 compared with April 2019 but recovered to near pre-pandemic volumes in September through December, resulting in a 26.9% decline in total colonoscopies performed in 2020. The number of patients diagnosed by colonoscopy with CRC and advanced adenoma declined by 8.7% and 26.9%, respectively, in 2020 vs 2019. CONCLUSIONS: The pandemic led to fewer FIT screenings and colonoscopies in 2020 vs 2019; however, after the lifting of shelter-in-place orders, FIT screenings exceeded, and colonoscopy volumes nearly reached numbers from those same months in 2019. Overall, there was an 8.7% reduction in CRC cases diagnosed by colonoscopy in 2020. These data may help inform the development of strategies for CRC screening and diagnostic testing during future national emergencies.
Assuntos
COVID-19 , Neoplasias Colorretais , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , COVID-19/diagnóstico , COVID-19/epidemiologia , Colonoscopia/métodos , Neoplasias Colorretais/diagnóstico , Neoplasias Colorretais/epidemiologia , Serviços de Saúde Comunitária , Detecção Precoce de Câncer/métodos , Fezes , Humanos , Programas de Rastreamento/métodos , Pessoa de Meia-Idade , Sangue Oculto , Pandemias , Estudos Retrospectivos , Estados Unidos/epidemiologia , Adulto JovemRESUMO
PURPOSE: To evaluate the sufficiency of the dietary adjustment of dietary risk factors, made in the recent study by Li et al. published in Cancer Causes & Controls. The main research question is: are the dietary adjustments in Li et al. sufficient enough to control for specific dietary food groups? METHODS: An evaluation of three methodological problems in Li et al.was performed; (1) the adjustment of total fruit intake, and how it relates to citrus fruit intake; (2) the adjustment of meat intake, and its relation to red and processed meat intake; (3) broad categorization of fish intake, and how it may limit interpretation. RESULTS: Adjusting for both total fruit intake and meat intake may not be enough to control the effect of specific dietary components which may affect melanoma risk, such as citrus fruit, and red and processed meat intake, causing an increased risk of residual confounding. Moreover, with no distinguishment between fresh and canned tuna in the dietary survey, significant limitations may be present. CONCLUSION: The dietary adjustments conducted in the study by Li et al. may not capture the intake of citrus fruit or red and processed meat, relevant to the risk of melanoma, and may induce residual confounding.
Assuntos
Frutas , Melanoma , Animais , Humanos , Dieta , Fatores de Risco , Carne/efeitos adversosRESUMO
Time at home is a critically important outcome to adults with acute myeloid leukemia (AML) when selecting treatment; however, no study to date has adequately described the amount of time older adults spend at home following initiation of chemotherapy. We queried records from a multi-institution health system to identify adults aged ≥60 years newly diagnosed with AML who were treated with azacitidine or venetoclax and evaluated the proportion of days at home (PDH) following diagnosis. Days were considered "at home" if patients were not admitted or seen in the emergency department or oncology/infusion clinic. Assessed covariates included demographics and disease risk. Associations between PDH and baseline characteristics were evaluated via linear regression, adjusted for log length of follow-up. From 2015-2020, 113 older adults were identified. Most received azacitidine plus venetoclax (51.3%) followed by azacitidine monotherapy (38.9%). The mean PDH for all patients was 0.58 (95% confidence interval: 0.54-0.63, median 0.63). PDH increased among survivors over time. PDH did not differ between therapy groups (adjusted mean, azacitidine plus venetoclax: 0.68; azacitidine monotherapy: 0.66; P=0.64) or between disease risk categories (P=0.34). Compared to patients receiving azacitidine monotherapy, patients receiving azacitidine plus venetoclax had longer clinic visits (median minutes: 127.9 vs. 112.9, P<0.001) and infusion visits (median minutes: 194.3 vs. 132.5, P<0.001). The burden of care for older adults with AML treated with "less intense" chemotherapy is high. The addition of venetoclax to azacitidine did not translate into increased time at home. Future prospective studies should evaluate patient-centered outcomes, including time at home, to inform shared decision-making and drug development.
Assuntos
Azacitidina , Leucemia Mieloide Aguda , Humanos , Idoso , Estudos Prospectivos , Compostos Bicíclicos Heterocíclicos com Pontes , Leucemia Mieloide Aguda/diagnóstico , Leucemia Mieloide Aguda/tratamento farmacológico , Leucemia Mieloide Aguda/etiologia , Protocolos de Quimioterapia Combinada Antineoplásica/efeitos adversosRESUMO
BACKGROUND AND AIMS: Endoscopist adenoma detection rates (ADRs) vary widely and are associated with patients' risk of postcolonoscopy colorectal cancers (PCCRCs). However, few scalable physician-directed interventions demonstrably both improve ADR and reduce PCCRC risk. METHODS: Among patients undergoing colonoscopy, we evaluated the influence of a scalable online training on individual-level ADRs and PCCRC risk. The intervention was a 30-minute, interactive, online training, developed using behavior change theory, to address factors that potentially impede detection of adenomas. Analyses included interrupted time series analyses for pretraining versus posttraining individual-physician ADR changes (adjusted for temporal trends) and Cox regression for associations between ADR changes and patients' PCCRC risk. RESULTS: Across 21 endoscopy centers and all 86 eligible endoscopists, ADRs increased immediately by an absolute 3.13% (95% confidence interval [CI], 1.31-4.94) in the 3-month quarter after training compared with .58% per quarter (95% CI, .40-.77) and 0.33% per quarter (95% CI, .16-.49) in the 3-year pretraining and posttraining periods, respectively. Posttraining ADR increases were higher among endoscopists with pretraining ADRs below the median. Among 146,786 posttraining colonoscopies (all indications), each 1% absolute increase in screening ADR posttraining was associated with a 4% decrease in their patients' PCCRC risk (hazard ratio, .96; 95% CI, .93-.99). An ADR increase of ≥10% versus <1% was associated with a 55% reduced risk of PCCRC (hazard ratio, .45; 95% CI, .24-.82). CONCLUSIONS: A scalable, online behavior change training intervention focused on modifiable factors was associated with significant and sustained improvements in ADR, particularly among endoscopists with lower ADRs. These ADR changes were associated with substantial reductions in their patients' risk of PCCRC.
Assuntos
Neoplasias Colorretais , Médicos , Procedimentos de Cirurgia Plástica , Humanos , Colonoscopia , Neoplasias Colorretais/diagnósticoRESUMO
BACKGROUND: This post-hoc analysis from three phase 3 treatment trials of rimegepant 75 mg - an oral small molecule calcitonin gene-related peptide receptor antagonist for acute and preventive treatment of migraine - assessed efficacy in adults with migraine based on triptan treatment experience. METHODS: Participants were assigned to one of four groups based on triptan treatment experience: insufficient response (e.g. lack of efficacy and/or poor tolerability) to 1 triptan, insufficient response to ≥2 triptans, current triptan users, and triptan-naïve participants. The co-primary efficacy endpoints were pain freedom and most bothersome symptom freedom at two hours postdose. RESULTS: In the three trials (N = 3507; rimegepant n = 1749, placebo n = 1758), 1235 (35.2%) participants had a history of insufficient response to 1 triptan (n = 910 [25.9%]) or ≥2 triptans (n = 325 [9.3%]), and 2272 (64.8%) had no history of insufficient response to triptans (current use = 595 [17.0%], naïve = 1677 [47.8%]). Rimegepant was effective on the co-primary endpoints in all subgroups (p ≤ 0.013), except for freedom from the most bothersome symptom in the triptan-naïve group (p = 0.06). No differences on co-primary endpoints were found in pairwise comparisons of rimegepant-treated participants. CONCLUSIONS: Rimegepant was effective for the acute treatment of migraine in adults with a history of insufficient response to 1 or ≥2 triptans and in current triptan users. Efficacy on co-primary endpoints did not differ based on the number of insufficient triptan responses.Trial registration: Clinicaltrials.gov: NCT03235479, NCT03237845, NCT03461757.
Assuntos
Transtornos de Enxaqueca , Triptaminas , Adulto , Humanos , Transtornos de Enxaqueca/tratamento farmacológico , Piperidinas/uso terapêutico , Ensaios Clínicos Controlados Aleatórios como Assunto , Agonistas do Receptor 5-HT1 de Serotonina/uso terapêutico , Triptaminas/uso terapêutico , Ensaios Clínicos Fase III como AssuntoRESUMO
OBJECTIVE: To characterize treatment decision-making processes and formalize consensus regarding key factors headache specialists consider in treatment decisions for patients with migraine, considering novel therapies. BACKGROUND: Migraine therapies have long been subject to binary classification, acute versus preventive, due to limitations of available drugs. The emergence of novel therapies that can be used more flexibly creates an opportunity to rethink this binary classification. To determine the role of these novel therapies in treatment, it is critical to understand whether existing guidelines reflect clinical practice and to establish consensus around factors driving management. METHODS: A three-round modified Delphi process was conducted with migraine clinical experts. Round 1 consisted of an online questionnaire; Round 2 involved an online discussion of aggregated Round 1 results; and Round 3 allowed participants to revise Round 1 responses, incorporating Round 2 insights. Questions elicited likelihood ratings (0 = highly unlikely to 100 = highly likely), rankings, and estimates on treatment decision-making. RESULTS: Nineteen experts completed three Delphi rounds. Experts strongly agreed on definitions for "acute" (median = 100, inter-quartile range [IQR] = 5) and "preventive" treatment (median = 90, IQR = 15), but noted a need for treatment customization for patients (median = 100, IQR = 6). Experts noted certain aspects of guidelines may no longer apply based on established tolerability and efficacy of newer acute and preventive agents (median = 91, IQR = 17). Further, experts agreed on a treatment category referred to as "situational prevention" (or "short-term prevention") for patients with reliable and predictable migraine triggers (median = 100, IQR = 10) or time-limited periods when headache avoidance is important (median = 100, IQR = 12). CONCLUSIONS: Using the modified Delphi method, a panel of migraine experts identified the importance of customizing treatment for people with migraine and the utility of "situational prevention," given the ability of new treatment options to meet this need and the potential to clinically identify patients and time periods when this approach would add value.
Assuntos
Transtornos de Enxaqueca , Humanos , Consenso , Transtornos de Enxaqueca/tratamento farmacológico , Técnica Delphi , Inquéritos e Questionários , CefaleiaRESUMO
AIMS: Previously, no relationship between milk consumption and the risk of type 2 diabetes has been found in prospective cohorts. However, Mendelian randomization allows researchers to almost bypass much residual confounding, providing a more precise effect estimate. This systematic review aims to investigate the risk of type 2 diabetes and levels of HbA1c by assessing all Mendelian Randomization studies investigating this subject matter. DATA SYNTHESIS: PubMed and EMBASE were searched from October 2021 through February 2023. Inclusion and exclusion criteria were formulated to filter out irrelevant studies. Studies were qualitatively assessed with STROBE-MR together with a list of five MR criteria. Six studies were identified, containing several thousand participants. All studies used the SNP rs4988235 as the main exposure and type 2 diabetes and/or HbA1c as the main outcome. Five studies were graded as "good" with STROBE-MR, with one graded as "fair". For the six MR criteria, five studies were graded "good" in four criteria, while two studies were graded "good" in two criteria. Overall, genetically predicted milk consumption did not seem to be associated with an increased risk of type 2 diabetes. CONCLUSIONS: This systematic review found that genetically predicted milk consumption did not seem to increase the risk of type 2 diabetes. Future Mendelian randomization studies concerning this topic should consider conducting two-sample Mendelian Randomization studies, in order to derive a more valid effect estimate.
Assuntos
Diabetes Mellitus Tipo 2 , Leite , Humanos , Animais , Leite/efeitos adversos , Diabetes Mellitus Tipo 2/diagnóstico , Diabetes Mellitus Tipo 2/epidemiologia , Diabetes Mellitus Tipo 2/genética , Hemoglobinas Glicadas , Análise da Randomização Mendeliana , Estudos Prospectivos , Polimorfismo de Nucleotídeo Único , Estudo de Associação Genômica AmplaRESUMO
Asthma is related to triggers within the home. Although it is recognised that triggers likely occur due to characteristics of housing, these characteristics have not been comprehensively reviewed, and there is a paucity of housing-focused interventions to reduce asthma and asthma symptoms. Following five steps identified by Arksey and O'Malley, we conducted a scoping review of published evidence on the associations between asthma and housing characteristics. We searched three electronic databases (PubMed, Scopus, Web of Science), identifying 33 studies that met our inclusion criteria. Through an iterative approach, we identified nine housing characteristics relevant to asthma onset or exacerbation, categorised as relating to the surrounding environment (location), the house itself (dwelling), or to conditions inside the home (occupancy). We conceptualise these three levels through a housing typologies framework. This facilitates the mapping of housing characteristics, and visualises how they can cluster and overlap to exacerbate asthma or asthma symptoms. Of the three levels in our framework, associations between asthma and locational features were evidenced most clearly in the literature reviewed. Within this category, environmental pollutants (and particularly air pollutants) were identified as a potentially important risk factor for asthma. Studies concerning associations between dwelling features and occupancy features and asthma reported inconsistent results, highlighting the need for greater research in these areas. Interpreting housing-related asthma triggers through this framework paves the way for the identification and targeting of typologies of housing that might adversely affect asthma, thus addressing multiple characteristics in tandem rather than as isolated elements.
Assuntos
Poluentes Atmosféricos , Arsênio , Asma , Humanos , Habitação , Asma/epidemiologia , Asma/etiologia , Bases de Dados FactuaisRESUMO
INTRODUCTION: A well-known complication of veno-arterial extracorporeal membrane oxygenation (VA ECMO) is differential hypoxia, in which poorly-oxygenated blood ejected from the left ventricle mixes with and displaces well-oxygenated blood from the circuit, thereby causing cerebral hypoxia and ischemia. We sought to characterize the impact of patient size and anatomy on cerebral perfusion under a range of different VA ECMO flow conditions. METHODS: We use one-dimensional (1D) flow simulations to investigate mixing zone location and cerebral perfusion across 10 different levels of VA ECMO support in eight semi-idealized patient geometries, for a total of 80 scenarios. Measured outcomes included mixing zone location and cerebral blood flow (CBF). RESULTS: Depending on patient anatomy, we found that a VA ECMO support ranging between 67-97% of a patient's ideal cardiac output was needed to perfuse the brain. In some cases, VA ECMO flows exceeding 90% of the patient's ideal cardiac output are needed for adequate cerebral perfusion. CONCLUSIONS: Individual patient anatomy markedly affects mixing zone location and cerebral perfusion in VA ECMO. Future fluid simulations of VA ECMO physiology should incorporate varied patient sizes and geometries in order to best provide insights toward reducing neurologic injury and improved outcomes in this patient population.
RESUMO
Interconnected magnetic nanowire (NW) networks offer a promising platform for three-dimensional (3D) information storage and integrated neuromorphic computing. Here we report discrete propagation of magnetic states in interconnected Co nanowire networks driven by magnetic field and current, manifested in distinct magnetoresistance (MR) features. In these networks, when only a few interconnected NWs were measured, multiple MR kinks and local minima were observed, including a significant minimum at a positive field during the descending field sweep. Micromagnetic simulations showed that this unusual feature was due to domain wall (DW) pinning at the NW intersections, which was confirmed by off-axis electron holography imaging. In a complex network with many intersections, sequential switching of nanowire sections separated by interconnects was observed, along with stochastic characteristics. The pinning/depinning of the DWs can be further controlled by the driving current density. These results illustrate the promise of such interconnected networks as integrated multistate memristors.
RESUMO
BACKGROUND: The plasma cell disorders (PCDs), multiple myeloma (MM), and light-chain amyloidosis (AL) are disproportionately diseases of older adults, whose care may be complicated by frailty associated with advancing age. We sought to evaluate the prevalence of functional deficits and symptoms in a cohort of persons with PCDs and associations of demographic, disease-related, functional, and psychosocial measures with quality of life (QoL). PATIENTS AND METHODS: Adults with PCDs were recruited into an observational registry in 2018-2020. Patients completed a functional assessment and European Organization for Research and Treatment of Cancer QoL questionnaire (QLQ-C30). Associations of covariates of interest with QoL were evaluated via univariate linear regression. RESULTS: Among 121 adults, the mean age was 68.6. Diagnoses were 74% MM, 14% AL, 7% both MM and AL, and 5% other PCDs. The median time from diagnosis was 34.9 months. Median lines of therapy were 2, with 11% having received ≥4th-line therapy.Patients with functional deficits had lower mean QoL scores: dependence in IADLs (66.3 vs. 79.9, P = .001) and recent falls (56.7 vs. 76.8, P = .001). Patients ≤6 months from diagnosis had lower QoL (66.7) than those ≥2 years from diagnosis (77.3, P = .03). However, patients on later lines of therapy (≥4th-line) had lower QoL (62.2) than those on 1st-line treatment (76.0, P = .04). CONCLUSIONS: Patients with physical impairments and more advanced PCDs had lower QoL than those without deficits or earlier in their disease course. Early identification of physical impairments may facilitate interventions that mitigate these deficits and thereby improve QoL for patients with PCDs.