Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 37
Filter
Add more filters

Publication year range
1.
Int J Cancer ; 154(3): 516-529, 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-37795630

ABSTRACT

Individuals with a family history of colorectal cancer (CRC) may benefit from early screening with colonoscopy or immunologic fecal occult blood testing (iFOBT). We systematically evaluated the benefit-harm trade-offs of various screening strategies differing by screening test (colonoscopy or iFOBT), interval (iFOBT: annual/biennial; colonoscopy: 10-yearly) and age at start (30, 35, 40, 45, 50 and 55 years) and end of screening (65, 70 and 75 years) offered to individuals identified with familial CRC risk in Germany. A Markov-state-transition model was developed and used to estimate health benefits (CRC-related deaths avoided, life-years gained [LYG]), potential harms (eg, associated with additional colonoscopies) and incremental harm-benefit ratios (IHBR) for each strategy. Both benefits and harms increased with earlier start and shorter intervals of screening. When screening started before age 50, 32-36 CRC-related deaths per 1000 persons were avoided with colonoscopy and 29-34 with iFOBT screening, compared to 29-31 (colonoscopy) and 28-30 (iFOBT) CRC-related deaths per 1000 persons when starting age 50 or older, respectively. For iFOBT screening, the IHBRs expressed as additional colonoscopies per LYG were one (biennial, age 45-65 vs no screening), four (biennial, age 35-65), six (biennial, age 30-70) and 34 (annual, age 30-54; biennial, age 55-75). Corresponding IHBRs for 10-yearly colonoscopy were four (age 55-65), 10 (age 45-65), 15 (age 35-65) and 29 (age 30-70). Offering screening with colonoscopy or iFOBT to individuals with familial CRC risk before age 50 is expected to be beneficial. Depending on the accepted IHBR threshold, 10-yearly colonoscopy or alternatively biennial iFOBT from age 30 to 70 should be recommended for this target group.


Subject(s)
Colorectal Neoplasms , Early Detection of Cancer , Humans , Middle Aged , Aged , Adult , Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/epidemiology , Colorectal Neoplasms/genetics , Colonoscopy , Mass Screening , Occult Blood , Cost-Benefit Analysis
2.
Gesundheitswesen ; 85(4): 234-241, 2023 Apr.
Article in German | MEDLINE | ID: mdl-34872119

ABSTRACT

BACKGROUND: Testicular cancer occurs mainly in young men between 25 and 45 years and is the most common cancer at this age. Possible testicular cancer early detection measures, clinical palpation and scrotal ultrasound (CUS) or testicular self-examination (TSE) in asymptomatic men aged 16 years and older, could perhaps avoid deaths and aggressive late therapies. Therefore, we investigated whether these measures have an additional benefit compared to the current situation. Ethical, legal, social and organisational aspects were considered as well. METHODS: The methodology of this review follows IQWiG's "Allgemeine[n] Methoden Version 5.0". In addition, to estimate the theoretically possible benefits and potential harms of screening, a supplementary presentation was used for the benefit assessment based on published data from tumour registries and data on predictive values from diagnostic studies. RESULTS: No intervention studies were identified, therefore evidence-based statements on additional benefit or harm of the studied interventions could not be made. The epidemiological data showed that per 100,000 men participating in screening annually, a maximum of 1.2 advanced tumours and 0.4 deaths would have been preventable. Harm calculations suggest that with CUS of 100,000 men, 1 to 22 unnecessary testicular exposures or removals might be expected, with TSE it would be 2 cases. However, these data on the possible harm of screening are subject to great uncertainty. CONCLUSIONS: There are no intervention studies demonstrating that the benefit of testicular cancer screening in men aged 16 years and older outweighs the harm. The maximum possible additional benefit is low and chances of detection and cure are good even without screening. At present, testicular cancer screening cannot be recommended.


Subject(s)
Testicular Neoplasms , Male , Humans , Testicular Neoplasms/diagnosis , Early Detection of Cancer , Technology Assessment, Biomedical , Germany
3.
BMC Gastroenterol ; 19(1): 209, 2019 Dec 05.
Article in English | MEDLINE | ID: mdl-31805871

ABSTRACT

BACKGROUND: Clear evidence on the benefit-harm balance and cost effectiveness of population-based screening for colorectal cancer (CRC) is missing. We aim to systematically evaluate the long-term effectiveness, harms and cost effectiveness of different organized CRC screening strategies in Austria. METHODS: A decision-analytic cohort simulation model for colorectal adenoma and cancer with a lifelong time horizon was developed, calibrated to the Austrian epidemiological setting and validated against observed data. We compared four strategies: 1) No Screening, 2) FIT: annual immunochemical fecal occult blood test age 40-75 years, 3) gFOBT: annual guaiac-based fecal occult blood test age 40-75 years, and 4) COL: 10-yearly colonoscopy age 50-70 years. Predicted outcomes included: benefits expressed as life-years gained [LYG], CRC-related deaths avoided and CRC cases avoided; harms as additional complications due to colonoscopy (physical harm) and positive test results (psychological harm); and lifetime costs. Tradeoffs were expressed as incremental harm-benefit ratios (IHBR, incremental positive test results per LYG) and incremental cost-effectiveness ratios [ICER]. The perspective of the Austrian public health care system was adopted. Comprehensive sensitivity analyses were performed to assess uncertainty. RESULTS: The most effective strategies were FIT and COL. gFOBT was less effective and more costly than FIT. Moving from COL to FIT results in an incremental unintended psychological harm of 16 additional positive test results to gain one life-year. COL was cost saving compared to No Screening. Moving from COL to FIT has an ICER of 15,000 EUR per LYG. CONCLUSIONS: Organized CRC-screening with annual FIT or 10-yearly colonoscopy is most effective. The choice between these two options depends on the individual preferences and benefit-harm tradeoffs of screening candidates.


Subject(s)
Colonic Neoplasms/diagnosis , Rectal Neoplasms/diagnosis , Adult , Aged , Austria , Colonic Neoplasms/prevention & control , Colonic Neoplasms/psychology , Colonoscopy/adverse effects , Cost-Benefit Analysis , Guaiac , Humans , Indicators and Reagents , Markov Chains , Mass Screening/economics , Middle Aged , Occult Blood , Quality-Adjusted Life Years , Rectal Neoplasms/prevention & control , Rectal Neoplasms/psychology , Sensitivity and Specificity
4.
BMC Public Health ; 17(1): 596, 2017 06 26.
Article in English | MEDLINE | ID: mdl-28651567

ABSTRACT

BACKGROUND: A recent recalibration of the ONCOTYROL Prostate Cancer Outcome and Policy (PCOP) Model, assuming that latent prostate cancer (PCa) detectable at autopsy might be detectable by screening as well, resulted in considerable worsening of the benefit-harm balance of screening. In this study, we used the recalibrated model to assess the effects of familial risk, quality of life (QoL) preferences, age, and active surveillance. METHODS: Men with average and elevated familial PCa risk were simulated in separate models, differing in familial risk parameters. Familial risk was assumed to affect PCa onset and progression simultaneously in the base-case, and separately in scenario analyses. Evaluated screening strategies included one-time screening at different ages, and screening at different intervals and age ranges. Optimal screening strategies were identified depending on age and individual QoL preferences. Strategies were additionally evaluated with active surveillance by biennial re-biopsy delaying treatment of localized cancer until grade progression to Gleason score ≥ 7. RESULTS: Screening men with average PCa risk reduced quality-adjusted life expectancy (QALE) even under favorable assumptions. Men with elevated familial risk, depending on age and disutilities, gained QALE. While for men with familial risk aged 55 and 60 years annual screening to age 69 was the optimal strategy over most disutility ranges, no screening was the preferred option for 65 year-old men with average and above disutilities. Active surveillance greatly reduced overtreatment, but QALE gains by averted adverse events were opposed by losses due to delayed treatment and additional biopsies. The effect of active surveillance on the benefit-harm balance of screening differed between populations, as net losses and gains in QALE predicted for screening without active surveillance in men with average and familial PCa risk, respectively, were both reduced. CONCLUSIONS: Assumptions about PCa risk and screen-detectable prevalence significantly affect the benefit-harm balance of screening. Based on the assumptions of our model, PCa screening should focus on candidates with familial predisposition with consideration of individual QoL preferences and age. Active surveillance may require treatment initiation before Gleason score progression to 7. Alternative active surveillance strategies should be evaluated in further modeling studies.


Subject(s)
Early Detection of Cancer/methods , Mass Screening/methods , Prostatic Neoplasms/diagnosis , Quality of Life , Quality-Adjusted Life Years , Age Factors , Aged , Biopsy , Disease Progression , Disease Susceptibility , Family , Humans , Life Expectancy , Male , Middle Aged , Models, Biological , Neoplasm Grading , Policy , Prostatic Neoplasms/epidemiology , Risk Assessment
5.
Urol Int ; 94(4): 419-27, 2015.
Article in English | MEDLINE | ID: mdl-25662301

ABSTRACT

INTRODUCTION: Urinary and erectile functions were assessed by using self-administered validated questionnaires in patients undergoing radical prostatectomy. MATERIALS AND METHODS: In a prospective observational study, a total of 253 consecutive patients diagnosed with clinically localised prostate cancer between 2008 and 2009 at the European Prostate Centre Innsbruck were included. Patient-reported outcomes were assessed before radical prostatectomy and 12 months postoperatively using the validated International Consultation on Incontinence Questionnaire (ICIQ) and the International Index of Erectile Function (IIEF). The Wilcoxon signed-rank test and Chi square statistics were used for analysis. RESULTS: The study showed that before radical prostatectomy, urinary incontinence of various severity grades was reported in 18.8, postoperatively in 63.0% (p < 0.001) and erectile dysfunction of various degrees was reported in 39.6 at baseline compared to 80.1% 12 months postoperatively (p < 0.001). CONCLUSIONS: This study suggests that radical prostatectomy is associated with a significantly increased risk of urinary incontinence and erectile dysfunction 12 months postoperatively.


Subject(s)
Erectile Dysfunction/etiology , Prostatectomy/adverse effects , Prostatic Neoplasms/surgery , Self Report , Urinary Incontinence/etiology , Adult , Aged , Austria , Chi-Square Distribution , Erectile Dysfunction/diagnosis , Hospitals, University , Humans , Male , Middle Aged , Prospective Studies , Prostatectomy/methods , Prostatic Neoplasms/complications , Prostatic Neoplasms/pathology , Risk Assessment , Risk Factors , Severity of Illness Index , Time Factors , Treatment Outcome , Urinary Incontinence/diagnosis
6.
Gastroenterology ; 143(4): 974-85.e14, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22863764

ABSTRACT

BACKGROUND & AIMS: The dynamics of hepatitis C virus (HCV) infection, as well as screening practices and access to therapy, vary among European countries. It is important to determine the magnitude of the effects of such differences on incidence and mortality of infection. We compared the dynamics of infection and screening and treatment practices among Belgium, France, Germany, Italy, Spain, and the United Kingdom. We also assessed the effects of treatment with pegylated interferon and additional effects of triple therapy with protease inhibitors. METHODS: We created a country-specific Markov model of HCV progression based on published epidemiologic data (on HCV prevalence, screening, genotype, alcohol consumption among patients, and treatments) and reports of competitive and hepatocellular carcinoma mortality for the 6 countries. The model was used to predict the incidence of HCV-related cirrhosis and its mortality until 2021 for each country. RESULTS: From 2002 to 2011, antiviral therapy reduced the cumulative incidence of cirrhosis by 7.1% and deaths by 3.4% overall. Reductions in incidence and mortality values ranged from 4.0% and 1.9%, respectively, in Italy to 16.3% and 9.0%, respectively, in France. From 2012 to 2021, antiviral treatment of patients with HCV genotype 1 infection that includes protease inhibitor-based triple therapy will reduce the cumulative incidence of cirrhosis by 17.7% and mortality by 9.7% overall. The smallest reduction is predicted for Italy (incidence reduced by 10.1% and mortality by 5.4%) and the highest is for France (reductions of 34.3% and 20.7%, respectively). CONCLUSIONS: Although HCV infection is treated with the same therapies in different countries, the effects of the therapies on morbidity and mortality vary significantly. In addition to common guidelines that are based on virologic response-guided therapy, there is a need for public health policies based on population-guided therapy.


Subject(s)
Antiviral Agents/therapeutic use , Carcinoma, Hepatocellular/epidemiology , Hepatitis C, Chronic/diagnosis , Hepatitis C, Chronic/drug therapy , Liver Cirrhosis/epidemiology , Liver Neoplasms/epidemiology , Protease Inhibitors/therapeutic use , Aged , Carcinoma, Hepatocellular/mortality , Carcinoma, Hepatocellular/virology , Disease Progression , Drug Therapy, Combination , Europe/epidemiology , Female , Hepacivirus/genetics , Hepatitis C, Chronic/complications , Humans , Incidence , Interferon-alpha/therapeutic use , Liver Cirrhosis/mortality , Liver Cirrhosis/virology , Liver Neoplasms/mortality , Liver Neoplasms/virology , Male , Markov Chains , Mass Screening , Middle Aged , Polyethylene Glycols/therapeutic use , Ribavirin/therapeutic use
7.
Dtsch Arztebl Int ; 120(44): 747-753, 2023 Nov 03.
Article in English | MEDLINE | ID: mdl-37656479

ABSTRACT

BACKGROUND: In this systematic review, we address the question whether children and adolescents with developmental visual disorders benefit from computer-assisted visual training. METHODS: Systematic literature searches were carried out in three bibliographic databases (initial search in October 2021) and trial registries. Included were randomized controlled trials that evaluated the efficacy of computer-assisted visual training in children and adolescents with developmental visual disorders in comparison to no training, sham training, or conservative treatment. RESULTS: The inclusion criteria were met by 17 trials (with a total of 1323 children and adolescents) focusing on binocular or monocular computer-assisted visual training for the treatment of amblyopia. In these trials, visual training was carried out for 2 to 24 weeks, either as "stand alone" therapy or in addition to occlusion therapy. Six trials showed a statistically significant difference in favor of the visual training for the outcome "best corrected visual acuity of the amblyopic eye." However, this difference was small and mostly below the threshold of clinical relevance of -0.05 logMAR (equivalent to an improvement of 0.5 lines on the eye chart, or 2.5 letters per line). Only few data were available for the outcomes "binocular vision" and "adverse events"; the differences between the groups were similarly small. CONCLUSION: The currently available data do not permit any firm conclusions regarding the efficacy of visual training in children and adolescents with amblyopia. Moreover, treatment adherence was often insufficient and the treatment durations in the trials was relatively short. No results from randomized trials have yet been published with respect to other developmental visual disorders (refractive errors, strabismus).


Subject(s)
Amblyopia , Refractive Errors , Child , Humans , Adolescent , Amblyopia/therapy , Visual Acuity , Vision Disorders/diagnosis , Vision Disorders/therapy , Computers , Randomized Controlled Trials as Topic
8.
Dtsch Arztebl Int ; 120(46): 786-792, 2023 11 17.
Article in English | MEDLINE | ID: mdl-37855423

ABSTRACT

BACKGROUND: Persons with a positive family history of colorectal cancer (CRC) are more likely than others to develop CRC and are also younger at the onset of the disease. Nonetheless, the German Federal Joint Committee (G-BA, Gemeinsamer Bundes - ausschuss) recommends screening all persons aged 50 and above regardless of their family history. FARKOR was a project supported by the Innovation Fund of the G-BA to study the feasibility, efficacy, and safety of a risk-adapted early detection program for CRC among persons aged 25 to 50 without any specific past medical history. METHODS: Physicians in private practice in Bavaria documented their activities relating to FARKOR online. The FARKOR process comprised a declaration of consent, a simplified family history for CRC, an optional, more comprehensive family history, a counseling session for participatory decision-making on further measures, and various modalities of screening (an immunological fecal occult blood test [iFOBT], colonoscopy, or no screening). Related physician activities outside the FARKOR process were assessed by record linkage between study data and data of the patients' health insurance carriers. RESULTS: The simplified family history was documented in 25 847 persons and positive for CRC in 5769 (22.3%). 3232 persons had a more comprehensive family history, among whom 2054 (63.6%) participated in screening measures. 1595 underwent colonoscopy; 278 persons who had already undergone colonoscopy in the preceding five years were excluded from the analysis. Colonoscopy revealed adenoma in 232 persons (17,6 %), advanced adenoma in 78 (5.9%) and carcinoma in 4 (0.3%). There were no serious complications. CONCLUSION: The detection rates in this study corresponded to those of persons aged 55 to 59 in the current early detection program. Despite numerous problems in the performance of the study (inconsistencies in documentation, external performance of screening measures on program participants), the results support the feasibility of a risk-adapted early detection program in the young target population with a family history of CRC.


Subject(s)
Adenoma , Colorectal Neoplasms , Humans , Early Detection of Cancer/methods , Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/genetics , Colonoscopy , Occult Blood , Adenoma/diagnosis , Mass Screening/methods
9.
Ger Med Sci ; 21: Doc06, 2023.
Article in English | MEDLINE | ID: mdl-37426885

ABSTRACT

Background: Stool DNA testing for early detection of colorectal cancer (CRC) is a non-invasive technology with the potential to supplement established CRC screening tests. The aim of this health technology assessment was to evaluate effectiveness and safety of currently CE-marked stool DNA tests, compared to other CRC tests in CRC screening strategies in an asymptomatic screening population. Methods: The assessment was carried out following the guidelines of the European Network for Health Technology Assessment (EUnetHTA). This included a systematic literature search in MED-LINE, Cochrane and EMBASE in 2018. Manufacturers were asked to provide additional data. Five patient interviews helped assessing potential ethical or social aspects and patients' experiences and preferences. We assessed the risk of bias using QUADAS-2, and the quality of the body of evidence using GRADE. Results: We identified three test accuracy studies, two of which investigated a multitarget stool DNA test (Cologuard®, compared fecal immunochemical test (FIT)) and one a combined DNA stool assay (ColoAlert®, compared to guaiac-based fecal occult blood test (gFOBT), Pyruvate Kinase Isoenzyme Type M2 (M2-PK) and combined gFOBT/M2-PK). We found five published surveys on patient satisfaction. No primary study investigating screening effects on CRC incidence or on overall mortality was found. Both stool DNA tests showed in direct comparison higher sensitivity for the detection of CRC and (advanced) adenoma compared to FIT, or gFOBT, respectively, but had lower specificity. However, these comparative results may depend on the exact type of FIT used. The reported test failure rates were higher for stool DNA testing than for FIT. The certainty of evidence was moderate to high for Cologuard® studies, and low to very low for the ColoAlert® study which refers to a former version of the product and yielded no direct evidence on the test accuracy for ad-vanced versus non-advanced adenoma. Conclusions: ColoAlert® is the only stool DNA test currently sold in Europe and is available at a lower price than Cologuard®, but reliable evidence is lacking. A screening study including the current product version of ColoAlert® and suitable comparators would, therefore, help evaluate the effectiveness of this screening option in a European context.


Subject(s)
Adenoma , Colorectal Neoplasms , Humans , Adenoma/diagnosis , Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/genetics , DNA, Neoplasm , Early Detection of Cancer/methods , Guaiac , Mass Screening/methods , Occult Blood , Technology Assessment, Biomedical
10.
Malar J ; 11: 212, 2012 Jun 22.
Article in English | MEDLINE | ID: mdl-22720832

ABSTRACT

BACKGROUND: Malaria continues to be amongst the most frequent infectious diseases imported to Europe. Whilst European treatment guidelines are based on data from studies carried out in endemic areas, there is a paucity of original prospective treatment data. The objective was to summarize data on treatments to harmonize and optimize treatment for uncomplicated malaria in Europe. METHODS: A prospective observational multicentre study was conducted, assessing tolerance and efficacy of treatment regimens for imported uncomplicated falciparum malaria in adults amongst European centres of tropical and travel medicine. RESULTS: Between December 2003 and 2009, 504 patients were included in 16 centres from five European countries. Eighteen treatment regimens were reported, the top three being atovaquone-proguanil, mefloquine, and artemether-lumefantrine. Treatments significantly differed with respect to the occurrence of treatment changes (p = 0.005) and adverse events (p = 0.001), parasite and fever clearance times (p < 0.001), and hospitalization rates (p = 0.0066) and durations (p = 0.001). Four recrudescences and two progressions to severe disease were observed. Compared to other regimens, quinine alone was associated with more frequent switches to second line treatment, more adverse events and longer inpatient stays. Parasite and fever clearance times were shortest with artemether-mefloquine combination treatment. Vomiting was the most frequent cause of treatment change, occurring in 5.5% of all patients but 9% of the atovaquone-proguanil group. CONCLUSIONS: This study highlights the heterogeneity of standards of care within Europe. A consensus discussion at European level is desirable to foster a standardized management of imported falciparum malaria.


Subject(s)
Antimalarials/therapeutic use , Malaria/drug therapy , Adolescent , Adult , Aged , Artemether, Lumefantrine Drug Combination , Artemisinins/therapeutic use , Atovaquone/therapeutic use , Drug Combinations , Drug Therapy/methods , Drug Therapy/standards , Ethanolamines/therapeutic use , Europe , Female , Fluorenes/therapeutic use , Humans , Male , Mefloquine/therapeutic use , Middle Aged , Proguanil/therapeutic use , Prospective Studies , Young Adult
11.
PLoS One ; 17(5): e0265957, 2022.
Article in English | MEDLINE | ID: mdl-35499997

ABSTRACT

BACKGROUND AND OBJECTIVE: The distribution of the newly developed vaccines presents a great challenge in the ongoing SARS-CoV-2 pandemic. Policy makers must decide which subgroups should be vaccinated first to minimize the negative consequences of the pandemic. These decisions must be made upfront and under uncertainty regarding the amount of vaccine doses available at a given time. The objective of the present work was to develop an iterative optimization algorithm, which provides a prioritization order of predefined subgroups. The results of this algorithm should be optimal but also robust with respect to potentially limited vaccine supply. METHODS: We present an optimization meta-heuristic which can be used in a classic simulation-optimization setting with a simulation model in a feedback loop. The meta-heuristic can be applied in combination with any epidemiological simulation model capable of depicting the effects of vaccine distribution to the modeled population, accepts a vaccine prioritization plan in a certain notation as input, and generates decision making relevant variables such as COVID-19 caused deaths or hospitalizations as output. We finally demonstrate the mechanics of the algorithm presenting the results of a case study performed with an epidemiological agent-based model. RESULTS: We show that the developed method generates a highly robust vaccination prioritization plan which is proven to fulfill an elegant supremacy criterion: the plan is equally optimal for any quantity of vaccine doses available. The algorithm was tested on a case study in the Austrian context and it generated a vaccination plan prioritization favoring individuals age 65+, followed by vulnerable groups, to minimize COVID-19 related burden. DISCUSSION: The results of the case study coincide with the international policy recommendations which strengthen the applicability of the approach. We conclude that the path-dependent optimum optimum provided by the algorithm is well suited for real world applications, in which decision makers need to develop strategies upfront under high levels of uncertainty.


Subject(s)
COVID-19 , Influenza Vaccines , Influenza, Human , Aged , Algorithms , COVID-19/epidemiology , COVID-19/prevention & control , COVID-19 Vaccines , Humans , Influenza, Human/epidemiology , SARS-CoV-2 , Vaccination
12.
J Clin Epidemiol ; 152: 269-280, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36252741

ABSTRACT

BACKGROUND AND OBJECTIVES: Drawing causal conclusions from real-world data (RWD) poses methodological challenges and risk of bias. We aimed to systematically assess the type and impact of potential biases that may occur when analyzing RWD using the case of progressive ovarian cancer. METHODS: We retrospectively compared overall survival with and without second-line chemotherapy (LOT2) using electronic medical records. Potential biases were determined using directed acyclic graphs. We followed a stepwise analytic approach ranging from crude analysis and multivariable-adjusted Cox model up to a full causal analysis using a marginal structural Cox model with replicates emulating a reference randomized controlled trial (RCT). To assess biases, we compared effect estimates (hazard ratios [HRs]) of each approach to the HR of the reference trial. RESULTS: The reference trial showed an HR for second line vs. delayed therapy of 1.01 (95% confidence interval [95% CI]: 0.82-1.25). The corresponding HRs from the RWD analysis ranged from 0.51 for simple baseline adjustments to 1.41 (95% CI: 1.22-1.64) accounting for immortal time bias with time-varying covariates. Causal trial emulation yielded an HR of 1.12 (95% CI: 0.96-1.28). CONCLUSION: Our study, using ovarian cancer as an example, shows the importance of a thorough causal design and analysis if one is expecting RWD to emulate clinical trial results.


Subject(s)
Ovarian Neoplasms , Humans , Female , Bias , Treatment Outcome , Ovarian Neoplasms/drug therapy
13.
Eur J Health Econ ; 22(8): 1311-1344, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34342797

ABSTRACT

OBJECTIVES: Benefit and cost effectiveness of breast cancer screening are still matters of controversy. Risk-adapted strategies are proposed to improve its benefit-harm and cost-benefit relations. Our objective was to perform a systematic review on economic breast cancer models evaluating primary and secondary prevention strategies in the European health care setting, with specific focus on model results, model characteristics, and risk-adapted strategies. METHODS: Literature databases were systematically searched for economic breast cancer models evaluating the cost effectiveness of breast cancer screening and prevention strategies in the European health care context. Characteristics, methodological details and results of the identified studies are reported in evidence tables. Economic model outputs are standardized to achieve comparable cost-effectiveness ratios. RESULTS: Thirty-two economic evaluations of breast cancer screening and seven evaluations of primary breast cancer prevention were included. Five screening studies and none of the prevention studies considered risk-adapted strategies. Studies differed in methodologic features. Only about half of the screening studies modeled overdiagnosis-related harms, most often indirectly and without reporting their magnitude. All models predict gains in life expectancy and/or quality-adjusted life expectancy at acceptable costs. However, risk-adapted screening was shown to be more effective and efficient than conventional screening. CONCLUSIONS: Economic models suggest that breast cancer screening and prevention are cost effective in the European setting. All screening models predict gains in life expectancy, which has not yet been confirmed by trials. European models evaluating risk-adapted screening strategies are rare, but suggest that risk-adapted screening is more effective and efficient than conventional screening.


Subject(s)
Breast Neoplasms , Breast Neoplasms/diagnosis , Breast Neoplasms/prevention & control , Cost-Benefit Analysis , Early Detection of Cancer , Female , Humans , Models, Economic , Quality-Adjusted Life Years
14.
Thyroid ; 31(3): 494-508, 2021 03.
Article in English | MEDLINE | ID: mdl-32847437

ABSTRACT

Background: Iodine deficiency is one of the most prevalent causes of intellectual disability and can lead to impaired thyroid function and other iodine deficiency disorders (IDDs). Despite progress made on eradicating iodine deficiency in the last decades in Europe, IDDs are still prevalent. Currently, evidence-based information on the benefit/harm balance of IDD prevention in Europe is lacking. We developed a decision-analytic model and conducted a public health decision analysis for the long-term net benefit of a mandatory IDD prevention program for the German population with moderate iodine deficiency, as a case example for a European country. Methods: We developed a decision-analytic Markov model simulating the incidence and consequences of IDDs in the absence or presence of a mandatory IDD prevention program (iodine fortification of salt) in an open population with current demographic characteristics in Germany and with moderate ID. We collected data on the prevalence, incidence, mortality, and quality of life from European studies for all health states of the model. Our primary net-benefit outcome was quality-adjusted life years (QALYs) predicted over a period of 120 years. In addition, we calculated incremental life years and disease events over time. We performed a systematic and comprehensive uncertainty assessment using multiple deterministic one-way sensitivity analyses. Results: In the base-case analysis, the IDD prevention program is more beneficial than no prevention, both in terms of QALYs and life years. Health gains predicted for the open cohort over a time horizon of 120 years for the German population (82.2 million inhabitants) were 33 million QALYs and 5 million life years. Nevertheless, prevention is not beneficial for all individuals since it causes additional hyperthyroidism (2.7 million additional cases). Results for QALY gains were stable in sensitivity analyses. Conclusions: IDD prevention via mandatory iodine fortification of salt increases quality-adjusted life expectancy in a European population with moderate ID, and is therefore beneficial on a population level. However, further ethical aspects should be considered before implementing a mandatory IDD prevention program. Costs for IDD prevention and treatment should be determined to evaluate the cost effectiveness of IDD prevention.


Subject(s)
Decision Support Techniques , Deficiency Diseases/prevention & control , Iodine/administration & dosage , Sodium Chloride, Dietary/administration & dosage , Deficiency Diseases/diagnosis , Deficiency Diseases/epidemiology , Germany/epidemiology , Humans , Incidence , Iodine/adverse effects , Iodine/deficiency , Life Expectancy , Markov Chains , Predictive Value of Tests , Prevalence , Quality-Adjusted Life Years , Risk Assessment , Risk Factors , Sodium Chloride, Dietary/adverse effects , Time Factors , Treatment Outcome
15.
Endocr Connect ; 10(1): 1-12, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33263563

ABSTRACT

OBJECTIVE: More than 30% of the German population suffers from mild to moderate iodine deficiency causing goiter and other iodine deficiency disorders (IDDs). The economic burden of iodine deficiency is still unclear. We aimed to assess costs for prevention, monitoring and treatment of IDDs in Germany. DESIGN: We performed a comprehensive cost analysis. METHODS: We assessed direct medical costs and direct non-medical costs for inpatient and outpatient care of IDDs and costs for productivity loss due to the absence of work in 2018. Additionally, we calculated total costs for an IDD prevention program comprising universal salt iodization (USI). We performed threshold analyses projecting how many cases of IDDs or related treatments would need to be avoided for USI to be cost-saving. RESULTS: Annual average costs per case in the year of diagnosis were € 211 for goiter/thyroid nodules; € 308 for hyperthyroidism; and € 274 for hypothyroidism. Average one-time costs for thyroidectomy were € 4184 and € 3118 for radioiodine therapy. Average costs for one case of spontaneous abortion were € 916. Annual costs of intellectual disability were € 14,202. In the German population, total annual costs for USI would amount to 8 million Euro. To be cost-saving, USI would need to prevent, for example, 37,900 cases of goiter/thyroid nodules. CONCLUSION: USI potentially saves costs, if a minimum amount of IDDs per year could be avoided. In order to recommend the implementation of USI, a full health-economic evaluation including a comprehensive benefit-harm assessment is needed.

16.
Vaccines (Basel) ; 9(5)2021 Apr 27.
Article in English | MEDLINE | ID: mdl-33925650

ABSTRACT

(1) Background: The Austrian supply of COVID-19 vaccine is limited for now. We aim to provide evidence-based guidance to the authorities in order to minimize COVID-19-related hospitalizations and deaths in Austria. (2) Methods: We used a dynamic agent-based population model to compare different vaccination strategies targeted to the elderly (65 ≥ years), middle aged (45-64 years), younger (15-44 years), vulnerable (risk of severe disease due to comorbidities), and healthcare workers (HCW). First, outcomes were optimized for an initially available vaccine batch for 200,000 individuals. Second, stepwise optimization was performed deriving a prioritization sequence for 2.45 million individuals, maximizing the reduction in total hospitalizations and deaths compared to no vaccination. We considered sterilizing and non-sterilizing immunity, assuming a 70% effectiveness. (3) Results: Maximum reduction of hospitalizations and deaths was achieved by starting vaccination with the elderly and vulnerable followed by middle-aged, HCW, and younger individuals. Optimizations for vaccinating 2.45 million individuals yielded the same prioritization and avoided approximately one third of deaths and hospitalizations. Starting vaccination with HCW leads to slightly smaller reductions but maximizes occupational safety. (4) Conclusion: To minimize COVID-19-related hospitalizations and deaths, our study shows that elderly and vulnerable persons should be prioritized for vaccination until further vaccines are available.

17.
BMC Public Health ; 9: 34, 2009 Jan 22.
Article in English | MEDLINE | ID: mdl-19161623

ABSTRACT

BACKGROUND: Hepatitis C virus (HCV) is a leading cause of chronic liver disease, end-stage cirrhosis, and liver cancer, but little is known about the burden of disease caused by the virus. We summarised burden of disease data presently available for Europe, compared the data to current expert estimates, and identified areas in which better data are needed. METHODS: Literature and international health databases were systematically searched for HCV-specific burden of disease data, including incidence, prevalence, mortality, disability-adjusted life-years (DALYs), and liver transplantation. Data were collected for the WHO European region with emphasis on 22 countries. If HCV-specific data were unavailable, these were calculated via HCV-attributable fractions. RESULTS: HCV-specific burden of disease data for Europe are scarce. Incidence data provided by national surveillance are not fully comparable and need to be standardised. HCV prevalence data are often inconclusive. According to available data, an estimated 7.3-8.8 million people (1.1-1.3%) are infected in our 22 focus countries. HCV-specific mortality, DALY, and transplantation data are unavailable. Estimations via HCV-attributable fractions indicate that HCV caused more than 86000 deaths and 1.2 million DALYs in the WHO European region in 2002. Most of the DALYs (95%) were accumulated by patients in preventable disease stages. About one-quarter of the liver transplants performed in 25 European countries in 2004 were attributable to HCV. CONCLUSION: Our results indicate that hepatitis C is a major health problem and highlight the importance of timely antiviral treatment. However, data on the burden of disease of hepatitis C in Europe are scarce, outdated or inconclusive, which indicates that hepatitis C is still a neglected disease in many countries. What is needed are public awareness, co-ordinated action plans, and better data. European physicians should be aware that many infections are still undetected, provide timely testing and antiviral treatment, and avoid iatrogenic transmission.


Subject(s)
Hepacivirus/isolation & purification , Hepatitis C/epidemiology , Hepatitis C/therapy , Liver Cirrhosis/epidemiology , Liver Neoplasms/epidemiology , Antiviral Agents/administration & dosage , Combined Modality Therapy , Disease Progression , Europe/epidemiology , Female , Hepatitis C/diagnosis , Hepatitis C, Chronic/diagnosis , Hepatitis C, Chronic/epidemiology , Hepatitis C, Chronic/therapy , Humans , Incidence , Liver Cirrhosis/therapy , Liver Cirrhosis/virology , Liver Neoplasms/therapy , Liver Neoplasms/virology , Liver Transplantation , Male , Morbidity/trends , Prevalence , Prognosis , Risk Assessment , Severity of Illness Index , Survival Analysis
18.
Eur J Public Health ; 19(3): 245-53, 2009 Jun.
Article in English | MEDLINE | ID: mdl-19196737

ABSTRACT

BACKGROUND: Hepatitis C virus (HCV) infection is an emerging problem in public health. In most countries, the majority of HCV infected people are yet undiagnosed. Early detection and treatment may result in better health outcomes and save costs by preventing future advanced liver disease. The evidence for long-term effectiveness and cost-effectiveness of HCV screening was systematically reviewed. METHODS: We performed a systematic literature search on long-term health-economic effects of HCV screening and included Health Technology Assessment (HTA) reports, systematic reviews, long-term clinical trials, full health economic and decision-analytic modelling studies with a sufficiently long time horizon and patient-relevant long-term outcomes such as life-years gained (LYG) or quality-adjusted life years (QALY) gained. Economic results were converted to 2005 Euros. RESULTS: Seven studies were included. Target population, HCV prevalence, study perspective, discount rate, screening and antiviral treatment mode varied. The incremental effectiveness of HCV screening and early treatment compared to no screening and standard care varied from 0.0004 to 0.066 LYG, and from 0.0001 to 0.072 QALY. Incremental cost-effectiveness and cost-utility ratios of HCV screening vs. no screening were 3900-243,700 euro/LYG and 18,300-1,151,000 euro/QALY. HCV screening seems to be cost-effective in populations with high HCV prevalence, but not in low HCV prevalence populations. CONCLUSIONS: HCV screening and early treatment have the potential to improve average life-expectancy, but should focus on populations with elevated HCV prevalence to be cost-effective. Further research on the long-term health-economic impact of HCV screening when combined with appropriate monitoring strategies in different European health care systems is needed.


Subject(s)
Hepatitis C/diagnosis , Hepatitis C/prevention & control , Mass Screening/economics , Communicable Diseases, Emerging/diagnosis , Communicable Diseases, Emerging/economics , Communicable Diseases, Emerging/epidemiology , Communicable Diseases, Emerging/prevention & control , Cost-Benefit Analysis , Europe/epidemiology , Hepatitis C/economics , Hepatitis C/epidemiology , Humans , Mass Screening/standards , Quality-Adjusted Life Years
19.
Malar J ; 6: 114, 2007 Aug 23.
Article in English | MEDLINE | ID: mdl-17716367

ABSTRACT

A comparison was made between local malaria transmission and malaria imported by travellers to identify the utility of national and regional annual parasite index (API) in predicting malaria risk and its value in generating recommendations on malaria prophylaxis for travellers. Regional malaria transmission data was correlated with malaria acquired in Latin America and imported into the USA and nine European countries. Between 2000 and 2004, most countries reported declining malaria transmission. Highest API's in 2003/4 were in Surinam (287.4) Guyana (209.2) and French Guiana (147.4). The major source of travel associated malaria was Honduras, French Guiana, Guatemala, Mexico and Ecuador. During 2004 there were 6.3 million visits from the ten study countries and in 2005, 209 cases of malaria of which 22 (11%) were Plasmodium falciparum. The risk of adverse events are high and the benefit of avoided benign vivax malaria is very low under current policy, which may be causing more harm than benefit.


Subject(s)
Malaria/prevention & control , Travel , Central America/epidemiology , Chemoprevention , Europe/epidemiology , Humans , Malaria/epidemiology , Malaria/transmission , Pan American Health Organization , Risk Factors , South America/epidemiology , United States/epidemiology
20.
Eur Urol ; 72(6): 899-907, 2017 12.
Article in English | MEDLINE | ID: mdl-28844371

ABSTRACT

BACKGROUND: An increasing proportion of prostate cancer is being managed conservatively. However, there are no randomized trials or consensus regarding the optimal follow-up strategy. OBJECTIVE: To compare life expectancy and quality of life between watchful waiting (WW) versus different strategies of active surveillance (AS). DESIGN, SETTING, AND PARTICIPANTS: A Markov model was created for US men starting at age 50, diagnosed with localized prostate cancer who chose conservative management by WW or AS using different testing protocols (prostate-specific antigen every 3-6 mo, biopsy every 1-5 yr, or magnetic resonance imaging based). Transition probabilities and utilities were obtained from the literature. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: Primary outcomes were life years and quality-adjusted life years (QALYs). Secondary outcomes include radical treatment, metastasis, and prostate cancer death. RESULTS AND LIMITATIONS: All AS strategies yielded more life years compared with WW. Lifetime risks of prostate cancer death and metastasis were, respectively, 5.42% and 6.40% with AS versus 8.72% and 10.30% with WW. AS yielded more QALYs than WW except in cohorts age >65 yr at diagnosis, or when treatment-related complications were long term. The preferred follow-up strategy was also sensitive to whether people value short-term over long-term benefits (time preference). Depending on the AS protocol, 30-41% underwent radical treatment within 10 yr. Extending the surveillance biopsy interval from 1 to 5 yr reduced life years slightly, with a 0.26 difference in QALYs. CONCLUSIONS: AS extends life more than WW, particularly for men with higher-risk features, but this is partly offset by the decrement in quality of life since many men eventually receive treatment. PATIENT SUMMARY: More intensive active surveillance protocols extend life more than watchful waiting, but this is partly offset by decrements in quality of life from subsequent treatment.


Subject(s)
Life Expectancy , Prostatic Neoplasms/pathology , Prostatic Neoplasms/therapy , Quality of Life , Watchful Waiting/methods , Adult , Aged , Aged, 80 and over , Biopsy , Conservative Treatment , Humans , Magnetic Resonance Imaging , Male , Markov Chains , Middle Aged , Neoplasm Metastasis , Prostate/diagnostic imaging , Prostate/pathology , Prostate-Specific Antigen/blood , Prostatic Neoplasms/diagnostic imaging , Quality-Adjusted Life Years
SELECTION OF CITATIONS
SEARCH DETAIL