Your browser doesn't support javascript.
loading
A comparison of four quasi-experimental methods: an analysis of the introduction of activity-based funding in Ireland.
Valentelyte, Gintare; Keegan, Conor; Sorensen, Jan.
Affiliation
  • Valentelyte G; Structured Population and Health services Research Education (SPHeRE) Programme, School of Population Health, RCSI University of Medicine and Health Sciences, Mercer Street Lower, Dublin, Ireland. gintarevalentelyte@rcsi.com.
  • Keegan C; Healthcare Outcome Research Centre (HORC), School of Population Health, RCSI University of Medicine and Health Sciences, Dublin, Ireland. gintarevalentelyte@rcsi.com.
  • Sorensen J; Economic and Social Research Institute (ESRI), Whitaker Square, Dublin, Ireland.
BMC Health Serv Res ; 22(1): 1311, 2022 Nov 03.
Article in En | MEDLINE | ID: mdl-36329423
ABSTRACT

BACKGROUND:

Health services research often relies on quasi-experimental study designs in the estimation of treatment effects of a policy change or an intervention. The aim of this study is to compare some of the commonly used non-experimental methods in estimating intervention effects, and to highlight their relative strengths and weaknesses. We estimate the effects of Activity-Based Funding, a hospital financing reform of Irish public hospitals, introduced in 2016.

METHODS:

We estimate and compare four analytical

methods:

Interrupted time series analysis, Difference-in-Differences, Propensity Score Matching Difference-in-Differences and the Synthetic Control method. Specifically, we focus on the comparison between the control-treatment methods and the non-control-treatment approach, interrupted time series analysis. Our empirical example evaluated the length of stay impact post hip replacement surgery, following the introduction of Activity-Based Funding in Ireland. We also contribute to the very limited research reporting the impacts of Activity-Based-Funding within the Irish context.

RESULTS:

Interrupted time-series analysis produced statistically significant results different in interpretation, while the Difference-in-Differences, Propensity Score Matching Difference-in-Differences and Synthetic Control methods incorporating control groups, suggested no statistically significant intervention effect, on patient length of stay.

CONCLUSION:

Our analysis confirms that different analytical methods for estimating intervention effects provide different assessments of the intervention effects. It is crucial that researchers employ appropriate designs which incorporate a counterfactual framework. Such methods tend to be more robust and provide a stronger basis for evidence-based policy-making.
Subject(s)
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Research Design / Health Services Research Limits: Humans Country/Region as subject: Europa Language: En Journal: BMC Health Serv Res Journal subject: PESQUISA EM SERVICOS DE SAUDE Year: 2022 Document type: Article Affiliation country:

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Research Design / Health Services Research Limits: Humans Country/Region as subject: Europa Language: En Journal: BMC Health Serv Res Journal subject: PESQUISA EM SERVICOS DE SAUDE Year: 2022 Document type: Article Affiliation country: