ABSTRACT
INTRODUCTION: The lingering burden of the COVID-19 pandemic on primary care clinicians and practices poses a public health emergency for the United States. This study uses clinician-reported data to examine changes in primary care demand and capacity. METHODS: From March 2020 to March 2022, 36 electronic surveys were fielded among primary care clinicians responding to survey invitations as posted on listservs and identified through social media and crowd sourcing. Quantitative and qualitative analyses were performed on both closed- and open-ended survey questions. RESULTS: An average of 937 respondents per survey represented family medicine, pediatrics, internal medicine, geriatrics, and other specialties. Responses reported increases in patient health burden, including worsening chronic care management and increasing volume and complexity. A higher frequency of dental- and eyesight-related issues was noted by respondents, as was a substantial increase in mental or emotional health needs. Respondents also noted increased demand, "record high" wait times, and struggles to keep up with patient needs and the higher volume of patient questions. Frequent qualitative statements highlighted the mismatch of patient needs with practice capacity. Staffing shortages and the inability to fill open clinical positions impaired clinicians' ability to meet patient needs and a substantial proportion of respondents indicated an intention to leave the profession or knew someone who had. CONCLUSION: These data signal an urgent need to take action to support the ability of primary care to meet ongoing patient and population health care needs.
Subject(s)
COVID-19 , Primary Health Care , SARS-CoV-2 , Humans , COVID-19/epidemiology , United States , Surveys and Questionnaires , Public Health , Pandemics , Health Services Needs and DemandABSTRACT
BACKGROUND: Acute kidney injury (AKI), including contrast-induced AKI (CI-AKI), is an important complication of percutaneous coronary intervention (PCI), resulting in short- and long-term adverse clinical outcomes. While prior research has reported an increased cost burden to hospitals from CI-AKI, the incremental cost to payers remains unknown. Understanding this incremental cost may inform decisions and even policy in the future. The objective of this study was to estimate the short- and long-term cost to Medicare of AKI overall, and specifically CI-AKI, in PCI. METHODS: Patients undergoing inpatient PCI between January 2017 and June 2020 were selected from Medicare 100% fee-for-service data. Baseline clinical characteristics, PCI lesion/procedural characteristics, and AKI/CI-AKI during the PCI admission, were identified from diagnosis and procedure codes. Poisson regression, generalized linear modelling, and longitudinal mixed effects modelling, in full and propensity-matched cohorts, were used to compare PCI admission length of stay (LOS) and cost (Medicare paid amount inflated to 2022 US$), as well as total costs during 1-year following PCI, between AKI and non-AKI patients. RESULTS: The study cohort included 509,039 patients, of whom 104,033 (20.4%) were diagnosed with AKI and 9,691 (1.9%) with CI-AKI. In the full cohort, AKI was associated withĀ +4.12 (95% confidence intervalĀ =Ā 4.10, 4.15) days index PCI admission LOS, +$11,313 ($11,093, $11,534) index admission costs, and +$14,800 ($14,359, $15,241) total 1-year costs. CI-AKI was associated withĀ +3.03 (2.97, 3.08) days LOS, +$6,566 ($6,148, $6,984) index admission costs, and +$13,381 ($12,118, $14,644) cumulative 1-year costs (all results are adjusted for baseline characteristics). Results from the propensity-matched analyses were similar. CONCLUSIONS: AKI, and specifically CI-AKI, during PCI is associated with significantly longer PCI admission LOS, PCI admission costs, and long-terms costs.
Subject(s)
Acute Kidney Injury , Percutaneous Coronary Intervention , Humans , Aged , United States/epidemiology , Percutaneous Coronary Intervention/methods , Risk Factors , Medicare , Forecasting , Acute Kidney Injury/chemically induced , Acute Kidney Injury/epidemiology , Contrast Media/adverse effectsABSTRACT
PURPOSE: During the COVID-19 pandemic, telemedicine emerged as an important tool in primary care. Technology and policy-related challenges, however, revealed barriers to adoption and implementation. This report describes the findings from weekly and monthly surveys of primary care clinicians regarding telemedicine during the first 2 years of the pandemic. METHODS: From March 2020 to March 2022, we conducted electronic surveys using convenience samples obtained through social networking and crowdsourcing. Unique tokens were used to confidentially track respondents over time. A multidisciplinary team conducted quantitative and qualitative analyses to identify key concepts and trends. RESULTS: A total of 36 surveys resulted in an average of 937 respondents per survey, representing clinicians from all 50 states and multiple specialties. Initial responses indicated general difficulties in implementing telemedicine due to poor infrastructure and reimbursement mechanisms. Over time, attitudes toward telemedicine improved and respondents considered video and telephone-based care important tools for their practice, though not a replacement for in-person care. CONCLUSIONS: The implementation of telemedicine during COVID-19 identified barriers and opportunities for technology adoption and highlighted steps that could support primary care clinics' ability to learn, adapt, and implement technology.
Subject(s)
COVID-19 , Telemedicine , Humans , Pandemics , COVID-19/epidemiology , Electronics , Primary Health CareABSTRACT
BACKGROUND: Breast cancer-related lymphedema (BCRL) imposes a significant economic burden on patients, providers, and society. There is no curative therapy for BCRL, but management through self-care can reduce symptoms and lower the risk of adverse events. MAIN BODY: The economic burden of BCRL stems from related adverse events, reductions in productivity and employment, and the burden placed on non-medical caregivers. Self-care regimens often include manual lymphatic drainage, compression garments, and meticulous skin care, and may incorporate pneumatic compression devices. These regimens can be effective in managing BCRL, but patients cite inconvenience and interference with daily activities as potential barriers to self-care adherence. As a result, adherence is generally poor and often worsens with time. Because self-care is on-going, poor adherence reduces the effectiveness of regimens and leads to costly treatment of BCRL complications. CONCLUSION: Novel self-care solutions that are more convenient and that interfere less with daily activities could increase self-care adherence and ultimately reduce complication-related costs of BCRL.
ABSTRACT
BACKGROUND: Non-adherence to psychotropic medications is common in schizophrenia and bipolar disorders (BDs) leading to adverse outcomes. We examined patterns of antipsychotic use in schizophrenia and BD and their impact on subsequent acute care utilization. METHODS: We used electronic health record (EHR) data of 577 individuals with schizophrenia, 795 with BD, and 618 using antipsychotics without a diagnosis of either illness at two large health systems. We structured three antipsychotics exposure variables: the proportion of days covered (PDC) to measure adherence; medication switch as a new antipsychotic prescription that was different than the initial antipsychotic; and medication stoppage as the lack of an antipsychotic order or fill data in the EHR after the date when the previous supply would have been depleted. Outcome measures included the frequency of inpatient and emergency department (ED) visits up to 12Ā months after treatment initiation. RESULTS: Approximately half of the study population were adherent to their antipsychotic medication (a PDC ≥ 0.80): 53.6% of those with schizophrenia, 52.4% of those with BD, and 50.3% of those without either diagnosis. Among schizophrenia patients, 22.5% switched medications and 15.1% stopped therapy. Switching and stopping occurred in 15.8% and 15.1% of BD patients and 7.4% and 20.1% of those without either diagnosis, respectively. Across the three cohorts, non-adherence, switching, and stopping therapy were all associated with increased acute care utilization, even after adjusting for baseline demographics, health insurance, past acute care utilization, and comorbidity. CONCLUSION: Non-continuous antipsychotic use is common and associated with high acute care utilization.
Subject(s)
Antipsychotic Agents , Bipolar Disorder , Schizophrenia , Humans , Antipsychotic Agents/therapeutic use , Retrospective Studies , Medication Adherence , Schizophrenia/diagnosis , Bipolar Disorder/drug therapyABSTRACT
PURPOSE: Care continuity is foundational to the clinician/patient relationship; however, little has been done to operationalize continuity of care (CoC) as a clinical quality measure. The American Board of Family Medicine developed the Primary Care CoC clinical quality measure as part of the Measures That Matter to Primary Care initiative. METHODS: Using 12-month Optum Clinformatics Data Mart claims data, we calculated the Bice-Boxerman Continuity of Care Index for each patient, which we rolled up to create an aggregate, physician-level CoC score. The physician quality score is the percent of patients with a Bice-Boxerman Index ≥0.7 (70%). We tested validity in 2 ways. First, we explored the validity of using 0.7 as a threshold for patient CoC within the Optum claims database to validate its use for reflecting patient-level continuity. Second, we explored the validity of the physician CoC measure by examining its association with patient outcomes. We assessed reliability using signal-to-noise methodology. RESULTS: Mean performance on the measure was 27.6%; performance ranged from 0% to 100% (n = 555,213 primary care physicians). Higher levels of CoC were associated with lower levels of care utilization. The measure indicated acceptable levels of validity and reliability. CONCLUSIONS: Continuity is associated with desirable health and cost outcomes as well as patient preference. The CoC clinical quality measure meets validity and reliability requirements for implementation in primary care payment and accountability. Care continuity is important and complementary to access to care, and prioritizing this measure could help shift physician and health system behavior to support continuity.
Subject(s)
Physicians , Quality Indicators, Health Care , Humans , Reproducibility of Results , Quality of Health Care , Continuity of Patient CareABSTRACT
RATIONALE & OBJECTIVE: Prior studies suggesting that medical therapy is inferior to percutaneous (percutaneous coronary intervention [PCI]) or surgical (coronary artery bypass grafting [CABG]) coronary revascularization in chronic kidney disease (CKD) have not adequately considered medication optimization or baseline cardiovascular risk and have infrequently evaluated progression to kidney failure. We compared, separately, the risks for kidney failure and death after treatment with PCI, CABG, or optimized medical therapy for coronary disease among patients with CKD stratified by cardiovascular disease risk. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: 34,385 individuals with CKD identified from a national 20% Medicare sample who underwent angiography or diagnostic stress testing without (low risk) or with (medium risk) prior cardiovascular disease or who presented with acute coronary syndrome (high risk). EXPOSURES: PCI, CABG, or optimized medical therapy (defined by the addition of cardiovascular medications in the absence of coronary revascularization). OUTCOMES: Death, kidney failure, composite outcome of death or kidney failure. ANALYTICAL APPROACH: Adjusted relative rates of death, kidney failure, and the composite of death or kidney failure estimated from Cox proportional hazards models. RESULTS: Among low-risk patients, 960 underwent PCI, 391 underwent CABG, and 6,426 received medical therapy alone; among medium-risk patients, 1,812 underwent PCI, 512 underwent CABG, and 9,984 received medical therapy alone; and among high-risk patients, 4,608 underwent PCI, 1,330 underwent CABG, and 8,362 received medical therapy alone. Among low- and medium-risk patients, neither CABG (HRs of 1.22 [95% CI, 0.96-1.53] and 1.08 [95% CI, 0.91-1.29] for low- and medium-risk patients, respectively) nor PCI (HRs of 1.14 [95% CI, 0.98-1.33] and 1.02 [95% CI, 0.93-1.12], respectively) were associated with reduced mortality compared with medical therapy, but in low-risk patients, CABG was associated with a higher rate of the composite, death or kidney failure (HR, 1.25; 95% CI, 1.02-1.53). In high-risk patients, CABG and PCI were associated with lower mortality (HRs of 0.57 [95% CI, 0.51-0.63] and 0.70 [95% CI, 0.66-0.74], respectively). Also, in high-risk patients, CABG was associated with a higher rate of kidney failure (HR, 1.40; 95% CI, 1.16-1.69). LIMITATIONS: Possible residual confounding; lack of data for coronary angiography or left ventricular ejection fraction; possible differences in decreased kidney function severity between therapy groups. CONCLUSIONS: Outcomes associated with cardiovascular therapies among patients with CKD differed by baseline cardiovascular risk. Coronary revascularization was not associated with improved survival in low-risk patients, but was associated with improved survival in high-risk patients despite a greater observed rate of kidney failure. These findings may inform clinical decision making in the care of patients with both CKD and cardiovascular disease.
Subject(s)
Cardiovascular Diseases/therapy , Medicare/trends , Percutaneous Coronary Intervention/methods , Percutaneous Coronary Intervention/trends , Renal Insufficiency, Chronic/therapy , Aged , Aged, 80 and over , Cardiovascular Diseases/economics , Cardiovascular Diseases/epidemiology , Cohort Studies , Female , Humans , Male , Middle Aged , Percutaneous Coronary Intervention/economics , Renal Insufficiency, Chronic/economics , Renal Insufficiency, Chronic/epidemiology , Retrospective Studies , Risk Factors , Treatment Outcome , United States/epidemiologyABSTRACT
Although management of multiple myeloma has changed substantially in the last decade, it is unknown whether the burden of ESRD due to multiple myeloma has changed, or whether survival of patients with multiple myeloma on RRT has improved. Regarding ESRD due to multiple myeloma necessitating RRT in the United States, we evaluated temporal trends between 2001 and 2010 for demography-adjusted incidence ratios, relative to rates in 2001-2002, and mortality hazards from RRT initiation, relative to hazards in 2001-2002. In this retrospective cohort study, we used the US Renal Data System database (n=1,069,343), 2001-2010, to identify patients with ESRD due to multiple myeloma treated with RRT (n=12,703). Demography-adjusted incidence ratios of ESRD from multiple myeloma decreased between 2001-2002 and 2009-2010 in the overall population (demography-adjusted incidence ratio 0.82; 95% confidence interval, 0.79 to 0.86) and in most demographic subgroups examined. Mortality rates were 86.7, 41.4, and 34.4 per 100 person-years in the first 3 years of RRT, respectively, compared with 32.3, 20.6, and 21.3 in matched controls without multiple myeloma. Unadjusted mortality hazards ratios declined monotonically after 2004 to a value of 0.72; 95% confidence interval, 0.67 to 0.77 in 2009-2010, and declines between 2001-2002 and 2008-2009 were observed (P<0.05) in most demographic subgroups examined. Findings were similar when adjustment was made for demographic characteristics, comorbidity markers, and laboratory test values. These data suggest the incidence of RRT from multiple myeloma in the United States has decreased in the last decade, and clinically meaningful increases in survival have occurred for these patients.
Subject(s)
Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/etiology , Multiple Myeloma/complications , Adult , Aged , Female , Humans , Incidence , Male , Middle Aged , Retrospective Studies , Time Factors , United States/epidemiologyABSTRACT
BACKGROUND: The 2011 expanded Prospective Payment System (PPS) and contemporaneous Food and Drug Administration label revision for erythropoiesis-stimulating agents (ESAs) were associated with changes in ESA use and mean hemoglobin levels among patients receiving maintenance dialysis. We aimed to investigate whether these changes coincided with increased red blood cell transfusions or changes to Medicare-incurred costs or sites of anemia management care in the period immediately before and after the introduction of the PPS, 2009-2011. METHODS: From US Medicare end-stage renal disease (ESRD) data (Parts A and B claims), maintenance hemodialysis patients from facilities that initially enrolled 100Ā % into the ESRD PPS were identified. Dialysis and anemia-related costs per-patient-per-month (PPPM) were calculated at the facility level, and transfusion rates were calculated overall and by site of care (outpatient, inpatient, emergency department, observation stay). RESULTS: More than 4100 facilities were included. Transfusions in both the inpatient and outpatient environments increased. In the inpatient environment, PPPM use increased by 11-17Ā % per facility in each quarter of 2011 compared with 2009; in the outpatient environment, PPPM use increased overall by 5.0Ā %. Site of care for transfusions appeared to have shifted. Transfusions occurring in emergency departments or during observation stays increased 13.9Ā % and 26.4Ā %, respectively, over 2Ā years. CONCLUSIONS: Inpatient- and emergency-department-administered transfusions increased, providing some evidence for a partial shift in the cost and site of care for anemia management from dialysis facilities to hospitals. Further exploration into the economic implications of this increase is necessary.
Subject(s)
Anemia/economics , Anemia/therapy , Erythrocyte Transfusion/statistics & numerical data , Kidney Failure, Chronic/therapy , Prospective Payment System/economics , Renal Dialysis/economics , Administration, Intravenous , Aged , Ambulatory Care Facilities/economics , Ambulatory Care Facilities/trends , Anemia/etiology , Emergency Service, Hospital/economics , Emergency Service, Hospital/trends , Erythrocyte Transfusion/economics , Erythrocyte Transfusion/trends , Female , Hematinics/economics , Hematinics/therapeutic use , Hospitalization/economics , Hospitalization/trends , Humans , Iron/administration & dosage , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/economics , Male , Medicare , Middle Aged , United StatesABSTRACT
BACKGROUND/AIMS: Few published data describe survival rates for pediatric end-stage renal disease (ESRD) patients. We aimed to describe one-year mortality rates for US pediatric ESRD patients over a 15-year period. METHODS: In this retrospective cohort study, we used the US Renal Data System database to identify period-prevalent cohorts of patients aged younger than 19 for each year during the period 1995-2010. Yearly cohorts averaged approximately 1,200 maintenance dialysis patients (60% hemodialysis, 40% peritoneal dialysis) and 1,100 transplant recipients. Patients were followed for up to 1 year and censored at change in modality, loss to follow-up, or death. We calculated the unadjusted model-based mortality rates per time at risk, within each cohort year, by treatment modality (hemodialysis, peritoneal dialysis, transplant) and patient characteristics; percentage of deaths by cause; and overall adjusted odds of mortality by characteristics and modality. RESULTS: Approximately 50% of patients were in the age group 15-18, 55% were male, and 45% were female. The most common causes of ESRD were congenital/reflux/obstructive causes (55%) and glomerulonephritis (30%). One-year mortality rates showed evidence of a decrease in the number of peritoneal dialysis patients (6.03 per 100 patient-years, 1995; 2.43, 2010; p = 0.0263). Mortality rates for transplant recipients (average 0.68 per 100 patient-years) were consistently lower than the rates for all dialysis patients (average 4.36 per 100 patient-years). CONCLUSIONS: One-year mortality rates differ by treatment modality in pediatric ESRD patients.
Subject(s)
Kidney Failure, Chronic/mortality , Renal Dialysis/statistics & numerical data , Adolescent , Black or African American/statistics & numerical data , Age Factors , Cause of Death , Child , Child, Preschool , Female , Humans , Infant , Infant, Newborn , Kidney Failure, Chronic/etiology , Kidney Failure, Chronic/therapy , Kidney Transplantation/statistics & numerical data , Male , Peritoneal Dialysis/statistics & numerical data , Retrospective Studies , Survival Rate/trends , United States/epidemiology , White People/statistics & numerical dataABSTRACT
BACKGROUND: Few published data describe long-term survival of dialysis patients undergoing surgical versus percutaneous coronary revascularization in the era of drug-eluting stents (DES). METHODS AND RESULTS: Using United States Renal Data System data, we identified 23 033 dialysis patients who underwent coronary revascularization (6178 coronary artery bypass grafting, 5011 bare metal stents, 11 844 DES) from 2004 to 2009. Revascularization procedures decreased from 4347 in 2004 to 3344 in 2009. DES use decreased by 41% and bare metal stent use increased by 85% from 2006 to 2007. Long-term survival was estimated by the Kaplan-Meier method, and independent predictors of mortality were examined in a comorbidity-adjusted Cox model. In-hospital mortality for coronary artery bypass grafting patients was 8.2%; all-cause survival at 1, 2, and 5 years was 70%, 57%, and 28%, respectively. In-hospital mortality for DES patients was 2.7%; 1-, 2-, and 5-year survival was 71%, 53%, and 24%, respectively. Independent predictors of mortality were similar in both cohorts: age >65 years, white race, dialysis duration, peritoneal dialysis, and congestive heart failure, but not diabetes mellitus. Survival was significantly higher for coronary artery bypass grafting patients who received internal mammary grafts (hazard ratio, 0.83; P<0.0001). The probability of repeat revascularization accounting for the competing risk of death was 18% with bare metal stents, 19% with DES, and 6% with coronary artery bypass grafting at 1 year. CONCLUSIONS: Among dialysis patients undergoing coronary revascularization, in-hospital mortality was higher after coronary artery bypass grafting, but long-term survival was superior with internal mammary grafts. In-hospital mortality was lower for DES patients, but the probability of repeat revascularization was higher and comparable to that in patients receiving a bare metal stent. Revascularization decisions for dialysis patients should be individualized.
Subject(s)
Drug-Eluting Stents , Percutaneous Coronary Intervention/mortality , Percutaneous Coronary Intervention/trends , Renal Dialysis/mortality , Renal Dialysis/trends , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Percutaneous Coronary Intervention/instrumentation , Retrospective Studies , Stents , Survival Rate/trends , Survivors , Treatment Outcome , United States/epidemiologyABSTRACT
Clinical experience suggests a heightened risk associated with the transition to maintenance dialysis but few national studies have systematically examined early mortality trajectories. Here we calculated weekly mortality rates in the first year of treatment for 498,566 adults initiating maintenance dialysis in the United States (2005-2009). Mortality rates were initially unexpectedly low, peaked at 37.0 per 100 person-years in week 6, and declined steadily to 14.8 by week 51. In both early (weeks 7-12) and later (weeks 13-51) time frames, multivariate mortality associations included older age, female, Caucasian, non-Hispanic ethnicity, end-stage renal disease (ESRD) from hypertension and acute tubular necrosis, ischemic heart disease, estimated glomerular filtration rate of 15 ml/min per 1.73 m(2) or more, shorter duration of nephrologist care, and hemodialysis, especially with a catheter. For early mortality risk, adjusted hazard ratios of 2 or more were seen with age over 65 (5.80 vs. under 40 years), hemodialysis with a catheter (2.73 vs. fistula), and age 40-64 (2.33). For later mortality risk, adjusted hazard ratios of 2 or more were seen with age over 65 (4.32 vs. under 40 years), hemodialysis with a catheter (2.10 vs. fistula), and age 40-64 (2.00). Thus, low initial mortality rates question the accuracy of data collected and are consistent with deaths occurring in the early weeks after starting dialysis not being registered with the United States Renal Data System.
Subject(s)
Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Renal Dialysis/mortality , Adult , Aged , Female , Humans , Male , Middle Aged , Registries , Retrospective Studies , Risk Factors , Time Factors , United States/epidemiologyABSTRACT
BACKGROUND: Autosomal dominant polycystic kidney disease (ADPKD) is amenable to early detection and specialty care. Thus, while important to patients with the condition, end-stage renal disease (ESRD) from ADPKD also may be an indicator of the overall state of nephrology care. STUDY DESIGN: Retrospective cohort study of temporal trends in ESRD from ADPKD and pre-renal replacement therapy (RRT) nephrologist care, 2001-2010 (n = 23,772). SETTING & PARTICIPANTS: US patients who initiated maintenance RRT from 2001 through 2010 (n = 1,069,343) from US Renal Data System data. PREDICTOR: ESRD from ADPKD versus from other causes for baseline characteristics and clinical outcomes; interval 2001-2005 versus 2006-2010 for comparisons of cohort of patients with ESRD from ADPKD. OUTCOMES: Death, wait-listing for kidney transplant, kidney transplantation. MEASUREMENTS: US census data were used as population denominators. Poisson distribution was used to compute incidence rates (IRs). Incidence ratios were standardized to rates in 2001-2002 for age, sex, and race/ethnicity. Patients with and without ADPKD were matched to compare clinical outcomes. Poisson regression was used to calculate IRs and adjusted HRs for clinical events after inception of RRT. RESULTS: General population incidence ratios in 2009-2010 were unchanged from 2001-2002 (incidence ratio, 1.02). Of patients with ADPKD, 48.1% received more than 12 months of nephrology care before RRT; preemptive transplantation was the initial RRT in 14.3% and fistula was the initial hemodialysis access in 35.8%. During 4.9 years of follow-up, patients with ADPKD were more likely to be listed for transplantation (IR, 11.7 [95% CI, 11.5-12.0] vs 8.4 [95% CI, 8.2-8.7] per 100 person-years) and to undergo transplantation (IR, 9.8 [95% CI, 9.5-10.0] vs 4.8 [95% CI, 4.7-5.0] per 100 person-years) and less likely to die (IR, 5.6 [95% CI, 5.4-5.7] vs 15.5 [95% CI, 15.3-15.8] per 100 person-years) than matched controls without ADPKD. LIMITATIONS: Retrospective nonexperimental registry-based study of associations; cause-and-effect relationships cannot be determined. CONCLUSIONS: Although outcomes on dialysis therapy are better for patients with ADPKD than for those without ADPKD, access to predialysis nephrology care and nondeclining ESRD rates may be a cause for concern.
Subject(s)
Kidney Failure, Chronic , Polycystic Kidney, Autosomal Dominant , Renal Replacement Therapy , Adult , Aged , Early Diagnosis , Ethnicity , Female , Humans , Incidence , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/etiology , Kidney Failure, Chronic/therapy , Male , Middle Aged , Patient Care Management/organization & administration , Patient Care Management/statistics & numerical data , Polycystic Kidney, Autosomal Dominant/complications , Polycystic Kidney, Autosomal Dominant/diagnosis , Polycystic Kidney, Autosomal Dominant/epidemiology , Polycystic Kidney, Autosomal Dominant/therapy , Quality Assurance, Health Care , Registries , Renal Replacement Therapy/methods , Renal Replacement Therapy/statistics & numerical data , Retrospective Studies , United States/epidemiologyABSTRACT
BACKGROUND: Calciphylaxis, a rare disease seen in chronic dialysis patients, is associated with significant morbidity and mortality. As is the case with other rare diseases, the precise epidemiology of calciphylaxis remains unknown. Absence of a unique International Classification of Diseases (ICD) code impedes its identification in large administrative databases such as the United States Renal Data System (USRDS) and hinders patient-oriented research. This study was designed to develop an algorithm to accurately identify cases of calciphylaxis and to examine its incidence and mortality. DESIGN, PARTICIPANTS, AND MAIN MEASURES: Along with many other diagnoses, calciphylaxis is included in ICD-9 code 275.49, Other Disorders of Calcium Metabolism. Since calciphylaxis is the only disorder listed under this code that requires a skin biopsy for diagnosis, we theorized that simultaneous application of code 275.49 and skin biopsy procedure codes would accurately identify calciphylaxis cases. This novel algorithm was developed using the Partners Research Patient Data Registry (RPDR) (n = 11,451 chronic hemodialysis patients over study period January 2002 to December 2011) using natural language processing and review of medical and pathology records (the gold-standard strategy). We then applied this algorithm to the USRDS to investigate calciphylaxis incidence and mortality. KEY RESULTS: Comparison of our novel research strategy against the gold standard yielded: sensitivity 89.2%, specificity 99.9%, positive likelihood ratio 3,382.3, negative likelihood ratio 0.11, and area under the curve 0.96. Application of the algorithm to the USRDS identified 649 incident calciphylaxis cases over the study period. Although calciphylaxis is rare, its incidence has been increasing, with a major inflection point during 2006-2007, which corresponded with specific addition of calciphylaxis under code 275.49 in October 2006. Calciphylaxis incidence continued to rise even after limiting the study period to 2007 onwards (from 3.7 to 5.7 per 10,000 chronic hemodialysis patients; r = 0.91, p = 0.02). Mortality rates among calciphylaxis patients were noted to be 2.5-3 times higher than average mortality rates for chronic hemodialysis patients. CONCLUSIONS: By developing and successfully applying a novel algorithm, we observed a significant increase in calciphylaxis incidence. Because calciphylaxis is associated with extremely high mortality, our study provides valuable information for future patient-oriented calciphylaxis research, and also serves as a template for investigating other rare diseases.
Subject(s)
Algorithms , Calciphylaxis/epidemiology , Databases, Factual , Natural Language Processing , Rare Diseases/epidemiology , Calciphylaxis/pathology , Female , Humans , Incidence , International Classification of Diseases , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/therapy , Male , Middle Aged , Predictive Value of Tests , ROC Curve , Rare Diseases/pathology , Renal Dialysis/adverse effects , Renal Dialysis/statistics & numerical data , United States/epidemiologyABSTRACT
This work proposes a frailty model that accounts for non-random treatment assignment in survival analysis. Using Monte Carlo simulation, we found that estimated treatment parameters from our proposed endogenous selection survival model (esSurv) closely parallel the consistent two-stage residual inclusion (2SRI) results, while offering computational and interpretive advantages. The esSurv method greatly enhances computational speed relative to 2SRI by eliminating the need for bootstrapped standard errors and generally results in smaller standard errors than those estimated by 2SRI. In addition, esSurv explicitly estimates the correlation of unobservable factors contributing to both treatment assignment and the outcome of interest, providing an interpretive advantage over the residual parameter estimate in the 2SRI method. Comparisons with commonly used propensity score methods and with a model that does not account for non-random treatment assignment show clear bias in these methods, which is not mitigated by increased sample size. We illustrate using actual dialysis patient data comparing mortality of patients with mature arteriovenous grafts for venous access to mortality of patients with grafts placed but not yet ready for use at the initiation of dialysis. We find strong evidence of endogeneity (with estimate of correlation in unobserved factors ρ^=0.55) and estimate a mature-graft hazard ratio of 0.197 in our proposed method, with a similar 0.173 hazard ratio using 2SRI. The 0.630 hazard ratio from a frailty model without a correction for the non-random nature of treatment assignment illustrates the importance of accounting for endogeneity.
Subject(s)
Epidemiologic Research Design , Kidney Failure, Chronic/mortality , Renal Dialysis/mortality , Selection Bias , Survival Analysis , Vascular Access Devices/statistics & numerical data , Aged , Comorbidity , Computer Simulation , Databases, Factual , Female , Humans , Kidney Failure, Chronic/economics , Kidney Failure, Chronic/therapy , Male , Middle Aged , Models, Econometric , Monte Carlo Method , Propensity Score , Proportional Hazards Models , Renal Dialysis/economics , Renal Dialysis/methods , United States/epidemiologyABSTRACT
PURPOSE: Febrile neutropenia (FN) is a common and serious complication of myelosuppressive chemotherapy. Guidelines recommend primary granulocyte colony-stimulating factors (G-CSF) prophylaxis (PPG) in patients with a high risk (HR, >20 %) of developing FN. We performed a retrospective analysis using a subset of the Medicare 5 % database to assess patterns of G-CSF use and FN occurrence among elderly cancer patients receiving myelosuppressive chemotherapy. METHODS: Chemotherapy courses for patients aged 65+ years were identified; only the first course was used for this analysis. Using clinical guidelines, chemotherapy regimens were classified as HR or intermediate risk (IR) for FN. The first administration of G-CSF was classified as either PPG (within the first 5 days of the first cycle), secondary prophylaxis, or reactive. RESULTS: Twelve thousand seven hundred seven courses across five tumor types were classified as having a HR or IR regimen. G-CSF was used in 24.5-73.8 % of patients receiving a HR FN regimen, with the highest use in breast cancer or NHL. Except for breast cancer (where PPG was used in 52.1 %), PPG was given in less than half of patients receiving a HR regimen. Depending on the tumor type, 4.8-22.6 % of patients with a HR regimen had a neutropenia-related hospitalization. CONCLUSIONS: Guidelines recommend PPG with HR FN regimens and older age (>65 years), an important risk factor for developing severe neutropenic complications. However, our results show that in this elderly population, PPG was not routinely used (range 4.8-52.1 %) in patients receiving HR FN regimens. Careful attention to FN risk factors, including chemotherapy regimen and patient age, is needed when planning treatment strategies.
Subject(s)
Antineoplastic Combined Chemotherapy Protocols/adverse effects , Chemotherapy-Induced Febrile Neutropenia/prevention & control , Granulocyte Colony-Stimulating Factor/administration & dosage , Neoplasms/drug therapy , Aged , Aged, 80 and over , Antineoplastic Combined Chemotherapy Protocols/therapeutic use , Female , Granulocyte Colony-Stimulating Factor/adverse effects , Humans , Male , Neoplasms/blood , Retrospective Studies , Risk FactorsABSTRACT
BACKGROUND: The choice of vascular access type is an important aspect of care for incident hemodialysis patients. However, data from the Centers for Medicare & Medicaid Services (CMS) Medical Evidence Report (form CMS-2728) identifying the first access for incident patients have not previously been validated. Medicare began requiring that vascular access type be reported on claims in July 2010. We aimed to determine the agreement between the reported vascular access at initiation from form CMS-2728 and from Medicare claims. METHODS: This retrospective study used a cohort of 9777 patients who initiated dialysis in the latter half of 2010 and were eligible for Medicare at the start of renal replacement therapy to compare the vascular access type reported on form CMS-2728 with the type reported on Medicare outpatient dialysis claims for the same patients. For each patient, the reported access from each data source was compiled; the percent agreement represented the percent of patients for whom the access was the same. Multivariate logistic analysis was performed to identify characteristics associated with the agreement of reported access. RESULTS: The two data sources agreed for 94% of patients, with a Kappa statistic of 0.83, indicating an excellent level of agreement. Further, we found no evidence to suggest that agreement was associated with the patient characteristics of age, sex, race, or primary cause of renal failure. CONCLUSION: These results suggest that vascular access data as reported on form CMS-2728 are valid and reliable for use in research studies.
Subject(s)
Arteriovenous Shunt, Surgical/statistics & numerical data , Catheterization, Central Venous/statistics & numerical data , Mandatory Reporting , Medical Errors/statistics & numerical data , Medicare/statistics & numerical data , Renal Dialysis/classification , Renal Dialysis/statistics & numerical data , Aged , Aged, 80 and over , Evidence-Based Medicine , Female , Humans , Male , Reproducibility of Results , Sensitivity and Specificity , United StatesABSTRACT
The incidence of stroke is substantially higher among hemodialysis patients than among patients with earlier stages of CKD, but to what extent the initiation of dialysis accelerates the risk for stroke is not well understood. In this cohort study, we analyzed data from incident hemodialysis and peritoneal dialysis patients in 2009 who were at least 67 years old and had Medicare as primary payer. We noted whether each of the 20,979 hemodialysis patients initiated dialysis as an outpatient (47%) or inpatient (53%). One year before initiation, the baseline stroke rate was 0.15%-0.20% of patients per month (ppm) for both outpatient and inpatient initiators. Among outpatient initiators, stroke rates began rising approximately 90 days before initiation, reached 0.5% ppm during the 30 days before initiation, and peaked at 0.7% ppm (8.4% per patient-year) during the 30 days after initiation. The pattern was similar among inpatient initiators, but the stroke rate peaked at 1.5% ppm (18% per patient-year). For both hemodialysis groups, stroke rates rapidly declined by 1-2 months after initiation, fluctuated, and stabilized at approximately twice the baseline rate by 1 year. Among the 620 peritoneal dialysis patients, stroke rates were slightly lower and variable, but approximately doubled after initiation. In conclusion, these data suggest that the process of initiating dialysis may cause strokes. Further studies should evaluate methods to mitigate the risk for stroke during this high-risk period.
Subject(s)
Renal Dialysis/adverse effects , Renal Insufficiency, Chronic/therapy , Stroke/epidemiology , Aged , Aged, 80 and over , Female , Humans , Incidence , Male , Medicare , Renal Insufficiency, Chronic/complications , Risk Factors , Stroke/etiology , Time Factors , United States/epidemiologyABSTRACT
OBJECTIVE: Contrast-sparing strategies have been developed for percutaneous coronary intervention (PCI) patients at increased risk of contrast-induced acute kidney injury (CI-AKI), and numerous CI-AKI risk prediction models have been created. However, the potential clinical and economic consequences of using predicted CI-AKI risk thresholds for assigning patients to contrast-sparing regimens have not been evaluated. We estimated the clinical and economic consequences of alternative CI-AKI risk thresholds for assigning Medicare PCI patients to contrast-sparing strategies. METHODS: Medicare data were used to identify inpatient PCI from January 2017 to June 2021. A prediction model was developed to assign each patient a predicted probability of CI-AKI. Multivariable modeling was used to assign each patient two marginal predicted values for each of several clinical and economic outcomes based on (1) their underlying clinical and procedural characteristics plus their true CI-AKI status in the data and (2) their characteristics plus their counterfactual CI-AKI status. Specifically, CI-AKI patients above the predicted risk threshold for contrast-sparing were reassigned their no CI-AKI (counterfactual) outcomes. Expected event rates, resource use, and costs were estimated before and after those CI-AKI patients were reassigned their counterfactual outcomes. This entailed bootstrapped sampling of the full cohort. RESULTS: Of the 542,813 patients in the study cohort, 5,802 (1.1%) had CI-AKI. The area under the receiver operating characteristic curve for the prediction model was 0.81. At a predicted risk threshold for CI-AKI of >2%, approximately 18.0% of PCI patients were assigned to contrast-sparing strategies, resulting in (/100,000 PCI patients) 121 fewer deaths, 58 fewer myocardial infarction readmissions, 4,303 fewer PCI hospital days, $11.3 million PCI cost savings, and $25.8 million total one-year cost savings, versus no contrast-sparing strategies. LIMITATIONS: Claims data may not fully capture disease burden and are subject to inherent limitations such as coding inaccuracies. Further, the dataset used reflects only individuals with fee-for-service Medicare, and the results may not be generalizable to Medicare Advantage or other patient populations. CONCLUSIONS: Assignment to contrast-sparing regimens at a predicted risk threshold close to the underlying incidence of CI-AKI is projected to result in significant clinical and economic benefits.
Subject(s)
Acute Kidney Injury , Contrast Media , Medicare , Percutaneous Coronary Intervention , Humans , Percutaneous Coronary Intervention/methods , Percutaneous Coronary Intervention/economics , Percutaneous Coronary Intervention/adverse effects , Acute Kidney Injury/chemically induced , Contrast Media/adverse effects , United States , Male , Female , Aged , Risk Assessment , Aged, 80 and over , Risk FactorsABSTRACT
BACKGROUND/AIMS: African-Americans with end-stage renal disease receiving dialysis have more severe secondary hyperparathyroidism than Whites. We aimed to assess racial differences in clinical use of cinacalcet. METHODS: This retrospective cohort study used data from DaVita, Inc., for 45,589 prevalent hemodialysis patients, August 2004, linked to Centers for Medicare & Medicaid Services data, with follow-up through July 2007. Patients with Medicare as primary payer, intravenous vitamin D use, or weighted mean parathyroid hormone (PTH) level >150 pg/ml at baseline (August 1-October 31, 2004) were included. Cox proportional hazard modeling was used to evaluate race and other demographic and clinical characteristics as predictors of cinacalcet initiation, titration, and discontinuation. RESULTS: Of 16,897 included patients, 7,674 (45.4%) were African-American and 9,223 (54.6%) were white; 53.2% of cinacalcet users were African-American. Cinacalcet was prescribed for 47.7% of African-Americans and 34.5% of Whites, and for a greater percentage of African-Americans at higher doses at each PTH strata. After covariate adjustment, African-Americans were more likely than Whites to receive cinacalcet prescriptions (hazard ratio 1.17, p < 0.001). The direction and magnitude of this effect appeared to vary by age, baseline PTH, and calcium, and by elemental calcium use. African-Americans were less likely than Whites to have prescriptions discontinued and slightly more likely to undergo uptitration (hazard ratio 1.09, 95% confidence interval 0.995-1.188), but this relationship lacked statistical significance. CONCLUSION: Cinacalcet is prescribed more commonly and at higher initial doses for African-Americans than for Whites to manage secondary hyperparathyroidism.