Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
1.
Pharmacoeconomics ; 42(4): 447-461, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38267806

RESUMO

OBJECTIVE: Cabotegravir long-acting (CAB-LA) administered every 2 months was approved in the USA as pre-exposure prophylaxis (PrEP) for individuals at risk of acquiring human immunodeficiency virus (HIV)-1 infection based on the HIV Prevention Trials Network (HPTN) 083 and HPTN 084 clinical trials, which demonstrated superior reduction in HIV-1 acquisition compared with daily oral emtricitabine/tenofovir disoproxil fumarate (FTC/TDF) in men who have sex with men (MSM), transgender women (TGW), and cisgender women. A decision-analytic model was developed to assess the lifetime cost-effectiveness of initiating CAB-LA versus generic oral FTC/TDF for HIV PrEP in the USA from a healthcare sector perspective. METHODS: PrEP-eligible adults entered the Markov model receiving CAB-LA or FTC/TDF and could continue initial PrEP, transition to a second PrEP option, or discontinue PrEP over time. Efficacy was taken from the HPTN 083 and HPTN 084 clinical trials. Individuals who acquired HIV-1 infection incurred lifetime HIV-related costs, could transmit HIV onwards, and could develop PrEP-related resistance mutations. Input parameter values were obtained from public and published sources. Model outcomes were discounted at 3%. RESULTS: The model estimated that the CAB-LA pathway prevented 4.5 more primary and secondary HIV-1 infections per 100 PrEP users than the oral PrEP pathway, which yielded 0.2 fewer quality-adjusted life-years (QALYs) lost per person. Additional per-person lifetime costs were $9476 (2022 US dollars), resulting in an incremental cost-effectiveness ratio of $46,843 per QALY gained. Results remained consistent in sensitivity and scenario analyses, including in underserved populations with low oral PrEP usage. CONCLUSIONS: Our analysis suggests that initiating CAB-LA for PrEP is cost-effective versus generic daily oral FTC/TDF for individuals at risk of acquiring HIV-1 infection.


Assuntos
Fármacos Anti-HIV , Dicetopiperazinas , Infecções por HIV , Profilaxia Pré-Exposição , Piridonas , Minorias Sexuais e de Gênero , Masculino , Adulto , Humanos , Feminino , Estados Unidos , Fármacos Anti-HIV/uso terapêutico , Homossexualidade Masculina , Análise Custo-Benefício , Infecções por HIV/prevenção & controle
2.
Temperature (Austin) ; 10(3): 379-393, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37554387

RESUMO

We have previously identified predator odor as a potent stimulus activating thermogenesis in skeletal muscle in rats. As this may prove relevant for energy balance and weight loss, the current study investigated whether skeletal muscle thermogenesis was altered with negative energy balance, obesity propensity seen in association with low intrinsic aerobic fitness, and monogenic obesity. First, weight loss subsequent to 3 wk of 50% calorie restriction suppressed the muscle thermogenic response to predator odor. Next, we compared rats bred based on artificial selection for intrinsic aerobic fitness - high- and low-capacity runners (HCR, LCR) - that display robust leanness and obesity propensity, respectively. Aerobically fit HCR showed enhanced predator odor-induced muscle thermogenesis relative to the less-fit LCR. This contrasted with the profound monogenic obesity displayed by rats homozygous for a loss of function mutation in Melanocortin 4 receptor (Mc4rK3a,4X/K314X rats), which showed no discernable deficit in thermogenesis. Taken together, these data imply that body size or obesity per se are not associated with deficient muscle thermogenesis. Rather, the physiological phenotype associated with polygenic obesity propensity may encompass pleiotropic mechanisms in the thermogenic pathway. Adaptive thermogenesis associated with weight loss also likely alters muscle thermogenic mechanisms.

3.
Vaccine ; 41(3): 684-693, 2023 01 16.
Artigo em Inglês | MEDLINE | ID: mdl-36526505

RESUMO

INTRODUCTION: Nonpharmaceutical interventions (NPI) and ring vaccination (i.e., vaccination that primarily targets contacts and contacts of contacts of Ebola cases) are currently used to reduce the spread of Ebola during outbreaks. Because these measures are typically initiated after an outbreak is declared, they are limited by real-time implementation challenges. Preventive vaccination may provide a complementary option to help protect communities against unpredictable outbreaks. This study aimed to assess the impact of preventive vaccination strategies when implemented in conjunction with NPI and ring vaccination. METHODS: A spatial-explicit, individual-based model (IBM) that accounts for heterogeneity of human contact, human movement, and timing of interventions was built to represent Ebola transmission in the Democratic Republic of the Congo. Simulated preventive vaccination strategies targeted healthcare workers (HCW), frontline workers (FW), and the general population (GP) with varying levels of coverage (lower coverage: 30% of HCW/FW, 5% of GP; higher coverage: 60% of HCW/FW, 10% of GP) and efficacy (lower efficacy: 60%; higher efficacy: 90%). RESULTS: The IBM estimated that the addition of preventive vaccination for HCW reduced cases, hospitalizations, and deaths by âˆ¼11 % to âˆ¼25 % compared with NPI + ring vaccination alone. Including HCW and FW in the preventive vaccination campaign yielded âˆ¼14 % to âˆ¼38 % improvements in epidemic outcomes. Further including the GP yielded the greatest improvements, with âˆ¼21 % to âˆ¼52 % reductions in epidemic outcomes compared with NPI + ring vaccination alone. In a scenario without ring vaccination, preventive vaccination reduced cases, hospitalizations, and deaths by âˆ¼28 % to âˆ¼59 % compared with NPI alone. In all scenarios, preventive vaccination reduced Ebola transmission particularly during the initial phases of the epidemic, resulting in flatter epidemic curves. CONCLUSIONS: The IBM showed that preventive vaccination may reduce Ebola cases, hospitalizations, and deaths, thus safeguarding the healthcare system and providing more time to implement additional interventions during an outbreak.


Assuntos
Ebolavirus , Epidemias , Doença pelo Vírus Ebola , Humanos , Doença pelo Vírus Ebola/epidemiologia , Doença pelo Vírus Ebola/prevenção & controle , Surtos de Doenças/prevenção & controle , Vacinação/métodos , Programas de Imunização , República Democrática do Congo/epidemiologia
4.
Exp Physiol ; 106(8): 1731-1742, 2021 08.
Artigo em Inglês | MEDLINE | ID: mdl-34086376

RESUMO

NEW FINDINGS: What is the central question of this study? How does intrinsic aerobic capacity impact weight loss with 50% daily caloric restriction and alternate-day fasting? What is the main finding and its importance? Intermittent fasting is effective for weight loss in rats with low fitness, which highlights the importance of how intermittent fasting interacts with aerobic fitness. ABSTRACT: Recent interest has focused on the benefits of time-restricted feeding strategies, including intermittent fasting, for weight loss. It is not yet known whether intermittent fasting is more effective than daily caloric restriction at stimulating weight loss and how each is subject to individual differences. Here, rat models of leanness and obesity, artificially selected for intrinsically high (HCR) and low (LCR) aerobic capacity, were subjected to intermittent fasting and 50% calorie restrictive diets in two separate experiments using male rats. The lean, high-fitness HCR and obesity-prone, low-fitness LCR rats underwent 50% caloric restriction while body weight and composition were monitored. The low-fitness LCR rats were better able to retain lean mass than the high-fitness HCR rats, without significantly different proportional loss of weight or fat. In a separate experiment using intermittent fasting in male HCR and LCR rats, alternate-day fasting induced significantly greater loss of weight and fat mass in LCR compared with HCR rats, although the HCR rats had a more marked reduction in ad libitum daily food intake. Altogether, this suggests that intermittent fasting is an effective weight-loss strategy for those with low intrinsic aerobic fitness; however, direct comparison of caloric restriction and intermittent fasting is warranted to determine any differential effects on energy expenditure in lean and obesity-prone phenotypes.


Assuntos
Restrição Calórica , Jejum , Animais , Masculino , Obesidade , Fenótipo , Ratos , Redução de Peso
5.
Pharmacoeconomics ; 39(4): 421-432, 2021 04.
Artigo em Inglês | MEDLINE | ID: mdl-33532919

RESUMO

BACKGROUND: Ibalizumab-uiyk (ibalizumab) is a first-in-class, long-acting, postattachment HIV-1 inhibitor for adults with multidrug-resistant (MDR) HIV-1 infection. This analysis examines the cost-effectiveness and budget impact of ibalizumab treatment for this difficult-to-treat population in the United States. METHODS: A Markov model followed cohorts of adults with MDR HIV-1 infection through two final lines of antiretroviral therapy: ibalizumab + optimized background therapy (OBT) or OBT alone followed by nonsuppressive therapy. Model inputs were based on ibalizumab clinical trial data, market uptake projections, and published literature, with costs in 2019 dollars. The cost-effectiveness analysis assessed costs and health outcomes from a health care sector perspective for individuals receiving ibalizumab + OBT versus OBT alone over a lifetime time horizon. The budget-impact analysis estimated the impact on payer budgets of the introduction of ibalizumab over 3 years for a hypothetical commercial health plan. RESULTS: Compared with individuals receiving OBT alone, individuals receiving ibalizumab + OBT incurred higher costs but lived longer, healthier lives, with an incremental cost of $133,040 per QALY gained. For a hypothetical commercial health plan with 1 million members, the introduction of ibalizumab + OBT was estimated to increase budgets by $217,260, $385,245, and $560,310 ($0.018, $0.032, and $0.047 per member per month) in years 1, 2, and 3, respectively. These results were found to be robust in sensitivity and scenario analyses. CONCLUSIONS: Ibalizumab may represent a cost-effective and affordable option to improve health outcomes for individuals with MDR HIV-1 infection.


Assuntos
Infecções por HIV , HIV-1 , Adulto , Anticorpos Monoclonais , Análise Custo-Benefício , Infecções por HIV/tratamento farmacológico , Humanos , Estados Unidos
6.
PLoS One ; 13(8): e0203293, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-30161205

RESUMO

METHODS: Ninety-six-week costs for antiretroviral drugs, adverse event management, and HIV care for individuals initiating RAL, ATV/r, or DRV/r as first-line therapy for HIV-1 infection were estimated using an economic model. Efficacy and safety data (mean CD4 cell count changes, discontinuation rates, grade 3/4 adverse event incidence) for each regimen through 96 weeks of treatment were taken from the ACTG 5257 clinical trial. Antiretroviral drug costs for each initial regimen and for each substitution regimen, as used by individuals who discontinued their initial regimen, were based on wholesale acquisition costs. Adverse event management costs and HIV care costs, stratified by CD4 cell count range, were taken from published sources and inflated to 2016 dollars. Scenario and sensitivity analyses were conducted to assess the robustness of the results. Cost outcomes were discounted at an annual rate of 3.0%. RESULTS: Total 96-week costs were $81,231 for RAL, $88,064 for ATV/r, and $87,680 for DRV/r, where differences were primarily due to lower antiretroviral drug costs for RAL than for ATV/r or DRV/r. These results were found to be robust in scenario and sensitivity analyses. CONCLUSIONS: Relative to the DRV/r and ATV/r regimens, the RAL regimen had the lowest cost for treatment-naive adults with HIV-1 infection in the United States.


Assuntos
Fármacos Anti-HIV/economia , Fármacos Anti-HIV/uso terapêutico , Infecções por HIV/tratamento farmacológico , Infecções por HIV/economia , HIV-1 , Adulto , Fármacos Anti-HIV/efeitos adversos , Sulfato de Atazanavir/efeitos adversos , Sulfato de Atazanavir/economia , Sulfato de Atazanavir/uso terapêutico , Análise Custo-Benefício , Darunavir/efeitos adversos , Darunavir/economia , Darunavir/uso terapêutico , Quimioterapia Combinada , Feminino , Custos de Cuidados de Saúde , Humanos , Masculino , Modelos Econômicos , Raltegravir Potássico/efeitos adversos , Raltegravir Potássico/economia , Raltegravir Potássico/uso terapêutico , Ritonavir/efeitos adversos , Ritonavir/economia , Ritonavir/uso terapêutico , Estados Unidos
7.
HIV Clin Trials ; 18(5-6): 214-222, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-29210626

RESUMO

INTRODUCTION: The AIDS Clinical Trial Group (ACTG) 5257 clinical trial showed that raltegravir (RAL) was superior to atazanavir/ritonavir (ATV/r) and darunavir/ritonavir (DRV/r), when used in combination with emtricitabine/tenofovir DF (FTC/TDF), in a 96-week composite endpoint combining virologic efficacy and tolerability for treatment-naive adults with HIV-1 infection. This study aimed to estimate the efficiency associated with these three regimens in Spain. METHODS: An economic model was developed to estimate costs for antiretroviral drugs, adverse event management, and HIV care for individuals initiating first-line therapy. Antiretroviral drug costs were based on hospital costs with mandatory discounts applied. Adverse event management costs and HIV care costs were obtained from published sources and inflated to 2015 euros. Head-to-head efficacy and safety data (discontinuation rates, mean CD4 cell-count changes, adverse event incidence) up to 96 weeks for each regimen were obtained from the clinical trial. The efficiency of each regimen, as measured by the cost per successfully treated patient (i.e. on first-line therapy for 96 weeks), was estimated and examined in sensitivity analyses. All cost outcomes were discounted at 3.0% annually. RESULTS: Total costs per successfully treated patient were €22,377 for RAL, €26,629 for ATV/r, and €23,928 for DRV/r. These results were found to be robust in sensitivity analyses. DISCUSSION: RAL has the lowest cost per successfully treated patient when compared with DRV/r and ATV/r, each used in combination with FTC/TDF, for treatment-naive adults with HIV-1 infection in Spain. This economic evidence complements the clinical benefits of RAL reported in the ACTG 5257 clinical trial.


Assuntos
Fármacos Anti-HIV/administração & dosagem , Terapia Antirretroviral de Alta Atividade/métodos , Custos e Análise de Custo , Infecções por HIV/tratamento farmacológico , Adulto , Fármacos Anti-HIV/efeitos adversos , Fármacos Anti-HIV/economia , Terapia Antirretroviral de Alta Atividade/efeitos adversos , Terapia Antirretroviral de Alta Atividade/economia , Estudos de Coortes , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/epidemiologia , Feminino , Humanos , Masculino , Espanha , Resultado do Tratamento
8.
Hum Vaccin Immunother ; 13(3): 533-542, 2017 03 04.
Artigo em Inglês | MEDLINE | ID: mdl-27780425

RESUMO

Trivalent inactivated influenza vaccines (IIV3s) protect against 2 A strains and one B lineage; quadrivalent versions (IIV4s) protect against an additional B lineage. The objective was to assess projected health and economic outcomes associated with IIV4 versus IIV3 for preventing seasonal influenza in the US. A cost-effectiveness model was developed to interact with a dynamic transmission model. The transmission model tracked vaccination, influenza cases, infection-spreading interactions, and recovery over 10 y (2012-2022). The cost-effectiveness model estimated influenza-related complications, direct and indirect costs (2013-2014 US$), health outcomes, and cost-effectiveness. Inputs were taken from published/public sources or estimated using regression or calibration. Outcomes were discounted at 3% per year. Scenario analyses tested the reliability of the results. Seasonal vaccination with IIV4 versus IIV3 is predicted to reduce annual influenza cases by 1,973,849 (discounted; 2,325,644 undiscounted), resulting in 12-13% fewer cases and influenza-related complications and deaths. These reductions are predicted to translate into 18,485 more quality-adjusted life years (QALYs) accrued annually for IIV4 versus IIV3. Increased vaccine-related costs ($599 million; 5.7%) are predicted to be more than offset by reduced influenza treatment costs ($699 million; 12.2%), resulting in direct medical cost saving annually ($100 million; 0.6%). Including indirect costs, savings with IIV4 are predicted to be $7.1 billion (5.6%). Scenario analyses predict IIV4 to be cost-saving in all scenarios tested apart from low infectivity, where IIV4 is predicted to be cost-effective. In summary, seasonal influenza vaccination in the US with IIV4 versus IIV3 is predicted to improve health outcomes and reduce costs.


Assuntos
Análise Custo-Benefício , Vacinas contra Influenza/administração & dosagem , Vacinas contra Influenza/economia , Influenza Humana/prevenção & controle , Vacinação/economia , Vacinação/estatística & dados numéricos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Criança , Pré-Escolar , Feminino , Humanos , Lactente , Recém-Nascido , Vacinas contra Influenza/imunologia , Masculino , Pessoa de Meia-Idade , Estados Unidos , Adulto Jovem
9.
Med Decis Making ; 35(6): 797-807, 2015 08.
Artigo em Inglês | MEDLINE | ID: mdl-25385750

RESUMO

The national demand for kidney transplantation far outweighs the supply of kidney organs. Currently, a patient's ability to receive a kidney transplant varies depending on where he or she seeks transplantation. This reality is in direct conflict with a federal mandate from the Department of Health and Human Services. We analyze current kidney allocation and develop an alternative kidney sharing strategy using a multiperiod linear optimization model, KSHARE. KSHARE aims to improve geographic equity in kidney transplantation while also respecting transplant system constraints and priorities. KSHARE is tested against actual 2000-2009 kidney allocation using Organ Procurement and Transplant Network data. Geographic equity is represented by minimizing the range in kidney transplant rates around local areas of the country. In 2009, less than 25% of standard criteria donor kidneys were allocated beyond the local area of procurement, and Donor Service Area kidney transplantation rates varied from 3.0% to 30.0%, for an overall range of 27.0%. Given optimal sharing of kidneys within 600 miles of procurement for 2000-2009, kidney transplant rates vary from 5.0% to 12.5% around the country for an overall kidney transplant range of 7.5%. Nationally sharing kidneys optimally between local areas only further decreases the transplant rate range by 1.7%. Enhancing the practice of sharing kidneys by the KSHARE model may increase geographic equity in kidney transplantation.


Assuntos
Simulação por Computador , Acessibilidade aos Serviços de Saúde/estatística & dados numéricos , Necessidades e Demandas de Serviços de Saúde/estatística & dados numéricos , Transplante de Rim/estatística & dados numéricos , Alocação de Recursos/estatística & dados numéricos , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Equidade em Saúde , Humanos , Estados Unidos
10.
Transplantation ; 98(9): 931-6, 2014 Nov 15.
Artigo em Inglês | MEDLINE | ID: mdl-25286057

RESUMO

BACKGROUND: The national organ allocation system for deceased-donor kidney transplant will endure increased burden as the waitlist expands and organ shortage persists. The Department of Health and Human Services issued the "Final Rule" in 1998 that states "Organs and tissues ought to be distributed on the basis of objective priority criteria and not on the basis of accidents of geography." However, it has not been addressed whether the rule was effective in encouraging regions to share the additional burden equitably. OBJECTIVE: To assess the significance of changes of geographic disparities for four metrics since the rule's adoption: waiting times, transplant rates, pretransplant mortality, and organ quality. METHODS: Using Organ Procurement and Transplant Network data from 1988 through 2009, annual ranges of the metrics were calculated for all donor service areas and United Network for Organ Sharing regions. Time series analyses were used to compare the metrics before and after the enactment of the Final Rule. RESULTS: A total of 412,127 kidney transplant candidates and 178,163 deceased-donor recipients were analyzed. Demographics varied significantly by region. The ranges of the four metrics have worsened by approximately 30% or more after the Final Rule at both the regional and donor service area levels. CONCLUSION: Increasing geographic disparity in allocation procedures may yield diverging outcomes and experiences in different locations for otherwise similar candidates. Consensus for measuring allocation discrepancies and policy interventions are required to mitigate the inequities.


Assuntos
Disparidades em Assistência à Saúde , Transplante de Rim , Obtenção de Tecidos e Órgãos , Geografia , Política de Saúde , Acessibilidade aos Serviços de Saúde , Humanos , Rim/patologia , Transplante de Rim/normas , Transplante de Órgãos , Avaliação de Resultados em Cuidados de Saúde , Fatores de Tempo , Doadores de Tecidos , Obtenção de Tecidos e Órgãos/normas , Estados Unidos , Listas de Espera
11.
Am J Surg ; 208(4): 605-18, 2014 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-25118164

RESUMO

BACKGROUND: Liver transplantation is a complex surgery associated with high rates of postoperative complications. While national outcomes data are available, national rates of most complications are unknown. DATA SOURCES: A systematic review of the literature reporting rates of postoperative complications between 2002 and 2012 was performed. A cohort of 29,227 deceased donor liver transplant recipients from 74 studies was used to calculate pooled incidences for 17 major postoperative complications. CONCLUSIONS: This is the first comprehensive review of postoperative complications after liver transplantation and can serve as a guide for transplant and nontransplant clinicians. Efforts to collect national data on complications, such as through the National Surgical Quality Improvement Program, would improve the ability to provide patients with informed consent, serve as a tool for individual center performance monitoring, and provide a central source against which to measure interventions aimed at improving patient care.


Assuntos
Transplante de Fígado , Complicações Pós-Operatórias , Doadores de Tecidos , Sobrevivência de Enxerto , Humanos
12.
Clin J Am Soc Nephrol ; 9(8): 1449-60, 2014 Aug 07.
Artigo em Inglês | MEDLINE | ID: mdl-24970871

RESUMO

BACKGROUND AND OBJECTIVES: The Statewide Sharing variance to the national kidney allocation policy allocates kidneys not used within the procuring donor service area (DSA), first within the state, before the kidneys are offered regionally and nationally. Tennessee and Florida implemented this variance. Known geographic differences exist between the 58 DSAs, in direct violation of the Final Rule stipulated by the US Department of Health and Human Services. This study examined the effect of Statewide Sharing on geographic allocation disparity over time between DSAs within Tennessee and Florida and compared them with geographic disparity between the DSAs within a state for all states with more than one DSA (California, New York, North Carolina, Ohio, Pennsylvania, Texas, and Wisconsin). DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: A retrospective analysis from 1987 to 2009 was conducted using Organ Procurement and Transplant Network data. Five previously used indicators for geographic allocation disparity were applied: deceased-donor kidney transplant rates, waiting time to transplantation, cumulative dialysis time at transplantation, 5-year graft survival, and cold ischemic time. RESULTS: Transplant rates, waiting time, dialysis time, and graft survival varied greatly between deceased-donor kidney recipients in DSAs in all states in 1987. After implementation of Statewide Sharing in 1992, disparity indicators decreased by 41%, 36%, 31%, and 9%, respectively, in Tennessee and by 28%, 62%, 34%, and 19%, respectively in Florida, such that the geographic allocation disparity in Tennessee and Florida almost completely disappeared. Statewide kidney allocations incurred 7.5 and 5 fewer hours of cold ischemic time in Tennessee and Florida, respectively. Geographic disparity between DSAs in all the other states worsened or improved to a lesser degree. CONCLUSIONS: As sweeping changes to the kidney allocation system are being discussed to alleviate geographic disparity--changes that are untested run the risk of unintended consequences--more limited changes, such as Statewide Sharing, should be further studied and considered.


Assuntos
Política de Saúde/tendências , Acessibilidade aos Serviços de Saúde/tendências , Disparidades em Assistência à Saúde/tendências , Falência Renal Crônica/terapia , Transplante de Rim/tendências , Características de Residência , Planos Governamentais de Saúde/tendências , Obtenção de Tecidos e Órgãos/tendências , Adolescente , Adulto , Idoso , Criança , Pré-Escolar , Isquemia Fria/tendências , Feminino , Sobrevivência de Enxerto , Humanos , Lactente , Recém-Nascido , Falência Renal Crônica/diagnóstico , Falência Renal Crônica/epidemiologia , Transplante de Rim/efeitos adversos , Masculino , Pessoa de Meia-Idade , Formulação de Políticas , Regionalização da Saúde/tendências , Diálise Renal/tendências , Estudos Retrospectivos , Governo Estadual , Fatores de Tempo , Doadores de Tecidos/provisão & distribuição , Obtenção de Tecidos e Órgãos/organização & administração , Resultado do Tratamento , Estados Unidos/epidemiologia , Listas de Espera , Adulto Jovem
13.
Transplantation ; 97(10): 1049-57, 2014 May 27.
Artigo em Inglês | MEDLINE | ID: mdl-24374790

RESUMO

BACKGROUND: Waiting time to deceased donor kidney transplant varies greatly across the United States. This variation violates the final rule, a federal mandate, which demands geographic equity in organ allocation for transplantation. METHODS: Retrospective analysis of the United States Renal Data System and United Network for Organ Sharing database from 2000 to 2009. Median waiting time was calculated for each of the 58 donor service areas (DSA) in the United States. Multivariate regression was performed to identify DSA predictors for long waiting times to kidney transplantation. RESULTS: The median waiting time varied between the 58 DSAs from 0.61 to 4.57 years, ranging from 0.59 to 5.17 years for standard criteria donor kidneys and 0.41 to 4.69 years for expanded criteria donor kidneys. The disparity in waiting time between the DSAs grew from 3.26 years (range, 0.41-3.67) in 2000 to 4.72 years (range, 0.50-5.22) in 2009. In DSAs with longer waiting times, there were significantly more patients suffering from end-stage renal disease and more patients listed for kidney transplant, lower kidney procurement rates, and higher transplant center competition. Patients were more likely black, sensitized, with lower educational attainment and less likely to waitlist outside of their DSA of residence. Donor organs used in DSAs with long waiting times were more likely hepatitis C positive and had a higher kidney donor profile index. Graft and patient survival at 5 years was worse for deceased donor kidney transplant, but rates for living donor kidney transplant were higher. CONCLUSION: Our analysis demonstrates significant and worsening geographic disparity in waiting time for kidney transplant across the DSAs. Increase in living donor kidney transplant and use of marginal organs has not mitigated the disparity. Changes to the kidney allocation system might be required to resolve this extensive geographic disparity in kidney allocation.


Assuntos
Falência Renal Crônica/cirurgia , Transplante de Rim , Sistema de Registros , Doadores de Tecidos/provisão & distribuição , Obtenção de Tecidos e Órgãos/organização & administração , Listas de Espera , Adolescente , Adulto , Feminino , Seguimentos , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de Tempo , Estados Unidos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...