Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 17 de 17
Filtrar
1.
J Clin Oncol ; 42(10): 1193-1201, 2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-38381994

RESUMO

PURPOSE: The US Food and Drug Administration (FDA) approved elacestrant for the treatment of postmenopausal women or adult men with estrogen receptor-positive (ER+), human epidermal growth factor receptor 2-negative (HER2-), estrogen receptor 1 (ESR1)-mutated advanced or metastatic breast cancer with disease progression after at least one line of endocrine therapy (ET). PATIENTS AND METHODS: Approval was based on EMERALD (Study RAD1901-308), a randomized, open-label, active-controlled, multicenter trial in 478 patients with ER+, HER2- advanced or metastatic breast cancer, including 228 patients with ESR1 mutations. Patients were randomly assigned (1:1) to receive either elacestrant 345 mg orally once daily (n = 239) or investigator's choice of ET (n = 239). RESULTS: In the ESR1-mut subgroup, EMERALD demonstrated a statistically significant improvement in progression-free survival (PFS) by blinded independent central review assessment (n = 228; hazard ratio [HR], 0.55 [95% CI, 0.39 to 0.77]; P value = .0005). Although the overall survival (OS) end point was not met, there was no trend toward a potential OS detriment (HR, 0.90 [95% CI, 0.63 to 1.30]) in the ESR1-mut subgroup. PFS also reached statistical significance in the intention-to-treat population (ITT, N = 478; HR, 0.70 [95% CI, 0.55 to 0.88]; P value = .0018). However, improvement in PFS in the ITT population was primarily attributed to results from patients in the ESR1-mut subgroup. More patients who received elacestrant experienced nausea, vomiting, and dyslipidemia. CONCLUSION: The approval of elacestrant in ER+, HER2- advanced or metastatic breast cancer was restricted to patients with ESR1 mutations. Benefit-risk assessment in the ESR1-mut subgroup was favorable on the basis of a statistically significant improvement in PFS in the context of an acceptable safety profile including no evidence of a potential detriment in OS. By contrast, the benefit-risk assessment in patients without ESR1 mutations was not favorable. Elacestrant is the first oral estrogen receptor antagonist to receive FDA approval for patients with ESR1 mutations.


Assuntos
Neoplasias da Mama , Tetra-Hidronaftalenos , Adulto , Estados Unidos , Humanos , Feminino , Neoplasias da Mama/tratamento farmacológico , Neoplasias da Mama/genética , Neoplasias da Mama/patologia , Receptor alfa de Estrogênio/genética , United States Food and Drug Administration , Receptor ErbB-2/metabolismo , Protocolos de Quimioterapia Combinada Antineoplásica/uso terapêutico
2.
Clin Cancer Res ; 28(11): 2221-2228, 2022 06 01.
Artigo em Inglês | MEDLINE | ID: mdl-35101885

RESUMO

FDA's approval of cemiplimab-rwlc on February 22, 2021, follows prior approvals of pembrolizumab and atezolizumab for similar indications as first-line treatment for patients with programmed death ligand-1 (PD-L1)-high advanced non-small cell lung cancer (NSCLC). Approvals of these anti-PD-L1 agents were supported by statistically significant and clinically meaningful improvements in overall survival (OS) in international, multicenter, active-controlled randomized trials. In KEYNOTE-024, the OS HR was 0.60 [95% confidence interval (CI), 0.41-0.89; P = 0.005] favoring pembrolizumab over platinum-doublet chemotherapy. In IMpower110, the OS HR was 0.59 (95% CI, 0.40-0.89; P = 0.0106) favoring atezolizumab over platinum-doublet chemotherapy. In Study 1624, the OS HR was 0.68 (95% CI, 0.53-0.87; P = 0.0022) favoring cemiplimab-rwlc over platinum-doublet chemotherapy. The progression-free survival (PFS) effect sizes for these anti-PD-L1 antibodies were also comparable across their respective registrational trials, and their safety profiles were consistent with the anti-PD-L1 class adverse event profile. The consistent survival benefits and manageable toxicity profiles of these single-agent anti-PD-L1 antibodies have established them as important treatment options in the PD-L1-high NSCLC treatment landscape. FDA approvals of these anti-PD-L1 antibodies, based on their favorable benefit-risk profiles, present effective chemotherapy-free therapeutic options for patients with advanced PD-L1-high NSCLC in the United States.


Assuntos
Carcinoma Pulmonar de Células não Pequenas , Neoplasias Pulmonares , Anticorpos Monoclonais Humanizados , Protocolos de Quimioterapia Combinada Antineoplásica/efeitos adversos , Antígeno B7-H1 , Carcinoma Pulmonar de Células não Pequenas/patologia , Humanos , Neoplasias Pulmonares/patologia , Platina/uso terapêutico , Estados Unidos
3.
Drug Saf ; 45(2): 169-180, 2022 02.
Artigo em Inglês | MEDLINE | ID: mdl-35113347

RESUMO

INTRODUCTION: New safety issues concerning US FDA-approved drugs are commonly communicated through safety-related labeling changes. Therefore, to optimize and refine postmarket safety surveillance strategies, it is important to comprehensively characterize the sources of data giving rise to safety-related labeling changes. OBJECTIVES: Our objective was to characterize the sources of data triggering and supporting the identification of new safety risks of FDA-approved drugs communicated through safety-related labeling changes. METHODS: We conducted a retrospective study with a 10-year observation period using FDA's internal electronic data repositories for all prescription new molecular entities (NME) approved in 2008. We collected and analyzed information on new safety issues, the section of the full prescribing information updated, initiators (FDA, drug manufacturer), and triggering and supporting sources of evidence. RESULTS: Among 22 NMEs approved in 2008, 189 new safety issues for 18 NMEs were identified. Compared to drug manufacturer, FDA initiated safety-related labeling changes in nine of the ten changes to the Boxed Warnings, 28 of the 52 changes to the Warnings and Precautions, and 43 of the 134 changes to the Adverse Reactions sections of the full prescribing information. The most frequent triggering sources of evidence included the drug manufacturer safety database (32.3%) and FDA Adverse Event Reporting System (FAERS) safety reports (15.3%) for all relevant sections of the full prescribing information, and class-labeling changes (17.5%) for Boxed Warnings and the Warnings and Precautions sections. The most frequent triggering source of evidence was FAERS safety reports (69%) in the first year after drug approval and the drug manufacturer safety database in subsequent years. CONCLUSIONS: Our findings emphasize the continued importance of safety reports from FAERS and drug manufacturer safety databases and a comprehensive drug safety surveillance program throughout a drug's lifecycle.


Assuntos
Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos , Vigilância de Produtos Comercializados , Aprovação de Drogas , Rotulagem de Medicamentos , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/epidemiologia , Humanos , Estudos Retrospectivos , Estados Unidos , United States Food and Drug Administration
4.
Clin Pharmacol Ther ; 110(6): 1512-1525, 2021 12.
Artigo em Inglês | MEDLINE | ID: mdl-34057195

RESUMO

We characterized the size of the premarket safety population for 278 small-molecule new molecular entities (NMEs) and 61 new therapeutic biologics (NTBs) approved by the US Food and Drug Administration (FDA) between October 1, 2002, and December 31, 2014, evaluating the relationship of premarket safety population size to regulatory characteristics and postmarket safety outcomes. The median size of the safety population was 1,044, and was lower for NTBs than NMEs (median: 920 vs. 1,138, P = 0.04), orphan products than nonorphan products (393 vs. 1,606, P < 0.001), and for products with fast-track designation (617 vs. 1,455, P < 0.001), priority review (630 vs. 1,735, P < 0.001), and accelerated approval (475 vs. 1,164, P < 0.001), than products without that designation. The median number of postmarket safety label updates and issues added to the label were higher with larger premarket exposure among nonorphan products, but not among orphan products. Products with accelerated approval using a surrogate end point had a higher median number of safety issues added to the label than those with full approval, but this did not vary with the size of the safety population; fast-track and priority review were not associated with the number of safety issues added to the label. A smaller safety population size was associated with a longer time to first safety outcome for nonorphan products but not orphan products. For orphan and nonorphan products combined, smaller premarket safety population size is not associated with the number or timing of postmarket safety outcomes, regardless of expedited program participation.


Assuntos
Produtos Biológicos/administração & dosagem , Aprovação de Drogas/métodos , Desenvolvimento de Medicamentos/métodos , Vigilância de Produtos Comercializados/métodos , United States Food and Drug Administration , Produtos Biológicos/normas , Estudos de Coortes , Desenvolvimento de Medicamentos/normas , Humanos , Vigilância de Produtos Comercializados/normas , Estudos Retrospectivos , Resultado do Tratamento , Estados Unidos/epidemiologia , United States Food and Drug Administration/normas
5.
Clin Pharmacol Ther ; 108(6): 1243-1253, 2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-32557564

RESUMO

We examined the relationship of regulatory and review characteristics to postmarketing safety-related regulatory actions for 61 new therapeutic biologics (NTBs) approved between October 1, 2002 and December 31, 2014. We also compared NTBs with small-molecule new molecular entities (NMEs) on these measures. Postmarketing safety-related regulatory actions were defined as a safety-related withdrawal or a safety-related update to a safety section of the label through June 30, 2018. Four NTBs were withdrawn, two for safety reasons. At least one safety-related update was added to the labels of 54 (88.5%) NTBs. Label updates occurred throughout the follow-up period. Time to the first safety-related regulatory action was shorter for NTBs approved under accelerated approval. The occurrence of safety events was more likely to occur with NTBs than with NMEs. This may be explained in part by the higher proportion of NTBs in the anatomical therapeutic chemical classification categories with higher frequency of safety-related updates. NTBs also had shorter time to safety events than NMEs. These findings underscore the importance of continued development of the life cycle safety surveillance system for both drugs and biologics with consideration for product type and its characteristics, including pharmacologic action.


Assuntos
Produtos Biológicos/uso terapêutico , Medicamentos Biossimilares/uso terapêutico , Vigilância de Produtos Comercializados , Produtos Biológicos/efeitos adversos , Medicamentos Biossimilares/efeitos adversos , Aprovação de Drogas , Rotulagem de Medicamentos , Humanos , Segurança do Paciente , Medição de Risco , Fatores de Tempo , Estados Unidos , United States Food and Drug Administration
6.
Sleep ; 42(4)2019 04 01.
Artigo em Inglês | MEDLINE | ID: mdl-30649500

RESUMO

STUDY OBJECTIVES: To examine the impact of untreated insomnia on health care utilization (HCU) among a nationally representative sample of Medicare beneficiaries. METHODS: Our data source was a random 5% sample of Medicare administrative data for years 2006-2013. Insomnia was operationalized as the presence of at least one claim containing an insomnia-related diagnosis in any given year based on International Classification of Disease, Version 9, Clinical Modification codes or at least one prescription fill for an insomnia-related medication in Part D prescription drug files in each year. We compared HCU in the year prior to insomnia diagnosis to HCU among to non-sleep disordered controls during the same period. RESULTS: A total of 151 668 beneficiaries were found to have insomnia. Compared to controls (n = 333 038), beneficiaries with insomnia had higher rates of HCU across all point of service locations. Rates of HCU were highest for inpatient care (rate ratio [RR] 1.61; 95% confidence interval [CI] 1.59, 1.64) and lowest for prescription fills (RR 1.17; 95% CI 1.16, 1.17). Similarly, compared to controls, beneficiaries with insomnia demonstrated $63,607 (95% CI $60,532, $66,685) higher all-cause costs, which were driven primarily by inpatient care ($60,900; 95% CI $56,609, $65,191). Emergency department ($1,492; 95% CI $1,387, $1,596) and prescription costs ($486; 95% CI $454, $518) were also elevated among cases relative to controls. CONCLUSIONS: In this randomly selected and nationally representative sample of older Medicare beneficiaries and compared to non-sleep disordered controls, individuals with untreated insomnia demonstrated increased HCU and costs across all points of service.


Assuntos
Custos de Cuidados de Saúde/estatística & dados numéricos , Aceitação pelo Paciente de Cuidados de Saúde/estatística & dados numéricos , Distúrbios do Início e da Manutenção do Sono/economia , Distúrbios do Início e da Manutenção do Sono/fisiopatologia , Idoso , Custos e Análise de Custo/economia , Feminino , Hospitalização/economia , Humanos , Masculino , Medicare/estatística & dados numéricos , Pessoa de Meia-Idade , Estados Unidos
7.
J Vasc Surg ; 66(3): 743-750, 2017 09.
Artigo em Inglês | MEDLINE | ID: mdl-28259573

RESUMO

OBJECTIVE: Endovascular aneurysm repair (EVAR) is considered a lower risk option for treating abdominal aortic aneurysms and is of particular utility in patients with poor functional status who may be poor candidates for open repair. However, the specific contribution of preoperative functional status to EVAR outcomes remains poorly defined. We hypothesized that impaired functional status, based simply on the ability of patients to perform activities of daily living, is associated with worse outcomes after EVAR. METHODS: Patients undergoing nonemergent EVAR for abdominal aortic aneurysm between 2010 and 2014 were identified in the National Surgical Quality Improvement Program (NSQIP) database. The primary outcomes were 30-day mortality and major operative and systemic complications. Secondary outcomes were inpatient length of stay, need for reoperation, and discharge disposition. Using the NSQIP-defined preoperative functional status, patients were stratified as independent or dependent (either partial or totally dependent) and compared by univariate and multivariable analyses. RESULTS: Of 13,432 patients undergoing EVAR between 2010 and 2014, 13,043 were independent (97%) and 389 were dependent (3%) before surgery. Dependent patients were older and more frequently minorities; had higher rates of chronic pulmonary, heart, and kidney disease; and were more likely to have an American Society of Anesthesiologists score of 4 or 5. On multivariable analysis, preoperative dependent status was an independent risk factor for operative complications (odds ratio [OR], 3.1; 95% confidence interval [CI], 2.5-3.9), systemic complications (OR, 2.8; 95% CI, 2.0-3.9), and 30-day mortality (OR, 3.4; 95% CI, 2.1-5.6). Secondary outcomes were worse among dependent patients. CONCLUSIONS: Although EVAR is a minimally invasive procedure with substantially less physiologic stress than in open aortic repair, preoperative functional status is a critical determinant of adverse outcomes after EVAR in spite of the minimally invasive nature of the procedure. Functional status, as measured by performance of activities of daily living, can be used as a valuable marker of increased perioperative risk and may identify patients who may benefit from preoperative conditioning and specialized perioperative care.


Assuntos
Atividades Cotidianas , Aneurisma da Aorta Abdominal/cirurgia , Implante de Prótese Vascular/efeitos adversos , Procedimentos Endovasculares/efeitos adversos , Nível de Saúde , Complicações Pós-Operatórias/etiologia , Idoso , Idoso de 80 Anos ou mais , Aneurisma da Aorta Abdominal/diagnóstico , Aneurisma da Aorta Abdominal/mortalidade , Implante de Prótese Vascular/mortalidade , Distribuição de Qui-Quadrado , Bases de Dados Factuais , Procedimentos Endovasculares/mortalidade , Feminino , Humanos , Tempo de Internação , Modelos Logísticos , Masculino , Análise Multivariada , Razão de Chances , Alta do Paciente , Complicações Pós-Operatórias/mortalidade , Complicações Pós-Operatórias/terapia , Valor Preditivo dos Testes , Estudos Retrospectivos , Medição de Risco , Fatores de Risco , Fatores de Tempo , Resultado do Tratamento , Estados Unidos
8.
AJR Am J Roentgenol ; 205(5): 976-84, 2015 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-26496544

RESUMO

OBJECTIVE: The purpose of this study was to define the cholangiographic patterns of ischemic cholangiopathy and clinically silent nonanastomotic biliary strictures in donation-after-cardiac-death (DCD) liver grafts in a large single-institution series. We also examined the correlation of the radiologic findings with laboratory data and clinical outcomes. MATERIALS AND METHODS: Data were collected for all DCD liver transplants at one institution from December 1998 to December 2011. Posttransplant cholangiograms were obtained during postoperative weeks 1 and 3 and when clinically indicated. Intrahepatic biliary strictures were classified by anatomic distribution and chronologic development. Radiologic findings were correlated with laboratory data and with 1-, 3-, and 5-year graft and patient survival rates. RESULTS: A total of 231 patients received DCD grafts. Cholangiograms were available for 184 of these patients. Postoperative cholangiographic findings were correlated with clinical data and divided into the following three groups: A, normal cholangiographic findings with normal laboratory values; B, radiologic abnormalities and cholangiopathy according to laboratory values; and C, radiologic abnormalities without laboratory abnormalities. Group B had four distinct abnormal cholangiographic patterns that were predictive of graft survival. Group C had mild nonprogressive multifocal stenoses and decreased graft and patient survival rates, although cholangiopathy was not detected in these patients according to laboratory data. CONCLUSION: Patterns and severity of nonanastomotic biliary abnormalities in DCD liver transplants can be defined radiologically and correlate with clinical outcomes. Postoperative cholangiography can depict the mild biliary abnormalities that occur in a subclinical manner yet cause a marked decrease in graft and patient survival rates in DCD liver transplants.


Assuntos
Doenças dos Ductos Biliares/diagnóstico por imagem , Colangiografia , Transplante de Fígado , Complicações Pós-Operatórias/diagnóstico por imagem , Idoso , Meios de Contraste , Morte , Feminino , Sobrevivência de Enxerto , Humanos , Iohexol , Testes de Função Hepática , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Taxa de Sobrevida
9.
Liver Transpl ; 20(6): 728-35, 2014 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-24648186

RESUMO

Limited data are available for outcomes of simultaneous liver-kidney (SLK) transplantation using donation after cardiac death (DCD) donors. The outcomes of 12 DCD-SLK transplants and 54 SLK transplants using donation after brain death (DBD) donors were retrospectively compared. The baseline demographics were similar for the DCD-SLK and DBD-SLK groups except for the higher liver donor risk index for the DCD-SLK group (1.8 ± 0.4 versus 1.3 ± 0.4, P = 0.001). The rates of surgical complications and graft rejections within 1 year were comparable for the DCD-SLK and DBD-SLK groups. Delayed renal graft function was twice as common in the DCD-SLK group. At 1 year, the serum creatinine levels and the iothalamate glomerular filtration rates were similar for the groups. The patient, liver graft, and kidney graft survival rates at 1 year were comparable for the groups (83.3%, 75.0%, and 82.5% for the DCD-SLK group and 92.4%, 92.4%, and 92.6% for the DBD-SLK group, P = 0.3 for all). The DCD-SLK group had worse patient, liver graft, and kidney graft survival at 3 years (62.5%, 62.5%, and 58.9% versus 90.5%, 90.5%, and 90.6%, P = 0.03 for all) and at 5 years (62.5%, 62.5%, and 58.9% versus 87.4%, 87.4%, and 87.7%, P < 0.05 for all). An analysis of the Organ Procurement and Transplantation Network database showed inferior 1- and 5-year patient and graft survival rates for DCD-SLK patients versus DBD-SLK patients. In conclusion, despite comparable rates of surgical and medical complications and comparable kidney function at 1 year, DCD-SLK transplantation was associated with inferior long-term survival in comparison with DBD-SLK transplantation.


Assuntos
Morte Encefálica , Cardiopatias/mortalidade , Transplante de Rim , Transplante de Fígado , Doadores de Tecidos , Obtenção de Tecidos e Órgãos , Adulto , Idoso , Bases de Dados Factuais , Função Retardada do Enxerto/etiologia , Feminino , Florida , Rejeição de Enxerto/etiologia , Sobrevivência de Enxerto , Humanos , Transplante de Rim/efeitos adversos , Transplante de Rim/mortalidade , Transplante de Fígado/efeitos adversos , Transplante de Fígado/mortalidade , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de Risco , Fatores de Tempo , Resultado do Tratamento
11.
Ann Hepatol ; 11(5): 679-85, 2012.
Artigo em Inglês | MEDLINE | ID: mdl-22947529

RESUMO

 Patients with end stage liver disease may become critically ill prior to LT requiring admission to the intensive care unit (ICU). The high acuity patients may be thought too ill to transplant; however, often LT is the only therapeutic option. Choosing the correct liver allograft for these patients is often difficult and it is imperative that the allograft work immediately. Donation after cardiac death (DCD) donors provide an important source of livers, however, DCD graft allocation remains a controversial topic, in critically ill patients. Between January 2003-December 2008, 1215 LTs were performed: 85 patients at the time of LT were in the ICU. Twelve patients received DCD grafts and 73 received donation after brain dead (DBD) grafts. After retransplant cases and multiorgan transplants were excluded, 8 recipients of DCD grafts and 42 recipients of DBD grafts were included in this study. Post-transplant outcomes of DCD and DBD liver grafts were compared. While there were differences in graft and survival between DCD and DBD groups at 4 month and 1 year time points, the differences did not reach statistical significance. The graft and patient survival rates were similar among the groups at 3-year time point. There is need for other large liver transplant programs to report their outcomes using liver grafts from DCD and DBD donors. We believe that the experience of the surgical, medical and critical care team is important for successfully using DCD grafts for critically ill patients.


Assuntos
Morte Encefálica , Seleção do Doador , Doença Hepática Terminal/cirurgia , Transplante de Fígado , Doadores de Tecidos/provisão & distribuição , Adolescente , Adulto , Distribuição de Qui-Quadrado , Criança , Estado Terminal , Feminino , Sobrevivência de Enxerto , Humanos , Estimativa de Kaplan-Meier , Transplante de Fígado/efeitos adversos , Transplante de Fígado/mortalidade , Masculino , Pessoa de Meia-Idade , Medição de Risco , Fatores de Risco , Fatores de Tempo , Resultado do Tratamento , Adulto Jovem
12.
Transpl Int ; 25(8): 838-46, 2012 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-22703372

RESUMO

This study sought to determine the procurement factors that lead to development of intrahepatic bile duct strictures (ITBS) and overall biliary complications in recipients of donation after cardiac death (DCD) liver grafts. Detailed information for different time points during procurement (withdrawal of support; SBP < 50 mmHg; oxygen saturation <30%; mandatory wait period; asystole; incision; aortic cross clamp) and their association with the development of ITBS and overall biliary complications were examined using logistic regression. Two hundred and fifteen liver transplants using DCD donors were performed between 1998 and 2010 at Mayo Clinic Florida. Of all the time periods during procurement, only asystole-cross clamp period was significantly different between patients with ITBS versus no ITBS (P = 0.048) and between the patients who had overall biliary complications versus no biliary complications (P = 0.047). On multivariate analysis, only asystole-cross clamp period was significant predictor for development of ITBS (P = 0.015) and development of overall biliary complications (P = 0.029). Hemodynamic changes in the agonal period did not emerge as risk factors. The results of the study raise the possibility of utilizing asystole-cross-clamp period in place of or in conjunction with donor warm ischemia time in determining viability or quality of liver grafts.


Assuntos
Doenças dos Ductos Biliares/etiologia , Parada Cardíaca , Transplante de Fígado/métodos , Obtenção de Tecidos e Órgãos/métodos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Ductos Biliares Intra-Hepáticos/patologia , Criança , Constrição Patológica/etiologia , Morte , Feminino , Sobrevivência de Enxerto , Parada Cardíaca/etiologia , Humanos , Transplante de Fígado/efeitos adversos , Masculino , Pessoa de Meia-Idade , Complicações Pós-Operatórias/etiologia , Estudos Retrospectivos
13.
Transplantation ; 93(10): 1006-12, 2012 May 27.
Artigo em Inglês | MEDLINE | ID: mdl-22357174

RESUMO

BACKGROUND: The role of sirolimus (SRL) conversion in the preservation of kidney function in liver transplant (LT) recipients with calcineurin inhibitor (CNI) nephrotoxicity is unclear. METHODS: Data on 102 LT recipients with deteriorating kidney function after CNI exposure who were later converted to SRL were retrospectively reviewed. Kidney function was assessed using serum creatinine and estimated glomerular filtration rate (eGFR) at time of conversion and serially thereafter. The primary endpoint was stabilization or improvement of kidney function as assessed by eGFR at last recorded follow-up compared with eGFR at the time of conversion. RESULT: After a median (interquartile range) of 3.1 (1.6-4.5) years of follow-up, serum creatinine decreased from 1.9 ± 0.8 to 1.8 ± 0.7 mg/dL (P=0.25) and eGFR increased from 40.8 ± 16.7 to 44.3 ± 20.0 mL/min (P=0.03). During the same time period, 24-hr urinary protein excretion increased from median (interquartile range) of 72 (0-155) to 382 (169-999) mg/day (P=0.0001). Sixty-five (64%) patients achieved the primary endpoint and 37 (36%) experienced deterioration in kidney function. Independent predictors of deterioration of kidney function after SRL conversion were development of proteinuria ≥ 1000 mg/day (odds ratio [OR]: 3.3, confidence interval [CI]: 1.1-9.5 P=0.03), post-LT diabetes (OR: 4.2, CI: 1.6-11.1, P=0.004), and higher eGFR at time of conversion (OR: 1.6, CI: 1.2-2.2, P=0.003). CONCLUSION: Improvement or stabilization of kidney function occurred in the majority of LT recipients converted to SRL for CNI nephrotoxicity. Proteinuria ≥ 1000 mg/day, post-LT diabetes, and higher baseline eGFR were independent predictors of kidney function loss after SRL conversion.


Assuntos
Inibidores de Calcineurina , Imunossupressores/efeitos adversos , Rim/efeitos dos fármacos , Transplante de Fígado/efeitos adversos , Proteinúria/induzido quimicamente , Sirolimo/efeitos adversos , Idoso , Feminino , Taxa de Filtração Glomerular/efeitos dos fármacos , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos
14.
Liver Transpl ; 18(3): 361-9, 2012 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-22140001

RESUMO

The continuation of hemodynamic, respiratory, and metabolic support for a variable period after liver transplantation (LT) in the intensive care unit (ICU) is considered routine by many transplant programs. However, some LT recipients may be liberated from mechanical ventilation shortly after the discontinuation of anesthesia. These patients might be appropriately discharged from the postanesthesia care unit (PACU) to the surgical ward and bypass the ICU entirely. In 2002, our program started a fast-tracking program: select LT recipients are transferred from the operating room to the PACU for recovery and tracheal extubation with a subsequent transfer to the ward, and the ICU stay is completely eliminated. Between January 1, 2003 and December 31, 2007, 1045 patients underwent LT at our transplant program; 175 patients were excluded from the study. Five hundred twenty-three of the remaining 870 patients (60.10%) were fast-tracked to the surgical ward, and 347 (39.90%) were admitted to the ICU after LT. The failure rate after fast-tracking to the surgical ward was 1.90%. The groups were significantly different with respect to the recipient age, the raw Model for End-Stage Liver Disease (MELD) score at the time of LT, the recipient body mass index (BMI), the retransplantation status, the operative time, the warm ischemia time, and the intraoperative transfusion requirements. A multivariate logistic regression analysis revealed that the raw MELD score at the time of LT, the operative time, the intraoperative transfusion requirements, the recipient age, the recipient BMI, and the absence of hepatocellular cancer/cholangiocarcinoma were significant predictors of ICU admission. In conclusion, we are reporting the largest single-center experience demonstrating the feasibility of bypassing an ICU stay after LT.


Assuntos
Unidades de Terapia Intensiva , Transplante de Fígado , Adulto , Idoso , Estudos de Viabilidade , Feminino , Sobrevivência de Enxerto , Humanos , Transplante de Fígado/mortalidade , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Índice de Gravidade de Doença
15.
Liver Transpl ; 18(1): 100-11, 2012 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-21837741

RESUMO

The use of donation after cardiac death (DCD) liver grafts is controversial because of the overall increased rates of graft loss and morbidity, which are mostly related to the consequences of ischemic cholangiopathy (IC). In this study, we sought to determine the factors leading to graft loss and the development of IC and to compare patient and graft survival rates for recipients of DCD liver grafts and recipients of donation after brain death (DBD) liver grafts in a large series at a single transplant center. Two hundred liver transplants with DCD donors were performed between 1998 and 2010 at Mayo Clinic Florida. Logistic regression models were used in the univariate and multivariate analyses of predictors for the development of IC. Additional analyses using Cox regression models were performed to identify predictors of graft survival and to compare outcomes for DCD and DBD graft recipients. In our series, the patient survival rates for the DCD and DBD groups at 1, 3, and 5 years was 92.6%, 85%, and 80.9% and 89.8%, 83.0%, and 76.6%, respectively (P = not significant). The graft survival rates for the DCD and DBD groups at 1, 3, and 5 years were 80.9%, 72.7%, and 68.9% and 83.3%, 75.1%, and 68.6%, respectively (P = not significant). In the DCD group, 5 patients (2.5%) had primary nonfunction, 7 patients (3.5%) had hepatic artery thrombosis, and 3 patients (1.5%) experienced hepatic necrosis. IC was diagnosed in 24 patients (12%), and 11 of these patients (5.5%) required retransplantation. In the multivariate analysis, the asystole-to-cross clamp duration [odds ratio = 1.161, 95% confidence interval (CI) = 1.021-1.321] and African American recipient race (odds ratio = 5.374, 95% CI = 1.368-21.103) were identified as significant factors for predicting the development of IC (P < 0.05). This study has established a link between the development of IC and the asystole-to-cross clamp duration. Procurement techniques that prolong the nonperfusion period increase the risk for the development of IC in DCD liver grafts.


Assuntos
Doenças Biliares/epidemiologia , Morte Encefálica , Morte , Hepatopatias/cirurgia , Transplante de Fígado , Doadores de Tecidos , Obtenção de Tecidos e Órgãos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Feminino , Sobrevivência de Enxerto , Humanos , Fígado/patologia , Transplante de Fígado/mortalidade , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Necrose/epidemiologia , Prevalência , Estudos Retrospectivos , Fatores de Risco , Taxa de Sobrevida , Adulto Jovem
16.
Ann Hepatol ; 10(4): 482-5, 2011.
Artigo em Inglês | MEDLINE | ID: mdl-21911889

RESUMO

INTRODUCTION: Donation after cardiac death (DCD) donors provide an important source of livers that has been used to expand the donor pool. As a consequence of increased numbers of OLT, allograft failure due to early and late complications and disease recurrence are more commonly encountered. The only life saving treatment for patients with liver allograft failure is liver re-transplantation (LR). The use of DCD liver grafts for LR is controversial. MATERIAL AND METHODS: Between February 1998 and June 2008, 10 patients underwent LR with DCD allografts. Five (50%) patients had no post operative complications. The 30 day, 1 year, and 3 year patient survival are 80, 60, and 60%, respectively. When DCD grafts are used for sick patients with high MELD scores for LR, the patient and graft survivals are prohibitively low. CONCLUSION: We do not recommend utilization of DCD liver grafts for LR if a candidate recipient has moderate to high MELD score.


Assuntos
Transplante de Fígado , Doadores de Tecidos/provisão & distribuição , Idoso , Distribuição de Qui-Quadrado , Feminino , Florida , Humanos , Estimativa de Kaplan-Meier , Transplante de Fígado/efeitos adversos , Transplante de Fígado/mortalidade , Masculino , Pessoa de Meia-Idade , Seleção de Pacientes , Reoperação , Estudos Retrospectivos , Medição de Risco , Fatores de Risco , Fatores de Tempo , Falha de Tratamento , Resultado do Tratamento
17.
Liver Transpl ; 17(6): 641-9, 2011 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-21618684

RESUMO

Hepatitis C virus (HCV) infection is the most common indication for orthotopic liver transplantation in the United States. Although studies have addressed the use of expanded criteria donor organs in HCV(+) patients, to date the use of liver grafts from donation after cardiac death (DCD) donors in HCV(+) patients has been addressed by only a limited number of studies. This retrospective analysis was undertaken to study the outcomes of DCD liver grafts used in HCV(+) recipients. Seventy-seven HCV(+) patients who received DCD liver grafts were compared to 77 matched HCV(+) patients who received donation after brain death (DBD) liver grafts and 77 unmatched non-HCV patients who received DCD liver grafts. There were no differences in 1-, 3-, and 5-year patient or graft survival among the groups. Multivariate analysis showed that the Model for End-Stage Liver Disease score [hazard ratio (HR) = 1.037, 95% confidence interval (CI) = 1.006-1.069, P = 0.018] and posttransplant cytomegalovirus infection (HR = 3.367, 95% CI = 1.493-7.593, P = 0.003) were significant factors for graft loss. A comparison of the HCV(+) groups for fibrosis progression based on protocol biopsy samples up to 5 years post-transplant did not show any difference; in multivariate analysis, HCV genotype 1 was the only factor that affected progression to stage 2 fibrosis (genotype 1 versus non-1 genotypes: HR = 2.739, 95% CI = 1.047-7.143, P = 0.040). In conclusion, this match-controlled, retrospective analysis demonstrates that DCD liver graft utilization does not cause untoward effects on disease progression or patient and graft survival in comparison with DBD liver grafts in HCV(+) patients.


Assuntos
Morte , Hepacivirus/isolamento & purificação , Hepatite C/cirurgia , Transplante de Fígado/estatística & dados numéricos , Fígado/virologia , Doadores de Tecidos , Obtenção de Tecidos e Órgãos/métodos , Adolescente , Adulto , Idoso , Biópsia , Morte Encefálica , Feminino , Rejeição de Enxerto/epidemiologia , Hepatite C/mortalidade , Humanos , Fígado/patologia , Fígado/cirurgia , Cirrose Hepática/epidemiologia , Hepatopatias/mortalidade , Hepatopatias/cirurgia , Masculino , Pessoa de Meia-Idade , Prevalência , Recidiva , Estudos Retrospectivos , Taxa de Sobrevida , Resultado do Tratamento , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...