Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 12 de 12
Filtrar
1.
Curr Opin Organ Transplant ; 25(2): 115-121, 2020 04.
Artigo em Inglês | MEDLINE | ID: mdl-32073498

RESUMO

PURPOSE OF REVIEW: The Scientific Registry of Transplant Recipients (SRTR) supports the Organ Procurement and Transplantation Network (OPTN) efforts to better align liver allocation with the Final Rule. Here, we review recent literature related to removing place of residence or listing from organ allocation policy and describe how SRTR may help advance the OPTN policy development process. RECENT FINDINGS: In December 2018, the OPTN Board of Directors endorsed the recommendation from OPTN's ad hoc Committee on Geography to develop organ-allocation policies that do not rely on geographic boundaries, called 'continuous distribution.' Many objections to wider organ distribution stem from efforts to address inequities in allocation for populations within geographic regions rather than for individual patients. A continuous distribution system could equitably address the needs of individual patients, merging ethical-medical urgency with geographic feasibility. SUMMARY: The effort to remove geographic boundaries from organ distribution and allocation has been controversial. An integrated continuous distribution system may help focus the debate on priorities that matter most to patients.


Assuntos
Obtenção de Tecidos e Órgãos/organização & administração , Humanos , Sistema de Registros , Doadores de Tecidos , Listas de Espera
2.
Am J Transplant ; 18(11): 2635-2640, 2018 11.
Artigo em Inglês | MEDLINE | ID: mdl-30203912

RESUMO

The Final Rule mandates that organ allocation not be based on the transplant candidate's place of residence or listing, except as required by sound medical judgment and best use of donated organs, to avoid wasting organs and futile transplants, and to promote access and efficiency. Current Organ Procurement and Transplantation Network (OPTN) policies use donation service areas and OPTN regions to distribute and allocate organs for transplant. These policies have recently been called into question as not meeting the requirements of the Final Rule. Therefore, we propose using borderless allocation scores that combine medical priority scores with geographic feasibility scores. Medical priority scores are currently used in OPTN allocation policy, for example, the model for end-stage liver disease and the lung allocation score. Geographic feasibility scores can be developed to account for the effects of ischemia due to travel times, donor characteristics that modify the feasibility of traveling due to organ outcomes, and the costs of shipping organs over long distances. A borderless distribution and allocation system could address the goals of equity and utility, while fulfilling the mandates of the Final Rule and providing optimal use of a scare resource.


Assuntos
Transplante de Órgãos/legislação & jurisprudência , Alocação de Recursos/legislação & jurisprudência , Alocação de Recursos/normas , Doadores de Tecidos/legislação & jurisprudência , Obtenção de Tecidos e Órgãos/legislação & jurisprudência , Listas de Espera , Geografia , Humanos
3.
Liver Transpl ; 24(4): 478-487, 2018 04.
Artigo em Inglês | MEDLINE | ID: mdl-29316203

RESUMO

Offer acceptance practices may cause geographic variability in allocation Model for End-Stage Liver Disease (aMELD) score at transplant and could magnify the effect of donor supply and demand on aMELD variability. To evaluate these issues, offer acceptance practices of liver transplant programs and donation service areas (DSAs) were estimated using offers of livers from donors recovered between January 1, 2016, and December 31, 2016. Offer acceptance practices were compared with liver yield, local placement of transplanted livers, donor supply and demand, and aMELD at transplant. Offer acceptance was associated with liver yield (odds ratio, 1.32; P < 0.001), local placement of transplanted livers (odds ratio, 1.34; P < 0.001), and aMELD at transplant (average aMELD difference, -1.62; P < 0.001). However, the ratio of donated livers to listed candidates in a DSA (ie, donor-to-candidate ratio) was associated with median aMELD at transplant (r = -0.45; P < 0.001), but not with offer acceptance (r = 0.09; P = 0.50). Additionally, the association between DSA-level donor-to-candidate ratios and aMELD at transplant did not change after adjustment for offer acceptance. The average squared difference in median aMELD at transplant across DSAs was 24.6; removing the effect of donor-to-candidate ratios reduced the average squared differences more than removing the effect of program-level offer acceptance (33% and 15% reduction, respectively). Offer acceptance practices and donor-to-candidate ratios independently contributed to geographic variability in aMELD at transplant. Thus, neither offer acceptance nor donor-to-candidate ratios can explain all of the geographic variability in aMELD at transplant. Liver Transplantation 24 478-487 2018 AASLD.


Assuntos
Doença Hepática Terminal/cirurgia , Transplante de Fígado/estatística & dados numéricos , Aceitação pelo Paciente de Cuidados de Saúde/estatística & dados numéricos , Sistema de Registros/estatística & dados numéricos , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Adolescente , Adulto , Fatores Etários , Idoso , Doença Hepática Terminal/patologia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Razão de Chances , Índice de Gravidade de Doença , Fatores de Tempo , Obtenção de Tecidos e Órgãos/organização & administração , Estados Unidos , Listas de Espera , Adulto Jovem
4.
J Clin Monit Comput ; 31(3): 561-569, 2017 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-27142098

RESUMO

Technology advances make it possible to consider continuous acoustic respiratory rate monitoring as an integral component of physiologic surveillance systems. This study explores technical and logistical aspects of augmenting pulse oximetry-based patient surveillance systems with continuous respiratory rate monitoring and offers some insight into the impact on patient deterioration detection that may result. Acoustic respiratory rate sensors were introduced to a general care pulse oximetry-based surveillance system with respiratory rate alarms deactivated. Simulation was used after 4324 patient days to determine appropriate alarm thresholds for respiratory rate, which were then activated. Data were collected for an additional 4382 patient days. Physiologic parameters, alarm data, sensor utilization and patient/staff feedback were collected throughout the study and analyzed. No notable technical or workflow issues were observed. Sensor utilization was 57 %, with patient refusal leading reasons for nonuse (22.7 %). With respiratory rate alarm thresholds set to 6 and 40 breaths/min., the majority of nurse pager clinical notifications were triggered by low oxygen saturation values (43 %), followed by low respiratory rate values (21 %) and low pulse rate values (13 %). Mean respiratory rate collected was 16.6 ± 3.8 breaths/min. The vast majority (82 %) of low oxygen saturation states coincided with normal respiration rates of 12-20 breaths/min. Continuous respiratory rate monitoring can be successfully added to a pulse oximetry-based surveillance system without significant technical, logistical or workflow issues and is moderately well-tolerated by patients. Respiratory rate sensor alarms did not significantly impact overall system alarm burden. Respiratory rate and oxygen saturation distributions suggest adding continuous respiratory rate monitoring to a pulse oximetry-based surveillance system may not significantly improve patient deterioration detection.


Assuntos
Auscultação/métodos , Diagnóstico por Computador/estatística & dados numéricos , Oximetria/estatística & dados numéricos , Insuficiência Respiratória/diagnóstico , Insuficiência Respiratória/epidemiologia , Sons Respiratórios , Espectrografia do Som/estatística & dados numéricos , Feminino , Humanos , Estudos Longitudinais , Masculino , Monitorização Fisiológica/estatística & dados numéricos , New Hampshire/epidemiologia , Aceitação pelo Paciente de Cuidados de Saúde/estatística & dados numéricos , Exame Físico/estatística & dados numéricos , Prevalência , Reprodutibilidade dos Testes , Insuficiência Respiratória/prevenção & controle , Taxa Respiratória , Estudos Retrospectivos , Sensibilidade e Especificidade , Revisão da Utilização de Recursos de Saúde
5.
Liver Transpl ; 21(8): 1031-9, 2015 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-25990089

RESUMO

Concerns have been raised that optimized redistricting of liver allocation areas might have the unintended result of shifting livers from better-performing to poorer-performing organ procurement organizations (OPOs). We used liver simulated allocation modeling to simulate a 5-year period of liver sharing within either 4 or 8 optimized districts. We investigated whether each OPO's net liver import under redistricting would be correlated with 2 OPO performance metrics (observed to expected liver yield and liver donor conversion ratio), along with 2 other potential correlates (eligible deaths and incident listings above a Model for End-Stage Liver Disease score of 15). We found no evidence that livers would flow from better-performing OPOs to poorer-performing OPOs in either redistricting scenario. Instead, under these optimized redistricting plans, our simulations suggest that livers would flow from OPOs with more-than-expected eligible deaths toward those with fewer-than-expected eligible deaths and that livers would flow from OPOs with fewer-than-expected incident listings to those with more-than-expected incident listings; the latter is a pattern that is already established in the current allocation system. Redistricting liver distribution to reduce geographic inequity is expected to align liver allocation across the country with the distribution of supply and demand rather than transferring livers from better-performing OPOs to poorer-performing OPOs.


Assuntos
Área Programática de Saúde , Alocação de Recursos para a Atenção à Saúde , Necessidades e Demandas de Serviços de Saúde , Disparidades em Assistência à Saúde , Transplante de Fígado/métodos , Avaliação de Processos em Cuidados de Saúde , Doadores de Tecidos/provisão & distribuição , Obtenção de Tecidos e Órgãos/métodos , Simulação por Computador , Atenção à Saúde , Humanos , Transplante de Fígado/efeitos adversos , Transplante de Fígado/mortalidade , Modelos Teóricos , Avaliação das Necessidades , Fatores de Tempo , Resultado do Tratamento , Listas de Espera
6.
Anesth Analg ; 118(2): 326-331, 2014 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-24361847

RESUMO

BACKGROUND: The manual collection and charting of traditional vital signs data in inpatient populations have been shown to be inaccurate when compared with true physiologic values. This issue has not been examined with respect to oxygen saturation data despite the increased use of this measurement in systems designed to assess the risk of patient deterioration. Of particular note are the lack of available data examining the accuracy of oxygen saturation charting in a particularly vulnerable group of patients who have prolonged oxygen desaturations (mean SpO2 <90% over at least 15 minutes). In addition, no data are currently available that investigate the often suspected "wake up" effect, resulting from a nurse entering a patient's room to obtain vital signs. METHODS: In this study, we compared oxygen saturation data recorded manually with data collected by an automated continuous monitoring system in 16 inpatients considered to be at high risk for deterioration (average SpO2 values <90% collected by the automated system in a 15-minute interval before a manual charting event). Data were sampled from the automatic collection system from 2 periods: over a 15-minute period that ended 5 minutes before the time of the manual data collection and charting, and over a 5-minute range before and after the time of the manual data collection and charting. Average saturations from prolonged baseline desaturations (15-minute period) were compared with both the manual and automated data sampled at the time of the nurse's visit to analyze for systematic change and to investigate the presence of an arousal effect. RESULTS: The manually charted data were higher than those recorded by the automated system. Manually recorded data were on average 6.5% (confidence interval, 4.0%-9.0%) higher in oxygen saturation. No significant arousal effect resulting from the nurse's visit to the patient's room was detected. CONCLUSIONS: In a cohort of patients with prolonged desaturations, manual recordings of SpO2 did not reflect physiologic patient state when compared with continuous automated sampling. Currently, early warning scores depend on manual vital sign recordings in many settings; the study data suggest that SpO2 ought to be added to the list of vital sign values that have been shown to be recorded inaccurately.


Assuntos
Sistemas Computadorizados de Registros Médicos/normas , Monitorização Fisiológica/métodos , Oximetria/métodos , Oxigênio/metabolismo , Automação , Estudos de Coortes , Coleta de Dados , Sistemas de Informação Hospitalar , Humanos , Pacientes Internados , Prontuários Médicos , Reprodutibilidade dos Testes , Risco , Interface Usuário-Computador , Sinais Vitais
7.
Jt Comm J Qual Patient Saf ; 38(9): 428-31, 385, 2012 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-23002497

RESUMO

A physiologic database infrastructure, composed of a patient monitoring system and a data processing and storage system, enables the detection of deterioration in noncritical patients, thereby helping to prevent failure-to-rescue events.


Assuntos
Bases de Dados Factuais , Monitorização Fisiológica/métodos , Sinais Vitais/fisiologia , Humanos , Monitorização Fisiológica/instrumentação , Risco
8.
Anesthesiology ; 115(2): 421-31, 2011 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-21587063

RESUMO

Failure-to-Rescue, defined as hospital deaths after adverse events, is an established measure of patient safety and hospital quality. Until recently, approaches used to address failure-to-rescue have been focused primarily on improvement of response to a recognized patient crisis, with limited success in terms of patient outcomes. Less attention has been paid to improving the detection of the crisis. A wealth of retrospective data exist to support the observation that adverse events in general ward patients are preceded by a significant period (on the order of hours) of physiologic deterioration. Thus, the lack of early recognition of physiologic decline plays a major role in the failure-to-rescue problem.


Assuntos
Mortalidade Hospitalar , Monitorização Fisiológica , Idoso , Cuidados Críticos , Serviço Hospitalar de Emergência , Feminino , Humanos , Unidades de Terapia Intensiva , Masculino , Estudos Retrospectivos , Risco
9.
Anesthesiology ; 112(2): 282-7, 2010 02.
Artigo em Inglês | MEDLINE | ID: mdl-20098128

RESUMO

BACKGROUND: Some preventable deaths in hospitalized patients are due to unrecognized deterioration. There are no publications of studies that have instituted routine patient monitoring postoperatively and analyzed impact on patient outcomes. METHODS: The authors implemented a patient surveillance system based on pulse oximetry with nursing notification of violation of alarm limits via wireless pager. Data were collected for 11 months before and 10 months after implementation of the system. Concurrently, matching outcome data were collected on two other postoperative units. The primary outcomes were rescue events and transfers to the intensive care unit compared before and after monitoring change. RESULTS: Rescue events decreased from 3.4 (1.89-4.85) to 1.2 (0.53-1.88) per 1,000 patient discharges and intensive care unit transfers from 5.6 (3.7-7.4) to 2.9 (1.4-4.3) per 1,000 patient days, whereas the comparison units had no change. CONCLUSIONS: Patient surveillance monitoring results in a reduced need for rescues and intensive care unit transfers.


Assuntos
Serviços Médicos de Emergência/métodos , Unidades de Terapia Intensiva , Monitorização Fisiológica/métodos , Oximetria/métodos , Transporte de Pacientes , Idoso , Anestesia , Alarmes Clínicos , Estudos de Coortes , Feminino , Parada Cardíaca/diagnóstico , Hemodinâmica/fisiologia , Humanos , Masculino , Pessoa de Meia-Idade , Procedimentos Ortopédicos , Período Pós-Operatório , Resultado do Tratamento
10.
Transplantation ; 102(5): 769-774, 2018 05.
Artigo em Inglês | MEDLINE | ID: mdl-29309379

RESUMO

BACKGROUND: The liver simulated allocation model (LSAM) can be used to study likely effects of liver transplant allocation policy changes on organ offers, acceptance, waitlist survival, and posttransplant survival. Implementation of Share 35 in June 2013 allowed for testing how well LSAM predicted actual changes. METHODS: LSAM projections for 1 year of liver transplants before and after the Share 35 policy change were compared with observed data during the same period. Numbers of organs recovered, organ sharing, transplant rates, and waitlist mortality rates (per 100 waitlist years) were evaluated by LSAM and compared with observed data. RESULTS: Candidate, recipient, and donor characteristics in the LSAM cohorts were similar to those in the observed population before and after Share 35. LSAM correctly predicted more accepted organs and fewer discarded organs with Share 35. LSAM also predicted increased regional and national sharing, consistent with observed data, although the magnitude was overestimated. Transplant rates were correctly projected to increase and waitlist death rates to decrease. CONCLUSIONS: Although the absolute number of transplants was underestimated and waitlist deaths overestimated, the direction of change was consistent with observed data. LSAM correctly predicted change in discarded organs, regional and national sharing, waitlist mortality, and transplants after Share 35 implementation.


Assuntos
Simulação por Computador , Técnicas de Apoio para a Decisão , Transplante de Fígado/métodos , Avaliação de Processos em Cuidados de Saúde/métodos , Doadores de Tecidos/provisão & distribuição , Obtenção de Tecidos e Órgãos/métodos , Listas de Espera , Feminino , Humanos , Transplante de Fígado/efeitos adversos , Transplante de Fígado/legislação & jurisprudência , Transplante de Fígado/mortalidade , Masculino , Pessoa de Meia-Idade , Formulação de Políticas , Complicações Pós-Operatórias/etiologia , Avaliação de Processos em Cuidados de Saúde/legislação & jurisprudência , Fatores de Risco , Fatores de Tempo , Doadores de Tecidos/legislação & jurisprudência , Obtenção de Tecidos e Órgãos/legislação & jurisprudência , Resultado do Tratamento , Estados Unidos , Listas de Espera/mortalidade
11.
Clin J Am Soc Nephrol ; 11(3): 505-11, 2016 Mar 07.
Artigo em Inglês | MEDLINE | ID: mdl-26839235

RESUMO

BACKGROUND AND OBJECTIVES: In December of 2014, the Organ Procurement and Transplant Network implemented a new Kidney Allocation System (KAS) for deceased donor transplant, with increased priority for highly sensitized candidates (calculated panel-reactive antibody [cPRA] >99%). We used a modified version of the new KAS to address issues of access and equity for these candidates. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: In a simulation, 10,988 deceased donor kidneys transplanted into waitlisted recipients in 2010 were instead allocated to candidates with cPRA≥80% (n=18,004). Each candidate's unacceptable donor HLA antigens had been entered into the allocation system by the transplant center. In simulated match runs, kidneys were allocated sequentially to adult ABO identical or permissible candidates with cPRA 100%, 99%, 98%, etc. to 80%. Allocations were restricted to donor/recipient pairs with negative virtual crossmatches. RESULTS: The simulation indicated that 2111 of 10,988 kidneys (19.2%) would have been allocated to patients with cPRA 100% versus 74 of 10,988 (0.7%) that were actually transplanted. Of cPRA 100% candidates, 74% were predicted to be compatible with an average of six deceased donors; the remaining 26% seemed to be incompatible with every deceased donor organ that entered the system. Of kidneys actually allocated to cPRA 100% candidates in 2010, 66% (49 of 74) were six-antigen HLA matched/zero-antigen mismatched (HLA-A, -B, and -DR) with their recipients versus only 11% (237 of 2111) in the simulation. The simulation predicted that 10,356 of 14,433 (72%) candidates with cPRA 90%-100% could be allocated an organ compared with 7.3% who actually underwent transplant. CONCLUSIONS: Data in this simulation are consistent with early results of the new KAS; specifically, nearly 20% of deceased donor kidneys were (virtually) compatible with cPRA 100% candidates. Although most of these candidates were predicted to be compatible with multiple donors, approximately one-quarter are unlikely to receive a single offer.


Assuntos
Seleção do Doador , Antígenos HLA/imunologia , Histocompatibilidade , Isoanticorpos/sangue , Transplante de Rim/métodos , Doadores de Tecidos/provisão & distribuição , Biomarcadores/sangue , Simulação por Computador , Acessibilidade aos Serviços de Saúde , Teste de Histocompatibilidade , Humanos , Valor Preditivo dos Testes , Sistema de Registros , Medição de Risco , Fatores de Risco , Obtenção de Tecidos e Órgãos , Estados Unidos , Listas de Espera
12.
Transplantation ; 100(10): 2153-9, 2016 10.
Artigo em Inglês | MEDLINE | ID: mdl-27490411

RESUMO

BACKGROUND: The probability of liver transplant and death on the waiting list in the United States varies greatly by donation service area (DSA) due to geographic differences in availability of organs and allocation of priority points, making it difficult for providers to predict likely outcomes after listing. We aimed to develop an online calculator to report outcomes by region and patient characteristics. METHODS: Using the Scientific Registry of Transplant Recipients database, we included all prevalent US adults aged 18 years or older waitlisted for liver transplant, examined on 24 days at least 30 days apart over a 2-year period. Outcomes were determined at intervals of 30 to 365 days. Outcomes are reported by transplant program, DSA, region, and the nation for comparison, and can be shown by allocation or by laboratory model for end-stage liver disease (MELD) score (6-14, 15-24, 25-29, 30-34, 35-40), age, and blood type. RESULTS: Outcomes varied greatly by DSA; for candidates with allocation MELD 25-29, the 25th and 75th percentiles of liver transplant probability were 30% and 67%, respectively, at 90 days. Corresponding percentiles for death or becoming too sick to undergo transplant were 5% and 9%. Outcomes also varied greatly for candidates with and without MELD exception points. CONCLUSIONS: The waitlist outcome calculator highlights ongoing disparities in access to liver transplant and may assist providers in understanding and counseling their patients about likely outcomes on the waiting list.


Assuntos
Doença Hepática Terminal/cirurgia , Transplante de Fígado , Obtenção de Tecidos e Órgãos , Listas de Espera , Adulto , Doença Hepática Terminal/mortalidade , Humanos , Transplante de Fígado/mortalidade , Índice de Gravidade de Doença , Doadores de Tecidos , Estados Unidos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA