RESUMO
BACKGROUND: Bloodborne pathogens pose a major safety risk in transfusion medicine. To mitigate the risk of bacterial contamination in platelet units, FDA issues updated guidance materials on various bacterial risk control strategies (BRCS). This analysis presents results of a budget impact model updated to include 5- and 7-day pathogen reduced (PR) and large volumed delayed sampling (LVDS) BRCS. STUDY DESIGN AND METHODS: Model base-case parameter inputs were based on scientific literature, a survey distributed to 27 US hospitals, and transfusion experts' opinion. The outputs include hospital budget and shelf-life impacts for 5- and 7-day LVDS, and 5- and 7-day PR units under three different scenarios: (1) 100% LVDS, (2) 100% PR, and (3) mix of 50% LVDS - and 50% PR. RESULTS: Total annual costs from the hospital perspective were highest for 100% LVDS platelets (US$2.325M) and lowest for 100% PR-7 units (US$2.170M). Net budget impact after offsetting annual costs by outpatient reimbursements was 5.5% lower for 5-day PR platelets as compared to 5-day LVDS (US$1.663 vs. US$1.760M). A mix of 7-day LVDS and 5-day PR platelets had net annual costs that were 1.3% lower than for 100% 7-day LVDS, but 1.3% higher than for 100% 5-day PR. 7-day PR platelets had the longest shelf life (4.63 days), while 5-day LVDS had the shortest (2.00 days). DISCUSSION: The model identifies opportunities to minimize transfusion center costs for 5- and 7-day platelets. Budget impact models such as this are important for understanding the financial implications of evolving FDA guidance and new platelet technologies.
Assuntos
Plaquetas , Transfusão de Plaquetas , Plaquetas/microbiologia , Transfusão de Sangue , Custos e Análise de Custo , Humanos , Transfusão de Plaquetas/métodos , Manejo de EspécimesRESUMO
BACKGROUND: Technologies used in the processing of whole blood and blood component products, including pathogen reduction, are continuously being adopted into blood transfusion workflows to improve process efficiencies. However, the economic implications of these technologies are not well understood. With the advent of these new technologies and regulatory guidance on bacterial risk-control strategies, an updated systematic literature review on this topic was warranted. OBJECTIVE: The objective of this systematic literature review was to summarize the current literature on the economic analyses of pathogen-reduction technologies (PRTs). METHODS: A systematic literature review was conducted using the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analysis) guidelines to identify newly published articles in PubMed, MEDLINE Complete, and EconLit from 1 January 2000 to 17 July 2019 related to economic evaluations of PRTs. Only full-text studies in humans published in English were included in the review. Both budget-impact and cost-effectiveness studies were included; common outcomes included cost, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs). RESULTS: The initial searches identified 433 original abstracts, of which 16 articles were included in the final data extraction and reporting. Seven articles presented cost-effectiveness analyses and nine assessed budget impact. The introduction of PRT increased overall costs, and ICER values ranged widely across cost-effectiveness studies, from below $US150,000/QALY to upwards of $US20,000,000/QALY. This wide range of results was due to a multitude of factors, including comparator selection, target patient population, and scenario analyses included. CONCLUSIONS: Overall, the results of economic evaluations of bacterial risk-control strategies, regardless of mechanism, were highly dependent on the current screening protocols in place. The optimization of blood transfusion safety may not result in decisions made at the willingness-to-pay thresholds commonly seen in pharmaceutical evaluations. Given the critical public health role of blood products, and the potential safety benefits introduced by advancements, it is important to continue building this body of evidence with more transparency and data source heterogeneity. This updated literature review provides global context when making local decisions for the coverage of new and emerging bacterial risk-control strategies.
Assuntos
Transfusão de Sangue , Análise Custo-Benefício , Humanos , Anos de Vida Ajustados por Qualidade de VidaRESUMO
Platelet transfusions carry greater risks of infection, sepsis, and death than any other blood product, owing primarily to bacterial contamination. Many patients may be at particular risk, including critically ill patients in the intensive care unit. This narrative review provides an overview of the problem and an update on strategies for the prevention, detection, and reduction/inactivation of bacterial contaminants in platelets. Bacterial contamination and septic transfusion reactions are major sources of morbidity and mortality. Between 1:1000 and 1:2500 platelet units are bacterially contaminated. The skin bacterial microflora is a primary source of contamination, and enteric contaminants are rare but may be clinically devastating, while platelet storage conditions can support bacterial growth. Donor selection, blood diversion, and hemovigilance are effective but have limitations. Biofilm-producing species can adhere to biological and non-biological surfaces and evade detection. Primary bacterial culture testing of apheresis platelets is in routine use in the US. Pathogen reduction/inactivation technologies compatible with platelets use ultraviolet light-based mechanisms to target nucleic acids of contaminating bacteria and other pathogens. These methods have demonstrated safety and efficacy and represent a proactive approach for inactivating contaminants before transfusion to prevent transfusion-transmitted infections. One system, which combines ultraviolet A and amotosalen for broad-spectrum pathogen inactivation, is approved in both the US and Europe. Current US Food and Drug Administration recommendations advocate enhanced bacterial testing or pathogen reduction/inactivation strategies (or both) to further improve platelet safety. Risks of bacterial contamination of platelets and transfusion-transmitted infections have been significantly mitigated, but not eliminated, by improvements in prevention and detection strategies. Regulatory-approved technologies for pathogen reduction/inactivation have further enhanced the safety of platelet transfusions. Ongoing development of these technologies holds great promise.
Assuntos
Contaminação de Medicamentos/prevenção & controle , Transfusão de Plaquetas/normas , Carga Bacteriana/métodos , Furocumarinas/uso terapêutico , Humanos , Fármacos Fotossensibilizantes/uso terapêutico , Transfusão de Plaquetas/efeitos adversos , Transfusão de Plaquetas/métodos , Reação Transfusional/prevenção & controle , Raios UltravioletaRESUMO
CONCLUSIONS: The inherent tradeoff between sensitivity and specificity in the detection of unexplained antibodies has been the objective of many studies, editorials, and journal articles. Many publications note that no method is capable of detecting all clinically significant antibodies while avoiding all clinically insignificant antibodies. This study describes the frequency of nonspecific reactivity and unexplained reactivity in solid-phase testing, along with the subsequent development of specific antibodies (Abs). In this study, nonspecific reactivity (NS) is defined as method-specific panreactivity detected by solid-phase testing only, with no reactivity in other methods. Unexplained reactivity (UR) is defined as reactivity present and detectable in all test methods after all clinically significant antibodies were ruled out following a standard antibody identification algorithm using selected cell panels. This retrospective study evaluated antibody detection tests of patients at a single center for 2 years using two automated solid-phase instruments that used the same three-cell antibody detection test. Antibody identification was performed with solid-phase panels supplemented with a polyethylene glycol tube method as needed. Of the 1934 (5%) samples with a positive antibody detection test, 29 had unavailable work-up data, leaving 1905 (98.5%) samples eligible for inclusion in the study. The data revealed the following: Ab only 999 (52.4%); UR only 429 (22.5%); Ab and UR 227 (11.9%); NS only 206 (10.8%); Ab and NS 24 (1.3%); UR and NS 14 (0.7%); and Ab, UR, and NS 6 (0.3%). Patients with a positive follow-up antibody detection test had UR and NS replaced with a specific Ab in 23 of 656 UR (3%) and 8 of 230 NS (3%) cases, respectively. Additionally, six patients with UR developed a specific Ab along with persistent UR, and no patients with persistent NS developed a specific Ab. The study concluded that both UR and NS can be encountered in solid-phase testing, and both UR and NS can persist in follow-up testing. Specific Ab was observed to replace UR in a few patients.
Assuntos
Anticorpos/análise , Automação Laboratorial , Humanos , Testes Imunológicos , Polietilenoglicóis , Estudos RetrospectivosRESUMO
BACKGROUND: US FDA draft guidance includes pathogen reduction (PR) or secondary rapid bacterial testing (RT) in its recommendations for mitigating risk of platelet component (PC) bacterial contamination. An interactive budget impact model was created for hospitals to use when considering these technologies. METHODS: A Microsoft Excel model was built and populated with base-case costs and probabilities identified through literature search and a survey of US hospital transfusion service directors. Annual costs of PC acquisition, testing, wastage, dispensing/transfusion, sepsis, shelf life, and reimbursement for a mid-sized hospital that purchases all of its PCs were compared for four scenarios: 100% conventional PCs (C-PC), 100% RT-PC, 100% PR-PC, and 50% RT-PC/50% PR-PC. RESULTS: Annual total costs were US$3.64, US$3.67, and US$3.96 million when all platelets were C-PC, RT-PC, or PR-PC, respectively, or US$3.81 million in the 50% RT-PC/50% PR-PC scenario. The annual net cost of PR-PC, obtained by subtracting annual reimbursements from annual total costs, is 6.18% above that of RT-PC. Maximum usable shelf lives for C-PC, RT-PC, and PR-PC are 3.0, 5.0, and 3.6 days, respectively; hospitals obtain PR-PC components earliest at 1.37 days. CONCLUSION: The model predicts minimal cost increase for PR-PC versus RT-PC, including cost offsets such as elimination of bacterial detection and irradiation, and reimbursement. Additional safety provided by PR, including risk mitigation of transfusion-transmission of a broad spectrum of viruses, parasites, and emerging pathogens, may justify this increase. Effective PC shelf life may increase with RT, but platelets can be available sooner with PR due to elimination of bacterial detection, depending on blood center logistics.
Assuntos
Plaquetas/microbiologia , Coleta de Amostras Sanguíneas/economia , Custos Hospitalares/estatística & dados numéricos , Transfusão de Plaquetas/economia , Infecções Bacterianas/diagnóstico , Infecções Bacterianas/economia , Infecções Bacterianas/prevenção & controle , Infecções Bacterianas/transmissão , Remoção de Componentes Sanguíneos/efeitos adversos , Remoção de Componentes Sanguíneos/economia , Remoção de Componentes Sanguíneos/estatística & dados numéricos , Coleta de Amostras Sanguíneas/métodos , Orçamentos , Humanos , Modelos Econométricos , Transfusão de Plaquetas/efeitos adversos , Transfusão de Plaquetas/estatística & dados numéricos , Estados UnidosRESUMO
BACKGROUND: Estimated average glucose (AG) is generally reported along with hemoglobin A1c measurements according to a standard calculation. Given a normal red blood cell lifetime of 120â¯days, serial A1c measurements at intervals <120â¯days are not completely independent. For short interval measurements, a change in AG (ΔAG) necessarily underestimates the change in average glucose operative during the interval (ΔG). We use a model for kinetics of HbA1c to evaluate the theoretical relationship between ΔAG and ΔG for HbA1c measurements made at intervals between 0 and 120â¯days. METHODS: From any given starting point for A1c, step changes in G were simulated using model calculations to determine the extent to which A1c could change as a function of the interval of exposure. Values for ΔAG were compared to the operative ΔG as a function of the interval between A1c measurements. RESULTS: Results of model simulations are a single graph for relationship of ΔAG to ΔG as a function of the interval between A1c measurements. ΔAG for (15, 30, 45, 60, 76, and 90) day intervals underestimated operative ΔG by (73, 51, 34, 21, 11, and 5)%, respectively. CONCLUSIONS: Model calculations predict the relationship between changes in estimated average glucose to changes in operative glucose for serial A1c measurements made at intervals <120â¯days. Given that serial measurements of A1c made at short intervals are not uncommon in practice, physicians may find this information to be useful.
Assuntos
Glicemia/metabolismo , Hemoglobinas Glicadas/metabolismo , Modelos Biológicos , HumanosRESUMO
BACKGROUND: Hospitals review allogeneic red blood cell (RBC) transfusions for appropriateness. Audit criteria have been published that apply to 5 common procedures. We expanded on this work to study the management decision of selecting which cases involving transfusion of at least 1 RBC unit to audit (review) among all surgical procedures, including those previously studied. METHODS: This retrospective, observational study included 400,000 cases among 1891 different procedures over an 11-year period. There were 12,616 cases with RBC transfusion. We studied the proportions of cases that would be audited based on criteria of nadir hemoglobin (Hb) greater than the hospital's selected transfusion threshold, or absent Hb or missing estimated blood loss (EBL) among procedures with median EBL <500 mL. This threshold EBL was selected because it is approximately the volume removed during the donation of a single unit of whole blood at a blood bank. Missing EBL is important to the audit decision for cases in which the procedures' median EBL is <500 mL because, without an indication of the extent of bleeding, there are insufficient data to assume that there was sufficient blood loss to justify the transfusion. RESULTS: Most cases (>50%) that would be audited and most cases (>50%) with transfusion were among procedures with median EBL <500 mL (P < .0001). Among cases with transfusion and nadir Hb >9 g/dL, the procedure's median EBL was <500 mL for 3.0 times more cases than for procedures having a median EBL ≥500 mL. A greater percentage of cases would be recommended for audit based on missing values for Hb and/or EBL than based on exceeding the Hb threshold among cases of procedures with median EBL ≥500 mL (P < .0001). There were 3.7 times as many cases with transfusion that had missing values for Hb and/or EBL than had a nadir Hb >9 g/dL and median EBL for the procedure ≥500 mL. CONCLUSIONS: An automated process to select cases for audit of intraoperative transfusion of RBC needs to consider the median EBL of the procedure, whether the nadir Hb is below the hospital's Hb transfusion threshold for surgical cases, and the absence of either a Hb or entry of the EBL for the case. This conclusion applies to all surgical cases and procedures.
Assuntos
Auditoria Clínica/normas , Transfusão de Eritrócitos/normas , Cuidados Intraoperatórios/normas , Complicações Intraoperatórias/terapia , Auditoria Clínica/métodos , Transfusão de Eritrócitos/métodos , Hemoglobinas/análise , Hemoglobinas/metabolismo , Humanos , Cuidados Intraoperatórios/métodos , Complicações Intraoperatórias/diagnóstico , Estudos RetrospectivosRESUMO
BACKGROUND: A model for hemoglobin A1c (HbA1c) formation was used to predict the relationship between average glucose (AG) and %HbA1c under conditions of altered red blood cell lifetime (RCL). METHODS: Using a kinetic mass balance model for formation of HbA1c in red blood cells as a function of age (time in circulation), whole blood %HbA1c vs. glucose was calculated based on the nonlinear distribution of red blood cells as a function of age across RCL. RESULTS: Model calculations provided a close fit to the standard relationship of estimated average glucose to %HbA1c for normal RCL (r>0.999). Results for altered RCL were calculated assuming simple time-scale compression or expansion of the distribution of red blood cells as a function of RCL. For a given %HbA1c, the operative average glucose needed to have achieved a given %HbA1c was predicted to be altered by RCL according to average glucose×RCL=constant. CONCLUSIONS: Model calculations estimate the extent to which standard reporting of AG vs. HbA1c underestimates or overestimates AG under conditions of altered RCL. Conditions of altered RCL may often be operative in patients with certain hemoglobin variants.
Assuntos
Glicemia/metabolismo , Eritrócitos/metabolismo , Hemoglobinas Glicadas/metabolismo , Modelos Estatísticos , Glicosilação , Humanos , CinéticaRESUMO
BACKGROUND: Prosthetic hip-associated cobalt toxicity (PHACT) is an uncommon, but potentially devastating, complication for patients with metal-on-metal hip implants (MoMs). Clinical management of PHACT is poorly defined, with primary intervention being MoM explant followed by chelation therapy. Therapeutic plasma exchange (TPE) in cobalt toxicity has not been previously described. Given that cobalt is predominantly albumin bound, it should theoretically be removed by TPE. Here we report a case of PHACT and our experience using TPE to lower plasma cobalt levels. CASE REPORT: A 61-year-old woman developed deafness, blindness, ambulatory dysfunction, and endocrinopathies after MoM implant. Cobalt levels on admission were greater than 1500 µg/L. In an attempt to rapidly lower cobalt levels before MoM explant, hemodialysis and TPE were performed. Hemodialysis removed negligible amounts of cobalt. One session of TPE temporarily removed approximately two-thirds of measurable cobalt, but levels rebounded to pre-TPE values after 8 hours. It was only after MoM removal that cobalt levels plateaued below 300 µg/L and clinical symptoms improved. DISCUSSION: TPE removed cobalt from a PHACT patient, but a durable decrease in cobalt was only achieved after MoM explant. These findings are comparable to reports where chelation was employed in PHACT patients before MoM explant. The observed rebound phenomenon is likely from rapid equilibration between the immense extravascular tissue source (the MoM) and the intravascular compartment. CONCLUSION: TPE may serve as adjunctive therapy for PHACT patients whose cobalt levels remain high after explant, especially in patients with renal failure, in whom chelation is contraindicated.
Assuntos
Cobalto/toxicidade , Prótese de Quadril/efeitos adversos , Troca Plasmática/métodos , Artroplastia de Quadril/efeitos adversos , Terapia por Quelação , Feminino , Humanos , Pessoa de Meia-IdadeRESUMO
BACKGROUND: Plasma is used to treat acquired coagulopathy or thrombotic thrombocytopenic purpura, or to reverse warfarin effect. Scant data are available, however, about its costs. OBJECTIVE: To estimate total costs of plasma from production through administration, from the perspective of a US hospital blood donor center (BDC). STUDY DESIGN AND METHODS: Six sequential decision analytic models were constructed and informed by primary and secondary data on time, tasks, personnel, and supplies for donation, processing, and administration. Expected values of the models were summed to yield the BDC's total cost of producing, preparing, and transfusing plasma. Costs ($US 2015) are reported for a typical patient using three units of plasma. Models assume plasma was obtained from whole blood donation and transfused in an inpatient setting. Univariate sensitivity analyses were performed to test the impact of changing inputs for personnel costs and adverse event (AE) rates and costs. RESULTS: BDC production cost of plasma was $91.24/patient ($30.41/unit), a $30.16/patient savings versus purchased plasma. Administration and monitoring costs totaled $194.64/patient. Sensitivity analyses indicated that modifying BDC personnel costs during donation and processing has little impact on total plasma costs. However, the probability and cost of transfusion-associated circulatory overload (TACO) have a significant impact on costs. CONCLUSION: Plasma produced by our BDC may be less costly than purchased plasma. Though plasma processes have multiple tasks involving staff time, these are not the largest cost driver. Major plasma-related AEs are uncommon, but are the biggest driver of total plasma costs.
Assuntos
Doadores de Sangue , Transfusão de Sangue/economia , Custos de Cuidados de Saúde/estatística & dados numéricos , Plasma , Doadores de Sangue/estatística & dados numéricos , Transfusão de Sangue/estatística & dados numéricos , Humanos , Modelos Teóricos , Estados UnidosRESUMO
BACKGROUND: RhIG has had great success in protecting fetuses from potential harm; however, little work has been done to demonstrate how long RhIG reactivity is detected in the mother after administration when using common red blood cell antibody detection methods. STUDY DESIGN AND METHODS: A retrospective investigation was performed examining positive antibody identification panels due to RhIG. These panels were run on solid-phase (SP) testing. The time to a positive result, length of detection, and positive strength of reactivity (PSR) were evaluated. Additionally, a comparative study was performed evaluating how sensitive SP, gel (GT), and tube testing (TT) were at detecting RhIG using serially diluted plasma samples spiked with different RhIG formulas. RESULTS: Retrospectively, most antibody identification panels by SP were positive 3.5 months after RhIG administration and demonstrated a strong PSR. The longest recorded positive panel was present at 4.5 months. RhIG administered intramuscularly could not be detected until several hours after injection. The comparative study showed that SP was the most sensitive method while GT and TT were comparable to one another in detecting RhIG. SP also recorded strong PSR at very low concentrations of RhIG. GT and TT recorded weak PSR even with higher concentrations of RhIG. CONCLUSION: SP is the most sensitive testing method and has the ability to detect RhIG 4 to 5 months after administration. TT and GT have the ability to detect RhIG up to 3 to 4 months after administration. Different RhIG formulas may show slightly different lengths of detection.
Assuntos
Eritrócitos/imunologia , Teste de Histocompatibilidade/métodos , Isoimunização Rh/diagnóstico , Imunoglobulina rho(D)/análise , Adolescente , Adulto , Eritrócitos/citologia , Feminino , Humanos , Técnicas Imunológicas/métodos , Injeções Intramusculares , Isoanticorpos/sangue , Gravidez , Estudos Retrospectivos , Isoimunização Rh/sangue , Isoimunização Rh/imunologia , Imunoglobulina rho(D)/administração & dosagem , Imunoglobulina rho(D)/sangue , Adulto JovemRESUMO
Late allograft failure (LAF) is a common cause of end stage renal disease. These patients face interrelated challenges regarding immunosuppression management, risk of graft intolerance syndrome (GIS), and sensitization. This retrospective study analyzes sensitization, pathology, imaging, and transfusion requirements in 33 LAFs presenting either with GIS (22) or grafts remaining quiescent (11). All patients underwent immunosuppression weaning to discontinuation at LAF. Profound increases in sensitization were noted for all groups and occurred in the GIS group prior to transplant nephrectomy (TxN). Patients with GIS experienced a major upswing in sensitization at, or before the time of their symptomatic presentation. For both GIS and quiescent grafts, sensitization appeared to be closely linked to immunosuppression withdrawal. Most transfusion naïve patients became highly sensitized. Fourteen patients in the GIS group underwent TxN which revealed grade II acute cellular rejection or worse, with grade 3 chronic active T-cell-mediated rejection. Blinded comparisons of computed tomography scan of GIS group revealed swollen allografts with fluid collections compared with the quiescent allografts (QAs), which were shrunken and atrophic. The renal volume on imaging and weight of explants nearly matched. Future studies should focus on interventions to avoid sensitization and GIS.
Assuntos
Rejeição de Enxerto/diagnóstico , Falência Renal Crônica/cirurgia , Transplante de Rim , Complicações Pós-Operatórias , Diagnóstico por Imagem , Embolização Terapêutica , Feminino , Seguimentos , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/terapia , Humanos , Terapia de Imunossupressão , Imunossupressores/uso terapêutico , Masculino , Pessoa de Meia-Idade , Prognóstico , Estudos Retrospectivos , Fatores de Risco , Tomografia Computadorizada por Raios XRESUMO
BACKGROUND: Postpartum hemorrhage (PPH) remains one of the leading causes of maternal morbidity and mortality worldwide, although the lack of a precise definition precludes accurate data of the absolute prevalence of PPH. STUDY DESIGN AND METHODS: An international expert panel in obstetrics, gynecology, hematology, transfusion, and anesthesiology undertook a comprehensive review of the literature. At a meeting in November 2011, the panel agreed on a definition of severe PPH that would identify those women who were at a high risk of adverse clinical outcomes. RESULTS: The panel agreed on the following definition for severe persistent (ongoing) PPH: "Active bleeding >1000 mL within the 24 hours following birth that continues despite the use of initial measures including first-line uterotonic agents and uterine massage." A treatment algorithm for severe persistent PPH was subsequently developed. Initial evaluations include measurement of blood loss and clinical assessments of PPH severity. Coagulation screens should be performed as soon as persistent (ongoing) PPH is diagnosed, to guide subsequent therapy. If initial measures fail to stop bleeding and uterine atony persists, second- and third-line (if required) interventions should be instated. These include mechanical or surgical maneuvers, i.e., intrauterine balloon tamponade or hemostatic brace sutures with hysterectomy as the final surgical option for uncontrollable PPH. Pharmacologic options include hemostatic agents (tranexamic acid), with timely transfusion of blood and plasma products playing an important role in persistent and severe PPH. CONCLUSION: Early, aggressive, and coordinated intervention by health care professionals is critical in minimizing blood loss to ensure optimal clinical outcomes in management of women with severe, persistent PPH.
Assuntos
Hemorragia Pós-Parto/diagnóstico , Hemorragia Pós-Parto/terapia , Prática Profissional , Transtornos Herdados da Coagulação Sanguínea/complicações , Transtornos Herdados da Coagulação Sanguínea/diagnóstico , Transtornos Herdados da Coagulação Sanguínea/terapia , Transfusão de Componentes Sanguíneos/estatística & dados numéricos , Prova Pericial , Feminino , Hemostáticos/uso terapêutico , Humanos , Trabalho de Parto , Hemorragia Pós-Parto/etiologia , Guias de Prática Clínica como Assunto , Gravidez , Prática Profissional/normas , Prática Profissional/estatística & dados numéricos , Fatores de RiscoRESUMO
BACKGROUND: Regulations governing pretransfusion testing allow specimen expiration to be extended past 3 days before the transfusion if a patient has not been transfused or pregnant in the preceding 3 months. Our hospital allows extension of the expiration of a presurgical specimen to 28 days if 1) the patient has neither been transfused nor pregnant in the past 3 months, 2) the patient does not have an antibody history, and 3) the current antibody screen (ABSC) is negative. Patients not meeting Criteria 2 and 3 are required to have specimens redrawn on the day of surgery (DOS). We evaluated the necessity of this policy. STUDY DESIGN AND METHODS: From October 2009 to September 2010, there were 132 patients who did not meet the above criteria for specimen extension. Equivalent tests were performed on preadmission testing (PAT) and DOS specimens, and the results were compared. RESULTS: The majority (113, 86%) of the samples redrawn on the DOS showed no change in antibody serology upon reinvestigation. Of the remaining patients, DOS specimens did not identify any new antibodies or change in blood product choices. CONCLUSION: Of the PAT specimens rejected for antibody history or positive ABSC, none had new significant serologic findings on DOS. Based on these results, requiring a repeat specimen on the DOS may not be clinically necessary. Our facility changed the PAT policy to extend specimen acceptability to patients with red blood cell antibody history or positive ABSC at time of PAT. A 6-month follow-up period showed that this practice is safe.
Assuntos
Coleta de Amostras Sanguíneas/normas , Política Organizacional , Admissão do Paciente , Segurança do Paciente/legislação & jurisprudência , Segurança do Paciente/normas , Tipagem e Reações Cruzadas Sanguíneas/normas , Preservação de Sangue/normas , Transfusão de Sangue/legislação & jurisprudência , Transfusão de Sangue/normas , Feminino , Seguimentos , Hospitais/normas , Humanos , Isoanticorpos/análise , Isoanticorpos/sangue , Admissão do Paciente/legislação & jurisprudência , Admissão do Paciente/normas , Segurança do Paciente/estatística & dados numéricos , Hemorragia Pós-Operatória/epidemiologia , Hemorragia Pós-Operatória/terapia , Gravidez , Testes Sorológicos , Fatores de Tempo , Reação Transfusional , Estados Unidos/epidemiologia , United States Food and Drug Administration/legislação & jurisprudênciaRESUMO
OBJECTIVES: To examine whether a liver transplant patient, who was not taking an angiotensin-converting enzyme inhibitor and developed two episodes of hypotension with systolic pressure in the 50s within minutes of starting an RBC transfusion, may have had a disturbance in the production and metabolism of bradykinin and des-Arg(9)-BK. METHODS: All patient information was obtained by reviewing the electronic medical record, the transfusion service database, and transfusion reaction investigation records. RESULTS: The blood pressure returned to normal once the transfusions were discontinued. In an effort to mitigate the acute hypotension, the blood products were washed. Subsequently, the patient received three additional packed RBC transfusions without further incidents of hypotension. CONCLUSIONS: Our experience suggests that washing the products was an acceptable and effective preventative measure to avoid further acute hypotensive transfusion reactions in patients unable to metabolize these vasodilators present in the donor units.
Assuntos
Transfusão de Eritrócitos/efeitos adversos , Transfusão de Eritrócitos/métodos , Hipotensão/etiologia , Hipotensão/prevenção & controle , Transplante de Fígado , Doença Aguda , Idoso , Bradicinina/análogos & derivados , Bradicinina/metabolismo , Feminino , Humanos , Hipotensão/metabolismoRESUMO
BACKGROUND: Our blood bank is part of a large academic institution with an active sickle cell anemia program. We provide sickle patients with blood phenotypically matched for C/c, E/e, and K antigens. Since licensed reagents are available for phenotyping C/c, E/e, and K on an automated blood analyzer, we decided to evaluate whether establishing our own inventory of blood negative for those antigens would result in cost savings and decreased turnaround time (TAT). STUDY DESIGN AND METHODS: Antigen typing of blood units for C/c, E/e, and K was validated. From March 1, 2012, to August 31, 2012, a total of 1033 units from our own donor center and from our suppliers were phenotyped. We compared direct cost savings and TAT for blood availability with historical data before we began phenotyping. RESULTS: Thirty-eight percent of typed antigen-negative (AG-) units were transfused to sickle patients. An additional 35% were transfused to nonsickle patients needing AG- blood. Twenty-one percent were used by patients without antibodies to prevent outdating. The remaining 6% had not yet been transfused by the end of the study period. From March 1, 2011, to August 31, 2011, we spent almost $200,000 on obtaining AG- blood. In the 6 months since we started antigen typing, we have saved approximately $110,000, the majority of which resulted from AG- blood provided to sickle patients. In addition, TAT for AG- units from our inventory significantly improved to 1 to 2 hours versus approximately 6 hours when obtained from our suppliers. CONCLUSION: Establishing an AG- inventory in a hospital-based blood bank is cost-effective and time-efficient.
Assuntos
Anemia Falciforme/terapia , Armazenamento de Sangue/métodos , Bancos de Sangue/economia , Antígenos de Grupos Sanguíneos/imunologia , Eritrócitos/imunologia , Centros Médicos Acadêmicos/economia , Tipagem e Reações Cruzadas Sanguíneas/economia , Tipagem e Reações Cruzadas Sanguíneas/métodos , Transfusão de Sangue , Redução de Custos , HumanosRESUMO
BACKGROUND AND PURPOSE: Red blood cell transfusion (RBCT) may increase the risk of thrombotic events (TE) in patients with subarachnoid hemorrhage (SAH) through changes induced by storage coupled with SAH-related hypercoagulability. We sought to investigate the association between RBCT and the risk of TE in patients with SAH. METHODS: 205 consecutive patients with acute, aneurysmal SAH admitted to the neurovascular intensive care unit of a tertiary care, academic medical center between 3/2008 and 7/2009 were enrolled in a retrospective, observational cohort study. TE were defined as the composite of venous thromboembolism (VTE), myocardial infarction (MI), and cerebral infarction noted on brain CT scan. Secondary endpoints included the risk of VTE, poor outcome (modified Rankin score 3-6 at discharge), and in-hospital mortality. RESULTS: 86/205 (42 %) received RBCT. Eighty-eight (43 %) had a thrombotic complication. Forty (34 %) of 119 non-transfused and 48/86 (56 %) transfused patients had a TE (p = 0.002). In multivariate analysis, RBCT was associated with more TE by [OR 2.4; 95 % CI (1.2, 4.6); p = 0.01], VTE [OR 2.3; 95 % CI (1.0, 5.2); p = 0.04], and poor outcome [OR 5.0; 95 % CI (1.9, 12.8); p < 0.01]. The risk of TE increased by 55 % per unit transfused when controlling for univariate variables. Neither mean nor maximum age of blood was significantly associated with thrombotic risk. CONCLUSIONS: RBCT is associated with an increased risk of TE and VTE in SAH patients. A dose-dependent relationship exists between number of units transfused and thrombosis. Age of blood does not appear to play a role.