RESUMEN
BACKGROUND: Transfusion-related acute lung injury (TRALI) is a leading cause of transfusion-related mortality. A concern with passive surveillance to detect transfusion reactions is underreporting. Our aim was to obtain evidence-based estimates of TRALI incidence using meta-analysis of active surveillance studies and to compare these estimates with passive surveillance. STUDY DESIGN AND METHODS: We performed a systematic review and meta-analysis of studies reporting TRALI rates. A search of Medline and Embase by a research librarian identified studies published between January 1, 1991 and January 20, 2023. Prospective and retrospective observational studies reporting TRALI by blood component (red blood cells [RBCs], platelets, or plasma) were identified and all inpatient and outpatient settings were eligible. Adult and pediatric, as well as general and specific clinical populations, were included. Platelets and plasma must have used at least one modern TRALI donor risk mitigation strategy. A random effects model estimated TRALI incidence by blood component for active and passive surveillance studies and heterogeneity was examined using meta-regression. RESULTS: Eighty studies were included with approximately 176-million blood components transfused. RBCs had the highest number of studies (n = 66) included, followed by platelets (n = 35) and plasma (n = 34). Pooled TRALI estimates for active surveillance studies were 0.17/10,000 (95% confidence intervals [CI]: 0.03-0.43; I2 = 79%) for RBCs, 0.31/10,000 (95% CI: 0.22-0.42; I2 = <1%) for platelets, and 3.19/10,000 (95% CI: 0.09-10.66; I2 = 86%) for plasma. Studies using passive surveillance ranged from 0.02 to 0.10/10,000 among the various blood components. DISCUSSION: In summary, these estimates may improve a quantitative understanding of TRALI risk, which is important for clinical decision-making weighing the risks and benefits of transfusion.
Asunto(s)
Lesión Pulmonar Aguda Postransfusional , Humanos , Lesión Pulmonar Aguda Postransfusional/epidemiología , Lesión Pulmonar Aguda Postransfusional/etiología , Incidencia , Reacción a la Transfusión/epidemiología , Transfusión de Plaquetas/efectos adversos , Lesión Pulmonar Aguda/etiología , Lesión Pulmonar Aguda/epidemiologíaRESUMEN
BACKGROUND: The purpose of this scoping review was to identify available sources of evidence on the epidemiology of transfusion-related acute lung injury (TRALI) and whether meta-analysis on the incidence of TRALI is feasible. TRALI is a serious complication and the second leading cause of death related to blood transfusion. Estimates of the incidence of TRALI would provide a useful benchmark for research to reduce TRALI. STUDY DESIGN AND METHODS: We searched the Medline, EMBASE, and PubMed databases for publications related to the incidence of TRALI and hemovigilance. We included all studies irrespective of language or country. Both full-text articles and conference abstracts were included. Participants of the studies must all have received a blood transfusion. Reviews and case studies were excluded. RESULTS: We identified 427 articles or abstracts to include for review. More than half were abstracts, and the majority were published after 2010. Reported TRALI definitions varied, but only 27.2% of studies reported any definition for TRALI. TRALI rates were reported using different denominators, such as per blood unit (54.1%), patient (34.4%), and transfusion episode (14.8%). Study populations and contexts were mostly general (75.6% and 80.3%, respectively). There was also variation in study design with most being observational (90.6%) and only 13.1% of all studies used modern donor restriction policies. DISCUSSION: There was substantial variation in reporting in studies on TRALI incidence. Meta-analysis of TRALI rates may be feasible in specific circumstances where reporting is clear. Future studies should clearly report key items, such as a TRALI definition.
Asunto(s)
Lesión Pulmonar Aguda Postransfusional , Humanos , Transfusión Sanguínea , Lesión Pulmonar Aguda Postransfusional/epidemiología , Metaanálisis como AsuntoRESUMEN
BACKGROUND: The relative safety of bacterial risk control strategies for platelets that include culture with or without rapid testing has been compared using simulation analysis. A wide range of bacterial lag and doubling times were included. However, published data on growth rates are available and these data have not been synthesized. We conducted a systematic review and meta-analysis to estimate growth rates and used these estimates to refine a comparative safety analysis of bacterial risk control strategies in the FDA guidance STUDY DESIGN AND METHODS: Data were extracted from published studies on bacterial growth rates in platelet components during storage. These data were used to estimate the practical range of growth rates. This refined the inputs for a simulation model comparing the safety of the testing strategies. RESULTS: In total, 108 growth curves for 11 different aerobic organisms were obtained. Doubling times ranged from 0.8 to 12 h, but the lower 90% range was approximately 1-5 h. The revised comparative safety simulation using the narrower 1-5-h range showed similar rankings to the prior simulation, with 48-h large-volume delayed sampling with 7-day expiration (48C-7) demonstrating the lowest-ranking relative performance at the 103 and 105 colony forming unit (CFU)/mL exposure thresholds. DISCUSSION: This was a two-step study. First, meta-analysis of published data on aerobic bacterial growth rates in stored platelets showed the vast majority of doubling times were 1-5 h. Next, an updated comparative safety simulation yielded similar results to a prior study, with 48C-7 showing the least favorable relative safety performance.
Asunto(s)
Plaquetas , Conductas Relacionadas con la Salud , Humanos , Simulación por ComputadorRESUMEN
BACKGROUND: Non-pathogen reduction platelet bacterial risk control strategies in the US FDA guidance include at least one culture. Almost all of these strategies have a culture hold time of ≥12 h. Studies have reported time to detection (TTD) of bacterial cultures inoculated with bacteria from contaminated platelets, but these data and estimates of risk associated with detection failures have not been synthesized. METHODS: We performed a literature search to identify studies reporting TTD for samples obtained from spiked platelet components. Using extracted data, regression analysis was used to estimate TTD for culture bottles at different inoculum sizes. Detection failures were defined as events in which contaminated components are transfused to a patient. We then used published data on time of transfusion (ToT) to estimate the risk of detection failures in practice. RESULTS: The search identified 1427 studies, of which 16 were included for analysis. TTD data were available for 16 different organisms, including 14 in aerobic cultures and 11 in anaerobic cultures. For inocula of 1 colony forming unit (CFU), the average TTD for aerobic organisms was 19.2 h while it was 24.9 h in anaerobic organisms, but there was substantial overall variation. A hold time of 12 versus 24 h had minimal effect for most organisms. CONCLUSION: TTD variation occurs between bacterial species and within a particular species. Under typical inventory management, the relative contribution of culture detection failures is much smaller than the residual risk from sampling failures. Increasing the hold period beyond 12 h has limited value.
Asunto(s)
Bacterias , Plaquetas , Humanos , Plaquetas/microbiología , Factores de Tiempo , Transfusión de PlaquetasRESUMEN
OBJECTIVES: There is continuing pressure to improve the cost effectiveness of quality control (QC) for clinical laboratory testing. Risk-based approaches are promising but recent research has uncovered problems in some common methods. There is a need for improvements in risk-based methods for quality control. METHODS: We provide an overview of a dynamic model for assay behavior. We demonstrate the practical application of the model using simulation and compare the performance of simple Shewhart QC monitoring against Westgard rules. We also demonstrate the utility of trade-off curves for analysis of QC performance. RESULTS: Westgard rules outperform simple Shewhart control over a narrow range of the trade-off curve of false-positive and false negative risk. The risk trade-off can be visualized in terms of risk, risk vs. cost, or in terms of cost. Risk trade-off curves can be "smoothed" by log transformation. CONCLUSIONS: Dynamic risk-models may provide advantages relative to static models for risk-based QC analysis.
Asunto(s)
Técnicas de Laboratorio Clínico , Humanos , Control de Calidad , Simulación por Computador , Medición de RiesgoRESUMEN
BACKGROUND: In hematologic and transfusion medicine research, measurement of red blood cell (RBC) in vivo kinetics must be safe and accurate. Recent reports indicate use of biotin-labeled RBC (BioRBC) to determine red cell survival (RCS) offers substantial advantages over 51 Cr and other labeling methods. Occasional induction of BioRBC antibodies has been reported. STUDY DESIGN AND METHODS: To investigate the causes and consequences of BioRBC immunization, we reexposed three previously immunized adults to BioRBC and evaluated the safety, antibody emergence, and RCS of BioRBC. RESULTS: BioRBC re-exposure caused an anamnestic increase of plasma BioRBC antibodies at 5-7 days; all were subclass IgG1 and neutralized by biotinylated albumin, thus indicating structural specificity for the biotin epitope. Concurrently, specific antibody binding to BioRBC was observed in each subject. As biotin label density increased, the proportion of BioRBC that bound increased antibody also increased; the latter was associated with proportional accelerated removal of BioRBC labeled at density 6 µg/mL. In contrast, only one of three subjects exhibited accelerated removal of BioRBC density 2 µg/mL. No adverse clinical or laboratory events were observed. Among three control subjects who did not develop BioRBC antibodies following initial BioRBC exposure, re-exposure induced neither antibody emergence nor accelerated BioRBC removal. DISCUSSION: We conclude re-exposure of immunized subjects to BioRBC can induce anamnestic antibody response that can cause an underestimation of RCS. To minimize chances of antibody induction and underestimation of RCS, we recommend an initial BioRBC exposure volume of ≤10 mL and label densities of ≤18 µg/mL.
Asunto(s)
Biotina , Eritrocitos , Adulto , Anticuerpos/metabolismo , Biotina/química , Supervivencia Celular , Recuento de Eritrocitos , Eritrocitos/metabolismo , HumanosRESUMEN
BACKGROUND: The US Food and Drug Administration (FDA) issued a guidance for bacterial risk control strategies for platelet components in September 2019 that includes strategies using secondary bacterial culture (SBC). While an SBC likely increases safety, the optimal timing of the SBC is unknown. Our aim was to develop a model to provide insight into the best time for SBC sampling. STUDY DESIGN AND METHODS: We developed a mathematical model based on the conditional probability of a bacterial contamination event. The model evaluates the impact of secondary culture sampling time over a range of bacterial contamination scenarios (lag and doubling times), with the primary outcome being the optimal secondary sampling time and the associated risk. RESULTS: Residual risk of exposure decreased with increasing inoculum size, later sampling times for primary culture, and using higher thresholds of exposure (in colony-forming units per milliliter). Given a level of exposure, the optimal sampling time for secondary culture depended on the timing of primary culture and on the expiration time. In general, the optimal sampling time for secondary culture was approximately halfway between the time of primary culture and the expiration time. CONCLUSION: Our model supports that the FDA guidance is quite reasonable and that sampling earlier in the specified secondary culture windows may be most optimal for safety.
Asunto(s)
Bacterias/aislamiento & purificación , Infecciones Bacterianas/transmisión , Plaquetas/microbiología , Seguridad de la Sangre/métodos , Seguridad de la Sangre/normas , Transfusión de Plaquetas/efectos adversos , Reacción a la Transfusión/microbiología , Bacterias/crecimiento & desarrollo , Infecciones Bacterianas/sangre , Infecciones Bacterianas/etiología , Humanos , Modelos Teóricos , Transfusión de Plaquetas/normas , Políticas , Factores de Riesgo , Estados Unidos , United States Food and Drug AdministrationRESUMEN
BACKGROUND AND OBJECTIVES: Septic transfusion reactions are a principal cause of transfusion-related mortality. The frequency of detectable bacterial contamination is greater in platelets compared to other blood components because platelets are stored at room temperature. Most strategies outlined in the September 2019 FDA guidance require both aerobic culture (AC) and anaerobic culture (AnC) testing. We performed a systematic review and meta-analysis in an effort to provide the best available estimate of the effectiveness of AnC. MATERIALS AND METHODS: Our analysis was performed according to published guidelines. Broad and context-specific meta-analyses of bacterial detection rates in platelets by AnC were performed to assess the practical effectiveness of AnC as a risk control measure. RESULTS: Seven studies with a total of 1 767 014 tested platelet components were included for analysis. With exclusion of positives due to Cutibacterium/Propionibacterium species and redundancy due to AC results, AnC detected 0·06 contamination events per thousand (EPT) components tested, twofold lower than the AC (0·12 EPT). CONCLUSION: Excluding Cutibacterium/Propionibacterium species, AnC detects occasional bacterial contamination events that are not detected by AC (~1 in 17 000 platelet components).
Asunto(s)
Bacterias/metabolismo , Técnicas Bacteriológicas/métodos , Plaquetas/microbiología , Contaminación de Medicamentos/prevención & control , Transfusión de Plaquetas/métodos , Reacción a la Transfusión/microbiología , Anaerobiosis , Seguridad de la Sangre , Humanos , Transfusión de Plaquetas/efectos adversos , Reacción a la Transfusión/prevención & controlRESUMEN
OBJECTIVE. The purpose of this study was to assess the utility of radiography in diagnosing osteonecrosis of the femoral head with pathologic examination as the reference standard. MATERIALS AND METHODS. Radiography and pathology reports of 253 consecutive femoral head resections were reviewed. A subset of 128 cases in which the diagnosis of osteonecrosis was made or suggested radiographically or pathologically were reviewed to evaluate for factors that might influence correlation. A total of 23 patients in this subset had also undergone MRI, and those reports and images were reviewed. RESULTS. There was 93.9% agreement between radiography and pathologic examination overall (κ = 0.67). When grade 3 osteoarthritis was present, 95.0% agreement was found, but because of the large number of patients with severe osteoarthritis, the kappa value decreased to 0.51. In the subset of cases in which osteonecrosis was diagnosed or suspected, radiologic-pathologic correlation decreased as osteoarthritis grade increased, and the diagnostic uncertainty for both evaluation methods increased. One patient without osteoarthritis had osteonecrosis diagnosed in both hips at radiography and MRI, but osteonecrosis was absent at pathologic examination. CONCLUSION. Radiography depicts osteonecrosis in most patients who have osteonecrosis and subsequently undergo femoral head resection. False-positive and false-negative radiographic findings occur, however. Diagnosis is most difficult in patients with advanced osteoarthritis or subchondral fractures. The number of patients who underwent MRI was not sufficient for evaluation of the accuracy of MRI.
Asunto(s)
Necrosis de la Cabeza Femoral/diagnóstico , Anciano , Femenino , Cabeza Femoral/diagnóstico por imagen , Cabeza Femoral/patología , Necrosis de la Cabeza Femoral/diagnóstico por imagen , Necrosis de la Cabeza Femoral/patología , Humanos , Masculino , Persona de Mediana Edad , RadiografíaRESUMEN
BACKGROUND: Primary culture alone was a bacterial risk control strategy intended to facilitate interdiction of contaminated platelets (PLTs). A September 2019 FDA guidance includes secondary testing options to enhance safety. Our objective was to use meta-analysis to determine residual contamination risk after primary culture using secondary culture and rapid testing. STUDY DESIGN AND METHODS: A December 2019 literature search identified articles on PLT bacterial detection rates using primary culture and a secondary testing method. We used meta-analysis to estimate secondary testing detection rates after a negative primary culture. We evaluated collection method, sample volume, sample time, and study date as potential sources of heterogeneity. RESULTS: The search identified 6102 articles; 16 were included for meta-analysis. Of these, 12 used culture and five used rapid testing as a secondary testing method. Meta-analysis was based on a total of 103 968 components tested by secondary culture and 114 697 by rapid testing. The residual detection rate using secondary culture (DRSC ) was 0.93 (95% CI, 0.24-0.6) per 1000 components, while residual detection rate using rapid testing (DRRT ) was 0.09 (95% CI, 0.01-0.25) per 1000 components. Primary culture detection rate was the only statistically significant source of heterogeneity. CONCLUSION: We evaluated bacterial detection rates after primary culture using rapid testing and secondary culture. These results provide a lower and upper bound on real-world residual clinical risk because these methods are designed to detect high-level exposures or any level of exposure, respectively. Rapid testing may miss some harmful exposures and secondary culture may identify some clinically insignificant exposures.
Asunto(s)
Bacterias/crecimiento & desarrollo , Técnicas Bacteriológicas , Cultivo de Sangre , Plaquetas/microbiología , Transfusión de Plaquetas/efectos adversos , Sepsis , Reacción a la Transfusión , Bacterias/clasificación , Femenino , Humanos , Masculino , Sepsis/etiología , Sepsis/microbiología , Reacción a la Transfusión/etiología , Reacción a la Transfusión/microbiologíaRESUMEN
BACKGROUND: Platelets have the highest bacterial contamination risk of all blood components, and septic transfusion reactions remain a problem. A good estimate of contamination rates could provide information about residual risk and inform optimal testing strategies. We performed a systematic review and meta-analysis of platelet contamination rates by primary culture. STUDY DESIGN AND METHODS: A literature search in December 2019 identified articles on platelet contamination rates using primary culture. We used meta-analysis to estimate the overall rate of contamination and meta-regression to identify heterogeneity. We studied the following sources of heterogeneity: collection method, sample volume, positivity criteria, and study date. Contamination rate estimates were obtained for apheresis (AP), platelet rich plasma (PRP), and buffy coat (BC) collection methods. RESULTS: The search identified 6102 studies, and 22 were included for meta-analysis. Among these 22 studies, there were 21 AP cohorts (4,072,022 components), 4 PRP cohorts (138,869 components), and 15 BC cohorts (1,474,679 components). The overall mean contamination rate per 1000 components was 0.51 (95% CI: 0.38-0.67) including AP (0.23, 95% CI: 0.18-0.28), PRP, (0.38, 95% CI: 0.15-0.70), and BC (1.12, 95% CI: 0.51-1.96). There was considerable variability within each collection method. Sample volume, positivity criteria, and publication year were significant sources of heterogeneity. CONCLUSION: The bacterial contamination rate of platelets by primary culture is 1 in 1961. AP and PRP components showed a lower contamination rate than BC components. There is clinically significant between-study variability for each method. Larger sample volumes increased sensitivity, and bacterial contamination rates have decreased over time.
Asunto(s)
Infecciones Bacterianas/sangre , Eliminación de Componentes Sanguíneos/estadística & datos numéricos , Plaquetas/microbiología , Contaminación de Medicamentos/estadística & datos numéricos , Transfusión de Plaquetas/estadística & datos numéricos , Cultivo Primario de Células/estadística & datos numéricos , Infecciones Bacterianas/epidemiología , Infecciones Bacterianas/transmisión , Técnicas Bacteriológicas , Eliminación de Componentes Sanguíneos/efectos adversos , Transfusión de Componentes Sanguíneos/efectos adversos , Transfusión de Componentes Sanguíneos/estadística & datos numéricos , Plaquetas/citología , Células Cultivadas , Humanos , Transfusión de Plaquetas/efectos adversos , Plasma Rico en Plaquetas/microbiología , Cultivo Primario de Células/métodos , Cultivo Primario de Células/normas , Reacción a la Transfusión/epidemiología , Reacción a la Transfusión/microbiologíaRESUMEN
BACKGROUND: Bacterial contamination of platelets is a problem that can lead to harmful septic transfusion reactions. The US Food and Drug Administration published a guidance in September 2019 detailing several permissible risk control strategies. Our objective was to compare the safety of each bacterial testing strategy for apheresis platelets. STUDY DESIGN AND METHODS: We used simulation to compare safety of the nine risk control strategies involving apheresis platelet testing. The primary outcome was the risk of exposure. An exposure event occurred if a patient received platelets exceeding a specific contamination threshold (>0, 103 , and 105 colony-forming units (CFU/mL). We generated a range of bacterial contamination scenarios (inoculum size, doubling time, lag time) and compared risk of exposure for each policy in each contamination scenario. We then computed the average risk difference over all scenarios. RESULTS: At the 0 CFU/mL exposure threshold, two-step policies that used secondary culture ranked best (all top three), while single-step 24-hour culture with 3-day expiration ranked last (ninth). This latter policy performed well (median rank of 1) at both the 103 and 105 CFU/mL thresholds, but 48-hour culture with 7-day expiration performed relatively poorly. At these higher thresholds, median ranks of two-step policies that used secondary culture were again top three. Two-step policies that used rapid testing improved at the higher (105 CFU/mL) harm threshold, with median rankings between 1 and 5. CONCLUSION: Two-step policies that used secondary culture were generally safer than single-step policies. Performance of two-step policies that used rapid testing depended on the CFU per milliter threshold of exposure used.
Asunto(s)
Infecciones Bacterianas , Plaquetas/microbiología , Seguridad de la Sangre , Modelos Biológicos , Transfusión de Plaquetas , Plaquetoferesis , Infecciones Bacterianas/sangre , Infecciones Bacterianas/etiología , Política de Salud , Humanos , Factores de RiesgoRESUMEN
BACKGROUND: Invasive fungal infections (IFIs) are life-threatening opportunistic infections that occur in immunocompromised or critically ill people. Early detection and treatment of IFIs is essential to reduce morbidity and mortality in these populations. (1â3)-ß-D-glucan (BDG) is a component of the fungal cell wall that can be detected in the serum of infected individuals. The serum BDG test is a way to quickly detect these infections and initiate treatment before they become life-threatening. Five different versions of the BDG test are commercially available: Fungitell, Glucatell, Wako, Fungitec-G, and Dynamiker Fungus. OBJECTIVES: To compare the diagnostic accuracy of commercially available tests for serum BDG to detect selected invasive fungal infections (IFIs) among immunocompromised or critically ill people. SEARCH METHODS: We searched MEDLINE (via Ovid) and Embase (via Ovid) up to 26 June 2019. We used SCOPUS to perform a forward and backward citation search of relevant articles. We placed no restriction on language or study design. SELECTION CRITERIA: We included all references published on or after 1995, which is when the first commercial BDG assays became available. We considered published, peer-reviewed studies on the diagnostic test accuracy of BDG for diagnosis of fungal infections in immunocompromised people or people in intensive care that used the European Organization for Research and Treatment of Cancer (EORTC) criteria or equivalent as a reference standard. We considered all study designs (case-control, prospective consecutive cohort, and retrospective cohort studies). We excluded case studies and studies with fewer than ten participants. We also excluded animal and laboratory studies. We excluded meeting abstracts because they provided insufficient information. DATA COLLECTION AND ANALYSIS: We followed the standard procedures outlined in the Cochrane Handbook for Diagnostic Test Accuracy Reviews. Two review authors independently screened studies, extracted data, and performed a quality assessment for each study. For each study, we created a 2 × 2 matrix and calculated sensitivity and specificity, as well as a 95% confidence interval (CI). We evaluated the quality of included studies using the Quality Assessment of Studies of Diagnostic Accuracy-Revised (QUADAS-2). We were unable to perform a meta-analysis due to considerable variation between studies, with the exception of Candida, so we have provided descriptive statistics such as receiver operating characteristics (ROCs) and forest plots by test brand to show variation in study results. MAIN RESULTS: We included in the review 49 studies with a total of 6244 participants. About half of these studies (24/49; 49%) were conducted with people who had cancer or hematologic malignancies. Most studies (36/49; 73%) focused on the Fungitell BDG test. This was followed by Glucatell (5 studies; 10%), Wako (3 studies; 6%), Fungitec-G (3 studies; 6%), and Dynamiker (2 studies; 4%). About three-quarters of studies (79%) utilized either a prospective or a retrospective consecutive study design; the remainder used a case-control design. Based on the manufacturer's recommended cut-off levels for the Fungitell test, sensitivity ranged from 27% to 100%, and specificity from 0% to 100%. For the Glucatell assay, sensitivity ranged from 50% to 92%, and specificity ranged from 41% to 94%. Limited studies have used the Dynamiker, Wako, and Fungitec-G assays, but individual sensitivities and specificities ranged from 50% to 88%, and from 60% to 100%, respectively. Results show considerable differences between studies, even by manufacturer, which prevented a formal meta-analysis. Most studies (32/49; 65%) had no reported high risk of bias in any of the QUADAS-2 domains. The QUADAS-2 domains that had higher risk of bias included participant selection and flow and timing. AUTHORS' CONCLUSIONS: We noted considerable heterogeneity between studies, and these differences precluded a formal meta-analysis. Because of wide variation in the results, it is not possible to estimate the diagnostic accuracy of the BDG test in specific settings. Future studies estimating the accuracy of BDG tests should be linked to the way the test is used in clinical practice and should clearly describe the sampling protocol and the relationship of time of testing to time of diagnosis.
Asunto(s)
Enfermedad Crítica , Huésped Inmunocomprometido , Infecciones Fúngicas Invasoras/diagnóstico , beta-Glucanos/sangre , Aspergilosis/diagnóstico , Biomarcadores/sangre , Candidiasis Invasiva/diagnóstico , Estudios de Casos y Controles , Humanos , Infecciones por Pneumocystis/diagnóstico , Pneumocystis carinii , Estudios Prospectivos , Curva ROC , Estudios Retrospectivos , Sensibilidad y EspecificidadRESUMEN
BACKGROUND: Transgender women are female individuals who were recorded men at birth based on natal sex. Supporting a person's gender identity improves their psychological health, and gender-affirming hormones reduce gender dysphoria and benefit mental health. For transgender women, estrogen administration has clinically significant benefits. Previous reviews have reported conflicting literature on the thrombotic risk of estrogen therapy in transgender women and have highlighted the need for more high-quality research. CONTENT: To help address the gap in understanding thrombotic risk in transgender women receiving estrogen therapy, we performed a systematic literature review and metaanalysis. Two evaluators independently assessed quality using the Ottawa Scale for Cohort Studies. The Poisson normal model was used to estimate the study-specific incidence rates and the pooled incidence rate. Heterogeneity was measured using Higgins I 2 statistic. The overall estimate of the incidence rate was 2.3 per 1000 person-years (95% CI, 0.8-6.9). The heterogeneity was significant (I 2 = 74%; P = 0.0039). SUMMARY: Our study estimated the incidence rate of venous thromboembolism in transgender women prescribed estrogen to be 2.3 per 1000 person-years, but because of heterogeneity this estimate cannot be reliably applied to transgender women as a group. There are insufficient data in the literature to partition by subgroup for subgroup prohibiting the analysis to control for tobacco use, age, and obesity, which is a major limitation. Additional studies of current estrogen formulations, modes of administration, and combination therapies, as well as studies in the aging transgender population, are needed to confirm thrombotic risk and clarify optimal therapy regimens.
Asunto(s)
Terapia de Reemplazo de Estrógeno/efectos adversos , Personas Transgénero , Tromboembolia Venosa/inducido químicamente , Femenino , Humanos , Masculino , Factores de RiesgoRESUMEN
BACKGROUND: Risk-adjusted benchmarking could be useful to compare blood utilization between hospitals or individual groups, such as physicians, while accounting for differences in patient complexity. The aim of this study was to analyze the association of red blood cell (RBC) use and diagnosis-related group (DRG) weights across all inpatient hospital stays to determine the suitability of using DRGs for between-hospital risk-adjusted benchmarking. Specific hierarchical organizational units (surgical vs. nonsurgical patients, departments, and physicians) were also evaluated. STUDY DESIGN AND METHODS: We studied blood use among all adult inpatients, and within organizational units, over 4 years (May 2014 to March 2018) at an academic center. Number of RBCs transfused, all patient refined (APR)-DRGs, and other variables were captured over entire hospital stays. We used multilevel generalized linear modeling (zero-inflated negative binomial) to study the relationship between RBC utilization and APR-DRG. RESULTS: A total of 97,955 hospital stays were evaluated and the median APR-DRG weight was 1.2. The association of RBCs transfused and APR-DRG weight was statistically significant at all hierarchical levels (incidence rate ratio = 1.22; p < 0.001). The impact of APR-DRG on blood use, measured by the incidence rate ratio, demonstrated an association at the all-patient and surgical levels, at several department and physician levels, but not at the medical patient level. The relationship between RBCs transfused and APR-DRG varied across organizational units. CONCLUSION: Number of RBCs transfused was associated with APR-DRG weight at multiple hierarchical levels and could be used for risk-adjusted benchmarking in those contexts. The relationship between RBC use and APR-DRG varied across organizational units.
Asunto(s)
Benchmarking , Transfusión Sanguínea , Grupos Diagnósticos Relacionados , Pacientes Internos , Tiempo de Internación , Adulto , Femenino , Humanos , Masculino , Persona de Mediana Edad , Medición de RiesgoRESUMEN
OBJECTIVE. The purpose of this study was to establish the correlation of radiography findings with findings of gross and microscopic histopathologic analysis to assess the usefulness of radiography in preoperative assessment for hip arthroplasty. MATERIALS AND METHODS. Radiology and pathology reports from 953 consecutive femoral head resections were reviewed to establish the correlation of radiography and pathology findings as used in routine clinical practice. In 83 cases MR images were also available for review. Both radiologists and pathologists prospectively used a four-grade scale of absent, mild, moderate, or severe osteoarthritis. The grades established by radiologists and pathologists were compared by means of both the four-grade system and a simplified two-grade system of none-to-mild versus moderate-to-severe osteoarthritis. RESULTS. The mean patient age was 60 years (range, 18-94 years). Resection was performed for osteoarthritis in 941 cases and for infection, inflammatory arthritis, avascular necrosis, fracture, or tumor in the others. Radiographs showed severe osteoarthritis in 62.3% of patients and no or mild osteoarthritis in 17.7%. Observed agreement between radiology and pathology findings was 90% for both the four-grade and two-grade osteoarthritis scales. Findings on standing radiographs were more concordant with pathology results than findings on supine radiographs (odds ratio, 1.4). Observed agreement between radiography and MRI was 78%. There were significant discrepancies between radiography grade and pathology grade in 2.2% of cases. Observed agreement of MRI and pathologic analysis was 76% (κ = 0.64). CONCLUSION. Radiography findings are a reliable indicator of severity of osteoarthritis. This is important because previous studies have shown that patients with no or mild osteoarthritis are less likely to benefit from arthroplasty. If evidence of moderate or severe osteoarthritis is not present on radiographs, further investigation is warranted before proceeding to arthroplasty.
Asunto(s)
Imagen por Resonancia Magnética , Osteoartritis de la Cadera/diagnóstico por imagen , Osteoartritis de la Cadera/patología , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Artroplastia de Reemplazo de Cadera , Humanos , Persona de Mediana Edad , Osteoartritis de la Cadera/cirugía , Estudios Retrospectivos , Índice de Severidad de la EnfermedadRESUMEN
BACKGROUND: Invasive fungal infection (IFI) is a life-threatening complication of allogeneic hematopoietic stem cell transplantation (HSCT) that is also associated with excess healthcare costs. Current approaches include universal antifungal prophylaxis, preemptive therapy based on biomarker surveillance, and empiric treatment initiated in response to clinical signs/symptoms. However, no study has directly compared the cost-effectiveness of these treatment strategies for an allogeneic HSCT patient population. METHODS: We developed a state transition model to study the impact of treatment strategies on outcomes associated with IFIs in the first 100 days following myeloablative allogeneic HSCT. We compared three treatment strategies: empiric voriconazole, preemptive voriconazole (200 mg), or prophylactic posaconazole (300 mg) for the management of IFIs. Preemptive treatment was guided by scheduled laboratory surveillance with galactomannan (GM) testing. Endpoints were cost and survival at 100 days post-HSCT. RESULTS: Empiric treatment was the least costly ($147 482) and was equally effective (85.2% survival at 100 days) as the preemptive treatment strategies. Preemptive treatments were slightly more costly than empiric treatment (GM cutoff ≥ 1.0 $147 910 and GM cutoff ≥ 0.5 $148 108). Preemptive therapy with GM cutoff ≥ 1.0 reduced anti-mold therapy by 5% when compared to empiric therapy. Posaconazole prophylaxis was the most effective (86.6% survival at 100 days) and costly ($152 240) treatment strategy with a cost of $352 415 per life saved when compared to empiric therapy. CONCLUSIONS: One preemptive treatment strategy reduced overall anti-mold drug exposure but did not reduce overall costs. Prevention of IFI using posaconazole prophylaxis was the most effective treatment strategy and may be cost-effective, depending upon the willingness to pay per life saved.
Asunto(s)
Antifúngicos/administración & dosificación , Antifúngicos/economía , Análisis Costo-Beneficio , Trasplante de Células Madre Hematopoyéticas/efectos adversos , Infecciones Fúngicas Invasoras/prevención & control , Acondicionamiento Pretrasplante , Humanos , Infecciones Fúngicas Invasoras/economía , Modelos Biológicos , Trasplante Homólogo/efectos adversos , Resultado del TratamientoRESUMEN
BACKGROUND: Current therapy requires separation of non-small cell carcinomas into adenocarcinomas (AC) and squamous cell carcinomas (SCC). A meta-analysis has shown a pooled diagnostic sensitivity of 63% and specificity of 95% for the diagnosis of AC. While a number of cytomorphological features have been proposed for separation of AC from SCC, we are unaware of a statistically based analysis of cytomorphological features useful for separation of these two carcinomas. We performed logistic regression analysis of cytological features useful in classifying SCC and AC. DESIGN: Sixty-one Papanicolaou-stained fine needle aspiration specimens (29 AC/32 SCC) were reviewed by two board-certified cytopathologists for nine features (eccentric nucleoli, vesicular chromatin, prominent nucleoli, vacuolated cytoplasm, 3-dimensional cell balls, dark non-transparent chromatin, central nucleoli, single malignant cells and spindle-shaped cells). All cytological specimens had surgical biopsy results. Inter-rater agreement was assessed by Cohen's κ. Association between features and AC was determined using hierarchical logistic regression model where feature scores were nested within reviewers. A model to classify cases as SCC or AC was developed and verified by k-fold verification (k = 5). Classification performance was assessed using the area under the receiver operating characteristic curve. RESULTS: Observed rater agreement for scored features ranged from 49% to 82%. Kappa scores were clustered in three groups. Raters demonstrated good agreement for prominent nucleoli, vesicular chromatin and eccentric nuclei. Fair agreement was seen for 3-dimensional cell balls, dark non-transparent chromatin, and presence of spindle-shaped cells. Association of features with adenocarcinoma showed four statistically significant associations (P < 0.001) with adenocarcinoma. These features were prominent nucleoli, vesicular chromatin, eccentric nuclei and three-dimensional cell balls. Spindle-shaped cells and dark non-transparent chromatin were negatively associated with adenocarcinoma. CONCLUSIONS: Logistic regression analysis demonstrated six features helpful in separation of AC from SCC. Prominent nucleoli, vesicular chromatin, cell balls and eccentric nucleoli were positively associated with AC and demonstrated a P value of 0.001 or less. The presence of dark, non-transparent chromatin and spindle-shaped cells favoured the diagnosis of SCC.
Asunto(s)
Adenocarcinoma del Pulmón/patología , Carcinoma de Células Escamosas/patología , Citodiagnóstico , Diagnóstico Diferencial , Adenocarcinoma del Pulmón/clasificación , Adenocarcinoma del Pulmón/diagnóstico , Biopsia con Aguja Fina , Carcinoma de Células Escamosas/clasificación , Carcinoma de Células Escamosas/diagnóstico , Nucléolo Celular , Núcleo Celular , Femenino , Humanos , Masculino , Medicina de PrecisiónRESUMEN
BACKGROUND: Critically ill preterm very-low-birthweight (VLBW) neonates (birthweight ≤ 1.5 kg) frequently develop anemia that is treated with red blood cell (RBC) transfusions. Although RBCs transfused to adults demonstrate progressive decreases in posttransfusion 24-hour RBC recovery (PTR24 ) during storage-to a mean of approximately 85% of the Food and Drug Administration-allowed 42-day storage-limited data in infants indicate no decrease in PTR24 with storage. STUDY DESIGN AND METHODS: We hypothesized that PTR24 of allogeneic RBCs transfused to anemic VLBW newborns: 1) will be greater than PTR24 of autologous RBCs transfused into healthy adults and 2) will not decrease with increasing storage duration. RBCs were stored at 4°C for not more than 42 days in AS-3 or AS-5. PTR24 was determined in 46 VLBW neonates using biotin-labeled RBCs and in 76 healthy adults using 51 Cr-labeled RBCs. Linear mixed-model analysis was used to estimate slopes and intercepts of PTR24 versus duration of RBC storage. RESULTS: For VLBW newborns, the estimated slope of PTR24 versus storage did not decrease with the duration of storage (p = 0.18) while for adults it did (p < 0.0001). These estimated slopes differed significantly in adults compared to newborns (p = 0.04). At the allowed 42-day storage limit, projected mean neonatal PTR24 was 95.9%; for adults, it was 83.8% (p = 0.0002). CONCLUSIONS: These data provide evidence that storage duration of allogeneic RBCs intended for neonates can be increased without affecting PTR24 . This conclusion supports the practice of transfusing RBCs stored up to 42 days for small-volume neonatal transfusions to limit donor exposure.
Asunto(s)
Conservación de la Sangre , Transfusión de Sangre Autóloga , Transfusión de Eritrocitos , Eritrocitos , Recién Nacido de Bajo Peso , Recien Nacido Prematuro , Adulto , Femenino , Humanos , Recién Nacido , Masculino , Factores de TiempoRESUMEN
The current reference method in the United States for measuring in vivo population red blood cell (RBC) kinetics utilizes chromium-51 (51 Cr) RBC labeling for determining RBC volume, 24-hour posttransfusion RBC recovery, and long-term RBC survival. Here we provide evidence supporting adoption of a method for kinetics that uses the biotin-labeled RBCs (BioRBCs) as a superior, versatile method for both regulatory and investigational purposes. RBC kinetic analysis using BioRBCs has important methodologic, analytical, and safety advantages over 51 Cr-labeled RBCs. We critically review recent advances in labeling human RBCs at multiple and progressively lower biotin label densities for concurrent, accurate, and sensitive determination of both autologous and allogeneic RBC population kinetics. BioRBC methods valid for RBC kinetic studies, including successful variations used by the authors, are presented along with pharmacokinetic modeling approaches for the accurate determination of RBC pharmacokinetic variables in health and disease. The advantages and limitations of the BioRBC method-including its capability of determining multiple BioRBC densities simultaneously in the same individual throughout the entire RBC life span-are presented and compared with the 51 Cr method. Finally, potential applications and limitations of kinetic BioRBC determinations are discussed.