RESUMEN
BACKGROUND: Xenografts from genetically modified pigs have become one of the most promising solutions to the dearth of human organs available for transplantation. The challenge in this model has been hyperacute rejection. To avoid this, pigs have been bred with a knockout of the alpha-1,3-galactosyltransferase gene and with subcapsular autologous thymic tissue. METHODS: We transplanted kidneys from these genetically modified pigs into two brain-dead human recipients whose circulatory and respiratory activity was maintained on ventilators for the duration of the study. We performed serial biopsies and monitored the urine output and kinetic estimated glomerular filtration rate (eGFR) to assess renal function and xenograft rejection. RESULTS: The xenograft in both recipients began to make urine within moments after reperfusion. Over the 54-hour study, the kinetic eGFR increased from 23 ml per minute per 1.73 m2 of body-surface area before transplantation to 62 ml per minute per 1.73 m2 after transplantation in Recipient 1 and from 55 to 109 ml per minute per 1.73 m2 in Recipient 2. In both recipients, the creatinine level, which had been at a steady state, decreased after implantation of the xenograft, from 1.97 to 0.82 mg per deciliter in Recipient 1 and from 1.10 to 0.57 mg per deciliter in Recipient 2. The transplanted kidneys remained pink and well-perfused, continuing to make urine throughout the study. Biopsies that were performed at 6, 24, 48, and 54 hours revealed no signs of hyperacute or antibody-mediated rejection. Hourly urine output with the xenograft was more than double the output with the native kidneys. CONCLUSIONS: Genetically modified kidney xenografts from pigs remained viable and functioning in brain-dead human recipients for 54 hours, without signs of hyperacute rejection. (Funded by Lung Biotechnology.).
Asunto(s)
Rechazo de Injerto , Trasplante de Riñón , Trasplante Heterólogo , Animales , Animales Modificados Genéticamente/cirugía , Muerte Encefálica , Rechazo de Injerto/etiología , Rechazo de Injerto/patología , Rechazo de Injerto/prevención & control , Xenoinjertos/trasplante , Humanos , Riñón/patología , Riñón/fisiología , Trasplante de Riñón/efectos adversos , Trasplante de Riñón/métodos , Porcinos/cirugía , Trasplante Heterólogo/efectos adversos , Trasplante Heterólogo/métodosRESUMEN
PURPOSE OF REVIEW: The greatest challenge facing end-stage kidney disease (ESKD) patients is the scarcity of transplantable organs. Advances in genetic engineering that mitigate xenogeneic immune responses have made transplantation across species a potentially viable solution to this unmet need. Preclinical studies and recent reports of pig-to-human decedent renal xenotransplantation signify that clinical trials are on the horizon. Here, we review the physiologic differences between porcine and human kidneys that could impede xenograft survival. Topics addressed include porcine renin and sodium handling, xenograft water handling, calcium, phosphate and acid-base balance, responses to porcine erythropoietin and xenograft growth. RECENT FINDINGS: Studies in nonhuman primates (NHPs) have demonstrated that genetically modified pig kidneys can survive for an extended period when transplanted into baboons. In recent studies conducted by our group and others, hyperacute rejection did not occur in pig kidneys lacking the α1,3Gal epitope transplanted into brain-dead human recipients. These experimental trials did not study potential clinical abnormalities arising from idiosyncratic xenograft responses to human physiologic stimuli due to the brief duration of observation this model entails. SUMMARY: Progress in biotechnology is heralding an era of xenotransplantation. We highlight the physiologic considerations for xenogeneic grafts to succeed.
Asunto(s)
Trasplante de Riñón , Riñón , Animales , Humanos , Porcinos , Animales Modificados Genéticamente , Trasplante Heterólogo , Riñón/fisiología , Rechazo de InjertoRESUMEN
BACKGROUND: Understanding immunogenicity and alloimmune risk following severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) vaccination in kidney transplant recipients is imperative to understanding the correlates of protection and to inform clinical guidelines. METHODS: We studied 50 kidney transplant recipients following SARS-CoV-2 vaccination and quantified their anti-spike protein antibody, donor-derived cell-free DNA (dd-cfDNA), gene expression profiling (GEP), and alloantibody formation. RESULTS: Participants were stratified using nucleocapsid testing as either SARS-CoV-2-naïve or experienced prior to vaccination. One of 34 (3%) SARS-CoV-2 naïve participants developed anti-spike protein antibodies. In contrast, the odds ratio for the association of a prior history of SARS-CoV-2 infection with vaccine response was 18.3 (95% confidence interval 3.2, 105.0, p < 0.01). Pre- and post-vaccination levels did not change for median dd-cfDNA (0.23% vs. 0.21% respectively, p = 0.13), GEP scores (9.85 vs. 10.4 respectively, p = 0.45), calculated panel reactive antibody, de-novo donor specific antibody status, or estimated glomerular filtration rate. CONCLUSIONS: SARS-CoV-2 vaccines do not appear to trigger alloimmunity in kidney transplant recipients. The degree of vaccine immunogenicity was associated most strongly with a prior history of SARS-CoV-2 infection.
Asunto(s)
COVID-19 , Ácidos Nucleicos Libres de Células , Trasplante de Riñón , Humanos , Anticuerpos Antivirales , COVID-19/prevención & control , Vacunas contra la COVID-19/efectos adversos , Inmunidad , SARS-CoV-2 , Receptores de Trasplantes , VacunaciónRESUMEN
The current kidney allocation system (KAS) preferentially allocates kidneys from blood type A2 or A2B (A/A2B) donors to blood type B candidates. We used national data to evaluate center-level performance of A2/A2B to B transplants, and organ procurement organization (OPO) reporting of type A or AB donor subtyping, in 5-year time periods prior to (2009-2014) and following (2015-2019) KAS implementation. The number of centers performing A2/A2B to B transplants increased from 17 pre-KAS to 76 post-KAS, though this still represents only a minority of centers (7.3% pre-KAS and 32.6% post-KAS). For high-performing centers, the median net increase in A2/A2B to B transplants was 19 cases (range -2-72) per center in the 5 years post-KAS. The median net increase in total B recipient transplants was 21 cases (range -17-119) per center. Despite requirements for performance of subtyping, in 2019 subtyping was reported on only 56.4% of A/AB donors. This translates into potential missed opportunities for B recipients, and even post-KAS up to 2322 A2/A2B donor kidneys may have been allocated for transplantation as A/AB. Further progress must be made both at center and OPO levels to broaden implementation of A2/A2B to B transplants for the benefit of underserved recipients.
Asunto(s)
Trasplante de Riñón , Obtención de Tejidos y Órganos , Humanos , Riñón , Donantes de TejidosRESUMEN
PURPOSE OF REVIEW: Traditionally, nephrolithiasis was considered a relative contraindication to kidney donation because of a risk of recurrent stones in donors and adverse stone-related outcomes in recipients. However, the scarcity of organs has driven the transplant community to re-examine and broaden selection criteria for living donors with stones. In this review, we summarize and contrast the guidelines published by various prominent national and international societies on this topic. RECENT FINDINGS: Although recent iterations of living donor guidelines are less stringent with respect to nephrolithiasis than those published in the 1990s, there is little consensus among national and international transplant society guidelines regarding selection criteria for potential kidney donors with nephrolithiasis. SUMMARY: The lack of evidence-based guidelines deters transplant centers from implementing selection criteria to accept donors with nephrolithiasis and discourages studies of outcomes in donors with nephrolithiasis and their recipients. In addition to drawing attention to the disparities in prevailing guidelines, we put forth several questions that must be answered before generalizable criteria for selection of donor with nephrolithiasis can be developed.
Asunto(s)
Selección de Donante/normas , Cálculos Renales , Trasplante de Riñón , Guías de Práctica Clínica como Asunto , Obtención de Tejidos y Órganos/normas , Humanos , Cálculos Renales/complicaciones , Donadores VivosRESUMEN
OBJECTIVES: The presence of a donor-specific positive crossmatch has been considered to be a contraindication to kidney transplantation because of the risk of hyperacute rejection. Desensitization is the process of removing hazardous preformed donor-specific antibody (DSA) in order to safely proceed with transplant. Traditionally, this involves plasmapheresis and intravenous immune globulin treatments that occur over days to weeks, and has been feasible when there is a living donor and the date of the transplant is known, allowing time for pre-emptive treatments. For sensitized patients without a living donor, transplantation has been historically difficult. SUMMARY OF BACKGROUND DATA: IdeS (imlifidase) is an endopeptidase derived from Streptococcus pyogenes which has specificity for human IgG, and when infused intravenously results in rapid cleavage of IgG. METHODS: Here we present our single-center's experience with 7 highly sensitized (cPRA98-100%) kidney transplant candidates who had DSA resulting in positive crossmatches with their donors (5 deceased, 2 living) who received IdeS within 24âhours prior to transplant. RESULTS: All pre-IdeS crossmatches were positive and would have been prohibitive for transplantation. All crossmatches became negative post-IdeS and the patients underwent successful transplantation. Three patients had DSA rebound and antibody-mediated rejection, which responded to standard of care therapies. Three patients had delayed graft function, which ultimately resolved. No serious adverse events were associated with IdeS. All patients have functioning renal allografts at a median follow-up of 235 days. CONCLUSION: IdeS may represent a groundbreaking new method of desensitization for patients who otherwise might have no hope for receiving a lifesaving transplant.
Asunto(s)
Proteínas Bacterianas/inmunología , Desensibilización Inmunológica/métodos , Endopeptidasas/inmunología , Rechazo de Injerto/inmunología , Rechazo de Injerto/prevención & control , Inmunoglobulina G/inmunología , Isoanticuerpos/sangre , Trasplante de Riñón , Adolescente , Adulto , Anciano , Femenino , Histocompatibilidad/inmunología , Prueba de Histocompatibilidad , Humanos , Infusiones Intravenosas , Cuidados Intraoperatorios , Masculino , Persona de Mediana Edad , Streptococcus pyogenes , Resultado del TratamientoRESUMEN
Genetically modified xenografts are one of the most promising solutions to the discrepancy between the numbers of available human organs for transplantation and potential recipients. To date, a porcine heart has been implanted into only one human recipient. Here, using 10-gene-edited pigs, we transplanted porcine hearts into two brain-dead human recipients and monitored xenograft function, hemodynamics and systemic responses over the course of 66 hours. Although both xenografts demonstrated excellent cardiac function immediately after transplantation and continued to function for the duration of the study, cardiac function declined postoperatively in one case, attributed to a size mismatch between the donor pig and the recipient. For both hearts, we confirmed transgene expression and found no evidence of cellular or antibody-mediated rejection, as assessed using histology, flow cytometry and a cytotoxic crossmatch assay. Moreover, we found no evidence of zoonotic transmission from the donor pigs to the human recipients. While substantial additional work will be needed to advance this technology to human trials, these results indicate that pig-to-human heart xenotransplantation can be performed successfully without hyperacute rejection or zoonosis.
Asunto(s)
Anticuerpos , Rechazo de Injerto , Animales , Humanos , Porcinos , Trasplante Heterólogo/métodos , Xenoinjertos , Corazón , Animales Modificados GenéticamenteRESUMEN
BACKGROUND: Clinical decisions are mainly driven by the ability of physicians to apply risk stratification to patients. However, this task is difficult as it requires complex integration of numerous parameters and is impacted by patient heterogeneity. We sought to evaluate the ability of transplant physicians to predict the risk of long-term allograft failure and compare them to a validated artificial intelligence (AI) prediction algorithm. METHODS: We randomly selected 400 kidney transplant recipients from a qualified dataset of 4000 patients. For each patient, 44 features routinely collected during the first-year post-transplant were compiled in an electronic health record (EHR). We enrolled 9 transplant physicians at various career stages. At 1-year post-transplant, they blindly predicted the long-term graft survival with probabilities for each patient. Their predictions were compared with those of a validated prediction system (iBox). We assessed the determinants of each physician's prediction using a random forest survival model. RESULTS: Among the 400 patients included, 84 graft failures occurred at 7 years post-evaluation. The iBox system demonstrates the best predictive performance with a discrimination of 0.79 and a median calibration error of 5.79%, while physicians tend to overestimate the risk of graft failure. Physicians' risk predictions show wide heterogeneity with a moderate intraclass correlation of 0.58. The determinants of physicians' prediction are disparate, with poor agreement regardless of their clinical experience. CONCLUSIONS: This study shows the overall limited performance and consistency of physicians to predict the risk of long-term graft failure, demonstrated by the superior performances of the iBox. This study supports the use of a companion tool to help physicians in their prognostic judgement and decision-making in clinical care.
The ability to predict the risk of a particular event is key to clinical decision-making, for example when predicting the risk of a poor outcome to help decide which patients should receive an organ transplant. Computer-based systems may help to improve risk prediction, particularly with the increasing volume and complexity of patient data available to clinicians. Here, we compare predictions of the risk of long-term kidney transplant failure made by clinicians with those made by our computer-based system (the iBox system). We observe that clinicians' overall performance in predicting individual long-term outcomes is limited compared to the iBox system, and demonstrate wide variability in clinicians' predictions, regardless of level of experience. Our findings support the use of the iBox system in the clinic to help clinicians predict outcomes and make decisions surrounding kidney transplants.
RESUMEN
BACKGROUND Solid-phase assays to investigate the complement-activating capacity of HLA antibodies have been utilized to optimize organ allocation and improve transplant outcomes. The clinical utility of C1q/C3d-binding characteristics of de novo donor-specific anti-HLA antibodies (dnDSA) associated with C4d-positive antibody-mediated rejection (C4d⺠AMR) in kidney transplants (KTx) has not been defined. MATERIAL AND METHODS Sera from 120 KTx recipients that had dnDSA concurrent with protocol/cause biopsy (median 3.8 years after transplantation) were screened for C1q and C3d-binding dnDSA. The difference in the incidence of C4d⺠AMR between recipients with and without C1q/C3d-binding dnDSA was assessed. RESULTS Over 86% of dnDSAs were class II antibodies. The immunodominant dnDSAs characterized by the highest median fluorescence intensity (MFI) in most recipients were HLA-DQ antibodies (67%). Most recipients (62%, n=74) had either C1q⺠(56%), C3d⺠(48%), or both C1qâºC3d⺠(41.2%) dnDSA, while the remaining 38% were negative for both C1q and C3d. Of those with C1qâº/C3d⺠dnDSA, 87% had high-MFI IgG (MFI=14144±5363 and 13932±5278, respectively), while 65% of C1qâ»C3dâ» dnDSA had low-MFI IgG (MFI=5970±3347). The incidence of C4d+ AMR was significantly higher in recipients with C1q⺠(66%), C3d+ (74%), and C1qâºC3d⺠(72%) dnDSA than in those with C1qâ»C3dâ» dnDSA (30%) recipients. Recipients with C3dâº/C1q⺠dnDSA had higher C4d⺠scores on biopsy. CONCLUSIONS C1qâº/C3d⺠dnDSA were associated with C4d⺠AMR and high-IgG MFI. Our data call into question the predictive utility of C1q/C3d-binding assays in identifying KTx recipients at risk of allograft failure. In conclusion, IgG MFI is sufficient for clinical management, and the C1q/C3d-assays with added cost do not provide any additional information.
Asunto(s)
Complemento C1q , Trasplante de Riñón , Rechazo de Injerto , Antígenos HLA , Humanos , Isoanticuerpos , Estudios Retrospectivos , Receptores de TrasplantesRESUMEN
Desensitization using plasma exchange can remove harmful antibodies prior to transplantation and mitigate risks for hyperacute and severe early acute antibody-mediated rejection. Traditionally, the use of plasma exchange requires a living donor so that the timing of treatments relative to transplant can be planned. Non-HLA antibody is increasingly recognized as capable of causing antibody-mediated renal allograft rejection and has been associated with decreased graft longevity. Our patient had high-strength non-HLA antibody deemed prohibitive to transplantation without desensitization, but no living donors. As the patient was eligible to receive an A2 ABO blood group organ and was willing to accept a hepatitis C positive donor kidney, this afforded a high probability of receiving an offer within a short enough time frame to attempt empiric desensitization in anticipation of a deceased donor transplant. Fifteen plasma exchange treatments were performed before the patient received an organ offer, and the patient was successfully transplanted. Hepatitis C infection was treated posttransplant. No episodes of rejection were observed. At one-year posttransplant, the patient maintains good graft function. In this case, willingness to consider nontraditional donor organs enabled us to mimic living donor desensitization using a deceased donor.
RESUMEN
The complement system is integral to innate immunity, and it is an essential deterrent against infections. The complement apparatus comprises of >30 fluid-phase and surface-bound elements that also engage with the adaptive immune system, clear harmful immune complexes, and orchestrates several salutary physiological processes. An imbalance in the complement system's tightly regulated machinery and the consequent unrestrained complement activation underpins the pathogenesis of a wide array of inflammatory, autoimmune, neoplastic and degenerative disorders. Antibody-mediated rejection is a leading cause of graft failure in kidney transplantation. Complement-induced inflammation and endothelial injury have emerged as the primary mechanisms in the pathogenesis of this form of rejection. Researchers in the field of transplantation are now trying to define the role and efficacy of complement targeting agents in the prevention and treatment of rejection and other complement related conditions that lead to graft injury. Here, we detail the current clinical indications for complement therapeutics and the scope of existing and emerging therapies that target the complement system, focusing on kidney transplantation.
Asunto(s)
Activación de Complemento/inmunología , Proteínas del Sistema Complemento/inmunología , Inmunomodulación , Trasplante de Riñón , Inmunidad Adaptativa , Animales , Toma de Decisiones Clínicas , Manejo de la Enfermedad , Descubrimiento de Drogas , Rechazo de Injerto/inmunología , Supervivencia de Injerto/inmunología , Antígenos HLA/genética , Antígenos HLA/inmunología , Xenoinjertos , Humanos , Inmunidad Innata , Riñón/inmunología , Riñón/metabolismo , Riñón/patología , Trasplante de Riñón/efectos adversos , Trasplante de Riñón/métodos , Inmunología del TrasplanteRESUMEN
INTRODUCTION: There were concerns raised regarding a high prevalence of chronic kidney disease (CKD) in Uddanam, a fertile subtropical low-altitude territory in the southern Indian state of Andhra Pradesh. The present study was undertaken to ascertain the prevalence of CKD, disease characteristics, and risk factor profile in this area. METHODS: We selected 2210 subjects (age >18 years) using multistage sampling. After obtaining demographic and anthropometric data, urinary protein-creatinine ratio, serum creatinine, and blood glucose were measured in all the subjects. Glomerular filtration rate was estimated (eGFR) using the Modification of Diet in Renal Disease equation. RESULTS: Mean age of the subjects was 43.2 ± 14.2 years (range: 18-98), 44.3% were men and 55.7% were women. Mean eGFR of subjects was 94.3 ± 33.4. Low eGFR (<60 ml/min per 1.73 m2) was seen in 307 (13.98%) patients with a mean eGFR of 34.8 ± 16.6. The prevalence of subjects having low eGFR and with proteinuria (CKD) was 18.23%. Major risk factors, such as diabetes, long-standing hypertension, and significant proteinuria, were absent in 73% of patients with CKD, implying that a significant proportion of the population is afflicted with the entity "CKD of unknown etiology (CKDu)." CONCLUSION: The prevalence of CKD and CKDu in Uddanam is much higher than other earlier studies in either rural or urban communities in India. We suggest that there is a dire need to review health policies and allocate resources for prevention and treatment of CKD in the Uddanam region.
RESUMEN
The human major histocompatibility complex is a family of genes that encodes HLAs, which have a crucial role in defence against foreign pathogens and immune surveillance of tumours. In the context of transplantation, HLA molecules are polymorphic antigens that comprise an immunodominant alloreactive trigger for the immune response, resulting in rejection. Remarkable advances in knowledge and technology in the field of immunogenetics have considerably enhanced the safety of transplantation. However, access to transplantation among individuals who have become sensitized as a result of previous exposure to alloantigens is reduced proportional to the breadth of their sensitization. New approaches for crossing the HLA barrier in transplantation using plasmapheresis, intravenous immunoglobulin and kidney paired donation have been made possible by the relative ease with which even low levels of anti-HLA antibodies can now be detected and tracked. The development of novel protocols for the induction of tolerance and new approaches to immunomodulation was also facilitated by advances in HLA technology. Here, we review the progress made in understanding HLAs that has enabled organ transplantation to become a life-saving endeavour that is accessible even for sensitized patients. We also discuss novel approaches to desensitization, immunomodulation and tolerance induction that have the potential to further improve transplantation access and outcomes.
Asunto(s)
Desensibilización Inmunológica/métodos , Rechazo de Injerto/inmunología , Rechazo de Injerto/prevención & control , Antígenos HLA/inmunología , Terapia de Inmunosupresión/métodos , Trasplante de Riñón , Inmunidad Adaptativa , Prueba de Histocompatibilidad , Humanos , Tolerancia InmunológicaRESUMEN
BACKGROUND: Kidney transplantation is the first-line therapy for patients with end-stage renal disease since it offers greater long-term survival and improved quality of life when compared to dialysis. The advent of calcineurin inhibitor (CNI)-based maintenance immunosuppression has led to a clinically significant decline in the rate of acute rejection and better short-term graft survival rates. However, these gains have not translated into improvement in long-term graft survival. CNI-related nephrotoxicity and metabolic side effects are thought to be partly responsible for this. CASE PRESENTATION: Here, we report the conversion of a highly sensitized renal transplant recipient with pretransplant donor-specific antibodies from tacrolimus to belatacept within 1 week of transplantation. This substitution was necessitated by the diagnosis of CNI-induced de novo post-transplant hemolytic uremic syndrome. CONCLUSION: Belatacept is a novel costimulation blocker that is devoid of the nephrotoxic properties of CNIs and has been shown to positively impact long-term graft survival and preserve renal allograft function in low-immunologic-risk kidney transplant recipients. Data regarding its use in patients who are broadly sensitized to human leukocyte antigens are scarce, and the increased risk of rejection associated with belatacept has been a deterrent to more widespread use of this immunosuppressive agent. This case serves as an example of a highly sensitized patient that has been successfully converted to a belatacept-based CNI-free regimen.
RESUMEN
PURPOSE OF REVIEW: Over the past two decades, significant strides made in our understanding of the etiology of antibody-mediated rejection (AMR) in transplantation have put the complement system in the spotlight. Here, we review recent progress made in the field of pharmacologic complement inhibition in clinical transplantation and aim to understand the impact of this therapeutic approach on outcomes in transplant recipients. RECENT FINDINGS: Encouraged by the success of agents targeting the complement cascade in disorders of unrestrained complement activation like paroxysmal nocturnal hemoglobinuria (PNH) and atypical hemolytic uremic syndrome (aHUS), investigators are testing the safety and efficacy of pharmacologic complement blockade in mitigating allograft injury in conditions ranging from AMR to recurrent post-transplant aHUS, C3 glomerulopathies and antiphospholipid anti-body syndrome (APS). A recent prospective study demonstrated the efficacy of terminal complement inhibition with eculizumab in the prevention of acute AMR in human leukocyte antigen (HLA)-incompatible living donor renal transplant recipients. C1 esterase inhibitor (C1-INH) was well tolerated in two recent studies in the treatment of AMR and was associated with improved renal allograft function. SUMMARY: Pharmacologic complement inhibition is emerging as valuable therapeutic tool, especially in the management of highly sensitized renal transplant recipients. Novel and promising agents that target various elements in the complement cascade are in development. Graphical Abstractá .
RESUMEN
It has been suggested that quantitative ultrasound (QUS) could be used as a selective population pre-screen, to maximise the cost effectiveness of referral for dual energy X-ray absorptiometry (DXA) assessment of bone mineral density (BMD). We set out to examine how such an approach might perform in the assessment of women who were referred by general practitioners for DXA via the open access service in Cardiff. In 115 women aged 40-80 (mean 69) years we used DXA to measure BMD at lumbar spine and hip, and QUS to measure broadband ultrasound attenuation (BUA) in the heel. A bottom-up approach was used to estimate the costs of DXA and QUS. We examined the cost effectiveness of using QUS as a pre-screen, only referring subjects for the more expensive DXA assessment if BUA were less than a pre-determined threshold. The unit costs of pencil-beam DXA and QUS were approximately 44 UK pounds and 16 UK pounds respectively. We identified a BUA threshold of 60 dB/MHz as the most cost effective, and calculated a sensitivity of 81% and specificity of 89% in identifying those subjects whom DXA assessment subsequently identified as having osteoporosis. At the BUA threshold of 60 dB/MHz, pre-screening saved 969 UK pounds at the expense of missing ten women with osteoporosis as diagnosed by DXA. Therefore the cost per additional woman with osteoporosis identified using DXA alone was only 97 UK pounds. QUS assessment does not appear to have a significant cost effective benefit as a pre-screen for DXA in the studied population. A QUS pre-screen would be cost effective only if this investigation could be performed at a substantially lower cost.
Asunto(s)
Absorciometría de Fotón/estadística & datos numéricos , Talón/diagnóstico por imagen , Cadera/diagnóstico por imagen , Región Lumbosacra/diagnóstico por imagen , Tamizaje Masivo/métodos , Osteoporosis/diagnóstico por imagen , Medicina Estatal/economía , Absorciometría de Fotón/economía , Adulto , Anciano , Anciano de 80 o más Años , Densidad Ósea/fisiología , Análisis Costo-Beneficio , Femenino , Accesibilidad a los Servicios de Salud , Humanos , Tamizaje Masivo/economía , Persona de Mediana Edad , Curva ROC , Derivación y Consulta , Evaluación de la Tecnología Biomédica , Ultrasonografía , GalesRESUMEN
Two men with severe ulcerative colitis developed ulcerative tracheobronchitis 4 and 8 years after total colectomy. Intense plasma cell infiltration of tracheal mucosa and submucosa and destruction of mucous glands occurred, with partial relief of symptoms with corticosteroids. We compare them with the only other case reported, also years after colectomy.
Asunto(s)
Bronquitis/etiología , Colectomía , Colitis Ulcerosa/complicaciones , Traqueítis/etiología , Beclometasona/uso terapéutico , Bronquitis/tratamiento farmacológico , Bronquitis/patología , Colitis Ulcerosa/cirugía , Humanos , Masculino , Persona de Mediana Edad , Prednisolona/uso terapéutico , Factores de Tiempo , Traqueítis/tratamiento farmacológico , Traqueítis/patologíaRESUMEN
The increased prevalence of atrial fibrillation (AF) in older people contributes to an increased risk of stroke. Although clear guidelines exist, there is considerable variation in physicians' approaches to the selection of patients appropriate for warfarin treatment as stroke prevention. We compared attitudes to the anticoagulation of elderly patients with AF, in a postal study of geriatricians and specialist physicians (general physicians with specialist interests in Cardiology, Gastroenterology, Diabetes and Endocrinology, Nephrology and Neurology). A structured questionnaire was mailed to all 108 consultant physicians and geriatricians in South East Wales. This explored their attitude to their patients' age and comorbidity when considering the benefits and risks of warfarin prophylaxis for AF. About 25/30 geriatricians (83%) and 43/78 specialist physicians (55%) responded; an overall response rate of 63%. About 94% of the respondents agreed that patients aged over 75 with atrial fibrillation were at a greater risk of stroke than younger patients. About 68% considered warfarin related bleeds more likely in this age group, despite which most thought that the benefits of warfarin outweighed the risks. In people aged above 75, only 13/25 (52%) geriatricians and 17/43 (40%) specialist physicians viewed lone AF (AF with no underlying risk factor) as an indication for anticoagulation. When considering the use of warfarin, geriatricians' appeared more likely to be influenced by coexisting problems such as disability, falls, cerebrovascular disease and limited life expectancy. Only a history of falls (96% geriatricians vs. 86% specialist physicians) and cerebrovascular disease (79% geriatricians vs. 51% specialist physicians) had a significant influence on prescribing practice (P<0.05, chi(2) test). There appears to be widespread uncertainty about the indications for warfarin as stroke prophylaxis, and ageist attitudes or a lack of conviction of benefit appear to be disadvantaging older people. Patients aged below 65 with lone AF who are at the lowest risk of embolic events are often considered for treatment, whilst the use of warfarin in 75-year-olds with lone AF who are at a moderately high risk of embolic events remains disappointing.
RESUMEN
This presentation is a randomized prospective study of 74 patients of squamous cell carcinoma of esophagus divided into two groups. First group was treated with radical radiation alone and the other with radical radiation and combination of bleomycin and 5-FU. Analysis at the end of two years showed that the patients treated with addition of chemotherapy had an overall survival of 23 percent as compared to nine percent in the radiation alone group (P less than 0.05).
Asunto(s)
Protocolos de Quimioterapia Combinada Antineoplásica/uso terapéutico , Carcinoma de Células Escamosas/radioterapia , Neoplasias Esofágicas/radioterapia , Adulto , Anciano , Bleomicina/administración & dosificación , Carcinoma de Células Escamosas/tratamiento farmacológico , Terapia Combinada , Neoplasias Esofágicas/tratamiento farmacológico , Femenino , Fluorouracilo/administración & dosificación , Humanos , Masculino , Persona de Mediana Edad , Ensayos Clínicos Controlados Aleatorios como AsuntoRESUMEN
Central venous stenosis is a well-described sequel to the placement of hemodialysis catheters in the central venous system. The presence of an ipsilateral arteriovenous fistula or graft often leads to severe venous dilatation, arm edema and recurrent infections. Vascular access thrombosis, compromised blood flow and inadequate dialysis delivery are dreaded complications that eventually render the access unusable. We report the case of a 58-year-old male hemodialysis patient who developed symptomatic central venous stenosis to illustrate the problem and review the pertinent literature. This patient developed severe enlargement of upper extremity veins due to central venous stenosis. The symptoms were refractory to multiple endovascular interventions and eventually necessitated ligation of his arteriovenous fistula. Central venous stenosis remains a pervasive problem despite advances in our understanding of its etiology and recognition of the enormity of its consequences. Due to the lack of effective therapeutic options, prevention is better than cure.