Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 20 de 58
Filtrar
2.
Nephrol Dial Transplant ; 28(4): 826-32; discussion 832, 2013 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-23543723

RESUMEN

All progress in dialysis methods was made in research presented in case reports, case-control studies and other observational studies. On the contrary, randomized controlled trials (RCTs) did not bring any valuable results. Comparison of the value of peritoneal dialysis and hemodialysis (HD) in RCTs was not completed because of recruitment problems. Four RCTs in HD did not provide any useful data. The worst example was the National Cooperative Dialysis Study, which committed a Type II statistical error rejecting the time of dialysis as an important factor determining the quality of dialysis. This study also provided the basis for the establishment of the Kt/V index as a measure of dialysis adequacy. This index was accepted by the HD community, having been established in a sacrosanct RCT, led to short dialysis, and possibly higher mortality in the USA. The second trial (the HEMO study) committed a Type III statistical error asking the wrong question and did not bring any valuable results, but at least it did not lead to deterioration of dialysis outcomes in the USA. The third, the Frequent Hemodialysis Network Trial Group, did not bring forth any valuable results, but at least confirmed what was already known. The fourth, the Frequent Hemodialysis Network Nocturnal Trial, committed a Type II statistical error because of tremendous recruitment problems leading to an inadequate number of subjects. Moreover, the study methodology was absolutely unreliable.


Asunto(s)
Fallo Renal Crónico/terapia , Ensayos Clínicos Controlados Aleatorios como Asunto , Diálisis Renal , Humanos
3.
Nephrol Nurs J ; 37(6): 641-6; quiz 647, 2010.
Artículo en Inglés | MEDLINE | ID: mdl-21290918

RESUMEN

This study compares patient and technique survival on continuous ambulatory peritoneal dialysis (CAPD) and other peritoneal dialysis (PD) modalities in relation to body size indicators, race, sex, and peritoneal transport characteristics. Data were abstracted from a PD adequacy database, with 354 patients subjected to analysis. Transfers between PD modalities were almost exclusively from CAPD to various offshoots of PD, mostly due to inadequate dialysis or inadequate ultrafiltration. Survival analysis showed better technique survival for other PD modalities compared to CAPD when body mass index was less than 25 kg/m2, body surface area (BSA) was less than 1.9 m2, total body water was less than 39 L, and the dialysate-to-plasma ratio of creatinine at four hours was less than 0.65 by the peritoneal equilibration test (PET). There were no differences found in relation to gender, race, or PET ratio of dialysate glucose at four hours to dialysate glucose at time zero. In other PD modalities, no differences in technique and patient survival were found in regard to the same parameters, with the exception of better technique survival in males with a BSA over 1.9 m2. In conclusion, CAPD technique survival is better in the small patient with below average peritoneal transport characteristics. In other PD modalities, survival is not related to anthropometric indices or peritoneal transport characteristics.


Asunto(s)
Tamaño Corporal , Diálisis Peritoneal Ambulatoria Continua , Educación Continua , Femenino , Humanos , Masculino , Análisis de Supervivencia
4.
Adv Perit Dial ; 25: 155-64, 2009.
Artículo en Inglés | MEDLINE | ID: mdl-19886338

RESUMEN

Technique survival in continuous ambulatory peritoneal dialysis (CAPD) depends mostly on clearances in relation to body size and residual renal function (RRF). Our clinical impression has been that when RRF fails, larger patients leave CAPD sooner than smaller patients do. Peritoneal equilibration tests (PETs) and 24-hour adequacy evaluations performed in 277 patients in a single center from 1986 through 2009 were abstracted from the existing peritoneal dialysis adequacy database. A PET (using 2 L of 2.5% dextrose dialysis solution) was performed in 272 patients during the first 4 months of dialysis. Every 3 months, the patients brought their 24-hour urine and dialysate collections for adequacy evaluations and had height and weight recorded. Body surface area (BSA), body mass index (BMI), and total body water (TBW) were calculated. There were 1372 adequacy evaluations abstracted. The number of patients gradually declined over time because of death (28%) or transfer to other peritoneal regimens (25%) or to hemodialysis (23%). A small number of patients received a kidney graft (6%) or left CAPD for other reasons (12%); only 6% of patients remained on CAPD after 80 months of treatment. The mean (+/- standard deviation) PET 4-hour values were 0.652 +/- 0.128 for dialysate-to-plasma (D/P) ratio of creatinine (Cr), 0.403 +/- 0.0969 for 4-hour dialysate-to-initial dialysate (D/D0) glucose concentration ratio, and 2336 +/- 211 mL for the drain volume. There was no correlation between PET D/P Cr and BSA (r = 0.0051, p = 0.934), PET D/D0 glucose and BSA (r = 0.0042, p = 0.945), or PET drain volume and TBW. The correlations with other size indicators were very poor. None of the large patients (BSA > 1.9 m2, weight > 75 kg, BMI > 25 kg/m2) remained on CAPD for more than 80 months once they lost RRF. These results confirm our impression that, with declining RRF, larger patients do not continue CAPD as long as smaller patients do.


Asunto(s)
Tamaño Corporal , Riñón/fisiopatología , Diálisis Peritoneal Ambulatoria Continua , Peritoneo/metabolismo , Transporte Biológico , Superficie Corporal , Agua Corporal , Peso Corporal , Creatinina/metabolismo , Femenino , Glucosa/metabolismo , Humanos , Masculino , Persona de Mediana Edad
7.
Hemodial Int ; 12(4): 412-25, 2008 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-19090863

RESUMEN

Sodium balance is precisely regulated by intake and output. The kidneys are responsible for adjusting sodium excretion to maintain balance at varying intakes. Our distant ancestors were herbivores. Their diet contained little sodium, so they developed powerful mechanisms for conserving sodium and achieving low urinary excretion. About 10,000 years ago, early humans became villagers and discovered that food could be preserved in brine. This led to increased consumption of salt. High salt intake increases extracellular volume (ECV), blood volume, and cardiac output resulting in elevation of blood pressure. High ECV induces release of a digitalis-like immunoreactive substance and other inhibitors of Na(+)-K(+)-ATPase. As a consequence, intracellular sodium and calcium concentrations increase in vascular smooth muscles predisposing them to contraction. Moreover, high ECV increases synthesis and decreases clearance of asymmetrical dimethyl-l-arginine leading to inhibition of nitric oxide (NO) synthase. High concentration of sodium and calcium in vascular smooth muscles, and decreased synthesis of NO lead to an increase in total peripheral resistance. Restoration of normal ECV and blood pressure are attained by increased glomerular filtration and decreased sodium reabsorption. In some individuals, the kidneys have difficulty in excreting sodium, so the equilibrium is achieved at the expense of elevated blood pressure. There is some lag time between reduction of ECV and normalization of blood pressure because the normal levels of Na(+)-K(+)-ATPase inhibitors and asymmetrical dimethyl-l-arginine are restored slowly. In dialysis patients, all mechanisms intended to increase renal sodium removal are futile but they still operate and elevate blood pressure. The sodium balance must be achieved via dialysis and ultrafiltration. Blood pressure is normalized a few weeks after ECV is returned to normal, i.e., when the patient reaches dry body weight. This is called the "lag phenomenon."


Asunto(s)
Hipertensión Renal/metabolismo , Fallo Renal Crónico/metabolismo , Riñón/metabolismo , Diálisis Renal , Cloruro de Sodio Dietético/metabolismo , Animales , Homeostasis/fisiología , Humanos , Fallo Renal Crónico/terapia
8.
Hemodial Int ; 12(2): 173-210, 2008 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-18394051

RESUMEN

Accumulation of knowledge requisite for development of hemodialysis started in antiquity and continued through Middle Ages until the 20th century. Firstly, it was determined that the kidneys produce urine containing toxic substances that accumulate in the body if the kidneys fail to function properly; secondly, it was necessary to discover the process of diffusion and dialysis; thirdly, it was necessary to develop a safe method to prevent clotting in the extracorporeal circulation; and fourthly, it was necessary to develop biocompatible dialyzing membranes. Most of the essential knowledge was acquired by the end of the 19th century. Hemodialysis as a practical means of replacing kidney function started and developed in the 20th century. The original hemodialyzers, using celloidin as a dialyzing membrane and hirudin as an anticoagulant, were used in animal experiments at the beginning of the 20th century, and then there were a few attempts in humans in the 1920s. Rapid progress started with the application of cellophane membranes and heparin as an anticoagulant in the late 1930s and 1940s. The explosion of new dialyzer designs continued in the 1950s and 1960s and ended with the development of capillary dialyzers. Cellophane was replaced by other dialyzing membranes in the 1960s, 1970s, and 1980s. Dialysis solution was originally prepared in the tank from water, electrolytes, and glucose. This solution was recirculated through the dialyzer and back to the tank. In the 1960s, a method of single-pass dialysis solution preparation and delivery system was designed. A large quantity of dialysis solution was used for a single dialysis. Sorbent systems, using a small volume of regenerated dialysis solution, were developed in the mid 1960s, and continue to be used for home hemodialysis and acute renal failure. At the end of the 20th century, a new closed system, which prepared and delivered ultrapure dialysis solution preparation, was developed. This system also had automatic reuse of lines and dialyzers and prepared the machine for the next dialysis. This was specifically designed for quotidian home hemodialysis. Another system for frequent home hemodialysis or acute renal failure was developed at the turn of the 21st century. This system used premanufactured dialysis solution, delivered to the home or dialysis unit, as is done for peritoneal dialysis.


Asunto(s)
Riñones Artificiales , Diálisis Renal/instrumentación , Diseño de Equipo/historia , Soluciones para Hemodiálisis/historia , Historia del Siglo XIX , Historia del Siglo XX , Historia del Siglo XXI , Historia Antigua , Historia Medieval , Humanos , Riñones Artificiales/historia , Diálisis Renal/historia , Insuficiencia Renal/historia , Insuficiencia Renal/fisiopatología , Insuficiencia Renal/terapia
9.
Hemodial Int ; 22(S2): S29-S64, 2018 10.
Artículo en Inglés | MEDLINE | ID: mdl-30457224

RESUMEN

Hemodialysis for chronic renal failure was introduced and developed in Seattle, WA, in the 1960s. Using Kiil dialyzers, weekly dialysis time and frequency were established to be about 30 hours on 3 time weekly dialysis. This dialysis time and frequency was associated with 10% yearly mortality in the United States in 1970s. Later in 1970s, newer and more efficient dialyzers were developed and it was felt that dialysis time could be shortened. An additional incentive to shorten dialysis was felt to be lower cost and higher convenience. Additional support for shortening dialysis time was provided by a randomized prospective trial performed by National Cooperative Dialysis Study (NCDS). This study committed a Type II statistical error rejecting the time of dialysis as an important factor in determining the quality of dialysis. This study also provided the basis for the establishment of the Kt/Vurea index as a measure of dialysis adequacy. This index having been established in a sacrosanct randomized controlled trial (RCT), was readily accepted by the HD community, and led to shorter dialysis, and higher mortality in the United States. Kt/Vurea is a poor measure of dialysis quality because it combines three unrelated variables into a single formula. These variables influence the clinical status of the patient independent of each other. It is impossible to compensate short dialysis duration (t) with the increased clearance of urea (K), because the tolerance of ultrafiltration depends on the plasma-refilling rate, which has nothing in common with urea clearance. Later, another RCT (the HEMO study) committed a Type III statistical error by asking the wrong research question, thus not yielding any valuable results. Fortunately, it did not lead to deterioration of dialysis outcomes in the United States. The third RCT in this field ("in-center hemodialysis 6 times per week versus 3 times per week") did not bring forth any valuable results, but at least confirmed what was already known. The fourth such trial ("The effects of frequent nocturnal home hemodialysis") too did not show any positive results primarily due to significant subject recruitment issues leading to inappropriate selection of patients. Comparison of the value of peritoneal dialysis and HD in RCTs could not be completed because of recruitment problems. Randomized controlled trials have therefore failed to yield any meaningful information in the area of dose and or frequency of hemodialysis.


Asunto(s)
Diálisis Renal/métodos , Sodio/aislamiento & purificación , Urea/metabolismo , Presión Sanguínea , Hemodiálisis en el Domicilio , Humanos , Fallo Renal Crónico/terapia , Estudios Prospectivos , Ensayos Clínicos Controlados Aleatorios como Asunto , Flujo Sanguíneo Regional , Diálisis Renal/mortalidad , Diálisis Renal/normas , Factores de Tiempo , Urea/toxicidad
10.
Kidney Int ; 82(1): 114-5; author reply 115, 2012 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-22699382
12.
Int J Artif Organs ; 29(1): 2-40, 2006 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-16485237

RESUMEN

The first peritoneal accesses were devices that had been used in other fields (general surgery, urology, or gynecology): trocars, rubber catheters, and sump drains. In the period after World War II, numerous papers were published with various modifications of peritoneal dialysis. The majority of cases were treated with the continuous flow technique; rubber catheters for inflow and sump drains for outflow were commonly used. At the end of the 1940s, intermittent peritoneal dialysis started to be more frequently used. Severe complications of peritoneal accesses created incentive to design accesses specifically for peritoneal dialysis. The initial three, in the late 1940s, were modified sump drains; however, Ferris and Odel for the first time designed a soft, polyvinyl intraperitoneal tube with metal weights to keep the catheter tip in the pelvic gutter where the conditions for drain are the best. In the 1950s, intermittent peritoneal dialysis was established as the preferred technique; polyethylene and nylon catheters became commercially available and peritoneal dialysis was established as a valuable method for treatment of acute renal failure. The major breakthrough came in the 1960s. First of all, it was discovered that the silicone rubber was less irritating to the peritoneal membrane than other plastics. Then, it was found that polyester velour allowed an excellent tissue ingrowth creating a firm bond with the tissue. When a polyester cuff was glued to the catheter, it restricted catheter movement and created a closed tunnel between the integument and the peritoneal cavity. In 1968, Tenckhoff and Schechter combined these two features and designed a silicone rubber catheter with a polyester cuff for treatment of acute renal failure and two cuffs for treatment of chronic renal failure. This was the most important development in peritoneal access. Technological evolution never ends. Multiple attempts have been made to eliminate remaining complications of the Tenckhoff catheter such as exit/tunnel infection, external cuff extrusion, migration leading to obstruction, dialysate leaks, recurrent peritonitis, and infusion or pressure pain. New designs combined the best features of the previous ones or incorporated new elements. Not all attempts have been successful, but many have. To prevent catheter migration, Di Paolo and his colleagues applied the old idea of providing weights at the catheter tips to Tenckhoff catheters. In another modification, Twardowski and his collaborators created a permanent bend to the intra-tunnel portion of the silicone catheter to eliminate cuff extrusions. The Tenckhoff catheter continues to be widely used for chronic peritoneal dialysis, although its use is decreasing in favor of swan-neck catheters. Soft, silicone rubber instead of rigid tubing virtually eliminated such early complications as bowel perforation or massive bleeding. Other complications, such as obstruction, pericatheter leaks, and superficial cuff extrusions have been markedly reduced in recent years, particularly with the use of swan-neck catheters and insertion through the rectus muscle instead of the midline. However, these complications still occur, so new designs are being tried.


Asunto(s)
Cateterismo/historia , Diálisis Peritoneal/historia , Animales , Cateterismo/efectos adversos , Historia del Siglo XX , Humanos , Lavado Peritoneal/historia
13.
Adv Perit Dial ; 22: 147-52, 2006.
Artículo en Inglés | MEDLINE | ID: mdl-16983959

RESUMEN

The Tenckhoff catheter was developed in 1968 and has been widely used since for chronic peritoneal dialysis (PD) patients. Variations of the Tenckhoff catheter have been designed over the years in a search for the ideal PD catheter--an access that can provide reliable dialysate flow rates with few complications. Currently, data derived from randomized, controlled, multicenter trials dedicated to testing how catheter design and placement technique influence long-term catheter survival and function are scarce. As a result, no firm guidelines exist at the national or international levels on optimal PD catheter type or implantation technique. Also, no current statistics on the use of PD catheters are available. The last survey was carried out using an audience response system at the Annual Peritoneal Dialysis Conference in Orlando, Florida, in January 1994. The present analysis is based on a new survey done at the 2005 Annual Dialysis Conference in Tampa, Florida. It is a snapshot of preferences in catheter design and implantation technique in 2004 from an international sample of 65 respondent chronic PD centers. The Tenckhoff catheter remains the most widely used catheter, followed closely by the swan-neck catheter in both adult and pediatric respondent centers. Double-cuff catheters continue to be preferred over single-cuff catheters, and coiled intraperitoneal segments are generally preferred over straight intra-peritoneal segments. Surgical implantation technique remains the prevailing placement method in both pediatric and adult respondent centers.


Asunto(s)
Catéteres de Permanencia/estadística & datos numéricos , Diálisis Peritoneal/instrumentación , Adulto , Niño , Humanos
14.
Hemodial Int ; 10(4): 394-8, 2006 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-17014518

RESUMEN

Brain natriuretic peptide or B-type natriuretic peptide (BNP) is a sensitive marker of heart disease. Plasma levels of BNP increase in left ventricular failure and determination of plasma BNP has become a useful tool in the diagnosis of heart failure. Hemodialysis (HD) patients may have elevated plasma levels of BNP, particularly predialysis, that correlate with echocardiographic signs of left ventricular dysfunction. High BNP levels are also a strong predictor of mortality in both nonrenal and HD patients. We studied plasma BNP levels in patients who changed from conventional thrice-weekly dialysis to daily dialysis 6 times a week while maintaining a total weekly time on dialysis of 12 hr. Twelve HD patients, mean age 55 years, had 4 hr of conventional thrice-weekly treatment for 4 weeks. Predialysis and postdialysis blood samples were obtained at the last dialysis. Patients were then dialyzed for 2 hr, 6 times weekly, for 4 weeks (daily dialysis). Again, predialysis and postdialysis blood samples were collected at the last HD. Brain natriuretic peptide plasma concentrations were determined by immunoradiometric assay. Predialysis BNP levels decreased from 194+/-51 ng/L (68+/-19 pmol/L; mean+SE) during thrice-weekly HD to 113+/-45 ng/L (41+/-18 pmol/L; p = 0.001) after 4 weeks on daily dialysis. With thrice-weekly HD, predialysis BNP levels were higher than postdialysis levels: 120+/-26 ng/L (39+/-8 pmol/L; p = 0.059). With daily dialysis, predialysis BNP levels did not differ significantly from postdialysis levels. Elevated predialysis plasma levels of BNP, considered sensitive and early markers of left ventricular dysfunction, decreased when patients were changed from conventional thrice-weekly HD to daily dialysis maintaining total hours of dialysis per week constant. Given the accumulated evidence that BNP is a biomarker of left ventricular dysfunction and can be used for risk stratification and guidance in pharmacotherapy of heart failure, daily dialysis appears to lead to less cardiac distress.


Asunto(s)
Péptido Natriurético Encefálico/sangre , Diálisis Renal/métodos , Disfunción Ventricular Izquierda/sangre , Disfunción Ventricular Izquierda/prevención & control , Adulto , Anciano , Anciano de 80 o más Años , Biomarcadores/sangre , Femenino , Humanos , Fallo Renal Crónico/sangre , Fallo Renal Crónico/complicaciones , Fallo Renal Crónico/terapia , Masculino , Persona de Mediana Edad , Diálisis Renal/efectos adversos , Factores de Tiempo , Disfunción Ventricular Izquierda/etiología
15.
Adv Perit Dial ; 21: 72-5, 2005.
Artículo en Inglés | MEDLINE | ID: mdl-16686289

RESUMEN

Peritoneal dialysis (PD)-associated peritonitis contributes significantly to morbidity and modalityfailure. The number of patients on PD is declining in Western countries, and peritonitis is a potential deterrent to the therapy. Here, we present a clinically significant decline in the rate of peritonitis at a single center over a 28-year period, with current rates significantly lower than the national average, and we review several factors that have contributed to those outcomes. Peritonitis and duration of follow-up have been recorded for all patients followed in our program since 1977. Introduction of important technological changes into the program were also recorded. All peritonitis rates are expressed as episodes/patient-year or episodes/n patient-months. Data are summarized for each calendar year since 1977. We followed 682 patients for a total follow-up duration of 15,435 patient-months. Glass bottles were changed to plastic bags in 1978. Straight connecting tubes were replaced by Y-sets in 1988. The presternal dialysis catheter was introduced in 1991 and has been the primary PD access since 1995. The peritonitis rate in 1977 was 5.8 episodes/patient-year, and that rate has progressively declined over the past 27 years to 0.35 episodes/patient-year in 2004. Technical improvements that contributed to the decline in overall peritonitis rates have been adopted nationwide. The largest improvement occurred with the switch from glass bottles to plastic bags, and to the closed-system Y-set that incorporated the flush-before-fill principle. Advances in catheter technology have also played a key role. Quality improvement in the program and long years of experience in overall care of PD patients are significant factors that cannot be measured quantitatively. Improvements have been made to exit-site care protocols, to exit-site evaluation and diagnosis, and to treatment strategies. Patient education and training in catheter care remain the important factor in a PD program. Many factors have contributed to the reduction of PD-associated peritonitis rates at our center Improved connectology, catheter care, and patient education play key roles in the reduction of peritonitis.


Asunto(s)
Diálisis Peritoneal Ambulatoria Continua/efectos adversos , Peritonitis/epidemiología , Estudios de Seguimiento , Humanos , Missouri/epidemiología , Diálisis Peritoneal Ambulatoria Continua/instrumentación , Peritonitis/etiología
16.
J Vasc Access ; 16 Suppl 9: S54-60, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-25751552

RESUMEN

There are two methods of fistula cannulation for hemodialysis. The first, different site or rope-ladder cannulation method, established by originators of the arteriovenous fistula as a blood access for hemodialysis in 1966, relies on changing the puncture sites for each dialysis. The second, constant site or buttonhole method, developed several years later, recommends using the same puncture sites for consecutive dialyses. The first method is prevailing at present, but the second method is becoming more and more popular. The major advantage of this buttonhole method is lower cannulation pain, fewer fistula complications, with the exception of fistula infection, which is more common in some studies. This method is more difficult and requires experienced single cannulator to establish good puncture sites. Home hemodialysis patients using single cannulator, the patient or helper, have better results with this method. Busy dialysis centers with high rotation of cannulators do not have as good results and prefer the rope-ladder method.


Asunto(s)
Derivación Arteriovenosa Quirúrgica , Cateterismo/métodos , Diálisis Renal , Cateterismo/efectos adversos , Cateterismo/instrumentación , Diseño de Equipo , Humanos , Agujas , Punciones , Factores de Tiempo , Resultado del Tratamiento
17.
Am J Kidney Dis ; 43(1): 90-102, 2004 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-14712432

RESUMEN

BACKGROUND: Advantages associated with an increased frequency of hemodialysis have been reported previously. However, previous studies were either small or not controlled and did not detail early clinical, biochemical, quality-of-life, urea kinetic, and dynamic changes when patients switched from a conventional (3 times/wk) dialysis regimen to "daily" (6 times/wk) dialysis therapy when total weekly dialysis time was unchanged. METHODS: A prospective sequential study with 21 patients as their own controls was performed. A 4-week period of conventional thrice-weekly dialysis (N = 240 treatments) was followed immediately by a 4-week period of daily (ie, 6 times/wk) dialysis (N = 480 treatments), in which each treatment was half the length of a conventional dialysis treatment session. Clinical parameters and symptoms during and between dialysis treatments were graded, and urea-related parameters, blood chemistry results, and nutritional data were determined. RESULTS: Within 4 weeks of switching to this daily dialysis regimen, there were improvements in blood pressure, dialysis "unphysiology," intradialytic and interdialytic symptoms, and urea kinetics and dynamics. There were fewer machine alarms and less need for nursing interventions during dialysis. Nutrition and quality of life began to improve. There was no increase in blood access complications and no significant changes in blood chemistry results, hematologic parameters, or use of medications. CONCLUSION: In this short-term study, daily dialysis appears to be a safe, better, and more physiological method to deliver dialysis care to patients with end-stage renal disease.


Asunto(s)
Fallo Renal Crónico/terapia , Diálisis Renal , Adulto , Anciano , Anciano de 80 o más Años , Análisis Químico de la Sangre , Nitrógeno de la Urea Sanguínea , Femenino , Pruebas Hematológicas , Humanos , Fallo Renal Crónico/sangre , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Calidad de Vida , Factores de Tiempo
20.
ASAIO J ; 49(6): 645-9, 2003.
Artículo en Inglés | MEDLINE | ID: mdl-14655728

RESUMEN

The present authors studied longevity and complications with fistulae, grafts, and central venous catheters in 23 patients on daily hemodialysis for a total of 409 patient months (mean 18 +/- 10 months) and 9,209 dialyses. Fourteen patients had fistulae, five had grafts, and four had catheters. These required one, one, and two replacements respectively during a total observation time of 254, 105, and 50 patient months. Fistulae required 0.05 replacements per year vs. 0.11 replacements per year for grafts and 0.48 replacements per year for catheters (p = 0.042, fistulae vs. other accesses). Cumulative survival at 15 months was 100% for fistulae, 80% for grafts, and 20% for catheters, and at 3 years it was 80% for fistulae and grafts. No catheter survived beyond 15 months (p = 0.041). Twenty-seven events required hospitalization or an outpatient intervention. Fistulae had 0.52 events per patient year, grafts 1.37 events per patient year, and catheters 1.44 events per patient year (p = 0.080, fistulae vs. other accesses). Patients with fistulae reported more problems between dialyses, these occurring on 3.2% of the observation days compared with 0.2% for grafts and 0.4% for catheters (p < 0.0001). Eighty-five percent of these problems were pain and redness at the access site. During dialysis, there were more problems with catheters (9.1% vs. fistulae 2.7% and grafts 0.9%, p < 0.0001). Complications and survival data were similar to those reported with daily hemodialysis by others and better than data from reports of access problems with conventional three times weekly hemodialysis. In conclusion, daily hemodialysis does not adversely affect the usual types of blood access. Survival was best and need for intervention was least with fistulae and worst with catheters. Grafts, when functioning, had fewer problems both between and during dialyses.


Asunto(s)
Derivación Arteriovenosa Quirúrgica/efectos adversos , Cateterismo Venoso Central/efectos adversos , Oclusión de Injerto Vascular/epidemiología , Hemodiálisis en el Domicilio , Fallo Renal Crónico/epidemiología , Fallo Renal Crónico/terapia , Adulto , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Masculino , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Encuestas y Cuestionarios , Resultado del Tratamiento
SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda