Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 38
Filter
1.
Br J Surg ; 103(5): 607-15, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26865013

ABSTRACT

BACKGROUND: Evaluation of new surgical procedures is a complex process challenged by evolution of technique, operator learning curves, the possibility of variable procedural quality, and strong treatment preferences among patients and clinicians. Preliminary studies that address these issues are needed to prepare for a successful randomized trial. The IDEAL (Idea, Development, Exploration, Assessment and Long-term follow-up) Framework and Recommendations provide an integrated step-by-step evaluation pathway that can help investigators achieve this. METHODS: A practical guide was developed for investigators evaluating new surgical interventions in the earlier phases before a randomized trial (corresponding to stages 1, 2a and 2b of the IDEAL Framework). The examples and practical tips included were chosen and agreed upon by consensus among authors with experience either in designing and conducting IDEAL format studies, or in helping others to design such studies. They address the most common challenges encountered by authors attempting to follow the IDEAL Recommendations. RESULTS: A decision aid has been created to help identify the IDEAL stage of an innovation from literature reports, with advice on how to design and report the IDEAL study formats discussed, along with the ethical and scientific rationale for specific recommendations. CONCLUSION: The guide helps readers and researchers to understand and implement the IDEAL Framework and Recommendations to improve the quality of evidence supporting surgical innovation.


Subject(s)
Evidence-Based Medicine/methods , Randomized Controlled Trials as Topic/methods , Research Design , Surgical Procedures, Operative , Humans
2.
Aliment Pharmacol Ther ; 38(9): 1045-53, 2013 Nov.
Article in English | MEDLINE | ID: mdl-24024705

ABSTRACT

BACKGROUND: The preferred initial investigation with either magnetic resonance (MRCP) or endoscopic retrograde cholangiopancreatography (ERCP) in patients with suspected biliary obstruction remains controversial in many clinical settings. AIM: To assess the effectiveness of an initial MRCP vs. ERCP in the work-up of patients at moderate likelihood of a suspected biliary obstruction. METHODS: Patients with an unconfirmed benign biliary obstruction, based on laboratory and ultrasound findings, were randomised to an ERCP-first or MRCP-first strategy, stratified by level of obstruction. The primary outcome was the occurrence of a disease or procedure-related bilio-pancreatic adverse events within the next 12 months. Secondary outcomes were the number of subsequent bilio-pancreatic procedures, duration of hospitalisation, days away from activities of daily living (ADL), quality of life (SF-36) and mortality. RESULTS: We randomised 126 patients to ERCP-first and 131 to MRCP-first (age 54 ± 18 years, 62% female, 39% post-cholecystectomy). In follow-up, 18/126 (14.3%) ERCP-first and 25/131 (19.1%) MRCP-first patients experienced a procedure- or disease-related complication (P = 0.30) (disease-related in 13 and 18 patients, and procedure-related in 5 and 7 patients respectively). A cause of biliary obstruction was found in 39.7% vs. 49.6% of patients (P = 0.11). Sixty-six (50%) patients in the MRCP-first group ended up avoiding an ERCP in follow-up. ERCP-first and MRCP-first patients were away from usual activities for 3.4 ± 7.7 days and 2.0 ± 4.8 days respectively (P < 0.001). CONCLUSION: A strategy of MRCP-first decreased the need for subsequent MRCPs, but not complications. Further study is required to define factors influencing the eventual use of MRCP vs. ERCP in appropriately selected patients (ClinicalTrial.gov: NCT01424657).


Subject(s)
Cholangiopancreatography, Endoscopic Retrograde/methods , Cholangiopancreatography, Magnetic Resonance/methods , Cholestasis/diagnosis , Gallstones/complications , Activities of Daily Living , Adult , Aged , Biliary Tract Diseases/diagnosis , Biliary Tract Diseases/etiology , Biliary Tract Diseases/pathology , Cholecystectomy/methods , Cholestasis/etiology , Cholestasis/pathology , Female , Follow-Up Studies , Humans , Length of Stay , Male , Middle Aged , Patient Selection , Quality of Life
3.
HPB (Oxford) ; 8(1): 67-8, 2006.
Article in English | MEDLINE | ID: mdl-18333243

ABSTRACT

BACKGROUND: A 29-year-old woman who presented with fatigue and jaundice was found to have an obstructing mass at the bifurcation of the bile duct. The patient underwent a successful left hepatectomy with resection of the bile duct bifurcation and a reconstruction with a right hepaticojejunostomy. Pathology revealed an atypical carcinoid tumour of the left extrahepatic bile duct, with perineural and lymphatic invasion. The patient subsequently developed multiple metastases in the remaining liver. METHODS: In the absence of extrahepatic disease, the patient underwent a successful liver transplant. RESULTS: Two years later she remains disease-free. DISCUSSION: To our knowledge this is the first report of a biliary carcinoid treated with hepatectomy and finally with liver transplantation, with excellent results. The biological behaviour of these rare tumours mandates aggressive surgical management.

4.
Transplant Proc ; 36(6): 1747-52, 2004.
Article in English | MEDLINE | ID: mdl-15350468

ABSTRACT

BACKGROUND: Renal dysfunction remains the Achilles' heel of calcineurin inhibitor (CI)use. The purpose of this study was to assess our institutional, renal-sparing strategy using thymoglobulin (TMG) in recipients of orthotopic liver transplants. METHODS: We performed a retrospective analysis of data from 298 adult recipients who were transplanted between 1991 and 2002. The patients were divided into two groups: those induced with TMG (group 1) and those that were not treated with this agent (group 2). A subgroup analysis was performed of patients with baseline serum creatinine values above 1.5 mg/dL (group 1A received TMG; group 2A did not). All patients received tacrolimus or cyclosporine (CyA) maintenance immunosuppression. RESULTS: Indications and demographics were similar between the two groups. Although there was no difference in patient and graft survivals, there was a statistically significant benefit in the rejection-free graft survival at 1 year for group 1 (51% vs 39%; P =.02). Furthermore, serum creatinine at 6 months was lower for group 1, despite a similar baseline creatinine. Subgroup analysis for patients with baseline abnormal serum creatinines showed that group 1A displayed an improved rejection-free graft survival at 1 month but not at 1 year. CONCLUSIONS: Thymoglobulin induction therapy may allow a delay in the initiation of CI therapy without compromising patient and graft survival, while preventing early rejection, even among patients with baseline renal dysfunction.


Subject(s)
Antilymphocyte Serum/therapeutic use , Calcineurin Inhibitors , Liver Transplantation/physiology , Adult , Creatinine/blood , Female , Follow-Up Studies , Graft Survival , Humans , Kidney Function Tests , Liver Transplantation/mortality , Male , Middle Aged , Retrospective Studies , Survival Analysis , Time Factors
5.
Transplant Proc ; 36(6): 1760-2, 2004.
Article in English | MEDLINE | ID: mdl-15350471

ABSTRACT

BACKGROUND: Little is known about the effect of blood transfusions and leukoreduction on acute rejection in liver transplantation. The purpose of this study was to assess the impact of leukoreduction on the occurrence of early rejection episodes in liver transplantation. METHODS: In 1999, mandatory leukoreduction was implemented in our program. Data from 339 consecutive liver transplant recipients were analyzed with attention to the time period as a proxy for leukoreduction, the number of transfusions, the wait list status, the hepatitis B or C status, the recipient age, and the type of immunosuppression. RESULTS: Using an early (6-month) rejection-free graft survival model, we observed that introduction of leukoreduction was independently associated with fewer rejection episodes (P =.001). Despite the lower rejection rate, due to a regimen of tacrolimus and antithymocyte globulin, the effect of implementation of leukoreduction remained significant (P =.021). CONCLUSION: The use of leukoreduction is associated with fewer early rejections, irrespective of the type of immunosuppression. These data support an exploration of the immunomodulatory effect of leukoreduction.


Subject(s)
Graft Rejection/prevention & control , Graft Survival/immunology , Leukocyte Reduction Procedures , Liver Transplantation/immunology , Disease-Free Survival , Graft Rejection/epidemiology , Graft Rejection/immunology , Humans , Retrospective Studies
6.
Transplant Proc ; 35(7): 2435-7, 2003 Nov.
Article in English | MEDLINE | ID: mdl-14611980

ABSTRACT

AIM: Most technical complications after orthotopic liver transplantation (OLT) are related to the biliary tree. This report reviews the role of routine intraoperative placement of stents to reduce biliary complications. METHODS: We retrospectively analyzed 396 consecutive OLTs. We reviewed rates of biliary complications after hepaticojejunostomy (HJA) as well as following choledochocholedochostomy (CCA) groups: "experimental" group (routine intraoperative biliary stenting, last 10 months), "recent" control group (nonstented, previous 10 months), "historical" control group (prior to that period of time). RESULTS: All groups were matched for donor/recipient characteristics and for graft cold/warm ischemia time. The overall prevalence of biliary complications was 30.7% after CCA versus 35% after HJA. In the experimental group 21 patients had a 4.8% biliary complication rate compared to the recent control and historical groups, where biliary complication rates were 30% and 32.6%, respectively (P <.05). CONCLUSIONS: The intraoperative use of biliary stents is feasible and appears to decrease the rate of biliary complications. These results support the need for a prospective randomized trial.


Subject(s)
Gallbladder Diseases/prevention & control , Gallbladder/surgery , Liver Transplantation/methods , Choledochostomy , Follow-Up Studies , Humans , Jejunum/surgery , Liver/surgery , Postoperative Complications/prevention & control , Retrospective Studies , Time Factors
7.
Transplantation ; 72(9): 1519-22, 2001 Nov 15.
Article in English | MEDLINE | ID: mdl-11707739

ABSTRACT

BACKGROUND: CA (cancer antigen) 125 is a serologic marker used in the monitoring of ovarian cancer. Elevated levels are also reported in cirrhosis. We evaluated the range of serum CA 125 levels seen before and after liver transplantation, and examined possible factors associated with CA 125 elevation. METHODS: We examined prospectively 57 consecutive patients with cirrhosis who underwent liver transplantation. CA 125 levels were also measured in two patients with polycystic liver disease. RESULTS: The mean serum CA 125 level before transplantation was 352+/-549 u/ml, compared with 46+/-49 u/ml after transplantation (P<0.001). Multivariate analysis identified the degree of ascites as the only significant predictive variable of preoperative CA 125 level. In five patients who underwent abdominal paracentesis, the mean ascites CA 125 level (951+/-322 u/ml) was higher than that of the serum (619+/-290 u/ml) (P<0.003). In 16 hepatectomy specimens, the grade of staining for CA 125 was 0.8+/-1.4 for the mesothelium of patients with a normal serum CA 125 level, compared with 1.5+/-1.1 in patients with elevated serum levels (P=0.37). Two patients with severe abdominal distension due to polycystic liver disease but without ascites had elevated serum CA 125 levels. DISCUSSION: CA 125 concentration is elevated in the majority of patients with cirrhosis and normalizes after liver transplantation. It is a reflection of the abdominal distention seen in these patients. Therefore, an elevation in CA 125 should not be considered a contraindication to liver transplantation in the absence of evidence of malignancy.


Subject(s)
Ascites/blood , CA-125 Antigen/blood , Liver Cirrhosis/surgery , Liver Transplantation/physiology , Biomarkers/blood , Female , Humans , Liver Cirrhosis/blood , Liver Neoplasms/blood , Liver Neoplasms/diagnosis , Male , Prospective Studies , Reference Values , Severity of Illness Index
8.
Surgery ; 130(4): 686-93; discussion 693-5, 2001 Oct.
Article in English | MEDLINE | ID: mdl-11602900

ABSTRACT

BACKGROUND: Surgical success has traditionally been judged from the surgeon's perspective. A more complete evaluation of outcome incorporates the patient's, surgeon's, and payor's perspectives. Because gastroesophageal reflux disease (GERD) is primarily a quality-of-life (QOL) problem, the evaluation of laparoscopic fundoplication (LF) is a useful model for evaluating outcomes from these 3 perspectives. METHODS: Between 1995 and 2000, 74 patients underwent primary LF for GERD. In addition to undergoing physiologic testing, 63 patients (85%) were evaluated with use of a disease-specific health-related QOL scale (GERD-HRQL), scored from 0 (no symptoms) to 45 (incapacitating symptoms). Thirty-three patients also completed a generic QOL questionnaire (SF-12), in which patient satisfaction was scored from 1 (very satisfied) to 5 (very dissatisfied). Preoperative and postoperative data were compared with use of the Wilcoxon signed rank test or the paired t test. RESULTS: The median GERD-HRQL score improved from 18 to 0 at 2 years postoperation (P <.01). The median satisfaction score improved from 5 to 1 (P <.01). The SF-12 summary scores also improved after 6 weeks postoperatively (P <.05). The mean +/- SD lower esophageal sphincter pressure rose from 7.3 +/- 4 mm Hg preoperatively to 17.5 +/- 6 postoperatively (P <.01), and the mean percentage of time that the esophagus was exposed to a pH of less than 4 declined from 14.7% +/- 12% to 1.1% +/- 2% (P <.01). The median operative time was 110 minutes, which declined with experience with the procedure (P <.01). Median postoperative stay was 2 days. CONCLUSIONS: In evaluating the outcomes of a new procedure, 3 overlapping points of view were addressed: the patient's (QOL, satisfaction), the surgeon's (physiologic changes), and the payor's (operating room time, hospital stay). With use of this framework, we found that LF for GERD improves QOL, corrects the physiologic abnormalities, and is associated with short hospitalization and operating time that declines with experience with the procedure.


Subject(s)
Fundoplication/methods , Gastroesophageal Reflux/surgery , Adolescent , Adult , Aged , Female , Follow-Up Studies , Fundoplication/adverse effects , Gastroesophageal Reflux/physiopathology , Gastroesophageal Reflux/psychology , Humans , Laparoscopy , Length of Stay , Male , Middle Aged , Postoperative Complications , Quality of Life , Time Factors
9.
Liver Transpl ; 7(7): 652-5, 2001 Jul.
Article in English | MEDLINE | ID: mdl-11460236

ABSTRACT

Because of the increasing gap in the number of patients awaiting organ transplantation and the supply of organ donors, reevaluation of donor criteria is an important issue in clinical transplantation. It has become necessary to make maximal use of the currently available donor pool. We describe a case of successful orthotopic liver transplantation in a 57-year-old man with Laënnec's cirrhosis using a liver containing an 8-cm focal nodular hyperplasia (FNH) lesion involving segments II and III and the caudate lobe. The donor liver was procured from a 46-year-old woman declared brain dead after a subarachnoid hemorrhage. Definitive pathological diagnosis was made at laparotomy by obtaining a Tru-cut (Allegiance Health Care Inc, Toronto, Ontario, Canada) core biopsy specimen. The recipient operation was performed uneventfully except for bleeding from the biopsy site. The patient did well postoperatively and was discharged on tacrolimus, mofetil mycophenolate, and prednisone therapy. He continues to thrive 2(1/2) years posttransplantation with no change in the size of the lesion. In well-selected donors, FNH should not be a contraindication for use in transplantation. However, FNH must be differentiated from hepatocellular adenoma. Although FNH has a benign course with little propensity for bleeding and almost no malignant potential, hepatic adenoma is reported to have a 15% to 33% chance of bleeding and rupture with a well-documented potential for neoplastic degeneration, making the liver unsuitable for donation.


Subject(s)
Focal Nodular Hyperplasia , Liver Cirrhosis/surgery , Liver Transplantation , Patient Selection , Tissue Donors , Focal Nodular Hyperplasia/diagnostic imaging , Humans , Male , Middle Aged , Tomography, X-Ray Computed
12.
Liver Transpl ; 7(2): 82-9, 2001 Feb.
Article in English | MEDLINE | ID: mdl-11172389

ABSTRACT

The pathophysiological state of rejection in liver xenotransplantation is poorly understood. Data from clinical pig liver perfusion suggest that pig livers might be rejected less vigorously than pig hearts or kidneys. Pig livers used in clinical xenoperfusions were exposed to blood from patients with liver failure. We have shown in an animal model that transplant recipients with liver failure are less capable of initiating hyperacute rejection of a xenografted liver than a healthy transplant recipient. The goal of this report is to examine the pathological characteristics of pig livers used in 2 clinical pig liver perfusions and combine this information with in vitro studies of pig-to-human liver xenotransplantation to determine whether the findings in the perfused pig livers could be explained in part by the diminished capacity of the patient with liver failure to respond to xenogeneic tissue. Pathological analysis of the perfused pig livers showed immunoglobulin M deposition in the sinusoids with little evidence of complement activation. Our in vitro studies showed that serum from patients with liver failure caused less injury to pig liver endothelium than serum from healthy subjects. Serum from patients with liver failure had similar levels of xenoreactive antibodies as serum from healthy humans. Incubation of serum from patients with liver failure with pig hepatic endothelial cells generated less iC3b, Bb fragment, and C5b-9 than serum from healthy subjects. We conclude that the altered injury in the perfused pig livers can be attributed to the relative complement deficiency that accompanies liver failure.


Subject(s)
Graft Rejection , Liver Failure/surgery , Liver Transplantation , Transplantation, Heterologous , Animals , Antibodies, Heterophile/analysis , Complement C3 Convertase, Alternative Pathway , Complement C3b/analysis , Complement Membrane Attack Complex/analysis , Complement System Proteins/analysis , Female , Fibrinogen/physiology , Humans , Liver/immunology , Liver/pathology , Liver Failure/blood , Middle Aged , Peptide Fragments/analysis , Perfusion , Swine
15.
J Gastrointest Surg ; 3(6): 575-82, 1999.
Article in English | MEDLINE | ID: mdl-10554363

ABSTRACT

Over a 28-month period, 123 patients with a unilateral inguinal hernia were recruited into a randomized controlled trial comparing open herniorrhaphy (OH) to laparoscopic inguinal herniorrhaphy (LH). The primary end point was duration of convalescence. Sixty-five patients underwent OH and 58 underwent LH. Both groups were well matched for all baseline parameters, although LH patients anticipated a shorter convalescence than OH patients (14.3 +/- 9.4 days vs. 18.5 +/- 10.8 days; P = 0.021). The median duration of hospital stay was one day in both groups. No difference was observed in the duration of convalescence (LH 9.8 +/- 7.4 days; OH 11.6 +/- 7. 7 days) across groups. However, when the data were analyzed after removing patients receiving disability ("worker's") compensation (21 patients), patients undergoing LH recovered on average 3 days faster (LH 7.8 +/- 5.6 days; OH 10.9 +/- 7.5 days; P = 0.02). Patients not receiving worker's compensation appear to have a shorter convalescence after LH compared to OH. Disability compensation is a major confounding variable in determining convalescence and needs to be controlled for in any future trial design.


Subject(s)
Convalescence , Hernia, Inguinal/rehabilitation , Hernia, Inguinal/surgery , Workers' Compensation/statistics & numerical data , Confounding Factors, Epidemiologic , Humans , Laparoscopy/statistics & numerical data , Middle Aged , Outcome Assessment, Health Care , Pain, Postoperative/epidemiology , Prospective Studies , Quality of Life , Quebec , Time Factors
16.
Transplantation ; 68(12): 1839-42, 1999 Dec 27.
Article in English | MEDLINE | ID: mdl-10628760

ABSTRACT

BACKGROUND: Based on the excellent correlation between cyclosporine A 2-hr postdose blood levels (C2) and the area under the concentration versus time curve, we evaluated the clinical benefit of Neoral dose monitoring with C2 compared trough levels (C0) in stable heart transplant patients. METHODS: We studied 114 stable adult patients followed at the heart transplant clinic, who were >1 year after surgery. In May 1996 (period 1, follow-up 10+/-4 months), Neoral dose monitoring was based on C2 (300-600 ng/ml); while in May 1997 (period 2, follow-up 10+/-2 months), it was based on C0 (100-200 ng/ml). Cyclosporine A levels were measured by an enzyme multiplied immunologic technique. Clinical benefit was defined by the absence of acute rejection, no mortality, no fall in left ventricular ejection fraction >10%, and no increase in serum creatinine >10% (compared with baseline). RESULTS: During period 1, Neoral dose, cyclosporine A, C0 and C2, and serum creatinine, decreased by 26, 56, 45, and 2.3%, respectively. At the end of period 2, the same variables increased by 24, 56, 38, and 10%, respectively (P<0.0001). The incidence of acute rejection was similar (period 1: 0.87%, period 2: 0.96%). The left ventricular ejection fraction (initial/final) remained stable (period 1: 57+/-91%/58+/-13%, period 2: 59+/-11d/58+/-10%). Mortality did not differ (period 1: 7.9%, period 2: 9.6%). A clinical benefit was observed in 69.3% of the patients during period 1 vs. 43.3% of the patients during period 2 (P<0.00001). CONCLUSIONS: In stable heart transplant patients, a greater clinical benefit was observed when Neoral dose monitoring was performed according to C2, compared with C0.


Subject(s)
Cyclosporine/administration & dosage , Cyclosporine/blood , Heart Transplantation , Immunosuppressive Agents/administration & dosage , Immunosuppressive Agents/blood , Acute Disease , Aged , Cyclosporine/therapeutic use , Dose-Response Relationship, Drug , Female , Follow-Up Studies , Graft Rejection/epidemiology , Humans , Immunosuppressive Agents/therapeutic use , Incidence , Longitudinal Studies , Male , Middle Aged , Postoperative Complications/mortality , Stroke Volume , Treatment Outcome
17.
J Gastrointest Surg ; 2(4): 385-90, 1998.
Article in English | MEDLINE | ID: mdl-9841997

ABSTRACT

The objective of this study was to describe recent trends in the management of mild-to-moderate gallstone pancreatitis and assess patient outcomes. Acute gallstone pancreatitis has traditionally been managed with open cholecystectomy and intraoperative cholangiography during the initial hospitalization. The popularization of endoscopic retrograde cholangiopancreatography (ERCP) and laparoscopic cholecystectomy has made a reassessment necessary. Two consecutive time periods were retrospectively analyzed: prior to laparoscopic cholecystectomy (prelaparoscopic era [PLE]) and after the diffusion of laparoscopic cholecystectomy (laparoscopic cholectomy era [LCE]). There were 35 patients in the PLE group and 58 in the LCE group. LCE patients waited 37.1 +/- 63 days from admission until cholecystectomy, compared to 9.8 +/- 14.8 days in the PLE group (P = 0.04). Biliary-pancreatic complications occurred in 24% of LCE patients and only 6% of PLE patients (P = 0.05), nearly always while they were awaiting cholecystectomy (P = 0.009). Patients in either time period who underwent cholecystectomy with intraoperative cholangiography developed less pancreatic-biliary complications than those who underwent ERCP prior to cholecystectomy, with or without sphincterotomy. Delaying the interval from pancreatitis to laparoscopic cholecystectomy beyond historical values is associated with a greater risk of recurrent biliary-pancreatic complications, which are not prevented by the use of ERCP. Early cholecys tectomy with intraoperative ductal evaluation is still the approach of choice.


Subject(s)
Cholelithiasis/surgery , Pancreatitis/therapy , Acute Disease , Cholangiography/adverse effects , Cholangiopancreatography, Endoscopic Retrograde/adverse effects , Cholecystectomy/adverse effects , Cholecystectomy, Laparoscopic/adverse effects , Cholelithiasis/complications , Female , Humans , Intraoperative Care , Length of Stay , Male , Middle Aged , Pancreatitis/etiology , Postoperative Complications , Recurrence , Retrospective Studies , Risk Factors , Sphincterotomy, Endoscopic/adverse effects , Treatment Outcome
18.
Endoscopy ; 30(5): 457-63, 1998 Jun.
Article in English | MEDLINE | ID: mdl-9693893

ABSTRACT

BACKGROUND AND STUDY AIMS: Determinants of complications after endoscopic retrograde cholangiopancreatography (ERCP) have not yet been completely characterized. PATIENTS AND METHODS: Data were collected from an endoscopic database. Univariate analysis and multivariate logistic regression analysis were used to generate the best model of independent predictors of post-ERCP pancreatitis. RESULTS: The database included 1239 ERCP examinations carried out to investigate suspected choledocholithiasis over a five-year period. From these, 45 patients who developed post-ERCP complications were compared to a random sample of 486 patients who had undergone an uncomplicated ERCP for suspected choledocholithiasis. Univariate analysis demonstrated significant differences between the two patient groups for the following factors: age, using a cut-off point of 59 years (27% vs. 51%, P = 0.002), pancreatic channel opacification (73% vs. 58%, P = 0.05), and absence of common bile duct stones (41% vs. 24%, P = 0.03). Using multivariate logistic regression, the best model for predicting post-ERCP pancreatitis in patients undergoing sphincterotomy included age under 59 years (P = 0.04), and absence of a common bile duct stone (P = 0.004). The model yielded probabilities of developing post-sphincterotomy pancreatitis that ranged from 2.8% if no predictor was present, to 27% when both predictors were present. Among patients in whom a sphincterotomy was not performed, the only significant independent predictor found was pancreatic channel opacification (P = 0.05). CONCLUSION: Age under 59 years, pancreatic channel opacification, and an absence of common bile duct stones at ERCP are all independent predictors of post-ERCP pancreatitis.


Subject(s)
Cholangiopancreatography, Endoscopic Retrograde , Gallstones/diagnosis , Pancreatitis/etiology , Adolescent , Adult , Aged , Aged, 80 and over , Female , Gallstones/complications , Gallstones/surgery , Humans , Male , Middle Aged , Postoperative Complications/etiology , Retrospective Studies , Risk Factors , Sphincterotomy, Endoscopic , Treatment Outcome
19.
Clin Transplant ; 12(3): 243-9, 1998 Jun.
Article in English | MEDLINE | ID: mdl-9642517

ABSTRACT

To assess the safety profile of Neoral dose adjustment using cyclosporine (CsA) trough levels (C0) compared with levels obtained 2 h after the morning dose (C2), 30 stable adult heart transplant patients 1 yr or more after surgery were converted from Sandimmune to Neoral. After a baseline visit (before conversion), initial follow-up included two visits (2 and 4-6 wk after conversion). After the first visit, patients were randomized to Group I (C0: 100-200 ng/ml) or Group II (C2: 200-400 ng/ml). Abbreviated pharmacokinetics were obtained for the estimation of the AUC0-4 h. Renal function was assessed by serum creatinine and the cimetidine-modified creatinine clearance. C2 correlated better than C0 with the AUC0-4 h (r = 0.91 vs. 0.63). Initial Neoral dose (mg/kg/d) was similar in both groups (2.8 +/- 0.5 and 2.8 +/- 0.8), and was lower in Group II at the second visit (2.0 +/- 0.7 vs. 3.0 +/- 0.6, p = 0.0001). C2 levels decreased in Group II from 912 +/- 438 to 555 +/- 271 ng/ml (p = 0.01), without evidence of acute rejection on endomyocardial biopsies. After the second visit,-both groups were monitored with C2, and the range was increased to 300-600 ng/ml. At the last visit (additional follow-up of 5 +/- 1 months), Neoral dose (mg/kg/d) was reduced to 2.0 +/- 0.3 in Group I (p < 0.001) and 1.8 +/- 0.4 in Group II. Serum creatinine was lower in Group II at the second visit (138 +/- 59 vs. 168 +/- 37 mumol/L, p = 0.01) and was similar in both groups at the last visit. Neoral dose reduction based on C2 levels was not associated with acute rejection. The better correlation with the AUC0-4 h suggests that C2 may be more reliable than C0 for Neoral dose adjustment.


Subject(s)
Cyclosporine/pharmacokinetics , Cyclosporine/therapeutic use , Drug Monitoring , Heart Transplantation/immunology , Immunosuppressive Agents/pharmacokinetics , Immunosuppressive Agents/therapeutic use , Adult , Area Under Curve , Chi-Square Distribution , Creatinine/blood , Cyclosporine/administration & dosage , Humans , Immunosuppressive Agents/administration & dosage , Linear Models , Prospective Studies , Random Allocation , Time Factors
20.
Am J Surg ; 175(6): 482-7, 1998 Jun.
Article in English | MEDLINE | ID: mdl-9645777

ABSTRACT

BACKGROUND: Interest in the training and evaluation of laparoscopic skills is extending beyond the realm of the operating room to the use of laparoscopic simulators. The purpose of this study was to develop a series of structured tasks to objectively measure laparoscopic skills. This model was then used to test for the effects of level of training and practice on performance. METHODS: Forty-two subjects (6 each of surgical residents PGY1 to PGY5, 6 surgeons who practice laparoscopy and 6 who do not) were evaluated. Each subject viewed a 20-minute introductory video, then was tested performing 7 laparoscopic tasks (peg transfers, pattern cutting, clip and divide, endolooping, mesh placement and fixation, suturing with intracorporeal or extracorporeal knots). Performance was measured using a scoring system rewarding precision and speed. Each candidate repeated all 7 tasks and was rescored. Data were analyzed by linear regression to assess the relationship of performance with level of residency training for each task, and by ANOVA with repeated measures to test for effects of level of training, of repetition, and of the interaction between level of training and repetition on overall performance. Student's t test was used to evaluate differences between laparoscopic and nonlaparoscopic surgeons and between each of these groups and the PGY 5 level of surgical residents. RESULTS: Significant predictors of overall performance were (a) level of training (P = 0.002), (b) repetition (P < 0.0001), and (c) interaction between level of training and practice (P = 0.001). There was also a significant interaction between level of training and the specific task on performance scores (P = 0.006). When each task was evaluated individually for the 30 residents, 4 of the 7 tasks (tasks 1, 2, 6, 7) showed significant correlation between PGY level and score. A significant difference in performance scores between laparoscopic and nonlaparoscopic surgeons was seen for tasks 1, 2, and 6. CONCLUSIONS: A model was developed to evaluate laparoscopic skills. Construct validity was demonstrated by measuring significant improvement in performance with increasing residency training, and with practice. Further validation will require correlation of performance in the model with skill in vivo.


Subject(s)
Education, Medical, Continuing , General Surgery/education , Internship and Residency , Laparoscopy , Humans , Models, Structural , Teaching Materials , Videotape Recording
SELECTION OF CITATIONS
SEARCH DETAIL