Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 16 de 16
Filter
1.
Int J Surg ; 110(5): 2818-2831, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38241354

ABSTRACT

BACKGROUND: Liver transplantation (LT) is a well-established treatment for hepatocellular carcinoma (HCC), but there are ongoing debates regarding outcomes and selection. This study examines the experience of LT for HCC at a high-volume centre. METHODS: A prospectively maintained database was used to identify HCC patients undergoing LT from 2000 to 2020 with more than or equal to 3-years follow-up. Data were obtained from the centre database and electronic medical records. The Metroticket 2.0 HCC-specific 5-year survival scale was calculated for each patient. Kaplan-Meier and Cox-regression analyses were employed assessing survival between groups based on Metroticket score and individual donor and recipient risk factors. RESULTS: Five hundred sixty-nine patients met criteria. Median follow-up was 96.2 months (8.12 years; interquartile range 59.9-147.8). Three-year recurrence-free (RFS) and overall survival (OS) were 88.6% ( n =504) and 86.6% ( n =493). Five-year RFS and OS were 78.9% ( n =449) and 79.1% ( n =450). Median Metroticket 2.0 score was 0.9 (interquartile range 0.9-0.95). Tumour size greater than 3 cm ( P =0.012), increasing tumour number on imaging ( P =0.001) and explant pathology ( P <0.001) was associated with recurrence. Transplant within Milan ( P <0.001) or UCSF criteria ( P <0.001) had lower recurrence rates. Increasing alpha-fetoprotein (AFP)-values were associated with more HCC recurrence ( P <0.001) and reduced OS ( P =0.008). Chemoembolization was predictive of recurrence in the overall population ( P =0.043) and in those outside-Milan criteria ( P =0.038). A receiver-operator curve using Metroticket 2.0 identified an optimal cut-off of projected survival greater than or equal to 87.5% for predicting recurrence. This cut-off was able to predict RFS ( P <0.001) in the total cohort and predict both, RFS ( P =0.007) and OS ( P =0.016) outside Milan. Receipt of donation after brain death (DBD) grafts (55/478, 13%) or living-donor grafts (3/22, 13.6%) experienced better survival rates compared to donation after cardiac death (DCD) grafts ( n =15/58, 25.6%, P =0.009). Donor age was associated with a higher HCC recurrence ( P =0.006). Both total ischaemia time (TIT) greater than 6hours ( P =0.016) and increasing TIT correlated with higher HCC recurrence ( P =0.027). The use of DCD grafts for outside-Milan candidates was associated with increased recurrence ( P =0.039) and reduced survival ( P =0.033). CONCLUSION: This large two-centre analysis confirms favourable outcomes after LT for HCC. Tumour size and number, pre-transplant AFP, and Milan criteria remain important recipient HCC-risk factors. A higher donor risk (i.e. donor age, DCD grafts, ischaemia time) was associated with poorer outcomes.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Transplantation , Humans , Carcinoma, Hepatocellular/surgery , Carcinoma, Hepatocellular/mortality , Carcinoma, Hepatocellular/pathology , Liver Transplantation/mortality , Liver Neoplasms/surgery , Liver Neoplasms/mortality , Liver Neoplasms/pathology , Male , Female , Middle Aged , Risk Assessment , Follow-Up Studies , Aged , Retrospective Studies , Adult , Risk Factors , Neoplasm Recurrence, Local , Kaplan-Meier Estimate
3.
Am J Surg ; 226(1): 77-82, 2023 07.
Article in English | MEDLINE | ID: mdl-36858866

ABSTRACT

BACKGROUND: There is currently no consensus on surgical management of splenic flexure adenocarcinoma (SFA). METHODS: Patients undergoing surgical resection for SFA between 1993 and 2015 were identified. Postoperative outcomes were compared between patients who underwent segmental (SR) vs. anatomical resection (AR). RESULTS: One-hundred and thirteen patients underwent SR and 89 underwent AR. More patients in the SR group had open resections, but there were otherwise no differences in demographics or surgical characteristics between the two groups. There were no differences in overall (p = 0.29) or recurrence-free(p = 0.37) survival. On multivariable analysis, increased age (HR 1.04, 1.01-1.07, p = 0.005), higher American Society of Anesthesiology classification (HR 3.1, 1.7-5.71, p < 0.001), and higher tumor stage (HR 8.84, 3.76-20.82, p < 0.001) were predictive of mortality. CONCLUSIONS: Short and long-term outcomes after SR and AR for SFA are not different, making SR a viable option for SFA surgical management.


Subject(s)
Adenocarcinoma , Colon, Transverse , Colonic Neoplasms , Laparoscopy , Humans , Colon, Transverse/surgery , Treatment Outcome , Retrospective Studies , Colectomy , Adenocarcinoma/surgery , Adenocarcinoma/pathology
4.
Am Surg ; 89(12): 5520-5526, 2023 Dec.
Article in English | MEDLINE | ID: mdl-36827614

ABSTRACT

BACKGROUND: The hernia defects that develop in liver transplant recipients tend to be complex. Unfortunately, there is a paucity of data to guide post-transplant hernia management. Our goal was to evaluate the outcomes following laparoscopic ventral hernia repair (LVHR) in liver transplant recipients. METHODS: A retrospective review of a prospectively kept database of liver transplant patients at a single tertiary healthcare facility was completed. All patients between 2007 and 2020 who underwent LVHR for a hernia at their transplant incision site were included. The primary outcome studied was hernia recurrence. Secondary outcomes included time-to-hernia repair, complications, and length of stay (LOS). RESULTS: There were 89 patients who met inclusion criteria. 82% were male, mean age was 60 years, and mean body mass index was 30.2 kg/m2. 94.4% were on tacrolimus and 36% on mycophenolate mofetil. Median time-to-hernia repair was 16 months with a mean mesh size of 743 cm2 and length of stay of 3.7 days. None required conversion to an open operation. Postoperative complications included ileus (20.2%), acute kidney injury (11.2%), pneumonia (6.7%), and bleeding requiring re-operation (1.1%). Hernia-related complications included chronic suture site pain (1.1%), seroma requiring intervention (3.3%), surgical site infection (3.3%), nonoperative mesh infection (1.1%), and mesh infection requiring explantation (1.1%). Median follow-up was 23 months. Hernia recurrence occurred in 4.5% and no predictive variables for recurrence were identified. CONCLUSIONS: Although the hernia defects that develop in liver transplant recipients are complex and their comorbidities significant, LVHR can safely and effectively repair these defects with low rates of recurrence and complications.


Subject(s)
Hernia, Ventral , Laparoscopy , Liver Transplantation , Humans , Male , Middle Aged , Female , Herniorrhaphy , Hernia, Ventral/surgery , Liver Transplantation/adverse effects , Surgical Wound Infection/surgery
5.
Transplant Direct ; 8(11): e1392, 2022 Nov.
Article in English | MEDLINE | ID: mdl-36246002

ABSTRACT

With donation after circulatory death (DCD) liver transplantation (LT), the goal of the recipient implantation procedure is to minimize surgical complexity to avoid a tenuous environment for an already marginal graft. The presence of portal vein thrombosis (PVT) at the time of LT adds surgical complexity, yet' to date, no studies have investigated the utilization of DCD liver grafts for patients with PVT. Methods: All DCD LT performed at Mayo Clinic-Florida, Mayo Clinic-Arizona, and Mayo Clinic-Rochester from 2006 to 2020 were reviewed (N = 771). Patients with PVT at the time of transplant were graded using Yerdel classification. A 1:3 propensity match between patients with PVT and those without PVT was performed. Results: A total of 91 (11.8%) patients with PVT undergoing DCD LT were identified. Grade I PVT was present in 62.6% of patients, grade II PVT in 27.5%, grade III in 8.8%, and grade 4 in 1.1%. At the time of LT, thromboendovenectomy was performed in 89 cases (97.8%). There was no difference in the rates of early allograft dysfunction (43.2% versus 52.4%; P = 0.13) or primary nonfunction (1.1% versus 1.1%; P = 0.41) between the DCD PVT and DCD without PVT groups, respectively. The rate of ischemic cholangiopathy was not significantly different between the DCD PVT (11.0%) and DCD without PVT groups (10.6%; P = 0.92). Graft (P = 0.58) and patient survival (P = 0.08) were similar between the 2 groups. Graft survival at 1-, 3-, and 5-y was 89.9%, 84.5%, and 79.3% in the DCD PVT group. Conclusions: In appropriately selected recipients with grades I-II PVT, DCD liver grafts can be utilized safely with excellent outcomes.

6.
Clin Transplant ; 36(6): e14618, 2022 06.
Article in English | MEDLINE | ID: mdl-35182437

ABSTRACT

BACKGROUND: Centers discard high kidney donor profile index (KDPI) allografts, potentially related to delayed graft function and prolonged hospital use by kidney transplant recipients (KTR). We sought to determine whether high KDPI KTRs have excess health care utilization. METHODS: We conducted a retrospective cohort study from a high-volume center analyzing KTRs from January 3, 2011 to April 12, 2015 (n = 652). We measured differences in hospital use, emergency visits, and outpatient visits within the first 90 days between low (≤85%) versus high KDPI (>85%) KTRs, as well as long-term graft function and patient survival. RESULTS: High (n = 107) and low KDPI (n = 545) KTRs had similar length of stay (median = 3 days, P = .66), and readmission rates at 7, 30, and 90 days after surgery (all, P > .05). High KDPI kidneys were not associated with excess utilization of the hospital, emergency services, outpatient transplant clinics, or ambulatory infusion visits on univariate or multivariate analysis (all, P > .05). Low KDPI KTRs had significantly better eGFR at 2 years (Low vs. High KDPI: 60.35 vs. 41.54 ml/min, P < .001), but similar 3-year patient and graft survival (both, P > .09). CONCLUSIONS: High and low KDPI KTRs demonstrated similar 90-day risk-adjusted health care utilization, which should encourage use of high KDPI kidneys.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Follow-Up Studies , Graft Survival , Humans , Patient Acceptance of Health Care , Retrospective Studies , Tissue Donors
7.
Ann Surg ; 275(2): e511-e519, 2022 02 01.
Article in English | MEDLINE | ID: mdl-32516231

ABSTRACT

OBJECTIVE: To understand whether reduced lengths of stay after kidney transplantation were associated with excess health care utilization in the first 90 days or long-term graft and patient survival outcomes. BACKGROUND: Reducing length of stay after kidney transplant has an unknown effect on post-transplant health care utilization. We studied this association in a cohort of 1001 consecutive kidney transplants. METHODS: We retrospectively reviewed 2011-2015 data from a prospectively-maintained kidney transplant database from a single center. RESULTS: A total of 1001 patients underwent kidney transplant, and were dismissed from the hospital in 3 groups: Early [≤2 days] (19.8%), Normal [3-7 days] (79.4%) and Late [>7 days] (3.8%). 34.8% of patients had living donor transplants (Early 51%, Normal 31.4%, Late 18.4%, P < 0.001). Early patients had lower delayed graft function rates (Early 19.2%, Normal 32%, Late73.7%, P = 0.001). By the hospital dismissal group, there were no differences in readmissions or emergency room visits at 30 or 90 days. Glomerular filtration rate at 12 months and rates of biopsy-proven acute rejection were also similar between groups. The timing of hospital dismissal was not associated with the risk-adjusted likelihood of readmission. Early and Normal patients had similar graft and patient survival. Late dismissal patients, who had higher rates of cardiovascular complications, had significantly higher late mortality versus Normal dismissal patients in unadjusted and risk-adjusted models. CONCLUSION: Dismissing patients from the hospital 2 days after kidney transplant is safe, feasible, and improves value. It is not associated with excess health care utilization or worse short or long-term transplant outcomes.


Subject(s)
Kidney Transplantation , Length of Stay/statistics & numerical data , Patient Acceptance of Health Care/statistics & numerical data , Patient Discharge , Adult , Aged , Female , Humans , Male , Middle Aged , Retrospective Studies , Time Factors , Treatment Outcome
8.
Exp Clin Transplant ; 20(6): 616-620, 2022 06.
Article in English | MEDLINE | ID: mdl-32778014

ABSTRACT

In this report, we present a case of successful long-term salvage of a patient with transfusion-related acute lung injury associated with acute respiratory distress syndrome immediately after a liver transplant. The patient was a 29-year-old man with end-stage liver disease due to sclerosing cholangitis who underwent liver transplant. After organ reperfusion, there was evidence of liver congestion, acidosis, coagulopathy, and acute kidney injury. He received 61 units of blood products. Continuous renal replacement therapy was initiated intraoperatively. On arrival to the intensive care unit, the patient was on high-dose pressors, and the patient developed respiratory failure and was immediately placed on veno-arterial extracorporeal membrane oxygenation via open femoral exposure. The patient presented with severe coagulopathy and early allograft dysfunction; therefore, no systemic heparin was administered and no thrombotic events occurred. He required extracorporeal membrane oxygenation support until posttransplant day 4, when resolution of the respiratory and cardiac dysfunction was noted. At 2 years after liver transplant, the patient has normal liver function, normal cognitive function, and stage V chronic kidney disease. We conclude that extracorporeal membrane oxygenation is a valuable therapeutic approach in patients with cardiorespiratory failure after liver transplant.


Subject(s)
Extracorporeal Membrane Oxygenation , Liver Transplantation , Respiratory Distress Syndrome , Respiratory Insufficiency , Adult , Humans , Liver Transplantation/adverse effects , Male , Respiratory Distress Syndrome/diagnosis , Respiratory Distress Syndrome/etiology , Respiratory Distress Syndrome/therapy , Treatment Outcome
9.
Article in English | MEDLINE | ID: mdl-34360387

ABSTRACT

Over time, the role of information and communication technologies (ICT) has become increasingly important in most areas of our lives, including education. So much so, that during the COVID-19 pandemic, the use of these tools has been essential to continue the teaching process. One of the great challenges facing teachers today is their need to adapt to this new educational scenario by acquiring the necessary digital skills. The aim of this study is to determine the level of competence of teaching in pre-university education key stages. To this end, a questionnaire was distributed among education centres and teachers in the Autonomous Community of Extremadura, obtaining 109 valid responses. The analysis methodology was the formation of clusters using the K-means model. The results confirmed that the teachers perceived a medium-high level of knowledge and use of ICT. Moreover, that this digital competence is conditioned by factors such as age, experience, gender, and level of education. In conclusion, public administrations are encouraged to facilitate teachers' knowledge and application of ICT according to the profiles identified.


Subject(s)
COVID-19 , Universities , Educational Status , Humans , Pandemics , SARS-CoV-2
10.
Clin Transplant ; 35(10): e14439, 2021 10.
Article in English | MEDLINE | ID: mdl-34297440

ABSTRACT

BACKGROUND: Opioids are associated with negative transplant outcomes. We sought to identify patient and center effects on over-prescribing of opioids (> 200 OME (oral morphine equivalents)). STUDY DESIGN: Clinical and opioid prescription data (2014-2017) were collected from three academic transplant centers for kidney (KT), liver (LT), and simultaneous liver-kidney transplant (SLK) patients. Multivariable models were used to identify predictors of opioid over-prescribing at discharge and the occurrence of refill prescriptions at 90 days. RESULTS: Three-thousand seven-hundred and two patients underwent transplant in the cohort (KT: n = 2358, LT: n = 1221, SLK: n = 123). More than 80% of recipients were over-prescribed opioids at discharge (Median OME (mOME) = 300 (IQR 225-375). LT and SLK had the largest prescription size (LT mOME 338 (IQR 300-450); SLK mOME 338 (IQR 225-450) and refill rate (LT: 64%, SLK 59%) (all, P < .001). Multivariable analysis indicated that transplant center was a significant predictor of opioid over-prescription after KT and LT (all, P < .001); older age (in KT) and length of stay (LOS) (in LT) were protective factors (both, P < .05). Refill occurrence was associated with initial prescription size and was reduced by older age and initial LOS (all, P < .05). CONCLUSIONS: The wide variation in opioid prescribing patterns has implications for transplant practice innovation, guideline development, and further study.


Subject(s)
Analgesics, Opioid , Pain, Postoperative , Aged , Analgesics, Opioid/therapeutic use , Humans , Length of Stay , Pain, Postoperative/drug therapy , Pain, Postoperative/etiology , Patient Discharge , Practice Patterns, Physicians' , Retrospective Studies
11.
Surgery ; 169(1): 58-62, 2021 01.
Article in English | MEDLINE | ID: mdl-32814633

ABSTRACT

BACKGROUND: Thyroid nodules discovered incidentally during transplant may prolong time to transplantation. Although data suggest that incidence of thyroid cancer increases after solid organ transplantation, the impact on prognosis in differentiated thyroid cancer is not well characterized. METHODS: We performed a retrospective review of patients with history of thyroid cancer and solid organ transplantation at our institution. RESULTS: A total of 13,037 patients underwent solid organ transplantation of which there were 94 patients with differentiated thyroid cancer (0.7%). Of these, 50 patients (53%) had cancer pre-solid organ transplantation, whereas 44 patients (47%) developed cancer post-solid organ transplantation. Papillary histology was most common (88%), followed by follicular (3%), Hurthle cell (3%), and medullary (2%) carcinomas. One patient in the post-transplant cohort died from metastatic thyroid cancer 11.8 years after transplantation. There were 5 patients in the pre-transplant group and 4 patients in the post-transplant group who had recurrent thyroid disease. There were no patients treated for differentiated thyroid cancer pre-solid organ transplantation that experienced disease recurrence after transplantation. Disease-free survival at 5 and 10 years was 95.8% and 92.1% (confidence interval 84.9-99.2%, 80.0-97.4%) in the pre-solid organ transplantation group vs 89.7% and 84.4% in the post (confidence interval: 80.0-96.3% and 79.0-93.1%, P = .363), respectively. CONCLUSION: Survival outcomes and recurrence rates in patients with thyroid cancer are not significantly affected by solid organ transplantation. A history of thyroid cancer or discovery of thyroid nodules during transplant screening should not be a contraindication for transplant listing.


Subject(s)
Neoplasm Recurrence, Local/epidemiology , Organ Transplantation/adverse effects , Thyroid Neoplasms/mortality , Adult , Age Factors , Disease-Free Survival , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Neoplasm Recurrence, Local/etiology , Postoperative Period , Preoperative Period , Prognosis , Retrospective Studies , Risk Factors , Sex Factors , Thyroid Gland/pathology , Thyroid Gland/surgery , Thyroid Neoplasms/diagnosis , Thyroid Neoplasms/etiology , Thyroid Neoplasms/therapy , Thyroidectomy
12.
Pancreas ; 49(4): 568-573, 2020 04.
Article in English | MEDLINE | ID: mdl-32282771

ABSTRACT

OBJECTIVES: We compared risk-adjusted short- and long-term outcomes between standard pancreaticoduodenectomy (SPD) and a pylorus-preserving pancreaticoduodenectomy (PPPD). METHODS: The National Cancer Database was queried for the years 2004 to 2014 to identify patients with adenocarcinoma of the pancreatic head undergoing SPD and PPD. Margin status, lymph node yield, length of stay (LOS), 30- and 90-day mortality, and overall survival were compared. RESULTS: A total of 11,172 patients were identified, of whom 9332 (83.5%) underwent SPD and 1840 (16.5%) PPPD. There was no difference in patient age, sex, stage, tumor grade, radiation treatment, and chemotherapy treatment between the 2 groups. Total number of regional lymph nodes was examined, and surgical margin status and overall survival were also comparable. However, patients undergoing PPPD had a shorter LOS (11.3 vs 12.3 days, P < 0.001), lower 30-day mortality (2.5% vs 3.7%, P = 0.02), and 90-day mortality (5.5% vs 6.9%, P = 0.03). On multivariate analyses, patients undergoing SPD were at higher risk for 30-day mortality compared with PPPD (odds ratio, 1.51; 95% confidence interval, 1.07-2.13). CONCLUSIONS: Standard pancreaticoduodenectomy and PPPD are oncologically equivalent, yet PPPD is associated with a reduction in postoperative mortality and shorter LOS.


Subject(s)
Carcinoma, Pancreatic Ductal/surgery , Organ Sparing Treatments/methods , Pancreatic Neoplasms/surgery , Pancreaticoduodenectomy/methods , Pylorus , Adult , Aged , Aged, 80 and over , Carcinoma, Pancreatic Ductal/mortality , Carcinoma, Pancreatic Ductal/therapy , Combined Modality Therapy , Confounding Factors, Epidemiologic , Female , Follow-Up Studies , Humans , Kaplan-Meier Estimate , Length of Stay/statistics & numerical data , Lymph Node Excision , Lymphatic Metastasis , Male , Middle Aged , Pancreatic Neoplasms/mortality , Pancreatic Neoplasms/therapy , Retrospective Studies , Treatment Outcome
13.
Article in English | MEDLINE | ID: mdl-32023942

ABSTRACT

This study focuses on the transparency of financial reporting on emission allowances (EA) and greenhouse gas (GHG) emissions within the European Union Emissions Trading Scheme (EU ETS). In particular, the different accounting treatments adopted by standard setters and professionals were analyzed to evaluate the influence of regulation in the transparency of financial reporting on EA and GHG emissions. Based on a sample of 85 companies registered with the Portuguese, Spanish, and French National Plans of Allocation (NPAs), data collected from the annual reports were analyzed for the 2008-2014 period. The results were obtained based on descriptive, logistic regressions and panel data statistical techniques, and they show that better levels of transparency of financial reporting on EA and GHG emissions are conditioned by a variety of accounting policies, which compromises the comparability of the financial information. The adoption of the International Accounting Standards Board (IASB) standards set lead to a greater dispersion in the choice of the accounting approach and a higher probability of not disclosing any information, as well as adopting off-balance sheet policies. Therefore, the regulatory factor is a determinant of the level of transparency of financial reporting on EA and GHG emissions, contributing to reduce strategies of omission.


Subject(s)
Greenhouse Effect/economics , Greenhouse Gases/economics , European Union , Greenhouse Effect/statistics & numerical data , Humans , Trust
14.
Am J Surg ; 218(6): 1229-1233, 2019 12.
Article in English | MEDLINE | ID: mdl-31421894

ABSTRACT

BACKGROUND: The Choosing Wisely Organization and the American College of Surgeons have issued recommendations for patients >70 with breast cancer involving screening and use of radiation therapy (RT) and sentinel lymph node biopsies (SNLB) in early stage tumors. This study evaluated compliance and implementation of these recommendations. METHODS: A database of patients undergoing breast cancer surgery was retrospectively queried from 2002 to 2017. Patients were divided into cohorts before and after the year of each guideline publication. RESULTS: The rate of presentation on mammography was not different before 2009 (65%) vs. after 2009 (66%). RT was given to 57% of patients with T1 ER + Her2-prior to 2013 vs. 27% after (p=<0.001). SLNB was performed in 91% of patients with T1, grade1/2, ER + Her2-tumors prior to 2016 vs. 56% after (p=<0.001). CONCLUSION: Rates of mammography detected breast cancer have not decreased but adjuvant RT and SLNB are less frequently performed in low risk breast cancer in the elderly.


Subject(s)
Breast Neoplasms/diagnosis , Breast Neoplasms/therapy , Practice Guidelines as Topic , Aged , Aged, 80 and over , Female , Humans , Life Expectancy , Mammography , Neoplasm Staging , Retrospective Studies , Sentinel Lymph Node Biopsy , United States
15.
Dis Colon Rectum ; 62(10): 1167-1176, 2019 10.
Article in English | MEDLINE | ID: mdl-30489325

ABSTRACT

BACKGROUND: Primary colorectal lymphoma is rare, representing 0.2% to 0.6% of all colorectal cancers. Because of its low incidence and histologic variety, no treatment guidelines exist. OBJECTIVE: The purpose was to report the experience of primary colorectal lymphoma in an institutional and a national cohort. DESIGN: This was a retrospective cohort study. SETTINGS: The study was conducted with institutional data composed of 3 tertiary referral centers and national data. PATIENTS: Patients with primary colorectal lymphoma were identified within the Mayo Clinic (1990-2016) and the Surveillance, Epidemiology, and End Results database (1990-2014). MAIN OUTCOME MEASURES: Primary outcomes were overall and 5-year survival. RESULTS: For the institutional cohort (N = 82), 5-year survival was 79.9%. Five-year survival was higher for rectal (88.4%) than for colon tumors (77.2%; p = 0.004). On multivariable analysis, age <50 years was associated with higher overall survival (p = 0.04). Left-sided colon masses and aggressive histological subtypes were associated with worse survival (0.04 and 0.03). No effect of treatment modality on survival was noted. For the national cohort (N = 2942), 5-year survival was 58.4%. Five-year survival for rectal tumors was 61.0% and 57.8% for colon tumors. On multivariable analysis, factors associated with improved survival were age <70 y, (p < 0.0001), female sex (p = 0.005), right-sided masses (p = 0.02), and diagnoses after 2000 compared with 1990-1999 (p < 0.0001). Aggressive pathology (p < 0.0001) and stage III or stage IV presentation compared with stage I (p = 0.02 and p < 0.0001) were associated with worse survival. LIMITATIONS: The institutional cohort was limited by sample size to describe treatment effect on survival. A major limitation of the national cohort was the ability to describe treatment modalities other than surgery, including chemotherapy and/or no additional treatment. CONCLUSIONS: Poorer survival was noted in elderly patients and in those with aggressive pathology. An overall survival advantage was seen in women in the national cohort. Currently, optimal strategies should follow a patient-centered multidisciplinary approach. See Video Abstract at http://links.lww.com/DCR/A807. LINFOMA COLORECTAL PRIMARIO: EXPERIENCIA INSTITUCIONAL Y REVISIÓN DE UNA BASE DE DATOS NACIONAL: El linfoma colorectal primario es poco frecuente, representando del 0.2% al 0.6% de todos los cánceres colorectales. Debido a su baja incidencia y variedad histológica, no existen guías de tratamiento. OBJETIVO: El propósito fue reportar la experiencia en linfoma colorectal primario en una cohorte institucional y una nacional. DISEÑO:: Este fue un estudio de cohorte retrospectivo. ESCENARIO: El estudio se realizó con datos institucionales provenientes de 3 centros de referencia terciarios y datos nacionales. PACIENTES: Se identificaron pacientes con linfoma colorectal primario en la base de datos de la Clínica Mayo (1990-2016) y en la base de datos de vigilancia, epidemiología y resultados finales [Surveillance, Epidemiology, and End Results database (1990-2014)]. PRINCIPALES MEDIDAS DE RESULTADO: Los resultados primarios fueron la sobrevida general y a 5 años. RESULTADOS: Para la cohorte institucional (N = 82), la sobrevida a 5 años fue de 79.9%. La sobrevida a cinco años fue mayor en tumores rectales (88.4%) que en los de colon (77.2%; p = 0.004). En el análisis multivariable, la edad <50 años se asoció con una mayor sobrevida general (p = 0,04). Las masas de colon izquierdo y los subtipos histológicos agresivos se asociaron con una peor sobrevida (0.04 y 0.03). No se observó ningún efecto según la modalidad de tratamiento en la sobrevida. Para la cohorte nacional (N = 2942), la sobrevida a 5 años fue del 58.4%. La sobrevida a cinco años fue de 61.0% para los tumores rectales y 57.8% para los tumores de colon. En el análisis multivariable, los factores asociados con una mayor sobrevida fueron edad <70 años, (p <0.0001), sexo femenino (p = 0.005), masas derechas (p = 0.02) y los casos diagnósticados después del año 2000 comparados con los de 1990-1999 (p <0.0001). Histopatología agresiva (p <0.0001) y presentación en estadio III o estadio IV en comparación con estadio I (p = 0.02 y p <0.0001) se asociaron con una peor sobrevida. LIMITACIONES: La cohorte institucional estuvo limitada por el tamaño de la muestra para describir el efecto del tratamiento en la sobrevida. Una limitación mayor en la cohorte nacional fue la habilidad para describir modalidades de tratamiento distintas a la cirugía, incluyendo quimioterapia y/o ningún tratamiento adicional. CONCLUSIONES: Una menor sobrevida fue documentada en pacientes de edad avanzada y en aquellos con histopatología agresiva. Se observó ventaja en cuanto a sobrevida general en las mujeres de la cohorte nacional. Actualmente, las estrategias óptimas deben de seguir un abordaje multidisciplinario centrado en cada paciente. Vea el abstract en video en http://links.lww.com/DCR/A807.


Subject(s)
Colorectal Neoplasms/epidemiology , Lymphoma/epidemiology , Neoplasm Staging , SEER Program , Adolescent , Adult , Aged , Aged, 80 and over , Child , Child, Preschool , Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/therapy , Combined Modality Therapy , Female , Follow-Up Studies , Humans , Incidence , Lymphoma/diagnosis , Male , Middle Aged , Prognosis , Retrospective Studies , Survival Rate/trends , Time Factors , Tomography, X-Ray Computed , United States/epidemiology , Young Adult
16.
Evolution ; 57(11): 2636-43, 2003 Nov.
Article in English | MEDLINE | ID: mdl-14686538

ABSTRACT

Evolution of the red-green visual subsystem in trichromatic primates has been linked to foraging advantages, namely the detection of either ripe fruits or young leaves amid mature foliage. We tested competing hypotheses globally for eight primate taxa: five with routine trichromatic vision, three without. Routinely trichromatic species ingested leaves that were "red shifted" compared to background foliage more frequently than species lacking this trait. Observed choices were not the reddest possible, suggesting a preference for optimal nutritive gain. There were no similar differences for fruits although red-greenness may sometimes be important in close-range fruit selection. These results suggest that routine trichromacy evolved in a context in which leaf consumption was critical.


Subject(s)
Biological Evolution , Color Perception/physiology , Feeding Behavior , Primates/physiology , Animals , Fruit , Plant Leaves
SELECTION OF CITATIONS
SEARCH DETAIL
...