Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 164
Filter
Add more filters

Publication year range
1.
J Surg Res ; 270: 555-563, 2022 02.
Article in English | MEDLINE | ID: mdl-34826691

ABSTRACT

BACKGROUND: All-terrain vehicle (ATV) use is widespread, however, little is known about injury patterns and outcomes in geriatric patients. We hypothesized that geriatric patients would have distinct and more severe injuries than non-geriatric adults after ATV trauma. METHODS: A retrospective cohort study was performed using the National Trauma Databank comparing non-geriatric (18-64) and geriatric adults (≥65) presenting after ATV trauma at Level 1 and 2 trauma centers from 2011 to 2015. Demographic, admission, and outcomes data were collected, including injury severity score (ISS), abbreviated injury scale (AIS) score, discharge disposition, and mortality. We performed univariate statistical tests between cohorts and multiple logistic regression models to assess for risk factors associated with severe injury (ISS>15) and mortality. RESULTS: 23,568 ATV trauma patients were identified, of whom 1,954 (8.3%) were geriatric. Geriatric patients had higher rates of severe injury(29.2 v 22.5%,p<0.0001), and thoracic (55.2 v 37.8%,p<0.0001) and spine (31.5 v 26.0%,p<0.0001) injuries, but lower rates of abdominal injuries (14.6 v 17.9%,p<0.001) as compared to non-geriatric adults. Geriatric patients had overall lower head injury rates (39.2 v 42.1%,p=0.01), but more severe head injuries (AIS>3) (36.2 vs 30.2%,p<0.001). Helmet use was significantly lower in geriatric patients (12.0 v 22.8%,p<0.0001). On multivariate analysis age increased the odds for both severe injury (OR 1.50, 95% CI 1.31-1.72, p<0.0001) and mortality (OR 5.07, 95% CI 3.42-7.50, p<0.0001). CONCLUSIONS: While severe injury and mortality after ATV trauma occurred in all adults, geriatric adults suffered distinct injury patterns and were at greater risk for severe injury and mortality.


Subject(s)
Off-Road Motor Vehicles , Wounds and Injuries , Adult , Aged , Head Protective Devices , Humans , Injury Severity Score , Retrospective Studies , Trauma Centers , Wounds and Injuries/epidemiology , Wounds and Injuries/etiology
2.
J Surg Res ; 262: 85-92, 2021 06.
Article in English | MEDLINE | ID: mdl-33549849

ABSTRACT

BACKGROUND: Snowmobiling is a popular activity that leads to geriatric trauma admissions; however, this unique trauma population is not well characterized. We aimed to compare the injury burden and outcomes for geriatric versus nongeriatric adults injured riding snowmobiles. MATERIALS AND METHODS: A retrospective cohort study was performed using the National Trauma Databank comparing nongeriatric (18-64) and geriatric adults (≥65) presenting after snowmobile-related trauma at level 1 and 2 trauma centers from 2011 to 2015. Demographic, admission, injury, and outcome data were collected and compared. A multivariate logistic regression model assessed for risk factors associated with severe injury (Injury Severity Score >15). Analysis was also performed using chi square, analysis of variance, and Kruskal-Wallis testing. RESULTS: A total of 2471 adult patients with snowmobile trauma were identified; 122 (4.9%) were geriatric. Rates of severe injury (Injury Severity Score >15) were similar between groups, 27.5% in geriatric patients and 22.5% in nongeriatric adults (P = 0.2). Geriatric patients experienced higher rates of lower extremity injury (50.4 versus 40.3%, P = 0.03), neck injury (4.1 versus 1.4%, P = 0.02), and severe spine injury (20.6 versus 7.0%, P = 0.004). Geriatric patients had longer hospitalizations (5 versus 3 d, P < 0.0001), rates of discharge to a facility (36.8% versus 12%, P < 0.0001), and higher mortality (4.1 versus 0.6%, P < 0.0001). Geriatric age did not independently increase the risk for severe injury. CONCLUSIONS: Geriatric age was not a significant predictor of severe injury after snowmobile trauma; however, geriatric patients suffered unique injuries, had longer hospitalizations, had higher rates of discharge to a facility, and had higher mortality. Tailored geriatric care may improve outcomes in this unique sport-related trauma population.


Subject(s)
Off-Road Motor Vehicles , Wounds and Injuries/epidemiology , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Injury Severity Score , Leg Injuries/epidemiology , Length of Stay , Logistic Models , Male , Middle Aged , Neck Injuries/epidemiology , Retrospective Studies , Wounds and Injuries/mortality , Young Adult
3.
J Surg Res ; 259: 121-129, 2021 03.
Article in English | MEDLINE | ID: mdl-33279837

ABSTRACT

BACKGROUND: Downhill skiing accounts for a large portion of geriatric sport-related trauma. We assessed the national burden of geriatric versus nongeriatric ski trauma. MATERIALS AND METHODS: Adults presenting to level 1/2 trauma centers after ski-associated injuries from 2011 to 2015 were identified from the National Trauma Data Bank by ICD-9 code. We compared demographics, injury patterns, and outcomes between geriatric (age ≥65 y) and nongeriatric adult skiers (age 18-64 y). A multiple regression analysis assessed for risk factors associated with severe injury (Injury Severity Score >15). RESULTS: We identified 3255 adult ski trauma patients, and 16.7% (543) were geriatric. Mean ages for nongeriatric versus geriatric skiers were 40.8 and 72.1 y, respectively. Geriatric skiers more often suffered head (36.7 versus 24.3%, P < 0.0001), severe head (abbreviated injury scale score >3, 49.0 versus 31.5%, P < 0.0001) and thorax injuries (22.2 versus 18.1%, P = 0.03) as compared with nongeriatric skiers. Geriatric skiers were also more often admitted to the ICU (26.5 versus 14.9%, P < 0.0001), discharged to a facility (26.7 versus 11.6%, P < 0.0001), and suffered higher mortality rates (1.3 versus 0.4%, P = 0.004). Independent risk factors for severe injury included being male (OR: 1.68, CI: 1.22-2.31), helmeted (OR: 1.41, CI: 1.07-1.85), and having comorbidities (OR: 1.37, CI: 1.05-1.80). Geriatric age was not independently associated with severe injury. CONCLUSIONS: At level 1/2 trauma centers, geriatric age in ski trauma victims was associated with unique injury patterns, higher acuity, increased rates of facility care at discharge, and higher mortality as compared with nongeriatric skiers. Our findings indicate the need for specialized care after high impact geriatric ski trauma.


Subject(s)
Cost of Illness , Craniocerebral Trauma/epidemiology , Skiing/injuries , Thoracic Injuries/epidemiology , Trauma Centers/statistics & numerical data , Adolescent , Adult , Age Factors , Aged , Comorbidity , Craniocerebral Trauma/diagnosis , Craniocerebral Trauma/etiology , Craniocerebral Trauma/prevention & control , Databases, Factual , Female , Head Protective Devices/statistics & numerical data , Hospital Mortality , Humans , Injury Severity Score , Intensive Care Units/statistics & numerical data , Length of Stay/statistics & numerical data , Male , Middle Aged , Patient Admission/statistics & numerical data , Retrospective Studies , Risk Factors , Skiing/statistics & numerical data , Thoracic Injuries/diagnosis , Thoracic Injuries/etiology , United States/epidemiology , Young Adult
4.
Am J Transplant ; 15(1): 55-63, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25534656

ABSTRACT

Since the latest revision in US heart allocation policy (2006), the landscape and volume of transplant waitlists have changed considerably. Advances in mechanical circulatory support (MCS) prolong survival, but Status 1A mortality remains high. Several patient subgroups may be disadvantaged by current listing criteria and geographical disparity remains in waitlist time. This forum on US heart allocation policy was organized to discuss these issues and highlight concepts for consideration in the policy development process. A 25-question survey on heart allocation policy was conducted. Among attendees/respondents were 84 participants with clinical/published experience in heart transplant representing 51 US transplant centers, and OPTN/UNOS and SRTR representatives. The survey results and forum discussions demonstrated very strong interest in change to a further-tiered system, accounting for disadvantaged subgroups and lowering use of exceptions. However, a heart allocation score is not yet viable due to the long-term viability of variables (used in the score) in an ever-developing field. There is strong interest in more refined prioritization of patients with MCS complications, highly sensitized patients and those with severe arrhythmias or restrictive physiology. There is also strong interest in distribution by geographic boundaries modified according to population. Differences of opinion exist between small and large centers.


Subject(s)
Health Policy/trends , Heart Failure/surgery , Heart Transplantation/legislation & jurisprudence , Resource Allocation/legislation & jurisprudence , Tissue and Organ Procurement/legislation & jurisprudence , Humans , Research Report , United States
5.
Am J Transplant ; 12(9): 2487-97, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22776430

ABSTRACT

This randomized, comparative, multinational phase 3b/4 study of patients 1-8 years postcardiac transplantation (mean 3.9 years) evaluated the effect of conversion from a calcineurin inhibitor (CNI) to sirolimus on renal function in patients with renal insufficiency. In total, 116 patients on CNI therapy with GFR 40-90 mL/min/1.73 m(2) were randomized (1:1) to sirolimus (n = 57) or CNI (n = 59). Intent-to-treat analysis showed the 1-year adjusted mean change from baseline in creatinine clearance (Cockcroft-Gault) was significantly higher with sirolimus versus CNI treatment (+3.0 vs. -1.4 mL/min/1.73 m(2) , respectively; p = 0.004). By on-therapy analysis, values were +4.7 and -2.1, respectively (p < 0.001). Acute rejection (AR) rates were numerically higher in the sirolimus group; 1 AR with hemodynamic compromise occurred in each group. A significantly higher treatment discontinuation rate due to adverse events (AEs; 33.3% vs. 0%; p < 0.001) occurred in the sirolimus group. Most common treatment-emergent AEs significantly higher in the sirolimus group were diarrhea (28.1%), rash (28.1%) and infection (47.4%). Conversion to sirolimus from CNI therapy improved renal function in cardiac transplant recipients with renal impairment, but was associated with an attendant AR risk and higher discontinuation rate attributable to AEs.


Subject(s)
Heart Transplantation , Immunosuppressive Agents/therapeutic use , Kidney/physiopathology , Sirolimus/therapeutic use , Aged , Female , Glomerular Filtration Rate , Humans , Kidney Function Tests , Male , Middle Aged
6.
Phys Rev Lett ; 107(14): 144801, 2011 Sep 30.
Article in English | MEDLINE | ID: mdl-22107200

ABSTRACT

Measurements of the spatial and temporal coherence of single, femtosecond x-ray pulses generated by the first hard x-ray free-electron laser, the Linac Coherent Light Source, are presented. Single-shot measurements were performed at 780 eV x-ray photon energy using apertures containing double pinholes in "diffract-and-destroy" mode. We determined a coherence length of 17 µm in the vertical direction, which is approximately the size of the focused Linac Coherent Light Source beam in the same direction. The analysis of the diffraction patterns produced by the pinholes with the largest separation yields an estimate of the temporal coherence time of 0.55 fs. We find that the total degree of transverse coherence is 56% and that the x-ray pulses are adequately described by two transverse coherent modes in each direction. This leads us to the conclusion that 78% of the total power is contained in the dominant mode.

7.
J Trauma ; 69(1): 119-21, 2010 Jul.
Article in English | MEDLINE | ID: mdl-20622586

ABSTRACT

BACKGROUND: Computed tomography (CT) of the thoracic and lumbar (T/L) spine with reformats has become the imaging modality of choice for the identification of T/L spine fractures. The objective of this study was to directly compare chest/abdomen/pelvis CT (CAP CT) with CT with T/L reformats (T/L CT) for the identification of T/L spine fractures. METHODS: Patients who had both a CAP CT scan (5-mm imaging spacing) and T/L CT reconstruction (2.5-mm image spacing with sagittal and coronal reformats) were selected. A "fracture" group (N = 35) and a "no fracture" group (N = 57) were identified. The type and level of fracture were recorded. RESULTS: The CAP CT correctly identified all 35 patients with a thoracolumbar fracture (100% sensitivity; 95% confidence interval: 88-100%). A total of 80 separate fracture sites were present in the 35 patients. The CAP CT accurately identified 78 of those fractures (97.5% sensitivity; 95% confidence interval: 90.4-99.6%). The two fractures not identified on the CAP CT were both the transverse process fractures in patients with multiple fractures at different levels. CONCLUSION: Patients who have a CAP CT do not require reformats for clearance of the T/L spine.


Subject(s)
Lumbar Vertebrae/injuries , Spinal Fractures/diagnostic imaging , Thoracic Vertebrae/injuries , Tomography, X-Ray Computed/methods , Adult , Female , Humans , Lumbar Vertebrae/diagnostic imaging , Male , Middle Aged , Retrospective Studies , Thoracic Vertebrae/diagnostic imaging , Wounds, Nonpenetrating/diagnostic imaging
8.
Injury ; 51(9): 2040-2045, 2020 Sep.
Article in English | MEDLINE | ID: mdl-32631617

ABSTRACT

INTRODUCTION: As the population ages, trauma centers are seeing a significant volume of injured geriatric patients. However, there is limited data on geriatric off-roading incidents. We investigated the injury patterns, severity and outcomes of geriatric versus younger adult all-terrain vehicle (ATV) and snowmobile related trauma with the hypothesis that geriatric patients will have higher mortality and worsened outcomes. METHODS: The trauma registry at a New England Level 1 trauma center was queried by ICD 9/10 code for adult ATV and/or snowmobile-related trauma from 2011-2019. Data reviewed included demographic, admission, injury, and outcomes data including injury severity score (ISS), abbreviated injury scale (AIS) score, hospital disposition, and mortality. Patients were stratified by age into younger adults (18-64 years old) versus geriatric (65 years and older). Univariate analysis was performed to compare groups. RESULTS: Over the study period, we identified 390 adult ATV or snowmobile-related trauma patients, of whom 38 were geriatric. The mean ages for the younger adult vs. geriatric cohorts were 41(SD 13) and 73(SD 5), respectively. The majority of patients were male (77%). Compared to younger adults, geriatric patients were more often unhelmeted (66 v 38%, p=0.004) and more likely to present after ATV as opposed to snowmobile trauma (71 v 51%, p=0.028). Geriatric patients more often sustained both any chest trauma (68 v 41%, p=0.003) and severe chest trauma (AIS≥3, 55 v 31%, p=0.022), and more often required tube thoracostomy (26 v 12%, p=0.042). Geriatric patients were also more often discharged to a facility (39 v 14%, p<0.001) compared to younger patients. There were no differences between age cohorts regarding arrival Glasgow coma scale scores, ISS>15, length of stay, ventilator days, complications, or mortality. CONCLUSIONS: Following ATV or snowmobile-related trauma, geriatric patients were more likely to sustain severe chest trauma and to require additional care upon hospital discharge as compared to younger adults. Primary prevention should focus on encouraging helmet and chest protective clothing use in this geriatric population.


Subject(s)
Off-Road Motor Vehicles , Wounds and Injuries , Adolescent , Adult , Aged , Head Protective Devices , Humans , Injury Severity Score , Male , Middle Aged , Retrospective Studies , Trauma Centers , Wounds and Injuries/epidemiology , Wounds and Injuries/therapy , Young Adult
9.
Mil Med ; 183(7-8): e257-e260, 2018 07 01.
Article in English | MEDLINE | ID: mdl-29741715

ABSTRACT

Introduction: Little is known regarding the confidence of military surgeons prior to combat zone deployment. Military surgeons are frequently deployed without peers experienced in combat surgery. We hypothesized that forward surgical team experience (FSTE) increases surgeon confidence with critical skill sets. Methods: We conducted a national survey of military affiliated personnel. We used a novel survey instrument that was piloted and validated by experienced military surgeons to collect demographics, education, practice patterns, and confidence parameters for trauma and surgical critical care skills. Skills were defined as crucial operative techniques for hemorrhage control and resuscitation. Surveyors were blinded to participants, and surveys were returned electronically via REDCap database. Data were analyzed with SPSS using appropriate models. Significance was considered p < 0.05. Results: Of 174 distributed surveys, 86 were completed. Nine individuals failed to characterize their FSTE, thus leaving a sample size of 77. At the time of first deployment, 78.4% were alone or with less experienced surgeons and 53.2% had less than 2 yr of post-residency practice. The respondents' confidence in damage control techniques and seven other trauma skills increased relative to FSTE. After adjusting for years of practice, number of trauma resuscitations performed per month and pre-deployment training, there remained a significant positive association between FSTE and confidence in damage control, thoracic surgery, extremity/junctional hemorrhage control, trauma systems administration, adult critical care and airway management. Conclusions: Training programs and years of general surgery practice do not replace FSTE among military surgeons. Pre-deployment training that mimics FST skill sets should be developed to improve military surgeon confidence and outcomes. Level of Evidence: Prognostic and Epidemiologic, Level IV.


Subject(s)
Patient Care Team/standards , Self Efficacy , Surgical Procedures, Operative/psychology , Wounds and Injuries/surgery , Adult , Female , Humans , Male , Middle Aged , Pennsylvania , Surgical Procedures, Operative/standards , Surveys and Questionnaires , Warfare/psychology
10.
J Clin Invest ; 98(5): 1150-7, 1996 Sep 01.
Article in English | MEDLINE | ID: mdl-8787678

ABSTRACT

To determine whether indirect allorecognition is involved in heart allograft rejection T cells obtained from peripheral blood and graft biopsy tissues were expanded in the presence of IL-2 and tested in limiting dilution analysis (LDA) for reactivity to synthetic peptides corresponding to the hypervariable regions of the mismatched HLA-DR antigen(s) of the donor. Serial studies of 32 patients showed that T cell reactivity to donor allopeptides was strongly associated with episodes of acute rejection. The frequency of allopeptide reactive T cells was 10-50-fold higher in the graft than in the periphery indicating that T cells activated via the indirect allorecognition pathway participate actively in acute allograft rejection. In recipients carrying a graft differing by two HLA-DR alleles the response appeared to target only one of the mismatched antigens of the donor. Indirect allorecognition was restricted by a single HLA-DR antigen of the host and directed against one immunodominant peptide of donor HLA-DR protein. However, intermolecular spreading was demonstrated in patients with multiple rejection episodes by showing that they develop allopeptide reactivity against the second HLA-DR antigen. These data imply that early treatment to suppress T cell responses through the indirect pathway of allorecognition, such as tolerance induction to the dominant donor determinant, may be required to prevent amplification and perpetuation of the rejection process.


Subject(s)
Graft Rejection/immunology , HLA-DR Antigens/immunology , Heart Transplantation/immunology , Peptides/immunology , T-Lymphocytes/immunology , Cells, Cultured , Female , Histocompatibility Testing , Humans , Immune Tolerance , Immunodominant Epitopes , Lymphocyte Activation , Male , Time Factors
11.
Transpl Immunol ; 18(1): 13-21, 2007 Jul.
Article in English | MEDLINE | ID: mdl-17584597

ABSTRACT

The interleukin-2 receptor alpha chain (IL-2Ra, CD25) plays a major part in shaping the dynamics of T cell populations following immune activation, due to its role in T cell proliferation and survival. Strategies to blunt the effector responses in transplantation have been developed by devising pharmaceutical agents to block the IL-2 pathways. However, such strategies could adversely affect the CD25(+)FOXP3(+)T regulatory (T reg) populations which also rely on intereukin-2 signaling for survival. The present study shows that a cohort of heart allograft recipients treated with Daclizumab (a humanized anti-CD25 antibody) display FOXP3 expression patterns consistent with functional T regulatory cell populations. High levels of FOXP3 were observed to correlate with lower incidence of and recovery from acute rejection, as well as lower levels of anti-donor HLA antibody production. Therefore, T reg populations appear fully functional in patients treated with Daclizumab, even when 5 doses were administered. By comparison, patients treated with fewer doses or no Daclizumab had a higher incidence of acute rejection, antibody production and graft failure. Therefore, our data indicates that Daclizumab treatment does not interfere with the generation of regulatory T cells and has a beneficial effect on heart allograft survival.


Subject(s)
Forkhead Transcription Factors/analysis , Heart Transplantation/immunology , Interleukin-2 Receptor alpha Subunit/antagonists & inhibitors , T-Lymphocytes, Regulatory/immunology , Adolescent , Adult , Aged , Female , HLA Antigens/immunology , Humans , Male , Middle Aged
12.
Oncogene ; 18(28): 4108-19, 1999 Jul 15.
Article in English | MEDLINE | ID: mdl-10435592

ABSTRACT

Tumour suppressor genes and growth regulatory genes are frequent targets for methylation defects that can result in aberrant expression and mutagenesis. We have established a methylation map of the promoter region of the neurofibromatosis (NF1) gene and demonstrated functional sensitivity for methylation at specific sites for the SP1 and CRE binding (CREB) proteins in the NF1 regulatory region. We evaluated the methylation status of CpG dinucleotides within five promoter subregions in the human and mouse homologues of the neurofibromatosis (NF1) genes. Three 5' subregions were found to be consistently methylated in all the tissues analysed. In contrast, DNA methylation was absent in the vicinity of the transcription start site bounded by SP1 recognition sequences. Gelshift assays showed that methylation specifically inhibits the CREB transcription factor from binding to its recognition site at the NF1 transcription start site. Furthermore, SP1 elements within the NF1 promoter are methylation sensitive, particularly when methylation is present on the antisense strand. We propose that for NF1 as with several other tumour suppressor genes, CpG methylation occurs in a complex, site-specific manner with the maintenance of a methylation-free promoter region bounded by SP1 binding sites that allow an accessible promoter to be retained. When these SP1 boundaries are breached, methylation can sweep in, rendering the promoter inaccessible for specific methylation-sensitive transcription factors and leading to a loss of functional integrity of the methylation-free CpG island.


Subject(s)
CpG Islands , Cyclic AMP Response Element-Binding Protein/metabolism , DNA Methylation , DNA/metabolism , Genes, Neurofibromatosis 1 , Promoter Regions, Genetic , Sp1 Transcription Factor/metabolism , Animals , Binding Sites , Binding, Competitive , Gene Expression Regulation , Humans , Mice , Organ Specificity , Polymerase Chain Reaction , Protein Binding
13.
Oncogene ; 12(12): 2623-9, 1996 Jun 20.
Article in English | MEDLINE | ID: mdl-8700521

ABSTRACT

Heterogeneous mutations in the BRCA1 tumour suppressor gene are responsible for a large percentage of inherited breast cancers as well as breast/ovarian cancers in families with a high incidence of both cancer types. Over a hundred BRCA1 mutations have been reported, but little is known of the mechanism(s) responsible for BRCA1 mutagenesis. To determine the significance of specific nucleotide sequences at mutational sites within the BRCA1 gene, we assessed how frequently independent BRCA1 mutations occur at the site of short direct repeats, single nucleotide repeats (homonucleotides) and at CpG and CpNpG motifs. We found that homonucleotide and short direct repeats are commonly associated with small deletions and insertions. Substitution mutations are frequently associated with homonucleotide repeats and with methylatable CpG dinucleotides and CpNpG trinucleotides. Our methylation and sequencing experiments show that CpG and certain CpNpG motifs are methylated, supporting the hypothesis that DNA methylation specificity at these sites may be an important contributor to BRCA1 mutagenesis. We suggest that BRCA1 mutations are acquired by replication errors and are retained by cells through an intricate balancing of replication and repair mechanisms. Such mutations may provide a proliferative advantage for a cell, leading to the tumour phenotype.


Subject(s)
Breast Neoplasms/genetics , Mutation , Neoplasm Proteins/genetics , Transcription Factors/genetics , BRCA1 Protein , Base Composition , Base Sequence , CpG Islands , DNA Transposable Elements , Female , Genetic Predisposition to Disease , Humans , Methylation , Microsatellite Repeats , Molecular Sequence Data , Mutagenesis , Neoplasm Proteins/metabolism , Oligodeoxyribonucleotides , Point Mutation , Sequence Deletion , Transcription Factors/metabolism , Trinucleotide Repeats
14.
Oncogene ; 16(9): 1161-9, 1998 Mar 05.
Article in English | MEDLINE | ID: mdl-9528858

ABSTRACT

Breast cancer is a genetic disease arising from a series of germ-line and/or somatic DNA changes in a variety of genes, including BRCA1 and BRCA2. DNA modifications have been shown to occur by a number of mechanisms that include DNA methylation. In some cases, the aberrant methylation of CpGs within 5' regulatory regions has led to suppression of gene activity. In this report we describe a variation in the pattern of DNA methylation within the regulatory region of the BRCA1 gene. We found no evidence of methylation at CpGs within the BRCA1 promoter in a variety of normal human tissues. However, screening of a series of randomly sampled breast carcinomas revealed the presence of CpG methylation adjacent to the BRCA1 transcription start site. One such methylated CpG occurs at a putative CREB (cAMP-responsive element binding) transcription factor binding site in the BRCA1 promoter. Gelshift assays with methylated and unmethylated BRCA1/CREB binding site oligonucleotides demonstrate that this site is sensitive to site-specific CpG methylation. These data suggest that aberrant DNA methylation at regulatory sequences in the BRCA1 locus may play a role in the transcriptional inactivation of the BRCA1 gene within subclones of breast tumors. This study represents the first evidence suggesting a role for DNA methylation in the transcriptional inactivation of the BRCA1 in human breast cancer.


Subject(s)
Breast Neoplasms/genetics , Cyclic AMP Response Element-Binding Protein/metabolism , DNA Methylation , Dinucleoside Phosphates/metabolism , Genes, BRCA1 , Promoter Regions, Genetic , Regulatory Sequences, Nucleic Acid , Base Sequence , Binding Sites , Exons , Female , Humans , Models, Genetic , Oligodeoxyribonucleotides , Ovarian Neoplasms/genetics , Reference Values , Transcription, Genetic
15.
Circulation ; 104(12 Suppl 1): I177-83, 2001 Sep 18.
Article in English | MEDLINE | ID: mdl-11568052

ABSTRACT

BACKGROUND: The influence of sex on alloreactivity and graft outcome after heart transplantation was evaluated. METHODS AND RESULTS: A retrospective review of 520 consecutive recipients of a primary cardiac allograft between 1992 and 2000 at a single center was performed. The influence of sex on alloreactivity, acute rejection, transplant-related coronary artery disease, and survival was determined. Statistical methods included logistic regression analysis and Kaplan-Meier actuarial survival analysis. Female recipients had an increased prevalence before transplant of idiopathic cardiomyopathy, antinuclear antibodies, and HLA-B8, DR3 haplotypes. After transplant, female sex predicted shorter duration to a first rejection, higher cumulative rejection frequency, and earlier posttransplant production of anti-HLA antibodies. Female recipients had higher early mortality rates (<6 months) that were due to infection. Fatal infections correlated with 2-fold higher cyclosporine levels in female recipients. However, the incidence of transplant-related coronary artery disease developing beyond 1 year after transplant was lower in female than in male recipients. CONCLUSIONS: Females undergoing cardiac transplantation are more likely to manifest features of an underlying autoimmune state. This may predispose to a higher posttransplant risk of allograft rejection and requirement for increased immunosuppression. Earlier diagnosis and management of alloreactivity in female recipients before development of acute rejection and the use of more focused and less globally immunosuppressive agents during established rejections may have a significant effect on the clinical outcome of female cardiac allograft recipients.


Subject(s)
Autoimmune Diseases/epidemiology , Coronary Disease/epidemiology , Graft Rejection/epidemiology , Heart Transplantation , Autoantibodies/blood , Autoimmune Diseases/diagnosis , Autoimmune Diseases/immunology , Cause of Death , Cohort Studies , Comorbidity , Coronary Disease/diagnosis , Coronary Disease/immunology , Demography , Disease-Free Survival , Female , Graft Rejection/immunology , HLA Antigens/immunology , Heart Transplantation/immunology , Heart Transplantation/mortality , Humans , Incidence , Logistic Models , Male , Middle Aged , Multivariate Analysis , Opportunistic Infections/epidemiology , Opportunistic Infections/mortality , Prevalence , Retrospective Studies , Risk Assessment , Risk Factors , Sex Factors , Survival Analysis , Survival Rate
16.
Circulation ; 99(8): 990-2, 1999 Mar 02.
Article in English | MEDLINE | ID: mdl-10051289

ABSTRACT

BACKGROUND: Incomplete suppression of the renin-angiotensin system during long-term ACE inhibition may contribute to symptomatic deterioration in patients with severe congestive heart failure (CHF). Combined angiotensin II type I (AT1) receptor blockade and ACE inhibition more completely suppresses the activated renin-angiotensin system than either intervention alone in sodium-depleted normal individuals. Whether AT1 receptor blockade with losartan improves exercise capacity in patients with severe CHF already treated with ACE inhibitors is unknown. METHODS AND RESULTS: Thirty-three patients with severe CHF despite treatment with maximally recommended or tolerated doses of ACE inhibitors were randomized 1:1 to receive 50 mg/d losartan or placebo for 6 months in addition to standard therapy in a multicenter, double-blind trial. Peak aerobic capacity (V(O2)) during symptom-limited treadmill exercise and NYHA functional class were determined at baseline and after 3 and 6 months of double-blind therapy. Peak V(O2) at baseline and after 3 and 6 months were 13.5+/-0.6, 15.1+/-1.0, and 15.7+/-1.1 mL. kg-1. min-1, respectively, in patients receiving losartan and 14.1+/-0.6, 14.3+/-0.9, and 13.6+/-1.1 mL. kg-1. min-1, respectively, in patients receiving placebo (P<0.02 for treatment group-by-time interaction). Functional class improved by at least one NYHA class in 9 of 16 patients receiving losartan and 1 of 17 patients receiving placebo. CONCLUSIONS: Losartan enhances peak exercise capacity and alleviates symptoms in patients with CHF who are severely symptomatic despite treatment with maximally recommended or tolerated doses of ACE inhibitors.


Subject(s)
Angiotensin Receptor Antagonists , Angiotensin-Converting Enzyme Inhibitors/administration & dosage , Exercise , Heart Failure/drug therapy , Losartan/administration & dosage , Adult , Aged , Double-Blind Method , Drug Therapy, Combination , Female , Heart Failure/physiopathology , Humans , Male , Middle Aged
17.
J Am Coll Cardiol ; 29(3): 590-6, 1997 Mar 01.
Article in English | MEDLINE | ID: mdl-9060898

ABSTRACT

OBJECTIVES: This study investigated whether maximal exercise performance can be improved by acutely decreasing the work of breathing in these patients. BACKGROUND: Exertional dyspnea is a frequent limiting symptom in patients with heart failure. It may result from increased work of breathing. METHODS: Fifteen patients with heart failure and nine age-matched normal subjects underwent two maximal exercise tests. Subjects exercised twice in randomized, single-blind manner using room air (RA) and a 79% helium/21% oxygen mixture (He). Respiratory gas analysis, Borg scale recordings of perceived dyspnea and near infrared spectroscopy of an accessory respiratory muscle were obtained during exercise. RESULTS: In normal subjects there was no significant difference in peak oxygen uptake (Vo2) ([mean +/- SD] RA 38 +/- 8 vs. He 35 +/- 7 ml/kg per min), exercise duration (RA 724 +/- 163 vs. He 762 +/- 123 s) or peak minute ventilation (RA 97 +/- 27 vs. He 97 +/- 28 liters/min, all p = NS). Only three of nine control subjects thought that exercise with the He mixture was subjectively easier. In contrast, patients with heart failure exercised an average of 146 s longer with the He mixture (RA 868 +/- 293 vs. He 1,014 +/- 338, p < 0.01). Peak Vo2 (RA 19 +/- 4 vs. He 18 +/- 5 ml/kg per min) and peak minute ventilation (RA 53 +/- 12 vs. He 53 +/- 15 liters/min) were unchanged (both p = NS). The respiratory quotient at peak exercise was lower with the He mixture (RA 1.05 +/- 0.08 vs. He 0.98 +/- 0.06, p < 0.05). Thirteen of the 15 patients thought that exercise with the He mixture was subjectively easier (p < 0.02 vs. control group). CONCLUSIONS: In patients with heart failure, pulmonary factors, including respiratory muscle work and airflow turbulence, contribute to limiting exercise performance. Therapeutic interventions aimed at attenuating work of breathing may be beneficial.


Subject(s)
Cardiac Output, Low/physiopathology , Exercise/physiology , Work of Breathing , Adult , Exercise Test , Female , Heart Failure/physiopathology , Humans , Male , Middle Aged , Respiratory Function Tests , Single-Blind Method
18.
J Am Coll Cardiol ; 22(4 Suppl A): 93A-98A, 1993 Oct.
Article in English | MEDLINE | ID: mdl-8376701

ABSTRACT

Exertional intolerance is a major clinical problem in ambulatory patients with chronic heart failure and is associated with both muscle fatigue and dyspnea. The increased muscle fatigability is most likely caused by a combination of muscle underperfusion and muscle deconditioning; patients frequently exhibit skeletal muscle atrophy, altered muscle metabolism and reduced mitochondrial-based enzyme levels, consistent with deconditioning. The muscle underperfusion is largely due to impaired arteriolar vasodilation within exercising muscle. Exertional dyspnea appears to be due to increased respiratory muscle work mediated by excessive ventilation and decreased lung compliance. Both excessive carbon dioxide production, secondary to increased muscle lactate release, and increased lung dead space contribute to the excessive ventilation. Decreased lung compliance is caused by chronic pulmonary congestion and fibrosis. Optimal management of exercise intolerance in patients with heart failure requires an understanding of the role of these multiple potential contributors to exertional fatigue and dyspnea.


Subject(s)
Exercise Tolerance/physiology , Heart Failure/physiopathology , Biopsy , Dyspnea/etiology , Dyspnea/physiopathology , Fatigue/etiology , Fatigue/physiopathology , Heart Failure/etiology , Humans , Muscles/physiopathology , Respiration/physiology
19.
J Am Coll Cardiol ; 33(5): 1189-95, 1999 Apr.
Article in English | MEDLINE | ID: mdl-10193715

ABSTRACT

OBJECTIVES: The study aimed to determine the risk of death or urgent transplant for patients who survived an initial 6 months on the outpatient heart transplant waiting list when criteria emphasizing reduced peak oxygen consumption are used for transplant candidate selection. BACKGROUND: Waiting time is a key criterion for heart donor allocation. A recent single-center investigation described decreasing survival benefit from transplant for patients who survived an initial 6 months on the outpatient waiting list. METHODS: Kaplan-Meier survival analyses were performed for 80 patients from the Hospital of the University of Pennsylvania (HUP) listed from July 1986 to January 1991, and 132 patients from Columbia-Presbyterian Medical Center (CPMC) listed from September 1993 to September 1995. Survival from the time of outpatient listing for the entire group (ALL) was compared to subsequent survival from 6 months onward for those patients who survived the initial 6 months after placement on the outpatient list (6M). Both urgent transplant and left ventricular assist device implantation were considered equivalent to death; elective transplant was censored. RESULTS Survival for 6M was not significantly better than ALL at HUP (subsequent 12 months: 60+/-7 vs. 60+/-6% [mean+/-SD]; p = 0.89) nor at CPMC (subsequent 12 months: 60+/-6 vs. 48+/-5%; p = 0.35). Survival for 6M at both centers was substantially lower than survival following transplant from the outpatient list in the United States in 1995. CONCLUSIONS: When high-risk patients are selected for nonurgent transplant listing, mortality remains high, even among those who survive the initial six months after listing. Time accrued on the waiting list remains an appropriate criterion for donor allocation.


Subject(s)
Heart Failure/mortality , Heart Transplantation , Outpatients , Waiting Lists , Aged , Female , Follow-Up Studies , Heart Failure/metabolism , Heart Failure/surgery , Humans , Male , Middle Aged , Myocardium/metabolism , Outpatients/statistics & numerical data , Oxygen Consumption , Prospective Studies , Registries/statistics & numerical data , Risk Factors , Survival Rate , Time Factors , Tissue Donors , United States/epidemiology
20.
J Am Coll Cardiol ; 12(6): 1464-9, 1988 Dec.
Article in English | MEDLINE | ID: mdl-3192843

ABSTRACT

The reduced maximal exercise capacity of patients with heart failure has been attributed to skeletal muscle underperfusion with resultant intramuscular lactic acidosis and muscular fatigue. To investigate this hypothesis, the effect of dichloroacetate, a drug that decreases lactate formation by increasing pyruvate oxidation, on the maximal exercise performance of 18 patients with heart failure and reduced ejection fraction (25 +/- 9%) was examined. Exercise tests after parenteral dextrose (control) and dichloroacetate were performed 1 week apart. The sequence of interventions was randomized in a double-blind manner. Dichloroacetate decreased blood lactate at rest (control 8.0 +/- 2.5 versus dichloroacetate 5.6 +/- 2.9 mg/dl), throughout exercise and at peak exercise (control 26.0 +/- 14.3 versus dichloroacetate 19.4 +/- 10.8) (all p less than 0.05). In contrast, dichloroacetate had no effect on exercise time (control 15.2 +/- 6.0 versus dichloroacetate 15.9 +/- 6.2 min) or peak exercise oxygen consumption (control 1,280 +/- 498 ml/min versus dichloroacetate 1,312 +/- 530 ml/min) (both p = NS). In six subjects, dichloroacetate also had no effect at peak exercise on leg blood flow (control 2.8 +/- 1.1 versus dichloroacetate 3.0 +/- 0.6 liters/min) or femoral oxygen vein saturation (control 12.7 +/- 4.1% versus dichloroacetate 12.5 +/- 5.7%). These data suggest that intramuscular lactate accumulation is not responsible for muscular fatigue during exercise in patients with heart failure.


Subject(s)
Acetates/pharmacology , Dichloroacetic Acid/pharmacology , Exercise , Heart Failure/physiopathology , Acidosis, Lactic/complications , Adult , Aged , Fatigue/etiology , Humans , Lactates/blood , Lactic Acid , Leg/blood supply , Male , Middle Aged , Muscles/metabolism , Oxygen Consumption , Regional Blood Flow/drug effects
SELECTION OF CITATIONS
SEARCH DETAIL