Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
J Prev Alzheimers Dis ; 6(2): 90-99, 2019.
Article in English | MEDLINE | ID: mdl-30756115

ABSTRACT

BACKGROUND: Randomized placebo-controlled trials in the development of disease-modifying treatments for Alzheimer's disease are typically of short duration (12-18 months), and health economic modeling requires extrapolation of treatment effects beyond the trial period. OBJECTIVES: To investigate whether observational data can be used to extrapolate data from open-label trials, we compared outcomes (cognition, function, behavior) over 36 months for patients with mild Alzheimer's disease dementia in the GERAS observational study (proxy for placebo control) with those of the mild Alzheimer's disease population on active treatment (solanezumab) in two 18-month randomized placebo-controlled trials (EXPEDITION and EXPEDITION2) and the additional 18-month open-label extension study (EXPEDITION-EXT). DESIGN AND SETTING: Analysis of longitudinal data from patients with mild Alzheimer's disease dementia in the GERAS observational study (conducted in France, Germany and the United Kingdom) and the EXPEDITION program (conducted in Europe, North America, South America, Asia and Australia). PARTICIPANTS: European and North American community-living patients, aged ≥55 years, with probable Alzheimer's disease dementia and their caregivers. Mild Alzheimer's disease dementia was defined as a Mini-Mental State Examination score of 20-26 in EXPEDITION and 21-26 in GERAS. INTERVENTION: Active treatment in both randomized placebo-controlled trials and the open-label extension study was intravenous solanezumab 400 mg every 4 weeks. Patients in GERAS were receiving treatment as part of standard care. MEASUREMENTS: Between-group differences for changes from baseline over 36 months in cognitive function, ability to perform activities of daily living, and behavioral and psychological symptoms of dementia were assessed using models stratified by propensity score. RESULTS: At baseline, patients and caregivers participating in GERAS were significantly older than those in the EXPEDITION studies, and the GERAS patient cohort had fewer years of education and a shorter time since diagnosis of Alzheimer's disease. The baseline mean Mini-Mental State Examination score of the GERAS cohort was significantly higher (indicating better cognition) than that of patients receiving placebo or active treatment in the pooled EXPEDITION studies Baseline functional ability scores were significantly lower for the GERAS cohort, indicating poorer functioning. Propensity score stratification achieved a good balance in the baseline variables between GERAS and the two EXPEDITION arms. Over 18 months, least squares mean changes from baseline in outcome measures were similar in the GERAS cohort and the pooled placebo groups from the randomized controlled trials. Also, the 18-month results for the comparison between the GERAS cohort and the pooled active treatment groups from the randomized controlled trials were generally similar to those reported for the comparison with the control group in the randomized trial. Comparison of active treatment (EXPEDITION-EXT) and observational study (GERAS, as proxy control) results over 36 months of the open-label trial showed a significantly smaller decline in activities of daily living (instrumental and basic) in the active treatment group, reflecting better functioning, but no between-group differences at 36 months for cognitive function or behavioral and psychological symptoms of dementia. CONCLUSIONS: Comparing results from clinical trials and observational studies (real-world data) may be a useful methodological approach for informing long-term outcomes in Alzheimer's disease drug development and could be used to inform health economic modeling. Further research using this methodological approach is needed.


Subject(s)
Alzheimer Disease/drug therapy , Antibodies, Monoclonal, Humanized/therapeutic use , Control Groups , Observational Studies as Topic , Randomized Controlled Trials as Topic , Activities of Daily Living , Aged , Aged, 80 and over , Alzheimer Disease/nursing , Alzheimer Disease/physiopathology , Alzheimer Disease/psychology , Caregivers , Cognition , Female , Humans , Longitudinal Studies , Male , Middle Aged , Quality of Life , Treatment Outcome
2.
J Prev Alzheimers Dis ; 4(2): 72-80, 2017.
Article in English | MEDLINE | ID: mdl-29186278

ABSTRACT

BACKGROUND: While functional loss forms part of the current diagnostic criteria used to identify dementia due to Alzheimer's disease, the gradual and progressive nature of the disease makes it difficult to recognize clinically relevant signposts that could be helpful in making treatment and management decisions. Having previously observed a significant relationship between stages of functional dependence (the level of assistance patients require consequent to Alzheimer's disease deficits, derived from the Alzheimer's Disease Cooperative Study - Activities of Daily Living Scale) and cognitive severity, we investigated whether measures of functional dependence could be utilized to identify clinical milestones of Alzheimer's disease progression. OBJECTIVES: To describe the patterns of change in dependence over the course of 18 months in groups stratified according to cognitive Alzheimer's disease dementia severity (determined using the Mini-Mental State Examination score) and to identify characteristics associated with patients showing worsening dependence (progressors) versus those showing no change or improvement (non-progressors). DESIGN: Analysis of longitudinal data from the GERAS study. SETTING: GERAS is an 18-month prospective, multicenter, naturalistic, observational cohort study reflecting the routine care of patients with Alzheimer's disease in France, Germany, and the United Kingdom. PARTICIPANTS: 1495 community-living patients, aged ≥55 years, diagnosed with probable Alzheimer's disease dementia, and their caregivers. MEASUREMENTS: Dependence levels, cognitive function, behavioral symptoms, caregiver burden, and cost were assessed at baseline and at 18 months. RESULTS: Of 971 patients having both baseline and 18-month data, 42% (408) were progressors and 563 (58%) were non-progressors. This general pattern held for all three levels of baseline Alzheimer's disease dementia severity - mild (Mini-Mental State Examination score 21-26), moderate (15-20) or moderately severe/severe (<15) - with 40-45% of each group identified as progressors and 55-60% as non-progressors. No baseline differences were seen between progressors and non-progressors in cognitive scores or behavioral symptoms, although progressors had significantly shorter times since diagnosis and showed milder functional impairment. Baseline factors predictive of increasing dependence over 18 months included more severe cognitive impairment, living with others, and having multiple caregivers. A higher level of initial dependence was associated with less risk of dependence progression. Total societal costs of care also increased with greater dependence. CONCLUSIONS: In this large cohort, 42% of Alzheimer's disease dementia patients at all levels of cognitive severity became more dependent within 18 months of observation while 58% did not progress. Dependence levels may be considered as meaningful interim clinical milestones that reflect Alzheimer's disease-related functional deficits, although a time frame that extends beyond 18 months may be necessary to observe changes if used in clinical trials or other longitudinal studies. Recognition of predictors of greater dependence offers opportunities for intervention.


Subject(s)
Alzheimer Disease/diagnosis , Disease Progression , Activities of Daily Living , Aged , Alzheimer Disease/economics , Alzheimer Disease/psychology , Alzheimer Disease/therapy , Caregivers , Cognition , Cost of Illness , Female , France , Germany , Humans , Longitudinal Studies , Male , Mental Status and Dementia Tests , Prospective Studies , Severity of Illness Index , United Kingdom
3.
J Prev Alzheimers Dis ; 2(2): 115-120, 2015.
Article in English | MEDLINE | ID: mdl-28775969

ABSTRACT

BACKGROUND: Because Alzheimer's disease (AD) is characterized by a gradual decline, it can be difficult to identify distinct clinical milestones that signal disease advancement. Adapting a functional scale may be a useful way of staging disease progression that is more informative for healthcare systems. OBJECTIVES: To adapt functional scale scores into discrete levels of dependence as a way of staging disease progression that is more informative to care providers and stakeholders who rely on the functional impact of diseases to determine access to supportive services and interventions. DESIGN: Analysis of data from the GERAS study. SETTING: GERAS is an 18-month prospective, multicenter, naturalistic, observational cohort study reflecting the routine care of patients with AD in France, Germany, and the United Kingdom. PARTICIPANTS: Data were from baseline results of 1497 community-living patients, aged ≥55 years, diagnosed with probable AD and their caregivers. MEASUREMENTS: We used data from the Alzheimer's Disease Cooperative Study Activities of Daily Living Inventory (ADCS-ADL) and mapped items onto established categories of functional dependence, validated using clinical and economic measures. Cognitive function, behavioral symptoms, caregiver burden, and cost were assessed. Based on stages of functional dependence described by the Dependence Scale, individual ADCS-ADL items were used to approximate 6 dependence levels. RESULTS: There was a significant relationship between assigned level of dependence derived from the ADCS-ADL score and cognitive severity category. As the assigned level of dependence increased, the associated clinical and economic indicators demonstrated a pattern of greater disease severity. CONCLUSIONS: This mapping provides initial support for dependence levels as appropriate interim clinical milestones that characterize the functional deficits associated with AD.

4.
J Nutr Health Aging ; 18(7): 677-84, 2014 Jul.
Article in English | MEDLINE | ID: mdl-25226106

ABSTRACT

OBJECTIVES: This study aimed to describe the baseline characteristics of informal carers of community-living Alzheimer's disease (AD) patients by AD severity group and to identify factors associated with two measures of caregiver burden. DESIGN AND SETTING: GERAS is a prospective observational study in France, Germany, and the UK, designed to assess costs and resource use associated with AD, for patients and their caregivers, stratified by disease severity. PARTICIPANTS: 1497 community-dwelling AD patients and their primary caregivers. MEASUREMENTS: Subjective caregiver burden assessed using the Zarit Burden Interview [ZBI] and time spent supervising patients (an objective measure of burden recorded using the Resource Utilization in Dementia instrument) during the month before the baseline visit were recorded. Separate multiple linear regression analyses using ZBI total score and caregiver supervision time as dependent variables were performed to identify patient and caregiver factors independently associated with caregiver burden. RESULTS: Increasing AD severity was associated with both subjective caregiver burden (ZBI total score) and overall caregiver time, which includes supervision time (both p<0.001, ANOVA). Better patient functioning (on instrumental activities of daily living) was independently associated with both a lower ZBI total score and less supervision time, whereas higher levels of caregiver distress due to patient behavior were associated with greater caregiver burden. Other factors independently associated with an increased ZBI total score included younger caregiver age, caregiver self-reported depression, caring for a male patient, and longer time since AD diagnosis. Caregivers living with the patient, being a male caregiver, patient living in a rural location, higher patient behavioral problem subdomain scores for apathy and psychosis, more patient emergency room visits, not receiving food delivery and receiving financial support for caregiving were all associated with greater caregiver supervision time. CONCLUSION: Our results show that subjective caregiver burden and caregiver time are influenced by different factors, reinforcing the need to consider both aspects of caregiving when trying to minimize the burden of AD. However, interventions that minimize caregiver distress and improve patient functioning may impact on both subjective and objective burden.


Subject(s)
Alzheimer Disease/economics , Caregivers/psychology , Cost of Illness , Self Report , Activities of Daily Living , Aged , Alzheimer Disease/diagnosis , Cross-Sectional Studies , Depression/epidemiology , Female , Follow-Up Studies , France , Germany , Humans , Linear Models , Longitudinal Studies , Male , Middle Aged , Multivariate Analysis , Prospective Studies , Residence Characteristics , United Kingdom
6.
Int J Clin Pract ; 61(11): 1850-62, 2007 Nov.
Article in English | MEDLINE | ID: mdl-17850306

ABSTRACT

AIMS: This report describes patterns of treatment changes with the phosphodiesterase type 5 (PDE5) inhibitors tadalafil, sildenafil and vardenafil, and variables associated with those treatment changes, during the 6-month, prospective, pan-European Erectile Dysfunction Observational Study (EDOS). METHODS: EDOS observed 8047 men > or = 18 years old with erectile dysfunction (ED), who began or changed ED therapy as part of their routine healthcare. Patients could change ED treatment at any time during EDOS. Data were collected at baseline and at 3 (+/- 1) and 6 (+/- 1) months. Analyses included ED treatment-naïve patients with complete follow-up who were prescribed a PDE5 inhibitor at baseline (n = 4026). RESULTS: Most patients, regardless of what PDE5 inhibitor they were prescribed at baseline, continued on that same PDE5 inhibitor throughout the study. Continuation rates were approximately 89% in the tadalafil cohort, vs. 63-64% in the sildenafil and vardenafil cohorts. The variables most strongly associated with increased risk of switching were prescription of sildenafil or vardenafil, vs. tadalafil, at baseline (odds ratios 4.43 and 4.14 respectively; p < 0.0001). Of patients who switched from tadalafil to another treatment, nearly 25% had switched back to tadalafil by study end. In contrast, of patients who switched from sildenafil or vardenafil, < 10% from each cohort had switched back to their original treatment by study end. CONCLUSION: The data suggest that tadalafil treatment in treatment-naïve ED patients may increase their likelihood of treatment continuation. These findings should be interpreted conservatively due to the observational nature of the study.


Subject(s)
Carbolines/therapeutic use , Erectile Dysfunction/drug therapy , Imidazoles/therapeutic use , Patient Satisfaction , Phosphodiesterase Inhibitors/therapeutic use , Piperazines/therapeutic use , Sulfones/therapeutic use , Adolescent , Adult , Aged , Cohort Studies , Follow-Up Studies , Humans , Male , Middle Aged , Patient Compliance , Prospective Studies , Purines/therapeutic use , Regression Analysis , Severity of Illness Index , Sildenafil Citrate , Surveys and Questionnaires , Tadalafil , Time Factors , Treatment Outcome , Triazines/therapeutic use , Vardenafil Dihydrochloride
7.
Int J Clin Pract ; 60(11): 1386-93, 2006 Nov.
Article in English | MEDLINE | ID: mdl-17073836

ABSTRACT

A multicentre, non-randomised, open-label study assessed whether personal distress caused by erectile dysfunction (ED) affected psychosocial outcomes of tadalafil treatment. Eligible Swedish men at least 18 years old reporting > or =3-month history of ED were stratified into two groups (manifest or mild/no distress) based upon a distress question administered at enrollment. Tadalafil 20 mg was taken as needed for 8 weeks. The primary outcome was the difference between the two distress groups in change from baseline in the Psychological and Interpersonal Relationship Scales (PAIRS) spontaneity domain. Secondary outcome measures were PAIRS sexual self-confidence and time concerns domains, Life Satisfaction (LiSat-11) checklist and a Global Assessment of Treatment Response. The study also assessed tolerability. Of 662 men enrolled, 88% had manifest distress and 12% had mild/no distress. Baseline-to-endpoint changes for PAIRS domains were not significantly different between groups. Baseline-to-endpoint changes in LiSat-11 items were not significantly different between groups except for satisfaction with sexual life. Compared with men without ED, below normal baseline satisfaction with partner relationship and family life were normalised at endpoint. Over 90% of men reported improved erection and ability to engage in sexual activity. The most common treatment-emergent adverse events were headache, myalgia, dyspepsia, flushing and back pain. One man discontinued because of myalgia; 630 (95%) completed the study. In conclusion, erectile distress levels vary among patients with ED and distress can affect intra-familiar aspects of life, which may have implications for clinical practise. However, distress does not appear to hinder improvement in both mechanical and psychosocial outcomes of tadalafil treatment.


Subject(s)
Carbolines/therapeutic use , Erectile Dysfunction , Phosphodiesterase Inhibitors/therapeutic use , Quality of Life , Adult , Aged , Aged, 80 and over , Erectile Dysfunction/drug therapy , Erectile Dysfunction/psychology , Humans , Interpersonal Relations , Male , Middle Aged , Patient Satisfaction , Surveys and Questionnaires , Sweden , Tadalafil , Treatment Outcome
9.
Clin Transpl ; : 105-13, 2000.
Article in English | MEDLINE | ID: mdl-11512304

ABSTRACT

RENAL TRANSPLANT OUTCOME: Analysis of 5-year transplant survival in the UK showed a number of significant factors influencing outcome of adult cadaveric renal transplantation. Data from 5,963 first grafts and 1,078 regrafts carried out between 1990-1997 showed year of graft, recipient age and diabetes, donor age, kidney exchange between centres and HLA matching to influence 5-year outcome. The most important prognostic factor was donor age: the risk of transplant failure within 5 years for grafts using kidneys from donors aged 60 years and over was double that of grafts using donors aged 18-34 years. Unlike the effect of donor age, the influence of HLA matching would appear to be diminishing with time. In contrast to transplants in the 1980's, the difference in 5-year transplant survival between 000 mismatched and favourably matched (100, 010 or 110 mismatched) transplants is no longer significant. An analysis of posttransplant survival for first grafts in different epochs (0-3 months, 3 months to 3 years and beyond 3 years) showed that one factor affected short-term outcome (exchange of kidneys between centres), whereas others affected outcome throughout the epochs (most notably donor age, recipient age and recipient diabetes). RECIPIENT AND DONOR AGE MATCHING: The mean recipient age in the UK and Republic of Ireland increased by 5 years between 1981-1990 but has remained at approximately 45 years since then. The mean donor age increased by 7 years to 42.5 years (s.e. 0.5) between 1981-1991 and since then has increased at a slower rate to 43.4 years (s.e. 0.5) in 1998. The mean donor-recipient age difference for more than 15,000 transplants carried out between 1990-1998 has decreased, primarily due to increasing donor age over this time. However, the introduction of a new Kidney Allocation Scheme in the UK in July 1998, part of which is aimed at minimising age differences, has increased the likelihood that recipients aged over 60 years will be allocated grafts from donors closer to their own age than previously. The new UK Kidney Allocation Scheme also gave children increased access to well-matched adult organs leading to an increased mean age difference for this group between July-December 1998. DONOR AND RECIPIENT HLA MATCHING: Modifications to the Kidney Allocation Scheme introduced in January 1997 with the aim of increasing the number of well-matched transplants has led to a rise in 000 mismatched grafts from 5% to 7% and favourably matched (100/010/110 mismatches) from 29% to 36% between 1990-1992 and 1996-1998. Over this same time the proportion of 2 DR-mismatched grafts has decreased from 10% to 4%. The revised Kidney Allocation Scheme implemented in July 1998 gave a further increase in priority to 000 mismatches, increasing the proportion of these transplants to 12% for the last half of 1998, a level which has been maintained since then.


Subject(s)
Kidney Transplantation , Adolescent , Adult , Age Factors , Child , Child, Preschool , Databases, Factual , Graft Survival , Histocompatibility Testing , Humans , Infant , Infant, Newborn , Ireland/epidemiology , Kidney Transplantation/immunology , Kidney Transplantation/mortality , Kidney Transplantation/statistics & numerical data , Middle Aged , Survival Rate , Tissue Donors/statistics & numerical data , United Kingdom/epidemiology
10.
Lancet ; 354(9185): 1147-52, 1999 Oct 02.
Article in English | MEDLINE | ID: mdl-10513707

ABSTRACT

BACKGROUND: In the UK, kidneys are exchanged between centres on the basis of matching for HLA. We analysed various factors that might affect graft outcome to establish whether exchange of kidneys on this basis remains valid. METHODS: 6363 primary cadaveric renal transplants carried out in 23 centres in the UK between 1986 and 1993 were used in the analysis. 6338 (99.6%) patients who underwent transplantation were followed up at 1 year. 5-year follow-up data were available for 2907 (97.8%) of the 2972 patients who survived to 5 years. We made random checks to validate the data. A multifactorial analysis with Cox's proportional hazards models was used to analyse factors that had a possible effect on graft outcome. To ensure that the analysis of matching was constant during the 8-year study, our analysis was based on the HLA antigens used for organ exchange (11 A locus antigens, 27 B locus antigens, and 12 DR locus antigens). We assessed overall outcome at 5 years and during three periods after transplantation at: 0-3 months, 3-36 months, and after 36 months. FINDINGS: The following factors were significantly associated with graft outcome in the multifactorial analysis: year of graft, age of donor, age of recipient, whether the recipient had diabetes, cause of donor's death, cold ischaemic time, transport of kidneys, transplant centre, and matching for HLA. The best outcome was achieved with kidneys that had no mismatches at HLA-A, HLA-B, and HLA-DR loci (000 mismatches). The next most favourable outcome was achieved with one mismatch at either A or B loci or one mismatch at both the A and B , but no mismatch at the DR locus (100, 010, or 110 mismatches). Age of the donor and recipient had a significant effect on transplant outcome: older age was associated with increased risk of graft failure. INTERPRETATION: Various factors affect the outcome of primary cadaveric renal transplantation, particularly the age of the donor and the recipient. However, the effect of matching for HLA remains a strong one and fully justifies the continuing policy in the UK of exchanging kidneys on the basis of HLA matching, especially to recipients when there is a 000 mismatch for HLA between donor and recipient. On the basis of this analysis, a new allocation scheme for kidneys was introduced in the UK in 1998. During the first 9 months of the scheme, there has been a doubling of the number of HLA-000 mismatched kidneys transplanted.


Subject(s)
Graft Survival , Kidney Transplantation/statistics & numerical data , Adolescent , Adult , Aged , Cadaver , Child , Child, Preschool , Follow-Up Studies , HLA Antigens/isolation & purification , Humans , Infant , Infant, Newborn , Middle Aged , Proportional Hazards Models , Risk Factors , Time Factors , United Kingdom
11.
Clin Transpl ; : 107-13, 1998.
Article in English | MEDLINE | ID: mdl-10503089

ABSTRACT

A new allocation scheme for kidneys from adult cadaver donors was introduced in the UK on July 1st, 1998. The new scheme is based on data from an analysis of factors influencing transplant survival instigated by the Kidney Advisory Group (KAG) of the UKTSSA. A cohort of 6,363 first cadaver allografts performed in the UK between 1986-1993 was used for the analysis with 99.6% one-year follow-up and 97.8% 5-year follow-up. HLA matching was one of a number of factors that were found to influence transplant survival, thus supporting the policy of exchange of kidneys based on matching. The new allocation scheme is based on 3 tiers in which cadaver organs are offered first in Tier 1 to patients with zero HLA antigen mismatches (000 matchgrade), in Tier 2 to favorably matched patients, (matched for HLA-DR and mismatched for a maximum of one HLA-A and one-B locus antigen -100, 010, 110 matchgrades) and remaining kidneys in Tier 3 to non-favorably matched patients. A points score devised by a subgroup of the KAG to reflect natural justice and common sense is used as a discriminator between equally matched patients. The points are based on recipient age, donor-recipient age difference, waiting time, matchability for HLA antigens, sensitization to HLA antigens and the transplant unit balance of organ exchange. The performance of the scheme will be closely monitored, but computer simulations predict that there will be an overall improvement in transplant survival as a result of an increase in well matched transplants.


Subject(s)
Health Care Rationing/organization & administration , Kidney Transplantation/statistics & numerical data , Tissue and Organ Procurement/organization & administration , Adolescent , Adult , Aged , Cadaver , Female , Histocompatibility Testing , Humans , Male , Middle Aged , Tissue Donors , Tissue and Organ Procurement/statistics & numerical data , United Kingdom
14.
Transpl Int ; 7 Suppl 1: S102-3, 1994.
Article in English | MEDLINE | ID: mdl-11271175

ABSTRACT

Elective blood group O liver recipients appear to wait longer than most other groups for matched donors. The aim of this study was to confirm the suspected differences in elective waiting times in the UK using data from the United Kingdom Transplant Support Service, and to determine some of the factors responsible for them. The findings were that potential group O recipients waited significantly longer than other groups for transplantation, and that 22% of group O livers were going to non-O recipients. AB, the group with the shortest waiting time, was receiving 74.5% mismatched (but compatible) grafts, from all other groups.


Subject(s)
ABO Blood-Group System , Blood Group Incompatibility , Liver Transplantation/immunology , Humans , Liver Transplantation/statistics & numerical data , Retrospective Studies , Tissue Donors/supply & distribution , United Kingdom , Waiting Lists
SELECTION OF CITATIONS
SEARCH DETAIL