Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 62
Filtrar
1.
Clin Infect Dis ; 76(3): e18-e25, 2023 02 08.
Artigo em Inglês | MEDLINE | ID: mdl-36041009

RESUMO

BACKGROUND: In late 2021, the Omicron severe acute respiratory syndrome coronavirus 2 variant emerged and rapidly replaced Delta as the dominant variant. The increased transmissibility of Omicron led to surges in case rates and hospitalizations; however, the true severity of the variant remained unclear. We aimed to provide robust estimates of Omicron severity relative to Delta. METHODS: This retrospective cohort study was conducted with data from the British Columbia COVID-19 Cohort, a large provincial surveillance platform with linkage to administrative datasets. To capture the time of cocirculation with Omicron and Delta, December 2021 was chosen as the study period. Whole-genome sequencing was used to determine Omicron and Delta variants. To assess the severity (hospitalization, intensive care unit [ICU] admission, length of stay), we conducted adjusted Cox proportional hazard models, weighted by inverse probability of treatment weights (IPTW). RESULTS: The cohort was composed of 13 128 individuals (7729 Omicron and 5399 Delta). There were 419 coronavirus disease 2019 hospitalizations, with 118 (22%) among people diagnosed with Omicron (crude rate = 1.5% Omicron, 5.6% Delta). In multivariable IPTW analysis, Omicron was associated with a 50% lower risk of hospitalization compared with Delta (adjusted hazard ratio [aHR] = 0.50, 95% confidence interval [CI] = 0.43 to 0.59), a 73% lower risk of ICU admission (aHR = 0.27, 95% CI = 0.19 to 0.38), and a 5-day shorter hospital stay (aß = -5.03, 95% CI = -8.01 to -2.05). CONCLUSIONS: Our analysis supports findings from other studies that have demonstrated lower risk of severe outcomes in Omicron-infected individuals relative to Delta.


Assuntos
COVID-19 , SARS-CoV-2 , Humanos , Colúmbia Britânica/epidemiologia , SARS-CoV-2/genética , Estudos Retrospectivos , COVID-19/epidemiologia
2.
Emerg Infect Dis ; 29(10): 1999-2007, 2023 10.
Artigo em Inglês | MEDLINE | ID: mdl-37640374

RESUMO

In British Columbia, Canada, initial growth of the SARS-CoV-2 Delta variant was slower than that reported in other jurisdictions. Delta became the dominant variant (>50% prevalence) within ≈7-13 weeks of first detection in regions within the United Kingdom and United States. In British Columbia, it remained at <10% of weekly incident COVID-19 cases for 13 weeks after first detection on March 21, 2021, eventually reaching dominance after 17 weeks. We describe the growth of Delta variant cases in British Columbia during March 1-June 30, 2021, and apply retrospective counterfactual modeling to examine factors for the initially low COVID-19 case rate after Delta introduction, such as vaccination coverage and nonpharmaceutical interventions. Growth of COVID-19 cases in the first 3 months after Delta emergence was likely limited in British Columbia because additional nonpharmaceutical interventions were implemented to reduce levels of contact at the end of March 2021, soon after variant emergence.


Assuntos
COVID-19 , SARS-CoV-2 , Humanos , Colúmbia Britânica/epidemiologia , SARS-CoV-2/genética , Estudos Retrospectivos , COVID-19/epidemiologia , COVID-19/prevenção & controle
3.
Clin Transplant ; 37(3): e14866, 2023 03.
Artigo em Inglês | MEDLINE | ID: mdl-36512481

RESUMO

INTRODUCTION: The illicit drug toxicity (overdose) crisis has worsened across Canada; between 2016 and 2021, more than 28,000 individuals have died of drug toxicity. Organ donation from persons who experience drug toxicity death (DTD) has increased in recent years. This study examines whether survival after heart or bilateral-lung transplantation differed by donor cause of death. METHODS: We studied transplant recipients in British Columbia who received heart (N = 110) or bilateral-lung (N = 223) transplantation from deceased donors aged 12-70 years between 2013 and 2019. Transplant recipient survival was compared by donor cause of death from drug toxicity or other. Five-year Kaplan-Meier estimates of survival and 3-year inverse probability treatment weighted Cox proportional hazards models were conducted. RESULTS: DTD donors made up 36% (40/110) of heart and 24% (54/223) of bilateral-lung transplantations. DTD donors were more likely to be young, white, and male. Unadjusted 5-year recipient survival was similar by donor cause of death (heart: 87% for DTD and 86% for non-DTD, p = .75; bilateral- lung: 80% for DTD and 76% for non-DTD, p = .65). Adjusted risk of mortality at 3-years post-transplant was similar between recipients of DTD and non-DTD donor heart (hazard ratio [HR]: .94, 95% confidence interval (CI): .22-4.07, p = .938) and bilateral-lung (HR: 1.06, 95% CI: .41-2.70, p = .908). CONCLUSION: Recipient survival after heart or bilateral-lung transplantation from DTD donors and non-DTD donors was similar. Donation from DTD donors is safe and should be considered more broadly to increase organ donation.


Assuntos
Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos , Transplante de Coração , Transplante de Pulmão , Obtenção de Tecidos e Órgãos , Humanos , Masculino , Doadores de Tecidos , Colúmbia Britânica , Estudos Retrospectivos , Sobrevivência de Enxerto
4.
J Infect Dis ; 225(8): 1387-1398, 2022 04 19.
Artigo em Inglês | MEDLINE | ID: mdl-32215564

RESUMO

BACKGROUND: The influenza A(H3N2) vaccine was updated from clade 3C.3a in 2015-2016 to 3C.2a for 2016-2017 and 2017-2018. Circulating 3C.2a viruses showed considerable hemagglutinin glycoprotein diversification and the egg-adapted vaccine also bore mutations. METHODS: Vaccine effectiveness (VE) in 2016-2017 and 2017-2018 was assessed by test-negative design, explored by A(H3N2) phylogenetic subcluster and prior season's vaccination history. RESULTS: In 2016-2017, A(H3N2) VE was 36% (95% confidence interval [CI], 18%-50%), comparable with (43%; 95% CI, 24%-58%) or without (33%; 95% CI, -21% to 62%) prior season's vaccination. In 2017-2018, VE was 14% (95% CI, -8% to 31%), lower with (9%; 95% CI, -18% to 30%) versus without (45%; 95% CI, -7% to 71%) prior season's vaccination. In 2016-2017, VE against predominant clade 3C.2a1 viruses was 33% (95% CI, 11%-50%): 18% (95% CI, -40% to 52%) for 3C.2a1a defined by a pivotal T135K loss of glycosylation; 60% (95% CI, 19%-81%) for 3C.2a1b (without T135K); and 31% (95% CI, 2%-51%) for other 3C.2a1 variants (with/without T135K). VE against 3C.2a2 viruses was 45% (95% CI, 2%-70%) in 2016-2017 but 15% (95% CI, -7% to 33%) in 2017-2018 when 3C.2a2 predominated. VE against 3C.2a1b in 2017-2018 was 37% (95% CI, -57% to 75%), lower at 12% (95% CI, -129% to 67%) for a new 3C.2a1b subcluster (n = 28) also bearing T135K. CONCLUSIONS: Exploring VE by phylogenetic subcluster and prior vaccination history reveals informative heterogeneity. Pivotal mutations affecting glycosylation sites, and repeat vaccination using unchanged antigen, may reduce VE.


Assuntos
Epidemias , Vacinas contra Influenza , Influenza Humana , Humanos , Influenza Humana/epidemiologia , Influenza Humana/prevenção & controle , Vírus da Influenza A Subtipo H3N2 , Filogenia , Eficácia de Vacinas , Vacinação , Canadá/epidemiologia , Estações do Ano
5.
Am J Kidney Dis ; 80(6): 740-750, 2022 12.
Artigo em Inglês | MEDLINE | ID: mdl-35659570

RESUMO

RATIONALE & OBJECTIVE: Little is known about the risk of cardiovascular disease (CVD) in patients with various primary glomerular diseases. In a population-level cohort of adults with primary glomerular disease, we sought to describe the risk of CVD compared with the general population and the impact of traditional and kidney-related risk factors on CVD risk. STUDY DESIGN: Observational cohort study. SETTING & PARTICIPANTS: Adults with membranous nephropathy (n = 387), minimal change disease (n = 226), IgA nephropathy (n = 759), and focal segmental glomerulosclerosis (n = 540) from a centralized pathology registry in British Columbia, Canada (2000-2012). EXPOSURE: Traditional CVD risk factors (diabetes, age, sex, dyslipidemia, hypertension, smoking, prior CVD) and kidney-related risk factors (type of glomerular disease, estimated glomerular filtration rate [eGFR], proteinuria). OUTCOME: A composite CVD outcome of coronary artery, cerebrovascular, and peripheral vascular events, and death due to myocardial infarction or stroke. ANALYTICAL APPROACH: Subdistribution hazards models to evaluate the outcome risk with non-CVD death treated as a competing event. Standardized incidence rates (SIR) calculated based on the age- and sex-matched general population. RESULTS: During a median 6.8 years of follow-up, 212 patients (11.1%) experienced the CVD outcome (10-year risk, 14.7% [95% CI, 12.8%-16.8%]). The incidence rate was high for the overall cohort (24.7 per 1,000 person-years) and for each disease type (range, 12.2-46.1 per 1,000 person-years), and was higher than that observed in the general population both overall (SIR, 2.46 [95% CI, 2.12-2.82]) and for each disease type (SIR range, 1.38-3.98). Disease type, baseline eGFR, and proteinuria were associated with a higher risk of CVD and, when added to a model with traditional risk factors, led to improvements in model fit (R2 of 14.3% vs 12.7%), risk discrimination (C-statistic of 0.81 vs 0.78; difference, 0.02 [95% CI, 0.01-0.04]), and continuous net reclassification improvement (0.4 [95% CI, 0.2-0.6]). LIMITATIONS: Ascertainment of outcomes and comorbidities using administrative data. CONCLUSIONS: Patients with primary glomerular disease have a high absolute risk of CVD that is approximately 2.5 times that of the general population. Consideration of eGFR, proteinuria, and type of glomerular disease may improve risk stratification of CVD risk in these individuals. PLAIN-LANGUAGE SUMMARY: Patients with chronic kidney disease are known to be at high risk of cardiovascular disease. Cardiovascular risk in patients with primary glomerular diseases is poorly understood because these conditions are rare and require a kidney biopsy for diagnosis. In this study of 1,912 Canadian patients with biopsy-proven IgA nephropathy, minimal change disease, focal segmental glomerulosclerosis, and membranous nephropathy, the rate of cardiovascular events was 2.5 times higher than in the general population and was high for each disease type. Consideration of disease type, kidney function, and proteinuria improved the prediction of cardiovascular events. In summary, our population-level study showed that patients with primary glomerular diseases have a high cardiovascular risk, and that inclusion of kidney-specific risk factors may improve risk stratification.


Assuntos
Doenças Cardiovasculares , Glomerulonefrite por IGA , Glomerulonefrite Membranosa , Glomerulosclerose Segmentar e Focal , Nefrose Lipoide , Adulto , Humanos , Glomerulosclerose Segmentar e Focal/patologia , Glomerulonefrite Membranosa/patologia , Doenças Cardiovasculares/epidemiologia , Glomerulonefrite por IGA/patologia , Nefrose Lipoide/patologia , Proteinúria , Taxa de Filtração Glomerular , Fatores de Risco , Colúmbia Britânica/epidemiologia
6.
Value Health ; 25(9): 1510-1519, 2022 09.
Artigo em Inglês | MEDLINE | ID: mdl-35466049

RESUMO

OBJECTIVES: Invasive pneumococcal disease (IPD) and a variety of clinical syndromes caused by pneumococci, such as acute otitis media (AOM), acute sinusitis (AS), and community-acquired pneumonia (CAP), cause a substantial burden on healthcare systems. Few studies have explored the short-term financial burden of pneumococcal disease after the 13-valent pneumococcal conjugate vaccine (PCV13) introduction in the infant immunization programs. This population-based study evaluated changes in costs associated with healthcare utilization for pneumococcal disease after the PCV13 introduction in the infant immunization program in British Columbia, Canada. METHODS: Individuals with pneumococcal disease were identified using provincial administrative data for the 2000 to 2018 period. Total direct healthcare costs were determined using case-mix methodology for hospitalization and fee-for-service codes for outpatient visits and medications dispensed. Costs were adjusted to 2018 Canadian dollars. Changes in the annual healthcare costs were evaluated across vaccine eras (pre-PCV13, 2000-2010; PCV13, 2011-2018) using generalized linear models, adjusting for the 7-valent pneumococcal conjugate vaccine program (2004-2010). RESULTS: During the 19-year study period, pneumococcal disease resulted in 6.3 million cases among 85 million total patient-years, resulting in total healthcare costs of $7.9 billion. More than 6.2 million cases were treated in outpatient setting, costing $0.65 billion (8% of total costs associated with pneumococcal disease treatment), whereas 370 000 hospitalized cases were 3% of all cases, which accrued $7.25 billion (92% of total costs) in costs. Healthcare costs for all studied infections nearly doubled over the study period from $248 million in 2000 to $476 million in 2018 (P = .003). In contrast, there were large declines in total annual costs in the PCV13 era for IPD (adjusted relative rate (aRR) 0.73; 95% confidence interval [CI] 0.56-0.95; P = .032), AOM (aRR 0.70; 95% CI 0.59-0.83; P = .001), and AS (aRR 0.68; 95% CI 0.54-0.85; P = .004) compared with the pre-PCV13 era. Total costs increased marginally in the PCV13 era for all-cause CAP (aRR 1.04; 95% CI 0.94-1.15; P = .484). CONCLUSIONS: This study confirms a temporal association in declining economic burden for IPD, AOM, and AS after the PCV13 introduction. Nevertheless, the total economic burden continues to be high in the PCV13 era, mainly driven by increasing CAP costs.


Assuntos
Otite Média , Infecções Pneumocócicas , Doença Aguda , Colúmbia Britânica/epidemiologia , Custos de Cuidados de Saúde , Humanos , Incidência , Lactente , Otite Média/epidemiologia , Otite Média/prevenção & controle , Infecções Pneumocócicas/prevenção & controle , Vacinas Pneumocócicas , Vacinação , Vacinas Conjugadas/uso terapêutico
7.
J Antimicrob Chemother ; 76(9): 2419-2427, 2021 08 12.
Artigo em Inglês | MEDLINE | ID: mdl-34021757

RESUMO

BACKGROUND: Numerous studies have characterized the 13-valent pneumococcal conjugate vaccine (PCV13) programme's beneficial effects on acute otitis media (AOM) and acute sinusitis (AS) rates in children; however, few studies have examined the impact on adults. OBJECTIVES: This retrospective cohort study evaluates the overall effect of the PCV13 immunization programme on the incidence of AOM and AS at the population level. METHODS: Health administrative databases were linked to assess outpatient visits, hospitalizations and antibiotic utilization from 2000 to 2018. Multivariable Poisson regression was used to evaluate the impact of the PCV13 vaccine programme (2011-18) compared with the pre-PCV13 era (2000-10), overall and by age. RESULTS: From 2000 to 2018, the incidence of AOM decreased by 50% (62 to 31 per 1000 population) while sinusitis decreased by 18% (33 to 27 per 1000 population). In the PCV13 era, the incidence of AOM declined [incidence rate ratio (IRR): 0.70; 95% CI: 0.70-0.70], in parallel with decreased incidence of antibiotic utilization (IRR: 0.65; 95% CI: 0.64-0.65). A reduction was also observed in the incidence of AS during the PCV13 era compared with the pre-PCV13 era (IRR: 0.88; 95% CI: 0.88-0.88), mainly driven by declines among those younger than 65 years of age. In contrast, an increase in AS incidence was noted in individuals aged ≥65 years (IRR: 1.03; 95% CI: 1.02-1.03). A decrease in antibiotic prescription rates for sinusitis was observed for those under 65 years of age. CONCLUSIONS: The PCV13 immunization programme is associated with a reduction in the incidence of AOM and AS. Moreover, the associated use of antibiotics for these diagnoses has comparably decreased across paediatric, as well as adult populations.


Assuntos
Otite Média , Infecções Pneumocócicas , Sinusite , Adulto , Colúmbia Britânica/epidemiologia , Criança , Humanos , Incidência , Lactente , Otite Média/epidemiologia , Otite Média/prevenção & controle , Infecções Pneumocócicas/epidemiologia , Infecções Pneumocócicas/prevenção & controle , Vacinas Pneumocócicas , Estudos Retrospectivos , Sinusite/epidemiologia , Sinusite/prevenção & controle , Streptococcus pneumoniae , Vacinas Conjugadas
8.
Clin Infect Dis ; 69(12): 2101-2108, 2019 11 27.
Artigo em Inglês | MEDLINE | ID: mdl-30856258

RESUMO

BACKGROUND: Latent tuberculosis infection (LTBI) screening and treatment is a key component of the World Health Organization (WHO) EndTB Strategy, but the impact of LTBI screening and treatment at a population level is unclear. We aimed to estimate the impact of LTBI screening and treatment in a population of migrants to British Columbia (BC), Canada. METHODS: This retrospective cohort included all individuals (N = 1 080 908) who immigrated to Canada as permanent residents between 1985 and 2012 and were residents in BC at any time up to 2013. Multiple administrative databases were linked to identify people with risk factors who met the WHO strong recommendations for screening: people with tuberculosis (TB) contact, with human immunodeficiency virus, on dialysis, with tumor necrosis factor-alpha inhibitors, who had an organ/haematological transplant, or with silicosis. Additional TB risk factors included immunosuppressive medications, cancer, diabetes, and migration from a country with a high TB burden. We defined active TB as preventable if diagnosed ≥6 months after a risk factor diagnosis. We estimated the number of preventable TB cases, given optimal LTBI screening and treatment, based on these risk factors. RESULTS: There were 16 085 people (1.5%) identified with WHO strong risk factors. Of the 2814 people with active TB, 118 (4.2%) were considered preventable through screening with WHO risk factors. Less than half (49.4%) were considered preventable with expanded screening to include people migrating from countries with high TB burdens, people who had been prescribed immunosuppressive medications, or people with diabetes or cancer. CONCLUSIONS: The application of WHO LTBI strong recommendations for screening would have minimally impacted the TB incidence in this population. Further high-risk groups must be identified to develop an effective LTBI screening and treatment strategy for low-incidence regions.


Assuntos
Avaliação do Impacto na Saúde , Tuberculose Latente/epidemiologia , Guias de Prática Clínica como Assunto , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Colúmbia Britânica/epidemiologia , Criança , Pré-Escolar , Emigrantes e Imigrantes , Feminino , Humanos , Incidência , Lactente , Recém-Nascido , Tuberculose Latente/diagnóstico , Masculino , Programas de Rastreamento/métodos , Programas de Rastreamento/normas , Pessoa de Meia-Idade , Programas Médicos Regionais , Estudos Retrospectivos , Organização Mundial da Saúde , Adulto Jovem
10.
J Am Soc Nephrol ; 29(4): 1301-1308, 2018 04.
Artigo em Inglês | MEDLINE | ID: mdl-29519800

RESUMO

The factors underlying the decline in living kidney donation in the United States since 2005 must be understood to inform strategies to ensure access to this option for future patients. Population-based estimates provide a better assessment of donation activity than do trends in the number of living donor transplants. Using data from the Scientific Registry of Transplant Recipients and the United States Census, we determined longitudinal changes in living kidney donation between 2005 and 2015, focusing on the effect of sex and income. We used multilevel Poisson models to adjust for differences in age, race, the incidence of ESRD, and geographic factors (including population density, urbanization, and daily commuting). During the study period, the unadjusted rate of donation was 30.1 and 19.3 per million population in women and men, respectively, and the adjusted incidence of donation was 44% higher in women (incidence rate ratio [IRR], 1.44; 95% confidence interval [95% CI], 1.39 to 1.49). The incidence of donation was stable in women (IRR, 0.95; 95% CI, 0.84 to 1.07) but declined in men (IRR, 0.75; 95% CI, 0.68 to 0.83). Income was associated with longitudinal changes in donation in both sexes, yet donation was stable in the highest two population income quartiles in women but only in the highest income quartile in men. In both sexes, living related donations declined, irrespective of income. In conclusion, living donation declined in men but remained stable in women between 2005 and 2015, and income appeared to have a greater effect on living donation in men.


Assuntos
Transplante de Rim , Doadores Vivos/estatística & dados numéricos , Fatores Sexuais , Adolescente , Adulto , Idoso , Etnicidade/estatística & dados numéricos , Feminino , Humanos , Incidência , Renda , Falência Renal Crônica/epidemiologia , Falência Renal Crônica/cirurgia , Transplante de Rim/economia , Doadores Vivos/provisão & distribuição , Masculino , Pessoa de Meia-Idade , Distribuição de Poisson , Sistema de Registros , População Rural/estatística & dados numéricos , Distribuição por Sexo , Fatores Socioeconômicos , Estados Unidos , População Urbana/estatística & dados numéricos , Adulto Jovem
11.
Euro Surveill ; 24(46)2019 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-31771709

RESUMO

IntroductionThe Canadian Sentinel Practitioner Surveillance Network reports vaccine effectiveness (VE) for the 2018/19 influenza A(H3N2) epidemic.AimTo explain a paradoxical signal of increased clade 3C.3a risk among 35-54-year-old vaccinees, we hypothesise childhood immunological imprinting and a cohort effect following the 1968 influenza A(H3N2) pandemic.MethodsWe assessed VE by test-negative design for influenza A(H3N2) overall and for co-circulating clades 3C.2a1b and 3C.3a. VE variation by age in 2018/19 was compared with amino acid variation in the haemagglutinin glycoprotein by year since 1968.ResultsInfluenza A(H3N2) VE was 17% (95% CI: -13 to 39) overall: 27% (95% CI: -7 to 50) for 3C.2a1b and -32% (95% CI: -119 to 21) for 3C.3a. Among 20-64-year-olds, VE was -7% (95% CI: -56 to 26): 6% (95% CI: -49 to 41) for 3C.2a1b and -96% (95% CI: -277 to -2) for 3C.3a. Clade 3C.3a VE showed a pronounced negative dip among 35-54-year-olds in whom the odds of medically attended illness were > 4-fold increased for vaccinated vs unvaccinated participants (p < 0.005). This age group was primed in childhood to influenza A(H3N2) viruses that for two decades following the 1968 pandemic bore a serine at haemagglutinin position 159, in common with contemporary 3C.3a viruses but mismatched to 3C.2a vaccine strains instead bearing tyrosine.DiscussionImprinting by the first childhood influenza infection is known to confer long-lasting immunity focused toward priming epitopes. Our findings suggest vaccine mismatch may negatively interact with imprinted immunity. The immunological mechanisms for imprint-regulated effect of vaccine (I-REV) warrant investigation.


Assuntos
Memória Imunológica , Vírus da Influenza A Subtipo H3N2/isolamento & purificação , Vacinas contra Influenza/administração & dosagem , Influenza Humana/prevenção & controle , Vigilância da População/métodos , Vacinação/estatística & dados numéricos , Potência de Vacina , Adulto , Fatores Etários , Canadá/epidemiologia , Feminino , Glicoproteínas de Hemaglutininação de Vírus da Influenza/genética , Humanos , Vírus da Influenza A Subtipo H3N2/genética , Vírus da Influenza A Subtipo H3N2/imunologia , Vacinas contra Influenza/imunologia , Influenza Humana/epidemiologia , Influenza Humana/imunologia , Influenza Humana/virologia , Masculino , Pessoa de Meia-Idade , Vigilância de Evento Sentinela
12.
Can Fam Physician ; 65(5): e231-e237, 2019 05.
Artigo em Inglês | MEDLINE | ID: mdl-31088889

RESUMO

OBJECTIVE: To evaluate the effects of the 2016 College of Physicians and Surgeons of British Columbia's (CPSBC's) opioid and benzodiazepine and z drug prescribing standards on the use of these medications in British Columbia. DESIGN: Interrupted time-series analysis of community-prescribing records over a 30-month period: January 2015 to June 2017. SETTING: British Columbia. PARTICIPANTS: Random sample of British Columbia residents with filled prescriptions during the study period. INTERVENTION: Introduction of CPSBC's opioid and benzodiazepine and z drug prescribing standards on June 1, 2016. MAIN OUTCOME MEASURES: Total weekly consumption of opioids (measured in morphine equivalents) and benzodiazepines and z drugs (measured in diazepam equivalents); and total monthly users of each class of medication. RESULTS: Total consumption of both medication classes began to decline in late 2015, and the rate of decrease did not statistically significantly change following the implementation of the CPSBC standards in June 2016. In contrast, introduction of the standards was associated with an immediate 2% decrease in the number of monthly users of opioids for pain (P < .001), culminating in a 9% decrease over the course of the following year (P < .001). This trend was driven largely by a decrease in the number of continuing users; minimal change was seen in the number of new users during the study period. Trends in monthly users of benzodiazepines and z drugs mirrored those seen for opioids for pain. CONCLUSION: Implementation of the 2016 CPSBC standards did not change a pre-existing downward trend in consumption of opioids or benzodiazepines and z drugs that began 6 months earlier. However, the standards did have a small effect on the number of monthly users of these medications, with a decrease in opioid prescribing among continuing users. Given the risk of destabilization of patients who are discontinued from opioid therapy, future research should assess how patient health outcomes are related to changing prescribing practices.


Assuntos
Analgésicos Opioides/uso terapêutico , Benzodiazepinas/uso terapêutico , Prescrições de Medicamentos/normas , Hipnóticos e Sedativos/uso terapêutico , Padrões de Prática Médica/estatística & dados numéricos , Colúmbia Britânica , Dor Crônica/tratamento farmacológico , Humanos , Análise de Séries Temporais Interrompida
13.
Am J Kidney Dis ; 71(5): 636-647, 2018 05.
Artigo em Inglês | MEDLINE | ID: mdl-29395484

RESUMO

BACKGROUND: The impact of dialysis exposure before nonpreemptive living donor kidney transplantation on allograft outcomes is uncertain. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: Adult first-time recipients of kidney-only living donor transplants in the United States who were recorded within the Scientific Registry of Transplant Recipients for 2000 to 2016. FACTORS: Duration of pretransplantation dialysis exposure. OUTCOMES: Kidney transplant failure from any cause including death, death-censored transplant failure, and death with allograft function. RESULTS: Among the 77,607 living donor transplant recipients studied, longer pretransplantation dialysis exposure was independently associated with progressively higher risk for transplant failure from any cause, including death beginning 6 months after transplantation. Compared with patients with 0.1 to 3.0 months of dialysis exposure, the HR for transplant failure from any cause including death increased from 1.16 (95% CI, 1.07-1.31) among patients with 6.1 to 9.0 months of dialysis exposure to 1.60 (95% CI, 1.43-1.79) among patients with more than 60.0 months of dialysis exposure. Pretransplantation dialysis exposure varied markedly among centers; median exposures were 11.0 and 18.9 months for centers in the 10th and 90th percentiles of dialysis exposure, respectively. Centers with the highest proportions of living donor transplantations had the shortest pretransplantation dialysis exposures. In multivariable analysis, patients of black race, with low income, with nonprivate insurance, with less than high school education, and not working for income had longer pretransplantation dialysis exposures. Dialysis exposure in patients with these characteristics also varied 2-fold between transplantation centers. LIMITATIONS: Why longer dialysis exposure is associated with transplant failure could not be determined. CONCLUSIONS: Longer pretransplantation dialysis exposure in nonpreemptive living donor kidney transplantation is associated with increased risk for allograft failure. Pretransplantation dialysis exposure is associated with recipients' sociodemographic and transplantation centers' characteristics. Understanding whether limiting pretransplantation dialysis exposure could improve living donor transplant outcomes will require further study.


Assuntos
Rejeição de Enxerto/epidemiologia , Falência Renal Crônica/cirurgia , Transplante de Rim/efeitos adversos , Diálise Renal/estatística & dados numéricos , Adulto , Fatores Etários , Estudos de Coortes , Feminino , Seguimentos , Sobrevivência de Enxerto , Humanos , Incidência , Falência Renal Crônica/diagnóstico , Falência Renal Crônica/mortalidade , Falência Renal Crônica/terapia , Transplante de Rim/métodos , Doadores Vivos , Masculino , Pessoa de Meia-Idade , Período Pré-Operatório , Modelos de Riscos Proporcionais , Sistema de Registros , Diálise Renal/métodos , Estudos Retrospectivos , Medição de Risco , Fatores Sexuais , Taxa de Sobrevida , Transplante Homólogo , Resultado do Tratamento
14.
J Am Soc Nephrol ; 28(12): 3647-3657, 2017 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-28982695

RESUMO

Donation after circulatory death (DCD) donors are an important source of kidneys for transplantation, but DCD donor transplantation is less common in the United States than in other countries. In this study of national data obtained between 2008 and 2015, recovery of DCD kidneys varied substantially among the country's 58 donor service areas, and 25% of DCD kidneys were recovered in only four donor service areas. Overall, 20% of recovered DCD kidneys were discarded, varying from 3% to 33% among donor service areas. Compared with kidneys from neurologically brain dead (NBD) donors, DCD kidneys had a higher adjusted odds ratio of discard that varied from 1.25 (95% confidence interval [95% CI], 1.16 to 1.34) in kidneys with total donor warm ischemic time (WIT) of 10-26 minutes to 2.67 (95% CI, 2.34 to 3.04) in kidneys with total donor WIT >48 minutes. Among the 12,831 DCD kidneys transplanted, kidneys with WIT≤48 minutes had survival similar to that of NBD kidneys. DCD kidneys with WIT>48 minutes had a higher risk of allograft failure (hazard ratio, 1.23; 95% CI, 1.07 to 1.41), but this risk was limited to kidneys with cold ischemia time (CIT) >12 hours. We conclude that donor service area-level variation in the recovery and discard of DCD kidneys is large. Additional national data collection is needed to understand the potential to increase DCD donor transplantation in the United States. Strategies to minimize cold ischemic injury may safely allow increased use of DCD kidneys with WIT>48 minutes.


Assuntos
Morte , Falência Renal Crônica/cirurgia , Transplante de Rim , Obtenção de Tecidos e Órgãos/métodos , Adolescente , Adulto , Idoso , Criança , Isquemia Fria , Função Retardada do Enxerto , Seleção do Doador , Feminino , Sobrevivência de Enxerto , Humanos , Rim/patologia , Masculino , Pessoa de Meia-Idade , Razão de Chances , Modelos de Riscos Proporcionais , Doadores de Tecidos , Resultado do Tratamento , Estados Unidos , Adulto Jovem
15.
Kidney Int ; 92(2): 490-496, 2017 08.
Artigo em Inglês | MEDLINE | ID: mdl-28433384

RESUMO

In living donor transplantation, cold ischemia time is a concern in transplants involving kidney paired donation. The impact of cold ischemia time over eight hours is unknown. Here we examined the association of cold ischemia time with delayed graft function and allograft loss among 48,498 living recipients in the Scientific Registry of Transplant Recipients registry. The incidence of delayed graft function was low but significantly higher among patients with longer cold ischemia times (0-2.0 hours: 3.3%; 2.1-4.0 hours: 3.9%; 4.1-8.0 hours: 4.3%; 8.1-16.0 hours: 5.5%). In multivariate analyses, only those with cold ischemia times of 8.1-16.0 hours had increased odds of delayed graft function (odds ratio 1.47; 95% confidence interval 1.05-2.05) compared to patients with times of 0-2.0 hours. In multivariate time-to-event analyses, cold ischemia times of 16 hours or less were not associated with allograft loss from any cause including death or death-censored graft loss with hazard ratios for cold ischemia times between 8.0-16.0 hours of 0.97 (95% confidence interval 0.74-1.26) and 1.09 (0.81-1.48) compared to patients with times of 0-2.0 hours). The results were consistent in paired and non-kidney paired donation transplants and in those with living donors over 50 years of age. In subgroup analysis restricted to kidney paired donation recipients, there was no difference in the risk of delayed graft function with an odds ratio of 1.40 (0.88, 2.40) or all-cause graft loss with a hazard ratio of 0.89 (0.62, 1.30) in transplant recipients who received kidneys that were shipped versus not shipped. Thus, a cold ischemia time up to 16 hours has limited impact on living donor outcomes. These findings may help expand living donor transplantation through kidney paired donation.


Assuntos
Isquemia Fria/estatística & dados numéricos , Transplante de Rim/estatística & dados numéricos , Adulto , Função Retardada do Enxerto , Feminino , Sobrevivência de Enxerto , Humanos , Doadores Vivos , Masculino , Pessoa de Meia-Idade , Adulto Jovem
16.
J Am Soc Nephrol ; 27(9): 2833-41, 2016 09.
Artigo em Inglês | MEDLINE | ID: mdl-26888475

RESUMO

Kidney retransplantation is a risk factor for decreased allograft survival. Repeated mismatched HLA antigens between first and second transplant may be a stimulus for immune memory responses and increased risk of alloimmune damage to the second allograft. Historical data identified a role of repeated HLA mismatches in allograft loss. However, evolution of HLA testing methods and a modern transplant era necessitate re-examination of this role to more accurately risk-stratify recipients. We conducted a contemporary registry analysis of data from 13,789 patients who received a second kidney transplant from 1995 to 2011, of which 3868 had one or more repeated mismatches. Multivariable Cox proportional hazards modeling revealed no effect of repeated mismatches on all-cause or death-censored graft loss. Analysis of predefined subgroups, however, showed that any class 2 repeated mismatch increased the hazard of death-censored graft loss, particularly in patients with detectable panel-reactive antibody before second transplant (hazard ratio [HR], 1.15; 95% confidence interval [95% CI], 1.02 to 1.29). Furthermore, in those who had nephrectomy of the first allograft, class 2 repeated mismatches specifically associated with all-cause (HR, 1.30; 95% CI, 1.07 to 1.58) and death-censored graft loss (HR, 1.41; 95% CI, 1.12 to 1.78). These updated data redefine the effect of repeated mismatches in retransplantation and challenge the paradigm that repeated mismatches in isolation confer increased immunologic risk. We also defined clear recipient categories for which repeated mismatches may be of greater concern in a contemporary cohort. Additional studies are needed to determine appropriate interventions for these recipients.


Assuntos
Rejeição de Enxerto/imunologia , Antígenos HLA/imunologia , Transplante de Rim , Adolescente , Adulto , Feminino , Rejeição de Enxerto/cirurgia , Humanos , Masculino , Pessoa de Meia-Idade , Reoperação , Estudos Retrospectivos , Medição de Risco , Imunologia de Transplantes , Adulto Jovem
17.
Kidney Int ; 89(6): 1331-6, 2016 06.
Artigo em Inglês | MEDLINE | ID: mdl-27165823

RESUMO

Concern about the long-term impact of delayed graft function (DGF) may limit the use of high-risk organs for kidney transplantation. To understand this better, we analyzed 29,598 mate kidney transplants from the same deceased donor where only 1 transplant developed DGF. The DGF associated risk of graft failure was greatest in the first posttransplant year, and in patients with concomitant acute rejection (hazard ratio: 8.22, 95% confidence interval: 4.76-14.21). In contrast, the DGF-associated risk of graft failure after the first posttransplant year in patients without acute rejection was far lower (hazard ratio: 1.15, 95% confidence interval: 1.02-1.29). In subsequent analysis, recipients of transplants complicated by DGF still derived a survival benefit when compared with patients who received treatment with dialysis irrespective of donor quality as measured by the Kidney Donor Profile Index (KDPI). The difference in the time required to derive a survival benefit was longer in transplants with DGF than in transplants without DGF, and this difference was greatest in recipients of lower quality kidneys (difference: 250-279 days for KDPI 20%-60% vs. 809 days for the KDPI over 80%). Thus, the association of DGF with graft failure is primarily limited to the first posttransplant year. Transplants complicated by DGF provide a survival benefit compared to treatment with dialysis, but the survival benefit is lower in kidney transplants with lower KDPI. This information may increase acceptance of kidneys at high risk for DGF and inform strategies to minimize the risk of death in the setting of DGF.


Assuntos
Aloenxertos/patologia , Função Retardada do Enxerto/complicações , Rejeição de Enxerto/mortalidade , Sobrevivência de Enxerto , Falência Renal Crônica/terapia , Transplante de Rim/mortalidade , Função Retardada do Enxerto/mortalidade , Feminino , Humanos , Falência Renal Crônica/mortalidade , Transplante de Rim/efeitos adversos , Masculino , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Diálise Renal/mortalidade , Fatores de Risco , Fatores de Tempo , Transplante Homólogo/efeitos adversos , Transplante Homólogo/mortalidade
18.
J Am Soc Nephrol ; 26(10): 2483-93, 2015 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-25814474

RESUMO

Strategies to increase expanded criteria donor (ECD) transplantation are needed. We quantified the extent to which ECD kidneys provide recipients with a lifetime of allograft function by determining the difference between patient survival and death-censored allograft survival (graft survival). Initial analyses compared 5-year outcomes in the Eurotransplant Senior Program (European) and the United States Renal Data System. Among European recipients ≥65 years, patient survival exceeded graft survival, and ECD recipients returned to dialysis for an average of 5.2 months after transplant failure. Among United States recipients ≥60 years, graft survival exceeded patient survival. Although patient survival in elderly recipients in the United States was low (49% at 5 years), the average difference in patient survival at 10 years in elderly recipients in the United States with an ECD versus non-ECD transplant was only 7 months. The probability of patient survival with a functioning allograft at 5 years was higher with ECD transplantation within 1 year after activation to the waiting list than with delayed non-ECD transplantation ≥3 years after activation to the waiting list. Subsequent analyses demonstrated that ECD transplants do not provide a lifetime of allograft function in recipients <50 years in the United States. These findings should encourage ECD transplantation in patients ≥60 years, demonstrate that rapid ECD transplantation is superior to delayed non-ECD transplantation, and challenge the policy in the United States of allowing patients <50 years to receive an ECD transplant.


Assuntos
Transplante de Rim , Rim/fisiologia , Adolescente , Adulto , Fatores Etários , Aloenxertos , Feminino , Sobrevivência de Enxerto , Humanos , Masculino , Pessoa de Meia-Idade , Fatores de Tempo , Doadores de Tecidos , Obtenção de Tecidos e Órgãos/normas , Estados Unidos , Adulto Jovem
19.
J Am Soc Nephrol ; 24(11): 1872-9, 2013 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-23990679

RESUMO

Studies of racial disparities in access to living donor kidney transplantation focus mainly on patient factors, whereas donor factors remain largely unexamined. Here, data from the US Census Bureau were combined with data on all African-American and white living kidney donors in the United States who were registered in the United Network for Organ Sharing (UNOS) between 1998 and 2010 (N=57,896) to examine the associations between living kidney donation (LKD) and donor median household income and race. The relative incidence of LKD was determined in zip code quintiles ranked by median household income after adjustment for age, sex, ESRD rate, and geography. The incidence of LKD was greater in higher-income quintiles in both African-American and white populations. Notably, the total incidence of LKD was higher in the African-American population than in the white population (incidence rate ratio [IRR], 1.20; 95% confidence interval [95% CI], 1.17 to 1.24]), but ratios varied by income. The incidence of LKD was lower in the African-American population than in the white population in the lowest income quintile (IRR, 0.84; 95% CI, 0.78 to 0.90), but higher in the African-American population in the three highest income quintiles, with IRRs of 1.31 (95% CI, 1.22 to 1.41) in Q3, 1.50 (95% CI, 1.39 to 1.62) in Q4, and 1.87 (95% CI, 1.73 to 2.02) in Q5. Thus, these data suggest that racial disparities in access to living donor transplantation are likely due to socioeconomic factors rather than cultural differences in the acceptance of LKD.


Assuntos
Renda , Transplante de Rim , Doadores Vivos , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Adolescente , Adulto , Negro ou Afro-Americano , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Fatores Socioeconômicos , Estados Unidos , População Branca
20.
Front Public Health ; 12: 1336038, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38481842

RESUMO

Background: The COVID-19 pandemic has highlighted health disparities, especially among specific population groups. This study examines the spatial relationship between the proportion of visible minorities (VM), occupation types and COVID-19 infection in the Greater Vancouver region of British Columbia, Canada. Methods: Provincial COVID-19 case data between June 24, 2020, and November 7, 2020, were aggregated by census dissemination area and linked with sociodemographic data from the Canadian 2016 census. Bayesian spatial Poisson regression models were used to examine the association between proportion of visible minorities, occupation types and COVID-19 infection. Models were adjusted for COVID-19 testing rates and other sociodemographic factors. Relative risk (RR) and 95% Credible Intervals (95% CrI) were calculated. Results: We found an inverse relationship between the proportion of the Chinese population and risk of COVID-19 infection (RR = 0.98 95% CrI = 0.96, 0.99), whereas an increased risk was observed for the proportions of the South Asian group (RR = 1.10, 95% CrI = 1.08, 1.12), and Other Visible Minority group (RR = 1.06, 95% CrI = 1.04, 1.08). Similarly, a higher proportion of frontline workers (RR = 1.05, 95% CrI = 1.04, 1.07) was associated with higher infection risk compared to non-frontline. Conclusion: Despite adjustments for testing, housing, occupation, and other social economic status variables, there is still a substantial association between the proportion of visible minorities, occupation types, and the risk of acquiring COVID-19 infection in British Columbia. This ecological analysis highlights the existing disparities in the burden of diseases among different visible minority populations and occupation types.


Assuntos
COVID-19 , Grupos Minoritários , Humanos , Colúmbia Britânica/epidemiologia , COVID-19/epidemiologia , Teste para COVID-19 , Pandemias , Teorema de Bayes , Ocupações
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA