Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 12 de 12
Filtrar
1.
J Clin Hypertens (Greenwich) ; 25(4): 315-325, 2023 04.
Artigo em Inglês | MEDLINE | ID: mdl-36919191

RESUMO

Retention in hypertension care, medication adherence, and blood pressure (BP) may have been affected by the COVID-19 pandemic. In a retrospective cohort study of 64 766 individuals with treated hypertension from an integrated health care system, we compared hypertension care during the year pre-COVID-19 (March 2019-February 2020) and the first year of COVID-19 (March 2020-February 2021). Retention in hypertension care was defined as receiving clinical BP measurements during COVID-19. Medication adherence was measured using prescription refills. Clinical care was assessed by in-person and virtual visits and changes in systolic and diastolic BP. The cohort had a mean age of 67.8 (12.2) years, 51.2% were women, and 73.5% were White. In 60 757 individuals with BP measurements pre-COVID-19, 16618 (27.4%) had no BP measurements during COVID-19. Medication adherence declined from 86.0% to 80.8% (p < .001). In-person primary care visits decreased from 2.7 (2.7) to 1.4 (1.9) per year, while virtual contacts increased from 9.5 (12.2) to 11.2 (14.2) per year (both p < .001). Among individuals with BP measurements, mean (SD) systolic BP was 126.5 mm Hg (11.8) pre-COVID-19 and 127.3 mm Hg (12.6) during COVID-19 (p = .14). Mean diastolic BP was 73.5 mm Hg (8.5) pre-COVID-19 and 73.5 mm Hg (8.7) during COVID-19 (p = .77). Even in this integrated health care system, many individuals did not receive clinical BP monitoring during COVID-19. Most individuals who remained in care maintained pre-COVID BP. Targeted outreach may be necessary to restore care continuity and hypertension control at the population level.


Assuntos
COVID-19 , Prestação Integrada de Cuidados de Saúde , Hipertensão , Humanos , Feminino , Idoso , Masculino , Hipertensão/tratamento farmacológico , Hipertensão/epidemiologia , Estudos Retrospectivos , Pandemias , COVID-19/epidemiologia , Pressão Sanguínea , Anti-Hipertensivos/uso terapêutico , Anti-Hipertensivos/farmacologia
2.
Tob Use Insights ; 16: 1179173X221134855, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-36636234

RESUMO

Introduction: Our primary purpose is to understand comorbidities and health outcomes associated with electronic nicotine delivery systems (ENDS) use. Methods: Study participants were Kaiser Permanente (KP) members from eight US regions who joined the Kaiser Permanente Research Bank (KPRB) from September 2015 through December 2019 and completed a questionnaire assessing demographic and behavioral factors, including ENDS and traditional cigarette use. Medical history and health outcomes were obtained from electronic health records. We used multinomial logistic regression to estimate odd ratios (ORs) and 95% confidence intervals (CIs) of current and former ENDS use according to member characteristics, behavioral factors, and clinical history. We used Cox regression to estimate hazard ratios (HRs) and 95% CIs comparing risk of health outcomes according to ENDS use. Results: Of 119 593 participants, 1594 (1%) reported current ENDS use and 5603 (5%) reported past ENDS use. ENDS users were more likely to be younger, male, gay or lesbian, and American Indian / Alaskan Native or Asian. After adjustment for confounding, current ENDS use was associated with current traditional cigarette use (OR = 39.55; CI:33.44-46.77), current marijuana use (OR = 6.72; CI:5.61-8.05), history of lung cancer (OR = 2.64; CI:1.42-4.92), non-stroke cerebral vascular disease (OR = 1.55; CI:1.21-1.99), and chronic obstructive pulmonary disease (OR = 2.16; CI:1.77-2.63). Current ENDS use was also associated with increased risk of emergency room (ER) visits (HR = 1.17; CI: 1.05-1.30) and death (HR = 1.84; CI:1.02-3.32). Conclusions: Concurrent traditional cigarette use, marijuana use, and comorbidities were prevalent among those who used ENDS, and current ENDS use was associated with an increased risk of ER visits and death. Additional research focused on health risks associated with concurrent ENDS and traditional cigarette use in those with underlying comorbidities is needed.

3.
Prev Med ; 165(Pt A): 107281, 2022 12.
Artigo em Inglês | MEDLINE | ID: mdl-36191653

RESUMO

Attention to health equity is critical in the implementation of firearm safety efforts. We present our operationalization of equity-oriented recommendations in preparation for launch of a hybrid effectiveness-implementation trial focused on firearm safety promotion in pediatric primary care as a universal suicide prevention strategy. In Step 1 of our process, pre-trial engagement with clinican partners and literature review alerted us that delivery of a firearm safety program may vary by patients' medical complexity, race, and ethnicity. In Step 2, we selected the Health Equity Implementation Framework to inform our understanding of contextual determinants (i.e., barriers and facilitators). In Step 3, we leveraged an implementation pilot across 5 pediatric primary care clinics in 2 health system sites to study signals of inequities. Eligible well-child visits for 694 patients and 47 clinicians were included. Our results suggested that medical complexity was not associated with program delivery. We did see potential signals of inequities by race and ethnicity but must interpret with caution. Though we did not initially plan to examine differences by sex assigned at birth, we discovered that clinicians may be more likely to deliver the program to parents of male than female patients. Seven qualitative interviews with clinicians provided additional context. In Step 4, we interrogated equity considerations (e.g., why and how do these inequities exist). In Step 5, we will develop a plan to probe potential inequities related to race, ethnicity, and sex in the fully powered trial. Our process highlights that prospective, rigorous, exploratory work is vital for equity-informed implementation trials.


Assuntos
Armas de Fogo , Prevenção do Suicídio , Recém-Nascido , Humanos , Masculino , Criança , Feminino , Projetos Piloto , Estudos Prospectivos , Projetos de Pesquisa
4.
Tob Use Insights ; 15: 1179173X221096638, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35492220

RESUMO

BACKGROUND: Although combustible cigarette use is an established risk factor for severe COVID-19 disease, there is conflicting evidence for the association of electronic cigarette use with SARS-CoV-2 infection and COVID-19 disease severity. METHODS: Study participants were from the Kaiser Permanente Research Bank (KPRB), a biorepository that includes adult Kaiser Permanente members from across the United States. Starting in April 2020, electronic surveys were sent to KPRB members to assess the impact of the COVID-19 pandemic. These surveys collected information on self-report of SARS-CoV-2 infection and COVID-related risk factors, including electronic cigarette and combustible cigarette smoking history. We also used electronic health records data to assess COVID-19 diagnoses, positive PCR lab tests, hospitalizations, and death. We used multivariable Cox proportional hazards regression to calculate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs) comparing the risk of SARS-CoV-2 infection between individuals by e-cigarette use categories (never, former, and current). Among those with SARS-CoV-2 infection, we used multivariable logistic regression to estimate adjusted odds ratios (ORs) and 95% CIs comparing the odds of hospitalization or death within 30 days of infection between individuals by e-cigarette use categories. RESULTS: There were 126,475 individuals who responded to the survey and completed questions on e-cigarette and combustible cigarette use (48% response rate). Among survey respondents, 819 (1%) currently used e-cigarettes, 3,691 (3%) formerly used e-cigarettes, and 121,965 (96%) had never used e-cigarettes. After adjustment for demographic, behavioral, and clinical factors, there was no association with SARS-CoV-2 infection and former e-cigarette use (hazard ratio (HR) = 0.99; CI: 0.83-1.18) or current e-cigarette use (HR = 1.08; CI: 0.76-1.52). Among those with SARS-CoV-2 infection, there was no association with hospitalization or death within 30 days of infection and former e-cigarette use (odds ratio (OR) = 1.19; CI: 0.59-2.43) or current e-cigarette use (OR = 1.02; CI: 0.22-4.74). CONCLUSIONS: Our results suggest that e-cigarette use is not associated with an increased risk of SARS-CoV-2 infection or severe COVID-19 illness.

5.
Vaccine ; 39(8): 1283-1289, 2021 02 22.
Artigo em Inglês | MEDLINE | ID: mdl-33485643

RESUMO

BACKGROUND: In some settings, research methods to determine influenza vaccine effectiveness (VE) may not be appropriate because of cost, time constraints, or other factors. Administrative database analysis of viral testing results and vaccination history may be a viable alternative. This study compared VE estimates from outpatient research and administrative databases. METHODS: Using the test-negative, case-control design, data for 2017-2018 and 2018-2019 influenza seasons were collected using: 1) consent, specimen collection, RT-PCR testing and vaccine verification using multiple methods; and 2) an administrative database of outpatients with a clinical respiratory viral panel combined with electronic immunization records. Odds ratios for likelihood of influenza infection by vaccination status were calculated using multivariable logistic regression. VE = (1 - aOR) × 100. RESULTS: Research participants were significantly younger (P < 0.001), more often white (69% vs. 59%; P < 0.001) than non-white and less frequently enrolled through the emergency department (35% vs. 72%; P < 0.001) than administrative database participants. VE was significant against all influenza and influenza A in each season and both seasons combined (37-49%). Point estimate differences between methods were evident, with higher VE in the research database, but insignificant due to low sample sizes. When enrollment sites were separately analyzed, there were significant differences in VE estimates for all influenza (66% research vs. 46% administrative P < 0.001) and influenza A (67% research vs. 49% administrative; P < 0.001) in the emergency department. CONCLUSIONS: The selection of the appropriate method for determining influenza vaccine effectiveness depends on many factors, including sample size, subgroups of interest, etc., suggesting that research estimates may be more generalizable. Other advantages of research databases for VE estimates include lack of clinician-related selection bias for testing and less misclassification of vaccination status. The advantages of the administrative databases are potentially shorter time to VE results and lower cost.


Assuntos
Vacinas contra Influenza , Influenza Humana , Estudos de Casos e Controles , Gerenciamento de Dados , Humanos , Vírus da Influenza A Subtipo H3N2 , Influenza Humana/epidemiologia , Influenza Humana/prevenção & controle , Estações do Ano , Vacinação
6.
Exp Lung Res ; 44(1): 51-61, 2018 02.
Artigo em Inglês | MEDLINE | ID: mdl-29381088

RESUMO

Purpose/Aim: Low doses (30-80 mg/kg) of monocrotaline are commonly used to create experimental models of pulmonary hypertension in rats. At these doses, monocrotaline causes pulmonary endothelial apoptosis and acute lung injury which ultimately results in pulmonary vascular disease. Higher doses of monocrotaline (300 mg/kg) are known to create severe liver injury, but previous investigations with lower doses have not reported histology in other organs to determine whether the vascular injury with monocrotaline is pulmonary-selective or generalized. MATERIALS AND METHODS: We therefore sought to determine whether monocrotaline caused extra-pulmonary injury at doses commonly used in pulmonary hypertension studies. We performed left pneumonectomy on young male and female rats before administering 50-60 mg/kg monocrotaline 7 days later. We monitored serum chemistry and urine dipsticks during the first 3 weeks while the animals developed pulmonary hypertension. After 3 weeks, we sacrificed animals and stained the lungs and highly vascular visceral organs (kidney, liver, and spleen) for elastin to evaluate the degree of vascular injury and remodeling. RESULTS: We did not observe proteinuria or significant transaminitis over the 3 weeks following monocrotaline. As previously published, monocrotaline caused severe pulmonary vascular disease with neointimal lesions and medial hypertrophy. We did not identify significant large or small arterial damage in the kidneys, liver, or spleen. Two external veterinary pathologists did not identify histopathology in the kidneys, liver, or spleen of these rats. CONCLUSIONS: We conclude that 50-60 mg/kg of monocrotaline causes a selective pulmonary vascular lesion and that male and female rats have little non-pulmonary damage over 3 weeks at these doses of monocrotaline.


Assuntos
Monocrotalina/efeitos adversos , Pneumonectomia/efeitos adversos , Artéria Pulmonar/patologia , Lesão Pulmonar Aguda/induzido quimicamente , Animais , Apoptose/efeitos dos fármacos , Células Endoteliais/efeitos dos fármacos , Células Endoteliais/patologia , Feminino , Hipertensão Pulmonar/induzido quimicamente , Pulmão/irrigação sanguínea , Pulmão/patologia , Pneumopatias/induzido quimicamente , Masculino , Ratos
7.
Am J Med Qual ; 31(6): 501-508, 2016 11.
Artigo em Inglês | MEDLINE | ID: mdl-26491116

RESUMO

Sepsis is an inflammatory response triggered by infection, with risk of in-hospital mortality fueled by disease progression. Early recognition and intervention by multidisciplinary sepsis programs may reverse the inflammatory response among at-risk patient populations, potentially improving outcomes. This retrospective study of a sepsis program enabled by a 2-stage sepsis Clinical Decision Support (CDS) system sought to evaluate the program's impact, identify early indicators that may influence outcomes, and uncover opportunities for quality improvement. Data encompassed 16 527 adult hospitalizations from 2014 and 2015. Of 2108 non-intensive care unit patients screened-in by sepsis CDS, 97% patients were stratified by 177 providers. Risk of adverse outcome improved 30% from baseline to year end, with gains materializing and stabilizing at month 7 after sepsis program go-live. Early indicators likely to influence outcomes include patient age, recent hospitalization, electrolyte abnormalities, hypovolemic shock, hypoxemia, patient location when sepsis CDS activated, and specific alert patterns.


Assuntos
Sistemas de Apoio a Decisões Clínicas , Comunicação Interdisciplinar , Sepse/terapia , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Equipe de Assistência ao Paciente , Avaliação de Programas e Projetos de Saúde , Estudos Retrospectivos , Sepse/diagnóstico , Sepse/mortalidade , Resultado do Tratamento
8.
JRSM Open ; 6(10): 2054270415609004, 2015 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-26688744

RESUMO

OBJECTIVE: To examine the diagnostic accuracy of a two-stage clinical decision support system for early recognition and stratification of patients with sepsis. DESIGN: Observational cohort study employing a two-stage sepsis clinical decision support to recognise and stratify patients with sepsis. The stage one component was comprised of a cloud-based clinical decision support with 24/7 surveillance to detect patients at risk of sepsis. The cloud-based clinical decision support delivered notifications to the patients' designated nurse, who then electronically contacted a provider. The second stage component comprised a sepsis screening and stratification form integrated into the patient electronic health record, essentially an evidence-based decision aid, used by providers to assess patients at bedside. SETTING: Urban, 284 acute bed community hospital in the USA; 16,000 hospitalisations annually. PARTICIPANTS: Data on 2620 adult patients were collected retrospectively in 2014 after the clinical decision support was implemented. MAIN OUTCOME MEASURE: 'Suspected infection' was the established gold standard to assess clinical decision support clinimetric performance. RESULTS: A sepsis alert activated on 417 (16%) of 2620 adult patients hospitalised. Applying 'suspected infection' as standard, the patient population characteristics showed 72% sensitivity and 73% positive predictive value. A postalert screening conducted by providers at bedside of 417 patients achieved 81% sensitivity and 94% positive predictive value. Providers documented against 89% patients with an alert activated by clinical decision support and completed 75% of bedside screening and stratification of patients with sepsis within one hour from notification. CONCLUSION: A clinical decision support binary alarm system with cross-checking functionality improves early recognition and facilitates stratification of patients with sepsis.

9.
Eur J Appl Physiol ; 113(5): 1223-31, 2013 May.
Artigo em Inglês | MEDLINE | ID: mdl-23160652

RESUMO

Cooling vests (CV) are often used to reduce heat strain. CVs have traditionally used ice as the coolant, although other phase-change materials (PCM) that melt at warmer temperatures have been used in an attempt to enhance cooling by avoiding vasoconstriction, which supposedly occurs when ice CVs are used. This study assessed the effectiveness of four CVs that melted at 0, 10, 20 and 30 °C (CV0, CV10, CV20, and CV30) when worn by 10 male volunteers exercising and then recovering in 40 °C air whilst wearing fire-fighting clothing. When compared with a non-cooling control condition (CON), only the CV0 and CV10 vests provided cooling during exercise (40 and 29 W, respectively), whereas all CVs provided cooling during resting recovery (CV0 69 W, CV10 66 W, CV20 55 W and CV30 29 W) (P < 0.05). In all conditions, skin blood flow increased when exercising and reduced during recovery, but was lower in the CV0 and CV10 conditions compared with control during exercise (observed power 0.709) (P < 0.05), but not during resting recovery (observed power only 0.55). The participants preferred the CV10 to the CV0, which caused temporary erythema to underlying skin, although this resolved overnight after each occurrence. Consequently, a cooling vest melting at 10 °C would seem to be the most appropriate choice for cooling during combined work and rest periods, although possibly an ice-vest (CV0) may also be appropriate if more insulation was worn between the cooling packs and the skin than used in this study.


Assuntos
Congelamento , Temperatura Alta , Roupa de Proteção , Adulto , Estudos de Casos e Controles , Temperatura Baixa , Exercício Físico , Bombeiros , Humanos , Masculino , Temperatura Cutânea
10.
N Engl J Med ; 358(13): 1402; author reply 1404-5, 2008 Mar 27.
Artigo em Inglês | MEDLINE | ID: mdl-18367747
12.
Occup Med (Lond) ; 55(5): 380-4, 2005 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-15860484

RESUMO

AIM: To test the hypothesis that measures of aerobic fitness, body mass and indices of body composition will influence the metabolic and cardiovascular demands of simulated load-carriage tasks. METHOD: Twenty-eight healthy male volunteers, following assessment of maximal oxygen uptake (O(2)max) and body composition, walked on a treadmill at 4 kph (1.11 m/s) for 60 min on gradients of 0, 3, 6 and 9% whilst carrying backpack loads of 0, 20 and 40 kg. During the final 3 min of each 5-min exercise bout, indirect respiratory calorimetry and heart rate data were collected and the 'steady-state' metabolic O(2) and cardiovascular (heart rate) demands quantified. RESULTS: Absolute O(2)max (ml/min) produced the strongest correlation (r = -0.64, P < 0.01) with the metabolic demand of heavy load-carriage (40 kg). The body composition index lean body mass/(fat mass + external load) produced a moderate correlation (r = -0.52, P < 0.01) with the metabolic demand of heavy load-carriage. The increases in metabolic and cardiovascular demands were greater when the load carried increased from 20 to 40 kg compared with 0 to 20 kg at all four gradients. A model incorporating anthropometric and physiological characteristics with gradient and load explains 89% of the variability in the metabolic demands of load-carriage compared with 82% using gradient and load alone. CONCLUSION: The results show that indices of body composition as well as absolute aerobic power influence the relative metabolic demands of load-carriage. Application of these measurements would ensure selection criteria for load-carriage occupations are based on lean muscle mass rather than running speed.


Assuntos
Composição Corporal/fisiologia , Consumo de Oxigênio/fisiologia , Suporte de Carga/fisiologia , Avaliação da Capacidade de Trabalho , Adulto , Metabolismo Energético/fisiologia , Teste de Esforço , Frequência Cardíaca/fisiologia , Humanos , Modelos Lineares , Masculino
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA