Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 211
Filtrar
1.
Foot Ankle Orthop ; 9(3): 24730114241263093, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-39086381

RESUMO

Background: Patients with poor glycemic control are at increased risk of postoperative complications. Hemoglobin A1c (HbA1c) has traditionally been used to assess preoperative glycemic control, but with limitations. More recently, fructosamine has been tested preoperatively in patients undergoing elective total joint arthroplasty. This study aims to assess whether preoperative serum fructosamine can be used to avoid adverse outcomes in patients undergoing foot and ankle surgery. Methods: This was a retrospective chart review of all patients who underwent foot and ankle surgeries at 2 level 1 trauma centers from January 2020 to December 2021. Of those, 305 patients were tested for HbA1c and fructosamine levels preoperatively. Adverse outcomes were assessed over 30 and 90 days. Outcomes of interest were surgical site infection, wound dehiscence, unplanned return to the operating room, unplanned readmission, and death. Data were analyzed using independent 2-sample t tests. A mixed effects model was used for multivariate analysis. P values less than .05 were considered statistically significant. Results: Preoperative serum fructosamine was significantly higher (P = .029) in those with complications within 90 days compared to those without. The mean preoperative fructosamine level was 269.2 µmol/L (SD = 58.85) in those who did have a complication vs 247.2 µmol/L (SD = 53.95) in those who did not. Clinically significant fructosamine threshold was determined using 2 different methods. Fructosamine was found to be non-inferior to HbA1c in accurately predicting postoperative complications. Conclusion: Fructosamine is a serum marker that reflects nearer term glycemic control than HbA1c. Elevation in preoperative fructosamine is associated with increased perioperative complications after foot and ankle surgery within 90 days. Preoperative fructosamine may be used in patient optimization and risk stratification when determining candidacy and timing for elective foot and ankle surgeries. Level of evidence: Level III, retrospective cohort study.

2.
J Dent Sci ; 19(3): 1673-1679, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-39035261

RESUMO

Background/purpose: Interproximal contact loss may lead to food impaction and result in subsequently periodontal complications. The purpose of this prospective study was to investigate the peri-implant parameters of posterior implant-supported single crowns (SCs) with and without mesial proximal contact loss after 2 years of follow-up. Material and methods: Twenty-six patients with a total of 40 posterior implant-supported SCs with mesial adjacent natural teeth were observed for 24 months after crown insertion. The mesial proximal contacts were assessed by dental floss, then were classified as tight, weak, and open contacts. The following peri-implant parameters were evaluated, including modified plaque index (MPI), modified gingival index (MGI), and probing depth (PD) were conducted at six sites per tooth (mesiofacail, midfacial, distofacial, mesiolingual, mid-lingual and distolingual) in the 6-, 12-, 18- and 24-month following visits. Furthermore, radiographs were taken regularly in 12- and 24-month recall sections for measuring the marginal bone loss (MBL). Results: At 12-month observation, the incidence rates of weak and open contacts were 22.5 % and 12.5 %; whereas after 24 months of clinical service, the rates came up with 12.9 % and 25.6 %, respectively. No significant differences were found between the tight, weak, and open contact groups in the parameters of MPI, MGI, or PD (P > 0.05) at 12- and 24-month follow-up. None of the mean differences of the peri-implant parameters: MPI, MGI, PD and MBL had significant differences between the tight, weak, and open contact groups after 1 and 2 years of clinical service (P > 0.05). Conclusion: The presence of open, weak, and tight mesial proximal contacts had no significant effects on the peri-implant tissue conditions.

3.
Res Sq ; 2024 Jun 11.
Artigo em Inglês | MEDLINE | ID: mdl-38946990

RESUMO

Background: Sedentary behavior (SB) is detrimental to cardiometabolic disease (CMD) risk, which can begin in young adulthood. To devise effective SB-CMD interventions in young adults, it is important to understand which context-specific sedentary behaviors (CS-SB) are most detrimental for CMD risk, the lifestyle behaviors that co-exist with CS-SBs, and the socioecological predictors of CS-SB. Methods: This longitudinal observational study will recruit 500 college-aged (18-24 years) individuals. Two laboratory visits will occur, spaced 12 months apart, where a composite CMD risk score (e.g., arterial stiffness, metabolic and inflammatory biomarkers, heart rate variability, and body composition) will be calculated, and questionnaires to measure lifestyle behaviors and different levels of the socioecological model will be administered. After each visit, total SB (activPAL) and CS-SB (television, transportation, academic/ occupational, leisure computer, "other"; ecological momentary assessment) will be measured across seven days. Discussion: It is hypothesized that certain CS-SB will show stronger associations with CMD risk, compared to T-SB, even after accounting for coexisting lifestyle behaviors. It is expected that a range of intra-individual, inter-individual, and physical environment socioecological factors will predict CS-SB. The findings from this study will support the development of an evidence-based, multi-level intervention to target SB reduction and mitigate CMD risk in CBYA.

4.
J Adolesc Health ; 75(3): 487-495, 2024 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-38980246

RESUMO

PURPOSE: Despite increasing use of long-acting reversible contraception (LARC) among U.S. adolescents, there is limited literature on factors affecting intrauterine device (IUD) or subdermal implant use. This study aimed to describe statewide rates, and associated patient and provider factors of adolescent IUD or implant initiation and continuation. METHODS: This retrospective cohort study used N.C. Medicaid claims data. 10,408 adolescents were eligible (i.e., 13-19 years, female sex, continuous Medicaid enrollment, had an IUD or implant insertion or removal code from January 1, 2013, to October 1, 2015). Bivariate analyses assessed differences in adolescents using IUD versus implant. Kaplan-Meier curves were created to assess IUD or implant discontinuation through December 31, 2018. RESULTS: Adolescents initiated 8,592 implants and 3,369 IUDs (N = 11,961). There were significant differences in nearly all provider and patient factors for those who initiated implants versus IUDs. 16% of implants and 53% of IUDs were removed in the first year. Younger (i.e., age <18 years old), Hispanic, and Black adolescents had higher adjusted continuation of implants compared with older and White adolescents, respectively (both p < .001). Those whose IUD was inserted by an obstetrician/gynecologist provider had lower continuation of IUDs compared with non-obstetrician/gynecologist providers (p < .001). DISCUSSION: We found that age-related, racial, and ethnic disparities exist in both implant and IUD continuation. Practice changes to support positive adolescent experiences with implant and IUD insertion and removals are needed, including patient-centered health care provider training in contraception counseling, LARC initiation and removal training for adolescent-facing providers, and broader clinic capacity for LARC services.


Assuntos
Contracepção Reversível de Longo Prazo , Medicaid , Humanos , Adolescente , Feminino , Medicaid/estatística & dados numéricos , Estados Unidos , Contracepção Reversível de Longo Prazo/estatística & dados numéricos , Contracepção Reversível de Longo Prazo/tendências , Estudos Retrospectivos , Adulto Jovem , Dispositivos Intrauterinos/estatística & dados numéricos , Dispositivos Intrauterinos/tendências
5.
J Hypertens ; 42(9): 1624-1631, 2024 Sep 01.
Artigo em Inglês | MEDLINE | ID: mdl-38860390

RESUMO

OBJECTIVES: Average values for self-measured blood pressure (SMBP) more accurately reflect a patient's risk of cardiovascular disease than do office measurements. Oftentimes, however, patients provide lists of individual home blood pressure (BP) measurements, and average values cannot be computed within the time constraints of a clinic visit. In contrast, the home BP load - defined as the proportion of BP values greater than a partition value (e.g., 130 mmHg) - can be easily calculated. We examined the utility of the BP load in predicting the mean SMBP and confirming elevated SMBP. METHODS: Four hundred twenty untreated adults at least 30 years of age acquired SMBP data twice in the morning and twice in the evening over 10 days. The 'true' SMBP was defined as the mean of these 40 determinations. RESULTS: Using all 10 days of BP data and a systolic BP threshold of 130 mmHg, the average SMBP associated with a home BP load of 0.50 was 130 mmHg, with a 95% prediction interval of 126-133 mmHg. True systolic SMBP was approximately 6 mmHg lower and higher at home BP loads of 0.25 and 0.75, respectively. There was a 90% probability that the true systolic SMBP was greater than 130 mmHg if the systolic home BP load was at least 0.60. Corresponding values for 3 days and 1 day of SMBP were at least 0.68 and at least 0.84, respectively. CONCLUSION: Our analysis demonstrates that the home BP load can be used to estimate the average BP acquired on home monitoring and confirm elevated SMBP.


Assuntos
Monitorização Ambulatorial da Pressão Arterial , Pressão Sanguínea , Humanos , Feminino , Masculino , Pressão Sanguínea/fisiologia , Pessoa de Meia-Idade , Monitorização Ambulatorial da Pressão Arterial/métodos , Adulto , Idoso , Hipertensão/fisiopatologia , Hipertensão/diagnóstico , Determinação da Pressão Arterial/métodos
6.
J Am Geriatr Soc ; 2024 Jun 19.
Artigo em Inglês | MEDLINE | ID: mdl-38895937

RESUMO

BACKGROUND: The population of people living with dementia (PLwD) continues to grow in Japan where advance care planning (ACP) for PLwD is relatively new. Our aim was to evaluate the feasibility and cultural acceptability of a dementia-specific ACP communication skills toolkit for Japanese primary care clinicians. METHODS: We delivered 13 training sessions in primary care clinics across central Japan and conducted a post-training survey to assess whether the toolkit increased confidence in dementia-specific ACP communication skills and the acceptability of the toolkit with the following four statements: (1) The language in the sessions was clear, (2) The sessions took an appropriate amount of time to complete, (3) The design of the sessions was an effective educational method, and (4) The sessions were culturally appropriate for communication with Japanese patients with dementia and their family members. We asked participants to respond using a 5-point Likert scale from strongly agree to strongly disagree. RESULTS: All participants were Japanese and included 80 physicians (mean age 39.8 years), 33 nurses (mean age 45.7 years), and 58 other participants (mean age 42.9 years), who were 30.0%, 87.9%, and 55.2% female, respectively. Most participants practiced in rural settings. In pre- post-comparisons, participant confidence increased in determining capacity, understanding dementia prognosis, goals of care, eliciting surrogates, recommending self-care practices to families, and leading family meetings (all p < 0.001). Most participants strongly agreed or agreed that the toolkit was an effective method (96.9%), took an appropriate amount of time (94.5%), contained clear language (89.8%), and was culturally appropriate (73.6%). CONCLUSIONS: Dementia-specific ACP communication skills toolkit can be delivered in Japan. Japanese primary care clinicians generally felt the dementia-specific ACP toolkit increased their confidence in ACP communication skills and was acceptable. The language, time, and design were well received, though further work is needed to improve the cultural appropriateness of the toolkit.

7.
Artigo em Inglês | MEDLINE | ID: mdl-38771793

RESUMO

American Indian and Alaska Native (AI/AN) adolescents face health disparities resulting from historical traumas. There is a paucity of research focusing on mental health in AI/AN adolescents or the relationship between cultural connection and health. This project assesses the relationship between cultural identity and markers of mental health and well-being for AI/AN adolescents. Adolescents 12 to 18 years old from the Lumbee Tribe of North Carolina participated in this mixed-methods study. Phase 1, discussed in this manuscript, involved surveys using validated instruments to assess cultural connection and markers of mental health and well-being. Characteristics of the 122 AI/AN youth who completed the survey included: mean age 14.9 years (SD = 2.0); 61% (n = 75) assigned female at birth; 56% (n = 70) identified as female; and 4.1% (n = 5) identified as non-binary. Mean tribal affiliation (TA) and ethnic identity (EI) scores suggest strong cultural connection (TA: M = 3.1/5, SD = 0.6; EI: M = 3.4/5, SD = 0.9). Sleep quality (M = 2.63/5) and positive stress management (M = 2.06/5) were low. Bivariate and logistic regression demonstrated moderate positive correlations between EI and friendship, EI and emotional support, TA and friendship, and TA and emotional support. AI/AN adolescents in this sample have a moderate-strong connection with Native culture, marked by ethnic identity and tribal affiliation, and positive markers of mental health and well-being. Data from this study may be used for policy formulation to promote increased funding and programming addressing mental health for AI/AN youth.


Assuntos
Indígenas Norte-Americanos , Humanos , Adolescente , Feminino , Masculino , Indígenas Norte-Americanos/etnologia , Criança , Saúde Mental/etnologia , North Carolina , Nativos do Alasca , Identificação Social
8.
J Infect Dis ; 230(2): 485-496, 2024 Aug 16.
Artigo em Inglês | MEDLINE | ID: mdl-38781438

RESUMO

BACKGROUND: Asymptomatic carriage of malaria parasites persists even as malaria transmission declines. Low-density infections are often submicroscopic, not detected with rapid diagnostic tests (RDTs) or microscopy but detectable by polymerase chain reaction (PCR). METHODS: To characterize submicroscopic Plasmodium falciparum carriage in an area of declining malaria transmission, asymptomatic persons >5 years of age in rural Bagamoyo District, Tanzania, were screened using RDT, microscopy, and PCR. We investigated the size of the submicroscopic reservoir of infection across villages, determined factors associated with submicroscopic carriage, and assessed the natural history of submicroscopic malaria over 4 weeks. RESULTS: Among 6076 participants, P. falciparum prevalences by RDT, microscopy, and PCR were 9%, 9%, and 28%, respectively, with roughly two-thirds of PCR-positive individuals harboring submicroscopic infection. Adult status, female sex, dry season months, screened windows, and bed net use were associated with submicroscopic carriage. Among 15 villages encompassing 80% of participants, the proportion of submicroscopic carriers increased with decreasing village-level malaria prevalence. Over 4 weeks, 23% of submicroscopic carriers (61 of 266) became RDT positive, with half exhibiting symptoms, while half (133 of 266) were no longer parasitemic at the end of 4 weeks. Progression to RDT-positive patent malaria occurred more frequently in villages with higher malaria prevalence. CONCLUSIONS: Microheterogeneity in transmission observed at the village level appears to affect both the size of the submicroscopic reservoir and the likelihood of submicroscopic carriers developing patent malaria in coastal Tanzania.


Assuntos
Portador Sadio , Malária Falciparum , Plasmodium falciparum , Humanos , Tanzânia/epidemiologia , Feminino , Malária Falciparum/transmissão , Malária Falciparum/epidemiologia , Malária Falciparum/parasitologia , Masculino , Adulto , Adolescente , Plasmodium falciparum/genética , Plasmodium falciparum/isolamento & purificação , Criança , Portador Sadio/transmissão , Portador Sadio/epidemiologia , Portador Sadio/parasitologia , Adulto Jovem , Pré-Escolar , Prevalência , Pessoa de Meia-Idade , População Rural , Reação em Cadeia da Polimerase , Microscopia , Infecções Assintomáticas/epidemiologia , Idoso
9.
medRxiv ; 2024 May 13.
Artigo em Inglês | MEDLINE | ID: mdl-38798425

RESUMO

In chronic disease epidemiology, investigation of disease etiology has largely focused on one single endpoint, and progression of chronic disease as a multi-state process is understudied, representing a knowledge gap. Most of existing multi-state regression models require Markov assumption and are unsuitable to estimate progression of chronic diseases that is largely non-memoryless. We propose a new non-Markov framework that allows past states to affect transition rates of current states. The key innovation is that we convert a non-Markov to Markov process by dividing disease states into substates through conditioning on past disease history. Specifically, we apply cause-specific Cox models (CSC) including past states as covariates to obtain transition rates (TR) of substates, which were used to obtain transition probabilities (TP) and state occupational probabilities (SOP) of substates. We applied our model to describe progression of coronary heart disease (CHD) in the ARIC study, where CHD was modeled in healthy, in risk, CHD, heart failure, and mortality states. We presented transition rates, transition probabilities, and state occupational probabilities between states from age 45 to 95 years. In summary, the significance of our framework lies in that transition parameters between disease substates may shed light on new mechanistic insight of chronic disease and may provide more accurate description of non-Markov process than Markov regression models. Our method has potential of wide application in chronic disease epidemiology.

10.
J Child Orthop ; 18(2): 229-235, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38567044

RESUMO

Purpose: Greulich and Pyle is the most used system to estimate skeletal maturity but has significant drawbacks, prompting the development of newer skeletal maturity systems, such as the modified Fels skeletal maturity systems based on knee radiographs. To create a new skeletal maturity system, an outcome variable, termed a "skeletal maturity standard," must be selected for calibration of the system. Peak height velocity and 90% of final height are both considered reasonable skeletal maturity standards for skeletal maturity system development. We sought to answer two questions: (1) Does a skeletal maturity system developed using 90% of final height estimate skeletal age as well as it would if it was instead developed using peak height velocity? (2) Does a skeletal maturity system developed using 90% of final height perform as well in lower extremity length prediction as it would if it was instead developed using peak height velocity? Methods: The modified Fels knee skeletal maturity system was recalibrated based on 90% of final height and peak height velocity skeletal maturity standards. These models were applied to 133 serially obtained, peripubertal antero-posterior knee radiographs collected from 38 subjects. Each model was used to estimate the skeletal age of each radiograph. Skeletal age estimates were also used to predict each patient's ultimate femoral and tibial length using the White-Menelaus method. Results: The skeletal maturity system calibrated with 90% of final height produced more accurate skeletal age estimates than the same skeletal maturity system calibrated with peak height velocity (p < 0.05). The 90% of final height and peak height velocity models made similar femoral and tibial length predictions (p > 0.05). Conclusion: Using the 90% of final height skeletal maturity standard allows for simpler skeletal maturity system development than peak height velocity with potentially more accuracy.

11.
BMC Public Health ; 24(1): 921, 2024 Mar 29.
Artigo em Inglês | MEDLINE | ID: mdl-38553694

RESUMO

BACKGROUND: The workplace can play an important role in shaping the eating behaviors of U.S. adults. Unfortunately, foods obtained in the workplace tend to be low in nutritional quality. Questions remain about the best way to approach the promotion of healthy food purchases among employees and to what extent health promotion activities should be tailored to the demographic characteristics of the employees. The purpose of this study was to (1) assess the nutritional quality of lunchtime meal purchases by employees in cafeterias of a large organization, (2) examine associations between lunchtime meal quality selection and the demographic characteristics of employees, and (3) determine the healthfulness of foods and beverages offered in the cafeterias of this organization. METHODS: A cross-sectional analysis was conducted using secondary data from a food labeling study implemented in three worksite cafeterias. Demographic data was collected via surveys and meal data was collected using a photo capture system for 378 participants. The Healthy Eating Index 2015 (HEI-2015) was used to determine meal quality and a total score for the menu of options available in the cafeterias during the study period. Summary statistics were generated, and the analysis of variance (ANOVA) was used to compare the HEI-2015 scores between groups. RESULTS: The mean HEI-2015 total score for the menu items offered (n = 1,229) in the cafeteria during the study period was 63.1 (SD = 1.83). The mean HEI-2015 score for individual lunchtime meal observations (n = 378) was 47.1 (SD = 6.8). In general, HEI-2015 total scores were higher for non-smokers, individuals who self-identified as Asian, had higher physical activity levels, scored higher on numeracy and literacy assessments, and reported higher education levels, incomes, and health status. CONCLUSIONS: The overall HEI-2015 scores indicate that the menu of options offered in the cafeterias and individual meal selections did not align with the Dietary Guidelines for Americans, and there were significant associations between average lunchtime meal quality scores and several demographic characteristics. These results suggest that healthy eating promotion activities in workplaces may need to be tailored to the demographic characteristics of the employees, and efforts to improve the food environment in the workplace could improve meal quality for all employees.


Assuntos
Almoço , Refeições , Adulto , Humanos , Estudos Transversais , Local de Trabalho , Nível de Saúde , Dieta
13.
JAMA Pediatr ; 178(3): 306-308, 2024 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-38190303

RESUMO

This cross-sectional study compares monthly rates of long-acting reversible contraception (LARC) insertions among adolescents before and after an American Academy of Pediatrics (AAP) policy statement recommending LARC for this age group.


Assuntos
Contracepção Reversível de Longo Prazo , Gravidez na Adolescência , Humanos , Estados Unidos , Adolescente , Criança , Gravidez , Feminino , Gravidez na Adolescência/prevenção & controle , Anticoncepção , Políticas
14.
PLoS Negl Trop Dis ; 17(12): e0011274, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-38064489

RESUMO

Plasmodium ovale curtisi (Poc) and Plasmodium ovale wallikeri (Pow) represent distinct non-recombining Plasmodium species that are increasing in prevalence in sub-Saharan Africa. Though they circulate sympatrically, co-infection within human and mosquito hosts has rarely been described. Separate 18S rRNA real-time PCR assays that detect Poc and Pow were modified to allow species determination in parallel under identical cycling conditions. The lower limit of detection was 0.6 plasmid copies/µL (95% CI 0.4-1.6) for Poc and 4.5 plasmid copies/µL (95% CI 2.7-18) for Pow, or 0.1 and 0.8 parasites/µL, respectively, assuming 6 copies of 18s rRNA per genome. However, the assays showed cross-reactivity at concentrations greater than 103 plasmid copies/µL (roughly 200 parasites/µL). Mock mixtures were used to establish criteria for classifying mixed Poc/Pow infections that prevented false-positive detection while maintaining sensitive detection of the minority ovale species down to 100 copies/µL (<1 parasite/µL). When the modified real-time PCR assays were applied to field-collected blood samples from Tanzania and Cameroon, species identification by real-time PCR was concordant with nested PCR in 19 samples, but additionally detected two mixed Poc/Pow infections where nested PCR detected a single Po species. When real-time PCR was applied to oocyst-positive Anopheles midguts saved from mosquitoes fed on P. ovale-infected persons, mixed Poc/Pow infections were detected in 11/14 (79%). Based on these results, 8/9 P. ovale carriers transmitted both P. ovale species to mosquitoes, though both Po species could only be detected in the blood of two carriers. The described real-time PCR approach can be used to identify the natural occurrence of mixed Poc/Pow infections in human and mosquito hosts and reveals that such co-infections and co-transmission are likely more common than appreciated.


Assuntos
Anopheles , Malária , Plasmodium ovale , Animais , Humanos , Reação em Cadeia da Polimerase em Tempo Real/métodos , Plasmodium ovale/genética , RNA Ribossômico 18S/genética , Técnicas de Amplificação de Ácido Nucleico , Anopheles/genética , Malária/diagnóstico , Malária/epidemiologia
15.
Chest ; 2023 Dec 09.
Artigo em Inglês | MEDLINE | ID: mdl-38072392

RESUMO

BACKGROUND: Primary ciliary dyskinesia (PCD) is a rare disorder of motile cilia associated with situs abnormalities. At least 12% of patients with PCD have situs ambiguus (SA), including organ laterality defects falling outside normal arrangement (situs solitus [SS]) or mirror image inversion (situs inversus totalis [SIT]). RESEARCH QUESTION: Do patients with PCD and SA achieve worse clinical outcomes compared with those with SS or SIT? STUDY DESIGN AND METHODS: This cross-sectional, multicenter study evaluated participants aged 21 years or younger with PCD. Participants were classified as having SA, including heterotaxy, or not having SA (SS or SIT). Markers of disease severity were compared between situs groups, adjusting for age at enrollment and severe CCDC39 or CCDC40 genotype, using generalized linear models and logistic and Poisson regression. RESULTS: In 397 participants with PCD (mean age, 8.4 years; range, 0.1-21), 42 patients were classified as having SA, including 16 patients (38%) with complex cardiovascular malformations or atrial isomerism, 13 patients (31%) with simple CVM, and 13 patients (31%) without cardiovascular malformations. Of these, 15 patients (36%) underwent cardiac surgery, 24 patients (57%) showed an anatomic spleen abnormality, and seven patients (17%) showed both. The remaining 355 participants did not have SA, including 152 with SIT and 203 with SS. Overall, 70 participants (17%) harbored the severe CCDC39 or CCDC40 genotype. Compared with participants without SA, those with SA showed lower median BMI z scores (P = .03), lower FVC z scores (P = .01), and more hospitalizations and IV antibiotic courses for acute respiratory infections during the 5 years before enrollment (P < .01). Participants with cardiovascular malformations requiring surgery or with anatomic spleen abnormalities showed lower median BMI z scores and more hospitalizations and IV therapies for respiratory illnesses compared with participants without SA. INTERPRETATION: Children with PCD and SA achieve worse nutritional and pulmonary outcomes with more hospitalizations for acute respiratory illnesses than those with SS or SIT combined. Poor nutrition and increased hospitalizations for respiratory infections in participants with SA and PCD are associated with cardiovascular malformations requiring cardiac surgery, splenic anomalies, or both. TRIAL REGISTRY: ClinicalTrials.gov; Nos.: NCT02389049 and NCT00323167; URL: www. CLINICALTRIALS: gov.

16.
Front Med (Lausanne) ; 10: 1272900, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37937142

RESUMO

Background: Urinary stone disease (USD) historically has affected older men, but studies suggest recent increases in women, leading to a near identical sex incidence ratio. USD incidence has doubled every 10 years, with disproportionate increases amongst children, adolescent, and young adult (AYA) women. USD stone composition in women is frequently apatite (calcium phosphate), which forms in a higher urine pH, low urinary citrate, and an abundance of urinary uric acid, while men produce more calcium oxalate stones. The reasons for this epidemiological trend are unknown. Methods: This perspective presents the extent of USD with data from a Canadian Province and a North American institution, explanations for these findings and offers potential solutions to decrease this trend. We describe the economic impact of USD. Findings: There was a significant increase of 46% in overall surgical interventions for USD in Ontario. The incidence rose from 47.0/100,000 in 2002 to 68.7/100,000 population in 2016. In a single United States institution, the overall USD annual unique patient count rose from 10,612 to 17,706 from 2015 to 2019, and the proportion of women with USD was much higher than expected. In the 10-17-year-old patients, 50.1% were girls; with 57.5% in the 18-34 age group and 53.6% in the 35-44 age group. The roles of obesity, diet, hormones, environmental factors, infections, and antibiotics, as well as the economic impact, are discussed. Interpretation: We confirm the significant increase in USD among women. We offer potential explanations for this sex disparity, including microbiological and pathophysiological aspects. We also outline innovative solutions - that may require steps beyond typical preventive and treatment recommendations.

17.
J Curr Glaucoma Pract ; 17(3): 157-165, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37920372

RESUMO

Aims and background: Practice guidelines assert that high-risk glaucoma suspects should be treated. Yet, there is ambiguity regarding what constitutes a high enough risk for treatment. The purpose of this study was to determine which factors contribute to the decision to treat glaucoma suspects and ocular hypertensive patients in an academic ophthalmology practice. Materials and methods: Retrospective cohort study of glaucoma suspects or ocular hypertensives at an academic ophthalmology practice from 2014 to 2020. Demographics, comorbidities, intraocular pressure (IOP), optical coherence tomography (OCT) findings, and visual field measurements were compared between treated and untreated patients. A multivariable logistic regression model assessed predictors of glaucoma suspected treatment. Results: Of the 388 patients included, 311 (80%) were untreated, and 77 (20%) were treated. There was no statistical difference in age, race/ethnicity, family history of glaucoma, central corneal thickness (CCT), or any visual field parameters between the two groups. Treated glaucoma suspects had higher IOP, thinner retinal nerve fiber layers (RNFL), more RNFL asymmetry, thinner ganglion cell-inner plexiform layers (GCIPL), and a higher prevalence of optic disc drusen, disc hemorrhage, ocular trauma, and proliferative diabetic retinopathy (PDR) (p < 0.05 for all). In the multivariable model, elevated IOP {odds ratio [OR] 1.16 [95% confidence interval (CI) 1.04-1.30], p = 0.008}, yellow temporal [5.76 (1.80-18.40), p = 0.003] and superior [3.18 (1.01-10.0), p = 0.05] RNFL quadrants, and a history of optic disc drusen [8.77 (1.96-39.34), p = 0.005] were significant predictors of glaucoma suspect treatment. Conclusion: Higher IOP, RNFL thinning, and optic disc drusen were the strongest factors in the decision to treat a glaucoma suspect or ocular hypertensive patient. RNFL asymmetry, GCIPL thinning, and ocular comorbidities may also factor into treatment decisions. Clinical significance: Understanding the clinical characteristics that prompt glaucoma suspect treatment helps further define glaucoma suspect disease status and inform when treatment should be initiated. How to cite this article: Ciociola EC, Anderson A, Jiang H, et al. Decision Factors for Glaucoma Suspects and Ocular Hypertensive Treatment at an Academic Center. J Curr Glaucoma Pract 2023;17(3):157-165.

18.
Implement Sci ; 18(1): 64, 2023 Nov 23.
Artigo em Inglês | MEDLINE | ID: mdl-37996884

RESUMO

BACKGROUND: Dual randomized controlled trials (DRCT) are type 2 hybrid studies that include two randomized trials: one testing implementation strategies and one testing an intervention. We argue that this study design offers efficiency by providing rigorous investigation of both implementation and intervention in one study and has potential to accelerate generation of the evidence needed to translate interventions that work into real-world practice. Nevertheless, studies using this design are rare in the literature. MAIN TEXT: We construct a paradigm that breaks down the components of the DRCT and provide a step-by-step explanation of features of the design and recommendations for use. A clear distinction is made between the dual strands that test the implementation versus the intervention, and a minimum of three randomized arms is advocated. We suggest an active treatment arm that includes both the implementation strategy and intervention that are hypothesized to be superior. We suggest two comparison/control arms: one to test the implementation strategy and the second to test the intervention. Further, we recommend selection criteria for the two control arms that place emphasis on maximizing the utility of the study design to advance public health practice. CONCLUSIONS: On the surface, the design of a DRCT can appear simple, but actual application is complex. We believe it is that complexity that has limited its use in the literature. We hope that this paper will give both implementation scientists and trialists who are not familiar with implementation science a better understanding of the DRCT design and encouragement to use it.


Assuntos
Projetos de Pesquisa , Humanos , Ensaios Clínicos Controlados Aleatórios como Assunto
19.
Artigo em Inglês | MEDLINE | ID: mdl-37997295

RESUMO

KEY POINTS: We present the largest cohort of structured histopathology reports on primary ciliary dyskinesia-related chronic rhinosinusitis (PCD-CRS). Despite endoscopic differences, PCD-CRS and cystic fibrosis-related chronic rhinosinusitis (CF-CRS) had similar structured histopathology reports. Compared to healthy patients and those with idiopathic chronic rhinosinusitis without nasal polyps, patients with PCD-CRS had an increased neutrophil count.

20.
Foot Ankle Orthop ; 8(3): 24730114231200482, 2023 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-37786608

RESUMO

Background: The modified Lapidus (ML) is a powerful procedure for correction of hallux valgus (HV) with emerging techniques. Studies considering patient-reported outcomes, radiographic measures, complications, and implant costs are currently limited. Methods: Retrospective cohort with prospectively collected Patient Reported Outcome Information System Physical Function (PROMIS-PF) Computerized Adaptive Test (CAT) scores, radiographic parameters (intermetatarsal angle, IMA; hallux valgus angle, HVA; and tibial sesamoid position, TSP), complications, and total operative time and implant costs were reviewed from 2014 to 2019. Results: Seventy-three feet (68 patients) underwent bunion correction by ML with lag-screw fixation. Median age was 55.8 years (IQR 45.6, 53.9), 4 of 73 (5.5%) were male, 11 of 73 (15.1%) were smokers, and 15 of 73 (20.6%) were diabetic (median HbA1c 6.4% [IQR 6.0, 7.4], none insulin dependent, 5 of 15 with neuropathy). Complications included 6 of 73 (8.2%) wound issues resolved with topical or oral treatment, 9 of 73 (12.3%) painful or broken hardware requiring hardware removal. Two of 73 (2.7%) had persistent pain despite union. One of 73 (1.4%) was overcorrected and required first MTP arthrodesis. Of 3 nonunions (2.7%), 1 resolved with corrected hypothyroidism, 1 was asymptomatic and required no treatment, 1 had a hallux valgus recurrence and sought revision surgery elsewhere. Preoperative radiographic angles were HVA 35 degrees, IMA 14 degrees which improved at final postoperative follow up to HVA 10 degrees, IMA 6 degrees. Tibial sesamoid position improved from 6.05 ± 1.00 to 2.22 ± 1.38. Thirty-two patients had preoperative and 42 had 1-year postoperative outcomes. PROMIS-PF (51% collection rate) was 43 (IQR 37,52) preoperatively, 37 (31, 39) at 6 weeks, 46 (42, 51) at 3 months, and 49 (41, 53) at >360 days postoperatively. The drop in PROMIS-PF between preoperative and 6 weeks and the rise from 6 weeks to 3 months were statistically significant. Pre- and postoperative PROMIS-PF scores were not significantly different. Implant cost averaged US$146. Discussion/Conclusion: We report low complication rates and costs with high patient postoperative functional and radiographic outcomes. PROMIS-PF decreased acutely postoperatively but recovered and maintained high levels by 3 months postoperatively. Level of Evidence: Level IV, case series.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA