ABSTRACT
We assessed SARS-CoV-2 seroprevalence in Japan during July-August 2023, with a focus on 2 key age groups, 0-15 and >80 years. We estimated overall seroprevalence of 45.3% for nucleocapsid antibodies and 95.4% for spike antibodies and found notable maternally derived spike antibodies in infants 6-11 months of age (90.0%).
Subject(s)
Antibodies, Viral , COVID-19 Vaccines , COVID-19 , SARS-CoV-2 , Humans , COVID-19/epidemiology , COVID-19/immunology , Seroepidemiologic Studies , Japan/epidemiology , SARS-CoV-2/immunology , Infant , Child , Antibodies, Viral/blood , Antibodies, Viral/immunology , Child, Preschool , Adult , Adolescent , Young Adult , COVID-19 Vaccines/immunology , COVID-19 Vaccines/administration & dosage , Aged, 80 and over , Infant, Newborn , Female , Male , Aged , Middle Aged , Spike Glycoprotein, Coronavirus/immunologyABSTRACT
BACKGROUND: The aim of this study is to evaluate how cumulative burden of clinically relevant, self-reported outcomes in childhood cancer survivors (CCSs) compares to a sibling control group and to explore how the burden corresponds to levels of care proposed by existing risk stratifications. METHODS: The authors invited 5925 5-year survivors from the Dutch Childhood Cancer Survivor Study (DCCSS LATER) cohort and their 1066 siblings to complete a questionnaire on health outcomes. Health outcomes were validated by self-reported medication use or medical record review. Missing data on clinically relevant outcomes in CCSs for whom no questionnaire data were available were imputed with predictive mean matching. We calculated the mean cumulative count (MCC) for clinically relevant outcomes. Furthermore, we calculated 30-year MCC for groups of CCSs based on primary cancer diagnosis and treatment, ranked 30-year MCC, and compared the ranking to levels of care according to existing risk stratifications. RESULTS: At median 18.5 years after 5-year survival, 46% of CCSs had at least one clinically relevant outcome. CCSs experienced 2.8 times more health conditions than siblings (30-year MCC = 0.79; 95% confidence interval [CI], 0.74-0.85 vs. 30-year MCC = 0.29; 95% CI, 0.25-0.34). CCSs' burden of clinically relevant outcomes consisted mainly of endocrine and vascular conditions and varied by primary cancer type. The ranking of the 30-year MCC often did not correspond with levels of care in existing risk stratifications. CONCLUSIONS: CCSs experience a high cumulative burden of clinically relevant outcomes that was not completely reflected by current risk stratifications. Choices for survivorship care should extend beyond primary tumor and treatment parameters, and should consider also including CCSs' current morbidity.
Subject(s)
Cancer Survivors , Neoplasms , Child , Humans , Neoplasms/epidemiology , Neoplasms/therapy , Neoplasms/pathology , Self Report , Survivorship , SurvivorsABSTRACT
INTRODUCTION: Systemic osteogenesis has been speculated to be involved in the pathogenesis of ossification of the posterior longitudinal ligament (OPLL). Our purpose was to compare the radiologic prevalence and severity of heterotopic ossification in foot tendons of Japanese patients with OPLL and to determine their association with systemic heterotopic ossification. MATERIALS AND METHODS: Clinical and radiographic data of 114 patients with OPLL were collected from 2020 to 2022. Control data were extracted from a medical database of 362 patients with ankle radiographs. Achilles and plantar tendon ossification were classified as grades 0-4, and the presence of osteophytes at five sites in the foot/ankle joint was assessed by radiography. Factors associated with the presence and severity of each ossification were evaluated by multivariable logistic regression and linear regression analysis. RESULTS: The prevalence of Achilles and plantar tendon ossification (grade ≥ 2) was 4.0-5.5 times higher in patients with OPLL (40-56%) than in the controls (10-11%). The presence of Achilles tendon ossification was associated with OPLL, age, and coexisting plantar tendon ossification, and was most strongly associated with OPLL (standardized regression coefficient, 0.79; 95% confidence interval, 1.34-2.38). The severity of Achilles and plantar tendon ossification was associated with the severity of ossification of the entire spinal ligament. CONCLUSIONS: The strong association of foot tendon ossification with OPLL suggests that patients with OPLL have a systemic osteogenesis background. These findings will provide a basis for exploring new treatment strategies for OPLL, including control of metabolic abnormalities.
Subject(s)
Ossification of Posterior Longitudinal Ligament , Ossification, Heterotopic , Humans , Male , Ossification of Posterior Longitudinal Ligament/diagnostic imaging , Ossification of Posterior Longitudinal Ligament/pathology , Ossification of Posterior Longitudinal Ligament/complications , Female , Ossification, Heterotopic/diagnostic imaging , Ossification, Heterotopic/pathology , Middle Aged , Aged , Achilles Tendon/pathology , Achilles Tendon/diagnostic imaging , Tendons/pathology , Tendons/diagnostic imaging , Foot/pathology , Ankle/diagnostic imaging , Ankle/pathology , Ankle Joint/diagnostic imaging , Ankle Joint/pathology , Adult , Japan/epidemiology , PrevalenceABSTRACT
BACKGROUND: In Japan, carbapenem-resistant Enterobacterales (CRE) infections were incorporated into the National Epidemiological Surveillance of Infectious Diseases (NESID) in 2014, necessitating mandatory reporting of all CRE infections cases. Subsequently, pathogen surveillance was initiated in 2017, which involved the collection and analysis of CRE isolates from reported cases to assess carbapenemase gene possession. In this surveillance, CRE is defined as (i) minimum inhibitory concentration (MIC) of meropenem ≥2 mg/L (MEPM criteria) or (ii) MIC of imipenem ≥2 mg/L and MIC of cefmetazole ≥64 mg/L (IPM criteria). This study examined whether the current definition of CRE surveillance captures cases with a clinical and public health burden. METHODS: CRE isolates from reported cases were collected from the public health laboratories of local governments, which are responsible for pathogen surveillance. Antimicrobial susceptibility tests were conducted on these isolates to assess compliance with the NESID CRE definition. The NESID data between April 2017 and March 2018 were obtained and analyzed using antimicrobial susceptibility test results. RESULTS: In total, 1681 CRE cases were identified during the study period, and pathogen surveillance data were available for 740 (44.0%) cases. Klebsiella aerogenes and Enterobacter cloacae complex were the dominant species, followed by Klebsiella pneumoniae and Escherichia coli. The rate of carbapenemase gene positivity was 26.5% (196/740), and 93.4% (183/196) of these isolates were of the IMP type. Meanwhile, 315 isolates were subjected to antimicrobial susceptibility testing. Among them, 169 (53.7%) fulfilled only the IPM criteria (IPM criteria-only group) which were susceptible to meropenem, while 146 (46.3%) fulfilled the MEPM criteria (MEPM criteria group). The IPM criteria-only group and MEPM criteria group significantly differed in terms of carbapenemase gene positivity (0% vs. 67.8%), multidrug resistance rates (1.2% vs. 65.8%), and mortality rates (1.8% vs 6.9%). CONCLUSION: The identification of CRE cases based solely on imipenem resistance has had a limited impact on clinical management. Emphasizing resistance to meropenem is crucial in defining CRE, which pose both clinical and public health burden. This emphasis will enable the efficient allocation of limited health and public health resources and preservation of newly developed antimicrobials.
Subject(s)
Anti-Infective Agents , Imipenem , Humans , Meropenem/pharmacology , Imipenem/pharmacology , Public Health Surveillance , Bacterial Proteins/genetics , beta-Lactamases/genetics , Cefmetazole , Escherichia coli , Microbial Sensitivity Tests , Anti-Bacterial Agents/pharmacologyABSTRACT
BACKGROUND: Stroke patients often face disabilities that significantly interfere with their daily lives. Poor nutritional status is a common issue amongst these patients, and malnutrition can severely impact their functional recovery post-stroke. Therefore, nutritional therapy is crucial in managing stroke outcomes. However, its effects on disability, activities of daily living (ADL), and other critical outcomes have not been fully explored. OBJECTIVES: To evaluate the effects of nutritional therapy on reducing disability and improving ADL in patients after stroke. SEARCH METHODS: We searched the trial registers of the Cochrane Stroke Group, CENTRAL, MEDLINE (from 1946), Embase (from 1974), CINAHL (from 1982), and AMED (from 1985) to 19 February 2024. We also searched trials and research registries (ClinicalTrials.gov, World Health Organization International Clinical Trials Registry Platform) and reference lists of articles. SELECTION CRITERIA: We included randomised controlled trials (RCTs) that compared nutritional therapy with placebo, usual care, or one type of nutritional therapy in people after stroke. Nutritional therapy was defined as the administration of supplemental nutrients, including energy, protein, amino acids, fatty acids, vitamins, and minerals, through oral, enteral, or parenteral methods. As a comparator, one type of nutritional therapy refers to all forms of nutritional therapies, excluding the specific nutritional therapy defined for use in the intervention group. DATA COLLECTION AND ANALYSIS: We used Cochrane's Screen4Me workflow to assess the initial search results. Two review authors independently screened references that met the inclusion criteria, extracted data, and assessed the risk of bias and the certainty of the evidence using the GRADE approach. We calculated the mean difference (MD) or standardised mean difference (SMD) for continuous data and the odds ratio (OR) for dichotomous data, with 95% confidence intervals (CIs). We assessed heterogeneity using the I2 statistic. The primary outcomes were disability and ADL. We also assessed gait, nutritional status, all-cause mortality, quality of life, hand and leg muscle strength, cognitive function, physical performance, stroke recurrence, swallowing function, neurological impairment, and the development of complications (adverse events) as secondary outcomes. MAIN RESULTS: We identified 52 eligible RCTs involving 11,926 participants. Thirty-six studies were conducted in the acute phase, 10 in the subacute phase, three in the acute and subacute phases, and three in the chronic phase. Twenty-three studies included patients with ischaemic stroke, three included patients with haemorrhagic stroke, three included patients with subarachnoid haemorrhage (SAH), and 23 included patients with ischaemic or haemorrhagic stroke including SAH. There were 25 types of nutritional supplements used as an intervention. The number of studies that assessed disability and ADL as outcomes were nine and 17, respectively. For the intervention using oral energy and protein supplements, which was a primary intervention in this review, six studies were included. The results for the seven outcomes focused on (disability, ADL, body weight change, all-cause mortality, gait speed, quality of life, and incidence of complications (adverse events)) were as follows: There was no evidence of a difference in reducing disability when 'good status' was defined as an mRS score of 0 to 2 (for 'good status': OR 0.97, 95% CI 0.86 to 1.10; 1 RCT, 4023 participants; low-certainty evidence). Oral energy and protein supplements may improve ADL as indicated by an increase in the FIM motor score, but the evidence is very uncertain (MD 8.74, 95% CI 5.93 to 11.54; 2 RCTs, 165 participants; very low-certainty evidence). Oral energy and protein supplements may increase body weight, but the evidence is very uncertain (MD 0.90, 95% CI 0.23 to 1.58; 3 RCTs, 205 participants; very low-certainty evidence). There was no evidence of a difference in reducing all-cause mortality (OR 0.57, 95% CI 0.14 to 2.28; 2 RCTs, 4065 participants; low-certainty evidence). For gait speed and quality of life, no study was identified. With regard to incidence of complications (adverse events), there was no evidence of a difference in the incidence of infections, including pneumonia, urinary tract infections, and septicaemia (OR 0.68, 95% CI 0.20 to 2.30; 1 RCT, 42 participants; very low-certainty evidence). The intervention was associated with an increased incidence of diarrhoea compared to usual care (OR 4.29, 95% CI 1.98 to 9.28; 1 RCT, 4023 participants; low-certainty evidence) and the occurrence of hyperglycaemia or hypoglycaemia (OR 15.6, 95% CI 4.84 to 50.23; 1 RCT, 4023 participants; low-certainty evidence). AUTHORS' CONCLUSIONS: We are uncertain about the effect of nutritional therapy, including oral energy and protein supplements and other supplements identified in this review, on reducing disability and improving ADL in people after stroke. Various nutritional interventions were assessed for the outcomes in the included studies, and almost all studies had small sample sizes. This led to challenges in conducting meta-analyses and reduced the precision of the evidence. Moreover, most of the studies had issues with the risk of bias, especially in terms of the absence of blinding and unclear information. Regarding adverse events, the intervention with oral energy and protein supplements was associated with a higher number of adverse events, such as diarrhoea, hyperglycaemia, and hypoglycaemia, compared to usual care. However, the quality of the evidence was low. Given the low certainty of most of the evidence in our review, further research is needed. Future research should focus on targeted nutritional interventions to reduce disability and improve ADL based on a theoretical rationale in people after stroke and there is a need for improved methodology and reporting.
Subject(s)
Activities of Daily Living , Randomized Controlled Trials as Topic , Stroke Rehabilitation , Stroke , Humans , Stroke Rehabilitation/methods , Stroke/complications , Malnutrition/diet therapy , Malnutrition/prevention & control , Nutrition Therapy/methods , Quality of Life , Nutritional Status , BiasABSTRACT
BACKGROUND: Cardiac dysfunction due to cardiotoxicity from anthracycline chemotherapy is a leading cause of morbidity and mortality in childhood cancer survivors (CCS), and the cumulative incidence of cardiac events has continued to increase. This study identifies an adequate indicator of cardiac dysfunction during long-term follow-up. PROCEDURE: In total, 116 patients (median age: 15.5 [range: 4.7-40.2] years) with childhood cancer who were treated with anthracycline were divided into three age groups for analysis (C1: 4-12 years of age, C2: 13-18 years of age, C3: 19-40 years of age), and 116 control patients of similar ages were divided into three corresponding groups (N1, N2, and N3). Layer-specific strains were assessed for longitudinal strain (LS) and circumferential strain (CS). The total and segmental intraventricular pressure gradients (IVPG) were also calculated based on Doppler imaging of the mitral inflow using Euler's equation. RESULTS: Conventional echocardiographic parameters were not significantly different between the patients and controls. All layers of the LS and inner and middle layers of the basal and papillary CS in all ages and all IVPGs in C2 and C3 decreased compared to those of corresponding age groups. Interestingly, basal CS and basal IVPG in CCS showed moderate correlation and both tended to rapidly decrease with aging. Furthermore, basal IVPG and anthracycline dose showed significant correlations. CONCLUSIONS: Basal CS and total and basal IVPGs may be particularly useful indicators of cardiotoxicity in long-term follow-up.
Subject(s)
Cancer Survivors , Heart Diseases , Neoplasms , Humans , Child , Adolescent , Young Adult , Adult , Child, Preschool , Cardiotoxicity/drug therapy , Anthracyclines/adverse effects , Ventricular Pressure , Follow-Up Studies , Neoplasms/drug therapy , Neoplasms/complications , Heart Diseases/diagnosis , Heart Diseases/diagnostic imaging , Antibiotics, Antineoplastic/adverse effectsABSTRACT
AIM: The head circumference to chest circumference (HC/CC) ratio has been used to identify low birth weight infants in developed countries. This study was conducted to examine whether the ratio could distinguish asymmetrical foetal growth restriction (FGR). METHODS: This retrospective observational study was conducted with 1955 infants (50.5% male) born at term between 2016 and 2020 at Tokyo Metropolitan Toshima Hospital, Japan. RESULTS: We found that 120 (6.1%) had FGR. Their mean birth weight was 3052.1 ± 367.3 g, and their mean gestational age was 39.1 ± 1.1 weeks. Logistic regression analysis showed that the association between the HC/CC ratio and FGR had a regression coefficient of -20.6 (p < 0.000). The linear regression analysis showed that the association between the HC/CC ratio and the birth weight z-score had a regression coefficient of -8.59 (p < 0.000). The coefficient of correlation was -0.33 (p < 0.001). The receiver operating characteristic curve for detecting FGR showed that the area under the curve was 0.75 and the cut-off value was 0.93, with sensitivity of 75.8% and specificity of 60.8%. CONCLUSION: Our study established the associations between HC/CC ratio and FGR and birth weight z-scores and confirmed that the ratio provided an easy way to detect FGR in term-born infants.
Subject(s)
Fetal Growth Retardation , Infant, Low Birth Weight , Infant, Newborn , Pregnancy , Infant , Female , Humans , Male , Fetal Growth Retardation/diagnosis , Birth Weight , Parturition , Gestational AgeABSTRACT
A nationwide survey of SARS-CoV-2 antinucleocapsid seroprevalence among blood donors in Japan revealed that, as of November 2022, infection-induced seroprevalence of the population was 28.6% (95% CI 27.6%-29.6%). Seroprevalence studies might complement routine surveillance and ongoing monitoring efforts to provide a more complete real-time picture of COVID-19 burden.
Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Blood Donors , COVID-19/epidemiology , Japan/epidemiology , Seroepidemiologic Studies , Antibodies, ViralABSTRACT
OBJECTIVES: The COVID-19 pandemic has had variable effects on the rates of STIs reported across the globe. This study sought to assess how the number of STI reports changed during the pandemic in Japan. METHODS: We used national infectious disease surveillance data from the National Institute of Infectious Diseases (Tokyo, Japan) for the period between January 2013 and December 2021. We compared reported rates of chlamydia, gonorrhoea, condyloma acuminata and genital herpes, as well as total notifications for HIV/AIDS and syphilis during the pandemic versus previous years in Japan. We used a quasi-Poisson regression to determine whether any given week or month between January 2018 and December 2021 had a significant excess or deficit of STIs. Notification values above or below the 95% upper and lower prediction thresholds were considered as statistically significant. The start of the pandemic was defined as January 2020. RESULTS: Chlamydia generally remained within predicted range during the pandemic period. Reporting of gonorrhoea was significantly higher than expected throughout early-to-mid 2021 but otherwise generally remained within predicted range prior to 2021. Condyloma, herpes and HIV/AIDS reporting were transiently significantly lower than expected throughout the pandemic period, but no significant periods of higher-than-expected reporting were detected. Syphilis showed widespread evidence of significantly lower-than-predicted reporting throughout 2020 but eventually reversed, showing significantly higher-than-predicted reporting in mid-to-late 2021. CONCLUSIONS: The COVID-19 pandemic was associated with variable changes in the reporting of STIs in Japan. Higher-than-predicted reporting was more likely to be observed in the later phases of the pandemic. These changes may have been attributable to pandemic-related changes in sexual behaviour and decreased STI clinic attendance and testing, but further research on the long-term impact of the pandemic on STIs is necessary.
Subject(s)
Acquired Immunodeficiency Syndrome , COVID-19 , Chlamydia Infections , Chlamydia , Condylomata Acuminata , Gonorrhea , HIV Infections , Sexually Transmitted Diseases , Syphilis , Humans , Syphilis/epidemiology , Gonorrhea/epidemiology , Pandemics , Acquired Immunodeficiency Syndrome/epidemiology , Japan/epidemiology , HIV Infections/epidemiology , COVID-19/epidemiology , Sexually Transmitted Diseases/epidemiology , Condylomata Acuminata/epidemiology , Chlamydia Infections/epidemiologyABSTRACT
Accurately estimating the timing of pathogen exposure plays a crucial role in outbreak control for emerging infectious diseases, including the source identification, contact tracing, and vaccine research and development. However, since surveillance activities often collect data retrospectively after symptoms have appeared, obtaining accurate data on the timing of disease onset is difficult in practice and can involve "coarse" observations, such as interval or censored data. To address this challenge, we propose a novel likelihood function, tailored to coarsely observed data in rapid outbreak surveillance, along with an optimization method based on an ε $$ \varepsilon $$ -accelerated EM algorithm for faster convergence to find maximum likelihood estimates (MLEs). The covariance matrix of MLEs is also discussed using a nonparametric bootstrap approach. In terms of bias and mean-squared error, the performance of our proposed method is evaluated through extensive numerical experiments, as well as its application to a series of epidemiological surveillance focused on cases of mass food poisoning. The experiments show that our method exhibits less bias than conventional methods, providing greater efficiency across all scenarios.
ABSTRACT
STUDY OBJECTIVE: To review closed reduction methods for anterior shoulder dislocation and perform the first comprehensive comparison of the individual methods in terms of success rate, pain, and reduction time. METHODS: We searched MEDLINE, PubMed, EMBASE, Cochrane, and ClinicalTrials.gov for randomized controlled trials registered until December 31, 2020. We performed a pairwise and network meta-analysis using a Bayesian random-effects model. Two authors independently performed screening and risk-of-bias assessment. RESULTS: We found 14 studies with 1,189 patients. In a pairwise meta-analysis, no significant difference was found in the only comparable pair, namely, the Kocher method versus the Hippocratic method (success rate: odds ratio, 1.21; 95% confidence interval [CI], 0.53, 2.75: pain during reduction [visual analog scale]: standard mean difference, -0.33; 95% CI, -0.69, 0.02; reduction time [minutes]: mean difference, 0.19, 95% CI, -1.77, 2.15). In network meta-analysis, FARES (Fast, Reliable, and Safe) was the only method significantly less painful than the Kocher method (mean difference, -4.0; 95% credible interval, -7.6, -0.40). In the surface under the cumulative ranking (SUCRA) plot of success rate, FARES, and the Boss-Holzach-Matter/Davos method showed high values. For pain during reduction, FARES had the highest SUCRA value in the overall analysis. In the SUCRA plot of reduction time, modified external rotation and FARES had high values. The only complication was 1 case of fracture with the Kocher method. CONCLUSION: Overall, Boss-Holzach-Matter/Davos, and FARES demonstrated the most favorable value for success rates, whereas both FARES and modified external rotation were more favorable in reduction times. FARES had the most favorable SUCRA for pain during reduction. Future work directly comparing techniques is needed to better understand the difference in reduction success and complications.
Subject(s)
Fractures, Bone , Shoulder Dislocation , Humans , Shoulder Dislocation/therapy , Network Meta-Analysis , Bayes Theorem , Pain , Fractures, Bone/complicationsABSTRACT
From 1 January 2022 to 4 September 2022, a total of 53 996 mpox cases were confirmed globally. Cases are predominantly concentrated in Europe and the Americas, while other regions are also continuously observing imported cases. This study aimed to estimate the potential global risk of mpox importation and consider hypothetical scenarios of travel restrictions by varying passenger volumes (PVs) via airline travel network. PV data for the airline network, and the time of first confirmed mpox case for a total of 1680 airports in 176 countries (and territories) were extracted from publicly available data sources. A survival analysis technique in which the hazard function was a function of effective distance was utilised to estimate the importation risk. The arrival time ranged from 9 to 48 days since the first case was identified in the UK on 6 May 2022. The estimated risk of importation showed that regardless of the geographic region, most locations will have an intensified importation risk by 31 December 2022. Travel restrictions scenarios had a minor impact on the global airline importation risk against mpox, highlighting the importance to enhance local capacities for the identification of mpox and to be prepared to carry out contact tracing and isolation.
Subject(s)
Mpox (monkeypox) , Humans , Travel , Airports , Contact Tracing , Europe/epidemiologyABSTRACT
INTRODUCTION: During the COVID-19 pandemic, the incidence of many droplet-transmitted infections decreased due to increased mask-wearing and social distancing. Contrastingly, there has been concern that COVID-19 countermeasures, such as lockdowns, may increase legionellosis incidence via water stagnation. During the pandemic in Japan, four state of emergency declarations were imposed between 2020 and 2021, which makes it particularly suitable to test this hypothesis. METHODS: We use country-level surveillance data from the National Institute of Infectious Diseases to track the relative incidence of legionellosis compared to invasive pneumococcal disease (IPD) during the COVID-19 pandemic in Japan, with a focus on the periods just after state of emergency declarations were lifted. RESULTS: The absolute number of legionellosis and IPD cases decreased in 2020 and 2021 compared to previous years. The average relative incidence of legionellosis as well as the variance of the relative incidence significantly increased during the pandemic compared to previous years. There were no increases in the relative incidence of legionellosis during the periods immediately following emergency declaration liftings, but the relative incidence did increase considerably during the first two states of emergency. CONCLUSIONS: COVID-19 countermeasures appear more effective at decreasing the incidence of human-to-human transmitted infections, such as IPD, compared to environmentally-transmitted infections, such as legionellosis. Though no evidence was found to suggest that legionellosis cases increased after state of emergency declarations, public health efforts should continue to emphasize the importance of routine sanitation and water system maintenance to prevent water stagnation and Legionella spp. contamination.
Subject(s)
COVID-19 , Legionellosis , Pneumococcal Infections , Humans , COVID-19/epidemiology , Pandemics , Incidence , Japan/epidemiology , Communicable Disease Control , Legionellosis/epidemiology , Pneumococcal Infections/epidemiology , WaterABSTRACT
BACKGROUND: Evidence has demonstrated that excess sodium intake is associated with development of several non-communicable diseases. The main source of sodium is salt. Therefore, reducing salt intake in foods is an important global public health effort to achieve sodium reduction and improve health. This study aimed to model salt intake reduction with 'umami' substances among Japanese adults. The umami substances considered in this study include glutamate or monosodium glutamates (MSG), calcium diglutamate (CDG), inosinate, and guanylate. METHODS: A total of 21,805 participants aged 57.8 years on average from the National Health and Nutrition Survey was used in the analysis. First, we employed a multivariable linear regression approach with overall salt intake (g/day) as a dependent variable, adjusting for food items and other covariates to estimate the contribution of salt intake from each food item that was selected through an extensive literature review. Assuming the participants already consume low-sodium products, we considered three scenarios in which salt intake could be reduced with the additional umami substances up to 30%, 60% and 100%. We estimated the total amount of population-level salt reduction for each scenario by age and gender. Under the 100% scenario, the Japan's achievement rates against the national and global salt intake reduction goals were also calculated. RESULTS: Without compromising the taste, the 100% or universal incorporation of umami substances into food items reduced the salt intake of Japanese adults by 12.8-22.3% at the population-level average, which is equivalent to 1.27-2.22 g of salt reduction. The universal incorporation of umami substances into food items changed daily mean salt intake of the total population from 9.95 g to 7.73 g: 10.83 g to 8.40 g for men and 9.21 g to 7.17 g for women, respectively. This study suggested that approximately 60% of Japanese adults could achieve the national dietary goal of 8 g/day, while only 7.6% would meet the global recommendation of 5.0 g/day. CONCLUSIONS: Our study provides essential information on the potential salt reduction with umami substances. The universal incorporation of umami substances into food items would enable the Japanese to achieve the national dietary goal. However, the reduced salt intake level still falls short of the global dietary recommendation.
Subject(s)
East Asian People , Sodium Chloride, Dietary , Adult , Male , Humans , Female , Cross-Sectional Studies , Food , Sodium , TasteABSTRACT
We study selection bias in meta-analyses by assuming the presence of researchers (meta-analysts) who intentionally or unintentionally cherry-pick a subset of studies by defining arbitrary inclusion and/or exclusion criteria that will lead to their desired results. When the number of studies is sufficiently large, we theoretically show that a meta-analysts might falsely obtain (non)significant overall treatment effects, regardless of the actual effectiveness of a treatment. We analyze all theoretical findings based on extensive simulation experiments and practical clinical examples. Numerical evaluations demonstrate that the standard method for meta-analyses has the potential to be cherry-picked.
ABSTRACT
INTRODUCTION: Previous studies on patients with symptoms of spinal ligament ossification, including ossification of the posterior longitudinal ligament (OPLL) and ligamentum flavum (OLF), have not clarified whether obesity is a cause or consequence of these diseases and were limited by selection bias. Thus, we investigated the association between obesity and the prevalence of spinal ligament ossification in randomly selected asymptomatic subjects. MATERIALS AND METHODS: Between April 2020 and March 2021, 622 asymptomatic Japanese subjects who underwent computed tomography of neck to pelvis for medical check-up purposes were included. All subjects were divided into the following three groups: normal weight (body mass index [BMI] < 25 kg/m2), obese I (25 ≤ BMI < 30 kg/m2), and obese II (BMI ≥ 30 kg/m2). The relationship between factors affecting the presence of each spinal ligament ossification was evaluated using multivariate logistic regression analysis. RESULTS: The proportion of subjects with thoracic OPLL was significantly higher in the obese II group than in the other two groups (vs. normal weight, P < 0.001; vs. obese I, P < 0.001). BMI was associated with the prevalence of OLF, cervical OPLL, thoracic OPLL, and ossification of the anterior longitudinal ligament (OALL). BMI was most significantly associated with the prevalence of thoracic OPLL (ß, 0.28; 95% confidence interval, 0.17-0.39). CONCLUSION: BMI was associated with the prevalence of OALL, cervical OPLL, thoracic OPLL, and OLF in asymptomatic subjects, suggesting that obesity is associated with the development of heterotopic ossification of the spinal ligaments.
Subject(s)
Ligamentum Flavum , Ossification of Posterior Longitudinal Ligament , Ossification, Heterotopic , Cross-Sectional Studies , Humans , Ligamentum Flavum/diagnostic imaging , Obesity/complications , Obesity/epidemiology , Ossification of Posterior Longitudinal Ligament/complications , Ossification of Posterior Longitudinal Ligament/diagnostic imaging , Ossification of Posterior Longitudinal Ligament/epidemiology , Ossification, Heterotopic/epidemiology , OsteogenesisABSTRACT
INTRODUCTION: A 28.2 µg twice-weekly formulation of teriparatide (2/W-TPD) was developed to provide comparably high efficacy for osteoporosis to a 56.5 µg once-weekly formulation while improving the safety and persistence rate. In the current study, we aimed to elucidate the real-world persistence of 2/W-TPD and to identify the factors associated with the discontinuation of 2/W-TPD in patients with severe osteoporosis. MATERIALS AND METHODS: This retrospective study included 90 patients who were treated with 2/W-TPD at three hospitals in Japan. Patient information was collected, including age, sex, distance to the hospital, family structure, comorbidities, previous treatment for osteoporosis, timing of the injection, side effects and duration of 2/W-TPD treatment, barthel index (BI), and bone mineral density (BMD) of the lumbar spine and femoral neck. We examined the factors influencing 2/W-TPD discontinuation using the Cox proportional hazards model. RESULTS: The 12 month completion rate of 2/W-TPD therapy was 47.5%. The Cox hazard analysis identified side effects [Hazard Ratio (HR) = 14.59, P < 0.001], low BMD of the femoral neck (HR = 0.04, P = 0.002), and morning injection (HR = 3.29, P = 0.006) as risk factors influencing the discontinuation of 2/W-TPD. Other variables, including age, did not contribute to the continuation of 2/W-TPD. CONCLUSION: One year continuation rate of 2/W-TPD was higher than the previously reported value of the once-weekly formulation in real-world setting, probably due to the lower incidence of side effects. Introducing injection of 2/W-TPD may further improve the persistence of TPD therapy for osteoporosis.
Subject(s)
Bone Density Conservation Agents , Osteoporosis , Bone Density , Bone Density Conservation Agents/adverse effects , Humans , Lumbar Vertebrae , Osteoporosis/complications , Retrospective Studies , Teriparatide/adverse effectsABSTRACT
BACKGROUND: Interrupted time series (ITS) analysis has become a popular design to evaluate the effects of health interventions. However, the most common formulation for ITS, the linear segmented regression, is not always adequate, especially when the timing of the intervention is unclear. In this study, we propose a new model to overcome this limitation. METHODS: We propose a new ITS model, ARIMAITS-DL, that combines (1) the Autoregressive Integrated Moving Average (ARIMA) model and (2) distributed lag functional terms. The ARIMA technique allows us to model autocorrelation, which is frequently observed in time series data, and the decaying cumulative effect of the intervention. By contrast, the distributed lag functional terms represent the idea that the intervention effect does not start at a fixed time point but is distributed over a certain interval (thus, the intervention timing seems unclear). We discuss how to select the distribution of the effect, the model construction process, diagnosing the model fitting, and interpreting the results. Further, our model is implemented as an example of a statement of emergency (SoE) during the coronavirus disease 2019 pandemic in Japan. RESULTS: We illustrate the ARIMAITS-DL model with some practical distributed lag terms to examine the effect of the SoE on human mobility in Japan. We confirm that the SoE was successful in reducing the movement of people (15.0-16.0% reduction in Tokyo), at least between February 20 and May 19, 2020. We also provide the R code for other researchers to easily replicate our method. CONCLUSIONS: Our model, ARIMAITS-DL, is a useful tool as it can account for the unclear intervention timing and distributed lag effect with autocorrelation and allows for flexible modeling of different types of impacts such as uniformly or normally distributed impact over time.
Subject(s)
COVID-19 , COVID-19/epidemiology , COVID-19/prevention & control , Humans , Interrupted Time Series Analysis , Linear Models , Pandemics/prevention & control , Time FactorsABSTRACT
BACKGROUND: The impact of promotional tweets from the official journal account (forCirculation JournalandCirculation Reports) on article viewership has not been thoroughly evaluated.MethodsâandâResults:We retrospectively collected journal viewership data forCirculation JournalandCirculation Reportsfrom March 2021 to August 2021. We compared viewership between articles with (n=15) and without (n=250) tweets. After 1 : 4 propensity score matching (15 tweeted articles and 60 non-tweeted matched controls), journal viewership metrics within 7 days of the tweeting date (and the hypothetical tweeting date), was larger in tweeted articles than non-tweeted articles (median [interquartile range] Abstract page views 89 [60-104] vs. 18 [8-41]). CONCLUSIONS: This pilot study suggests a positive relationship between journal-posted promotional tweets and article viewership.
Subject(s)
Social Media , Benchmarking , Humans , Japan , Pilot Projects , Retrospective StudiesABSTRACT
BACKGROUND: Increases in human mobility have been linked to rises in novel coronavirus disease 2019 (COVID-19) transmission. The pandemic era in Japan has been characterized by changes in inter-prefectural mobility across state of emergency (SOE) declarations and travel campaigns, but they have yet to be characterized. METHODS: Using Yahoo Japan mobility data extracted from the smartphones of more than 10 million Japanese residents, we calculated the monthly number of inter-prefectural travel instances, stratified by residential prefecture and destination prefecture. We then used this adjacency matrix to calculate two network connectedness metrics, closeness centrality and effective distance, that reliably predict disease transmission. RESULTS: Inter-prefectural mobility and network connectedness decreased most considerably during the first SOE, but this decrease dampened with each successive SOE. Mobility and network connectedness increased during the Go To Travel campaign. Travel volume between distant prefectures decreased more than travel between prefectures with geographic proximity. Closeness centrality was found to be negatively correlated with the rate of COVID-19 infection across prefectures, with the strength of this association increasing in tandem with the infection rate. Changes in effective distance were more visible among geographically isolated prefectures (Hokkaido and Okinawa) than among metropolitan, central prefectures (Tokyo, Aichi, Osaka, and Fukuoka). CONCLUSION: The magnitude of reductions in human mobility decreased with each subsequent state of emergency, consistent with pandemic fatigue. The association between network connectedness and rates of COVID-19 infection remained visible throughout the entirety of the pandemic period, suggesting that inter-prefectural mobility may have contributed to disease spread.