RESUMO
Students with developmental disabilities are anxious about a change in environment when graduating from high school to college. Existing research, which is scarce, focuses on the mental health status of students with developmental disabilities entering university. This study investigated the frequency of autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD) among first-year Japanese university students and their mental health risks post-admission. We conducted a cross-sectional survey for university students within a month of admission, using the Autism Spectrum Quotient (AQ) and Adult ADHD Test (A-ADHD) to demonstrate the frequency of ASD and ADHD. The Counseling Center Assessment of Psychological Symptoms (CCAPS)-Japanese (depression, eating concerns, hostility, social anxiety, family distress, alcohol use, generalized anxiety, and academic distress) evaluated their mental health condition.Of 711 students (20.3 ± 2.1 years; 330 male, 381 female), the number of those showing either ASD or ADHD tendencies was 61 (8.58%). Twenty-three (3.23%) showed symptoms of only ASD, 34 (4.78%) of ADHD, and four (0.56%) of ASD and ADHD. No significant differences existed in the frequency of ASD and ADHD between each sex and major. The scores and frequency of high risk (over the cut-off points) students on all CCAPS-Japanese subscales (except alcohol use) were significantly higher among the ASD and ADHD groups than the control group, which showed no ASD or ADHD tendencies. The frequency of ASD and ADHD characteristics among first-year Japanese university students was 8.58%. They have a high risk of mental health problems when they enter university.
RESUMO
Owing to its mitogenic and angiogenic characteristics, the use of basic fibroblast growth factor (bFGF) to promote wound healing has been investigated. However, its clinical efficacy has fallen short of expectations due to its instability. Heparin has been reported to stabilize bFGF. Therefore, we hypothesized that the combination of these agents would more effectively promote wound healing than bFGF alone; a single-center, two-arm parallel, single-blind, and a prospective randomized controlled pilot study was therefore performed involving 12 patients who underwent split-thickness skin graft harvesting. To ensure a feasible clinical treatment model, commercially available agents were used. The patients were randomly assigned to either the control group treated with bFGF (n = 6) or the intervention group treated with bFGF and heparin (n = 6) in a 1:1 ratio. The wound area and the wound area variation was assessed each week postoperatively, as was the number of days required for epithelialization. As a supplementary analysis, the least-squares means were calculated using a linear mixed-effects model. The results of this study indicate that the combination of bFGF and heparin may more effectively promote wound healing than bFGF alone, consistent with our hypothesis. A multicenter trial based on these data is ongoing.
RESUMO
This study aimed to validate the DFS (direct fast scarlet) staining in the diagnosis of EC (eosinophilic colitis). The study included 50 patients with EC and 60 with control colons. Among the 60 control samples, 39 and 21 were collected from the ascending and descending colons, respectively. We compared the median number of eosinophils and frequency of eosinophil degranulation by HE (hematoxylin and eosin) and DFS staining between the EC and control groups. In the right hemi-colon, eosinophil count by HE was useful in distinguishing between EC and control (41.5 vs. 26.0 cells/HPF, p < 0.001), but the ideal cutoff value is 27.5 cells/HPF (high-power field). However, this method is not useful in the left hemi-colon (12.5 vs. 13.0 cells/HPF, p = 0.990). The presence of degranulation by DFS allows us to distinguish between the groups even in the left hemi-colon (58% vs. 5%, p < 0.001). DFS staining also enabled a more accurate determination of degranulation than HE. According to the current standard to diagnose EC (count by HE staining ≥20 cells/HPF), mucosal sampling from left hemi-colon is problematic since the number of eosinophils could not be increased even in EC. Determination of degranulated eosinophils by DFS may potentiate the diagnostic performance even in such conditions.
RESUMO
BACKGROUND: Sarcopenia is known to bring about adverse outcomes in elderly populations and dialysis patients. However, whether it is a risk factor in kidney transplant recipients (KTRs) has not yet been established. In the present study, the association of sarcopenia with mortality was investigated in KTRs. METHODS: We conducted a single-center prospective cohort study and recruited KTRs who were more than 1-year posttransplant from August 2017 to January 2018. The participants were followed for 5 years, and the Kaplan-Meier method and Cox proportional hazards model were used to assess patient survival. RESULTS: A total of 212 KTRs with a median age of 54 years and median transplant vintage of 79 months were enrolled in this study. Among them, 33 (16%) had sarcopenia according to the Asia Working Group for Sarcopenia 2019 at baseline. During the 5-year follow-up period, 20 (9.4%) died, 5 returned to dialysis after graft loss, and 4 were lost to follow-up. The 5-year overall survival rate was 90%. After 1:1 propensity score matching, a matched cohort with 60 KTRs was generated. The overall survival rate was significantly lower in the sarcopenia group compared to the non-sarcopenia group (p = 0.025, log-rank test). Furthermore, mortality risk was significantly higher in the sarcopenia group compared to the non-sarcopenia group (hazard ratio = 7.57, 95% confidence interval = 0.94-62). CONCLUSION: Sarcopenia was a predictor of mortality in KTRs. KTRs with suboptimal muscle status who were at risk for poor survival could have a clinical benefit by interventions for sarcopenia.
Assuntos
Transplante de Rim , Pontuação de Propensão , Sarcopenia , Humanos , Sarcopenia/mortalidade , Sarcopenia/complicações , Sarcopenia/diagnóstico , Transplante de Rim/efeitos adversos , Transplante de Rim/mortalidade , Masculino , Feminino , Pessoa de Meia-Idade , Estudos Prospectivos , Fatores de Risco , Taxa de Sobrevida , Adulto , Estimativa de Kaplan-Meier , Idoso , Modelos de Riscos Proporcionais , Transplantados/estatística & dados numéricos , Falência Renal Crônica/mortalidade , Falência Renal Crônica/complicações , Falência Renal Crônica/cirurgiaRESUMO
Introduction: Young adults are hesitant to receive the coronavirus disease 2019 (COVID-19) vaccination owing to concerns regarding adverse events despite the effectiveness of vaccines in preventing SARS-CoV-2 infection-associated serious illness, hospitalization, and death. Methods: A retrospective cohort study was conducted in Gifu University students receiving the mRNA-1273 vaccine and boosters to elucidate the real incidence of adverse events and factors that prevent them. We examined the adverse events and identified potential risk factors through a self-administered questionnaire on the participants' physical condition after COVID-19 vaccination. Results: Focal/systemic adverse events were highly frequent among university students after receiving the COVID-19 vaccine; however, there were no life-threatening cases or hospitalizations over two years. A higher number of vaccinations (p < 0.001), female sex (p < 0.001), and lower body mass index (BMI) (p = 0.002) were associated with an increased incidence of adverse events on the day of COVID-19 vaccination or the day after vaccination. Regular breakfast consumption was significantly associated with a decreased incidence of post-vaccination itching (p = 0.019) and abdominal pain and diarrhea (p = 0.042). Sufficient sleep duration was significantly associated with a decreased incidence of post-vaccination abdominal pain and diarrhea (p = 0.042). Conclusions: High frequency of adverse events of COVID-19 mRNA-1273 among Japanese university students was reported. A higher number of shots, female sex, and lower BMI were associated with a higher incidence of adverse events. Regular breakfast and sufficient sleep were associated with fewer adverse events. This study may provide a possible solution to the worldwide problem of vaccine hesitancy.
RESUMO
OBJECTIVE: The risk factors for residual liver recurrence after resection of colorectal cancer liver metastases were analyzed separately for synchronous and metachronous metastases. METHODS: This retrospective study included 236 patients (139 with synchronous and 97 with metachronous lesions) who underwent initial surgery for colorectal cancer liver metastases from April 2010 to December 2021 at the Fujita Health University Hospital. We performed univariate and multivariate analyses of risk factors for recurrence based on clinical background. RESULTS: Univariate analysis of synchronous liver metastases identified three risk factors: positive lymph nodes (p = 0.018, HR = 2.067), ≥3 liver metastases (p < 0.001, HR = 2.382), and use of adjuvant chemotherapy (p = 0.013, HR = 0.560). Multivariate analysis identified the same three factors. For metachronous liver metastases, univariate and multivariate analysis identified ≥3 liver metastases as a risk factor (p = 0.002, HR = 2.988); however, use of adjuvant chemotherapy after hepatic resection was not associated with a lower risk of recurrence for metachronous lesions. Inverse probability of treatment weighting analysis of patients with these lesions with or without adjuvant chemotherapy after primary resection showed that patients with metachronous liver metastases who did not receive this treatment had fewer recurrences when adjuvant therapy was administered after subsequent liver resection, although the difference was not significant. Patients who received adjuvant chemotherapy after hepatic resection had less recurrence but less benefit from this treatment. CONCLUSION: Risk factors for liver recurrence after resection of synchronous liver metastases were positive lymph nodes, ≥3 liver metastases, and no postoperative adjuvant chemotherapy. Adjuvant chemotherapy is recommended after hepatic resection of synchronous liver metastases.
RESUMO
Background: Synthetic patient data (SPD) generation for survival analysis in oncology trials holds significant potential for accelerating clinical development. Various machine learning methods, including classification and regression trees (CART), random forest (RF), Bayesian network (BN), and conditional tabular generative adversarial network (CTGAN), have been used for this purpose, but their performance in reflecting actual patient survival data remains under investigation. Objective: The aim of this study was to determine the most suitable SPD generation method for oncology trials, specifically focusing on both progression-free survival (PFS) and overall survival (OS), which are the primary evaluation end points in oncology trials. To achieve this goal, we conducted a comparative simulation of 4 generation methods, including CART, RF, BN, and the CTGAN, and the performance of each method was evaluated. Methods: Using multiple clinical trial data sets, 1000 data sets were generated by using each method for each clinical trial data set and evaluated as follows: (1) median survival time (MST) of PFS and OS; (2) hazard ratio distance (HRD), which indicates the similarity between the actual survival function and a synthetic survival function; and (3) visual analysis of Kaplan-Meier (KM) plots. Each method's ability to mimic the statistical properties of real patient data was evaluated from these multiple angles. Results: In most simulation cases, CART demonstrated the high percentages of MSTs for synthetic data falling within the 95% CI range of the MST of the actual data. These percentages ranged from 88.8% to 98.0% for PFS and from 60.8% to 96.1% for OS. In the evaluation of HRD, CART revealed that HRD values were concentrated at approximately 0.9. Conversely, for the other methods, no consistent trend was observed for either PFS or OS. CART demonstrated better similarity than RF, in that CART caused overfitting and RF (a kind of ensemble learning approach) prevented it. In SPD generation, the statistical properties close to the actual data should be the focus, not a well-generalized prediction model. Both the BN and CTGAN methods cannot accurately reflect the statistical properties of the actual data because small data sets are not suitable. Conclusions: As a method for generating SPD for survival data from small data sets, such as clinical trial data, CART demonstrated to be the most effective method compared to RF, BN, and CTGAN. Additionally, it is possible to improve CART-based generation methods by incorporating feature engineering and other methods in future work.
RESUMO
The purpose of this study was to investigate the utility of reconstructed CT images perpendicular to the artery for assessing arterial involvement from pancreatic cancer and compare the interobserver variability between it and the current diagnostic imaging method. This retrospective study included patients with pancreatic cancer in the pancreatic body or tail who underwent preoperative pancreatic protocol CT and distal pancreatectomy. Five radiologists used axial and coronal CT images (current method) and perpendicular reconstructed CT images (proposed method) to determine if the degree of solid soft-tissue contact with the splenic artery was ≤180° or >180°. The generalized estimating equations were used to compare the diagnostic performance of solid soft-tissue contact >180° between the current and proposed methods. Fleiss' ĸ statistics were used to assess interobserver variability. The sensitivity and negative predictive value for diagnosing solid soft-tissue contact >180° were higher (p < 0.001 for each) and the specificity (p = 0.003) and positive predictive value (p = 0.003) were lower in the proposed method than the current method. Interobserver variability was improved in the proposed method compared with the current method (ĸ = 0.87 vs. 0.67). Reconstructed CT images perpendicular to the artery showed higher sensitivity and negative predictive value for diagnosing solid soft-tissue contact >180° than the current method and demonstrated improved interobserver variability.
RESUMO
BACKGROUND: Stenotrophomonas maltophilia is a carbapenem-resistant Gram-negative pathogen increasingly responsible for difficult-to-treat nosocomial infections. OBJECTIVES: To describe the contemporary clinical characteristics and genome epidemiology of patients colonized or infected by S. maltophilia in a multicentre, prospective cohort. METHODS: All patients with a clinical culture growing S. maltophilia were enrolled at six tertiary hospitals across Japan between April 2019 and March 2022. The clinical characteristics, outcomes, antimicrobial susceptibility and genomic epidemiology of cases with S. maltophilia were investigated. RESULTS: In total, 78 patients were included representing 34 infection and 44 colonization cases. The median age was 72.5â years (IQR, 61-78), and males accounted for 53 cases (68%). The most common comorbidity was localized solid malignancy (39%). Nearly half of the patients (44%) were immunosuppressed, with antineoplastic chemotherapy accounting for 31%. The respiratory tract was the most common site of colonization (86%), whereas bacteraemia accounted for most infection cases (56%). The 30 day all-cause mortality rate was 21%, which was significantly higher in infection cases than colonization cases (35% versus 9%; adjusted HR, 3.81; 95% CI, 1.22-11.96). Susceptibility rates to ceftazidime, levofloxacin, minocycline and sulfamethoxazole/trimethoprim were 14%, 65%, 87% and 100%, respectively. The percentage of infection ranged from 13% in the unclassified group to 86% in genomic group 6A. The percentage of non-susceptibility to ceftazidime ranged from 33% in genomic group C to 100% in genomic groups 6 and 7 and genomic group geniculate. CONCLUSIONS: In this contemporary multicentre cohort, S. maltophilia primarily colonized the respiratory tract, whereas patients with bacteraemia had the highest the mortality from this pathogen. Sulfamethoxazole/trimethoprim remained consistently active, but susceptibility to levofloxacin was relatively low. The proportions of cases representing infection and susceptibility to ceftazidime differed significantly based on genomic groups.
Assuntos
Antibacterianos , Infecções por Bactérias Gram-Negativas , Testes de Sensibilidade Microbiana , Stenotrophomonas maltophilia , Humanos , Stenotrophomonas maltophilia/genética , Stenotrophomonas maltophilia/efeitos dos fármacos , Stenotrophomonas maltophilia/isolamento & purificação , Stenotrophomonas maltophilia/classificação , Masculino , Idoso , Japão/epidemiologia , Feminino , Infecções por Bactérias Gram-Negativas/epidemiologia , Infecções por Bactérias Gram-Negativas/microbiologia , Infecções por Bactérias Gram-Negativas/tratamento farmacológico , Pessoa de Meia-Idade , Estudos Prospectivos , Antibacterianos/farmacologia , Antibacterianos/uso terapêutico , Infecção Hospitalar/microbiologia , Infecção Hospitalar/epidemiologia , Genoma Bacteriano , Bacteriemia/microbiologia , Bacteriemia/epidemiologia , Epidemiologia Molecular , Combinação Trimetoprima e Sulfametoxazol/farmacologia , Combinação Trimetoprima e Sulfametoxazol/uso terapêuticoRESUMO
BACKGROUND: Medical care and long-term care utilization in the last year of life of frail older adults could be a key indicator of their quality of life. This study aimed to identify the medical care expenditure (MCE) trajectories in the last year of life of frail older adults by investigating the association between MCE and long-term care utilization in each trajectory. METHODS: The retrospective cohort study of three municipalities in Japan included 405 decedents (median age at death, 85 years; 189 women [46.7%]) from a cohort of 1,658 frail older adults aged ≥65 years who were newly certified as support level in the long-term care insurance program from April 2012 to March 2013. This study used long-term care and medical insurance claim data from April 2012 to March 2017. The primary outcome was MCE over the 12 months preceding death. Group-based trajectory modeling was conducted to identify the MCE trajectories. A mixed-effect model was employed to examine the association between long-term care utilization and MCE in each trajectory. RESULTS: Participants were stratified into four groups based on MCE trajectories over the 12 months preceding death as follows: rising (n = 159, 39.3%), persistently high (n = 143, 35.3%), minimal (n = 56, 13.8%), and descending (n = 47, 11.6%) groups. Home-based long-term care utilization was associated with increased MCE in the descending trajectory (coefficient, 1.48; 95% confidence interval [CI], 1.35-1.62). Facility-based long-term care utilization was associated with reduced MCE in the rising trajectory (coefficient, 0.59; 95% CI, 0.50-0.69). Both home-based (coefficient, 0.92; 95% CI, 0.85-0.99) and facility-based (coefficient; 0.53; 95% CI, 0.41-0.63) long-term care utilization were associated with reduced MCE in the persistently high trajectory. CONCLUSIONS: These findings may facilitate the integration of medical and long-term care models at the end of life in frail older adults.
Assuntos
Idoso Fragilizado , Gastos em Saúde , Assistência de Longa Duração , Humanos , Feminino , Idoso de 80 Anos ou mais , Masculino , Estudos Retrospectivos , Idoso Fragilizado/estatística & dados numéricos , Idoso , Assistência de Longa Duração/economia , Assistência de Longa Duração/estatística & dados numéricos , Gastos em Saúde/estatística & dados numéricos , Assistência Terminal/economia , Japão , Qualidade de VidaRESUMO
BACKGROUND: Mirtazapine blocks 5-hydroxytryptamine type (5-HT)2A, 5-HT2C, 5-HT3 and histamine H1 receptors, similarly to olanzapine. This study aimed to investigate the efficacy and safety of mirtazapine plus granisetron and dexamethasone for carboplatin (CBDCA)-induced nausea and vomiting in patients with thoracic cancers. METHODS: We conducted a prospective, open-label, single-arm, multicenter, phase II trial in four institutions in Japan. Registered patients were moderately to highly emetogenic chemotherapy-naïve, and were scheduled to receive CBDCA at area under the curve (AUC) ≥ 4 mg/mL per minute. Patients received mirtazapine 15 mg/day orally at bedtime for four consecutive days, in combination with granisetron and dexamethasone. Primary endpoint was complete response (CR; no emesis and no use of rescue medication) rate during the delayed period (24-120 h). RESULTS: Between July 2022 and July 2023, 52 patients were enrolled, and 48 patients were evaluated. CR rates in the delayed (24-120 h), overall (0-120 h), and acute periods (0-24 h) were 83.3%, 83.3%, and 100%, respectively. No grade 3 or higher treatment-related adverse events were observed except for one patient who had grade 3 dry mouth as evaluated by Common Terminology Criteria for Adverse Events version 5.0. CONCLUSIONS: Prophylactic antiemetic therapy with mirtazapine plus granisetron and dexamethasone shows promising efficacy and an acceptable safety profile. This three-drug combination appears to be a reasonable treatment approach in patients with thoracic cancers receiving a CBDCA-based regimen at AUC ≥ 4 mg/mL per minute.
Assuntos
Antieméticos , Carboplatina , Dexametasona , Granisetron , Mirtazapina , Náusea , Vômito , Humanos , Granisetron/administração & dosagem , Granisetron/uso terapêutico , Masculino , Mirtazapina/uso terapêutico , Mirtazapina/administração & dosagem , Feminino , Dexametasona/administração & dosagem , Dexametasona/uso terapêutico , Pessoa de Meia-Idade , Idoso , Náusea/induzido quimicamente , Náusea/tratamento farmacológico , Vômito/induzido quimicamente , Vômito/tratamento farmacológico , Estudos Prospectivos , Carboplatina/efeitos adversos , Carboplatina/administração & dosagem , Antieméticos/uso terapêutico , Antieméticos/administração & dosagem , Neoplasias Torácicas/tratamento farmacológico , Neoplasias Torácicas/patologia , Adulto , Protocolos de Quimioterapia Combinada Antineoplásica/efeitos adversos , Protocolos de Quimioterapia Combinada Antineoplásica/uso terapêutico , Idoso de 80 Anos ou mais , Japão , Quimioterapia CombinadaRESUMO
PURPOSE: This study aimed to assess the imaging features and postoperative natural course of 18F-fluorodeoxyglucose (FDG) uptake in the cervical muscles after neck dissection. MATERIALS AND METHODS: This study included 83 patients who underwent preoperative and postoperative 18F-FDG-PET/CT and were diagnosed with head and neck malignancy after neck dissection. Postoperative 18F-FDG-PET/CT was performed within 5 years after neck dissection. Preoperative and postoperative FDG uptake of the trapezius, sternocleidomastoid, scalene, pectoralis major, and deltoid muscles was visually assessed. Increased postoperative uptake was visually defined as higher postoperative FDG uptake than the preoperative one in the corresponding muscle. The maximum standardized uptake value (SUVmax) was measured in cases with increased postoperative uptakes. RESULTS: Increased postoperative uptakes were observed in 43 patients (52%). The trapezius (31/83, 37%), sternocleidomastoid (19/83, 23%), and scalene (12/83, 14%) muscles were involved, as opposed to the pectoralis major and deltoid muscles were not. Increased postoperative uptakes were observed on the dissected side in all 43 patients. Significant differences between SUVmax estimated from the mixed-effects model and postoperative months were observed in the trapezius muscle (Coefficient (ß) = -0.038; 95% confidence interval (CI): [-0.047, -0.028]; p < 0.001) and sternocleidomastoid muscle (ß = -0.015; 95% CI: [-0.029, -0.001]; p = 0.046). CONCLUSIONS: Increased postoperative uptakes in the cervical muscles were observed on the dissected side in approximately half of the patients after neck dissection. The SUVmax in the trapezius and sternocleidomastoid muscles decreased after surgery over time.
Assuntos
Fluordesoxiglucose F18 , Neoplasias de Cabeça e Pescoço , Esvaziamento Cervical , Músculos do Pescoço , Tomografia por Emissão de Pósitrons combinada à Tomografia Computadorizada , Compostos Radiofarmacêuticos , Humanos , Fluordesoxiglucose F18/farmacocinética , Masculino , Feminino , Tomografia por Emissão de Pósitrons combinada à Tomografia Computadorizada/métodos , Pessoa de Meia-Idade , Compostos Radiofarmacêuticos/farmacocinética , Idoso , Músculos do Pescoço/diagnóstico por imagem , Músculos do Pescoço/metabolismo , Neoplasias de Cabeça e Pescoço/cirurgia , Neoplasias de Cabeça e Pescoço/diagnóstico por imagem , Neoplasias de Cabeça e Pescoço/metabolismo , Adulto , Período Pós-Operatório , Estudos Retrospectivos , Idoso de 80 Anos ou maisRESUMO
PURPOSE: To compare the diagnostic performance of 40 keV and 70 keV virtual monoenergetic images (VMIs) generated from dual-energy CT in the detection of pancreatic cancer. METHODS: This retrospective study included patients who underwent pancreatic protocol dual-energy CT from January 2019 to August 2022. Four radiologists (1-11 years of experience), who were blinded to the final diagnosis, independently and randomly interpreted 40 keV and 70 keV VMIs and graded the presence or absence of pancreatic cancer. For each image set (40 keV and 70 keV VMIs), the sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy were calculated. The diagnostic performance of each image set was compared using generalized estimating equations. RESULTS: Overall, 137 patients (median age, 71 years; interquartile range, 63-78 years; 77 men) were included. Among them, 62 patients (45%) had pathologically proven pancreatic cancer. The 40 keV VMIs had higher specificity (75% vs. 67%; P < .001), PPV (76% vs. 71%; P < .001), and accuracy (85% vs. 81%; P = .001) than the 70 keV VMIs. On the contrary, 40 keV VMIs had lower sensitivity (96% vs. 98%; P = .02) and NPV (96% vs. 98%; P = .004) than 70 keV VMIs. However, the diagnostic confidence in patients with (P < .001) and without (P = .001) pancreatic cancer was improved in 40 keV VMIs than in 70 keV VMIs. CONCLUSIONS: The 40 keV VMIs showed better diagnostic performance in diagnosing pancreatic cancer than the 70 keV VMIs, along with higher reader confidence.
Assuntos
Neoplasias Pancreáticas , Imagem Radiográfica a Partir de Emissão de Duplo Fóton , Sensibilidade e Especificidade , Tomografia Computadorizada por Raios X , Humanos , Neoplasias Pancreáticas/diagnóstico por imagem , Masculino , Feminino , Estudos Retrospectivos , Pessoa de Meia-Idade , Idoso , Tomografia Computadorizada por Raios X/métodos , Imagem Radiográfica a Partir de Emissão de Duplo Fóton/métodos , Valor Preditivo dos TestesRESUMO
Recently, conversion surgery (CS) has been reported to improve the prognosis in patients with unresectable pancreatic ductal adenocarcinoma (UR-PDAC) with a favorable response to intense chemotherapy or chemoradiotherapy. However, few pretherapeutic parameters predict the attainability of CS in patients with UR-PDAC. The present study aimed to explore the pretherapeutic predictors for the attainability of CS in patients with UR-PDAC. The present study retrospectively evaluated 130 patients with UR-PDAC treated at Gifu University Hospital (Gifu, Japan) from January 2015 to December 2021. Survival analysis was performed using the Simon and Makuch-modified Kaplan-Meier method. The hazard ratio (HR) was estimated using a time-varying Cox regression model. The association between each predictor and CS was evaluated using the univariate analysis and age-adjusted Fine-Gray sub-distribution hazard model. The bootstrap bias-corrected area under the receiver operating characteristic curve analysis for predicting CS was used to assess the cut-off values for each predictor. The cumulative incidence rate was calculated with CS as the outcome when divided into two groups based on the cut-off value of each pretherapeutic predictor. Among the 130 patients included in the analysis, only 14 (11%) underwent CS. The median survival time was significantly longer in patients who underwent CS compared with patients without CS (56.3 vs. 14.1 months; P<0.001). The age-adjusted Fine-Gray sub-distribution hazard regression showed that the total protein (TP) [HR 2.81, 95% confidence interval (CI) 1.19-6.65; P=0.018], neutrophil-to-lymphocyte ratio (NLR) (HR 0.53, 95% CI 0.31-0.90; P=0.020), and lymphocyte-to-monocyte ratio (LMR) (HR 1.28, 95% CI 1.07-1.53; P=0.006) were significantly associated with CS. Moreover, TP ≥6.8, NLR <2.84 and LMR ≥3.87 were associated with a higher cumulative incidence of CS. In conclusion, pretherapeutic TP, NLR and LMR are clinically feasible biomarkers for predicting the attainability of CS in patients with UR-PDAC.
RESUMO
BACKGROUND/AIM: Progress has been made in a triplet preoperative chemotherapy regimen for advanced esophageal cancer. We performed a preliminary investigation of the radiomics features of pathological lymph node metastasis after neoadjuvant chemotherapy using dual-energy computed tomography (DECT). PATIENTS AND METHODS: From January to December 2022, 36 lymph nodes from 10 patients with advanced esophageal cancer who underwent contrast-enhanced DECT after neoadjuvant chemotherapy and radical surgery in our department were studied. Radiomics features were extracted from iodine-based material decomposition images at the portal venous phase constructed by DECT using MATLAB analysis software. Receiver operating characteristic (ROC) analysis and cut-off values were determined for the presence or absence of pathological metastasis. RESULTS: ROC for the short axis of the pathologically positive lymph nodes showed an AUC of 0.713. Long run emphasis (LRE) within gray-level run-length matrix (GLRLM) was confirmed with a high AUC of 0.812. Sensitivity and specificity for lymph nodes with a short axis >10 mm were 0.222 and 1, respectively. Sensitivity and specificity for LRE within GLRLM were 0.722 and 0.833, respectively. Sensitivity and specificity for small zone emphasis (SZE) within gray-level size zone matrix (GLSZM) were 0.889 and 0.667, and zone percentage (ZP) values within GLSZM were 0.722 and 0.778, respectively. Discrimination of existing metastases using radiomics showed significantly higher sensitivity compared to lymph node short axis >10 mm (odds ratios of LRE, SZE, and ZP: 9.1, 28, and 9.1, respectively). CONCLUSION: Evaluation of radiomics analysis using DECT may enable a more detailed evaluation of lymph node metastasis after neoadjuvant chemotherapy.
Assuntos
Neoplasias Esofágicas , Radiômica , Humanos , Metástase Linfática/patologia , Linfonodos/diagnóstico por imagem , Linfonodos/patologia , Tomografia Computadorizada por Raios X/métodos , Neoplasias Esofágicas/diagnóstico por imagem , Neoplasias Esofágicas/tratamento farmacológico , Neoplasias Esofágicas/cirurgia , Estudos RetrospectivosRESUMO
Introduction: Acute kidney injury (AKI), with a fatality rate of 8.6%, is one of the most common types of multiorgan failure in the intensive care unit (ICU). Thus, AKI should be diagnosed early, and early interventions should be implemented. Urinary liver-type fatty acid-binding protein (L-FABP) could aid in the diagnosis of AKI. Methods: In this prospective, single-center, observational study, we included 100 patients with trauma. Urinary L-FABP levels were measured using a semi-quantitative rapid assay kit 6 and 12 h after injury. Negative, weakly positive, and strongly positive urinary L-FABP levels were examined using two protocols. Using protocol 1, measurements were performed at 6 h after injury negative levels were considered "negative," and weakly positive and strongly positive levels were considered "positive." Using protocol 2, strongly positive levels at 6 h after injury were considered "positive," and negative or weakly positive levels at 6 h after injury were considered "positive" if they were weakly positive or positive at 12 h after injury. Results: Fifteen patients were diagnosed with AKI. Using protocol 1, the odds ratio (OR) was 20.55 (p = 0.001) after adjustment for the injury severity score (ISS), contrast media use, and shock index. When the L-FABP levels at 6 and 12 h were similarly adjusted for those three factors, the OR was 18.24 (p < 0.001). The difference in ORs for protocols 1 and 2 was 1.619 (p = 0.04). Discussion: Associations between urinary L-FABP and AKI can be examined more precisely by performing measurements at 6 and 12 h after injury than only one time at 6 h.
RESUMO
Background: Olaparib and niraparib (poly adenosine diphosphate [ADP]-ribose polymerase [PARP] inhibitors) have significant antitumor action in patients with ovarian cancer. However, the incidence of nausea and vomiting among patients on these drugs in clinical trials is rather high. There are no guidelines on antiemetic treatment for nausea caused by oral anticancer agents. This study aimed to investigate the incidence of nausea and vomiting caused by PARP inhibitors and the actual situation of antiemetic therapy in patients with gynecologic cancer. Methods: Patients with gynecologic cancer who were scheduled to receive PARP inhibitors were enrolled. Data on PARP inhibitor-induced nausea and vomiting were collected from patient diaries for 21 days. The primary endpoint was the incidence of vomiting during the 21 days after starting olaparib and niraparib. Results: Overall, between January 2020 and March 2023, 134 patients were enrolled. Of the 129 patients who were evaluated, 28 (21.7%) received prophylactic antiemetics for 21 days, and 101 (78.3%) did not. The overall incidence of PARP inhibitor-induced vomiting was 16.3%. The incidence of vomiting in the group that did not receive antiemetic prophylaxis was 13.9%. On dividing the group that did not receive antiemetic prophylaxis into the olaparib and niraparib subgroups, the incidence of vomiting was found to be 18.6% for the olaparib group and 10.3% for the niraparib group. Conclusion: The incidence of emesis without antiemetic prophylaxis among patients on olaparib and niraparib ranged from 10% to 30%. Therefore, olaparib and niraparib can be classified in the low emetogenic risk and prophylactic antiemetic therapy at the time of treatment initiation may be unnecessary.
RESUMO
BACKGROUND/AIM: Renal dysfunction necessitates S-1 dose reduction. However, decreased dihydropyrimidine dehydrogenase (DPD) activity may lead to adverse events due to 5-FU. The guidelines provided by pharmaceutical companies state that total bilirubin (T-Bil) should be ≤upper limit of normal (ULN)×1.5 as a reference value for safely taking S-1. Nevertheless, the relationship between the degree of liver dysfunction and S-1 dose reduction has not been clearly established. PATIENTS AND METHODS: This study focused on patients who received S-1 monotherapy for various types of cancer. The primary outcome was defined as the variation between blood sampling results on the test day and the subsequent test. The variation data were categorized based on the difference in T-Bil: Low T-Bil group (≤2.25) and High T-Bil group (>2.25). RESULTS: The number of patients that underwent S-1 monotherapy was 883 and the running number was 7,511; Low T-Bil group included 7,245 and High T-Bil group included 266. Examination of the effect of the T-Bil Group on clinical outcomes revealed a correlation with red blood cell (RBC) count, platelet (PLT) count, and T-Bil level. When the impact of the interaction between the T-Bil Group and any of the clinical outcomes, such as the RBC count, PLT count, and T-Bil level, was determined, each outcome showed a significant decrease in the High T-Bil group compared with the Low T-Bil group. CONCLUSION: S-1 administration in patients with liver dysfunction accompanied by elevated T-Bil levels may cause thrombocytopenia.
Assuntos
Hepatopatias , Humanos , Estudos Retrospectivos , Bilirrubina , Testes de Função HepáticaRESUMO
OBJECTIVES: About 90% of Japanese kidney transplantations are conducted from living donors, and their safety and the maintenance of their renal function are critical. This study aims to identify factors that affect the compensation of renal function in living kidney donors after donor nephrectomy. METHOD: In a retrospective cohort study, we reviewed data from 120 patients who underwent nephrectomy as living kidney transplant donors in our department from 2012 to 2021. Univariable and multivariable linear regression analyses were performed for donor factors affecting renal function after donor nephrectomy. RESULT: The multivariable linear regression model revealed that the donor's age (p = 0.025), preoperative estimated Glomerular Filtration Rate (eGFR) (p < 0.001), and hemoglobin A1c (HbA1c) (p = 0.043) were independent risk factors for eGFR at six months after nephrectomy. The eGFR deterioration was more strongly associated with age in females than in males, whereas higher HbA1c values were more strongly associated with eGFR deterioration in males. Higher donor age and higher HbA1c each enhance the deterioration of eGFR six months after living donor nephrectomy. The data suggest that old age in especially female donors and preoperative higher HbA1c in male donors have a harmful impact on their renal function compensation.