Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 38
Filtrar
Mais filtros

Bases de dados
Tipo de documento
Intervalo de ano de publicação
1.
Clin Appl Thromb Hemost ; 30: 10760296231221535, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38591958

RESUMO

Hepatocellular carcinoma (HCC) is associated with higher mortality as a result of poor prognosis and unavailability of effective treatment options. This study retrospectively analyzed the clinical value of platelet-to-lymphocyte ratio (PLR) to aid in differentiating early hepatocellular carcinoma from liver cirrhosis patients. Three hundred and nine (309) patients including 155 patients with hepatocellular carcinoma (HCC) and 154 patients with liver cirrhosis were enrolled in this study. General clinical characteristics and blood parameters of each patient were collected, calculated, and retrospectively analyzed. Mann-Whitney U test was calculated to compare the two groups. Receiver operating characteristics (ROC) curve was performed to investigate the diagnostic potential of PLR in the prediction of HCC at a cut-off with high accuracy (area under the curve [AUC]) > 0.80. Hemoglobin (HB) concentration, red blood cell (RBC) count, neutrophil (NEU) count, platelet count, platelet-to-lymphocyte ratio (PLR), and neutrophil-to-lymphocyte ratio (NLR) were significantly higher in the HCC patients than in the liver cirrhosis patients (p < 0.05). ROC curve analysis showed that the AUC, optimal cut-off value, sensitivity, and specificity of PLR to predict HCC patients were 0.912, 98.7, 81.2%, and 80.6% respectively. The results suggest that PLR is a potential biomarker that can be used to predict early HCC.


Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Humanos , Carcinoma Hepatocelular/diagnóstico , Estudos Retrospectivos , Neoplasias Hepáticas/diagnóstico , Linfócitos , Cirrose Hepática/diagnóstico
2.
Adv Ther ; 41(6): 2299-2306, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38619722

RESUMO

INTRODUCTION: Some people with type 2 diabetes (T2D) require intensive insulin therapy to manage their diabetes. This can increase the risk of diabetes-related hospitalizations. We hypothesize that initiation of real-time continuous glucose monitoring (RT-CGM), which continuously measures a user's glucose values and provides threshold- and trend-based alerts, will reduce diabetes-related emergency department (ED) and inpatient hospitalizations and concomitant costs. METHODS: A retrospective analysis of US healthcare claims data using Optum's de-identified Clinformatics® Data Mart database was performed. The cohort consisted of commercially insured, CGM-naïve individuals with T2D who initiated Dexcom G6 RT-CGM system between August 1, 2018, and March 31, 2021. Twelve months of continuous health plan enrollment before and after RT-CGM initiation was required to capture baseline and follow-up rates of diabetes-related hospitalizations and associated healthcare resource utilization (HCRU) costs. Analyses were performed for claims with a diabetes-related diagnosis code in either (1) any position or (2) first or second position on the claim. RESULTS: A total of 790 individuals met the inclusion criteria. The average age was 52.8 (10.5) [mean (SD)], 53.3% were male, and 76.3% were white. For claims with a diabetes-related diagnosis code in any position, the number of individuals with ≥ 1 ED visit decreased by 30.0% (p = 0.01) and with ≥ 1 inpatient visit decreased by 41.5% (p < 0.0001). The number of diabetes-related visits and average number of visits per person similarly decreased by at least 31.4%. Larger relative decreases were observed for claims with a diabetes-related diagnosis code in the first or second position on the claim. Total diabetes-related costs expressed as per-person-per-month (PPPM) decreased by $341 PPPM for any position and $330 PPPM for first or second position. CONCLUSION: Initiation of Dexcom G6 among people with T2D using intensive insulin therapy was associated with a significant reduction in diabetes-related ED and inpatient visits and related HCRU costs. Expanded use of RT-CGM could augment these benefits and result in further cost reductions.


Assuntos
Automonitorização da Glicemia , Diabetes Mellitus Tipo 2 , Hospitalização , Hipoglicemiantes , Insulina , Humanos , Diabetes Mellitus Tipo 2/tratamento farmacológico , Diabetes Mellitus Tipo 2/economia , Masculino , Feminino , Pessoa de Meia-Idade , Estudos Retrospectivos , Hospitalização/economia , Hospitalização/estatística & dados numéricos , Insulina/uso terapêutico , Insulina/economia , Hipoglicemiantes/uso terapêutico , Hipoglicemiantes/economia , Automonitorização da Glicemia/economia , Automonitorização da Glicemia/métodos , Adulto , Idoso , Glicemia/análise , Custos de Cuidados de Saúde/estatística & dados numéricos , Estados Unidos
3.
J Med Internet Res ; 26: e47125, 2024 Apr 18.
Artigo em Inglês | MEDLINE | ID: mdl-38422347

RESUMO

BACKGROUND: The adoption of predictive algorithms in health care comes with the potential for algorithmic bias, which could exacerbate existing disparities. Fairness metrics have been proposed to measure algorithmic bias, but their application to real-world tasks is limited. OBJECTIVE: This study aims to evaluate the algorithmic bias associated with the application of common 30-day hospital readmission models and assess the usefulness and interpretability of selected fairness metrics. METHODS: We used 10.6 million adult inpatient discharges from Maryland and Florida from 2016 to 2019 in this retrospective study. Models predicting 30-day hospital readmissions were evaluated: LACE Index, modified HOSPITAL score, and modified Centers for Medicare & Medicaid Services (CMS) readmission measure, which were applied as-is (using existing coefficients) and retrained (recalibrated with 50% of the data). Predictive performances and bias measures were evaluated for all, between Black and White populations, and between low- and other-income groups. Bias measures included the parity of false negative rate (FNR), false positive rate (FPR), 0-1 loss, and generalized entropy index. Racial bias represented by FNR and FPR differences was stratified to explore shifts in algorithmic bias in different populations. RESULTS: The retrained CMS model demonstrated the best predictive performance (area under the curve: 0.74 in Maryland and 0.68-0.70 in Florida), and the modified HOSPITAL score demonstrated the best calibration (Brier score: 0.16-0.19 in Maryland and 0.19-0.21 in Florida). Calibration was better in White (compared to Black) populations and other-income (compared to low-income) groups, and the area under the curve was higher or similar in the Black (compared to White) populations. The retrained CMS and modified HOSPITAL score had the lowest racial and income bias in Maryland. In Florida, both of these models overall had the lowest income bias and the modified HOSPITAL score showed the lowest racial bias. In both states, the White and higher-income populations showed a higher FNR, while the Black and low-income populations resulted in a higher FPR and a higher 0-1 loss. When stratified by hospital and population composition, these models demonstrated heterogeneous algorithmic bias in different contexts and populations. CONCLUSIONS: Caution must be taken when interpreting fairness measures' face value. A higher FNR or FPR could potentially reflect missed opportunities or wasted resources, but these measures could also reflect health care use patterns and gaps in care. Simply relying on the statistical notions of bias could obscure or underplay the causes of health disparity. The imperfect health data, analytic frameworks, and the underlying health systems must be carefully considered. Fairness measures can serve as a useful routine assessment to detect disparate model performances but are insufficient to inform mechanisms or policy changes. However, such an assessment is an important first step toward data-driven improvement to address existing health disparities.


Assuntos
Medicare , Readmissão do Paciente , Idoso , Adulto , Humanos , Estados Unidos , Estudos Retrospectivos , Hospitais , Florida/epidemiologia
4.
Environ Sci Pollut Res Int ; 30(54): 115322-115336, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-37884723

RESUMO

China's critical reliance on well-crafted public policies, coupled with the effective execution of central government directives at the local level, drives the achievement of the "dual carbon" goal including the peaking of CO2 emissions and attaining carbon neutrality. Therefore, examining policy records can unveil the holistic strategy for attaining carbon neutrality during the period of peak CO2 emissions; at the same time, it can also highlight the potential obstacles in policy implementation. In this study, we adopt a policy instruments perspective to investigate data related to policies addressing peak CO2 emissions across 29 provincial administrative regions in China. We apply Nvivo12 software to conduct a quantitative literature assessment and content analysis to establish a theoretical framework for the policy process. This framework encompasses dimensions such as political feasibility, regional coordination, attributes of low-carbon initiatives, and policy refinement. Subsequently, we employ the model to carry out a retrospective analysis of policy documents pertaining to peak CO2 emissions in China. Our research findings underscore the pivotal role of political feasibility in shaping policy effectiveness, while also highlighting the facilitative influence of regional coordination, shedding light on the essential synergy between provinces and cities in achieving emissions reduction goals. Similarly, the estimated results highlight the motivating impact of specific attributes within low-carbon initiatives. Moreover, policy enhancements are identified as a critical driver in advancing the path toward carbon neutrality. Consequently, to achieve the objective of carbon neutrality, it is imperative for every province and city to sequentially reach the peak of CO2 emissions. Our research offers a comprehensive "China strategy," providing valuable insights to guide future policy formulation and accelerate progress toward sustainable environmental objectives.


Assuntos
Dióxido de Carbono , Política Pública , Estudos Retrospectivos , Carbono , China , Desenvolvimento Econômico
5.
Front Surg ; 10: 1148274, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37151867

RESUMO

Background: Approximately 3.2%-6% of the general population harbor an unruptured intracranial aneurysm (UIA). Ruptured aneurysms represent a significant healthcare burden, and preventing rupture relies on early detection and treatment. Most patients with UIAs are asymptomatic, and many of the symptoms associated with UIAs are nonspecific, which makes diagnosis challenging. This study explored symptoms associated with UIAs, the rate of resolution of such symptoms after microsurgical treatment, and the likely pathophysiology. Methods: A retrospective review of patients with UIAs who underwent microsurgical treatment from January 1, 2014, to December 31, 2020, at a single quaternary center were identified. Analyses included the prevalence of nonspecific symptoms upon clinical presentation and postoperative follow-up; comparisons of symptomatology by aneurysmal location; and comparisons of patient demographics, aneurysmal characteristics, and poor neurologic outcome at postoperative follow-up stratified by symptomatic versus asymptomatic presentation. Results: The analysis included 454 patients; 350 (77%) were symptomatic. The most common presenting symptom among all 454 patients was headache (n = 211 [46%]), followed by vertigo (n = 94 [21%]), cognitive disturbance (n = 68[15%]), and visual disturbance (n = 64 [14%]). Among 328 patients assessed for postoperative symptoms, 258 (79%) experienced symptom resolution or improvement. Conclusion: This cohort demonstrates that the clinical presentation of patients with UIAs can be associated with vague and nonspecific symptoms. Early detection is crucial to prevent aneurysmal subarachnoid hemorrhage. It is imperative that physicians not rule out aneurysms in the setting of nonspecific neurologic symptoms.

6.
Clin Genitourin Cancer ; 21(5): 517-529, 2023 10.
Artigo em Inglês | MEDLINE | ID: mdl-37248148

RESUMO

BACKGROUND: Prostate cancer (PC) is more likely to develop in men ≥65 years old than in those <65 years old. This study aimed to generate real-world evidence on treatment patterns, clinical outcomes, health care resource utilization (HCRU), and costs among older patients with metastatic castration-resistant PC (mCRPC). MATERIALS AND METHODS: A claims algorithm based on treatments expected for mCRPC was used to identify men ≥65 years old with mCRPC in the SEER-Medicare data between 2007 and 2019. The index date was defined as the date of the start of first-line therapy (1L). Treatment patterns and all-cause and PC-specific HCRU and costs were measured in the 12 months preindex period and the postindex follow-up period. Time to next treatment or death (TNTD) and overall survival (OS) were assessed in the follow-up period. RESULTS: A total of 4758 patients met the eligibility criteria and received 1L treatment. Among these 1L patients, 57.4% subsequently received second-line (2L) treatment; among patients receiving 2L treatment, 49.3% subsequently received third-line (3L) treatment. Abiraterone, enzalutamide, and docetaxel were most common regimens in 1L (41.9%, 22.0%, 22.0%, respectively), 2L (33.3%, 32.7%, 13.6%, respectively), and 3L (17.9%, 25.1%, 22.3%, respectively). On average, patients had 1.2 inpatient admissions, 1.1 emergency room visits, and 27.6 outpatient visits per year during follow-up. The mean total all-cause and PC-related costs during the follow-up period were $111,060 and $99,540 per-patient-per-year, respectively. Median TNTD was 9.3, 6.5, and 5.7 months for 1L, 2L, and 3L, respectively. Median OS from the start of 1L treatment for mCRPC was 21.5 months. DISCUSSION: Among older patients with mCRPC, high attrition from 1L to subsequent lines of therapy was observed. Median TNTD was <1 year and median OS was <2 years. These results highlight a need to introduce more effective mCRPC therapies in 1L to improve clinical outcomes for older patients.


Assuntos
Neoplasias de Próstata Resistentes à Castração , Masculino , Humanos , Idoso , Estados Unidos , Neoplasias de Próstata Resistentes à Castração/tratamento farmacológico , Estudos Retrospectivos , Medicare , Custos e Análise de Custo , Atenção à Saúde , Resultado do Tratamento , Custos de Cuidados de Saúde
7.
Environ Sci Pollut Res Int ; 30(14): 40654-40669, 2023 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-36622601

RESUMO

Climate change exacerbates uncertainties in water resource management, water supply, and treatment that are energy intensive and then exert great pressure on climate change mitigation; hence, interrelated and contradictory characteristics within the water-climate change (WC) nexus system are needed to be studied. The nexus thinking and coordination of WC would impact many realistic practices and assist in sustainable socioeconomic development since traditional single-target policies have sometimes been out of function. Hence, the ability to direct water production and use as well as climate change mitigation has become a hotspot recently. Furthermore, we find that there has been no complete research on reviewing the impacts of the WC nexus in different areas on the Sustainable Development Goals (SDGs). Hence, this paper builds a core nexus of WC and then analyzes those effects on social and environmental aspects in many areas, including sewage treatment, energy transition, waste treatment, land management, and ocean management. This paper discusses how WC interlinkages are utilized to realize SDGs in those areas. Moreover, uncertainties derived from exogenous hydrology, climate change, and anthropogenic endogenous systems for realistic problems appeal to gradually increasing concern. Finally, implications offer valuable guidelines for integrated management of water and carbon emissions, as well as sustainable socioeconomic development in the future.


Assuntos
Mudança Climática , Desenvolvimento Sustentável , Água , China , Seguridade Social
8.
Ophthalmol Sci ; 3(1): 100215, 2023 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-36275199

RESUMO

Objective: To examine the data quality and usability of visual acuity (VA) data extracted from an electronic health record (EHR) system during ophthalmology encounters and provide recommendations for consideration of relevant VA end points in retrospective analyses. Design: Retrospective, EHR data analysis. Participants: All patients with eyecare office encounters at any 1 of the 9 locations of a large academic medical center between August 1, 2013, and December 31, 2015. Methods: Data from 13 of the 21 VA fields (accounting for 93% VA data) in EHR encounters were extracted, categorized, recoded, and assessed for conformance and plausibility using an internal data dictionary, a 38-item listing of VA line measurements and observations including 28 line measurements (e.g., 20/30, 20/400) and 10 observations (e.g., no light perception). Entries were classified into usable and unusable data. Usable data were further categorized based on conformance to the internal data dictionary: (1) exact match; (2) conditional conformance, letter count (e.g., 20/30+2 - 3); (3) convertible conformance (e.g., 5/200 to 20/800); (4) plausible but cannot be conformed (e.g., 5/400). Data were deemed unusable when they were not plausible. Main Outcome Measures: Proportions of usable and unusable VA entries at the overall and subspecialty levels. Results: All VA data from 513 036 encounters representing 166 212 patients were included. Of the 1 573 643 VA entries, 1 438 661 (91.4%) contained usable data. There were 1 196 720 (76.0%) exact match (category 1), 185 692 (11.8%) conditional conformance (category 2), 40 270 (2.6%) convertible conformance (category 3), and 15 979 (1.0%) plausible but not conformed entries (category 4). Visual acuity entries during visits with providers from retina (17.5%), glaucoma (14.0%), neuro-ophthalmology (8.9%), and low vision (8.8%) had the highest rates of unusable data. Documented VA entries with providers from comprehensive eyecare (86.7%), oculoplastics (81.5%), and pediatrics/strabismus (78.6%) yielded the highest proportions of exact match with the data dictionary. Conclusions: Electronic health record VA data quality and usability vary across documented VA measures, observations, and eyecare subspecialty. We proposed a checklist of considerations and recommendations for planning, extracting, analyzing, and reporting retrospective study outcomes using EHR VA data. These are important first steps to standardize analyses enabling comparative research.

9.
Front Public Health ; 10: 945805, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36052004

RESUMO

Background: Metabolic syndrome (MetS) encompasses several clinical presentations that include truncal obesity and insulin resistance at its core. MetS afflicts 23% of the adult US population, increasing their risk of diabetes and cardiovascular disease. Many studies have indicated the importance of a vegetarian diet in improving overall health and more specifically MetS components. Unfortunately, these findings have been inconsistent and cannot be extended to examine effects on MetS incidence in the younger adult population. Objective: This study aimed to conduct a retrospective analysis of a vegetarian vs. non-vegetarian dietary status in young adults (age 18-24) based on MetS components in later adulthood (age 20-30). This study focuses on elucidating any relationship between a vegetarian diet and MetS components of central obesity, hypertension, and hyperlipidemia. Methods: Waves 3 and 4 data were acquired from AddHealth. One-to-one propensity score matched vegetarians to non-vegetarians in a cohort of 535 women and 159 men. Logistical regression assessed the relationship between vegetarian status and MetS components, including truncal obesity (cm), hypertension (normal, pre-HT, HT1, and HT2), and hyperlipidemia (high and low). Results MetS components from ages 20 to 30 are not associated with vegetarian dietary status. Truncal obesity [N = 694; M = 92.82 cm; OR 0.999; p = 0.893; 95% CI (0.980, 1.017)]; hypertension [N = 694; OR 0.949; p = 0.638; 95% CI (0.764, 1.179)]; hyperlipidemia [N = 694; OR 0.840; p = 0.581; 95% CI (0.453, 1.559)]. Conclusion: Current study results were consistent with previous findings suggesting that consumption of a vegetarian diet cannot be directly linked to MetS outcomes. However, further investigation should be completed as MetS is a risk factor for several chronic diseases.


Assuntos
Hipertensão , Síndrome Metabólica , Adolescente , Adulto , Dieta Vegetariana , Feminino , Humanos , Hipertensão/epidemiologia , Masculino , Síndrome Metabólica/epidemiologia , Obesidade/epidemiologia , Estudos Retrospectivos , Adulto Jovem
10.
Dermatol Ther (Heidelb) ; 12(11): 2547-2562, 2022 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-36155881

RESUMO

INTRODUCTION: The objective of this study was to conduct a retrospective analysis to understand the patient profile, treatment patterns, healthcare resource utilization, and cost of atopic dermatitis (AD) of patients eligible for targeted therapy in Taiwan. METHODS: A retrospective, claims-based analysis was undertaken using Taiwan's National Health Insurance Research Database from 01 January 2014 to 31 December 2017. Patients aged ≥ 2 years and with at least one diagnosis code for AD during 2015 were identified. Patients with comorbid autoimmune diseases were excluded. Enrolled AD patients were categorized using claims-based treatment algorithms by disease severity and their eligibility for targeted therapy treatment. A cohort of targeted therapy-eligible patients was formed, and a matched cohort using patients not eligible for targeted therapy was derived using propensity score matching based on age, gender, and the Charlson Comorbidity Index (CCI). Treatment patterns, resource utilization, and costs were measured during a 1-year follow-up period. RESULTS: A total of 377,423 patients with AD were identified for this study. Most patients had mild AD (84.5%; n = 318,830) with 11.9% (n = 45,035) having moderate AD, and 3.6% (n = 13,558) having severe AD. Within the 58,593 moderate-to-severe AD patients, 1.5% (n = 897) were included in the targeted therapy-eligible cohort. The matched cohort consisted of 3558 patients. During the 1-year follow-up period, targeted therapy-eligible patients utilized antihistamines (85.5%), topical treatments (80.8%), and systemic anti-inflammatories (91.6%) including systemic corticosteroids (51.4%) and azathioprine (59.1%). During the first year of follow-up, targeted therapy-eligible patients (70.5%; 7.01 [SD = 8.84] visits) had higher resource utilization rates and frequency of AD-related outpatient visits compared with the matched cohort (40.80%; 1.85 [SD = 4.71] visits). Average all-cause direct costs during 1-year follow-up were $2850 (SD = 3629) and $1841 (SD = 6434) for the eligible targeted therapy and matched cohorts, respectively. AD-related costs were 17.7% ($506) of total costs for the targeted therapy eligible cohort and 2.2% ($41) for the matched cohort. CONCLUSIONS: AD patients eligible for targeted therapy in Taiwan experienced high resource and economic burden compared with their non-targeted-therapy-eligible counterparts.

11.
Vaccines (Basel) ; 10(7)2022 Jul 14.
Artigo em Inglês | MEDLINE | ID: mdl-35891285

RESUMO

Hepatitis B vaccination protects newborns from contracting the hepatitis B virus that may lead to chronic infection, liver failure, or death. Trends and racial differences in the administration of the hepatitis B (HepB) birth dose in 2018−2020 were examined in the targeted region. A retrospective analysis of electronic birth dose vaccination data of newborns in 2018−2020 was performed. Birth data from six birthing facilities and home delivery records were obtained from the DC Health Department Vital Statistics Division. This data represented 40,269 newborns and included the mother's race and ethnicity, health insurance type, birthing facility, and administration of the HepB birth dose. Descriptive analysis and multivariable logistic regression analysis were conducted. In addition, subgroup analysis by health insurance type was also conducted with a significant interaction of race/ethnicity and health insurance type. A total of 34,509 (85.7%) received the HepB birth dose within 12 h or before discharge from the facility. The rates of birth dose vaccination have seen an increase over the 3-year period (83.7% in 2018, 85.8% in 2018, 87.7% in 2020, p < 0.01). Multivariable logistic regression analysis revealed racial differences in HepB birth dose vaccination rates. Asian Americans had the highest rate of newborn vaccination consistently over the 3-year period. Conversely, African American infants were less likely to have the birth dose than non-Hispanic Whites (aOR = 0.77, 95% CI: 0.71−0.83). Our research indicates that further studies are needed to explore HepB birth dose hesitancy among African Americans.

12.
Pak J Med Sci ; 38(4Part-II): 1004-1008, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35634629

RESUMO

Objectives: To evaluate right ventricular diastolic function in patients with coronary slow flow phenomenon (CSFP) by using Doppler tissue imaging (DTI). Methods: CSFP patients diagnosed using coronary angiography from June 2019 to December 2020 at the third Hospital of Quzhou were retrospectively investigated, with a similar number of patients with normal coronary blood flow during the same period taken as the control group. Right ventricular systolic and diastolic function index was measured via DTI. Results: No differences were found between CSFP and control groups in terms of baseline data, RV end systolic diameter, RV end diastolic diameter, or RV ejection fraction. The peak velocity E in the early diastolic phase of the right ventricle was decreased in CSFP patients, while the peak velocity a in the late diastolic phase of the right ventricle was increased, resulting in a lower E / a ratio. Conclusions: Right ventricular diastolic function in patients with CSFP is decreased, and this can be identified using DTI. DTI has important applicative value for evaluating right ventricular diastolic function in patients with CSFP.

13.
Front Oncol ; 11: 743765, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34858820

RESUMO

OBJECTIVE: The use of ipilimumab plus anti-PD-1 has recently been shown to significantly improve the survival of patients with metastatic melanoma resistant to anti-PD-(L)1 monotherapy. The study assessed the cost-effectiveness of ipilimumab plus anti-PD-1 therapy in this population from the US payer perspective. MATERIALS AND METHODS: A Markov model was created based on a retrospective analysis of patients with metastatic melanoma who were resistant to anti-PD-(L)1. Cost information was obtained from the Centers for Medicare and Medicaid Services and literature-based costs. The utility value was derived from the published literature. The results of the model was the total cost, quality-adjusted life-year (QALY), and incremental cost-effectiveness ratio (ICER). The uncertainty of the model was addressed through sensitivity analysis. In addition, we also conducted subgroup analysis. RESULTS: Ipilimumab plus anti-PD-1 provided an improvement of 1.39 QALYs and 2.48 LYs, at a ICER of $73,163 per QALY. The HR of OS was the variable that had the greatest impact on ICER. Compared to ipilimumab, the probability of ipilimumab plus anti-PD-1 being cost-effective was 94% at the WTP of $150,000/QALY. The results of the subgroup analysis showed that the ICER in the majority of the subgroups was less than $150,000/QALY. CONCLUSIONS: Ipilimumab plus anti-PD-1 was likely to be cost-effective compared to ipilimumab for patients with metastatic melanoma who are resistant to anti-PD-(L)1 at a WTP threshold of 150,000/QALY.

14.
Front Pediatr ; 9: 763125, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34869120

RESUMO

Objective: There is a lack of assessment methods of acute appendicitis in little children. The purpose of this study was to develop and internally validate a nomogram for predicting the severity of acute appendicitis of young children (<3 years old). Methods: We develop a prediction model based on a training dataset of 121 patients (<3 years old) with acute appendicitis. Admission information was collected between January 2010 and January 2021, which contained demographic characteristic, laboratory examinations, treatment and pathology type, etc. Logistic regression analysis was used to identify independent risk factors and establish the predictive model. C-index and calibration curves were applied to evaluate the performance of the nomogram. Then corrected C-index was calculated to conduct internal verification by using the bootstrapping validation. Decision curve analysis determined clinical application of the prediction model. Results: Predictors contained in the prediction nomogram included weight for age, onset time (from developing symptoms to hospital), admission temperature, leukocyte count, neutrophil ratio, and total bilirubin. Logistic regression analysis showed that weight for age (X1) < -2.32 SD (P = 0.046), onset time (X2) > 2.5 days (P = 0.044), admission temperature (X3) > 38.5°C (P = 0.009), leukocyte count (X4) > 12.185*109/L (P = 0.045), neutrophil ratio (X5) > 68.7% (P = 0.029), and total bilirubin (X6) > 9.05 µmol/L (P = 0.035) were found to be significant for predicting the severity of appendicitis. The logistic regression equation was logit (P) = -0.149X1 + 0.51X2 + 1.734X3 + 0.238X4 + 0.061X5 + 0.098X6 - 75.229. C-index of nomogram was calculated at 0.8948 (95% Cl: 0.8332-0.9567) and it still was 0.8867 through bootstrapping validation. Decision curve analysis showed that when the threshold probability ranged from 14 to 88%, there is a net benefit of using this prediction model for severity of appendicitis in little children. Conclusion: This novel nomogram incorporating the weight for age, onset time, admission temperature, leukocyte count, neutrophil ratio, and total bilirubin could be conveniently used to estimate the severity of appendicitis of young children <3 years old) and determine appropriate treatment options in time.

15.
Curr Med Res Opin ; 37(7): 1189-1197, 2021 07.
Artigo em Inglês | MEDLINE | ID: mdl-33944646

RESUMO

OBJECTIVE: Limited real-world information exists on the characteristics or treatment patterns of patients with peripheral T-cell lymphoma (PTCL). We reported demographics, treatments and direct healthcare resource utilization (HRU) in a large cohort of US patients newly diagnosed with PTCL. METHODS: Patients aged ≥18 years with a PTCL diagnosis between January 2011 and December 2016 were identified from the Inovalon MORE2 Registry. Continuous medical/pharmacy enrollment 6-months prior to and ≥1-month after the first PTCL diagnosis was required. The main focus of this study was on newly diagnosed patients receiving cyclophosphamide, doxorubicin, vincristine, and prednisone (CHOP) versus other chemotherapy. RESULTS: A total 2971 patients with PTCL and chemotherapy information were included in the study; 1706 (57%) received CHOP and 1265 (43%) other chemotherapy. A majority of patients (51.7%) were female; mean (standard deviation) age at index was 61.0 (±16.0), Charlson score was 4.1 (±2.9), and follow-up time was 24.6 (±16.7) months. During the variable follow-up period, HRU was similar for the CHOP and other chemotherapy cohorts; 58.1% and 59.3% had ≥1 all-cause hospitalizations, respectively. The proportion of patients with ≥1 PTCL-related hospitalizations was higher in the CHOP than in the other chemotherapy cohort (40.3% vs. 9.7%, respectively) and mean length of stay was longer (4.6 vs. 3.7 days per patient per month, respectively). CONCLUSIONS: This retrospective analysis of patients with PTCL revealed high levels of comorbidity and HRU; novel interventions that improve patient outcomes and reduce the HRU burden of PTCL are needed.


Assuntos
Linfoma de Células T Periférico , Adolescente , Adulto , Protocolos de Quimioterapia Combinada Antineoplásica/uso terapêutico , Efeitos Psicossociais da Doença , Atenção à Saúde , Feminino , Humanos , Linfoma de Células T Periférico/tratamento farmacológico , Linfoma de Células T Periférico/epidemiologia , Masculino , Estudos Retrospectivos
16.
J Vet Diagn Invest ; 33(3): 469-478, 2021 May.
Artigo em Inglês | MEDLINE | ID: mdl-33745389

RESUMO

To evaluate the utility of random-effects linear modeling for herd-level evaluation of trace mineral status, we performed a retrospective analysis of the results for trace mineral testing of bovine liver samples submitted to the Michigan State University Veterinary Diagnostic Laboratory between 2011 and 2017. Our aim was to examine random-effects models for their potential utility in improving interpretation with minimal sample numbers. The database consisted of 1,658 animals distributed among 121 herds. Minerals were assayed by inductively coupled plasma-mass spectroscopy, and included cobalt, copper, iron, molybdenum, manganese, selenium, and zinc. Intraclass correlation coefficients for each mineral were significantly different (p < 0.001) from zero and ranged from 0.38 for manganese to 0.82 for selenium, indicating that the strength of herd effects, which are presumably related to diet, vary greatly by mineral. Analysis of the distribution and standard errors of best linear unbiased predictor (BLUP) values suggested that testing 5-10 animals per herd could place herds within 10 percentile units across the population of herds with 70-95% confidence, the confidence level varying among minerals. Herd means were generally similar to BLUPs, suggesting that means could be reasonably compared to BLUPs with respect to the distributions reported here. However, caution in interpreting means relative to BLUPs should be exercised when animal numbers are small, the standard errors of the means are large, and/or the values are near the extremes of the distribution.


Assuntos
Bovinos/metabolismo , Fígado/química , Minerais/metabolismo , Oligoelementos/metabolismo , Animais , Modelos Lineares , Michigan , Estudos Retrospectivos
17.
Sci Total Environ ; 770: 144677, 2021 May 20.
Artigo em Inglês | MEDLINE | ID: mdl-33508673

RESUMO

The omnipresence of pharmaceuticals at relatively high concentrations (µg/L) in environmental compartments indicated their inadequate removal by wastewater treatment plants. As such, batch reactors seeded with activated sludge were set up to assess the biotransformation of metformin, ranitidine, lidocaine and atorvastatin. The main objective was to identify transformation products (TPs) through the establishment of an integrated workflow for suspect and non-target screening based on reversed phase liquid chromatography quadrupole-time-of-flight mass spectrometry. To support the identification, hydrophilic interaction liquid chromatography (HILIC) was used as a complementary tool, in order to enhance the completeness of the developed workflow by identifying the more polar TPs. The structure assignment/elucidation of the candidate TPs was mainly based on interpretation of MS/MS spectra. Twenty-two TPs were identified, with fourteen of them reaching high identification confidence levels (level 1: confirmed structure by reference standards and level 2: probable structure by library spectrum match and diagnostic evidence). Finally, retrospective analysis in influent and effluent wastewater was performed for the TPs for four consecutive years in wastewater sampled in Athens, Greece. The potential toxicological threat of the compounds to the aquatic environment was assessed and atorvastatin with two of its TPs showed a potential risk to the aquatic organisms.


Assuntos
Preparações Farmacêuticas , Poluentes Químicos da Água , Biotransformação , Grécia , Estudos Retrospectivos , Medição de Risco , Espectrometria de Massas em Tandem , Águas Residuárias/análise , Poluentes Químicos da Água/análise
18.
World J Clin Cases ; 8(21): 5128-5138, 2020 Nov 06.
Artigo em Inglês | MEDLINE | ID: mdl-33269249

RESUMO

BACKGROUND: Many classification systems of thoracolumbar spinal fractures have been proposed to enhance treatment protocols, but none have achieved universal adoption. AIM: To develop a new patient scoring system for cases with thoracolumbar injury classification and severity score (TLICS) = 4, namely the load-sharing thoracolumbar injury score (LSTLIS). METHODS: Based on thoracolumbar injury classification and severity score, this study proposes the use of the established load-sharing classification (LSC) to develop an improved classification system (LSTLIS). To prove the reliability and reproducibility of LSTLIS, a retrospective analysis for patients with thoracolumbar vertebral fractures has been conducted. RESULTS: A total of 102 cases were enrolled in the study. The scoring trend of LSTLIS is roughly similar as the LSC scoring, however, the average deviation based on the former method is relatively smaller than that of the latter. Thus, the robustness of the LSTLIS scoring method is better than that of LSC. LSTLIS can further classify patients with TLICS = 4, so as to assess more accurately this particular circumstance, and the majority of LSTLIS recommendations are consistent with actual clinical decisions. CONCLUSION: LSTLIS is a scoring system that combines LSC and TLICS to compensate for the lack of appropriate inclusion of anterior and middle column compression fractures with TLICS. Following preliminary clinical verification, LSTLIS has greater feasibility and reliability value, is more practical in comprehensively assessing certain clinical circumstances, and has better accuracy with clinically significant guidelines.

19.
Diabetes Metab Syndr Obes ; 13: 4249-4260, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-33204131

RESUMO

BACKGROUND: The cost of care for diabetic foot ulcers has became a global economic burden. The study aimed to analyze diabetic foot ulcer cost changes over time and to identify factors associated with these variables, so as to strengthen and improve the management of diabetic foot ulcers. METHODS: We retrospectively analyzed the data in the electronic medical record system of our wound treatment center. The homepage of the system was queried using the national clinical version 2.0 disease diagnosis code (ICD-10), the data of patient's basic information were exported. Through the statistics and analysis of these data, the socioeconomic changes and possible risk factors of diabetic foot ulcers management in recent years were obtained. RESULTS: There were 3654 patients included in the study, an average of 522 per year. The total cost per patient increased from ¥15,535.58 in 2014 to ¥42,040.60 in 2020, with an average of ¥21,826.91. The average length of stay between 14.29 days and 31.4 days from 2014 to 2020, with an average of 18.10 days. Besides, the average incidence of peripheral arterial disease in diabetic foot ulcers patients admitted was as high as 81.9%, and the average amputation rate was 9.9%. The study reflected the total cost and length of stay of diabetic foot patients increased significantly from 2014 to 2020, which were related to age (>85 years), gender (male), peripheral arterial disease, amputation (P < 0.05). CONCLUSION: A heavy cost from diabetic foot ulcers and its complications was significantly increased yearly, which was related to older age, co-morbidity, amputation and duration of hospitalization. The prevention and treatment of diabetic foot ulcers have a long way to go, early comprehensive prevention and multi-disciplinary cooperation may still be an effective way.

20.
World J Clin Cases ; 8(16): 3483-3492, 2020 Aug 26.
Artigo em Inglês | MEDLINE | ID: mdl-32913855

RESUMO

BACKGROUND: Vaginal delivery is the ideal mode of delivery for the termination of a pregnancy. However, the cesarean section rate in China is much higher than the published by the World Health Organization in the Lancet in 2010. AIM: To retrospectively analyze the factors related to failed trial of labor and the clinical indications for cesarean section conversion, explore how to promote the trial of labor success rate, and determine the feasibility of reducing the rate of conversion to cesarean section. METHODS: A retrospective analysis was performed on 9240 maternal women who met vaginal delivery conditions and required a trial of labor from January 2016 to December 2018 at our hospital. Among them, 8164 pregnant women who had a successful trial of labor were used as a control group, and 1076 pregnant women who had a failed trial of labor and converted to an emergency cesarean section were used as an observation group. The patients' clinical data during hospitalization were collected for comparative analysis, the related factors of the failed trial of labor were discussed, and reasonable prevention and resolution strategies were proposed to increase the success rate of trial of labor. RESULTS: The analysis revealed that advanced age (≥ 35 years old), macrosomia (≥ 4000 g), delayed pregnancy (≥ 41 wk), use of uterine contraction drugs, primipara, and fever during labor were associated with conversion to an emergency cesarean section in the failed trial of labor. Multivariate regression analysis showed that age, gestational age, primipara, use of uterine contraction drugs, fever during birth, and newborn weight led to a higher probability of conversion to an emergency cesarean section in the failed trial of labor. The analysis indicated that the following clinical indications were associated with the conversion to cesarean section in the failed trial of labor: Fetal distress (44.3%), social factors (12.8%), malpresentation (face presentation, persistent occipitoposterior position, and persistent occipitotransverse position) (9.4%), and cephalopelvic disproportion (8.9%). CONCLUSION: The conversion to emergency cesarean section in failed trial of labor is affected by many factors. Medical staff should take appropriate preventive measures for the main factors, increase the trial of labor success rate, improve the quality of delivery, ensure the safety of mother and child during the perinatal period, and improve the relationship between doctors and patients.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA