Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 54
Filtrar
Más filtros

Bases de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Osteoporos Int ; 34(5): 925-933, 2023 May.
Artículo en Inglés | MEDLINE | ID: mdl-36854747

RESUMEN

PURPOSE: Interest in fractures in patients with multiple sclerosis (MS) and neuromyelitis optica spectrum disorder (NMOSD) has considerably increased in the last decade. However, few studies have compared the incidence of fractures between patients with MS and NMOSD using a nationwide database. This study aimed to evaluate the differences in the risk of fracture between patients with NMOSD and MS compared to that in healthy controls using cohort data from a Korean nationwide database. METHODS: In this retrospective cohort study, data from the National Health Insurance Service (NHIS) database from January 2010 to December 2017 were analyzed. A total of 1,217/1,329 patients with MS/NMOSD free of fractures at the index date were included. Matched controls were selected based on age, sex, and the presence of hypertension, diabetes mellitus, and dyslipidemia. The mean follow-up durations after the index date were 4.40/4.08 years for patients with MS/NMOSD and 4.73/4.28 for their matched controls. RESULTS: The adjusted hazard ratios (aHRs) with 95% confidence intervals of any, hip, and vertebral fractures were 1.81 (1.43-2.28), 3.36 (1.81-6.24), and 2.01 (1.42-2.99) times higher for patients with MS than for controls, respectively, and they were 1.85 (1.47-2.34), 3.82 (2.05-7.11), and 2.84 (1.92-4.21) times higher for patients with NMOSD than for controls, respectively. No significant differences were observed in the incidence of fractures between the MS and NMOSD groups. Patients with MS/NMOSD had a 1.8-fold higher risk of fracture than matched controls, and the risk of hip fracture was especially high (3- to 4-fold higher). CONCLUSIONS: Clinicians need to regularly assess patients with MS/NMOSD for the risk of fractures and take preventative measures to reduce it.


Asunto(s)
Fracturas Óseas , Esclerosis Múltiple , Neuromielitis Óptica , Humanos , Neuromielitis Óptica/complicaciones , Neuromielitis Óptica/epidemiología , Estudios de Cohortes , Esclerosis Múltiple/complicaciones , Esclerosis Múltiple/epidemiología , Estudios Retrospectivos , Imagen por Resonancia Magnética
2.
Artículo en Inglés | MEDLINE | ID: mdl-35902226

RESUMEN

BACKGROUND: Neurodegeneration is associated with pathogenesis of both multiple sclerosis (MS) and neuromyelitis optica (NMOSD). Parkinson's disease (PD) is a representative neurodegenerative disease, however, whether MS or NMOSD is associated with risk of PD is not known. METHODS: MS and NMOSD cohorts were collected from the Korean National Health Insurance Service between 1 January 2010 and 31 December 2017, using International Classification of Diseases 10th revision diagnosis codes and information in the Rare Intractable Disease management programme. The PD incidence rate that occurred after a 1-year lag period was calculated and compared with that of a control cohort matched for age, sex, hypertension, diabetes and dyslipidaemia in a 1:5 ratio. RESULTS: The incidence rates of PD in patients with MS and NMOSD were 3.38 and 1.27 per 1000 person-years, respectively, and were higher than that of their matched control groups. The adjusted HR of PD was 7.73 (95% CI, 3.87 to 15.47) in patients with MS and 2.61 (95% CI, 1.13 to 6.02) in patients with NMOSD compared with matched controls. In both patients with MS and NMOSD, there were no significant differences in relative risk when stratified by sex, age, diabetes, hypertension and dyslipidaemia. CONCLUSION: The PD risk was higher in patients with MS and NMOSD compared with healthy controls and was particularly high in patients with MS. Further investigations should be performed to determine the pathophysiology and occurrence of PD in patients with MS and NMOSD.

3.
BMC Med Inform Decis Mak ; 22(1): 210, 2022 08 08.
Artículo en Inglés | MEDLINE | ID: mdl-35941636

RESUMEN

BACKGROUND: While various quantitative studies based on the Unified Theory of Acceptance and Use of Technology (UTAUT) and Technology Acceptance Models (TAM) exist in the general medical sectors, just a few have been conducted in the behavioral sector; they have all been qualitative interview-based studies. OBJECTIVE: The purpose of this study is to assess the adoption dimensions of a behavioral electronic health record (EHR) system for behavioral clinical professionals using a modified clinical adoption (CA) research model that incorporates a variety of micro, meso, and macro level factors. METHODS: A questionnaire survey with quantitative analysis approach was used via purposive sampling method. We modified the existing CA framework to be suitable for evaluating the adoption of an EHR system by behavioral clinical professionals. We designed and verified questionnaires that fit into the dimensions of the CA framework. The survey was performed in five US behavioral hospitals, and the adoption factors were analyzed using a structural equation analysis. RESULTS: We derived a total of seven dimensions, omitting those determined to be unsuitable for behavioral clinical specialists to respond to. We polled 409 behavioral clinical experts from five hospitals. As a result, the ease of use and organizational support had a substantial impact on the use of the behavioral EHR system. Although the findings were not statistically significant, information and service quality did appear to have an effect on the system's ease of use. The primary reported benefit of behavioral EHR system adoption was the capacity to swiftly locate information, work efficiently, and access patient information via a mobile app, which resulted in more time for better care. The primary downside, on the other hand, was an unhealthy reliance on the EHR system. CONCLUSIONS: We demonstrated in this study that the CA framework can be a useful tool for evaluating organizational and social elements in addition to the EHR system's system features. Not only the EHR system's simplicity of use, but also organizational support, should be considered for the effective implementation of the behavioral EHR system. TRIAL REGISTRATION: The study was approved by the Institutional Review Board of Seoul National University Bundang Hospital (IRB No.: B-1904-534-301).


Asunto(s)
Registros Electrónicos de Salud , Médicos , Actitud del Personal de Salud , Personal de Salud , Hospitales Universitarios , Humanos
4.
J Med Internet Res ; 23(9): e26802, 2021 09 13.
Artículo en Inglés | MEDLINE | ID: mdl-34515640

RESUMEN

BACKGROUND: Despite the fact that the adoption rate of electronic health records has increased dramatically among high-income nations, it is still difficult to properly disseminate personal health records. Token economy, through blockchain smart contracts, can better distribute personal health records by providing incentives to patients. However, there have been very few studies regarding the particular factors that should be considered when designing incentive mechanisms in blockchain. OBJECTIVE: The aim of this paper is to provide 2 new mathematical models of token economy in real-world scenarios on health care blockchain platforms. METHODS: First, roles were set for the health care blockchain platform and its token flow. Second, 2 scenarios were introduced: collecting life-log data for an incentive program at a life insurance company to motivate customers to exercise more and recruiting participants for clinical trials of anticancer drugs. In our 2 scenarios, we assumed that there were 3 stakeholders: participants, data recipients (companies), and data providers (health care organizations). We also assumed that the incentives are initially paid out to participants by data recipients, who are focused on minimizing economic and time costs by adapting mechanism design. This concept can be seen as a part of game theory, since the willingness-to-pay of data recipients is important in maintaining the blockchain token economy. In both scenarios, the recruiting company can change the expected recruitment time and number of participants. Suppose a company considers the recruitment time to be more important than the number of participants and rewards. In that case, the company can increase the time weight and adjust cost. When the reward parameter is fixed, the corresponding expected recruitment time can be obtained. Among the reward and time pairs, the pair that minimizes the company's cost was chosen. Finally, the optimized results were compared with the simulations and analyzed accordingly. RESULTS: To minimize the company's costs, reward-time pairs were first collected. It was observed that the expected recruitment time decreased as rewards grew, while the rewards decreased as time cost grew. Therefore, the cost was represented by a convex curve, which made it possible to obtain a minimum-an optimal point-for both scenarios. Through sensitivity analysis, we observed that, as the time weight increased, the optimized reward increased, while the optimized time decreased. Moreover, as the number of participants increased, the optimization reward and time also increased. CONCLUSIONS: In this study, we were able to model the incentive mechanism of blockchain based on a mechanism design that recruits participants through a health care blockchain platform. This study presents a basic approach to incentive modeling in personal health records, demonstrating how health care organizations and funding companies can motivate one another to join the platform.


Asunto(s)
Cadena de Bloques , Registros de Salud Personal , Ensayos Clínicos como Asunto , Atención a la Salud , Registros Electrónicos de Salud , Humanos , Régimen de Recompensa
5.
J Med Internet Res ; 22(9): e19907, 2020 09 09.
Artículo en Inglés | MEDLINE | ID: mdl-32877350

RESUMEN

BACKGROUND: The COVID-19 pandemic has caused major disruptions worldwide since March 2020. The experience of the 1918 influenza pandemic demonstrated that decreases in the infection rates of COVID-19 do not guarantee continuity of the trend. OBJECTIVE: The aim of this study was to develop a precise spread model of COVID-19 with time-dependent parameters via deep learning to respond promptly to the dynamic situation of the outbreak and proactively minimize damage. METHODS: In this study, we investigated a mathematical model with time-dependent parameters via deep learning based on forward-inverse problems. We used data from the Korea Centers for Disease Control and Prevention (KCDC) and the Center for Systems Science and Engineering (CSSE) at Johns Hopkins University for Korea and the other countries, respectively. Because the data consist of confirmed, recovered, and deceased cases, we selected the susceptible-infected-recovered (SIR) model and found approximated solutions as well as model parameters. Specifically, we applied fully connected neural networks to the solutions and parameters and designed suitable loss functions. RESULTS: We developed an entirely new SIR model with time-dependent parameters via deep learning methods. Furthermore, we validated the model with the conventional Runge-Kutta fourth order model to confirm its convergent nature. In addition, we evaluated our model based on the real-world situation reported from the KCDC, the Korean government, and news media. We also crossvalidated our model using data from the CSSE for Italy, Sweden, and the United States. CONCLUSIONS: The methodology and new model of this study could be employed for short-term prediction of COVID-19, which could help the government prepare for a new outbreak. In addition, from the perspective of measuring medical resources, our model has powerful strength because it assumes all the parameters as time-dependent, which reflects the exact status of viral spread.


Asunto(s)
Betacoronavirus , Infecciones por Coronavirus/epidemiología , Aprendizaje Profundo , Modelos Teóricos , Redes Neurales de la Computación , Pandemias , Neumonía Viral/epidemiología , COVID-19 , Humanos , Medios de Comunicación de Masas , República de Corea/epidemiología , SARS-CoV-2 , Factores de Tiempo
6.
J Med Internet Res ; 22(11): e18582, 2020 11 13.
Artículo en Inglés | MEDLINE | ID: mdl-33185553

RESUMEN

BACKGROUND: Although the electronic health record system adoption rate has reached 96% in the United States, implementation and usage of health information exchange (HIE) is still lagging behind. Blockchain has come into the spotlight as a technology to solve this problem. However, there have been no studies assessing the perspectives of different stakeholders regarding blockchain-based patient-centered HIE. OBJECTIVE: The objective of this study was to analyze the awareness among patients, health care professionals, and information technology developers toward blockchain-based HIE, and compare their different perspectives related to the platform using a qualitative research methodology. METHODS: In this qualitative study, we applied grounded theory and the Promoting Action on Research Implementation in the Health Service (PARiHS) framework. We interviewed 7 patients, 7 physicians, and 7 developers, for a total of 21 interviewees. RESULTS: Regarding the leakage of health information, the patient group did not have concerns in contrast to the physician and developer groups. Physicians were particularly concerned about the fact that errors in the data cannot be easily fixed due to the nature of blockchain technology. Patients were not against the idea of providing information for clinical trials or research institutions. They wished to be provided with the results of clinical research rather than being compensated for providing data. The developers emphasized that blockchain must be technically mature before it can be applied to the health care scene, and standards of medical information to be exchanged must first be established. CONCLUSIONS: The three groups' perceptions of blockchain were generally positive about the idea of patients having the control of sharing their own health information. However, they were skeptical about the cooperation among various institutions and implementation for data standardization in the establishment process, in addition to how the service will be employed in practice. Taking these factors into consideration during planning, development, and operation of a platform will contribute to establishing practical treatment plans and tracking in a more convenient manner for both patients and physicians. Furthermore, it will help expand the related research and health management industry based on blockchain.


Asunto(s)
Cadena de Bloques/normas , Intercambio de Información en Salud/normas , Pacientes/estadística & datos numéricos , Proyectos de Investigación/tendencias , Adolescente , Adulto , Anciano , Atención a la Salud , Personal de Salud , Humanos , Persona de Mediana Edad , Investigación Cualitativa , Adulto Joven
8.
J Med Internet Res ; 21(2): e11757, 2019 02 15.
Artículo en Inglés | MEDLINE | ID: mdl-30767907

RESUMEN

BACKGROUND: Prevention and management of chronic diseases are the main goals of national health maintenance programs. Previously widely used screening tools, such as Health Risk Appraisal, are restricted in their achievement this goal due to their limitations, such as static characteristics, accessibility, and generalizability. Hypertension is one of the most important chronic diseases requiring management via the nationwide health maintenance program, and health care providers should inform patients about their risks of a complication caused by hypertension. OBJECTIVE: Our goal was to develop and compare machine learning models predicting high-risk vascular diseases for hypertensive patients so that they can manage their blood pressure based on their risk level. METHODS: We used a 12-year longitudinal dataset of the nationwide sample cohort, which contains the data of 514,866 patients and allows tracking of patients' medical history across all health care providers in Korea (N=51,920). To ensure the generalizability of our models, we conducted an external validation using another national sample cohort dataset, comprising one million different patients, published by the National Health Insurance Service. From each dataset, we obtained the data of 74,535 and 59,738 patients with essential hypertension and developed machine learning models for predicting cardiovascular and cerebrovascular events. Six machine learning models were developed and compared for evaluating performances based on validation metrics. RESULTS: Machine learning algorithms enabled us to detect high-risk patients based on their medical history. The long short-term memory-based algorithm outperformed in the within test (F1-score=.772, external test F1-score=.613), and the random forest-based algorithm of risk prediction showed better performance over other machine learning algorithms concerning generalization (within test F1-score=.757, external test F1-score=.705). Concerning the number of features, in the within test, the long short-term memory-based algorithms outperformed regardless of the number of features. However, in the external test, the random forest-based algorithm was the best, irrespective of the number of features it encountered. CONCLUSIONS: We developed and compared machine learning models predicting high-risk vascular diseases in hypertensive patients so that they may manage their blood pressure based on their risk level. By relying on the prediction model, a government can predict high-risk patients at the nationwide level and establish health care policies in advance.


Asunto(s)
Enfermedades Cardiovasculares/diagnóstico , Trastornos Cerebrovasculares/diagnóstico , Hipertensión/diagnóstico , Aprendizaje Automático/tendencias , Algoritmos , Enfermedad Crónica , Humanos
9.
Liver Int ; 38(5): 915-923, 2018 05.
Artículo en Inglés | MEDLINE | ID: mdl-28940824

RESUMEN

BACKGROUND & AIMS: Evaluation of the controlled attenuation parameter (CAP) is a promising noninvasive method for assessing hepatic steatosis. Despite the increasing reliability of the CAP for assessing steatosis in subjects with chronic liver disease, few studies have evaluated the CAP in asymptomatic subjects without overt liver disease. Therefore, we aimed to evaluate the usefulness of the CAP for a health check-up population. METHODS: We enrolled subjects who underwent abdominal ultrasonography (US), FibroScan (Echosens, France) and blood sampling during medical health check-ups. The CAP was measured using FibroScan, and increased CAP was defined as CAP ≥ 222 dB/m. RESULTS: A total of 1133 subjects were included; 589 subjects (52.0%) had fatty liver based on US, and 604 subjects (53.3%) had increased CAP. Increased CAP was significantly associated with metabolic abnormalities, including higher body mass index (BMI)[odds ratio (OR) = 1.33;95% confidence interval (CI),1.24-1.43; P < .001], higher alanine aminotransferase (ALT) (OR = 1.02; 95% CI, 1.01-1.04; P = .003), higher insulin (OR = 1.04; 95% CI, 1.00-1.08; P = .037), higher triglyceride (OR = 1.00; 95% CI, 1.00-1.01; P = 0.042) and older age (OR = 1.02; 95% CI, 1.00-1.03; P = .05). Furthermore, a comparison of clinical parameters among three groups (normal vs no fatty liver by US but increased CAP vs fatty liver based on US) revealed that metabolic parameters, including blood pressure, BMI, waist circumference, aspartate aminotransferase (AST), ALT, triglycerides, fasting glucose, uric acid, insulin, homeostasis model assessment-estimated insulin resistance and liver stiffness measurements, gradually increased across the three groups (all P < .001). CONCLUSIONS: In conclusion, increased CAP could be an early indicator of fatty liver disease with metabolic abnormalities that manifests even before a sonographic fatty change appears.


Asunto(s)
Diagnóstico por Imagen de Elasticidad , Hígado Graso/diagnóstico por imagen , Hígado Graso/fisiopatología , Hígado/diagnóstico por imagen , Adulto , Alanina Transaminasa/sangre , Femenino , Humanos , Insulina/sangre , Modelos Logísticos , Masculino , Persona de Mediana Edad , Análisis Multivariante , Reproducibilidad de los Resultados , Estudios Retrospectivos , Triglicéridos/sangre
10.
J Med Internet Res ; 19(12): e401, 2017 12 07.
Artículo en Inglés | MEDLINE | ID: mdl-29217503

RESUMEN

BACKGROUND: Personal health record (PHR)-based health care management systems can improve patient engagement and data-driven medical diagnosis in a clinical setting. OBJECTIVE: The purpose of this study was (1) to demonstrate the development of an electronic health record (EHR)-tethered PHR app named MyHealthKeeper, which can retrieve data from a wearable device and deliver these data to a hospital EHR system, and (2) to study the effectiveness of a PHR data-driven clinical intervention with clinical trial results. METHODS: To improve the conventional EHR-tethered PHR, we ascertained clinicians' unmet needs regarding PHR functionality and the data frequently used in the field through a cocreation workshop. We incorporated the requirements into the system design and architecture of the MyHealthKeeper PHR module. We constructed the app and validated the effectiveness of the PHR module by conducting a 4-week clinical trial. We used a commercially available activity tracker (Misfit) to collect individual physical activity data, and developed the MyHealthKeeper mobile phone app to record participants' patterns of daily food intake and activity logs. We randomly assigned 80 participants to either the PHR-based intervention group (n=51) or the control group (n=29). All of the study participants completed a paper-based survey, a laboratory test, a physical examination, and an opinion interview. During the 4-week study period, we collected health-related mobile data, and study participants visited the outpatient clinic twice and received PHR-based clinical diagnosis and recommendations. RESULTS: A total of 68 participants (44 in the intervention group and 24 in the control group) completed the study. The PHR intervention group showed significantly higher weight loss than the control group (mean 1.4 kg, 95% CI 0.9-1.9; P<.001) at the final week (week 4). In addition, triglyceride levels were significantly lower by the end of the study period (mean 2.59 mmol/L, 95% CI 17.6-75.8; P=.002). CONCLUSIONS: We developed an innovative EHR-tethered PHR system that allowed clinicians and patients to share lifelog data. This study shows the effectiveness of a patient-managed and clinician-guided health tracker system and its potential to improve patient clinical profiles. TRIAL REGISTRATION: ClinicalTrials.gov NCT03200119; https://clinicaltrials.gov/ct2/show/NCT03200119 (Archived by WebCite at http://www.webcitation.org/6v01HaCdd).


Asunto(s)
Registros Electrónicos de Salud/estadística & datos numéricos , Registros de Salud Personal/psicología , Participación del Paciente/métodos , Telemedicina/métodos , Adulto , Femenino , Humanos , Masculino
11.
J Pers Med ; 14(3)2024 Mar 18.
Artículo en Inglés | MEDLINE | ID: mdl-38541058

RESUMEN

This study investigates the feasibility of accurately predicting adverse health events without relying on costly data acquisition methods, such as laboratory tests, in the era of shifting healthcare paradigms towards community-based health promotion and personalized preventive healthcare through individual health risk assessments (HRAs). We assessed the incremental predictive value of four categories of predictor variables-demographic, lifestyle and family history, personal health device, and laboratory data-organized by data acquisition costs in the prediction of the risks of mortality and five chronic diseases. Machine learning methodologies were employed to develop risk prediction models, assess their predictive performance, and determine feature importance. Using data from the National Sample Cohort of the Korean National Health Insurance Service (NHIS), which includes eligibility, medical check-up, healthcare utilization, and mortality data from 2002 to 2019, our study involved 425,148 NHIS members who underwent medical check-ups between 2009 and 2012. Models using demographic, lifestyle, family history, and personal health device data, with or without laboratory data, showed comparable performance. A feature importance analysis in models excluding laboratory data highlighted modifiable lifestyle factors, which are a superior set of variables for developing health guidelines. Our findings support the practicality of precise HRAs using demographic, lifestyle, family history, and personal health device data. This approach addresses HRA barriers, particularly for healthy individuals, by eliminating the need for costly and inconvenient laboratory data collection, advancing accessible preventive health management strategies.

12.
Eye (Lond) ; 38(2): 364-371, 2024 02.
Artículo en Inglés | MEDLINE | ID: mdl-37598260

RESUMEN

PURPOSE: To evaluate the association between age-related macular degeneration (AMD) with or without visual disability (VD) and the risk of fracture using the National Health Insurance data in South Korea. METHODS: In total, 3,894,702 individuals who had taken part in health-screening programs between January 1, 2009, and December 31, 2009, were included in the cohort and followed until December 31, 2019. The participants with VD, which could be related to the severity of AMD, were defined as those with a loss of vision or visual field defect as certified by the Korean government's Ministry of Health and Welfare. The hazard ratio was calculated for groups (control and AMD with/without VD) using the multivariable-adjusted cox regression analysis. RESULTS: In total, 466,890 participants (11.99%) were diagnosed with fractures during the study period. An increased risk of fracture was observed in individuals with AMD compared with the control (adjusted hazard ratio (aHR), 1.09, 95% confidence interval (CI), 1.06-1.11). Furthermore, among the AMD individuals, an increased risk of fracture was prominent in individuals with VD (aHR 1.17, 95% CI 1.08-1.27) than those without VD (aHR 1.08, 95% CI 1.06-1.11) compared with the reference group (control). CONCLUSIONS: AMD was associated with an increased risk of fracture even without VD. Prevention for fracture should be considered in AMD patients, especially when accompanied by VD.


Asunto(s)
Degeneración Macular , Humanos , Estudios de Cohortes , Factores de Riesgo , Degeneración Macular/complicaciones , Degeneración Macular/epidemiología , Degeneración Macular/diagnóstico , República de Corea/epidemiología , Modelos de Riesgos Proporcionales
13.
Int J Med Inform ; 181: 105300, 2024 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-37995386

RESUMEN

BACKGROUND: Antibiotic stewardship programs (ASP) aim to reduce inappropriate use of antibiotics, but their labor-intensive nature impedes their wide adoption. The present study introduces explainable machine learning (ML) models designed to prioritize inpatients who would benefit most from stewardship interventions. METHODS: A cohort of inpatients who received systemic antibiotics and were monitored by a multidisciplinary ASP team at a tertiary hospital in the Republic of Korea was assembled. Data encompassing over 130,000 patient-days and comprising more than 160 features from multiple domains, including prescription records, laboratory, microbiology results, and patient conditions was collected.Outcome labels were generated using medication administration history: discontinuation, switching from intravenous to oral medication (IV to PO), and early or late de-escalation. The models were trained using Extreme Gradient Boosting (XGB) and light Gradient Boosting Machine (LGBM), with SHapley Additive exPlanations (SHAP) analysis used to explain the model's predictions. RESULTS: The models demonstrated strong discrimination when evaluated on a hold-out test set(AUROC - IV to PO: 0.81, Early de-escalation: 0.78, Late de-escalation: 0.72, Discontinue: 0.80). The models identified 41%, 16%, 22%, and 17% more cases requiring discontinuation, IV to PO, early and late de-escalation, respectively, compared to the conventional length of therapy strategy, given that the same number of patients were reviewed by the ASP team. The SHAP results explain how each model makes their predictions, highlighting a unique set of important features that are well-aligned with the clinical intuitions of the ASP team. CONCLUSIONS: The models are expected to improve the efficiency of ASP activities by prioritizing cases that would benefit from different types of ASP interventions along with detailed explanations.


Asunto(s)
Programas de Optimización del Uso de los Antimicrobianos , Humanos , Antibacterianos/uso terapéutico , Tiempo de Internación , Centros de Atención Terciaria , República de Corea
14.
Diabetes Metab J ; 47(4): 514-522, 2023 07.
Artículo en Inglés | MEDLINE | ID: mdl-37096375

RESUMEN

BACKGRUOUND: Although obesity is a well-known risk factor of type 2 diabetes mellitus (T2DM), there is scant data on discriminating the contribution of previous obesity and recent weight gain on developing T2DM. METHODS: We analyzed the Korean National Health Insurance Service-Health Screening Cohort data from 2002 to 2015 where Korean residents underwent biennial health checkups. Participants were classified into four groups according to their obesity status (body mass index [BMI] ≥25 kg/m2) before and after turning 50 years old: maintaining normal (MN), becoming obese (BO), becoming normal (BN), and maintaining obese (MO). Cox proportional hazards regression model was used to estimate the risk of T2DM factoring in the covariates age, sex, BMI, presence of impaired fasting glucose or hypertension, family history of diabetes, and smoking status. RESULTS: A total of 118,438 participants (mean age, 52.5±1.1 years; men, 45.2%) were prospectively evaluated for incident T2DM. A total of 7,339 (6.2%) participants were diagnosed with T2DM during a follow-up period of 4.8±2.6 years. Incidence rates of T2DM per 1,000 person-year were 9.20 in MN, 14.81 in BO, 14.42 in BN, 21.38 in MO. After factoring in covariates, participants in the groups BN (adjusted hazard ratio [aHR], 1.15; 95% confidence interval [CI], 1.04 to 1.27) and MO (aHR, 1.14; 95% CI, 1.06 to 1.24) were at increased risk of developing T2DM compared to MN, whereas BO (hazard ratio, 1.06; 95% CI, 0.96 to 1.17) was not. CONCLUSION: Having been obese before 50 years old increased the risk of developing T2DM in the future, but becoming obese after 50 did not. Therefore, it is important to maintain normal weight from early adulthood to prevent future metabolic perturbations.


Asunto(s)
Diabetes Mellitus Tipo 2 , Masculino , Persona de Mediana Edad , Humanos , Adulto , Diabetes Mellitus Tipo 2/diagnóstico , Estudios de Cohortes , Obesidad/complicaciones , Obesidad/epidemiología , República de Corea/epidemiología
15.
JMIR Form Res ; 7: e36324, 2023 Oct 30.
Artículo en Inglés | MEDLINE | ID: mdl-37902820

RESUMEN

BACKGROUND: The surge in older demographics has inevitably resulted in a heightened demand for health care, and a shortage of nursing staff is impending. Consequently, there is a growing demand for the development of nursing robots to assist patients with urinary and bowel elimination. However, no study has examined nurses' opinions of smart devices that provide integrated nursing for patients' urinary and bowel elimination needs. OBJECTIVE: This study aimed to evaluate the feasibility of the Smart Excretion Care System tethered to electronic medical records in a tertiary hospital and community care setting and discuss the anticipated reductions in the burden of nursing care. METHODS: Focus group interviews were conducted using the Consolidated Criteria for Reporting Qualitative Research (COREQ) guidelines. The interviews were conducted in March 2021 and involved 67 nurses who had worked at Seoul National University Bundang Hospital for more than 1 year and had experience in assisting patients with excretion care. Data were collected using purposive and snowball sampling methods. RESULTS: A total of four themes relevant to the Smart Excretion Care System were found: (1) expected reductions in the burden of nursing care, (2) applicable indications (by departments and diseases), (3) preferred features/functions, and (4) expected benefits of using the Smart Excretion Care System in clinical facilities. Nurses from comprehensive nursing care wards had the highest burden when it came to excretion care. It was a common opinion that the Smart Excretion Care System would be very useful in intensive care units and should be applied first to patients with stroke or dementia. CONCLUSIONS: Excretion care is one of the most burdensome tasks for nurses, increasing their workload. The development of the Smart Excretion Care System as a digital health intervention could help improve nurses' work efficiency, reduce their burden, and extend to caregivers and guardians.

16.
Int J Gen Med ; 16: 4067-4076, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37700744

RESUMEN

Background: Inpatients commonly experience problems with elimination due to incontinence, urinary retentions, and complications with indwelling catheters. Although elimination care (EC) is an important nursing area, few studies explore the burden of EC on nurses. Aim: To identify the burden on EC by analyzing nurses' opinions using sequential explanatory mixed method. Methods: This research was conducted using a sequential explanatory mixed-methods design. A total of 59 nurses at a tertiary hospital in South Korea participated in the study from January 1 to March 31, 2022. For quantitative analysis, information about number of delays of work due to EC, required time for serving bedpan or diaper changes, percentage of EC per shift, and percentage of patients who need EC was collected through a survey. For qualitative analysis, focus group interviews were conducted to identify factors that put a burden on EC. Important themes were derived by analyzing nurses' opinions on EC. Results: For nurses in intensive care units, general wards, and integrated nursing care wards, the number of work delays due to EC was 3.6 ± 1.5, 2.3 ± 1.2, and 4.8 ± 2.4 (p<0.01), respectively. The mean percentage of EC work out of total nursing tasks per shift was 36.2 ± 19.0, 29.3 ± 14.4, and 43.8 ± 14.1 (p=0.02), respectively. The mean percentage of patients requiring EC out of patients a nurse cares was 85.4 ± 16.6, 41.3 ± 26.1, and 58.8 ± 21.9 (p<0.01), respectively. Following qualitative analysis, four themes related to nurses' EC burden were derived: physical burden, frequent care needs, delay of other jobs due to EC, and complications. Among them, frequent care needs were found to be the primary factor requiring consideration to reduce nurses' burden. Conclusion: This research found that EC is one of the most burdensome tasks that nurses want to avoid. To alleviate their burden, effective EC protocol or smart medical devices assisting with EC should be developed.

17.
Front Neurosci ; 17: 1214652, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37397465

RESUMEN

Introduction: Cognitive impairment is a common feature of multiple sclerosis (MS) and neuromyelitis optica spectrum disorder (NMOSD). However, there is a lack of population-based study of dementia risk in these disorders. In the present study, the risk of dementia in MS and NMOSD patients in Republic of Korea was estimated. Methods: Data analyzed in this study were obtained from the Korean National Health Insurance Service (KNHIS) database between January 2010 and December 2017. The study included 1,347 MS patients and 1,460 NMOSD patients ≥40 years of age who had not been diagnosed with dementia within 1 year prior to the index date. Matched controls were selected based on age, sex, and the presence of hypertension, diabetes mellitus, or dyslipidemia. Results: In MS and NMOSD patients, the risk of developing any dementia [adjusted hazard ratio (aHR) = 2.34; 95% confidence interval (CI) = 1.84-2.96 and aHR = 2.19; 95% CI = 1.61-3.00, respectively], Alzheimer's disease [AD; aHR = 2.23; 95% confidence interval (CI) = 1.70-2.91 and aHR = 1.99; 95% CI = 1.38-2.88, respectively], and vascular dementia (aHR = 3.75; 95% CI = 1.91-7.35 and aHR = 3.21; 95% CI = 1.47-7.02, respectively) was higher compared with the matched controls. NMOSD patients had a lower risk of any dementia and AD compared with MS patients after adjusting for age, sex, income, hypertension, diabetes, and dyslipidemia (aHR = 0.67 and 0.62). Conclusion: The risk of dementia increased in MS and NMOSD patients and dementia risk was higher in MS than in NMOSD.

18.
JAMA Netw Open ; 6(4): e239955, 2023 04 03.
Artículo en Inglés | MEDLINE | ID: mdl-37097632

RESUMEN

Importance: Dexmedetomidine is a widely used sedative in the intensive care unit (ICU) and has unique properties that may be associated with reduced occurrence of new-onset atrial fibrillation (NOAF). Objective: To investigate whether the use of dexmedetomidine is associated with the incidence of NOAF in patients with critical illness. Design, Setting, and Participants: This propensity score-matched cohort study was conducted using the Medical Information Mart for Intensive Care-IV database, which includes records of patients admitted to the ICU at Beth Israel Deaconess Medical Center in Boston dating 2008 through 2019. Included patients were those aged 18 years or older and hospitalized in the ICU. Data were analyzed from March through May 2022. Exposure: Patients were divided into 2 groups according to dexmedetomidine exposure: those who received dexmedetomidine within 48 hours after ICU admission (dexmedetomidine group) and those who never received dexmedetomidine (no dexmedetomidine group). Main Outcomes and Measures: The primary outcome was the occurrence of NOAF within 7 days of ICU admission, as defined by the nurse-recorded rhythm status. Secondary outcomes were ICU length of stay, hospital length of stay, and in-hospital mortality. Results: This study included 22 237 patients before matching (mean [SD] age, 65.9 [16.7] years; 12 350 male patients [55.5%]). After 1:3 propensity score matching, the cohort included 8015 patients (mean [SD] age, 61.0 [17.1] years; 5240 males [65.4%]), among whom 2106 and 5909 patients were in the dexmedetomidine and no dexmedetomidine groups, respectively. Use of dexmedetomidine was associated with a decreased risk of NOAF (371 patients [17.6%] vs 1323 patients [22.4%]; hazard ratio, 0.80; 95% CI, 0.71-0.90). Although patients in the dexmedetomidine group had longer median (IQR) length of stays in the ICU (4.0 [2.7-6.9] days vs 3.5 [2.5-5.9] days; P < .001) and hospital (10.0 [6.6-16.3] days vs 8.8 [5.9-14.0] days; P < .001), dexmedetomidine was associated with decreased risk of in-hospital mortality (132 deaths [6.3%] vs 758 deaths [12.8%]; hazard ratio, 0.43; 95% CI, 0.36-0.52). Conclusions and Relevance: This study found that dexmedetomidine was associated with decreased risk of NOAF in patients with critical illness, suggesting that it may be necessary and warranted to evaluate this association in future clinical trials.


Asunto(s)
Fibrilación Atrial , Enfermedad Crítica , Humanos , Masculino , Anciano , Persona de Mediana Edad , Estudios de Cohortes , Fibrilación Atrial/tratamiento farmacológico , Fibrilación Atrial/epidemiología , Hipnóticos y Sedantes/efectos adversos , Unidades de Cuidados Intensivos
19.
BMJ Open Respir Res ; 10(1)2023 12 28.
Artículo en Inglés | MEDLINE | ID: mdl-38154913

RESUMEN

BACKGROUND: Existing models have performed poorly when predicting mortality for patients undergoing venovenous extracorporeal membrane oxygenation (VV-ECMO). This study aimed to develop and validate a machine learning (ML)-based prediction model to predict 90-day mortality in patients undergoing VV-ECMO. METHODS: This study included 368 patients with acute respiratory failure undergoing VV-ECMO from 16 tertiary hospitals across South Korea between 2012 and 2015. The primary outcome was the 90-day mortality after ECMO initiation. The inputs included all available features (n=51) and those from the electronic health record (EHR) systems without preprocessing (n=40). The discriminatory strengths of ML models were evaluated in both internal and external validation sets. The models were compared with conventional models, such as respiratory ECMO survival prediction (RESP) and predicting death for severe acute respiratory distress syndrome on VV-ECMO (PRESERVE). RESULTS: Extreme gradient boosting (XGB) (areas under the receiver operating characteristic curve, AUROC 0.82, 95% CI (0.73 to 0.89)) and light gradient boosting (AUROC 0.81 (95% CI 0.71 to 0.88)) models achieved the highest performance using EHR's and all other available features. The developed models had higher AUROCs (95% CI 0.76 to 0.82) than those of RESP (AUROC 0.66 (95% CI 0.56 to 0.76)) and PRESERVE (AUROC 0.71 (95% CI 0.61 to 0.81)). Additionally, we achieved an AUROC (0.75) for 90-day mortality in external validation in the case of the XGB model, which was higher than that of RESP (0.70) and PRESERVE (0.67) in the same validation dataset. CONCLUSIONS: ML prediction models outperformed previous mortality risk models. This model may be used to identify patients who are unlikely to benefit from VV-ECMO therapy during patient selection.


Asunto(s)
Oxigenación por Membrana Extracorpórea , Síndrome de Dificultad Respiratoria , Humanos , Estudios Retrospectivos , Mortalidad Hospitalaria , Síndrome de Dificultad Respiratoria/terapia , Aprendizaje Automático Supervisado
20.
Int J Nurs Stud ; 147: 104587, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-37741258

RESUMEN

BACKGROUND: Most nursing homes in South Korea lack professional nursing services, resulting in transporting residents to hospitals for mild health problems and nursing treatment needs. While the number of nursing homes has increased, the number of registered nurses working in nursing homes has declined. In 2019, the Korean Ministry of Health and Welfare and the National Health Insurance Service launched the Special Nursing Units in Nursing Homes, a pilot nurse-led model, to resolve the lack of health and nursing services in nursing homes by mandating registered nurses' minimum staffing levels and protecting their scope of practice. OBJECTIVE: This study explored the effects of the Special Nursing Units model in Nursing Homes on healthcare utilization and cost among nursing home residents. DESIGN: A comparative effectiveness research design using propensity score matching. SETTING(S): Eighteen nursing homes were selected based on the region and number of beds. PARTICIPANTS: There were 323 matched-pairs of residents from the case and control groups. METHODS: Nursing homes with more than 30 beds were recruited nationwide, with 18 nursing homes being selected based on the region and number of beds. The case group included 323 older adults receiving professional nursing services by registered nurses under the Special Nursing Units model in Nursing Homes for more than six months consecutively in 18 nursing homes between April and December 2019. We matched control participants using propensity score matching with health insurance and long-term care data. We analyzed the differences in healthcare utilization and cost changes between the case and control groups using generalized estimating equations. RESULTS: The groups were not statistically different in baseline demographic or health-related characteristics. There were 26 (8.1%) and 30 (9.3%) deaths in the case and control groups, respectively, during the six months of the model, which was not statistically different (p = .576). The case group showed statistically significant decreases in healthcare utilization and costs, including hospitalization frequency (p = .008), length of stay (p = .002), and hospitalization costs (p = .003); outpatient visit frequency (p = .003) and costs (p < .001); and home healthcare frequency (p < .001) and cost (p < .001) than the control group. CONCLUSIONS: Professional nursing services by registered nurses under the Special Nursing Units model in Nursing Homes decreased healthcare utilization and costs. A nurse-led model in nursing homes, which includes mandating the minimum staffing levels of registered nurses and protecting their scope of practice, promises to improve resident health outcomes.


Asunto(s)
Enfermeras y Enfermeros , Casas de Salud , Humanos , Anciano , Estudios de Casos y Controles , Puntaje de Propensión , Atención a la Salud , Aceptación de la Atención de Salud
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA