Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 163
Filtrar
1.
J Am Geriatr Soc ; 2024 Aug 10.
Artigo em Inglês | MEDLINE | ID: mdl-39126234

RESUMO

BACKGROUND: Older adults with severe aortic stenosis (AS) may receive care in a nursing home (NH) prior to undergoing transcatheter aortic valve replacement (TAVR). NH level of care can be used to stabilize medical conditions, to provide rehabilitation services, or for long-term care services. Our primary objective is to determine whether NH utilization pre-TAVR can be used to stratify patients at risk for higher mortality and poor disposition outcomes at 30 and 365 days post-TAVR. METHODS: We conducted a retrospective cohort study among Medicare beneficiaries who spent ≥1 day in an NH 6 months before TAVR (2011-2019). The intensity of NH utilization was categorized as low users (1-30 days), medium users (31-89 days), long-stay NH residents (≥ 100 days, with no more than a 10-day gap in care), and high post-acute rehabilitation patients (≥90 days, with more than a 10-day gap in care). The probabilities of death and disposition were estimated using multinomial logistic regression, adjusting for age, sex, and race. RESULTS: Among 15,581 patients, 9908 (63.6%) were low users, 4312 (27.7%) were medium users, 663 (4.3%) were high post-acute care rehab users, and 698 (4.4%) were long-stay NH residents before TAVR. High post-acute care rehabilitation patients were more likely to have dementia, weight loss, falls, and extensive dependence of activities of daily living (ADLs) as compared with low NH users. Mortality was the greatest in high post-acute care rehab users: 5.5% at 30 days, and 36.4% at 365 days. In contrast, low NH users had similar mortality rates compared with long-stay NH residents: 4.8% versus 4.8% at 30 days, and 24.9% versus 27.0% at 365 days. CONCLUSION: Frequent bouts of post-acute rehabilitation before TAVR were associated with adverse outcomes, yet this metric may be helpful to determine which patients with severe AS could benefit from palliative and geriatric services.

2.
BMC Pregnancy Childbirth ; 24(1): 460, 2024 Jul 03.
Artigo em Inglês | MEDLINE | ID: mdl-38961444

RESUMO

BACKGROUND AND AIMS: Although minimally invasive hysterectomy offers advantages, abdominal hysterectomy remains the predominant surgical method. Creating a standardized dataset and establishing a hysterectomy registry system present opportunities for early interventions in reducing volume and selecting benign hysterectomy methods. This research aims to develop a dataset for designing benign hysterectomy registration system. METHODS: Between April and September 2020, a qualitative study was carried out to create a data set for enrolling patients who were candidate for hysterectomy. At this stage, the research team conducted an information needs assessment, relevant data element identification, registry software development, and field testing; Subsequently, a web-based application was designed. In June 2023the registry software was evaluated using data extracted from medical records of patients admitted at Al-Zahra Hospital in Tabriz, Iran. RESULTS: During two months, 40 patients with benign hysterectomy were successfully registered. The final dataset for the hysterectomy patient registry comprise 11 main groups, 27 subclasses, and a total of 91 Data elements. Mandatory data and essential reports were defined. Furthermore, a web-based registry system designed and evaluated based on data set and various scenarios. CONCLUSION: Creating a hysterectomy registration system is the initial stride toward identifying and registering hysterectomy candidate patients. this system capture information about the procedure techniques, and associated complications. In Iran, this registry can serve as a valuable resource for assessing the quality of care delivered and the distribution of clinical measures.


Assuntos
Hospitais de Ensino , Histerectomia , Sistema de Registros , Humanos , Feminino , Irã (Geográfico) , Histerectomia/métodos , Histerectomia/estatística & dados numéricos , Adulto , Pessoa de Meia-Idade , Encaminhamento e Consulta/estatística & dados numéricos , Pesquisa Qualitativa , Conjuntos de Dados como Assunto
3.
Artigo em Inglês | MEDLINE | ID: mdl-38928942

RESUMO

BACKGROUND: Standardized health-data collection enables effective disaster responses and patient care. Emergency medical teams use the Japan Surveillance in Post-Extreme Emergencies and Disasters (J-SPEED) reporting template to collect patient data. EMTs submit data on treated patients to an EMT coordination cell. The World Health Organization's (WHO) EMT minimum dataset (MDS) offers an international standard for disaster data collection. GOAL: The goal of this study was to analyze age and gender distribution of medical consultations in EMT during disasters. METHODS: Data collected from 2016 to 2020 using the J-SPEED/MDS tools during six disasters in Japan and Mozambique were analyzed. Linear regression with data smoothing via the moving average method was employed to identify trends in medical consultations based on age and gender. RESULTS: 31,056 consultations were recorded: 13,958 in Japan and 17,098 in Mozambique. Women accounted for 56.3% and 55.7% of examinees in Japan and Mozambique, respectively. Children accounted for 6.8% of consultations in Japan and 28.1% in Mozambique. Elders accounted for 1.32 and 1.52 times more consultations than adults in Japan and Mozambique, respectively. CONCLUSIONS: Study findings highlight the importance of considering age-specific healthcare requirements in disaster planning. Real-time data collection tools such as J-SPEED and MDS, which generate both daily reports and raw data for in-depth analysis, facilitate the validation of equitable access to healthcare services, emphasize the specific needs of vulnerable groups, and enable the consideration of cultural preferences to improve healthcare provision by EMTs.


Assuntos
Desastres , Humanos , Feminino , Japão , Moçambique , Masculino , Idoso , Pessoa de Meia-Idade , Adulto , Adolescente , Adulto Jovem , Criança , Pré-Escolar , Lactente , Serviços Médicos de Emergência/estatística & dados numéricos , Idoso de 80 Anos ou mais , Fatores Etários , Recém-Nascido , Fatores Sexuais
4.
Front Physiol ; 15: 1399374, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38872836

RESUMO

Background: Infections and seizures are some of the most common complications in stroke survivors. Infections are the most common risk factor for seizures and stroke survivors that experience an infection are at greater risk of experiencing seizures. A predictive model to determine which stroke survivors are at the greatest risk for a seizure after an infection can be used to help providers focus on prevention of seizures in higher risk residents that experience an infection. Methods: A predictive model was generated from a retrospective study of the Long-Term Care Minimum Data Set (MDS) 3.0 (2014-2018, n = 262,301). Techniques included three data balancing methods (SMOTE for up sampling, ENN for down sampling, and SMOTEENN for up and down sampling) and three feature selection methods (LASSO, Recursive Feature Elimination, and Principal Component Analysis). One balancing and one feature selection technique was applied, and the resulting dataset was then trained on four machine learning models (Logistic Regression, Random Forest, XGBoost, and Neural Network). Model performance was evaluated with AUC and accuracy, and interpretation used SHapley Additive exPlanations. Results: Using data balancing methods improved the prediction performances of the machine learning models, but feature selection did not remove any features and did not affect performance. With all models having a high accuracy (76.5%-99.9%), interpretation on all four models yielded the most holistic view. SHAP values indicated that therapy (speech, physical, occupational, and respiratory), independence (activities of daily living for walking, mobility, eating, dressing, and toilet use), and mood (severity score, anti-anxiety medications, antidepressants, and antipsychotics) features contributed the most. Meaning, stroke survivors who received fewer therapy hours, were less independent, had a worse overall mood were at a greater risk of having a seizure after an infection. Conclusion: The development of a tool to predict seizure following an infection in stroke survivors can be interpreted by providers to guide treatment and prevent complications long term. This promotes individualized treatment plans that can increase the quality of resident care.

5.
BMC Health Serv Res ; 24(1): 675, 2024 May 28.
Artigo em Inglês | MEDLINE | ID: mdl-38807135

RESUMO

BACKGROUND: Disease registries are comprehensive databases that record detailed information on patients diagnosed with specific conditions, providing valuable insights into their diagnosis, treatment, and outcomes. This study aims to describe the pilot phase of the national pediatric Immune Thrombocytopenia(ITP) registry (NPITP) in Iran, serving as the inaugural interpretive report. METHODS: This patient-centered software system was implemented as a national program across multiple pediatric centers in Iran. Several focus groups were conducted to establish a minimum data set (MDS) comprising six main classes, 14 sub-classes, and 187 data elements. Following expert consensus on the final data set, a web-based software tool was developed by the dedicated IT team, accessible online and offline via https://disreg.sbmu.ac.ir/q/ITP.html . The registry included children aged between two months and 18 years with a platelet count below 100 × 109/L, based on predefined inclusion criteria. RESULTS: Within a four-month period, a total of 60 ITP patients were registered, including 41 (68.3%) newly diagnosed cases, 68 (13.6%) persistent cases, and 14 (23.3%) with chronic ITP. The mean age of the registered patients was 55.93 ± 9.72 months. The most frequently observed bleeding symptoms were petechiae (68.3%), purpura (51.6%), and ecchymosis (13.3%). Among the newly diagnosed patients, 20 (33.3%) received intravenous immunoglobulin (IVIG), 17 (28.3%) were treated with prednisolone, and 17 (28.3%) received combined IVIG and steroid therapy. Of all patients, 40 (66.7%) demonstrated a complete response to treatment, while 16 (26.7%) exhibited a partial response. Four patients (6.7%) remained unresponsive to therapy. Treatment-related complications, such as Cushing's syndrome, edema, weight gain, hirsutism, and mood disorders, were reported in 10 patients (16.6%). However, the majority of patients (81.7%) did not experience therapy-related complications. CONCLUSION: The pilot phase of the NPITP registry successfully implemented a web-based software tool for data collection, aiming to enhance the quality of care, facilitate clinical research, and support health service planning in the future.


Assuntos
Púrpura Trombocitopênica Idiopática , Sistema de Registros , Humanos , Criança , Irã (Geográfico)/epidemiologia , Púrpura Trombocitopênica Idiopática/terapia , Púrpura Trombocitopênica Idiopática/tratamento farmacológico , Pré-Escolar , Adolescente , Masculino , Feminino , Lactente , Projetos Piloto
6.
Environ Monit Assess ; 196(6): 567, 2024 May 22.
Artigo em Inglês | MEDLINE | ID: mdl-38775991

RESUMO

The study attempted to evaluate the agricultural soil quality using the Soil Quality Index (SQI) model in two Community Development Blocks, Ausgram-II and Memari-II of Purba Bardhaman District. Total 104 soil samples were collected (0-20 cm depth) from each Block to analyse 13 parameters (bulk density, soil porosity, soil aggregate stability, water holding capacity, infiltration rate, available nitrogen, available phosphorous, available potassium, soil pH, soil organic carbon, electrical conductivity, soil respiration and microbial biomass carbon) in this study. The Integrated Quality Index (IQI) was applied using the weighted additive approach and non-linear scoring technique to retain the Minimum Data Set (MDS). Principal Component Analysis (PCA) identified that SAS, BD, available K, pH, available N, and available P were the key contributing parameters to SQI in Ausgram-II. In contrast, WHC, SR, available N, pH, and SAS contributed the most to SQI in Memari-II. Results revealed that Ausgram-II (0.97) is notably higher SQI than Memari-II (0.69). In Ausgram-II, 99.72% of agricultural lands showed very high SQI (Grade I), whereas, in Memari-II, 49.95% of lands exhibited a moderate SQI (Grade III) and 49.90% showed a high SQI (Grade II). Sustainable Yield Index (SYI), Sensitivity Index (SI) and Efficiency Ratio (ER) were used to validate the SQIs. A positive correlation was observed between SQI and paddy ( R2 = 0.82 & 0.72) and potato yield (R2 = 0.71 & 0.78) in Ausgram-II and Memari-II Block, respectively. This study could evaluate the agricultural soil quality and provide insights for decision-making in fertiliser management practices to promote agricultural sustainability.


Assuntos
Agricultura , Monitoramento Ambiental , Oryza , Solo , Índia , Solo/química , Monitoramento Ambiental/métodos , Oryza/crescimento & desenvolvimento , Nitrogênio/análise , Poluentes do Solo/análise , Fósforo/análise
7.
Cureus ; 16(4): e58332, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38752033

RESUMO

INTRODUCTION: Nonoperative care represents a cornerstone of adolescent idiopathic scoliosis (AIS) management, although no consensus exists for a minimal data set. We aimed to determine a consensus in critical data points to obtain during clinical AIS visits. METHODS: A REDCap-based survey was distributed to Pediatric Orthopedic Society of America (POSNA), Pediatric Spine Study Group (PSSG), and International Society on Scoliosis Orthopedic and Rehabilitation Treatment (SOSORT). Respondents ranked the importance of data points in history, physical examination, and bracing during AIS visits.  Results: One hundred eighty-one responses were received (26% response rate), of which 86% were physicians and 14% were allied health professionals. About 80% of respondents worked at pediatric hospitals or pediatric spaces within adult hospitals, and 82% were academic, with the majority (57%) seeing 150+ unique AIS patients annually. Most respondents recommended six-month follow-up for patients under observation (60%) and bracing (54%). Most respondents (75%) considered family history and pain important (69%), with the majority (69%) asking about pain at every visit. Across all time points, Adam's forward bend test, shoulder level, sagittal contour, trunk shift, and curve stiffness were all considered critically important (>60%). At the first visit, scapular prominence, leg lengths, motor and neurological examination, gait, and iliac crest height were also viewed as critical. At the preoperative visit, motor strength and scapular prominence should also be documented. About 39% of respondents use heat sensors to monitor bracing compliance, and average brace wear since the prior visit was considered the most important (85%) compliance data point. CONCLUSIONS: This study establishes recommendations for a 19-item minimum data set for clinical AIS evaluation, including history, physical exam, and bracing, to allow for future multicenter registry-based studies.

8.
Heliyon ; 10(9): e30054, 2024 May 15.
Artigo em Inglês | MEDLINE | ID: mdl-38707457

RESUMO

Background: To reduce the risk of errors, patient safety monitoring in the medical imaging department is crucial. Interventions are required and these can be provided as a framework for documenting, reporting, evaluating, and recognizing events that pose a threat to patient safety. The aim of this study was to develop minimum data set and dashboard for monitoring adverse events in radiology departments. Material and methods: This developmental research was conducted in multiple phases, including content determination using the Delphi technique; database designing using SQL Server; user interface (UI) building using PHP; and dashboard evaluation in three aspects: the accuracy of calculating; UI requirements; and usability. Results: This study identified 26 patient safety (PS) performance metrics and 110 PS-related significant data components organized into 14 major groupings as the system contents. The UI was built with three tabs: pre-procedure, intra-procedure, and post-procedure. The evaluation results proved the technical feasibility of the dashboard. Finally, the dashboard's usability was highly rated (76.3 out of 100). Conclusion: The dashboard can be used to supplement datasets to obtain a more accurate picture of the PS condition and to draw attention to characteristics that professionals might otherwise overlook or undervalue.

9.
BMC Emerg Med ; 24(1): 94, 2024 May 31.
Artigo em Inglês | MEDLINE | ID: mdl-38816720

RESUMO

BACKGROUND: Rainfall-induced floods represented 70% of the disasters in Japan from 1985 to 2018 and caused various health problems. To improve preparedness and preventive measures, more information is needed on the health problems caused by heavy rain. However, it has proven challenging to collect health data surrounding disasters due to various inhibiting factors such as environmental hazards and logistical constraints. In response to the Kumamoto Heavy Rain 2020, Emergency Medical Teams (EMTs) used J-SPEED (Japan-Surveillance in Post Extreme Emergencies and Disasters) as a daily reporting tool, collecting patient data and sending it to an EMTCC (EMT Coordination Cell) during the response. We performed a descriptive epidemiological analysis using J-SPEED data to better understand the health problems arising from the Kumamoto Heavy Rain 2020 in Japan. METHODS: During the Kumamoto Heavy Rain 2020 from July 5 to July 31, 2020, 79 EMTs used the J-SPEED form to submit daily reports to the EMTCC on the number and types of health problems they treated. We analyzed the 207 daily reports, categorizing the data by age, gender, and time period. RESULTS: Among the 816 reported consultations, women accounted for 51% and men accounted for 49%. The majority of patients were elderly (62.1%), followed by adults (32.8%), and children (5%). The most common health issues included treatment interruption (12.4%), hypertension (12.0%), wounds (10.8%), minor trauma (9.6%), and disaster-related stress symptoms (7.4%). Consultations followed six phases during the disaster response, with the highest occurrence during the hyperacute and acute phases. Directly disaster-related events comprised 13.9% of consultations, indirectly related events comprised 52.0%, and unrelated events comprised 34.0%. As the response phases progressed, the proportions of directly and indirectly related events decreased while that of unrelated events increased. CONCLUSION: By harnessing data captured by J-SPEED, this research demonstrates the feasibility of collecting, quantifying, and analyzing data using a uniform format. Comparison of the present findings with those of two previous analyses of J-SPEED data from other disaster scenarios that varied in time, location, and/or disaster type showcases the potential to use analysis of past experiences to advancing knowledge on disaster medicine and disaster public health.


Assuntos
Chuva , Humanos , Feminino , Masculino , Japão , Adulto , Pessoa de Meia-Idade , Idoso , Criança , Adolescente , Pré-Escolar , Lactente , Adulto Jovem , Desastres , Idoso de 80 Anos ou mais , Serviços Médicos de Emergência/estatística & dados numéricos , Inundações , Planejamento em Desastres , Necessidades e Demandas de Serviços de Saúde , Recém-Nascido
10.
Confl Health ; 18(1): 28, 2024 Apr 08.
Artigo em Inglês | MEDLINE | ID: mdl-38589881

RESUMO

BACKGROUND: The Red Cross and Red Crescent Movement (RCRC) utilizes specialized Emergency Response Units (ERUs) for international disaster response. However, data collection and reporting within ERUs have been time-consuming and paper-based. The Red Cross Red Crescent Health Information System (RCHIS) was developed to improve clinical documentation and reporting, ensuring accuracy and ease of use while increasing compliance with reporting standards. CASE PRESENTATION: RCHIS is an Electronic Medical Record (EMR) and Health Information System (HIS) designed for RCRC ERUs. It can be accessed on Android tablets or Windows laptops, both online and offline. The system securely stores data on Microsoft Azure cloud, with synchronization facilitated through a local ERU server. The functional architecture covers all clinical functions of ERU clinics and hospitals, incorporating user-friendly features. A pilot study was conducted with the Portuguese Red Cross (PRC) during a large-scale event. Thirteen super users were trained and subsequently trained the staff. During the four-day pilot, 77 user accounts were created, and 243 patient files were documented. Feedback indicated that RCHIS was easy to use, requiring minimal training time, and had sufficient training for full utilization. Real-time reporting facilitated coordination with the civil defense authority. CONCLUSIONS: The development and pilot use of RCHIS demonstrated its feasibility and efficacy within RCRC ERUs. The system addressed the need for an EMR and HIS solution, enabling comprehensive clinical documentation and supporting administrative reporting functions. The pilot study validated the training of trainers' approach and paved the way for further domestic use of RCHIS. RCHIS has the potential to improve patient safety, quality of care, and reporting efficiency within ERUs. Automated reporting reduces the burden on ERU leadership, while electronic compilation enhances record completeness and correctness. Ongoing feedback collection and feature development continue to enhance RCHIS's functionality. Further trainings took place in 2023 and preparations for international deployments are under way. RCHIS represents a significant step toward improved emergency medical care and coordination within the RCRC and has implications for similar systems in other Emergency Medical Teams.

11.
Sci Rep ; 14(1): 8491, 2024 Apr 11.
Artigo em Inglês | MEDLINE | ID: mdl-38605150

RESUMO

The primary objective of this study was to develop soil quality indexes (SQIs) to reveal the changes in SQ during the restoration of vegetation in the reclaimed waste dumps of the Hequ open-pit coal mine. The study built an SQI evaluation model for waste dumps based on the soil management assessment framework. The total data set (TDS) consisted of nine physicochemical property indicators. The selection of the minimum data set (MDS) involved the utilization of principal component analysis (PCA) and Norm values. The SQ was comprehensively evaluated for nine indicators, taking into account the non-linear membership function and the improved Nemerow index. The findings suggested a notable disparity in the SQ between the reclaimed area and the unreclaimed area, yet the overall SQ fell short. In the TDS index system, the organic matter has the highest weight and a greater contribution to the soil quality of the waste dumps. In the MDS indicator system, the weights of organic matter and total nitrogen are both 0.5. According to Nemerow index method, the average SQIN of 5 plots is calculated to be 0.4352 ± 0.194. The average value obtained from TDS is 0.581 ± 0.236, and the average value obtained from MDS is 0.602 ± 0.351. The weighted additive method was employed to compute three SQIs, all of which yielded satisfactory outcomes. And the above evaluation methods indicate that the overall soil quality level of the waste dumps is at a moderate level. The sequence of SQ in various waste dumps was as follows: No.4lower > No.1 > No.2 > No.3 > No.4upper. Specifically, the non-linear membership function indicated that pH, available nitrogen (AN), available phosphorus (AP), surface moisture content (SMC), and bulk density (BD) were crucial in limiting SQIs in total waste dumps. The crucial limiting SQIs in unreclaimed areas were total phosphorus (TP) and total nitrogen (TN). This analysis demonstrates its efficacy in formulating strategies for the SQ evaluation and targeted soil reclamation plans of waste dumps.

12.
Sci Total Environ ; 924: 171707, 2024 May 10.
Artigo em Inglês | MEDLINE | ID: mdl-38490429

RESUMO

Soil salinization is one of the major soil degradation threats worldwide, and parameters related to soil quality and ecosystem multifunctionality (EMF) are crucial for evaluating the success of reclamation efforts in saline-sodic wasteland (WL). Microbial metabolic limitation is also one of the main factors that influences EMF in agricultural cropping systems. A ten-year localization experiment was conducted to reveal the key predictors of soil quality index (SQI) values, microbial metabolic characteristics, and EMF in different farmland cropping systems. A random forest model showed that the ß-glucosidase (BG), cellobiosidase (CBH) and saturated hydraulic conductivity (SHC) of the SQI factors were the main driving forces of soil EMF. Compared to monoculture models, such as paddy field (PF) or upland field (UF), the converted paddy field to upland field (CF) cropping system was most effective at improving EMF in reclaimed saline-sodic WL, increasing this metric by 275.35 %. CF integrates practices from both PF and UF planting systems, improved soil quality and relieves microbial metabolic limitation. Specifically, both CF and PF significantly reduced soil pH (by 16-23 %) and sodium adsorption ration (SAR) (by 65-83 %) and significantly reduced the abundance of large macroaggregates. Moreover, CF significantly improved soil saturated hydraulic conductivity relative to PF and UF (p < 0.05), indicating an improvement in soil physical properties. Overall, although reclamation improved SQI compared to WL (0.25), the EMF of CF (0.56) was significantly higher than that of other treatments (p < 0.05). Thus, while increasing SQI can improve soil EMF, it was not as effective alone as it was when combined with more comprehensive efforts that focus on improving various soil properties and alleviating microbial metabolic limitations. Therefore, our results suggested that future saline-sodic wasteland reclamation efforts should avoid monoculture systems to enhance soil EMF.


Assuntos
Ecossistema , Solo , Solo/química , Sódio/química , Adsorção
13.
Gerontologist ; 64(2)2024 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-37432373

RESUMO

BACKGROUND AND OBJECTIVES: Uncovering subgroups of nursing home residents sharing similar preference patterns is useful for developing systematic approaches to person-centered care. This study aimed to (i) identify preference patterns among long-stay residents, and (ii) examine the associations of preference patterns with resident and facility characteristics. RESEARCH DESIGN AND METHODS: This study was a national cross-sectional analysis of Minimum Data Set assessments in 2016. Using resident-rated importance for 16 preference items in the Preference Assessment Tool as indicators, we conducted latent class analysis to identify preference patterns and examined their associations with resident and facility characteristics. RESULTS: We identified 4 preference patterns. The high salience group (43.5% of the sample) was the most likely to rate all preferences as important, whereas the low salience group (8.7%) was the least likely. The socially engaged (27.2%) and the socially independent groups (20.6%) featured high importance ratings on social/recreational activities and maintaining privacy/autonomy, respectively. The high salience group reported more favorable physical and sensory function than the other 3 groups and lived in facilities with higher staffing of activity staff. The low salience and socially independent groups reported a higher prevalence of depressive symptoms, whereas the low salience or socially engaged groups reported a higher prevalence of cognitive impairment. Preference patterns also varied by race/ethnicity and gender. DISCUSSION AND IMPLICATIONS: Our study advanced the understanding of within-individual variations in preferences, and the role of individual and environmental factors in shaping preferences. The findings provided implications for providing person-centered care in NHs.


Assuntos
Casas de Saúde , Preferência do Paciente , Humanos , Estudos Transversais , Análise de Classes Latentes , Assistência Centrada no Paciente
14.
J Am Med Dir Assoc ; 25(4): 606-609.e1, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-37573885

RESUMO

OBJECTIVES: Nursing home (NH) Minimum Data Set (MDS) have frequently been used to measure medication use in epidemiologic studies, but there is little evidence on the accuracy of MDS-based medication records. We compared antipsychotic use estimated using 2 data sources-MDS and NH electronic medication administration records (eMAR). DESIGN: Cross-sectional comparison. SETTING AND PARTICIPANTS: This analysis was based on MDS and linked eMAR data of 604 NH residents with dementia at 54 NHs in 10 states, participating in a cluster-randomized pragmatic trial (METRIcAL), from June 2019 to February 2020. METHODS: One admission, quarterly, or annual MDS assessment was chosen for each participant. The MDS assessment recorded the number of antipsychotic treatment days during a 7-day window. We then identified antipsychotic administrations during the corresponding window in the eMAR. We used Cohen kappa to assess agreement in the proportion of participants on antipsychotics during the week and used intraclass correlation coefficient (ICC) to assess the agreement of treatment days. We further used the eMAR data as a reference to calculate validity parameters. RESULTS: A total of 29.5% of study participants were identified as antipsychotic users based on the MDS vs 28.3% based on the eMAR data (kappa value: 0.96). MDS-based average treatment duration was estimated to be 2.0, consistent with eMAR-based estimate (1.8 days, ICC: 0.96). The sensitivity was 98.8% (95% CI 95.8%-99.9%), the specificity was 97.9% (95% CI 96.1%-99.1%), the positive predictive value was 94.9% (95% CI 90.8%-97.3%), and the negative predictive value was 99.5% (95% CI 98.2%-99.9%). CONCLUSIONS AND IMPLICATIONS: Agreement between the MDS and eMAR in antipsychotic use is high, suggesting that the MDS is a valid tool to measure antipsychotic use in epidemiologic studies. Further studies with large and diverse populations are warranted to confirm our findings.


Assuntos
Antipsicóticos , Humanos , Antipsicóticos/uso terapêutico , Estudos Transversais , Hospitalização , Casas de Saúde , Instituições de Cuidados Especializados de Enfermagem
15.
Health Sci Rep ; 6(11): e1671, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-37920660

RESUMO

Background and Aims: Spinal canal stenosis is one of the most common vertebral column diseases, which can lead to disability. Developing a registry system can help in research on the prevention and effective treatment of it. This study designs a minimum data set (MDS) as the first step in creating a registry system for spinal canal stenosis. Method: The present research is of applied-descriptive type, performed in 2022. First, the applicable data elements about the disease were selected from a vast range of English and Farsi references, including peer reviewed articles, academic books, credible websites, and medical records of hospitalized patients. Through the extracted data, the primary MDS plan was designed as a questionnaire. The validity of the questionnaire was conducted via asking the opinion of experts (neurosurgeons, physiotherapists, epidemiologists, and health information management specialists). Also, its reliability was calculated via Cronbach ⍺ coefficient, which was 86%. Finally, the MDS of the spinal canal stenosis national registry system (for Iran) was confirmed through a two stage Delphi technique. Data analysis was applied through descriptive statistics via SPSS21 software. Results: The proposed MDS is offered in two general sets of data: administrative and clinical. For the administrative data set, 40 data elements had been proposed, as five classes. Twenty-six of them were confirmed. In the clinical section, 95 data elements had been proposed in 14 classes; 94 of which were finally confirmed. Conclusion: Since there is no spinal canal stenosis MDS available, this study can be a turning point in the standardization of the data on this disease. Moreover, these precise, coherent, and standard data elements can be contributed to improving disease management and enhancing the public healthcare quality. Also, the MDS proposed in this study can help researchers and experts, design a spinal canal stenosis registry system in other countries.

16.
Environ Monit Assess ; 195(12): 1536, 2023 Nov 27.
Artigo em Inglês | MEDLINE | ID: mdl-38010593

RESUMO

A healthy soil is crucial for food security, but human activities, particularly industrialization, are degrading the soil's quality. This study aims to assess and compare the Soil Quality Index (SQI) at three industrial sites: the iron and steel industry, the open cast coal mining industry, and the brick kiln industry, along with a control field. To measure the SQI, the weighted additive method was applied to the nine selected physico-chemical properties of soil: soil temperature, soil moisture, Bulk Density, pH, N, P, K, OC, and EC. Principal Component Analysis coupled with multiple correlation methods was used to determine the Minimum Data Set. The most dominant factors near the iron and steel industrial site are soil temperature, N, K, and EC, while N, OC, soil moisture, P, K, and EC are the most significant factors near the open cast coal mine. In the case of the brick kiln industrial site, soil moisture, OC, K, soil temperature, and P have the highest loadings. The calculated SQI indicates that the quality of soil is in better condition in the control field (0.6475), while the soil adjacent to the coal mining industrial site (0.1426) is in the worst state, followed by the iron and steel industrial site (0.1611) and the brick kiln industrial site (0.289). To attain sustainable agricultural practices in industrial landscapes, efficient management of nutrient contents and phytoremediation can be helpful.


Assuntos
Minas de Carvão , Solo , Humanos , Solo/química , Monitoramento Ambiental/métodos , Ferro , Biodegradação Ambiental , Aço
17.
Front Plant Sci ; 14: 1283457, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37954986

RESUMO

Introduction: Soil quality plays an irreplaceable role in plant growth for restored grassland. However, few studies investigate the comprehensive effects considering soil and vegetation properties during the restoration of desertified grassland, which restrict the virtuous circle of restored grassland ecosystem. Methods: By setting three restoration patterns of enclosure plus grass (EG), enclosure intercropping shrub-grass (ESG), and enclosure plus sand-barrier and shrub-grass (ESSG) with three different restoration years (≤5, 7-9, and ≥15 years), we selected 28 physicochemical and microbial indicators, and constructed a minimum data set (MDS) to analyze the influences of restoration measurements on soil quality and ecological benefits in alpine desertified grassland. Results: The results showed that the MDS comprised seven soil quality indicators: silt, total nitrogen (TN), carbon-nitrogen ratio (C/N), total potassium (TK), microbial biomass carbon (MBC), microbial biomass phosphorus (MBP), and fungi. Soil quality index (SQI) and ecological restoration effect index (EREI) in restored grasslands significantly increased by 144.83-561.24% and 87.21-422.12%, respectively, compared with unrestored grassland, and their positive effects increased with extending restoration years. The increasing effects of SQI and EREI were the highest in ESSG, followed by EG and ESG. The increasing rate of SQI began to decrease after 5 years in EG and ESG, while it decreased after 7-9 years in ESSG, and that of EREI in EG was lower than ESSG in each restoration year. Our work revealed that ESSG was the optimum restoration pattern for desertified grassland, and anthropogenic monitoring and management measurements such as applying organic fertilization and mowing return reasonably should be carried out at the beginning of 5 years in EG and ESG as well as 7 years in ESSG to maintain sustainable ecological benefits. Discussion: The study highlights that soil quality, including microbial properties, is a key factor to evaluate the restoration effects of desertified grassland.

18.
BMC Health Serv Res ; 23(1): 1010, 2023 Sep 19.
Artigo em Inglês | MEDLINE | ID: mdl-37726768

RESUMO

BACKGROUND: In the last ten years, many countries have started to develop constructive systems for registering common diseases and cancers. In this research, we intended to determine and identify the minimum data set (MDS) required for the design of the oral and lip squamous cell cancer registration system in Iran. METHODS AND MATERIAL: At first, primary information elements related to disease registries were extracted using scientific papers published in reliable databases. After reviewing the books, related main guidelines, and 42 valid articles, the initial draft of a researcher-made questionnaire was compiled. To validate the questionnaire, two focus group meetings were held with 29 expert panel members. The final version of this questionnaire was prepared by extracting different questions and categories and receiving numerous pieces of feedback from specialists. Lastly, a final survey was conducted by the experts who were present at the previous stage. RESULTS: Out of 29 experts participating in the study, 17 (58.62%) were men and 12 (40.37%) were women. The age range of experts varies from 34 to 58 years. One hundred-fourteen items, which are divided into ten main parts, were considered the main information elements of the registry design. The main minimum data sets have pertained to the demographic and clinical information of the patient, information related to the consumed drugs, initial diagnostic evaluations of the patient, biopsy, tumor staging at the time of diagnosis, clinical characteristics of the tumor, surgery, histopathological characteristics of the tumor, pathologic stage classification, radiotherapy details, follow-up information, and disease registry capabilities. The distinctive characteristics of the oral and lip squamous cell cancer registry systems, such as the title of the disease registration programme, the population being studied, the geographic extent of the registration, its primary goals, the definition of the condition, the technique of diagnosis, and the kind of registration, are all included in a model. CONCLUSION: The benefits of designing and implementing disease registries can include timely access to medical records, registration of information related to patient care and follow-up of patients, the existence of standard forms and the existence of standard information elements, and the existence of an integrated information system at the country level.


Assuntos
Carcinoma de Células Escamosas , Lábio , Masculino , Humanos , Feminino , Adulto , Pessoa de Meia-Idade , Carcinoma de Células Escamosas/epidemiologia , Carcinoma de Células Escamosas/terapia , Biópsia , Livros , Bases de Dados Factuais
19.
J Environ Manage ; 345: 118582, 2023 Nov 01.
Artigo em Inglês | MEDLINE | ID: mdl-37540979

RESUMO

Globally, agriculture has had a significant and often detrimental impact on soil. The continued capacity of soil to function as a living ecosystem that sustains microbes, plants, and animals (including humans), its metaphorical health, is of vital importance across geographic scales. Healthy soil underpins food production and ecosystem resilience against a changing climate. This paper focuses on assessing soil health, an area of increasing interest for farming communities, researchers, industry and policy-makers. Without accessible and reliable soil assessment, any management and interventions to improve soil health are likely to be sub-optimal. Here we explore available soil health assessments (SHAs) that may be feasible for farmers of varying income levels and suitable for broad geographic application. Whilst there is a range of existing approaches to SHA, we find that no one framework currently meets these broad aims. Firstly, reliance on expensive and logistically complex laboratory methods reduces viability and accessibility for many farmers. Secondly, lack of defined indicator baselines and associated thresholds or gradients for soil health prevents the assessment of soil measurements against achieving optima for a given set of local soil-climate conditions. Since soils vary greatly, these baselines and thresholds must be defined considering the local biogeographic context; it is inappropriate to simply transfer calibrated information between contexts. These shortcomings demand progress towards a feasible, globally applicable and context-relevant SHA framework. The most feasible SHAs we identified were developed locally in conjunction with farmers, who have been repeatedly found to assess the health of their soils accurately, often using relatively simple, observable indications. To progress, we propose assessment of which indicators add information to a SHA in local contexts, with a focus on sufficiency, to reduce data burden. Provision of a standardised protocol for measurement and sampling that considers the reliability and accuracy of different methods would also be extremely valuable. For greatest impact, future work should be taken forward through a cross-industry collaborative approach involving researchers, businesses, policy makers, and, above all, farmers, who are both experts and users.


Assuntos
Fazendeiros , Solo , Animais , Humanos , Ecossistema , Reprodutibilidade dos Testes , Agricultura
20.
Heliyon ; 9(7): e17835, 2023 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-37519636

RESUMO

The role of biochar in improving the soil properties of problem soils is well known, but its long term impact on lowland rice soil is not well recognized. The soil quality indicators of biochar applied lowland rice soil are not widely reported. We developed soil quality index (SQI) of a biochar applied lowland rice soil based on 17 soil properties (indicators). Field experimentation consisted of six treatments such as 0.5, 1, 2, 4, 8 and 10 t ha-1 of rice husk derived biochar (RHB) along with control. An overall SQI was calculated encompassing the indicators using multivariate statistics (principal component analysis) and non-linear scoring functions after generation of minimum data set (MDS). Sequential application of RHB improved the SQI by 4.85% and 16.02% with application of 0.5 t ha-1 and 10 t ha-1 RHB, respectively, over the recommended dose of fertilizer (control). PCA-screening revealed that total organic carbon (Ctot), zinc (Zn), pH and bulk density (BD) were the main soil quality indicators for MDS with 27.79%, 26.61%, 23.67% and 14.47% contributions, respectively. Apart from Ctot, Zn is one of the major contributors to SQI and RHB application can potentially be an effective agronomic practice to improve Zn status in lowland rice soil. The overall SQI was significantly influenced by RHB application even at 0.5 t ha-1. The present study highlights that application of RHB improves the soil quality even in fertile, well managed, lowland rice soil.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA