RESUMO
A two-level hierarchical framework for early-stage sustainability assessment (FESSA) amongst a set of alternatives applicable from the earliest stages of process or product development is introduced, and its use in combination with an improved method weighted-sum method multi-criteria decision analysis (WSM-MCDA) in the presence of uncertainty is presented through application to a case study based upon a real-world decision scenario from speciality polymer manufacture. The approach taken addresses the challenge faced by those responsible for innovation management in the manufacturing process industries to make simultaneously timely and rational decisions early in the innovation cycle when knowledge gaps and uncertainty about the options tend to be at their highest. The Computed Uncertainty Range Evaluations (CURE) WSM-MCDA method provides better discrimination than the existing Multiple Attribute Range Evaluations (MARE) method without the computational burden of generating heuristic outcome distributions via Monte-Carlo simulation.
This paper introduces a framework that teams can use to think systematically about the wide range of criteria which go into deciding whether a proposed innovation enhances sustainability or not and shows how an improved method for multiple-criteria decision analysis can be used to put it into practice with an example drawn from the speciality chemicals industry. Innovation in the manufacturing process industries requires decisions to be made. In individual projects, scientists and technical managers must decide which technology, materials, and equipment to use. Equally, those responsible for directing a portfolio of projects must choose which projects to prioritise. In either case, early decision making is desirable to avoid sinking time and money into dead-end projects, and to identify what further work is needed for projects with a future. The earlier you decide however, the harder it can be to obtain firm evidence (e.g. conclusive experimental data, fully validated costings, or life cycle impacts) upon which to base your decision. The growing societal expectation that sustainability criteria are factored into such decisions merely adds to the challenges faced by the decision maker. Decisions must be made upon the evidence that is available combined with the informed judgement of those with knowledge of the system under consideration. This is best approached as a facilitated, team-based activity where assertions, assumptions and interpolations or extrapolations from the limited data can be tested and challenged. A sound decision-making process needs a suitable computational method for turning this complex qualitative and semi-quantitative assessment into a clear output indicator of potential success or failure for the options under consideration. The method described in this paper addresses this need but, just as importantly, the methodology ensures that the thought process behind whatever decision is indicated is clearly and transparently documented for future reference.
RESUMO
The European regulatory framework on chemicals is at a crossroads. There are calls for the framework to be more effective, by better protecting people and the environment. There is also room for it to be more efficient and cost-effective, by harmonising assessment practices across sectors and avoiding the need for unnecessary testing. At the same time, there is a political commitment to phase out animal testing in chemical safety assessments. In this commentary, we argue that these needs are not at odds with other. On the contrary, the roadmap to phase out animal testing could also be the transition pathway to a more efficient, effective and sustainable regulatory ecosystem. Central to our proposal is a framework based on biological reasoning in which biological questions can be answered by a choice of methods, with non-animal methods progressively becoming the only choice. Within this framework, a tiered approach to testing and assessment allows for greater efficiency and effectiveness, while also introducing considerations of proportionality and cost-effectiveness. Testing strategies, and their component methods, should be developed in tandem and judged in terms of their outcomes, the protection levels they inform, rather than their ability to predict the outputs of animal tests.
RESUMO
To address marketing challenges in the agricultural sector, provide financial support for small-scale farmers over marketing seasons, manage price risks of agricultural products, and enhance the functioning of agricultural mercantile exchanges, it is attainable to implement an efficient and compliant warehouse receipt system (WRS) that aligns with the legal, institutional, social, and economic-financial conditions of a country. The aim of this study is to design, simulate, and assess the feasibility of an innovative WRS in the agricultural sector. To achieve this, a WRS was designed and evaluated for maize crop in Iran. The research methodology of this study is divided into three main parts: design, simulation, and feasibility assessment of the WRS. The design process incorporated the FAO-provided (2009) warehouse receipt system development, considering the experiences of various countries and the institutional and financial regulations specific to Iran. Additionally, a dynamic programming model was used to simulate the system, and an agent-based model was utilized for system feasibility assessment. The study results demonstrated that it is possible to design an innovative and efficient WRS by involving five key actors, including farmers, buyers (maize traders), banks, mercantile exchange market, and the warehouse (governmental institution), and establishing clear communications among them. Based on simulation results using dynamic programming model, it was evident that four parameters, namely the annualized loan interest rate, the valuation coefficient for loan collateral, price volatility of the product over the marketing season, and the warehouse cost-to-product value ratio, significantly impact the adoption of the WRS by farmers. In conclusion, the findings from the agent-based model revealed that setting the annualized loan interest rate at 8%, a collateral valuation coefficient of 85%, price fluctuations over the non-harvest time at 60%, and a warehouse cost-to-product value ratio of 2% can result in the participation of nearly 100% of farmers in the proposed WRS.
RESUMO
BACKGROUND AND PURPOSE: While patient-specific quality assurance (PSQA) has been integral to intensity-modulated treatments, its value is debated. A systems approach to safety is essential for understanding complex systems like radiation oncology but is often overlooked in PSQA research. This study aims to elucidate PSQA's fundamental value and identify opportunities for enhancing safety in intensity-modulated treatments. MATERIALS AND METHODS: First, causal scenarios that could lead to patient harm were identified using a prospective safety assessment technique developed for complex systems. Second, PSQA's ability to mitigate these scenarios was evaluated using standard stability and control principles. The analysis also included safeguards related to PSQA, such as daily linac QA, equipment commissioning, and equipment design. RESULTS: Ten causal scenarios were identified, highlighting well-known issues like flawed algorithms, data corruption, and hardware errors. Mitigation is achieved through advanced dose calculation and optimization algorithms, software and data integration, and preconfigured beam data, which improve decision-making and system state determination. Modern linac control systems enhance all aspects of system stability and control. Commissioning, daily linac QA, and PSQA are effective in enhancing the determination of system states only when feedback is non-overlapping and unambiguous. CONCLUSION: Given equipment improvement and related safeguards, the feedback generated from PSQA has diminished in value. To better complement other safeguards, PSQA should evolve to provide automated, unambiguous detection of any potential catastrophic treatment deviations prior to treatment. This evolution would allow physicists to focus on more critical aspects of patient care in radiation oncology.
RESUMO
Objective: To construct and validate a predictive nomogram model for the survival of patients with ventilator-associated pneumonia (VAP) to enhance prediction of 28-day survival rate in critically ill patients with VAP. Methods: A total of 1,438 intensive care unit (ICU) patients with VAP were screened through Medical Information Mart for Intensive Care (MIMIC)-IV. On the basis of multi-variable Cox regression analysis data, nomogram performance in predicting survival status of patients with VAP at ICU admission for 7, 14, and 28 days was evaluated using the C-index and area under the curve (AUC). Calibration and decision curve analysis curves were generated to assess clinical value and effectiveness of model, and risk stratification was performed for patients with VAP. Result: Through stepwise regression screening of uni-variable and multi-variable Cox regression models, independent prognostic factors for predicting nomogram were determined, including age, race, body temperature, Sequential Organ Failure Assessment score, anion gap, bicarbonate concentration, partial pressure of carbon dioxide, mean corpuscular hemoglobin, and liver disease. The model had C-index values of 0.748 and 0.628 in the train and test sets, respectively. The receiver operating characteristic curve showed that nomogram had better performance in predicting 28-day survival status in the train set (AUC = 0.74), whereas it decreased in the test set (AUC = 0.66). Calibration and decision curve analysis curve results suggested that nomogram had favorable predictive performance and clinical efficacy. Kaplan-Meier curves showed significant differences in survival between low, medium, and high-risk groups in the total set and training set (log-rank p < 0.05), further validating the effectiveness of the model. Conclusion: The VAP patient admission ICU 7, 14, and 28-day survival prediction nomogram was constructed, contributing to risk stratification and decision-making for such patients. The model is expected to play a positive role in supporting personalized treatment and management of VAP.
RESUMO
The pervasive utilization of plastic tools in aquaculture introduces significant volumes of microplastic fibers, presenting a consequential risk through the leaching of additives such as phthalates. This study scrutinizes the leaching dynamics of six prevalent phthalate esters (PAEs) from thirteen plastic aquaculture tools comprising polyethylene terephthalate (PET), polypropylene (PP), and polyethylene (PE), with ΣPAEs ranging from 0.24 to 4.26 mg g-1. Di(2-ethylhexyl) phthalate (DEHP) and dibutyl phthalate (DBP) emerged as predominant, marking significant environmental concern. Over a 30-day period, leaching quantities of Σ6PAEs from PET, PP, and PE fibers reached 36.65 µg g-1, 21.87 µg g-1 and 19.11 µg g-1, respectively, influenced by factors such as time, temperature, turbulence, and salinity. Notably, turbulence exerted the most pronounced effect, followed by temperature, with negligible influence from salinity. The kinetic models aligning with interface diffusion control was developed, predicting PAEs' leaching behavior with activation energies (Ea) indicative of the process's thermodynamic nature. The application of this model to real-world aquaculture waters forecasted significant risks, corroborating with empirical data and underscoring the pressing need for regulatory and mitigation strategies against PAEs contamination from aquaculture practices.
RESUMO
Internet-of-Things (IoT) refers to low-memory connected devices used in various new technologies, including drones, autonomous machines, and robotics. The article aims to understand better cyber risks in low-memory devices and the challenges in IoT risk management. The article includes a critical reflection on current risk methods and their level of appropriateness for IoT. We present a dependency model tailored in context toward current challenges in data strategies and make recommendations for the cybersecurity community. The model can be used for cyber risk estimation and assessment and generic risk impact assessment. The model is developed for cyber risk insurance for new technologies (e.g., drones, robots). Still, practitioners can apply it to estimate and assess cyber risks in organizations and enterprises. Furthermore, this paper critically discusses why risk assessment and management are crucial in this domain and what open questions on IoT risk assessment and risk management remain areas for further research. The paper then presents a more holistic understanding of cyber risks in the IoT. We explain how the industry can use new risk assessment, and management approaches to deal with the challenges posed by emerging IoT cyber risks. We explain how these approaches influence policy on cyber risk and data strategy. We also present a new approach for cyber risk assessment that incorporates IoT risks through dependency modeling. The paper describes why this approach is well suited to estimate IoT risks.
RESUMO
Extreme weather events affect many areas around the world. How a country or region reacts to it can take many forms. In this article, we concentrate on policy responses, as typically found in laws, acts, or strategies. Recent research in climate change adaptation or environmental governance concluded that the degree of severity of extreme events is a crucial indicator that policy action should be taken. The event alone is a necessary, but insufficient condition for policies to be introduced. In this context, we ask: Which conditions must be at stake so that an extreme event is able to deploy its focal power and induce policy introduction or change? To answer this question, we studied more than two centuries of flood risk management in Switzerland. We relied on qualitative and quantitative data, as well as process tracing techniques, to relate event characteristics, media, political, and policy contexts to policy change in flood risk management. Results indicate that two conditions made floods turn into focusing events and support paradigm shift: high economic damage and a policy subsystem's actor constellation favorable to change. We are convinced that our results are also replicable for other natural disasters and other countries than only Switzerland. Supplementary Information: The online version contains supplementary material available at 10.1007/s10113-024-02316-2.
RESUMO
Background: Precautionary Allergen ("may contain") Labelling (PAL) is used by industry to communicate potential risk to food-allergic individuals posed by unintended allergen presence (UAP). In 2014, the World Allergy Organization (WAO) highlighted that PAL use was increasing, but often applied inconsistently and without regulation - which reduces its usefulness to consumers with food allergy and those purchasing food for them. WAO proposed the need for a regulated, international framework to underpin application of PAL. In 2019, the World Health Organization (WHO) and the Food and Agriculture Organization (FAO) of the United Nations convened an expert consultation to address the issue of PAL, the outputs of which are now being considered by the Codex Committee on Food Labelling (CCFL). Objectives: To summarise the latest data to inform the application of PAL in a more systematic way, for implementation into global food standards. Methods: A non-systematic review of issues surrounding precautionary labelling and food allergens in pre-packaged products. Results: Approximately, 100 countries around the world have legislation on the declaration of allergenic ingredients. Just a few have legislation on UAP. Given the risks that UAP entails, non-regulated PAL creates inconvenience in real life due to its unequal, difficult interpretation by patients. The attempts made so far to rationalize PAL present lights and shadows. Conclusions: At a time when CCFL is considering the results of the FAO/WHO Expert Consultation 2020-2023, we summarise the prospects to develop an effective and homogeneous legislation at a global level, and the areas of uncertainty that might hinder international agreement on a regulated framework for PAL of food allergens.
RESUMO
Overview: This study provides empirical data on the knowledge and practices of biosafety and biosecurity professionals and researchers involved in research on enhanced Potential Pandemic Pathogens (ePPPs) and Dual Use Research of Concern (DURC) within various U.S. sectors. The goal is to improve public health interventions and oversight for DURC and ePPP, contributing valuable insights for policy development. A notable finding was the association between larger biosafety/biosecurity teams and a higher likelihood of conducting high-risk biological research. Methods: A survey of 541 biosafety and biosecurity professionals was conducted between March 8 and 10 April 2024, with results analyzed using SAS at a significance level of 0.05. The study received approval from the Institutional Review Boards (IRBs) at Arizona State University and the University of Nevada, Reno. Results: Government organizations were more likely to conduct DURC compared to other sectors (e.g., Academic, Commercial, Consulting). Public institutions reviewed more experiments outside the scope of the U.S. DURC Policy than private for-profit institutions. Institutions with larger biosafety/biosecurity teams reported greater research activity and more effective non-compliance reporting mechanisms (e.g., anonymous hotlines, reporting forms). Additionally, financial support and the challenges of policy implementation varied significantly across sectors. Discussion: The findings emphasize the need for appropriate staffing and resource allocation for high-risk biosafety and biosecurity research. A differentiated regulatory approach and equitable distribution of resources are essential for effective oversight. Moreover, robust non-compliance reporting systems are critical to mitigating the risks associated with DURC and ePPP research.
RESUMO
Risk management is an important component of service delivery in supportive housing and Housing First programs. However, there is no evidence on the implementation of risk management approaches in these settings. This qualitative study examined what service providers working in supportive housing and Housing First programs in Canada identify as the programmatic and organizational factors that affect the prevention and management of high-risk behaviours and challenges (e.g., overdose, suicide attempts, non-suicidal self-injury, falls and fall-related injuries, fire-setting, hoarding, apartment takeovers, violence, property damage, drug selling) in their programs. In-depth interviews were completed with a purposive sample of 32 service providers. Data were analyzed using an integrative approach that incorporated techniques from qualitative description and thematic analysis. Four thematic factors, which were comprised of various barriers and facilitators, that affected management of high-risk issues in supportive housing and Housing First programs were identified: [1] flexibility in addressing risk issues; [2] early identification of risk issues; [3] built environment and housing location; and [4] resource availability. Overall, the findings underscore how service providers aim to identify high-risk issues promptly, beginning as early as referral, and that their capacity to effectively do this and intervene accordingly is dynamically shaped by various aspects of the program model, environment, and availability of internal and external resources. Yet, the findings also highlight how risk management approaches may conflict with other programmatic goals and values, and the importance of considering these collectively. Systems-level changes to strengthen programs' capacity to prevent risk and implications for future research are discussed.
RESUMO
BACKGROUND: Early assessment of patients with suspected transient ischaemic attack (TIA) is crucial to provision of effective care, including initiation of preventive therapies and identification of stroke mimics. Many patients with TIA present to emergency medical services (EMS) but may not require hospitalisation. Paramedics could identify and refer patients with low-risk TIA, without conveyance to the ED. Safety and effectiveness of this model is unknown. AIM: To assess the feasibility of undertaking a fully powered randomised controlled trial (RCT) to evaluate clinical and cost-effectiveness of paramedic referral of patients who call EMS with low-risk TIA to TIA clinic, avoiding transfer to ED. METHODS: The Transient Ischaemic attack Emergency Referral (TIER) intervention was developed through a survey of UK ambulance services, a scoping review of evidence of prehospital care of TIA and convening a specialist clinical panel to agree its final form. Paramedics in South Wales, UK, were randomly allocated to trial intervention (TIA clinic referral) or control (usual care) arms, with patients' allocation determined by that of attending paramedics.Predetermined progression criteria considered: proportion of patients referred to TIA clinic, data retrieval, patient satisfaction and potential cost-effectiveness. RESULTS: From December 2016 to September 2017, eighty-nine paramedics recruited 53 patients (36 intervention; 17 control); 48 patients (31 intervention; 17 control) consented to follow-up via routine data. Three intervention patients, of seven deemed eligible, were referred to TIA clinic by paramedics. Contraindications recorded for the other intervention arm patients were: Face/Arms/Speech/Time positive (n=13); ABCD2 score >3 (n=5); already anticoagulated (n=2); crescendo TIA (n=1); other (n=8). Routinely collected electronic health records, used to report further healthcare contacts, were obtained for all consenting patients. Patient-reported satisfaction with care was higher in the intervention arm (mean 4.8/5) than the control arm (mean 4.2/5). Health economic analysis suggests an intervention arm quality-adjusted life-year loss of 0.0094 (95% CI -0.0371, 0.0183), p=0.475. CONCLUSION: The TIER feasibility study did not meet its progression criteria, largely due to low patient identification and referral rates. A fully powered RCT in this setting is not recommended. TRIAL REGISTRATION NUMBER: ISRCTN85516498.
RESUMO
The photochemical and photoelectrochemical reduction of CO2 is a promising approach for converting carbon dioxide into valuable chemicals (materials) and fuels. A key issue is ensuring the accuracy of experimental results in CO2 reduction reactions (CO2RRs) because of potential sources of false positives. This paper reports the results of investigations on various factors that may contribute to erroneous attribution of reduced-carbon species, including degradation of carbon species contained in photocatalysts, residual contaminants from synthetic procedures, laboratory glassware, environmental exposure, and the operator. The importance of rigorous experimental protocols, including the use of labeled 13CO2 and blank tests, to identify true CO2 reduction products (CO2RPs) accurately is highlighted. Our experimental data (eventually complemented with or compared to literature data) underline the possible sources of errors and, whenever possible, quantify the false positives with respect to the effective conversion of CO2 in clean conditions. This paper clarifies that the incidence of false positives is higher in the preliminary phase of photo-material development when CO2RPs are in the range of a few 10s of µg gcat-1 h-1, reducing its importance when significant conversions of CO2 are performed reaching 10s of mol gcat-1 h-1. This paper suggests procedures for improving the reliability and reproducibility of CO2RR experiments, thus validating such technologies.
RESUMO
Background: This study aims to determine a generalized outcome and risk profile for patients undergoing orthognathic surgery for the definitive treatment of cleft lip and palate. Furthermore, we hope to determine the key risk factors that cause increased risk for cleft lip and palate patients undergoing orthognathic surgery. Methods: This study includes a systematic review using PubMed, MEDLINE, Cochrane, and Scopus. Data curation utilized Covidence software, with dual-reviewer screening and conflict resolution by a third party, focusing on publications with the full texts available. Results: The initial search yielded 1697 articles. Following title, abstract, and full-text screening, a total of 62 articles were included in this review. A total of 70.9% of included articles had moderate bias, with the rest having low risk of bias. The sample consisted of 2550 patients with an average age of about 20 years and an average follow-up of 16.8 months. The most employed procedure was Le Fort I osteotomy (99%). In terms of velopharyngeal function, there were notable increases in insufficiency and severity scores, with an average 63% worsening score from the baseline. That being said, patients experienced an average 33% improvement in speech articulation. Furthermore, the average horizontal movement was reported to be 6.09 mm with a subsequent relapse of 0.98 mm overall. Conclusions: This systematic review distills data from 62 articles and 2550 patients. It highlights the efficacy of orthognathic surgery in addressing oropharyngeal and aesthetic deficits. This study identifies relapse and velopharyngeal insufficiency as recurrent complications. These insights inform surgical refinement and patient counseling, laying a foundation for enhanced clinical protocols.
RESUMO
There is a need to utilise formal education to ensure and support the effective participation of communities in the disaster risk management process. The negative outcomes of disasters occurring as a result of various disasters in Turkiye show that the society is inadequately prepared. Therefore, the best fight against disasters can be carried out within the scope of formal education activities. In this study, the content and infrastructure of a curriculum for the management of disaster risks at the university level is presented at the conceptual level. Disaster literacy curriculum can contribute to the management of current and future disaster risks. However, there is a need to expand the implementation and measurement of the effectiveness and feasibility of the curriculum as a public health intervention tool. Finally, the support of the national education system needs to be ensured.
Assuntos
Gestão de Riscos , Humanos , Gestão de Riscos/métodos , Currículo/tendências , Alfabetização , Planejamento em Desastres/métodosRESUMO
Until recently, plastic pollution research was focused on the marine environments, and attention was given to terrestrial and freshwater environments latter. This discussion paper aims to put forward crucial questions on issues that limit our ability to conduct reliable plastic ecological risk assessments in rivers. Previous studies highlighted the widespread presence of plastics in rivers, but the sources and levels of exposure remained matters of debate. Field measurements have been carried out on the concentration and composition of plastics in rivers, but greater homogeneity in the choice of plastic sizes, particularly for microplastics by following the recent ISO international standard nomenclature, is needed for better comparison between studies. The development of additional relevant sampling strategies that are suited to the specific characteristics of riverine environments is also needed. Similarly, we encourage the systematic real-time monitoring of environmental conditions (e.g., topology of the sampling section of the river, hydrology, volumetric flux and velocity, suspended matters concentration) to better understand the origin of variability in plastic concentrations in rivers. Furthermore, ingestion of microplastics by freshwater organisms has been demonstrated under laboratory conditions, but the long-term effects of continuous microplastic exposure in organisms are less well understood. This discussion paper encourages an integrative view of the issues involved in assessing plastic exposure and its effects on biota, in order to improve our ability to carry out relevant ecological risk assessments in river environments.
RESUMO
BACKGROUND: Hospitals should adopt multiple methods to monitor incidents for a comprehensive review of the types of incidents that occur. Contrary to traditional incident reporting systems, the Green Cross (GC) method is a simple visual method to recognise incidents based on teamwork and safety briefings. Its longitudinal effect on patient safety culture has not been previously assessed. This study aimed to explore whether the implementation of the GC method in a postanaesthesia care unit changed nurses' perceptions of different factors associated with patient safety culture over 4 years. METHODS: A longitudinal quasi-experimental pre-post intervention design with a comparison group was used. The intervention unit and the comparison group, which consisted of nurses, were recruited from the surgical department of a Norwegian university hospital. The intervention unit implemented the GC method in February 2019. Both groups responded to the staff survey before and then annually between 2019 and 2022 on the factors 'work engagement', 'teamwork climate' and 'safety climate'. The data were analysed using logistic regression models. RESULTS: Within the intervention unit, relative to the changes in the comparison group, the results indicated significant large positive changes in all factor scores in 2019, no changes in 2020, significant large positive changes in 'work engagement' and 'safety climate' scores in 2021 and a significant medium positive change in 'work engagement' in 2022. At baseline, the comparison group had a significantly lower score in 'safety climate' than the intervention unit, but no significant baseline differences were found between the groups regarding 'work engagement' and 'teamwork climate'. CONCLUSION: The results suggest that the GC method had a positive effect on the nurses' perception of factors associated with patient safety culture over a period of 4 years. The positive effect was completely sustained in 'work engagement' but was somewhat less persistent in 'teamwork climate' and 'safety climate'.
Assuntos
Segurança do Paciente , Gestão da Segurança , Humanos , Estudos Longitudinais , Segurança do Paciente/estatística & dados numéricos , Segurança do Paciente/normas , Noruega , Masculino , Gestão da Segurança/métodos , Gestão da Segurança/normas , Gestão da Segurança/estatística & dados numéricos , Feminino , Adulto , Inquéritos e Questionários , Cultura Organizacional , Pessoa de Meia-IdadeRESUMO
BACKGROUND: Patient safety is a critical concern in dentistry. Adverse events (AEs) can harm patients, increase costs, and decrease satisfaction. Understanding AE types and frequencies is crucial for effective risk management and quality improvement. This study analyzes incident reports to identify preliminary incident patterns as a starting point for developing risk management strategies. However, under-reporting limits the ability to identify true incident patterns, highlighting the need for improved reporting systems and encouragement of incident reporting. Further research is underway to develop such a system and promote reporting to ensure sufficient data quality for effective risk management. METHODS: A retrospective analysis of 1,618 incident reports from December 2018 to August 2023 was conducted. A validated classification system, developed from a 5-year retrospective analysis and approved by 14 experts, categorized patient safety incidents, aligning with Thailand's Hospital Accreditation standards. Descriptive statistics summarized AE frequency and distribution. RESULTS: Of the reports, 752 were patient safety, 503 personnel safety, and 363 organizational safety incidents. Top patient safety incidents included medical record errors (176), accidental damage (66), post-operative complications (65), medical emergencies (64), and communication errors (53). Personnel safety incidents involved inappropriate working conditions (135) and work-related injuries with contact transmission risk (117). Organizational safety incidents mainly concerned policy and operational processes (131). CONCLUSIONS: This study reveals the preliminary patterns of adverse events (AEs) in dental settings and underscores the limitations due to under-reporting, which affect the ability to fully understand true incident patterns. To effectively manage risks, there is a critical need for improving the existing incident reporting system and encouraging a culture of comprehensive reporting among dental professionals. Future efforts should focus on enhancing reporting systems to ensure high-quality data, enabling better identification of incident trends and supporting targeted risk management strategies to improve patient safety in dentistry.
Assuntos
Erros Médicos , Segurança do Paciente , Gestão de Riscos , Estudos Retrospectivos , Humanos , Erros Médicos/estatística & dados numéricos , Erros Médicos/classificação , Tailândia , Faculdades de Odontologia , Melhoria de QualidadeRESUMO
INTRODUCTION: The National Early Warning Score (NEWS/2) system was developed to enable the detection and early intervention of patients at risk of clinical deterioration. It has demonstrated good accuracy in identifying imminent critical outcomes but has limitations in its applicability to various patient types and its ability to predict upcoming deterioration beyond 24 hours. Various studies have attempted to improve its predictive accuracy and clinical utility by modifying or adding variables to the standard NEWS/2 system. The purpose of this scoping review is to identify modifications to the NEWS and NEWS2 systems (eg, the inclusion of additional patient demographic, physiological or other characteristics) and how those modifications influence predictive accuracy to provide an evidence base for subsequent improvement of the system. METHODS AND ANALYSIS: The review will be structured using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews and the Population, Intervention, Comparator, Outcome, and Study frameworks. Six databases (PubMed, ScienceDirect, Embase, CINAHL, Web of Science and Cochrane Library) will be searched in April 2024 for articles published in English. Article screening and data extraction will be conducted by two independent reviewers, with any conflicts resolved by discussion. The analysis will be descriptive to provide a summary of modifications identified and their influence on the predictive accuracy of NEWS/NEWS 2. ETHICS AND DISSEMINATION: Ethical approval is not required as data will be obtained from already published sources. Findings from this study will be disseminated via publication in a peer-reviewed journal.
Assuntos
Escore de Alerta Precoce , Humanos , Projetos de Pesquisa , Deterioração Clínica , Revisões Sistemáticas como AssuntoRESUMO
This paper presents a new conceptual framework, and stepwise approach to populate it, for informing countermeasure development to support fitness-to-drive for professional drivers. Professional drivers are vital to the transport network; however, the job is demanding and drivers are vulnerable to impairments which may impact safe driving. Countermeasures are any action or activity that mitigates the impact or frequency of occurrence of driver impairment. The framework proposes countermeasures to be delivered across three time points: Operational (during shift), Tactical (immediately after shift) and Strategic (outside of on-shift) and at multiple system levels, e.g., driver, manager, enforcement etc. The framework was successfully pilot tested with three different professional driver use cases: autonomous shuttles, taxi, and garbage truck drivers. This structured approach to countermeasure design offers potential to improve driver health and enhance road safety. The work was conducted within PANACEA, an EU project, grant agreement number 953426.