Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 42
Filtrar
1.
Sensors (Basel) ; 24(3)2024 Jan 23.
Artículo en Inglés | MEDLINE | ID: mdl-38339436

RESUMEN

This paper proposes a monitoring procedure based on characterizing state probability distributions estimated using particle filters. The work highlights what types of information can be obtained during state estimation and how the revealed information helps to solve fault diagnosis tasks. If a failure is present in the system, the output predicted by the model is inconsistent with the actual output, which affects the operation of the estimator. The heterogeneity of the probability distribution of states increases, and a large proportion of the particles lose their information content. The correlation structure of the posterior probability density can also be altered by failures. The proposed method uses various indicators that characterize the heterogeneity and correlation structure of the state distribution, as well as the consistency between model predictions and observed behavior, to identify the effects of failures.The applicability of the utilized measures is demonstrated through a dynamic vehicle model, where actuator and sensor failure scenarios are investigated.

2.
Entropy (Basel) ; 26(7)2024 Jun 30.
Artículo en Inglés | MEDLINE | ID: mdl-39056933

RESUMEN

This paper highlights that metrics from the machine learning field (e.g., entropy and information gain) used to qualify a classifier model can be used to evaluate the effectiveness of separation systems. To evaluate the efficiency of separation systems and their operation units, entropy- and information gain-based metrics were developed. The receiver operating characteristic (ROC) curve is used to determine the optimal cut point in a separation system. The proposed metrics are verified by simulation experiments conducted on the stochastic model of a waste-sorting system.

3.
Sensors (Basel) ; 23(5)2023 Feb 23.
Artículo en Inglés | MEDLINE | ID: mdl-36904704

RESUMEN

This paper describes a framework for detecting welding errors using 3D scanner data. The proposed approach employs density-based clustering to compare point clouds and identify deviations. The discovered clusters are then classified according to standard welding fault classes. Six welding deviations defined in the ISO 5817:2014 standard were evaluated. All defects were represented through CAD models, and the method was able to detect five of these deviations. The results demonstrate that the errors can be effectively identified and grouped according to the location of the different points in the error clusters. However, the method cannot separate crack-related defects as a distinct cluster.

4.
Sensors (Basel) ; 22(11)2022 Jun 03.
Artículo en Inglés | MEDLINE | ID: mdl-35684889

RESUMEN

The present research presents a framework that supports the development and operation of machine-learning (ML) algorithms to develop, maintain and manage the whole lifecycle of modeling software sensors related to complex chemical processes. Our motivation is to take advantage of ML and edge computing and offer innovative solutions to the chemical industry for difficult-to-measure laboratory variables. The purpose of software sensor models is to continuously forecast the quality of products to achieve effective quality control, maintain the stable production condition of plants, and support efficient, environmentally friendly, and harmless laboratory work. As a result of the literature review, quite a few ML models have been developed in recent years that support the quality assurance of different types of materials. However, the problems of continuous operation, maintenance and version control of these models have not yet been solved. The method uses ML algorithms and takes advantage of cloud services in an enterprise environment. Industrial 4.0 devices such as the Internet of Things (IoT), edge computing, cloud computing, ML, and artificial intelligence (AI) are core techniques. The article outlines an information system structure and the related methodology based on data from a quality-assurance laboratory. During the development, we encountered several challenges resulting from the continuous development of ML models and the tuning of their parameters. The article discusses the development, version control, validation, lifecycle, and maintenance of ML models and a case study. The developed framework can continuously monitor the performance of the models and increase the amount of data that make up the models. As a result, the most accurate, data-driven and up-to-date models are always available to quality-assurance engineers with this solution.


Asunto(s)
Inteligencia Artificial , Internet de las Cosas , Nube Computacional , Aprendizaje Automático , Programas Informáticos
5.
Sensors (Basel) ; 23(1)2022 Dec 27.
Artículo en Inglés | MEDLINE | ID: mdl-36616880

RESUMEN

One of the main challenges of Industry 4.0 is how advanced sensors and sensing technologies can be applied through the Internet of Things layers of existing manufacturing. This is the so-called Brownfield Industry 4.0, where the different types and ages of machines and processes need to be digitalized. Smart retrofitting is the umbrella term for solutions to show how we can digitalize manufacturing machines. This problem is critical in the case of solutions to support human workers. The Operator 4.0 concept shows how we can efficiently support workers on the shop floor. The key indicator is the readiness level of a company, and the main bottleneck is the technical knowledge of the employees. This study proposes an education framework and a related Operator 4.0 laboratory that prepares students for the development and application of Industry 5.0 technologies. The concept of intelligent space is proposed as a basis of the educational framework, which can solve the problem of monitoring the stochastic nature of operators in production processes. The components of the intelligent space are detailed through the layers of the IoT in the form of a case study conducted at the laboratory. The applicability of indoor positioning systems is described with the integration of machine-, operator- and environment-based sensor data to obtain real-time information from the shop floor. The digital twin of the laboratory is developed in a discrete event simulator, which integrates the data from the shop floor and can control the production based on the simulation results. The presented framework can be utilized to design education for the generation of Industry 5.0.


Asunto(s)
Industrias , Estudiantes , Humanos , Comercio , Simulación por Computador , Inteligencia
6.
J Environ Manage ; 323: 116165, 2022 Dec 01.
Artículo en Inglés | MEDLINE | ID: mdl-36116263

RESUMEN

Climate change can cause multiply potential health issues in urban areas, which is the most susceptible environment in terms of the presently increasing climate volatility. Urban greening strategies make an important part of the adaptation strategies which can ameliorate the negative impacts of climate change. It was aimed to study the potential impacts of different kinds of greenings against the adverse effects of climate change, including waterborne, vector-borne diseases, heat-related mortality, and surface ozone concentration in a medium-sized Hungarian city. As greening strategies, large and pocket parks were considered, based on our novel location identifier algorithm for climate risk minimization. A method based on publicly available data sources including satellite pictures, climate scenarios and urban macrostructure has been developed to evaluate the health-related indicator patterns in cities. The modelled future- and current patterns of the indicators have been compared. The results can help the understanding of the possible future state of the studied indicators and the development of adequate greening strategies. Another outcome of the study is that it is not the type of health indicator but its climate sensitivity that determines the extent to which it responds to temperature rises and how effective greening strategies are in addressing the expected problem posed by the factor.


Asunto(s)
Cambio Climático , Ozono , Ciudades , Evaluación del Impacto en la Salud , Calor , Ozono/análisis , Temperatura , Salud Urbana
7.
Sensors (Basel) ; 21(10)2021 May 12.
Artículo en Inglés | MEDLINE | ID: mdl-34065951

RESUMEN

The targeted shortening of sensor development requires short and convincing verification tests. The goal of the development of novel verification methods is to avoid or reduce an excessive amount of testing and identify tests that guarantee that the assumed failure will not happen in practice. In this paper, a method is presented that results in the test loads of such a verification. The method starts with the identification of the requirements for the product related to robustness using the precise descriptions of those use case scenarios in which the product is assumed to be working. Based on the logic of the Quality Function Deployment (QFD) method, a step-by-step procedure has been developed to translate the robustness requirements through the change in design parameters, their causing phenomena, the physical quantities as causes of these phenomena, until the test loads of the verification. The developed method is applied to the test plan of an automotive sensor. The method is general and can be used for any parts of a vehicle, including mechanical, electrical and mechatronical ones, such as sensors and actuators. Nonetheless, the method is applicable in a much broader application area, even outside of the automotive industry.

8.
Sensors (Basel) ; 20(23)2020 Nov 26.
Artículo en Inglés | MEDLINE | ID: mdl-33256090

RESUMEN

Real-time monitoring and optimization of production and logistics processes significantly improve the efficiency of production systems. Advanced production management solutions require real-time information about the status of products, production, and resources. As real-time locating systems (also referred to as indoor positioning systems) can enrich the available information, these systems started to gain attention in industrial environments in recent years. This paper provides a review of the possible technologies and applications related to production control and logistics, quality management, safety, and efficiency monitoring. This work also provides a workflow to clarify the steps of a typical real-time locating system project, including the cleaning, pre-processing, and analysis of the data to provide a guideline and reference for research and development of indoor positioning-based manufacturing solutions.

9.
J Environ Manage ; 263: 110414, 2020 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-32174539

RESUMEN

Countries have to work out and follow tailored strategies for the achievement of their Sustainable Development Goals. At the end of 2018, more than 100 voluntary national reviews were published. The reviews are transformed by text mining algorithms into networks of keywords to identify country-specific thematic areas of the strategies and cluster countries that face similar problems and follow similar development strategies. The analysis of the 75 VNRs has shown that SDG5 (gender equality) is the most discussed goal worldwide, as it is discussed in 77% of the analysed Voluntary National Reviews. The SDG8 (decent work and economic growth) is the second most studied goal, With 76 %, while the SDG1 (no poverty) is the least focused goal, it is mentioned only in 48 % of documents and the SDG10 (reduced inequalities) in 49 %. The results demonstrate that the proposed benchmark tool is capable of highlighting what kind of activities can make significant contributions to achieve sustainable developments.


Asunto(s)
Desarrollo Económico , Desarrollo Sostenible , Minería de Datos , Pobreza , Factores Socioeconómicos
10.
J Environ Manage ; 238: 126-135, 2019 May 15.
Artículo en Inglés | MEDLINE | ID: mdl-30849597

RESUMEN

Strategic environmental assessment is a decision support technique that evaluates policies, plans and programs in addition to identifying the most appropriate interventions in different scenarios. This work develops a network-based model to study interlinked ecological, economic, environmental and social problems to highlight the synergies between policies, plans, and programs in environmental strategic planning. Our primary goal is to propose a methodology for the data-driven verification and extension of expert knowledge concerning the interconnectedness of the sustainable development goals and their related targets. A multilayer network model based on the time-series indicators of the World Bank open data over the last 55 years was assembled. The results illustrate that by providing an objective and data-driven view of the correlated variables of the World Bank, the proposed layered multipartite network model highlights the previously not discussed interconnections, node centrality measures evaluate the importance of the targets, and network community detection algorithms reveal their strongly connected groups. The results confirm that the proposed methodology can serve as a data-driven decision support tool for the preparation and monitoring of long-term environmental policies. The developed new data-driven network model enables multi-level analysis of the sustainability (goals, targets, indicators) and will make it possible to plan long-term environmental strategic planning. Through relationships among indicators, relationships among targets and goals can be modelled. The results show that sustainable development goals are strongly interconnected, while the 5th goal (gender equality) is linked mostly to 17th (partnerships for the goals) goal. The analysis has also highlighted the importance of the 4th (quality education).


Asunto(s)
Conservación de los Recursos Naturales , Objetivos , Ecología , Política Ambiental , Desarrollo Sostenible
11.
Sensors (Basel) ; 18(7)2018 Jul 19.
Artículo en Inglés | MEDLINE | ID: mdl-30029510

RESUMEN

Industry 4.0-based human-in-the-loop cyber-physical production systems are transforming the industrial workforce to accommodate the ever-increasing variability of production. Real-time operator support and performance monitoring require accurate information on the activities of operators. The problem with tracing hundreds of activity times is critical due to the enormous variability and complexity of products. To handle this problem a software-sensor-based activity-time and performance measurement system is proposed. To ensure a real-time connection between operator performance and varying product complexity, fixture sensors and an indoor positioning system (IPS) were designed and this multi sensor data merged with product-relevant information. The proposed model-based performance monitoring system tracks the recursively estimated parameters of the activity-time estimation model. As the estimation problem can be ill-conditioned and poor raw sensor data can result in unrealistic parameter estimates, constraints were introduced into the parameter-estimation algorithm to increase the robustness of the software sensor. The applicability of the proposed methodology is demonstrated on a well-documented benchmark problem of a wire harness manufacturing process. The fully reproducible and realistic simulation study confirms that the indoor positioning system-based integration of primary sensor signals and product-relevant information can be efficiently utilized in terms of the constrained recursive estimation of the operator activity.

12.
Sensors (Basel) ; 18(9)2018 Sep 14.
Artículo en Inglés | MEDLINE | ID: mdl-30223464

RESUMEN

Network science-based analysis of the observability of dynamical systems has been a focus of attention over the past five years. The maximum matching-based approach provides a simple tool to determine the minimum number of sensors and their positions. However, the resulting proportion of sensors is particularly small when compared to the size of the system, and, although structural observability is ensured, the system demands additional sensors to provide the small relative order needed for fast and robust process monitoring and control. In this paper, two clustering and simulated annealing-based methodologies are proposed to assign additional sensors to the dynamical systems. The proposed methodologies simplify the observation of the system and decrease its relative order. The usefulness of the proposed method is justified in a sensor-placement problem of a heat exchanger network. The results show that the relative order of the observability is decreased significantly by an increase in the number of additional sensors.

13.
ScientificWorldJournal ; 2014: 870406, 2014.
Artículo en Inglés | MEDLINE | ID: mdl-24616651

RESUMEN

During the last decade various algorithms have been developed and proposed for discovering overlapping clusters in high-dimensional data. The two most prominent application fields in this research, proposed independently, are frequent itemset mining (developed for market basket data) and biclustering (applied to gene expression data analysis). The common limitation of both methodologies is the limited applicability for very large binary data sets. In this paper we propose a novel and efficient method to find both frequent closed itemsets and biclusters in high-dimensional binary data. The method is based on simple but very powerful matrix and vector multiplication approaches that ensure that all patterns can be discovered in a fast manner. The proposed algorithm has been implemented in the commonly used MATLAB environment and freely available for researchers.


Asunto(s)
Análisis por Conglomerados , Minería de Datos/métodos , Algoritmos
14.
PLoS One ; 19(5): e0301262, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38722864

RESUMEN

Frequent sequence pattern mining is an excellent tool to discover patterns in event chains. In complex systems, events from parallel processes are present, often without proper labelling. To identify the groups of events related to the subprocess, frequent sequential pattern mining can be applied. Since most algorithms provide too many frequent sequences that make it difficult to interpret the results, it is necessary to post-process the resulting frequent patterns. The available visualisation techniques do not allow easy access to multiple properties that support a faster and better understanding of the event scenarios. To answer this issue, our work proposes an intuitive and interactive solution to support this task, introducing three novel network-based sequence visualisation methods that can reduce the time of information processing from a cognitive perspective. The proposed visualisation methods offer a more information rich and easily understandable interpretation of sequential pattern mining results compared to the usual text-like outcome of pattern mining algorithms. The first uses the confidence values of the transitions to create a weighted network, while the second enriches the adjacency matrix based on the confidence values with similarities of the transitive nodes. The enriched matrix enables a similarity-based Multidimensional Scaling (MDS) projection of the sequences. The third method uses similarity measurement based on the overlap of the occurrences of the supporting events of the sequences. The applicability of the method is presented in an industrial alarm management problem and in the analysis of clickstreams of a website. The method was fully implemented in Python environment. The results show that the proposed methods are highly applicable for the interactive processing of frequent sequences, supporting the exploration of the inner mechanisms of complex systems.


Asunto(s)
Algoritmos , Minería de Datos/métodos , Humanos
15.
Sci Rep ; 14(1): 9036, 2024 Apr 19.
Artículo en Inglés | MEDLINE | ID: mdl-38641683

RESUMEN

In real-world classification problems, it is important to build accurate prediction models and provide information that can improve decision-making. Decision-support tools are often based on network models, and this article uses information encoded by social networks to solve the problem of employer turnover. However, understanding the factors behind black-box prediction models can be challenging. Our question was about the predictability of employee turnover, given information from the multilayer network that describes collaborations and perceptions that assess the performance of organizations that indicate the success of cooperation. Our goal was to develop an accurate prediction procedure, preserve the interpretability of the classification, and capture the wide variety of specific reasons that explain positive cases. After a feature engineering, we identified variables with the best predictive power using decision trees and ranked them based on their added value considering their frequent co-occurrence. We applied the Random Forest using the SMOTE balancing technique for prediction. We calculated the SHAP values to identify the variables that contribute the most to individual predictions. As a last step, we clustered the sample based on SHAP values to fine-tune the explanations for quitting due to different background factors.

16.
Sci Rep ; 14(1): 14521, 2024 Jun 24.
Artículo en Inglés | MEDLINE | ID: mdl-38914589

RESUMEN

Identifying communities in multilayer networks is crucial for understanding the structural dynamics of complex systems. Traditional community detection algorithms often overlook the presence of overlapping edges within communities, despite the potential significance of such relationships. In this work, we introduce a novel modularity measure designed to uncover communities where nodes share specific multiple facets of connectivity. Our approach leverages a null network, an empirical layer of the multiplex network, not a random network, that can be one of the network layers or a complement graph of that, depending on the objective. By analyzing real-world social networks, we validate the effectiveness of our method in identifying meaningful communities with overlapping edges. The proposed approach offers valuable insights into the structural dynamics of multiplex systems, shedding light on nodes that share similar multifaceted connections.

17.
Heliyon ; 10(8): e29437, 2024 Apr 30.
Artículo en Inglés | MEDLINE | ID: mdl-38655321

RESUMEN

This paper presents a methodology that aims to enhance the accuracy of probability density estimation in mobility pattern analysis by integrating prior knowledge of system dynamics and contextual information into the particle filter algorithm. The quality of the data used for density estimation is often inadequate due to measurement noise, which significantly influences the distribution of the measurement data. Thus, it is crucial to augment the information content of the input data by incorporating additional sources of information beyond the measured position data. These other sources can include the dynamic model of movement and the spatial model of the environment, which influences motion patterns. To effectively combine the information provided by positional measurements with system and environment models, the particle filter algorithm is employed, which generates discrete probability distributions. By subjecting these discrete distributions to exploratory techniques, it becomes possible to extract more certain information compared to using raw measurement data alone. Consequently, this study proposes a methodology in which probability density estimation is not solely based on raw positional data but rather on probability-weighted samples generated through the particle filter. This approach yields more compact and precise modeling distributions. Specifically, the method is applied to process position measurement data using a nonparametric density estimator known as kernel density estimation. The proposed methodology is thoroughly tested and validated using information-theoretic and probability metrics. The applicability of the methodology is demonstrated through a practical example of mobility pattern analysis based on forklift data in a warehouse environment.

18.
Heliyon ; 10(4): e25946, 2024 Feb 29.
Artículo en Inglés | MEDLINE | ID: mdl-38404856

RESUMEN

Detecting chemical, biological, radiological and nuclear (CBRN) incidents is a high priority task and has been a topic of intensive research for decades. Ongoing technological, data processing, and automation developments are opening up new potentials in CBRN protection, which has become a complex, interdisciplinary field of science. According to it, chemists, physicists, meteorologists, military experts, programmers, and data scientists are all involved in the research. The key to effectively enhancing CBRN defence capabilities is continuous and targeted development along a well-structured concept. Our study highlights the importance of predictive analytics by providing an overview of the main components of modern CBRN defence technologies, including a summary of the conceptual requirements for CBRN reconnaissance and decision support steps, and by presenting the role and recent opportunities of information management in these processes.

19.
MethodsX ; 13: 102838, 2024 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-39071993

RESUMEN

This article focuses on improving indoor positioning data through data reconciliation. Indoor positioning systems are increasingly used for resource tracking to monitor manufacturing and warehouse processes. However, measurement errors due to noise can negatively impact system performance. Redundant measurement involves the use of multiple sensor tags that provide position data on the same resource, to identify errors in the physical environment. If we have measurement data from the entire physical environment, a map-based average measurement error can be determined by specifying the points in the examined area where measurement data should be compensated and to what extent. This compensation is achieved through data reconciliation, which improves real-time position data by considering the measurement error in the actual position as an element of the variance-covariance matrix. A case study in a warehouse environment is presented to demonstrate how discrepancies in position data from two sensor tags on forklifts can be used to identify layout-based errors. The algorithm is generally capable of handling the multi-sensor problem in the case of indoor positioning systems. The key points are as follows:•The layout-based error detection is determined with the indoor positioning system measurement error.•This article shows how redundant measurements and data reconciliation can improve the accuracy of such systems.•Improving the accuracy of position data with the layout-based error map using a data reconciliation algorithm.

20.
Heliyon ; 10(9): e29764, 2024 May 15.
Artículo en Inglés | MEDLINE | ID: mdl-38694130

RESUMEN

The parameter identification of failure models for composite plies can be cumbersome, due to multiple effects as the consequence of brittle fracture. Our work proposes an iterative, nonlinear design of experiments (DoE) approach that finds the most informative experimental data to identify the parameters of the Tsai-Wu, Tsai-Hill, Hoffman, Hashin, max stress and Puck failure models. Depending on the data, the models perform differently, therefore, the parameter identification is validated by the Euclidean distance of the measured points to the closest ones on the nominal surface. The resulting errors provide a base for the ranking of the models, which helps to select the best fitting. Following the validation, the sensitivity of the best model is calculated by partial differentiation, and a theoretical surface is generated. Lastly, an iterative design of the experiments is implemented to select the optimal set of experiments from which the parameters can be identified from the least data by minimizing the fitting error. In this way, the number of experiments required for the identification of a model of a composite material can be significantly reduced. We demonstrate how the proposed method selected the most optimal experiments out of generated data. The results indicate that if the dataset contains enough information, the method is robust and accurate. If the data set lacks the necessary information, novel material tests can be proposed based on the optimal points of the parameters' sensitivity of the generated failure model surface.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA