RESUMO
In recent years, the automotive industry has witnessed significant progress in the development of automated driving technologies. The integration of advanced sensors and systems in vehicles has led to the emergence of various functionalities, such as driving assistance and autonomous driving. Applying these technologies on the assembly line can enhance the efficiency, safety, and speed of transportation, especially at end-of-line production. This work presents a connected automated vehicle (CAV) demonstrator for generating autonomous driving systems and services for the automotive industry. Our prototype electric vehicle is equipped with state-of-the-art sensors and systems for perception, localization, navigation, and control. We tested various algorithms and tools for transforming the vehicle into a self-driving platform, and the prototype was simulated and tested in an industrial environment as proof of concept for integration into assembly systems and end-of-line transport. Our results show the successful integration of self-driving vehicle platforms in the automotive industry, particularly in factory halls. We demonstrate the localization, navigation, and communication capabilities of our prototype in a demo area. This work anticipates a significant increase in efficiency and operating cost reduction in vehicle manufacturing, despite challenges such as current low traveling speeds and high equipment costs. Ongoing research aims to enhance safety for higher vehicle speeds, making it a more viable business case for manufacturers, considering the increasing standardization of automated driving equipment in cars. The main contribution of this paper lies in introducing the general concept architecture of the integration of automated driving functionalities in end-of-line assembly and production systems. Showing a case study of the effective development and implementation of such functionalities with a CAV demonstrator in a more standardized industrial operational design domain.
RESUMO
This study centers on creating a real-time algorithm to estimate brain-to-brain synchronization during social interactions, specifically in collaborative and competitive scenarios. This type of algorithm can provide useful information in the educational context, for instance, during teacher-student or student-student interactions. Positioned within the context of neuroeducation and hyperscanning, this research addresses the need for biomarkers as metrics for feedback, a missing element in current teaching methods. Implementing the bispectrum technique with multiprocessing functions in Python, the algorithm effectively processes electroencephalography signals and estimates brain-to-brain synchronization between pairs of subjects during (competitive and collaborative) activities that imply specific cognitive processes. Noteworthy differences, such as higher bispectrum values in collaborative tasks compared to competitive ones, emerge with reliability, showing a total of 33.75% of significant results validated through a statistical test. While acknowledging progress, this study identifies areas of opportunity, including embedded operations, wider testing, and improved result visualization. Beyond academia, the algorithm's utility extends to classrooms, industries, and any setting involving human interactions. Moreover, the presented algorithm is shared openly, to facilitate implementations by other researchers, and is easily adjustable to other electroencephalography devices. This research not only bridges a technological gap but also contributes insights into the importance of interactions in educational contexts.
Assuntos
Encéfalo , Eletroencefalografia , Humanos , Reprodutibilidade dos Testes , Eletroencefalografia/métodos , Algoritmos , EstudantesRESUMO
According to the World Health Organization (WHO), stress can be defined as any type of alteration that causes physical, emotional, or psychological tension. A very important concept that is sometimes confused with stress is anxiety. The difference between stress and anxiety is that stress usually has an existing cause. Once that activator has passed, stress typically eases. In this respect, according to the American Psychiatric Association, anxiety is a normal response to stress and can even be advantageous in some circumstances. By contrast, anxiety disorders differ from temporary feelings of anxiousness or nervousness with more intense feelings of fear or anxiety. The Diagnostic and Statistical Manual (DSM-5) explicitly describes anxiety as exorbitant concern and fearful expectations, occurring on most days for at least 6 months, about a series of events. Stress can be measured by some standardized questionnaires; however, these resources are characterized by some major disadvantages, the main one being the time consumed to interpret them; i.e., qualitative information must be transformed to quantitative data. Conversely, a physiological recourse has the advantage that it provides quantitative spatiotemporal information directly from brain areas and it processes data faster than qualitative supplies. A typical option for this is an electroencephalographic record (EEG). We propose, as a novelty, the application of time series (TS) entropies developed by us to inspect collections of EEGs obtained during stress situations. We investigated this database related to 23 persons, with 1920 samples (15 s) captured in 14 channels for 12 stressful events. Our parameters reflected that out of 12 events, event 2 (Family/financial instability/maltreatment) and 10 (Fear of disease and missing an important event) created more tension than the others. In addition, the most active lobes reflected by the EEG channels were frontal and temporal. The former is in charge of performing higher functions, self-control, self monitoring, and the latter is in charge of auditory processing, but also emotional handling. Thus, events E2 and E10 triggering frontal and temporal channels revealed the actual state of participants under stressful situations. The coefficient of variation revealed that E7 (Fear of getting cheated/losing someone) and E11 (Fear of suffering a serious illness) were the events with more changes among participants. In the same sense, AF4, FC5, and F7 (mainly frontal lobe channels) were the most irregular on average for all participants. In summary, by means of dynamic entropy analysis, the goal is to process the EEG dataset in order to elucidate which event and brain regions are key for all participants. The latter will allow us to easily determine which was the most stressful and on which brain zone. This study can be applied to other caregivers datasets. All this is a novelty.
Assuntos
Ansiedade , Cuidadores , Humanos , Entropia , Encéfalo , EletroencefalografiaRESUMO
Nearly half of the world's urban population depends on aquifers for drinking water. These are increasingly vulnerable to pollution and overexploitation. Besides anthropogenic sources, pollutants such as arsenic (As) are also geogenic and their concentrations have, in some cases, been increased by groundwater pumping. Almost 40 % of Mexico's population relies on groundwater for drinking water purposes; much the aquifers in semi-arid and arid central and northern Mexico is contaminated by As. These are agricultural regions where irrigation water is primarily provided from intenstive pumping of the aquifers leading to long-standing declines in the water table. The focus of this study is the main aquifer within the Comarca Lagunera region in Northern Mexico. Although the scientific evidence demonstrates that health effects are associated with long-term exposure to elevated As concentrations, this knowledge has not yielded effective groundwater development and public health policy. A multidisciplinary approach - including the evaluation of geochemistry, human health risk and development and public health policy - was used to provide a current account of these links. The dissolved As concentrations measured exceeded the corresponding World Health Organization guideline for drinking water in 90 % of the sampled wells; for the population drinking this water, the estimated probability of presenting non-carcinogenic health effects was >90 %, and the lifetime risk of developing cancer ranged from 0.5 to 61 cases in 10,000 children and 0.2 to 33 cases in 10,000 adults. The results suggest that insufficient policy responses are due to a complex and dysfunctional groundwater governance framework that compromises the economic, social and environmental sustainability of this region. These findings may valuable to other regions with similar settings that need to design and enact better informed, science-based policies that recognize the value of a more sustainable use of groundwater resources and a healthier population.
Assuntos
Arsênio , Água Potável , Água Subterrânea , Poluentes Químicos da Água , Criança , Humanos , Arsênio/análise , Água Potável/análise , Monitoramento Ambiental/métodos , Poluentes Químicos da Água/análise , México , Política de SaúdeRESUMO
BACKGROUND: The coronavirus (COVID-19) is a novel pandemic and recently we do not have enough knowledge about the virus behaviour and key performance indicators (KPIs) to assess the mortality risk forecast. However, using a lot of complex and expensive biomarkers could be impossible for many low budget hospitals. Timely identification of the risk of mortality of COVID-19 patients (RMCPs) is essential to improve hospitals' management systems and resource allocation standards. METHODS: For the mortality risk prediction, this research work proposes a COVID-19 mortality risk calculator based on a deep learning (DL) model and based on a dataset provided by the HM Hospitals Madrid, Spain. A pre-processing strategy for unbalanced classes and feature selection is proposed. To evaluate the proposed methods, an over-sampling Synthetic Minority TEchnique (SMOTE) and data imputation approaches are introduced which is based on the K-nearest neighbour. RESULTS: A total of 1,503 seriously ill COVID-19 patients having a median age of 70 years old are comprised in the research work, with 927 (61.7%) males and 576 (38.3%) females. A total of 48 features are considered to evaluate the proposed method, and the following results are achieved. It includes the following values i.e., area under the curve (AUC) 0.93, F2 score 0.93, recall 1.00, accuracy, 0.95, precision 0.91, specificity 0.9279 and maximum probability of correct decision (MPCD) 0.93. CONCLUSION: The results show that the proposed method is significantly best for the mortality risk prediction of patients with COVID-19 infection. The MPCD score shows that the proposed DL outperforms on every dataset when evaluating even with an over-sampling technique. The benefits of the data imputation algorithm for unavailable biomarker data are also evaluated. Based on the results, the proposed scheme could be an appropriate tool for critically ill Covid-19 patients to assess the risk of mortality and prognosis.
Assuntos
COVID-19 , Aprendizado Profundo , Idoso , Algoritmos , Área Sob a Curva , Feminino , Humanos , Masculino , PrognósticoRESUMO
The presence of endocrine-disrupting chemicals (EDCs) in water resources has significant negative implications for the environment. Traditional technologies implemented for water treatment are not completely efficient for removing EDCs from water. Therefore, research on sustainable remediation has been mainly directed to novel decontamination approaches including nano-remediation. This emerging technology employs engineered nanomaterials to clean up the environment quickly, efficiently, and sustainably. Thus, nanomaterials have contributed to a wide variety of remediation techniques like adsorption, filtration, coagulation/flocculation, and so on. Among the vast diversity of decontamination technologies catalytic advanced oxidation processes (AOPs) outstand as simple, clean, and efficient alternatives. A vast diversity of catalysts has been developed demonstrating high efficiencies; however, the search for novel catalysts with enhanced performances continues. In this regard, nanomaterials used as nanocatalysts are exhibiting enhanced performances on AOPs due to their special nanostructures and larger specific surface areas. Therefore, in this review we summarize, compare, and discuss the recent advances on nanocatalysts, catalysts doped with metal-based nanomaterials, and catalysts doped with carbon-based nanomaterials on the degradation of EDCs. Finally, further research opportunities are identified and discussed to achieve the real application of nanomaterials to efficiently degrade EDCs from water resources.
Assuntos
Disruptores Endócrinos , Poluentes Ambientais , Nanoestruturas , Poluentes Químicos da Água , Purificação da Água , Carbono , Disruptores Endócrinos/análise , Poluentes Químicos da Água/análiseRESUMO
Analyzing data related to the conditions of city streets and avenues could help to make better decisions about public spending on mobility. Generally, streets and avenues are fixed as soon as they have a citizen report or when a major incident occurs. However, it is uncommon for cities to have real-time reactive systems that detect the different problems they have to fix on the pavement. This work proposes a solution to detect anomalies in streets through state analysis using sensors within the vehicles that travel daily and connecting them to a fog-computing architecture on a V2I network. The system detects and classifies the main road problems or abnormal conditions in streets and avenues using Machine Learning Algorithms (MLA), comparing roughness against a flat reference. An instrumented vehicle obtained the reference through accelerometry sensors and then sent the data through a mid-range communication system. With these data, the system compared an Artificial Neural Network (supervised MLA) and a K-Nearest Neighbor (Supervised MLA) to select the best option to handle the acquired data. This system makes it desirable to visualize the streets' quality and map the areas with the most significant anomalies.
Assuntos
Algoritmos , Aprendizado de Máquina , Análise por Conglomerados , Sistemas Computacionais , Redes Neurais de ComputaçãoRESUMO
Environmental pollution is a critical issue that requires proper measures to maintain environmental health in a sustainable and effective manner. The growing persistence of several active pharmaceutical residues, such as antibiotics like tetracycline, and anti-inflammatory drugs like diclofenac in water matrices is considered an issue of global concern. Numerous sewage/drain waste lines from the domestic and pharmaceutical sector contain an array of toxic compounds, so-called "emerging pollutants" and possess adverse effects on entire living ecosystem and damage its biodiversity. Therefore, effective solution and preventive measures are urgently required to sustainably mitigate and/or remediate pharmaceutically active emerging pollutants from environmental matrices. In this context, herein, the entry pathways of the pharmaceutical waste into the environment are presented, through the entire lifecycle of a pharmaceutical product. There is no detailed review available on carbon-dots (CDs) as robust materials with multifunctional features that support sustainable mitigation of emerging pollutants from water matrices. Thus, CDs-based photocatalysts are emerging as an efficient alternative for decontamination by pharmaceutical pollutants. The addition of CDs on photocatalytic systems has an important role in their performance, mainly because of their up-conversion property, transfer photoinduced electron capacities, and efficient separation of electrons and holes. In this review, we analyze the strategies followed by different researchers to optimize the photodegradation of various pharmaceutical pollutants. In this manner, the effect of different parameters such as pH, the dosage of photocatalyst, amount of carbon dots, and initial pollutant concentration, among others are discussed. Finally, current challenges are presented from a pollution prevention perspective and from CDs-based photocatalytic remediation perspective, with the aim to suggest possible research directions.
Assuntos
Poluentes Ambientais , Preparações Farmacêuticas , Carbono , Descontaminação , EcossistemaRESUMO
The ubiquitous occurrence, toxicological influence, and bioaccumulation of toxic entities, e.g., pesticides and toxic elements in the environment, biota, and humans, directly or indirectly, are posing severe social, ecological, and human health concerns. Much attention has been given to the rising bioaccumulation of toxins and their adverse impact on various environmental matrices. For example, the inappropriate and exacerbated use of xenobiotics and related hazardous substances have caused the deterioration of the agricultural environment, e.g., fertile soils where plants are grown. Moreover, the harmful toxins have negatively impacted human health through the trophic chains. However, the analytical and regulatory considerations to effectively monitor and mitigate any or many pesticides and toxic elements from environmental matrices are still lacking in the existing literature. For decades, the scientific community has overseen the consequences caused by pollutants, however, the improvement of analytical detection methods and regulatory considerations are not yet fully covered. This review covers the notable literature gap by stressing the development and deployment of robust analytical and regulatory considerations for an efficient abatement of hazardous substances. Following detailed information on occurrence, toxicological influence, and bioaccumulation of pesticides and toxic elements, the most relevant analytical detection tools and regulatory measures are given herein, with suitable examples, to mitigate or reduce the damage caused by these pollutants.
Assuntos
Poluentes Ambientais , Praguicidas , Poluentes Químicos da Água , Monitoramento Ambiental , Poluentes Ambientais/toxicidade , Substâncias Perigosas/toxicidade , Humanos , Praguicidas/toxicidade , Poluentes Químicos da Água/análiseRESUMO
Non-pathological mental fatigue is a recurring, but undesirable condition among people in the fields of office work, industry, and education. This type of mental fatigue can often lead to negative outcomes, such as performance reduction and cognitive impairment in education; loss of focus and burnout syndrome in office work; and accidents leading to injuries or death in the transportation and manufacturing industries. Reliable mental fatigue assessment tools are promising in the improvement of performance, mental health and safety of students and workers, and at the same time, in the reduction of risks, accidents and the associated economic loss (e.g., medical fees and equipment reparations). The analysis of biometric (brain, cardiac, skin conductance) signals has proven to be effective in discerning different stages of mental fatigue; however, many of the reported studies in the literature involve the use of long fatigue-inducing tests and subject-specific models in their methodologies. Recent trends in the modeling of mental fatigue suggest the usage of non subject-specific (general) classifiers and a time reduction of calibration procedures and experimental setups. In this study, the evaluation of a fast and short-calibration mental fatigue assessment tool based on biometric signals and inter-subject modeling, using multiple linear regression, is presented. The proposed tool does not require fatigue-inducing tests, which allows fast setup and implementation. Electroencephalography, photopletismography, electrodermal activity, and skin temperature from 17 subjects were recorded, using an OpenBCI helmet and an Empatica E4 wristband. Correlations to self-reported mental fatigue levels (using the fatigue assessment scale) were calculated to find the best mental fatigue predictors. Three-class mental fatigue models were evaluated, and the best model obtained an accuracy of 88% using three features, ß/θ (C3), and the α/θ (O2 and C3) ratios, from one minute of electroencephalography measurements. The results from this pilot study show the feasibility and potential of short-calibration procedures and inter-subject classifiers in mental fatigue modeling, and will contribute to the use of wearable devices for the development of tools oriented to the well-being of workers and students, and also in daily living activities.
Assuntos
Dispositivos Eletrônicos Vestíveis , Local de Trabalho , Biometria , Humanos , Fadiga Mental/diagnóstico , Projetos PilotoRESUMO
Rubber bushings and mounts are vastly used in automotive applications as support and interface elements. In suspension systems, they are commonly employed to interconnect the damping structure to the chassis. Therein, the viscoelastic nature of the material introduces a desirable filtering effect to reduce mechanical vibrations. When designing a suspension system, available literature often deals with viscoelastic mounts by introducing a linear or nonlinear stiffness behavior. In this context, the present paper aims at representing the rubber material using a proper viscoelastic model with the selection of different in-wheels motors. Thus, the mount dynamic behavior's influence in a suspension is studied and discussed thoroughly through numerical simulations and sensitivity analyses. Furthermore, guidelines are proposed to orient the designer when selecting these elements.
RESUMO
This study presents a neuroengineering-based machine learning tool developed to predict students' performance under different learning modalities. Neuroengineering tools are used to predict the learning performance obtained through two different modalities: text and video. Electroencephalographic signals were recorded in the two groups during learning tasks, and performance was evaluated with tests. The results show the video group obtained a better performance than the text group. A correlation analysis was implemented to find the most relevant features to predict students' performance, and to design the machine learning tool. This analysis showed a negative correlation between students' performance and the (theta/alpha) ratio, and delta power, which are indicative of mental fatigue and drowsiness, respectively. These results indicate that users in a non-fatigued and well-rested state performed better during learning tasks. The designed tool obtained 85% precision at predicting learning performance, as well as correctly identifying the video group as the most efficient modality.
RESUMO
Coffee is one of the most important commercial traded commodities in the international market, as well as the most popular beverage around the world. In Mexico, organic coffee cultivation (specifically, Arabica coffee crops) is a highly demanded that generates up to 500,000 employments in 14 federal entities. Among various coffee producers, Chiapas, Veracruz, and Oaxaca are responsible of 80% of the total coffee production in the country. Currently, Mexico is the leading producer of organic coffee in the world. However, there have been a slow recovery due to the large production losses since 2012, caused by earlier and highly aggressive outbreaks of coffee leaf rust (CLR), in the country, where the infectious agent is known as Hemileia vastatrix (HV). This phenomenon is becoming frequent, and climate change effects could be the main contributors. This spontaneous proliferation was generated in Mexico, due to the precipitation and temperature variability, during the last decade. As result, in Mexico, the biological interaction between coffee crops and their environment has been harmed and crucial characteristics, as crop yield and quality, are particularly being affected, directly by the negative effects of the greenhouse phenomenon, and indirectly, through diseases as CLR. Therefore, this review discusses the contribution of climate change effects in the early development of CLR in Mexico. The focus is also given on possible schemes and actions taken around the world as control measures to adapt the vulnerable coffee varieties to tackle this challenging issue.
Assuntos
Basidiomycota , Café , Mudança Climática , México , Doenças das PlantasRESUMO
Among the different chemical and physical treatments used to remove the color of the textile effluents, bioremediation offers many benefits to the environment. In this study, we determined the potential of Spirulina platensis (S. platensis) for decolorizing indigo blue dye under different incubation conditions. The microalgae were incubated at different pH (from 4 to 10) to calibrate for the optimal discoloration condition; a pH of 4 was found to be optimal. The biomass concentration in all experiments was 1 g/L, which was able to decolorize the indigo blue dye by day 3. These results showed that S. platensis is capable of removing indigo blue dye at low biomass. However, this was dependent on the treatment conditions, where temperature played the most crucial role. Two theoretical adsorption models, namely (1) a first-order model equation and (2) a second-order rate equation, were compared with observed adsorption vs. time curves for different initial concentrations (from 25 to 100 mg/L). The comparison between models showed similar accuracy and agreement with the experimental values. The observed adsorption isotherms for three temperatures (30, 40, and 50 °C) were plotted, showing fairly linear behavior in the measured range. The adsorption equilibrium isotherms were estimated, providing an initial description of the dye removal capacity of S. platensis.
RESUMO
This manuscript focuses on the implementation of the hierarchical complexity of space-time deterministic and stochastic dynamical systems to study the pollution dispersion behavior. Considering the concurrent environmental scope and requisites to understand the evolution of various types of environmentally related pollutants of high concern, herein, several suitable mathematical models are anticipated. Aiming to study the current pollution phenomenon at hand, we employed a lumped-linear or nonlinear structure and directly discussed in support of relevant equations. Up to some extent, by intuition, the researcher knows which model is more complex (suitable) than others, so the basic concepts are coated with linked references. Hence, the structural complexity features of the dynamical system are discussed in detail. The continuous dynamical system is discretized, and from the associated time series, a complexity measure can be obtained. There also exists a research gap on complexity theory, which generally deals with the behavior (solutions to the representing differential equations) in a system. Taking all these into account to cover the left behind literature gaps, herein, we propose to glimpse a family of classical models used to describe pollution and bacterial dispersion in the environment. From this review, we offer a qualitative complexity measure to each modeling paradigm by taking into account the underlying space of definition of the model and the key issue of the related differentiability. For instance, a lumped-linear set of differential equations is relatively simple with respect to its nonlinear counterpart because the former lives in the three-dimensional (3-D) real space R3, where the notion of differentiability shows up naturally. However, the latter needs to translate such conception to a manifold by means of differential geometry. Going further, we reflected on this issue for random systems where the notion of differentiability is transformed into an integral equivalence by means of Ito's lemma and so on for more exotic modeling perspectives. Moreover, the study presents a qualitative measure of complexity in terms of underlying sets and feasibility of differentiability.
RESUMO
Here the problem of designing two degrees of freedom controllers for an unknown plant based on input-output measurements is discussed. Virtual reference feedback tuning aims at minimizing a cost function of the L2-norm type by using a set of data, as no identification process is needed. When constructing this cost function, two model-matching problems are considered between closed loop transfer function and sensitivity function simultaneously. In model-matching procedures, we design virtual input and virtual disturb respectively. Further two filters used to reprocess the input-output measurements are derived to prove the equivalence between virtual reference feedback tuning and model reference control. After constructing one identification cost without any knowledge of the plant, we derive one bound on the difference between the expected identification cost and its sample identification cost under the condition that the number of data points is finite. Further the correlation property between the input and external noise is considered in deriving this bound. Then we continue to derive one probabilistic bound to quantify this difference through using some probability inequalities and knowledge of control theory. The number of data points is obtained by using generalization of independent block sequence. Finally two simulation examples have been performed to demonstrate the effectiveness of the theories proposed in this paper.
RESUMO
BACKGROUND: To date, a large number of acoustic therapies have been applied to treat tinnitus. The effect that produces those auditory stimuli is, however, not well understood yet. Furthermore, the conventional clinical protocol is based on a trial-error procedure, and there is not a formal and adequate treatment follow-up. At present, the only way to evaluate acoustic therapies is by means of subjective methods such as analog visual scale and ad-hoc questionnaires. METHODS: This protocol seeks to establish an objective methodology to treat tinnitus with acoustic therapies based on electroencephalographic (EEG) activity evaluation. On the hypothesis that acoustic therapies should produce perceptual and cognitive changes at a cortical level, it is proposed to examine neural electrical activity of patients suffering from refractory and chronic tinnitus in four different stages: at the beginning of the experiment, at one week of treatment, at five weeks of treatment, and at eight weeks of treatment. Four of the most efficient acoustic therapies found at the moment are considered: retraining, auditory discrimination, enriched acoustic environment, and binaural. DISCUSSION: EEG has become a standard brain imaging tool to quantify and qualify neural oscillations, which are basically spatial, temporal, and spectral patterns associated with particular perceptual, cognitive, motor and emotional processes. Neural oscillations have been traditionally studied on the basis of event-related experiments, where time-locked and phase-locked responses (i.e., event-related potentials) along with time-locked but not necessary phase-locked responses (i.e., event-related (de) synchronization) have been essentially estimated. Both potentials and levels of synchronization related to auditory stimuli are herein proposed to assess the effect of acoustic therapies. TRIAL REGISTRATION: Registration Number: ISRCTN14553550. ISRCTN Registry: BioMed Central. Date of Registration: October 31st, 2017.
RESUMO
Epileptic encephalopathies (EE) is a term coined by the International League Against Epilepsy (ILAE) to refer to a group of epilepsies in which the ictal and interictal abnormalities may contribute to progressive cerebral dysfunction. Among them, two affect mainly children and are very difficult to deal with, Doose and Lennox-Gastaut syndromes, (DS and LGS, respectively). So far (Zavala-Yoe et al., J Integr Neurosci 15(2):205-223, 2015a and works of ours there), quantitative analysis of single case studies of EE have been performed. All of them are manifestations of drug resistant epileptic encephalopathies (DREES) and as known, such disorders require a lot of EEG studies through all patient's life. As a consequence, dozens of EEG records are stored by parents and neurologists as time goes by. However, taking into account all this massive information, our research questions (keeping colloquial wording by parents) arise: a) Which zone of the brain has been the most affected so far? b) On which year was the child better? c) How bad is our child with respect to others? We must reflect that despite clinical assessment of the EEG has undergone standardization by establishment of guidelines such as the recently published guidelines of the American Clinical Neurophysiology Society (Tsuchida et al., J Clin Neurophysiol 4(33):301-302, 2016), qualitative EEG will never be as objective as quantitative EEG, since it depends largely on the education and experience of the conducting neurophysiologist (Grant et al., Epilepsy Behav 2014(32):102-107, 2014, Rating, Z Epileptologie, Springer Med 27(2):139-142, 2014). We already answered quantitatively the above mentioned questions in the references of ours given above where we provided entropy curves and an entropy index which encompasses the complexity of bunches of EEG making possible to deal with massive data and to make objective comparisons among some patients simultaneously. However, we have refined that index here and we also offer another two measures which are spatial and dynamic. Moreover, from those indices we also provide what we call a temporal dynamic complexity path which shows in a standard 10-20 system head diagram the evolution of the lowest complexity per brain zone with respect to the EEG period. These results make it possible to compare quantitatively/graphically the progress of several patients at the same time, answering the questions posed above. The results obtained showed that we can associate low spatio-temporal entropy indices to multiple seizures events in several patients at the same time as well as tracking seizure progress in space and time with our entropy path, coinciding with neurophysiologists observations.
Assuntos
Encefalopatias/fisiopatologia , Epilepsia/fisiopatologia , Modelos Neurológicos , Adolescente , Algoritmos , Anticonvulsivantes/efeitos adversos , Anticonvulsivantes/uso terapêutico , Encéfalo/fisiopatologia , Encefalopatias/tratamento farmacológico , Criança , Pré-Escolar , Bases de Dados Factuais , Resistência a Medicamentos , Eletroencefalografia , Entropia , Epilepsia/tratamento farmacológico , Feminino , Guias como Assunto , Humanos , Lactente , Masculino , Valores de Referência , SíndromeRESUMO
Brain-computer interface (BCI) is technology that is developing fast, but it remains inaccurate, unreliable and slow due to the difficulty to obtain precise information from the brain. Consequently, the involvement of other biosignals to decode the user control tasks has risen in importance. A traditional way to operate a BCI system is via motor imagery (MI) tasks. As imaginary movements activate similar cortical structures and vegetative mechanisms as a voluntary movement does, heart rate variability (HRV) has been proposed as a parameter to improve the detection of MI related control tasks. However, HR is very susceptible to body needs and environmental demands, and as BCI systems require high levels of attention, perceptual processing and mental workload, it is important to assess the practical effectiveness of HRV. The present study aimed to determine if brain and heart electrical signals (HRV) are modulated by MI activity used to control a BCI system, or if HRV is modulated by the user perceptions and responses that result from the operation of a BCI system (i.e., user experience). For this purpose, a database of 11 participants who were exposed to eight different situations was used. The sensory-cognitive load (intake and rejection tasks) was controlled in those situations. Two electrophysiological signals were utilized: electroencephalography and electrocardiography. From those biosignals, event-related (de-)synchronization maps and event-related HR changes were respectively estimated. The maps and the HR changes were cross-correlated in order to verify if both biosignals were modulated due to MI activity. The results suggest that HR varies according to the experience undergone by the user in a BCI working environment, and not because of the MI activity used to operate the system.
RESUMO
Doose and Lennox-Gastaut (syndromes) are rare generalized electroclinical affections of early infancy of variable prognosis which manifest with very diverse kinds of seizures. Very frequently, these types of epilepsy become drug resistant and finding reliable treatment results is very difficult. As a result of this, fighting against these syndromes becomes a long term (or endless) event for the little patient, the neurologist and the parents. A lot of Electroencephalographic (EEG) records are so accumulated during the child's life in order to monitor evolution and correlate it with medications. So, given a bunch of EEG, three questions arise: (a) On which year was the child healthier (less affected by seizures)? (b) Which area of the brain has been the most affected? (c) What is the status of the child with respect to others (which also have a bunch of EEG, each)? Answering these interrogations by traditional scrutinizing of the whole database becomes subjective, if not impossible. We propose to answer these questions objectively by means of time series entropies. We start with our modified version of the Multiscale Entropy (MSE) in order to generalize it as a Bivariate MSE (BMSE) and from them, we compute two indices. All were tested in a series of patients and coincide with medical conclusions. As far as we are concerned, our contribution is new.