RESUMEN
Fusing data from many sources helps to achieve improved analysis and results. In this work, we present a new algorithm to fuse data from multiple cameras with data from multiple lidars. This algorithm was developed to increase the sensitivity and specificity of autonomous vehicle perception systems, where the most accurate sensors measuring the vehicle's surroundings are cameras and lidar devices. Perception systems based on data from one type of sensor do not use complete information and have lower quality. The camera provides two-dimensional images; lidar produces three-dimensional point clouds. We developed a method for matching pixels on a pair of stereoscopic images using dynamic programming inspired by an algorithm to match sequences of amino acids used in bioinformatics. We improve the quality of the basic algorithm using additional data from edge detectors. Furthermore, we also improve the algorithm performance by reducing the size of matched pixels determined by available car speeds. We perform point cloud densification in the final step of our method, fusing lidar output data with stereo vision output. We implemented our algorithm in C++ with Python API, and we provided the open-source library named Stereo PCD. This library very efficiently fuses data from multiple cameras and multiple lidars. In the article, we present the results of our approach to benchmark databases in terms of quality and performance. We compare our algorithm with other popular methods.
RESUMEN
The field of quantum computing (QC) is expanding, with efforts being made to apply it to areas previously covered by classical algorithms and methods. Bioinformatics is one such domain that is developing in terms of QC. This article offers a broad mapping review of methods and algorithms of QC in bioinformatics, marking the first of its kind. It presents an overview of the domain and aids researchers in identifying further research directions in the early stages of this field of knowledge. The work presented here shows the current state-of-the-art solutions, focuses on general future directions, and highlights the limitations of current methods. The gathered data includes a comprehensive list of identified methods along with descriptions, classifications, and elaborations of their advantages and disadvantages. Results are presented not just in a descriptive table but also in an aggregated and visual format.
Asunto(s)
Algoritmos , Biología Computacional , Teoría Cuántica , Biología Computacional/métodos , HumanosRESUMEN
Multiple sclerosis (MS) is a demyelinating and neurodegenerative disease due to an autoimmune chronic inflammatory response, yet the etiology is currently not completely understood. It is already known that physical activity plays an essential role in improving quality of life, especially in neuropathological conditions. The study was aimed to investigate the possible benefits of high-intensity interval training (HIIT) in bone and lipid metabolism markers, and neuromotor abilities in MS patients. 130 participants were recruited; 16 subjects with MS met the inclusion criteria and were included in the data analysis. The patients were randomly assigned to two groups: a Control group (CG) (34.88 ± 4.45 yrs) that didn't perform any physical activity and the Exercise group (EG) (36.20 ± 7.80 yrs) that performed HIIT protocol. The training program was conducted remotely by a kinesiologist. It was performed three times a week for 8 weeks. At the beginning (T0) and the end of the study (T1) physical function tests, bone remodelling markers, and lipid markers analyses were performed. After 8 weeks of training the wall squat (s) (T0 = 27.18 ± 4.21; T1 = 41.68 ± 5.38, p ≤ 0.01) and Time Up and Go test (s) (T0 = 7.65 ± 0.43; T1 = 6.34 ± 0.38 p ≤ 0.01) performances improved; lipid markers analysis showed a decrease in Total (mg/dl) (T0 = 187.22 ± 15.73; T1 = 173.44 ± 13.03, p ≤ 0.05) and LDL (mg/dl) (T0 = 108 ± 21.08; T1 = 95.02 ± 17.99, p < 0.05) cholesterol levels. Additionally, the levels of osteocalcin (µg/L), a marker of bone formation increased (T0 = 20.88 ± 4.22; T1 = 23.66 ± 6.24, p < 0.05), 25-OH Vitamin D (µg/L) improved after 8 weeks (T0 = 21.11 ± 7.11; T1 = 27.66 ± 7.59, p < 0.05). HIIT had an effect on lower limb strength and gait control, improved bone formation, and lipid management, in MS patients.
Asunto(s)
Remodelación Ósea , Entrenamiento de Intervalos de Alta Intensidad , Esclerosis Múltiple , Humanos , Entrenamiento de Intervalos de Alta Intensidad/métodos , Masculino , Femenino , Adulto , Esclerosis Múltiple/fisiopatología , Esclerosis Múltiple/sangre , Esclerosis Múltiple/terapia , Lípidos/sangre , Metabolismo de los Lípidos , Biomarcadores/sangre , Persona de Mediana Edad , Calidad de Vida , Terapia por Ejercicio/métodos , Ejercicio Físico/fisiologíaRESUMEN
Electrocardiogram (ECG) are the physiological signals and a standard test to measure the heart's electrical activity that depicts the movement of cardiac muscles. A review study has been conducted on ECG signals analysis with the help of artificial intelligence (AI) methods over the last ten years i.e., 2012-22. Primarily, the method of ECG analysis by software systems was divided into classical signal processing (e.g. spectrograms or filters), machine learning (ML) and deep learning (DL), including recursive models, transformers and hybrid. Secondly, the data sources and benchmark datasets were depicted. Authors grouped resources by ECG acquisition methods into hospital-based portable machines and wearable devices. Authors also included new trends like advanced pre-processing, data augmentation, simulations and agent-based modeling. The study found improvement in ECG examination perfection made each year through ML, DL, hybrid models, and transformers. Convolutional neural networks and hybrid models were more targeted and proved efficient. The transformer model extended the accuracy from 90% to 98%. The Physio-Net library helps acquire ECG signals, including the popular benchmark databases such as MIT-BIH, PTB, and challenging datasets. Similarly, wearable devices have been established as a appropriate option for monitoring patient health without the time and place limitations and are also helpful for AI model calibration with so far accuracy of 82%-83% on Samsung smartwatch. In the pre-processing signals, spectrogram generation through Fourier and wavelet transformations are erected leading approaches promoting on average accuracy of 90%-95%. Likewise, data enhancement using geometrical techniques is well-considered; however, extraction and concatenation-based methods need attention. As the what-if analysis in healthcare or cardiac issues can be performed using a complex simulation, the study reviews agent-based modeling and simulation approaches for cardiovascular risk event assessment.
Asunto(s)
Algoritmos , Inteligencia Artificial , Humanos , Redes Neurales de la Computación , Programas Informáticos , Procesamiento de Señales Asistido por Computador , Electrocardiografía/métodosRESUMEN
Currently, there are numerous methods that can be used to neutralize pathogens (i.e., devices, tools, or protective clothing), but the sterilizing agent must be selected so that it does not damage or change the properties of the material to which it is applied. Dry sterilization with hydrogen peroxide gas (VHP) in combination with UV-C radiation is well described and effective method of sterilization. This paper presents the design, construction, and analysis of a novel model of sterilization device. Verification of the sterilization process was performed, using classical microbiological methods and flow cytometry, on samples containing Geobacillus stearothermophilus spores, Bacillus subtilis spores, Escherichia coli, and Candida albicans. Flow cytometry results were in line with the standardized microbiological tests and confirmed the effectiveness of the sterilization process. It was also determined that mobile sterilization stations represent a valuable solution when dedicated to public institutions and businesses in the tourism sector, sports & fitness industry, or other types of services, e.g., cosmetic services. A key feature of this solution is the ability to adapt the device within specific constraints to the user's needs.
Asunto(s)
Geobacillus stearothermophilus , Esterilización , Esterilización/métodos , Bacillus subtilis , Peróxido de Hidrógeno , Esporas , Esporas BacterianasRESUMEN
BACKGROUND: The five base-pair (bp) insertion/deletion (rs3039851) polymorphism in the PPP3R1 gene, which encodes calcineurin subunit B type 1, has been found to be associated with left ventricular hypertrophy (LVH) in hypertensive patients and in athletes. The aim of this study is to analyze the possible association between PPP3R1:rs3039851 polymorphism and left ventricular mass (LVM) in full-term healthy newborns. METHODS: The study group consisted of 162 consecutive, full-term, healthy newborns. Two-dimensional M-mode echocardiography was used to assess LVM. The PPP3R1:rs3039851 polymorphism was identified by PCR-RFLP in genomic DNA extracted from cord blood leukocytes. RESULTS: No significant differences were found between newborns homozygous for the reference allele (5I/5I, n = 135) and newborns carrying at least one 5D allele (n = 27) for LVM standardized for body mass, body length or body surface area (LVM/BM, LVM/BL or LVM/BSA, respectively). However, the frequency of PPP3R1:rs3039851 genotypes with a 5D allele (5I/5D + 5D/5D) among newborns with the largest LVM/BM or LVM/BSA (upper tertile) was statistically significantly higher compared with the prevalence in individuals with the lowest values of both indices (lower tertile). CONCLUSIONS: Our results suggest that the PPP3R1:rs3039851 polymorphism may contribute to subtle variation in left ventricular mass at birth.
RESUMEN
This study aimed to assess the post-effort transcriptional changes of selected genes encoding receptors for chemokines and interleukins in young, physically active men to better understand the immunomodulatory effect of physical activity. The participants, aged 16-21 years, performed physical exercise tasks of either a maximal multistage 20 m shuttle-run test (beep test) or a repeated speed ability test. The expression of selected genes encoding receptors for chemokines and interleukins in nucleated peripheral blood cells was determined using RT-qPCR. Aerobic endurance activity was a positive stimulant that induced increased expression of CCR1 and CCR2 genes following lactate recovery, while the maximum expression of CCR5 was found immediately post-effort. The increase in the expression of inflammation-related genes encoding chemokine receptors triggered by aerobic effort strengthens the theory that physical effort induces sterile inflammation. Different profiles of studied chemokine receptor gene expression induced by short-term anaerobic effort suggest that not all types of physical effort activate the same immunological pathways. A significant increase in IL17RA gene expression after the beep test confirmed the hypothesis that cells expressing this receptor, including Th17 lymphocyte subsets, can be involved in the creation of an immune response after endurance efforts.
Asunto(s)
Esfuerzo Físico , Receptores CCR2 , Masculino , Humanos , Receptores CCR5/genética , Quimiocinas/metabolismo , Células Sanguíneas/metabolismo , Receptores de Interleucina , Inflamación/genéticaRESUMEN
Respiration rate is an important healthcare indicator, and it has become a popular research topic in remote healthcare applications with Internet of Things. Existing respiration monitoring systems have limitations in terms of convenience, comfort, and privacy, etc. This paper presents a contactless and real-time respiration monitoring system, the so-called Wi-Breath, based on off-the-shelf WiFi devices. The system monitors respiration with both the amplitude and phase difference of the WiFi channel state information (CSI), which is sensitive to human body micro movement. The phase information of the CSI signal is considered and both the amplitude and phase difference are used. For better respiration detection accuracy, a signal selection method is proposed to select an appropriate signal from the amplitude and phase difference based on a support vector machine (SVM) algorithm. Experimental results demonstrate that the Wi-Breath achieves an accuracy of 91.2% for respiration detection, and has a 17.0% reduction in average error in comparison with state-of-the-art counterparts.
Asunto(s)
Algoritmos , Frecuencia Respiratoria , Humanos , Monitoreo Fisiológico , Tecnología Inalámbrica , Atención a la SaludRESUMEN
MicroRNAs (miRNAs) influence several biological processes involved in human disease. Biological experiments for verifying the association between miRNA and disease are always costly in terms of both money and time. Although numerous biological experiments have identified multi-types of associations between miRNAs and diseases, existing computational methods are unable to sufficiently mine the knowledge in these associations to predict unknown associations. In this study, we innovatively propose a heterogeneous graph attention network model based on meta-subgraphs (MSHGANMDA) to predict the potential miRNA-disease associations. Firstly, we define five types of meta-subgraph from the known miRNA-disease associations. Then, we use meta-subgraph attention and meta-subgraph semantic attention to extract features of miRNA-disease pairs within and between these five meta-subgraphs, respectively. Finally, we apply a fully-connected layer (FCL) to predict the scores of unknown miRNA-disease associations and cross-entropy loss to train our model end-to-end. To evaluate the effectiveness of MSHGANMDA, we apply five-fold cross-validation to calculate the mean values of evaluation metrics Accuracy, Precision, Recall, and F1-score as 0.8595, 0.8601, 0.8596, and 0.8595, respectively. Experiments show that our model, which primarily utilizes multi-types of miRNA-disease association data, gets the greatest ROC-AUC value of 0.934 when compared to other state-of-the-art approaches. Furthermore, through case studies, we further confirm the effectiveness of MSHGANMDA in predicting unknown diseases.
Asunto(s)
MicroARNs , Humanos , MicroARNs/genética , Biología Computacional/métodos , AlgoritmosRESUMEN
The non-invasive electrocardiogram (ECG) signals are useful in heart condition assessment and are found helpful in diagnosing cardiac diseases. However, traditional ways, i.e., a medical consultation required effort, knowledge, and time to interpret the ECG signals due to the large amount of data and complexity. Neural networks have been shown to be efficient recently in interpreting the biomedical signals including ECG and EEG. The novelty of the proposed work is using spectrograms instead of raw signals. Spectrograms could be easily reduced by eliminating frequencies with no ECG information. Moreover, spectrogram calculation is time-efficient through short-time Fourier transformation (STFT) which allowed to present reduced data with well-distinguishable form to convolutional neural network (CNN). The data reduction was performed through frequency filtration by taking a specific cutoff value. These steps makes architecture of the CNN model simple which showed high accuracy. The proposed approach reduced memory usage and computational power through not using complex CNN models. A large publicly available PTB-XL dataset was utilized, and two datasets were prepared, i.e., spectrograms and raw signals for binary classification. The highest accuracy of 99.06% was achieved by the proposed approach, which reflects spectrograms are better than the raw signals for ECG classification. Further, up- and down-sampling of the signals were also performed at various sampling rates and accuracies were attained.
Asunto(s)
Cardiopatías , Redes Neurales de la Computación , Humanos , Frecuencia Cardíaca , Electrocardiografía , Filtración , AlgoritmosRESUMEN
Sport diagnostics is still in pursuit of the optimal combination of biochemical and hematological markers to assess training loads and the effectiveness of recovery. The biochemical and hematological markers selected for a panel should be specific to the sport and training program. Therefore, the aim of this study was to evaluate the usefulness of selected biochemical and hematological variables in professional long-distance and sprint swimming. Twenty-seven participants aged 15-18 years took part in the study. Alanine aminotransferase (ALT), aspartate aminotransferase (AST), lactate dehydrogenase (LDH) and alkaline phosphatase (ALP) activities and creatinine (Cr), C-reactive protein (CRP), ferritin, total bilirubin (TB), direct bilirubin (DB) and iron concentrations were measured for 10 weeks and compared with the traditional sport diagnostic markers of creatine kinase (CK) activity and urea (U) concentration. Additionally, capillary blood morphology was analyzed. An effective panel should consist of measurements of CK and AST activities and urea, TB, DB and ferritin concentrations. These markers provide a good overview of athletes' post-training effort changes, can help assess the effectiveness of their recovery regardless of sex or competitive distance and are affordable. Moreover, changes in ferritin concentration can indicate inflammation status and, when combined with iron concentration and blood morphology, can help to avoid iron deficiencies, anemia and adverse inflammatory states in swimmers.
Asunto(s)
Bilirrubina , Ferritinas , Aspartato Aminotransferasas , Biomarcadores , Humanos , Hierro , UreaRESUMEN
With the popularity of the wireless body sensor network, real-time and continuous collection of single-lead electrocardiogram (ECG) data becomes possible in a convenient way. Data mining from the collected single-lead ECG waves has therefore aroused extensive attention worldwide, where early detection of atrial fibrillation (AF) is a hot research topic. In this paper, a two-channel convolutional neural network combined with a data augmentation method is proposed to detect AF from single-lead short ECG recordings. It consists of three modules, the first module denoises the raw ECG signals and produces 9-s ECG signals and heart rate (HR) values. Then, the ECG signals and HR rate values are fed into the convolutional layers for feature extraction, followed by three fully connected layers to perform the classification. The data augmentation method is used to generate synthetic signals to enlarge the training set and increase the diversity of the single-lead ECG signals. Validation experiments and the comparison with state-of-the-art studies demonstrate the effectiveness and advantages of the proposed method.
Asunto(s)
Fibrilación Atrial , Aprendizaje Profundo , Fibrilación Atrial/diagnóstico , Electrocardiografía/métodos , Frecuencia Cardíaca , Humanos , Redes Neurales de la ComputaciónRESUMEN
The Th1 cell subset is involved in the immunological response induced by physical exercise. The aim of this work is to evaluate the post-effort activation of Ras/MAPK and JAK/STAT signaling pathways in T cells of young, physically active men. Seventy-six physically active, healthy men between 15 and 21 years old performed a standard physical exercise protocol (Beep test). Phosphorylation levels of Ras/MAPK-(p38 MAPK, ERK1/2) and JAK/STAT-related (STAT1, STAT3, STAT5, and STAT6) proteins were evaluated by flow cytometry in Th and Tc cells post-effort and during the lactate recovery period. The performed physical effort was not a strong enough physiological stimulant to provoke the phosphorylation of ERK1/2, p38 MAPK, STAT1, STAT3, STAT5, and STAT6 in T cells, at least for the duration of our study (the end of the lactate recovery period). We conclude that more observation time-points, including shorter and longer times after the exercise, are required to determine if the Ras/MAPK signaling pathway is involved in modulating the post-effort immunological response.
RESUMEN
BACKGROUND: The assembly task is an indispensable step in sequencing genomes of new organisms and studying structural genomic changes. In recent years, the dynamic development of next-generation sequencing (NGS) methods raises hopes for making whole-genome sequencing a fast and reliable tool used, for example, in medical diagnostics. However, this is hampered by the slowness and computational requirements of the current processing algorithms, which raises the need to develop more efficient algorithms. One possible approach, still little explored, is the use of quantum computing. RESULTS: We present a proof of concept of de novo assembly algorithm, using the Genomic Signal Processing approach, detecting overlaps between DNA reads by calculating the Pearson correlation coefficient and formulating the assembly problem as an optimization task (Traveling Salesman Problem). Computations performed on a classic computer were compared with the results achieved by a hybrid method combining CPU and QPU calculations. For this purpose quantum annealer by D-Wave was used. The experiments were performed with artificially generated data and DNA reads coming from a simulator, with actual organism genomes used as input sequences. To our knowledge, this work is one of the few where actual sequences of organisms were used to study the de novo assembly task on quantum annealer. CONCLUSIONS: Proof of concept carried out by us showed that the use of quantum annealer (QA) for the de novo assembly task might be a promising alternative to the computations performed in the classical model. The current computing power of the available devices requires a hybrid approach (combining CPU and QPU computations). The next step may be developing a hybrid algorithm strictly dedicated to the de novo assembly task, using its specificity (e.g. the sparsity and bounded degree of the overlap-layout-consensus graph).
Asunto(s)
Metodologías Computacionales , Teoría Cuántica , Algoritmos , Secuencia de Bases , ADN/genética , Secuenciación de Nucleótidos de Alto Rendimiento/métodos , Análisis de Secuencia de ADN/métodosRESUMEN
Species tree estimation faces many significant hurdles. Chief among them is that the trees describing the ancestral lineages of each individual gene-the gene trees-often differ from the species tree. The multispecies coalescent is commonly used to model this gene tree discordance, at least when it is believed to arise from incomplete lineage sorting, a population-genetic effect. Another significant challenge in this area is that molecular sequences associated to each gene typically provide limited information about the gene trees themselves. While the modeling of sequence evolution by single-site substitutions is well-studied, few species tree reconstruction methods with theoretical guarantees actually address this latter issue. Instead, a standard-but unsatisfactory-assumption is that gene trees are perfectly reconstructed before being fed into a so-called summary method. Hence much remains to be done in the development of inference methodologies that rigorously account for gene tree estimation error-or completely avoid gene tree estimation in the first place. In previous work, a data requirement trade-off was derived between the number of loci m needed for an accurate reconstruction and the length of the locus sequences k. It was shown that to reconstruct an internal branch of length f, one needs m to be of the order of [Formula: see text]. That previous result was obtained under the restrictive assumption that mutation rates as well as population sizes are constant across the species phylogeny. Here we further generalize this result beyond this assumption. Our main contribution is a novel reduction to the molecular clock case under the multispecies coalescent, which we refer to as a stochastic Farris transform. As a corollary, we also obtain a new identifiability result of independent interest: for any species tree with [Formula: see text] species, the rooted topology of the species tree can be identified from the distribution of its unrooted weighted gene trees even in the absence of a molecular clock.
Asunto(s)
Especiación Genética , Modelos Genéticos , FilogeniaRESUMEN
BACKGROUND: A typical Copy Number Variations (CNVs) detection process based on the depth of coverage in the Whole Exome Sequencing (WES) data consists of several steps: (I) calculating the depth of coverage in sequencing regions, (II) quality control, (III) normalizing the depth of coverage, (IV) calling CNVs. Previous tools performed one normalization process for each chromosome-all the coverage depths in the sequencing regions from a given chromosome were normalized in a single run. METHODS: Herein, we present the new CNVind tool for calling CNVs, where the normalization process is conducted separately for each of the sequencing regions. The total number of normalizations is equal to the number of sequencing regions in the investigated dataset. For example, when analyzing a dataset composed of n sequencing regions, CNVind performs n independent depth of coverage normalizations. Before each normalization, the application selects the k most correlated sequencing regions with the depth of coverage Pearson's Correlation as distance metric. Then, the resulting subgroup of [Formula: see text] sequencing regions is normalized, the results of all n independent normalizations are combined; finally, the segmentation and CNV calling process is performed on the resultant dataset. RESULTS AND CONCLUSIONS: We used WES data from the 1000 Genomes project to evaluate the impact of independent normalization on CNV calling performance and compared the results with state-of-the-art tools: CODEX and exomeCopy. The results proved that independent normalization allows to improve the rare CNVs detection specificity significantly. For example, for the investigated dataset, we reduced the number of FP calls from over 15,000 to around 5000 while maintaining a constant number of TP calls equal to about 150 CNVs. However, independent normalization of each sequencing region is a computationally expensive process, therefore our pipeline is customized and can be easily run in the cloud computing environment, on the computer cluster, or the single CPU server. To our knowledge, the presented application is the first attempt to implement an innovative approach to independent normalization of the depth of WES data coverage.
Asunto(s)
Variaciones en el Número de Copia de ADN , Exoma , Algoritmos , Nube Computacional , Secuenciación de Nucleótidos de Alto Rendimiento/métodos , Secuenciación del ExomaRESUMEN
In this work, the problem of classifying Polish court rulings based on their text is presented. We use natural language processing methods and classifiers based on convolutional and recurrent neural networks. We prepared a dataset of 144,784 authentic, anonymized Polish court rulings. We analyze various general language embedding matrices and multiple neural network architectures with different parameters. Results show that such models can classify documents with very high accuracy (>99%). We also include an analysis of wrongly predicted examples. Performance analysis shows that our method is fast and could be used in practice on typical server hardware with 2 Processors (Central Processing Units, CPUs) or with a CPU and a Graphics processing unit (GPU).
Asunto(s)
Procesamiento de Lenguaje Natural , Redes Neurales de la Computación , Computadores , Lenguaje , PoloniaRESUMEN
Third-generation DNA sequencers provided by Oxford Nanopore Technologies (ONT) produce a series of samples of an electrical current in the nanopore. Such a time series is used to detect the sequence of nucleotides. The task of translation of current values into nucleotide symbols is called basecalling. Various solutions for basecalling have already been proposed. The earlier ones were based on Hidden Markov Models, but the best ones use neural networks or other machine learning models. Unfortunately, achieved accuracy scores are still lower than competitive sequencing techniques, like Illumina's. Basecallers differ in the input data type-currently, most of them work on a raw data straight from the sequencer (time series of current). Still, the approach of using event data is also explored. Event data is obtained by preprocessing of raw data and dividing it into segments described by several features computed from raw data values within each segment. We propose a novel basecaller that uses joint processing of raw and event data. We define basecalling as a sequence-to-sequence translation, and we use a machine learning model based on an encoder-decoder architecture of recurrent neural networks. Our model incorporates twin encoders and an attention mechanism. We tested our solution on simulated and real datasets. We compare the full model accuracy results with its components: processing only raw or event data. We compare our solution with the existing ONT basecaller-Guppy. Results of numerical experiments show that joint raw and event data processing provides better basecalling accuracy than processing each data type separately. We implement an application called Ravvent, freely available under MIT licence.
Asunto(s)
Nanoporos , ADN , Aprendizaje Automático , Redes Neurales de la Computación , Análisis de Secuencia de ADN/métodosRESUMEN
In this work we present an automated approach to allergy recognition based on neural networks. Allergic reaction classification is an important task in modern medicine. Currently it is done by humans, which has obvious drawbacks, such as subjectivity in the process. We propose an automated method to classify prick allergic reactions using correlated visible-spectrum and thermal images of a patient's forearm. We test our model on a real-life dataset of 100 patients (1584 separate allergen injections). Our solution yields good results-0.98 ROC AUC; 0.97 AP; 93.6% accuracy. Additionally, we present a method to segment separate allergen injection areas from the image of the patient's forearm (multiple injections per forearm). The proposed approach can possibly reduce the time of an examination, while taking into consideration more information than possible by human staff.
Asunto(s)
Alérgenos/inmunología , Hipersensibilidad/inmunología , Redes Neurales de la Computación , Pruebas Cutáneas/métodos , Piel/inmunología , Termografía/métodos , Alérgenos/administración & dosificación , Conjuntos de Datos como Asunto , Femenino , Humanos , MasculinoRESUMEN
Everyday life's hygiene and professional realities, especially in economically developed countries, indicate the need to modify the standards of pro-health programs as well as modern hygiene and work ergonomics programs. These observations are based on the problem of premature death caused by civilization diseases. The biological mechanisms associated with financial risk susceptibility are well described, but there is little data explaining the biological basis of neuroaccounting. Therefore, the aim of the study was to present relationships between personality traits, cognitive competences and biological factors shaping behavioral conditions in a multidisciplinary aspect. This critical review paper is an attempt to compile biological and psychological factors influencing the development of professional competences, especially decent in the area of accounting and finance. We analyzed existing literature from wide range of scientific disciplines (including economics, psychology, behavioral genetics) to create background to pursuit multidisciplinary research models in the field of neuroaccounting. This would help in pointing the best genetically based behavioral profile of future successful financial and accounting specialists.