Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 14 de 14
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Entropy (Basel) ; 25(2)2023 Jan 17.
Artigo em Inglês | MEDLINE | ID: mdl-36832553

RESUMO

The article analytically summarizes the idea of applying Shannon's principle of entropy maximization to sets that represent the results of observations of the "input" and "output" entities of the stochastic model for evaluating variable small data. To formalize this idea, a sequential transition from the likelihood function to the likelihood functional and the Shannon entropy functional is analytically described. Shannon's entropy characterizes the uncertainty caused not only by the probabilistic nature of the parameters of the stochastic data evaluation model but also by interferences that distort the results of the measurements of the values of these parameters. Accordingly, based on the Shannon entropy, it is possible to determine the best estimates of the values of these parameters for maximally uncertain (per entropy unit) distortions that cause measurement variability. This postulate is organically transferred to the statement that the estimates of the density of the probability distribution of the parameters of the stochastic model of small data obtained as a result of Shannon entropy maximization will also take into account the fact of the variability of the process of their measurements. In the article, this principle is developed into the information technology of the parametric and non-parametric evaluation on the basis of Shannon entropy of small data measured under the influence of interferences. The article analytically formalizes three key elements: -instances of the class of parameterized stochastic models for evaluating variable small data; -methods of estimating the probability density function of their parameters, represented by normalized or interval probabilities; -approaches to generating an ensemble of random vectors of initial parameters.

2.
Entropy (Basel) ; 25(12)2023 Nov 21.
Artigo em Inglês | MEDLINE | ID: mdl-38136447

RESUMO

Measurement is a typical way of gathering information about an investigated object, generalized by a finite set of characteristic parameters. The result of each iteration of the measurement is an instance of the class of the investigated object in the form of a set of values of characteristic parameters. An ordered set of instances forms a collection whose dimensionality for a real object is a factor that cannot be ignored. Managing the dimensionality of data collections, as well as classification, regression, and clustering, are fundamental problems for machine learning. Compactification is the approximation of the original data collection by an equivalent collection (with a reduced dimension of characteristic parameters) with the control of accompanying information capacity losses. Related to compactification is the data completeness verifying procedure, which is characteristic of the data reliability assessment. If there are stochastic parameters among the initial data collection characteristic parameters, the compactification procedure becomes more complicated. To take this into account, this study proposes a model of a structured collection of stochastic data defined in terms of relative entropy. The compactification of such a data model is formalized by an iterative procedure aimed at maximizing the relative entropy of sequential implementation of direct and reverse projections of data collections, taking into account the estimates of the probability distribution densities of their attributes. The procedure for approximating the relative entropy function of compactification to reduce the computational complexity of the latter is proposed. To qualitatively assess compactification this study undertakes a formal analysis that uses data collection information capacity and the absolute and relative share of information losses due to compaction as its metrics. Taking into account the semantic connection of compactification and completeness, the proposed metric is also relevant for the task of assessing data reliability. Testing the proposed compactification procedure proved both its stability and efficiency in comparison with previously used analogues, such as the principal component analysis method and the random projection method.

3.
Entropy (Basel) ; 24(7)2022 Jul 20.
Artigo em Inglês | MEDLINE | ID: mdl-35885229

RESUMO

In this article, the concept (i.e., the mathematical model and methods) of computational phonetic analysis of speech with an analytical description of the phenomenon of phonetic fusion is proposed. In this concept, in contrast to the existing methods, the problem of multicriteria of the process of cognitive perception of speech by a person is strictly formally presented using the theoretical and analytical apparatus of information (entropy) theory, pattern recognition theory and acoustic theory of speech formation. The obtained concept allows for determining reliably the individual phonetic alphabet inherent in a person, taking into account their inherent dialect of speech and individual features of phonation, as well as detecting and correcting errors in the recognition of language units. The experiments prove the superiority of the proposed scientific result over such common Bayesian concepts of decision making using the Euclidean-type mismatch metric as a method of maximum likelihood and a method of an ideal observer. The analysis of the speech signal carried out in the metric based on the proposed concept allows, in particular, for establishing reliably the phonetic saturation of speech, which objectively characterizes the environment of speech signal propagation and its source.

4.
PLoS One ; 19(4): e0299000, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38630761

RESUMO

In the article, the extreme problem of finding the optimal placement plan of 5G base stations at certain points within a linear area of finite length is set. A fundamental feature of the author's formulation of the extreme problem is that it takes into account not only the points of potential placement of base stations but also the possibility of selecting instances of stations to be placed at a specific point from a defined excess set, as well as the aspect of inseparable interaction of placed 5G base stations within the framework of SON. The formulation of this extreme problem is brought to the form of a specific combinatorial model. The article proposes an adapted branch-and-bounds method, which allows the process of synthesis of the architecture of a linearly oriented segment of a 5G network to select the best options for the placement of base stations for further evaluation of the received placement plans in the metric of defined performance indicators. As the final stage of the synthesis of the optimal plan of a linearly oriented wireless network segment based on the sequence of the best placements, it is proposed to expand the parametric space of the design task due to the specific technical parameters characteristic of the 5G platform. The article presents a numerical example of solving an instance of the corresponding extremal problem. It is shown that the presented mathematical apparatus allows for the formation of a set of optimal placements taking into account the size of the non-coverage of the target area. To calculate this characteristic parameter, both exact and two approximate approaches are formalized. The results of the experiment showed that for high-dimensional problems, the approximate approach allows for reducing the computational complexity of implementing the adapted branch-and-bounds method by more than six times, with a slight loss of accuracy of the optimal solution. The structure of the article includes Section 1 (introduction and state-of-the-art), Section 2 (statement of the research, proposed models and methods devoted to the research topic), Section 3 (numerical experiment and analysis of results), and Section 4 (conclusions and further research).

5.
R Soc Open Sci ; 11(7): 240206, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-39076361

RESUMO

Rerouting of direct information traffic under the WiMax-1/2 technology control in the case of licensed frequency spectrum overload ensures communication continuity in the smart city's critical infrastructure. The support of such a process in the WiMax-1/2 cluster has its specificity, worthy of analytical formalization. The article presents a mathematical apparatus that allows the average service duration of an information message during its transfer from the terminal to the WiMax-1/2 base station to be estimated. Unlike analogues, the presented concept adequately describes the investigated process for any number of terminals, taking into account both the queuing effect on their side and the functioning of the cumulative query transmission mechanism inherent in WiMax-1/2 technology. Therefore, the proposed mathematical apparatus, describing the process of servicing an information message, takes into account both the average duration accompanied by potential collisions in the process of sending a request for the allocation of communication resources for its transmission to the base station, and the average duration of the information message's stay in the terminal queue. Experimental studies demonstrated the adequacy of the proposed mathematical apparatus for describing the investigated process. The experimental section also formulates the optimization problem of the investigated process resulting from the management of competitive access parameters.

6.
Heliyon ; 10(2): e24708, 2024 Jan 30.
Artigo em Inglês | MEDLINE | ID: mdl-38298719

RESUMO

The formalization of dependencies between datasets, taking into account specific hypotheses about data properties, is a constantly relevant task, which is especially acute when it comes to small data. The aim of the study is to formalize the procedure for calculating optimal estimates of probability density functions of parameters of linear and nonlinear dynamic and static small data models, created taking into account specific hypotheses regarding the properties of the studied object. The research methodology includes probability theory and mathematical statistics, information theory, evaluation theory, and stochastic mathematical programming methods. The mathematical apparatus presented in the article is based on the principle of maximization of information entropy on sets determined as a result of a small number of censored measurements of "input" and "output" entities in the presence of noise. These data structures became the basis for the formalization of linear and nonlinear dynamic and static models of small data with stochastic parameters, which include both controlled and noise-oriented input and output measurement entities. For all variants of the above-mentioned small data models, the tasks of determining the optimal estimates of the probability density functions of the parameters were carried out. Formulated optimization problems are reduced to the forms canonical for the stochastic linear programming problem with probabilistic constraints.

7.
Comput Struct Biotechnol J ; 24: 593-602, 2024 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-39297161

RESUMO

The approaches used in biomedicine to analyze epidemics take into account features such as exponential growth in the early stages, slowdown in dynamics upon saturation, time delays in spread, segmented spread, evolutionary adaptations of the pathogen, and preventive measures based on universal communication protocols. All these characteristics are also present in modern cyber epidemics. Therefore, adapting effective biomedical approaches to epidemic analysis for the investigation of the development of cyber epidemics is a promising scientific research task. The article is dedicated to researching the problem of predicting the development of cyber epidemics at early stages. In such conditions, the available data is scarce, incomplete, and distorted. This situation makes it impossible to use artificial intelligence models for prediction. Therefore, the authors propose an entropy-extreme model, defined within the machine learning paradigm, to address this problem. The model is based on estimating the probability distributions of its controllable parameters from input data, taking into account the variability characteristic of the last ones. The entropy-extreme instance, identified from a set of such distributions, indicates the most uncertain (most negative) trajectory of the investigated process. Numerical methods are used to analyze the generated set of investigated process development trajectories based on the assessments of probability distributions of the controllable parameters and the variability characteristic. The result of the analysis includes characteristic predictive trajectories such as the average and median trajectories from the set, as well as the trajectory corresponding to the standard deviation area of the parameters' values. Experiments with real data on the infection of Windows-operated devices by various categories of malware showed that the proposed model outperforms the classical competitor (least squares method) in predicting the development of cyber epidemics near the extremum of the time series representing the deployment of such a process over time. Moreover, the proposed model can be applied without any prior hypotheses regarding the probabilistic properties of the available data.

8.
Heliyon ; 10(16): e36269, 2024 Aug 30.
Artigo em Inglês | MEDLINE | ID: mdl-39224301

RESUMO

The Internet of Medical Things (IoMT) has transformed healthcare by connecting medical devices, sensors, and patients, significantly improving patient care. However, the sensitive data exchanged through IoMT is vulnerable to security attacks, raising serious privacy concerns. Traditional key sharing mechanisms are susceptible to compromise, posing risks to data integrity. This paper proposes a Timestamp-based Secret Key Generation (T-SKG) scheme for resource-constrained devices, generating a secret key at the patient's device and regenerating it at the doctor's device, thus eliminating direct key sharing and minimizing key compromise risks. Simulation results using MATLAB and Java demonstrate the T-SKG scheme's resilience against guessing, birthday, and brute force attacks. Specifically, there is only a 9 % chance of key compromise in a guessing attack if the attacker knows the key sequence pattern, while the scheme remains secure against brute force and birthday attacks within a specified timeframe. The T-SKG scheme is integrated into a healthcare framework to securely transmit health vitals collected using the MySignals sensor kit. For confidentiality, the Data Encryption Standard (DES) with various Cipher Block modes (ECB, CBC, CTR) is employed.

9.
PLoS One ; 18(12): e0295252, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38064461

RESUMO

A typical element of the smart city's information and communication space is a 5G cluster, which is focused on serving both new and handover requests because it is an open system. In an ordinary 5G smart city cluster, Ultra-Reliable Low-Latency Communications (URLLC) and enhanced Mobile BroadBand (eMBB) traffic types prevail. The formation of an effective QoS policy for such an object (taking into account the potentially active slicing technology) is an urgent problem. As a baseline, this research considers a Quality of Service (QoS) policy with constraints for context-defined URLLC and eMBB classes of incoming requests. Evaluating the QoS policy instance defined within the framework of the basic concept requires the formalization of both a complete qualitative metric and a computationally efficient mathematical apparatus for its calculation. The article presents accurate and approximate methods of calculating such quality parameters as the probability of loss of typed requests and the utilization ratio of the communication resource, which depend on the implementation of the estimated QoS policy. At the same time, the original parametric space includes both fixed characteristics (amount of available communication resources, load according to request classes) and controlled characteristics due to the specifics of the implementation of the basic QoS concept. The paper empirically proves the adequacy of the presented mathematical apparatus for evaluating the QoS policy defined within the scope of the research. Also, in the proposed qualitative metric, a comparison of the author's concept with a parametrically close analogue (the well-known QoS policy scheme, which takes into account the phenomenon of reservation of communication resources), determined taking into account the reservation of communication resources, was made. The results of the comparison testify in favour of the superiority of the author's approach in the proposed metrics.


Assuntos
Redes de Comunicação de Computadores , Tecnologia sem Fio , Comunicação , Tecnologia , Probabilidade
10.
Sci Rep ; 13(1): 22810, 2023 Dec 20.
Artigo em Inglês | MEDLINE | ID: mdl-38129492

RESUMO

Security Information and Event Management (SIEM) technologies play an important role in the architecture of modern cyber protection tools. One of the main scenarios for the use of SIEM is the detection of attacks on protected information infrastructure. Consorting that ISO 27001, NIST SP 800-61, and NIST SP 800-83 standards objectively do not keep up with the evolution of cyber threats, research aimed at forecasting the development of cyber epidemics is relevant. The article proposes a stochastic concept of describing variable small data on the Shannon entropy basis. The core of the concept is the description of small data by linear differential equations with stochastic characteristic parameters. The practical value of the proposed concept is embodied in the method of forecasting the development of a cyber epidemic at an early stage (in conditions of a lack of empirical information). In the context of the research object, the stochastic characteristic parameters of the model are the generation rate, the death rate, and the independent coefficient of variability of the measurement of the initial parameter of the research object. Analytical expressions for estimating the probability distribution densities of these characteristic parameters are proposed. It is assumed that these stochastic parameters of the model are imposed on the intervals, which allows for manipulation of the nature and type of the corresponding functions of the probability distribution densities. The task of finding optimal functions of the probability distribution densities of the characteristic parameters of the model with maximum entropy is formulated. The proposed method allows for generating sets of trajectories of values of characteristic parameters with optimal functions of the probability distribution densities. The example demonstrates both the flexibility and reliability of the proposed concept and method in comparison with the concepts of forecasting numerical series implemented in the base of Matlab functions.

11.
Sci Rep ; 12(1): 16050, 2022 09 26.
Artigo em Inglês | MEDLINE | ID: mdl-36163351

RESUMO

The main contribution of the investigation is the Markov model of the process of resource allocation management between subscribers of eMBB and mMTC services within the 5G cluster. The proposed model, considers the organization of the channel resource in the format of resource blocks. The presented model allows to estimate the average duration of IoT sessions, the average number of active multimedia/IoT sessions, the average number of channel resource units occupied by multimedia/IoT traffic, the average number of resource blocks occupied by multimedia/IoT traffic. The metrics are generalized by three management schemes of the investigated process: balanced, competitive and perspective. The first and third schemes enable static/dynamic distribution of channel resources into reserved and common segments for subscribers of eMBB and mMTC services. The proposed model is illustrated with an example showing how to assess the availability and efficiency of channel resource use of the 5G cluster of the cyber-physical system of the Situation Center of the Department of Information Technology of Vinnytsia City Council (Vinnytsia, Ukraine). The article also shows how to use the proposed model to select the 5G network parameters to keep the probabilities of rejection of multimedia and IoT requests below a set threshold.


Assuntos
Multimídia , Alocação de Recursos , Comunicação , Ucrânia
12.
Sci Rep ; 12(1): 7089, 2022 04 30.
Artigo em Inglês | MEDLINE | ID: mdl-35490168

RESUMO

The functional safety assessment is one of the primary tasks both at the design stage and at the stage of operation of critical infrastructure at all levels. The article's main contribution is the information technology of calculating the author's metrics of functional safety for estimating the instance of the model of the cyber-physical system operation. The calculation of metric criteria analytically summarizes the results of expert evaluation of the system in VPR-metrics and the results of statistical processing of information on the system's operation presented in the parametric space Markov model of this process. The advantages of the proposed approach are the following: the need to process orders of magnitude less empirical data to obtain objective estimates of the investigated system; taking into account the configuration scheme and architecture of the security subsystem of the investigated system when calculating the metric; completeness, compactness, and simplicity of interpretation of evaluation results; the ability to assess the achievability of the limit values of the metric criteria based on the model of operation of the investigated system. The paper demonstrates the application of the proposed technology to assess the functional safety of the model of a real cyber-physical system.


Assuntos
Cadeias de Markov , Fenômenos Físicos
13.
Sci Rep ; 12(1): 12849, 2022 Jul 27.
Artigo em Inglês | MEDLINE | ID: mdl-35896812

RESUMO

The article's main contribution is the description of the process of the security subsystem countering the impact of typed cyber-physical attacks as a model of end states in continuous time. The input parameters of the model are the flow intensities of typed cyber-physical attacks, the flow intensities of possible cyber-immune reactions, and the set of probabilities of neutralization of cyber-physical attacks. The set of admissible states of the info-communication system is described taking into account possible variants of the development of the modeled process. The initial parameters of the model are the probabilities of the studied system in the appropriate states at a particular moment. The dynamics of the info-communication system's life cycle are embodied in the form of a matrix of transient probabilities. The mentioned matrix connects the initial parameters in the form of a system of Chapman's equations. The article presents a computationally efficient concept based on Gershgorin's theorems to solve such a system of equations with given initiating values. Based on the presented scientific results, the article proposes the concept of calculating the time to failure as an indicator of the reliability of the info-communication system operating under the probable impact of typical cyber-physical attacks. The adequacy of the model and concepts presented in the article is proved by comparing a statically representative amount of empirical and simulated data. We emphasize that the main contribution of the research is the description of the process of the security subsystem countering the impact of typed cyber-physical attacks as a model of end states in continuous time. Based on the created model, the concept of computationally efficient solution of Chapman's equation system based on Gershgorin's theorems and calculating time to failure as an indicator of the reliability of the info-communication system operating under the probable impact of typed cyber-physical attacks are formalized. These models and concepts are the highlights of the research.

14.
PLoS One ; 17(7): e0271536, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35849601

RESUMO

The article examines the subject-system interaction session, where the system is understood as a base station, and the subject is understood as a mobile communication device. The peculiarity of the study is taking into account the phenomenon relevant to modern communication infrastructures, which is that the base station supports the division of information traffic into a subspace of guaranteed personalized traffic and a subspace of general-purpose traffic. The study considers a highly critical empirical emergency when the general-purpose traffic subspace may cease to be available at any time. The presented mathematical apparatus describes the impact of such an emergency on the active communication sessions supported by the system in receiving new incoming requests of increasing intensity. To characterize this emergency situation, expressions adapted for practical application are presented to calculate such qualitative parameters as the probability of stability, the probability of failure, and unavailability.


Assuntos
Comunicação , Redes de Comunicação de Computadores , Probabilidade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA