Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 212
Filtrar
1.
BMC Med Res Methodol ; 24(1): 171, 2024 Aug 06.
Artigo em Inglês | MEDLINE | ID: mdl-39107695

RESUMO

BACKGROUND: Dimension reduction methods do not always reduce their underlying indicators to a single composite score. Furthermore, such methods are usually based on optimality criteria that require discarding some information. We suggest, under some conditions, to use the joint probability density function (joint pdf or JPD) of p-dimensional random variable (the p indicators), as an index or a composite score. It is proved that this index is more informative than any alternative composite score. In two examples, we compare the JPD index with some alternatives constructed from traditional methods. METHODS: We develop a probabilistic unsupervised dimension reduction method based on the probability density of multivariate data. We show that the conditional distribution of the variables given JPD is uniform, implying that the JPD is the most informative scalar summary under the most common notions of information. B. We show under some widely plausible conditions, JPD can be used as an index. To use JPD as an index, in addition to having a plausible interpretation, all the random variables should have approximately the same direction(unidirectionality) as the density values (codirectionality). We applied these ideas to two data sets: first, on the 7 Brief Pain Inventory Interference scale (BPI-I) items obtained from 8,889 US Veterans with chronic pain and, second, on a novel measure based on administrative data for 912 US Veterans. To estimate the JPD in both examples, among the available JPD estimation methods, we used its conditional specifications, identified a well-fitted parametric model for each factored conditional (regression) specification, and, by maximizing the corresponding likelihoods, estimated their parameters. Due to the non-uniqueness of conditional specification, the average of all estimated conditional specifications was used as the final estimate. Since a prevalent common use of indices is ranking, we used measures of monotone dependence [e.g., Spearman's rank correlation (rho)] to assess the strength of unidirectionality and co-directionality. Finally, we cross-validate the JPD score against variance-covariance-based scores (factor scores in unidimensional models), and the "person's parameter" estimates of (Generalized) Partial Credit and Graded Response IRT models. We used Pearson Divergence as a measure of information and Shannon's entropy to compare uncertainties (informativeness) in these alternative scores. RESULTS: An unsupervised dimension reduction was developed based on the joint probability density (JPD) of the multi-dimensional data. The JPD, under regularity conditions, may be used as an index. For the well-established Brief Pain Interference Inventory (BPI-I (the short form with 7 Items) and for a new mental health severity index (MoPSI) with 6 indicators, we estimated the JPD scoring. We compared, assuming unidimensionality, factor scores, Person's scores of the Partial Credit model, the Generalized Partial Credit model, and the Graded Response model with JPD scoring. As expected, all scores' rankings in both examples were monotonically dependent with various strengths. Shannon entropy was the smallest for JPD scores. Pearson Divergence of the estimated densities of different indices against uniform distribution was maximum for JPD scoring. CONCLUSIONS: An unsupervised probabilistic dimension reduction is possible. When appropriate, the joint probability density function can be used as the most informative index. Model specification and estimation and steps to implement the scoring were demonstrated. As expected, when the required assumption in factor analysis and IRT models are satisfied, JPD scoring agrees with these established scores. However, when these assumptions are violated, JPD scores preserve all the information in the indicators with minimal assumption.


Assuntos
Probabilidade , Humanos , Dor/diagnóstico , Índice de Gravidade de Doença , Medição da Dor/métodos , Medição da Dor/estatística & dados numéricos , Transtornos Mentais/diagnóstico , Modelos Estatísticos , Algoritmos
2.
Hepatol Res ; 54(2): 151-161, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-37768830

RESUMO

AIM: To weight the prognostic value of thyroid hormones in catastrophic acute-on-chronic liver failure (ACLF). METHODS: A retrospective cohort (n = 635) and two prospective cohorts (n = 353, and 198) were enrolled in this study. The performance of a novel developed prognostic score was assessed from aspects of reliability, discrimination, and clinical net benefit. RESULTS: Thyroid-stimulating hormone (TSH) was identified to have the most potential as a prognostic predictor for hepatitis B virus-related ACLF among thyroid hormones. The novel score (modified chronic liver failure-organ failure score [mCLIF-OFs]) was developed with weighted TSH and other scored organs in the CLIF-OFs using the retrospective cohort (n = 635). The predicted risk and observed probabilities of death were comparable across the deciles of mCLIF-OFs (Hosmer-Lemeshow χ2  = 4.28, p = 0.83; Brier scaled = 11.9). The C-index of mCLIF-OFs (0.885 [0.883-0.887]) for 30-day mortality was significantly higher than that of the CLIF-OFs, chronic liver failure-sequential organ failure assessment score (CLIF-SOFAs), CLIF-C ACLFs, Model of End-stage Liver Disease (MELD), and Child-Pugh (all p < 0.001). The absolute improvements of prediction error rates of the mCLIF-OFs compared to the above five scores were from 19.0% to 61.1%. After the analysis of probability density function, the mCLIF-OFs showed the least overlapping coefficients (27.9%) among the above prognostic scores. Additionally, the mCLIF-OFs showed greater net benefit than the above five prognostic scores over a wide range of risk threshold of death. Similar results were validated in two prospective ACLF cohorts with HBV and non-HBV etiologies. CONCLUSION: Weighted TSH portended the outcome of ACLF patients, which could be treated as a "damaged organ" of the hypothalamic-pituitary-thyroid axis. The novel mCLIF-OFs is a reliable prognostic score with better discrimination power and clinical net benefit than CLIF-OFs, CLIF-SOFAs, CLIF-C ACLFs, MELD, and Child-Pugh.

3.
J Math Biol ; 89(3): 30, 2024 Jul 17.
Artigo em Inglês | MEDLINE | ID: mdl-39017723

RESUMO

To describe the transmission dynamics of maize streak virus infection, in the paper, we first formulate a stochastic maize streak virus infection model, in which the stochastic fluctuations are depicted by a logarithmic Ornstein-Uhlenbeck process. This approach is reasonable to simulate the random impacts of main parameters both from the biological significance and the mathematical perspective. Then we investigate the detailed dynamics of the stochastic system, including the existence and uniqueness of the global solution, the existence of a stationary distribution, the exponential extinction of the infected maize and infected leafhopper vector. Especially, by solving the five-dimensional algebraic equations corresponding to the stochastic system, we obtain the specific expression of the probability density function near the quasi-endemic equilibrium of the stochastic system, which provides valuable insights into the stationary distribution. Finally, the model is discretized using the Milstein higher-order numerical method to illustrate our theoretical results numerically. Our findings provide a groundwork for better methods of preventing the spread of this type of virus.


Assuntos
Vírus do Listrado do Milho , Conceitos Matemáticos , Modelos Biológicos , Doenças das Plantas , Processos Estocásticos , Zea mays , Doenças das Plantas/virologia , Doenças das Plantas/estatística & dados numéricos , Zea mays/virologia , Animais , Vírus do Listrado do Milho/fisiologia , Simulação por Computador , Insetos Vetores/virologia , Epidemias/estatística & dados numéricos , Hemípteros/virologia
4.
Radiat Environ Biophys ; 63(2): 185-194, 2024 05.
Artigo em Inglês | MEDLINE | ID: mdl-38565701

RESUMO

This paper describes events of anomalously high energy transfer to a micro-object by fragments of nuclei generated in nuclear interactions in the environment on board a spacecraft in flight in low-Earth orbit. An algorithm has been developed that allows for the calculation of the absorbed energy from one or more fragments - products of nuclear interaction. With this algorithm the energy distributions for a spherical micro-volume in an aqueous medium were calculated. And the resulting absorbed energy spectra from nuclear fragments and from primary cosmic rays were compared. The role of nuclear interactions in events of large energy transfers in micro-objects in the field of primary cosmic radiation has been evaluated. The calculations performed in this study showed that the energy in a micro-volume from nuclear events can be several times higher compared to the energy imparted by primary space radiation.


Assuntos
Radiação Cósmica , Transferência de Energia , Algoritmos , Astronave , Voo Espacial
5.
Sensors (Basel) ; 24(12)2024 Jun 07.
Artigo em Inglês | MEDLINE | ID: mdl-38931504

RESUMO

A complete framework of predicting the attributes of sea clutter under different operational conditions, specified by wind speed, wind direction, grazing angle, and polarization, is proposed for the first time. This framework is composed of empirical spectra to characterize sea-surface profiles under different wind speeds, the Monte Carlo method to generate realizations of sea-surface profiles, the physical-optics method to compute the normalized radar cross-sections (NRCSs) from individual sea-surface realizations, and regression of NRCS data (sea clutter) with an empirical probability density function (PDF) characterized by a few statistical parameters. JONSWAP and Hwang ocean-wave spectra are adopted to generate realizations of sea-surface profiles at low and high wind speeds, respectively. The probability density functions of NRCSs are regressed with K and Weibull distributions, each characterized by two parameters. The probability density functions in the outlier regions of weak and strong signals are regressed with a power-law distribution, each characterized by an index. The statistical parameters and power-law indices of the K and Weibull distributions are derived for the first time under different operational conditions. The study reveals succinct information of sea clutter that can be used to improve the radar performance in a wide variety of complicated ocean environments. The proposed framework can be used as a reference or guidelines for designing future measurement tasks to enhance the existing empirical models on ocean-wave spectra, normalized radar cross-sections, and so on.

6.
J Environ Manage ; 351: 120005, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38183951

RESUMO

Accurate estimation of potential wildfire behavior characteristics (PWBC) can improve wildfire danger assessment. However, wildfire behavior has been estimated by most fire spread models with immeasurable uncertainties and difficulties in large-scale applications. In this study, a PWBC estimation model (named PWBC-QR-BiLSTM) was proposed by coupling the Bi-directional Long Short-Term Memory (BiLSTM) and quantile regression (QR) methods. Multi-source data, including fuel, weather, topography, infrastructure, and landscape variables, were input into the PWBC-QR-BiLSTM model to estimate the potential rate of spread (ROS) and fire radiative power (FRP) over western Sichuan of China, and then to estimate the probability density of ROS and FRP. Daily ROS and FRP were extracted from the Global Fire Atlas and the MOD14A1/MYD14A1 product. The optimal PWBC-QR-BiLSTM model was determined using the Non-dominated Sorting Genetic Algorithm Ⅱ (NAGA-Ⅱ). Results showed that the PWBC-QR-BiLSTM performed well in estimating potential ROS and FRP with high accuracy (ROS: R2 > 0.7 and MAPE<30%, FRP: R2 > 0.8 and MAPE<25%). The modal PWBC values extracted from the estimated probability density were closer to the observed values, which can be regarded as a good indicator for wildfire danger assessment. The variable importance analysis also verified that fuel and infrastructure variables played an important role in driving wildfire behavior. This study suggests the potential of utilizing artificial intelligence to estimate PWBC and its probability density to improve the guidance on wildfire management.


Assuntos
Aprendizado Profundo , Incêndios , Incêndios Florestais , Inteligência Artificial , Espécies Reativas de Oxigênio , Conservação dos Recursos Naturais/métodos , China
7.
Entropy (Basel) ; 26(6)2024 May 26.
Artigo em Inglês | MEDLINE | ID: mdl-38920461

RESUMO

Heat capacity data of many crystalline solids can be described in a physically sound manner by Debye-Einstein integrals in the temperature range from 0K to 300K. The parameters of the Debye-Einstein approach are either obtained by a Markov chain Monte Carlo (MCMC) global optimization method or by a Levenberg-Marquardt (LM) local optimization routine. In the case of the MCMC approach the model parameters and the coefficients of a function describing the residuals of the measurement points are simultaneously optimized. Thereby, the Bayesian credible interval for the heat capacity function is obtained. Although both regression tools (LM and MCMC) are completely different approaches, not only the values of the Debye-Einstein parameters, but also their standard errors appear to be similar. The calculated model parameters and their associated standard errors are then used to derive the enthalpy, entropy and Gibbs energy as functions of temperature. By direct insertion of the MCMC parameters of all 4·105 computer runs the distributions of the integral quantities enthalpy, entropy and Gibbs energy are determined.

8.
Entropy (Basel) ; 26(6)2024 Jun 15.
Artigo em Inglês | MEDLINE | ID: mdl-38920526

RESUMO

When using traditional Euler deconvolution optimization strategies, it is difficult to distinguish between anomalies and their corresponding Euler tails (those solutions are often distributed outside the anomaly source, forming "tail"-shaped spurious solutions, i.e., misplaced Euler solutions, which must be removed or marked) with only the structural index. The nonparametric estimation method based on the normalized B-spline probability density (BSS) is used to separate the Euler solution clusters and mark different anomaly sources according to the similarity and density characteristics of the Euler solutions. For display purposes, the BSS needs to map the samples onto the estimation grid at the points where density will be estimated in order to obtain the probability density distribution. However, if the size of the samples or the estimation grid is too large, this process can lead to high levels of memory consumption and excessive computation times. To address this issue, a fast linear binning approximation algorithm is introduced in the BSS to speed up the computation process and save time. Subsequently, the sample data are quickly projected onto the estimation grid to facilitate the discrete convolution between the grid and the density function using a fast Fourier transform. A method involving multivariate B-spline probability density estimation based on the FFT (BSSFFT), in conjunction with fast linear binning appropriation, is proposed in this paper. The results of two random normal distributions show the correctness of the BSS and BSSFFT algorithms, which is verified via a comparison with the true probability density function (pdf) and Gaussian kernel smoothing estimation algorithms. Then, the Euler solutions of the two synthetic models are analyzed using the BSS and BSSFFT algorithms. The results are consistent with their theoretical values, which verify their correctness regarding Euler solutions. Finally, the BSSFFT is applied to Bishop 5X data, and the numerical results show that the comprehensive analysis of the 3D probability density distributions using the BSSFFT algorithm, derived from the Euler solution subset of x0,y0,z0, can effectively separate and locate adjacent anomaly sources, demonstrating strong adaptability.

9.
Entropy (Basel) ; 26(2)2024 Jan 30.
Artigo em Inglês | MEDLINE | ID: mdl-38392376

RESUMO

We deal with absolutely continuous probability distributions with finite all-positive integer-order moments. It is well known that any such distribution is either uniquely determined by its moments (M-determinate), or it is non-unique (M-indeterminate). In this paper, we follow the maximum entropy approach and establish a new criterion for the M-indeterminacy of distributions on the positive half-line (Stieltjes case). Useful corollaries are derived for M-indeterminate distributions on the whole real line (Hamburger case). We show how the maximum entropy is related to the symmetry property and the M-indeterminacy.

10.
Philos Trans A Math Phys Eng Sci ; 381(2242): 20210226, 2023 Feb 20.
Artigo em Inglês | MEDLINE | ID: mdl-36587818

RESUMO

Magnetically confined plasmas are far from equilibrium and pose considerable challenges in statistical analysis. We discuss a non-perturbative statistical method, namely a time-dependent probability density function (PDF) approach that is potentially useful for analysing time-varying, large, or non-Gaussian fluctuations and bursty events associated with instabilities in the low-to-high confinement transition and the H-mode. Specifically, we present a stochastic Langevin model of edge-localized modes (ELMs) by including stochastic noise terms in a previous ODE ELM model. We calculate exact time-dependent PDFs by numerically solving the Fokker-Planck equation and characterize time-varying statistical properties of ELMs for different energy fluxes and noise amplitudes. The stochastic noise is shown to introduce phase-mixing and plays a significant role in mitigating extreme bursts of large ELMs. Furthermore, based on time-dependent PDFs, we provide a path-dependent information geometric theory of the ELM dynamics and demonstrate its utility in capturing self-regulatory relaxation oscillations, bursts and a sudden change in the system. This article is part of a discussion meeting issue 'H-mode transition and pedestal studies in fusion plasmas'.

11.
Biomed Eng Online ; 22(1): 8, 2023 Feb 04.
Artigo em Inglês | MEDLINE | ID: mdl-36739411

RESUMO

OBJECTIVE: The probability density analysis was applied to automatically characterize the center of pressure (COP) data for evaluation of the stroke patients' balance ability. METHODS: The real-time COP coordinates of 38 stroke patients with eyes open and closed during quiet standing were obtained, respectively, from a precision force platform. The COP data were analyzed and characterized by the commonly used parameters: total sway length (SL), sway radius (SR), envelope sway area (EA), and the probability density analysis based parameters: projection area (PA), skewness (SK) and kurtosis (KT), and their statistical correlations were analyzed. The differences of both conventional parameters and probability density parameters under the conditions of eyes open (EO) and eyes closed (EC) were compared. RESULTS: The PA from probability density analysis is strongly correlated with SL and SR. Both the traditional parameters and probability density parameters in the EC state are significantly different from those in the EO state. The obtained various statokinesigrams were calculated and categorized into typical sway types through probability density function for clinical evaluation of the balance ability of stroke patients. CONCLUSIONS: The probability density analysis of COP data can be used to characterize the posturography for evaluation of the balance ability of stroke patients.


Assuntos
Equilíbrio Postural , Acidente Vascular Cerebral , Humanos , Posição Ortostática , Probabilidade
12.
Chaos Solitons Fractals ; 169: 113256, 2023 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-36820073

RESUMO

In this paper, we propose a stochastic SEIR-type model with asymptomatic carriers to describe the propagation mechanism of coronavirus (COVID-19) in the population. Firstly, we show that there exists a unique global positive solution of the stochastic system with any positive initial value. Then we adopt a stochastic Lyapunov function method to establish sufficient conditions for the existence and uniqueness of an ergodic stationary distribution of positive solutions to the stochastic model. Especially, under the same conditions as the existence of a stationary distribution, we obtain the specific form of the probability density around the quasi-endemic equilibrium of the stochastic system. Finally, numerical simulations are introduced to validate the theoretical findings.

13.
Sensors (Basel) ; 23(17)2023 Aug 29.
Artigo em Inglês | MEDLINE | ID: mdl-37687945

RESUMO

The SOI-FET biosensor (silicon-on-insulator field-effect transistor) for virus detection is a promising device in the fields of medicine, virology, biotechnology, and the environment. However, the applications of modern biosensors face numerous problems and require improvement. Some of these problems can be attributed to sensor design, while others can be attributed to technological limitations. The aim of this work is to conduct a theoretical investigation of the "antibody + antigen" complex (AB + AG) detection processes of a SOI-FET biosensor, which may also solve some of the aforementioned problems. Our investigation concentrates on the analysis of the probability of AB + AG complex detection and evaluation. Poisson probability density distribution was used to estimate the probability of the adsorption of the target molecules on the biosensor's surface and, consequently, to obtain correct detection results. Many implicit and unexpected causes of error detection have been identified for AB + AG complexes using SOI-FET biosensors. We showed that accuracy and time of detection depend on the number of SOI-FET biosensors on a crystal.


Assuntos
Biotecnologia , Silício , Adsorção , Probabilidade
14.
Sensors (Basel) ; 23(3)2023 Jan 26.
Artigo em Inglês | MEDLINE | ID: mdl-36772417

RESUMO

Most penalized maximum likelihood methods for tomographic image reconstruction based on Bayes' law include a freely adjustable hyperparameter to balance the data fidelity term and the prior/penalty term for a specific noise-resolution tradeoff. The hyperparameter is determined empirically via a trial-and-error fashion in many applications, which then selects the optimal result from multiple iterative reconstructions. These penalized methods are not only time-consuming by their iterative nature, but also require manual adjustment. This study aims to investigate a theory-based strategy for Bayesian image reconstruction without a freely adjustable hyperparameter, to substantially save time and computational resources. The Bayesian image reconstruction problem is formulated by two probability density functions (PDFs), one for the data fidelity term and the other for the prior term. When formulating these PDFs, we introduce two parameters. While these two parameters ensure the PDFs completely describe the data and prior terms, they cannot be determined by the acquired data; thus, they are called complete but unobservable parameters. Estimating these two parameters becomes possible under the conditional expectation and maximization for the image reconstruction, given the acquired data and the PDFs. This leads to an iterative algorithm, which jointly estimates the two parameters and computes the to-be reconstructed image by maximizing a posteriori probability, denoted as joint-parameter-Bayes. In addition to the theoretical formulation, comprehensive simulation experiments are performed to analyze the stopping criterion of the iterative joint-parameter-Bayes method. Finally, given the data, an optimal reconstruction is obtained without any freely adjustable hyperparameter by satisfying the PDF condition for both the data likelihood and the prior probability, and by satisfying the stopping criterion. Moreover, the stability of joint-parameter-Bayes is investigated through factors such as initialization, the PDF specification, and renormalization in an iterative manner. Both phantom simulation and clinical patient data results show that joint-parameter-Bayes can provide comparable reconstructed image quality compared to the conventional methods, but with much less reconstruction time. To see the response of the algorithm to different types of noise, three common noise models are introduced to the simulation data, including white Gaussian noise to post-log sinogram data, Poisson-like signal-dependent noise to post-log sinogram data and Poisson noise to the pre-log transmission data. The experimental outcomes of the white Gaussian noise reveal that the two parameters estimated by the joint-parameter-Bayes method agree well with simulations. It is observed that the parameter introduced to satisfy the prior's PDF is more sensitive to stopping the iteration process for all three noise models. A stability investigation showed that the initial image by filtered back projection is very robust. Clinical patient data demonstrated the effectiveness of the proposed joint-parameter-Bayes and stopping criterion.


Assuntos
Processamento de Imagem Assistida por Computador , Tomografia Computadorizada por Raios X , Humanos , Teorema de Bayes , Processamento de Imagem Assistida por Computador/métodos , Tomografia Computadorizada por Raios X/métodos , Algoritmos , Simulação por Computador , Imagens de Fantasmas
15.
Entropy (Basel) ; 25(2)2023 Jan 17.
Artigo em Inglês | MEDLINE | ID: mdl-36832552

RESUMO

Output probability density function (PDF) tracking control of stochastic systems has always been a challenging problem in both theoretical development and engineering practice. Focused on this challenge, this work proposes a novel stochastic control framework so that the output PDF can track a given time-varying PDF. Firstly, the output PDF is characterised by the weight dynamics following the B-spline model approximation. As a result, the PDF tracking problem is transferred to a state tracking problem for weight dynamics. In addition, the model error of the weight dynamics is described by the multiplicative noises to more effectively establish its stochastic dynamics. Moreover, to better reflect the practical applications in the real world, the given tracking target is set to be time-varying rather than static. Thus, an extended fully probabilistic design (FPD) is developed based on the conventional FPD to handle multiplicative noises and to track the time-varying references in a superior way. Finally, the proposed control framework is verified by a numerical example, and a comparison simulation with the linear-quadratic regulator (LQR) method is also included to illustrate the superiority of our proposed framework.

16.
Entropy (Basel) ; 25(4)2023 Mar 31.
Artigo em Inglês | MEDLINE | ID: mdl-37190386

RESUMO

Typical human-scaled considerations of thermodynamic states depend primarily on the core of associated speed or other relevant distributions, because the wings of those distributions are so improbable that they cannot contribute significantly to averages. However, for long timescale regimes (slow time), previous papers have shown otherwise. Fluctuating local equilibrium systems have been proven to have distributions with non-Gaussian tails demanding more careful treatment. That has not been needed in traditional statistical mechanics. The resulting non-Gaussian distributions do not admit notions such as temperature; that is, a global temperature is not defined even if local regimes have meaningful temperatures. A fluctuating local thermodynamic equilibrium implies that any local detector is exposed to sequences of local states which collectively induce the non-Gaussian forms. This paper shows why tail behavior is observationally challenging, how the convolutions that produce non-Gaussian behavior are directly linked to time-coarse graining, how a fluctuating local equilibrium system does not need to have a collective temperature, and how truncating the tails in the convolution probability density function (PDF) produces even more non-Gaussian behaviors.

17.
Entropy (Basel) ; 25(3)2023 Mar 09.
Artigo em Inglês | MEDLINE | ID: mdl-36981365

RESUMO

Turbulence can cause effects such as light intensity fluctuations and phase fluctuations when a laser is transmitted in the atmosphere, which has serious impacts on a number of optical engineering application effects and on climate improvement. Therefore, accurately obtaining real-time turbulence intensity information using lidar-active remote sensing technology is of great significance. In this paper, based on residual turbulent scintillation theory, a Mie-scattering lidar method was developed to detect atmospheric turbulence intensity. By extracting light intensity fluctuation information from a Mie-scattering lidar return signal, the atmospheric refractive index structure constant, Cn2, representing the atmospheric turbulence intensity, could be obtained. Specifically, the scintillation effect on the detection path was analyzed, and the probability density distribution of the light intensity of the Mie-scattering lidar return signal was studied. It was verified that the probability density of logarithmic light intensity basically follows a normal distribution under weak fluctuation conditions. The Cn2 profile based on Kolmogorov turbulence theory was retrieved using a layered, iterative method through the scintillation index. The method for detecting Kolmogorov turbulence intensity was applied to the detection of the non-Kolmogorov turbulence intensity. Through detection using the scintillation index, the corresponding C˜n2 profile could be calculated. The detection of the C˜n2 and Cn2 profiles were compared with the Hufnagel-Valley (HV) night model in the Yinchuan area. The results show that the detection results are consistent with the overall change trend of the model. In general, it is feasible to detect a non-Kolmogorov turbulence profile using Mie-scattering lidar.

18.
Philos Trans A Math Phys Eng Sci ; 380(2218): 20210097, 2022 Mar 07.
Artigo em Inglês | MEDLINE | ID: mdl-35034486

RESUMO

Variation of the statistical properties of an incompressible velocity, passive vector and passive scalar in isotropic turbulence was studied using direct numerical simulation. The structure functions of the gradients, and the moments of the dissipation rates, began to increase at about [Formula: see text] from the Gaussian state and grew rapidly at [Formula: see text] in the turbulent state. A contour map of the probability density functions (PDFs) indicated that PDF expansion of the gradients of the passive vector and passive scalar begins at around [Formula: see text], whereas that of the longitudinal velocity gradient PDF is more gradual. The left tails of the dissipation rate PDF were found to follow a power law with an exponent of 3/2 for the incompressible velocity and passive vector dissipation rates, and 1/2 for the scalar dissipation rate and the enstrophy; they remained constant for all Reynolds numbers, indicating the universality of the left tail. The analytical PDFs of the dissipation rates and enstrophy of the Gaussian state were obtained and found to be the Gamma distribution. It was shown that the number of terms contributing to the dissipation rates and the enstrophy determines the decay rates of the two PDFs for low to moderate amplitudes. This article is part of the theme issue 'Scaling the turbulence edifice (part 1)'.

19.
Acta Biotheor ; 70(4): 25, 2022 Sep 16.
Artigo em Inglês | MEDLINE | ID: mdl-36112233

RESUMO

In this work, we study and analyze the aggregate death counts of COVID-19 reported by the United States Centers for Disease Control and Prevention (CDC) for the fifty states in the United States. To do this, we derive a stochastic model describing the cumulative number of deaths reported daily by CDC from the first time Covid-19 death is recorded to June 20, 2021 in the United States, and provide a forecast for the death cases. The stochastic model derived in this work performs better than existing deterministic logistic models because it is able to capture irregularities in the sample path of the aggregate death counts. The probability distribution of the aggregate death counts is derived, analyzed, and used to estimate the count's per capita initial growth rate, carrying capacity, and the expected value for each given day as at the time this research is conducted. Using this distribution, we estimate the expected first passage time when the aggregate death count is slowing down. Our result shows that the expected aggregate death count is slowing down in all states as at the time this analysis is conducted (June 2021). A formula for predicting the end of Covid-19 deaths is derived. The daily expected death count for each states is plotted as a function of time. The probability density function for the current day, together with the forecast and its confidence interval for the next four days, and the root mean square error for our simulation results are estimated.


Assuntos
COVID-19 , Animais , Estados Unidos/epidemiologia
20.
Sensors (Basel) ; 23(1)2022 Dec 22.
Artigo em Inglês | MEDLINE | ID: mdl-36616708

RESUMO

The self-noise level of a seismometer can determine the performance of the seismic instrument and limit the ability to use seismic data to solve geoscience problems. Accurately measuring and simultaneously comparing the self-noise models from different types of seismometers has long been a challenging task due to the constraints of observation conditions. In this paper, the self-noise power spectral density (PSD) values of nine types of seismometers are calculated using four months of continuous seismic waveforms from Malingshan seismic station, China, and nine self-noise models are obtained based on the probability density function (PDF) representation. For the seismometer STS-2.5, the self-noise levels on the horizontal channels (E−W and N−S) are significantly higher than that on the vertical channel (U−D) in the microseism band (0.1 Hz to 1 Hz), which is a computing bias caused by the misalignment between the sensors in the horizontal direction, while the remarkably elevated noise on the horizontal channels at the low frequencies (<0.6 Hz) may originate from the local variation of atmospheric pressure. As for the very broadband seismometers Trillium-Horizon-120 and Trillium-120PA, and the ultra-broadband seismometers Trillium-Horizon-360 and CMG-3T-360, there is a trade-off between the microseism band range and low-frequency range in the PSD curves of the vertical channel. When the level of self-noise in the microseism band is high, the self-noise at low frequencies is relatively low. Although compared with the other very broadband seismometers, the self-noise level of the vertical component of the STS-2.5 is 3 dB to 4 dB lower at frequencies less than 1 Hz, the self-noise level of the STS-2.5 at high frequencies (>2 Hz) is slightly higher than others. From our observations, we conclude that the nine seismometers cannot reach the lowest noise level in all frequency bands within the working range.

SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa