Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 74
Filtrar
Mais filtros

Intervalo de ano de publicação
1.
Value Health ; 24(5): 648-657, 2021 05.
Artigo em Inglês | MEDLINE | ID: mdl-33933233

RESUMO

OBJECTIVES: Coronavirus disease 2019 has put unprecedented pressure on healthcare systems worldwide, leading to a reduction of the available healthcare capacity. Our objective was to develop a decision model to estimate the impact of postponing semielective surgical procedures on health, to support prioritization of care from a utilitarian perspective. METHODS: A cohort state-transition model was developed and applied to 43 semielective nonpediatric surgical procedures commonly performed in academic hospitals. Scenarios of delaying surgery from 2 weeks were compared with delaying up to 1 year and no surgery at all. Model parameters were based on registries, scientific literature, and the World Health Organization Global Burden of Disease study. For each surgical procedure, the model estimated the average expected disability-adjusted life-years (DALYs) per month of delay. RESULTS: Given the best available evidence, the 2 surgical procedures associated with most DALYs owing to delay were bypass surgery for Fontaine III/IV peripheral arterial disease (0.23 DALY/month, 95% confidence interval [CI]: 0.13-0.36) and transaortic valve implantation (0.15 DALY/month, 95% CI: 0.09-0.24). The 2 surgical procedures with the least DALYs were placing a shunt for dialysis (0.01, 95% CI: 0.005-0.01) and thyroid carcinoma resection (0.01, 95% CI: 0.01-0.02). CONCLUSION: Expected health loss owing to surgical delay can be objectively calculated with our decision model based on best available evidence, which can guide prioritization of surgical procedures to minimize population health loss in times of scarcity. The model results should be placed in the context of different ethical perspectives and combined with capacity management tools to facilitate large-scale implementation.


Assuntos
COVID-19/complicações , Simulação por Computador , Saúde da População/estatística & dados numéricos , Capacidade de Resposta ante Emergências/normas , Estudos de Coortes , Carga Global da Doença , Humanos , Expectativa de Vida/tendências , Teoria da Probabilidade , Anos de Vida Ajustados por Qualidade de Vida , Capacidade de Resposta ante Emergências/estatística & dados numéricos
2.
J Math Biol ; 80(6): 1971-1992, 2020 05.
Artigo em Inglês | MEDLINE | ID: mdl-32253463

RESUMO

This paper introduces a new way to define a genome rearrangement distance, using the concept of mean first passage time from probability theory. Crucially, this distance provides a genuine metric on genome space. We develop the theory and introduce a link to a graph-based zeta function. The approach is very general and can be applied to a wide variety of group-theoretic models of genome evolution.


Assuntos
Evolução Molecular , Rearranjo Gênico , Modelos Genéticos , Filogenia , Bactérias/genética , Inversão Cromossômica , Genoma , Genoma Bacteriano , Funções Verossimilhança , Cadeias de Markov , Conceitos Matemáticos , Teoria da Probabilidade , Fatores de Tempo
3.
Value Health ; 22(4): 446-452, 2019 04.
Artigo em Inglês | MEDLINE | ID: mdl-30975396

RESUMO

BACKGROUND: Paired-gamble methods have been proposed to avoid the "certainty effect" associated with standard gamble methods. OBJECTIVE: This study examines the role of starting-point effects in paired-gamble methods. In particular, it examines how the utilities so derived vary as a function of the probabilities of the stimulus lottery. METHODS: A sample of 455 members of the Spanish general population valued 9 health states via face-to-face interviews. Subjects were randomly placed into 3 subgroups, which differed in terms of the stimulus gamble's probability. Nonparametric tests and an interval regression model were used to test if utilities change when the probability distribution is modified. RESULTS: Nonparametric tests showed that the probability of a health state being considered worse than death did not differ among subgroups. Nevertheless, changes in the stimulus gamble did produce significant differences in the distribution of utilities: the higher the probability of full health in the stimulus, the higher the utility elicited. Regression estimates support the existence of starting-point effects when the utilities are obtained under expected utility. According to the prospect theory, the conclusions depend on the reference point considered. When the reference points used are death or the health state evaluated, we observe differences among these groups. Nevertheless, when full health is used, these differences disappear. CONCLUSION: This research suggests that paired-gamble methods may also be susceptible to starting-point effects. Yet the differences are small, and they disappear when the data are analyzed using prospect theory with full health as the reference point.


Assuntos
Alcoolismo/diagnóstico , Indicadores Básicos de Saúde , Nível de Saúde , Teoria da Probabilidade , Qualidade de Vida , Alcoolismo/mortalidade , Alcoolismo/psicologia , Alcoolismo/terapia , Efeitos Psicossociais da Doença , Interpretação Estatística de Dados , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Modelos Estatísticos , Preferência do Paciente , Probabilidade , Anos de Vida Ajustados por Qualidade de Vida , Fatores Socioeconômicos , Espanha/epidemiologia
4.
Proc Biol Sci ; 283(1828)2016 04 13.
Artigo em Inglês | MEDLINE | ID: mdl-27053743

RESUMO

Classical probability theory has been influential in modelling decision processes, despite empirical findings that have been persistently paradoxical from classical perspectives. For such findings, some researchers have been successfully pursuing decision models based on quantum theory (QT). One unique feature of QT is the collapse postulate, which entails that measurements (or in decision-making, judgements) reset the state to be consistent with the measured outcome. If there is quantum structure in cognition, then there has to be evidence for the collapse postulate. A striking, a prioriprediction, is that opinion change will be slowed down (under idealized conditions frozen) by continuous judgements. In physics, this is the quantum Zeno effect. We demonstrate a quantum Zeno effect in decision-making in humans and so provide evidence that advocates the use of quantum principles in decision theory, at least in some cases.


Assuntos
Tomada de Decisões , Teoria da Decisão , Humanos , Modelos Psicológicos , Teoria da Probabilidade , Teoria Quântica
5.
Value Health ; 18(4): 413-24, 2015 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-26091595

RESUMO

BACKGROUND: Within the standard gamble approach to the elicitation of health preferences, no previous studies compared probability equivalent (PE) and certainty equivalent (CE) techniques OBJECTIVE: This study aimed to explore the differences between CE and PE techniques when payoffs are expressed in terms of life-years or quality of life. METHODS: Individuals were interviewed through both CE and PE techniques within an experimental setting. Inferential statistics and regression analysis where applied to process data. Order and sequence effect were also investigated. RESULTS: On average, the elicitation technique did not affect individuals' risk attitude significantly. Individuals proved to be risk averse in gambles concerning life-years and risk seekers in those concerning quality of life. No order or sequence effect was observed. Risk premium, measuring the strength of risk attitude as the percentage variation between the individual's estimated PE or CE and the risk neutral PE or CE, was affected by the kind of gamble that the interviewee is presented with. It increased in gambles concerning health profiles, denoting a stronger risk propensity, and decreased in gambles concerning life years, denoting a stronger risk aversion. CONCLUSION: The choice of the elicitation technique did not affect the individuals' risk attitude significantly, which instead was sensitive to the kind of gamble.


Assuntos
Jogo de Azar/psicologia , Nível de Saúde , Teoria da Probabilidade , Qualidade de Vida/psicologia , Anos de Vida Ajustados por Qualidade de Vida , Assunção de Riscos , Comportamento de Escolha , Feminino , Jogo de Azar/economia , Humanos , Masculino
6.
Arch. med ; 15(1): 33-45, jun. 2015.
Artigo em Espanhol | LILACS | ID: lil-776036

RESUMO

Objetivo: confirmar la capacidad diagnóstica de una metodología basada en la teoría de probabilidad en casos de arritmia. Materiales y métodos: se realiza un estudio ciego, en el que se analizaron 10 Holter normales y 90 con diferentes tipos de arritmia, de pacientes mayores de 21 años. Se enmascara el diagnóstico convencional y se calcula la probabilidad de rangos de frecuencias cardiacas máximas, mínimas e intermedias cada hora, y de número de latidos por hora, para determinar el diagnóstico matemático de acuerdo con los tres parámetros establecidos previamente. Finalmente, se desenmascara el diagnóstico convencional, que fue tomado como Estándar de Oro, y se realiza un análisis de la concordancia diagnóstica para diferenciar normalidad y arritmia aguda. Resultados: los Holternormales, presentan valores matemáticos característicos de normalidad o en evolucióna enfermedad, mientras que todos los casos patológicos son diagnosticados por la metodología como en evolución a enfermedad o enfermos. Se obtiene una sensibilidad del 100%, una especificidad del 70%, un VPP de 93,33% y un VPNde 100%, y el coeficiente Kappa presenta un valor de 0,82. Conclusiones: la aplicación de la metodología en el estudio de alteraciones arrítmicas, evidencia un orden matemático mediante el cual se logra diferenciar normalidad de enfermedad y detectar estados de evolución a la enfermedad aun en estados clínicos con diagnóstico normal, de posible utilidad preventiva a nivel clínico.


Assuntos
Arritmias Cardíacas , Diagnóstico , Eletrocardiografia Ambulatorial , Teoria da Probabilidade
7.
J Math Biol ; 70(3): 647-78, 2015 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-24682331

RESUMO

The selection of dispersion is a classical problem in ecology and evolutionary biology. Deterministic dynamical models of two competing species differing only in their passive dispersal rates suggest that the lower mobility species has a competitive advantage in inhomogeneous environments, and that dispersion is a neutral characteristic in homogeneous environments. Here we consider models including local population fluctuations due to both individual movements and random birth and death events to investigate the effect of demographic stochasticity on the competition between species with different dispersal rates. In this paper, the first of two, we focus on homogeneous environments where deterministic models predict degenerate dynamics in the sense that there are many (marginally) stable equilibria with the species' coexistence ratio depending only on initial data. When demographic stochasticity is included the situation changes. A novel large carrying capacity ([Formula: see text]) asymptotic analysis, confirmed by direct numerical simulations, shows that a preference for faster dispersers emerges on a precisely defined [Formula: see text] time scale. We conclude that while there is no evolutionarily stable rate for competitors to choose in these models, the selection mechanism quantified here is the essential counterbalance in spatially inhomogeneous models including demographic fluctuations which do display an evolutionarily stable dispersal rate.


Assuntos
Evolução Biológica , Modelos Biológicos , Animais , Simulação por Computador , Ecossistema , Extinção Biológica , Conceitos Matemáticos , Método de Monte Carlo , Dinâmica Populacional , Teoria da Probabilidade , Processos Estocásticos , Biologia de Sistemas
8.
J Neurosci Methods ; 240: 179-92, 2015 Jan 30.
Artigo em Inglês | MEDLINE | ID: mdl-25479231

RESUMO

BACKGROUND: Neural information processing involves a series of nonlinear dynamical input/output transformations between the spike trains of neurons/neuronal ensembles. Understanding and quantifying these transformations is critical both for understanding neural physiology such as short-term potentiation and for developing cognitive neural prosthetics. NEW METHOD: A novel method for estimating Volterra kernels for systems with point-process inputs and outputs is developed based on elementary probability theory. These Probability Based Volterra (PBV) kernels essentially describe the probability of an output spike given q input spikes at various lags t1, t2, …, tq. RESULTS: The PBV kernels are used to estimate both synthetic systems where ground truth is available and data from the CA3 and CA1 regions rodent hippocampus. The PBV kernels give excellent predictive results in both cases. Furthermore, they are shown to be quite robust to noise and to have good convergence and overfitting properties. Through a slight modification, the PBV kernels are shown to also deal well with correlated point-process inputs. COMPARISON WITH EXISTING METHODS: The PBV kernels were compared with kernels estimated through least squares estimation (LSE) and through the Laguerre expansion technique (LET). The LSE kernels were shown to fair very poorly with real data due to the large amount of input noise. Although the LET kernels gave the best predictive results in all cases, they require prior parameter estimation. It was shown how the PBV and LET methods can be combined synergistically to maximize performance. CONCLUSIONS: The PBV kernels provide a novel and intuitive method of characterizing point-process input-output nonlinear systems.


Assuntos
Potenciais de Ação , Neurônios/fisiologia , Teoria da Probabilidade , Processamento de Sinais Assistido por Computador , Algoritmos , Animais , Região CA1 Hipocampal/fisiologia , Região CA3 Hipocampal/fisiologia , Simulação por Computador , Eletrodos Implantados , Análise dos Mínimos Quadrados , Masculino , Modelos Neurológicos , Método de Monte Carlo , Atividade Motora/fisiologia , Dinâmica não Linear , Curva ROC , Ratos Long-Evans
9.
J Chem Phys ; 140(3): 034706, 2014 Jan 21.
Artigo em Inglês | MEDLINE | ID: mdl-25669406

RESUMO

The interactions between proteins/peptides and materials are crucial to research and development in many biomedical engineering fields. The energetics of such interactions are key in the evaluation of new proteins/peptides and materials. Much research has recently focused on the quality of free energy profiles by Jarzynski's equality, a widely used equation in biosystems. In the present work, considerable discrepancies were observed between the results obtained by Jarzynski's equality and those derived by umbrella sampling in biomaterial-water model systems. Detailed analyses confirm that such discrepancies turn up only when the target molecule moves in the high-density water layer on a material surface. Then a hybrid scheme was adopted based on this observation. The agreement between the results of the hybrid scheme and umbrella sampling confirms the former observation, which indicates an approach to a fast and accurate estimation of adsorption free energy for large biomaterial interfacial systems.


Assuntos
Oligopeptídeos/química , Água/química , Adsorção , Simulação de Dinâmica Molecular , Teoria da Probabilidade , Termodinâmica
10.
Top Cogn Sci ; 5(4): 818-43, 2013 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-24019237

RESUMO

In recent years quantum probability models have been used to explain many aspects of human decision making, and as such quantum models have been considered a viable alternative to Bayesian models based on classical probability. One criticism that is often leveled at both kinds of models is that they lack a clear interpretation in terms of psychological mechanisms. In this paper we discuss the mechanistic underpinnings of a quantum walk model of human decision making and response time. The quantum walk model is compared to standard sequential sampling models, and the architectural assumptions of both are considered. In particular, we show that the quantum model has a natural interpretation in terms of a cognitive architecture that is both massively parallel and involves both co-operative (excitatory) and competitive (inhibitory) interactions between units. Additionally, we introduce a family of models that includes aspects of the classical and quantum walk models.


Assuntos
Cognição , Tomada de Decisões/fisiologia , Modelos Psicológicos , Teoria da Probabilidade , Probabilidade , Teoria Quântica , Humanos , Tempo de Reação , Fatores de Tempo
11.
Behav Brain Sci ; 36(3): 294-5, 2013 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-23673040

RESUMO

According to the foundations of economic theory, agents have stable and coherent "global" preferences that guide their choices among alternatives. However, people are constrained by information-processing and memory limitations and hence have a propensity to avoid cognitive load. We propose that this in turn will encourage them to respond to "local" preferences and goals influenced by context and memory representations.


Assuntos
Cognição , Modelos Psicológicos , Teoria da Probabilidade , Teoria Quântica , Humanos
12.
Top Cogn Sci ; 5(1): 13-34, 2013 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-23335572

RESUMO

The idea that perceptual and cognitive systems must incorporate knowledge about the structure of the environment has become a central dogma of cognitive theory. In a Bayesian context, this idea is often realized in terms of "tuning the prior"-widely assumed to mean adjusting prior probabilities so that they match the frequencies of events in the world. This kind of "ecological" tuning has often been held up as an ideal of inference, in fact defining an "ideal observer." But widespread as this viewpoint is, it directly contradicts Bayesian philosophy of probability, which views probabilities as degrees of belief rather than relative frequencies, and explicitly denies that they are objective characteristics of the world. Moreover, tuning the prior to observed environmental frequencies is subject to overfitting, meaning in this context overtuning to the environment, which leads (ironically) to poor performance in future encounters with the same environment. Whenever there is uncertainty about the environment-which there almost always is-an agent's prior should be biased away from ecological relative frequencies and toward simpler and more entropic priors.


Assuntos
Teorema de Bayes , Ciência Cognitiva , Generalização Psicológica , Conhecimento , Teoria da Probabilidade , Humanos , Método de Monte Carlo , Variações Dependentes do Observador , Filosofia
13.
J Comput Biol ; 20(1): 1-18, 2013 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-23294268

RESUMO

The Dirichlet process is used to model probability distributions that are mixtures of an unknown number of components. Amino acid frequencies at homologous positions within related proteins have been fruitfully modeled by Dirichlet mixtures, and we use the Dirichlet process to derive such mixtures with an unbounded number of components. This application of the method requires several technical innovations to sample an unbounded number of Dirichlet-mixture components. The resulting Dirichlet mixtures model multiple-alignment data substantially better than do previously derived ones. They consist of over 500 components, in contrast to fewer than 40 previously, and provide a novel perspective on the structure of proteins. Individual protein positions should be seen not as falling into one of several categories, but rather as arrayed near probability ridges winding through amino acid multinomial space.


Assuntos
Proteínas/química , Proteínas/genética , Alinhamento de Sequência/estatística & dados numéricos , Algoritmos , Teorema de Bayes , Biologia Computacional , Funções Verossimilhança , Cadeias de Markov , Conceitos Matemáticos , Modelos Estatísticos , Método de Monte Carlo , Teoria da Probabilidade , Estatísticas não Paramétricas
14.
Environ Monit Assess ; 185(5): 3917-29, 2013 May.
Artigo em Inglês | MEDLINE | ID: mdl-22941202

RESUMO

The evaluation of the status of a municipal drinking water treatment plant (WTP) is important. The evaluation depends on several factors, including, human health risks from disinfection by-products (R), disinfection performance (D), and cost (C) of water production and distribution. The Dempster-Shafer theory (DST) of evidence can combine the individual status with respect to R, D, and C to generate a new indicator, from which the overall status of a WTP can be evaluated. In the DST, the ranges of different factors affecting the overall status are divided into several segments. The basic probability assignments (BPA) for each segment of these factors are provided by multiple experts, which are then combined to obtain the overall status. In assigning the BPA, the experts use their individual judgments, which can impart subjective biases in the overall evaluation. In this research, an approach has been introduced to avoid the assignment of subjective BPA. The factors contributing to the overall status were characterized using the probability density functions (PDF). The cumulative probabilities for different segments of these factors were determined from the cumulative density function, which were then assigned as the BPA for these factors. A case study is presented to demonstrate the application of PDF in DST to evaluate a WTP, leading to the selection of the required level of upgradation for the WTP.


Assuntos
Estatística como Assunto , Purificação da Água/métodos , Algoritmos , Desinfecção , Probabilidade , Teoria da Probabilidade , Purificação da Água/economia , Purificação da Água/estatística & dados numéricos
15.
Demography ; 50(1): 237-60, 2013 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-23104205

RESUMO

In this article, we show how stochastic diffusion models can be used to forecast demographic cohort processes using the Hernes, Gompertz, and logistic models. Such models have been used deterministically in the past, but both behavioral theory and forecast utility are improved by introducing randomness and uncertainty into the standard differential equations governing population processes. Our approach is to add time-series stochasticity to linearized versions of each process. We derive both Monte Carlo and analytic methods for estimating forecast uncertainty. We apply our methods to several examples of marriage and fertility, extending them to simultaneous forecasting of multiple cohorts and to processes restricted by factors such as declining fecundity.


Assuntos
Coeficiente de Natalidade , Estado Civil/estatística & dados numéricos , Teoria da Probabilidade , Adulto , Fatores Etários , Europa (Continente) , Humanos , Pessoa de Meia-Idade , Método de Monte Carlo , Processos Estocásticos , Fatores de Tempo , Incerteza
16.
Cogn Sci ; 35(8): 1518-52, 2011.
Artigo em Inglês | MEDLINE | ID: mdl-21951058

RESUMO

Order of information plays a crucial role in the process of updating beliefs across time. In fact, the presence of order effects makes a classical or Bayesian approach to inference difficult. As a result, the existing models of inference, such as the belief-adjustment model, merely provide an ad hoc explanation for these effects. We postulate a quantum inference model for order effects based on the axiomatic principles of quantum probability theory. The quantum inference model explains order effects by transforming a state vector with different sequences of operators for different orderings of information. We demonstrate this process by fitting the quantum model to data collected in a medical diagnostic task and a jury decision-making task. To further test the quantum inference model, a new jury decision-making experiment is developed. Using the results of this experiment, we compare the quantum inference model with two versions of the belief-adjustment model, the adding model and the averaging model. We show that both the quantum model and the adding model provide good fits to the data. To distinguish the quantum model from the adding model, we develop a new experiment involving extreme evidence. The results from this new experiment suggest that the adding model faces limitations when accounting for tasks involving extreme evidence, whereas the quantum inference model does not. Ultimately, we argue that the quantum model provides a more coherent account for order effects that was not possible before.


Assuntos
Teorema de Bayes , Teoria da Decisão , Modelos Logísticos , Modelos Psicológicos , Aprendizagem por Probabilidade , Teoria da Probabilidade , Ciência Cognitiva , Tomada de Decisões , Humanos , Lógica , Processos Mentais , Resolução de Problemas
17.
Opt Express ; 19(13): 12261-74, 2011 Jun 20.
Artigo em Inglês | MEDLINE | ID: mdl-21716463

RESUMO

We investigate Hotelling observer performance (i.e., signal detectability) of a phased array system for tasks of detecting small inhomogeneities and distinguishing adjacent abnormalities in uniform diffusive media. Unlike conventional phased array systems where a single detector is located on the interface between two sources, we consider a detector array, such as a CCD, on a phantom exit surface for calculating the Hotelling observer detectability. The signal detectability for adjacent small abnormalities (2 mm displacement) for the CCD-based phased array is related to the resolution of reconstructed images. Simulations show that acquiring high-dimensional data from a detector array in a phased array system dramatically improves the detectability for both tasks when compared to conventional single detector measurements, especially at low modulation frequencies. It is also observed in all studied cases that there exists the modulation frequency optimizing CCD-based phased array systems, where detectability for both tasks is consistently high. These results imply that the CCD-based phased array has the potential to achieve high resolution and signal detectability in tomographic diffusive imaging while operating at a very low modulation frequency. The effect of other configuration parameters, such as a detector pixel size, on the observer performance is also discussed.


Assuntos
Interpretação de Imagem Assistida por Computador/métodos , Nefelometria e Turbidimetria/métodos , Teoria da Probabilidade , Tomografia Óptica/métodos , Simulação por Computador , Humanos , Interpretação de Imagem Assistida por Computador/instrumentação , Método de Monte Carlo , Neoplasias/diagnóstico , Nefelometria e Turbidimetria/instrumentação , Imagens de Fantasmas , Espectroscopia de Luz Próxima ao Infravermelho/métodos , Processos Estocásticos , Tomografia Óptica/instrumentação
18.
Med Teach ; 33(3): 224-33, 2011.
Artigo em Inglês | MEDLINE | ID: mdl-21345062

RESUMO

Medical education research in general is a young scientific discipline which is still finding its own position in the scientific range. It is rooted in both the biomedical sciences and the social sciences, each with their own scientific language. A more unique feature of medical education (and assessment) research is that it has to be both locally and internationally relevant. This is not always easy and sometimes leads to purely ideographic descriptions of an assessment procedure with insufficient general lessons or generalised scientific knowledge being generated or vice versa. For medical educational research, a plethora of methodologies is available to cater to many different research questions. This article contains consensus positions and suggestions on various elements of medical education (assessment) research. Overarching is the position that without a good theoretical underpinning and good knowledge of the existing literature, good research and sound conclusions are impossible to produce, and that there is no inherently superior methodology, but that the best methodology is the one most suited to answer the research question unambiguously. Although the positions should not be perceived as dogmas, they should be taken as very serious recommendations. Topics covered are: types of research, theoretical frameworks, designs and methodologies, instrument properties or psychometrics, costs/acceptability, ethics, infrastructure and support.


Assuntos
Educação Médica , Avaliação Educacional/métodos , Projetos de Pesquisa , Pesquisa/organização & administração , Conferências de Consenso como Assunto , Ética em Pesquisa , Humanos , Teoria da Probabilidade , Reprodutibilidade dos Testes
19.
J Magn Reson ; 207(2): 234-41, 2010 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-20937564

RESUMO

The (3)He lung morphometry technique (Yablonskiy et al., JAP, 2009), based on MRI measurements of hyperpolarized gas diffusion in lung airspaces, provides unique information on the lung microstructure at the alveolar level. 3D tomographic images of standard morphological parameters (mean airspace chord length, lung parenchyma surface-to-volume ratio, and the number of alveoli per unit lung volume) can be created from a rather short (several seconds) MRI scan. These parameters are most commonly used to characterize lung morphometry but were not previously available from in vivo studies. A background of the (3)He lung morphometry technique is based on a previously proposed model of lung acinar airways, treated as cylindrical passages of external radius R covered by alveolar sleeves of depth h, and on a theory of gas diffusion in these airways. The initial works approximated the acinar airways as very long cylinders, all with the same R and h. The present work aims at analyzing effects of realistic acinar airway structures, incorporating airway branching, physiological airway lengths, a physiological ratio of airway ducts and sacs, and distributions of R and h. By means of Monte-Carlo computer simulations, we demonstrate that our technique allows rather accurate measurements of geometrical and morphological parameters of acinar airways. In particular, the accuracy of determining one of the most important physiological parameter of lung parenchyma - surface-to-volume ratio - does not exceed several percent. Second, we analyze the effect of the susceptibility induced inhomogeneous magnetic field on the parameter estimate and demonstrate that this effect is rather negligible at B(0) ≤ 3T and becomes substantial only at higher B(0) Third, we theoretically derive an optimal choice of MR pulse sequence parameters, which should be used to acquire a series of diffusion-attenuated MR signals, allowing a substantial decrease in the acquisition time and improvement in accuracy of the results. It is demonstrated that the optimal choice represents three not equidistant b-values: b(1)=0, b(2)∼2 s/cm(2), b(3)∼8 s/cm(2).


Assuntos
Imagem de Difusão por Ressonância Magnética/métodos , Hélio , Pulmão/anatomia & histologia , Algoritmos , Teorema de Bayes , Imagem de Difusão por Ressonância Magnética/estatística & dados numéricos , Campos Eletromagnéticos , Humanos , Modelos Estatísticos , Método de Monte Carlo , Teoria da Probabilidade , Alvéolos Pulmonares/anatomia & histologia , Enfisema Pulmonar , Reprodutibilidade dos Testes
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA