Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 36
Filtrar
1.
Opt Express ; 31(3): 4899-4919, 2023 Jan 30.
Artigo em Inglês | MEDLINE | ID: mdl-36785446

RESUMO

Photon echoes in rare-earth-doped crystals are studied to understand the challenges of making broadband quantum memories using the atomic frequency comb (AFC) protocol in systems with hyperfine structure. The hyperfine structure of Pr3+ poses an obstacle to this goal because frequencies associated with the hyperfine transitions change the simple picture of modulation at an externally imposed frequency. The current work focuses on the intermediate case where the hyperfine spacing is comparable to the comb spacing, a challenging regime that has recently been considered. Operating in this regime may facilitate storing quantum information over a larger spectral range in such systems. In this work, we prepare broadband AFCs using optical combs with tooth spacings ranging from 1 MHz to 16 MHz in fine steps, and measure transmission spectra and photon echoes for each. We predict the spectra and echoes theoretically using the optical combs as input to either a rate equation code or a density matrix code, which calculates the redistribution of populations. We then use the redistributed populations as input to a semiclassical theory using the frequency-dependent dielectric function. The two sets of predictions each give a good, but different account of the photon echoes.

2.
Opt Express ; 30(13): 23238-23259, 2022 Jun 20.
Artigo em Inglês | MEDLINE | ID: mdl-36225009

RESUMO

X-ray tomography is capable of imaging the interior of objects in three dimensions non-invasively, with applications in biomedical imaging, materials science, electronic inspection, and other fields. The reconstruction process can be an ill-conditioned inverse problem, requiring regularization to obtain satisfactory results. Recently, deep learning has been adopted for tomographic reconstruction. Unlike iterative algorithms which require a distribution that is known a priori, deep reconstruction networks can learn a prior distribution through sampling the training distributions. In this work, we develop a Physics-assisted Generative Adversarial Network (PGAN), a two-step algorithm for tomographic reconstruction. In contrast to previous efforts, our PGAN utilizes maximum-likelihood estimates derived from the measurements to regularize the reconstruction with both known physics and the learned prior. Compared with methods with less physics assisting in training, PGAN can reduce the photon requirement with limited projection angles to achieve a given error rate. The advantages of using a physics-assisted learned prior in X-ray tomography may further enable low-photon nanoscale imaging.

3.
Opt Express ; 29(2): 1788-1804, 2021 Jan 18.
Artigo em Inglês | MEDLINE | ID: mdl-33726385

RESUMO

A reconstruction algorithm for partially coherent x-ray computed tomography (XCT) including Fresnel diffraction is developed and applied to an optical fiber. The algorithm is applicable to a high-resolution tube-based laboratory-scale x-ray tomography instrument. The computing time is only a few times longer than the projective counterpart. The algorithm is used to reconstruct, with projections and diffraction, a tilt series acquired at the micrometer scale of a graded-index optical fiber using maximum likelihood and a Bayesian method based on the work of Bouman and Sauer. The inclusion of Fresnel diffraction removes some reconstruction artifacts and use of a Bayesian prior probability distribution removes others, resulting in a substantially more accurate reconstruction.

4.
Artigo em Inglês | MEDLINE | ID: mdl-35529769

RESUMO

Feature sizes in integrated circuits have decreased substantially over time, and it has become increasingly difficult to three-dimensionally image these complex circuits after fabrication. This can be important for process development, defect analysis, and detection of unexpected structures in externally sourced chips, among other applications. Here, we report on a non-destructive, tabletop approach that addresses this imaging problem through x-ray tomography, which we uniquely realize with an instrument that combines a scanning electron microscope (SEM) with a transition-edge sensor (TES) x-ray spectrometer. Our approach uses the highly focused SEM electron beam to generate a small x-ray generation region in a carefully designed target layer that is placed over the sample being tested. With the high collection efficiency and resolving power of a TES spectrometer, we can isolate x-rays generated in the target from background and trace their paths through regions of interest in the sample layers, providing information about the various materials along the x-ray paths through their attenuation functions. We have recently demonstrated our approach using a 240 Mo/Cu bilayer TES prototype instrument on a simplified test sample containing features with sizes of ∼ 1 µm. Currently, we are designing and building a 3000 Mo/Au bilayer TES spectrometer upgrade, which is expected to improve the imaging speed by factor of up to 60 through a combination of increased detector number and detector speed.

5.
Microsc Microanal ; 25(1): 70-76, 2019 02.
Artigo em Inglês | MEDLINE | ID: mdl-30869576

RESUMO

Using a commercial X-ray tomography instrument, we have obtained reconstructions of a graded-index optical fiber with voxels of edge length 1.05 µm at 12 tube voltages. The fiber manufacturer created a graded index in the central region by varying the germanium concentration from a peak value in the center of the core to a very small value at the core-cladding boundary. Operating on 12 tube voltages, we show by a singular value decomposition that there are only two singular vectors with significant weight. Physically, this means scans beyond two tube voltages contain largely redundant information. We concentrate on an analysis of the images associated with these two singular vectors. The first singular vector is dominant and images of the coefficients of the first singular vector at each voxel look are similar to any of the single-energy reconstructions. Images of the coefficients of the second singular vector by itself appear to be noise. However, by averaging the reconstructed voxels in each of several narrow bands of radii, we can obtain values of the second singular vector at each radius. In the core region, where we expect the germanium doping to go from a peak value at the fiber center to zero at the core-cladding boundary, we find that a plot of the two coefficients of the singular vectors forms a line in the two-dimensional space consistent with the dopant decreasing linearly with radial distance from the core center. The coating, made of a polymer rather than silica, is not on this line indicating that the two-dimensional results are sensitive not only to the density but also to the elemental composition.

6.
Artigo em Inglês | MEDLINE | ID: mdl-32856003

RESUMO

Patient-specific computational modeling is increasingly used to assist with visualization, planning, and execution of medical treatments. This trend is placing more reliance on medical imaging to provide accurate representations of anatomical structures. Digital image analysis is used to extract anatomical data for use in clinical assessment/planning. However, the presence of image artifacts, whether due to interactions between the physical object and the scanning modality or the scanning process, can degrade image accuracy. The process of extracting anatomical structures from the medical images introduces additional sources of variability, e.g., when thresholding or when eroding along apparent edges of biological structures. An estimate of the uncertainty associated with extracting anatomical data from medical images would therefore assist with assessing the reliability of patient-specific treatment plans. To this end, two image datasets were developed and analyzed using standard image analysis procedures. The first dataset was developed by performing a "virtual voxelization" of a CAD model of a sphere, representing the idealized scenario of no error in the image acquisition and reconstruction algorithms (i.e., a perfect scan). The second dataset was acquired by scanning three spherical balls using a laboratory-grade CT scanner. For the idealized sphere, the error in sphere diameter was less than or equal to 2% if 5 or more voxels were present across the diameter. The measurement error degraded to approximately 4% for a similar degree of voxelization of the physical phantom. The adaptation of established thresholding procedures to improve segmentation accuracy was also investigated.

7.
Artigo em Inglês | MEDLINE | ID: mdl-34877164

RESUMO

Fundamental limits for the calculation of scattering corrections within X-ray computed tomography (CT) are found within the independent atom approximation from an analysis of the cross sections, CT geometry, and the Nyquist sampling theorem, suggesting large reductions in computational time compared to existing methods. By modifying the scatter by less than 1 %, it is possible to treat some of the elastic scattering in the forward direction as inelastic to achieve a smoother elastic scattering distribution. We present an analysis showing that the number of samples required for the smoother distribution can be greatly reduced. We show that fixed forced detection can be used with many fewer points for inelastic scattering, but that for pure elastic scattering, a standard Monte Carlo calculation is preferred. We use smoothing for both elastic and inelastic scattering because the intrinsic angular resolution is much poorer than can be achieved for projective tomography. Representative numerical examples are given.

8.
PLoS One ; 13(12): e0208820, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-30571779

RESUMO

PURPOSE: This paper lays the groundwork for linking Hounsfield unit measurements to the International System of Units (SI), ultimately enabling traceable measurements across X-ray CT (XCT) machines. We do this by characterizing a material basis that may be used in XCT reconstruction giving linear combinations of concentrations of chemical elements (in the SI units of mol/m3) which may be observed at each voxel. By implication, linear combinations not in the set are not observable. METHODS AND MATERIALS: We formulated a model for our material basis with a set of measurements of elemental powders at four tube voltages, 80 kV, 100 kV, 120 kV, and 140 kV, on a medical XCT. The samples included 30 small plastic bottles of powders containing various compounds spanning the atomic numbers up to 20, and a bottle of water and one of air. Using the chemical formulas and measured masses, we formed a matrix giving the number of Hounsfield units per (mole per cubic meter) at each tube voltage for each of 13 chemical elements. We defined a corresponding matrix in units we call molar Hounsfield unit (HU) potency, the difference in HU values that an added mole per cubic meter in a given voxel would add to the measured HU value. We built a matrix of molar potencies for each chemical element and tube voltage and performed a singular value decomposition (SVD) on these to formulate our material basis. We determined that the dimension of this basis is two. We then compared measurements in this material space with theoretical measurements, combining XCOM cross section data with the tungsten anode spectral model using interpolating cubic splines (TASMICS), a one-parameter filter, and a simple detector model, creating a matrix similar to our experimental matrix for the first 20 chemical elements. Finally, we compared the model predictions to Hounsfield unit measurements on three XCT calibration phantoms taken from the literature. RESULTS: We predict the experimental HU potency values derived from our scans of chemical elements with our theoretical model built from XCOM data. The singular values and singular vectors of the model and powder measurements are in substantial agreement. Application of the Bayesian Information Criterion (BIC) shows that exactly two singular values and singular vectors describe the results over four tube voltages. We give a good account of the HU values from the literature, measured for the calibration phantoms at several tube voltages for several commercial instruments, compared with our theoretical model without introducing additional parameters. CONCLUSIONS: We have developed a two-dimensional material basis that specifies the degree to which individual elements in compounds effect the HU values in XCT images of samples with elements up to atomic number Z = 20. We show that two dimensions is sufficient given the contrast and noise in our experiment. The linear combination of concentrations of elements that can be observed using a medical XCT have been characterized, providing a material basis for use in dual-energy reconstruction. This approach provides groundwork for improved reconstruction and for the link of Hounsfield units to the SI.


Assuntos
Modelos Teóricos , Imagens de Fantasmas , Tomografia Computadorizada por Raios X/métodos , Tomografia Computadorizada por Raios X/normas , Calibragem , Humanos , Tomografia Computadorizada por Raios X/instrumentação
9.
Phys Teach ; 56(2)2018 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-29542737

RESUMO

A measurement of a thermophysical property of water is made using items found in the author's home. Specifically, the ratio of the energy required to heat water from the melting point to boiling to the energy required to completely boil away the water is found to be 5.7. This may be compared to the standard value of 5.5. The close agreement is not representative of the actual uncertainties in this simple experiment. Heating water in a microwave oven can let a student apply the techniques of quantitative science based on questions generated by his or her scientific curiosity.

10.
Appl Opt ; 57(4): 788-793, 2018 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-29400755

RESUMO

We have significantly accelerated diffraction calculations using three independent acceleration devices. These innovations are restricted to cylindrically symmetrical systems. In the first case, we consider Wolf's formula for integrated flux in a circular region following diffraction of light from a point source by a circular aperture or a circular lens. Although the formula involves a double sum, we evaluate it with the effort of a single sum by use of fast Fourier transforms (FFTs) to perform convolutions. In the second case, we exploit properties of the Fresnel-Kirchhoff propagator in the Gaussian, paraxial optics approximation to achieve the propagation of a partial wave from one optical element to the next. Ordinarily, this would involve a double loop over the radial variables on each element, but we have reduced the computational cost by a factor approximately equal to the smaller number of radius values. In the third case, we reduce the number of partial waves, for which the propagation needs to be calculated, to determine the throughput of an optical system of interest in radiometry when at least one element is very small, such as a pinhole aperture. As a demonstration of the benefits of the second case, we analyze intricate diffraction effects that occur in a satellite-based solar radiometry instrument.

11.
Opt Express ; 26(25): 32788-32801, 2018 Dec 10.
Artigo em Inglês | MEDLINE | ID: mdl-30645441

RESUMO

The low-latency requirements of a practical loophole-free Bell test preclude time-consuming post-processing steps that are often used to improve the statistical quality of a physical random number generator (RNG). Here we demonstrate a post-processing-free RNG that produces a random bit within 2.4(2) ns of an input trigger. We use weak feedback to eliminate long-term drift, resulting in 24 hour operation with output that is statistically indistinguishable from a Bernoulli process. We quantify the impact of the feedback on the predictability of the output as less than 6.4×10-7 and demonstrate the utility of the Allan variance as a tool for characterizing non-idealities in RNGs.

12.
Opt Express ; 25(22): 26728-26746, 2017 Oct 30.
Artigo em Inglês | MEDLINE | ID: mdl-29092156

RESUMO

Preliminary experiments at the NIST Spectral Tri-function Automated Reference Reflectometer (STARR) facility have been conducted with the goal of providing the diffuse optical properties of a solid reference standard with optical properties similar to human skin. Here, we describe an algorithm for determining the best-fit parameters and the statistical uncertainty associated with the measurement. The objective function is determined from the profile log likelihood, including both experimental and Monte Carlo uncertainties. Initially, the log likelihood is determined over a large parameter search box using a relatively small number of Monte Carlo samples such as 2·104. The search area is iteratively reduced to include the 99.9999% confidence region, while doubling the number of samples at each iteration until the experimental uncertainty dominates over the Monte Carlo uncertainty. Typically this occurs by 1.28·106 samples. The log likelihood is then fit to determine a 95% confidence ellipse. The inverse problem requires the values of the log likelihood on many points. Our implementation uses importance sampling to calculate these points on a grid in an efficient manner. Ultimately, the time-to-solution is approximately six times the cost of a Monte Carlo simulation of the radiation transport problem for a single set of parameters with the largest number of photons required. The results are found to be 64 times faster than our implementation of Particle Swarm Optimization.

13.
Med Phys ; 44(9): e202-e206, 2017 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-28901619

RESUMO

PURPOSE: The goal is to determine whether dual-energy computed tomography (CT) leads to a unique reconstruction using two basis materials. METHODS: The beam-hardening equation is simplified to the single-voxel case. The simplified equation is rewritten to show that the solution can be considered to be linear operations in a vector space followed by a measurement model which is the sum of the exponential of the coordinates. The case of finding the concentrations of two materials from measurements of two spectra with three photon energies is the simplest non-trivial case and is considered in detail. RESULTS: Using a material basis of water and bone, with photon energies of 30 keV, 60 keV, and 100 keV, a case with two solutions is demonstrated. CONCLUSIONS: Dual-energy reconstruction using two materials is not unique as shown by an example. Algorithms for dual-energy, dual-material reconstructions need to be aware of this potential ambiguity in the solution.


Assuntos
Algoritmos , Tomografia Computadorizada por Raios X , Humanos , Fótons
14.
Math J ; 192017.
Artigo em Inglês | MEDLINE | ID: mdl-29706749

RESUMO

The derivation of the scattering force and the gradient force on a spherical particle due to an electromagnetic wave often invokes the Clausius-Mossotti factor, based on an ad hoc physical model. In this article, we derive the expressions including the Clausius-Mossotti factor directly from the fundamental equations of classical electromagnetism. Starting from an analytic expression for the force on a spherical particle in a vacuum using the Maxwell stress tensor, as well as the Mie solution for the response of dielectric particles to an electromagnetic plane wave, we derive the scattering and gradient forces. In both cases, the Clausius-Mossotti factor arises rigorously from the derivation without any physical argumentation. The limits agree with expressions in the literature.

15.
Artigo em Inglês | MEDLINE | ID: mdl-34877089

RESUMO

The goal of this study was to compare volumetric analysis in computed tomography (CT) with the length measurement prescribed by the Response Evaluation Criteria in Solid Tumors (RECIST) for a system with known mass and unknown shape. We injected 2 mL to 4 mL of water into vials of sodium polyacrylate and into disposable diapers. Volume measurements of the sodium polyacrylate powder were able to predict both mass and proportional changes in mass within a 95 % prediction interval of width 12 % and 16 %, respectively. The corresponding figures for RECIST were 102 % and 82 %.

17.
Opt Express ; 24(13): 14100-23, 2016 Jun 27.
Artigo em Inglês | MEDLINE | ID: mdl-27410570

RESUMO

We consider the problem of sorting, by size, spherical particles of order 100 nm radius. The scheme we analyze consists of a heterogeneous stream of spherical particles flowing at an oblique angle across an optical Gaussian mode standing wave. Sorting is achieved by the combined spatial and size dependencies of the optical force. Particles of all sizes enter the flow at a point, but exit at different locations depending on size. Exiting particles may be detected optically or separated for further processing. The scheme has the advantages of accommodating a high throughput, producing a continuous stream of continuously dispersed particles, and exhibiting excellent size resolution. We performed detailed Monte Carlo simulations of particle trajectories through the optical field under the influence of convective air flow. We also developed a method for deriving effective velocities and diffusion constants from the Fokker-Planck equation that can generate equivalent results much more quickly. With an optical wavelength of 1064 nm, polystyrene particles with radii in the neighborhood of 275 nm, for which the optical force vanishes, may be sorted with a resolution below 1 nm.

18.
Proc SPIE Int Soc Opt Eng ; 97002016 Mar 24.
Artigo em Inglês | MEDLINE | ID: mdl-27453623

RESUMO

The National Institute of Standards and Technology (NIST) has maintained scales for reflectance and transmittance over several decades. The scales are primarily intended for regular transmittance, mirrors, and solid surface scattering diffusers. The rapidly growing area of optical medical imaging needs a scale for volume scattering of diffuse materials that are used to mimic the optical properties of tissue. Such materials are used as phantoms to evaluate and validate instruments under development intended for clinical use. To address this need, a double-integrating sphere based instrument has been installed to measure the optical properties of tissue-mimicking phantoms. The basic system and methods have been described in previous papers. An important attribute in establishing a viable calibration service is the estimation of measurement uncertainties. The use of custom models and comparisons with other established scales enabled uncertainty measurements. Here, we describe the continuation of those efforts to advance the understanding of the uncertainties through two independent measurements: the bidirectional reflectance distribution function and the bidirectional transmittance distribution function of a commercially available solid biomedical phantom. A Monte Carlo-based model is used and the resulting optical properties are compared to the values provided by the phantom manufacturer.

19.
Atmos Meas Tech ; 9: 1627-1636, 2016 Apr 13.
Artigo em Inglês | MEDLINE | ID: mdl-27453761

RESUMO

Laser absorption spectroscopy (LAS) has been used over the last several decades for the measurement of trace gasses in the atmosphere. For over a decade, LAS measurements from multiple sources and tens of retroreflectors have been combined with sparse-sample tomography methods to estimate the 2-D distribution of trace gas concentrations and underlying fluxes from point-like sources. In this work, we consider the ability of such a system to detect and estimate the position and rate of a single point leak which may arise as a failure mode for carbon dioxide storage. The leak is assumed to be at a constant rate giving rise to a plume with a concentration and distribution that depend on the wind velocity. We demonstrate the ability of our approach to detect a leak using numerical simulation and also present a preliminary measurement.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...