Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 48
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
J Comput Appl Math ; 257: 195-211, 2014 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-25202164

RESUMO

The finite element method (FEM) is a widely employed numerical technique for approximating the solution of partial differential equations (PDEs) in various science and engineering applications. Many of these applications benefit from fast execution of the FEM pipeline. One way to accelerate the FEM pipeline is by exploiting advances in modern computational hardware, such as the many-core streaming processors like the graphical processing unit (GPU). In this paper, we present the algorithms and data-structures necessary to move the entire FEM pipeline to the GPU. First we propose an efficient GPU-based algorithm to generate local element information and to assemble the global linear system associated with the FEM discretization of an elliptic PDE. To solve the corresponding linear system efficiently on the GPU, we implement a conjugate gradient method preconditioned with a geometry-informed algebraic multi-grid (AMG) method preconditioner. We propose a new fine-grained parallelism strategy, a corresponding multigrid cycling stage and efficient data mapping to the many-core architecture of GPU. Comparison of our on-GPU assembly versus a traditional serial implementation on the CPU achieves up to an 87 × speedup. Focusing on the linear system solver alone, we achieve a speedup of up to 51 × versus use of a comparable state-of-the-art serial CPU linear system solver. Furthermore, the method compares favorably with other GPU-based, sparse, linear solvers.

2.
Breast J ; 19(1): 49-55, 2013.
Artigo em Inglês | MEDLINE | ID: mdl-23186153

RESUMO

Histologic confirmation of axillary nodal metastases preoperatively avoids a sentinel node biopsy and enables a one step surgical procedure. The aim of this study was to establish the local positive predictive value of axillary ultrasound (AUS) and guided needle core biopsy (NCB) in axillary staging of breast cancer, and to identify factors influencing yield. A prospective audit of 142 consecutive patients (screening and symptomatic) presenting from 1st December 2008-31st May 2009 with breast lesions categorized R4-R5, who underwent a preoperative AUS, and proceeded to surgery was undertaken. Ultrasound-guided NCB was performed on nodes radiologically classified R3-R5. Lymph node size, number, and morphological features were documented. Yield was correlated with tumor size, grade, and histologic type. AUS/NCB was correlated with post surgical pathologic findings to determine sensitivity, specificity, positive and negative predictive value of AUS and NCB. A total of 142 patients underwent surgery, of whom 52 (37%) had lymph node metastases on histology. All had a preoperative AUS, 51 (36%) had abnormal ultrasound findings. 46 (90%) underwent axillary node NCB of which 24 (52%) were positive. The smallest tumor size associated with positive nodes at surgery was 11.5 mm. The sensitivity of AUS was 65%. Specificity was 81%, with a positive predictive value (PPV) of 67% and negative predictive (NPV) value of 80%. Sensitivity of U/S-guided NCB was 75%, with a specificity of 100%, PPV 100% and NPV 64%. Sensitivity of AUS for lobular carcinoma was 36% versus 76% for all other histologies. Sensitivity of NCB for lobular cancer was 33% versus 79% for all other histologies. The most significant factor producing discordance between preoperative AUS and definitive histologic evidence of lymph node metastasis was tumor type. Accurate preoperative lymph node staging was prejudiced by lobular histology (p < 0.0019).


Assuntos
Neoplasias da Mama/patologia , Carcinoma Ductal de Mama/secundário , Carcinoma Lobular/secundário , Linfonodos/diagnóstico por imagem , Linfonodos/patologia , Adulto , Idoso , Idoso de 80 Anos ou mais , Axila , Biópsia por Agulha , Neoplasias da Mama/cirurgia , Carcinoma Ductal de Mama/cirurgia , Carcinoma Lobular/cirurgia , Feminino , Humanos , Linfonodos/cirurgia , Metástase Linfática , Auditoria Médica , Pessoa de Meia-Idade , Micrometástase de Neoplasia/diagnóstico por imagem , Micrometástase de Neoplasia/patologia , Estadiamento de Neoplasias , Valor Preditivo dos Testes , Cuidados Pré-Operatórios , Estudos Prospectivos , Ultrassonografia
3.
Appl Numer Math ; 63: 58-77, 2013 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-23585704

RESUMO

The Immersed Boundary (IB) method is a widely-used numerical methodology for the simulation of fluid-structure interaction problems. The IB method utilizes an Eulerian discretization for the fluid equations of motion while maintaining a Lagrangian representation of structural objects. Operators are defined for transmitting information (forces and velocities) between these two representations. Most IB simulations represent their structures with piecewise linear approximations and utilize Hookean spring models to approximate structural forces. Our specific motivation is the modeling of platelets in hemodynamic flows. In this paper, we study two alternative representations - radial basis functions (RBFs) and Fourier-based (trigonometric polynomials and spherical harmonics) representations - for the modeling of platelets in two and three dimensions within the IB framework, and compare our results with the traditional piecewise linear approximation methodology. For different representative shapes, we examine the geometric modeling errors (position and normal vectors), force computation errors, and computational cost and provide an engineering trade-off strategy for when and why one might select to employ these different representations.

4.
JCO Clin Cancer Inform ; 7: e2300057, 2023 07.
Artigo em Inglês | MEDLINE | ID: mdl-37490642

RESUMO

PURPOSE: To determine prognostic and predictive clinical outcomes in metastatic hormone-sensitive prostate cancer (mHSPC) and metastatic castrate-resistant prostate cancer (mCRPC) on the basis of a combination of plasma-derived genomic alterations and lipid features in a longitudinal cohort of patients with advanced prostate cancer. METHODS: A multifeature classifier was constructed to predict clinical outcomes using plasma-based genomic alterations detected in 120 genes and 772 lipidomic species as informative features in a cohort of 71 patients with mHSPC and 144 patients with mCRPC. Outcomes of interest were collected over 11 years of follow-up. These included in mHSPC state early failure of androgen-deprivation therapy (ADT) and exceptional responders to ADT; early death (poor prognosis) and long-term survivors in mCRPC state. The approach was to build binary classification models that identified discriminative candidates with optimal weights to predict outcomes. To achieve this, we built multi-omic feature-based classifiers using traditional machine learning (ML) methods, including logistic regression with sparse regularization, multi-kernel Gaussian process regression, and support vector machines. RESULTS: The levels of specific ceramides (d18:1/14:0 and d18:1/17:0), and the presence of CHEK2 mutations, AR amplification, and RB1 deletion were identified as the most crucial factors associated with clinical outcomes. Using ML models, the optimal multi-omics feature combination determined resulted in AUC scores of 0.751 for predicting mHSPC survival and 0.638 for predicting ADT failure; and in mCRPC state, 0.687 for prognostication and 0.727 for exceptional survival. The models were observed to be superior than using a limited candidate number of features for developing multi-omic prognostic and predictive signatures. CONCLUSION: Using a ML approach that incorporates multiple omic features improves the prediction accuracy for metastatic prostate cancer outcomes significantly. Validation of these models will be needed in independent data sets in future.


Assuntos
Neoplasias de Próstata Resistentes à Castração , Masculino , Humanos , Neoplasias de Próstata Resistentes à Castração/diagnóstico , Neoplasias de Próstata Resistentes à Castração/genética , Neoplasias de Próstata Resistentes à Castração/terapia , Antagonistas de Androgênios/uso terapêutico , Lipidômica , Multiômica , Estudos Retrospectivos , Genômica
5.
IEEE Trans Vis Comput Graph ; 28(12): 4546-4557, 2022 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-34191729

RESUMO

Robustly handling collisions between individual particles in a large particle-based simulation has been a challenging problem. We introduce particle merging-and-splitting, a simple scheme for robustly handling collisions between particles that prevents inter-penetrations of separate objects without introducing numerical instabilities. This scheme merges colliding particles at the beginning of the time-step and then splits them at the end of the time-step. Thus, collisions last for the duration of a time-step, allowing neighboring particles of the colliding particles to influence each other. We show that our merging-and-splitting method is effective in robustly handling collisions and avoiding penetrations in particle-based simulations. We also show how our merging-and-splitting approach can be used for coupling different simulation systems using different and otherwise incompatible integrators. We present simulation tests involving complex solid-fluid interactions, including solid fractures generated by fluid interactions.

6.
IEEE Trans Vis Comput Graph ; 27(9): 3781-3793, 2021 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-32248111

RESUMO

Extraction of multiscale features using scale-space is one of the fundamental approaches to analyze scalar fields. However, similar techniques for vector fields are much less common, even though it is well known that, for example, turbulent flows contain cascades of nested vortices at different scales. The challenge is that the ideas related to scale-space are based upon iteratively smoothing the data to extract features at progressively larger scale, making it difficult to extract overlapping features. Instead, we consider spatial regions of influence in vector fields as scale, and introduce a new approach for the multiscale analysis of vector fields. Rather than smoothing the flow, we use the natural Helmholtz-Hodge decomposition to split it into small-scale and large-scale components using progressively larger neighborhoods. Our approach creates a natural separation of features by extracting local flow behavior, for example, a small vortex, from large-scale effects, for example, a background flow. We demonstrate our technique on large-scale, turbulent flows, and show multiscale features that cannot be extracted using state-of-the-art techniques.

7.
IEEE Trans Vis Comput Graph ; 26(1): 162-172, 2020 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-31425105

RESUMO

High-order finite element methods (HO-FEM) are gaining popularity in the simulation community due to their success in solving complex flow dynamics. There is an increasing need to analyze the data produced as output by these simulations. Simultaneously, topological analysis tools are emerging as powerful methods for investigating simulation data. However, most of the current approaches to topological analysis have had limited application to HO-FEM simulation data for two reasons. First, the current topological tools are designed for linear data (polynomial degree one), but the polynomial degree of the data output by these simulations is typically higher (routinely up to polynomial degree six). Second, the simulation data and derived quantities of the simulation data have discontinuities at element boundaries, and these discontinuities do not match the input requirements for the topological tools. One solution to both issues is to transform the high-order data to achieve low-order, continuous inputs for topological analysis. Nevertheless, there has been little work evaluating the possible transformation choices and their downstream effect on the topological analysis. We perform an empirical study to evaluate two commonly used data transformation methodologies along with the recently introduced L-SIAC filter for processing high-order simulation data. Our results show diverse behaviors are possible. We offer some guidance about how best to consider a pipeline of topological analysis of HO-FEM simulations with the currently available implementations of topological analysis.

8.
IEEE Trans Vis Comput Graph ; 15(6): 1227-34, 2009.
Artigo em Inglês | MEDLINE | ID: mdl-19834193

RESUMO

Visual representations of isosurfaces are ubiquitous in the scientific and engineering literature. In this paper, we present techniques to assess the behavior of isosurface extraction codes. Where applicable, these techniques allow us to distinguish whether anomalies in isosurface features can be attributed to the underlying physical process or to artifacts from the extraction process. Such scientific scrutiny is at the heart of verifiable visualization--subjecting visualization algorithms to the same verification process that is used in other components of the scientific pipeline. More concretely, we derive formulas for the expected order of accuracy (or convergence rate) of several isosurface features, and compare them to experimentally observed results in the selected codes. This technique is practical: in two cases, it exposed actual problems in implementations. We provide the reader with the range of responses they can expect to encounter with isosurface techniques, both under "normal operating conditions" and also under adverse conditions. Armed with this information--the results of the verification process--practitioners can judiciously select the isosurface extraction technique appropriate for their problem of interest, and have confidence in its behavior.

9.
IEEE Trans Biomed Eng ; 55(1): 31-40, 2008 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-18232344

RESUMO

Because numerical simulation parameters may significantly influence the accuracy of the results, evaluating the sensitivity of simulation results to variations in parameters is essential. Although the field of sensitivity analysis is well developed, systematic application of such methods to complex biological models is limited due to the associated high computational costs and the substantial technical challenges for implementation. In the specific case of the forward problem in electrocardiography, the lack of robust, feasible, and comprehensive sensitivity analysis has left many aspects of the problem unresolved and subject to empirical and intuitive evaluation rather than sound, quantitative investigation. In this study, we have developed a systematic, stochastic approach to the analysis of sensitivity of the forward problem of electrocardiography to the parameter of inhomogeneous tissue conductivity. We apply this approach to a two-dimensional, inhomogeneous, geometric model of a slice through the human thorax. We assigned probability density functions for various organ conductivities and applied stochastic finite elements based on the generalized polynomial chaos-stochastic Galerkin (gPC-SG) method to obtain the standard deviation of the resulting stochastic torso potentials. This method utilizes a spectral representation of the stochastic process to obtain numerically accurate stochastic solutions in a fraction of the time required when employing classic Monte Carlo methods. We have shown that a systematic study of sensitivity is not only easily feasible with the gPC-SG approach but can also provide valuable insight into characteristics of the specific simulation.


Assuntos
Mapeamento Potencial de Superfície Corporal/métodos , Diagnóstico por Computador/métodos , Eletrocardiografia/métodos , Sistema de Condução Cardíaco/fisiopatologia , Modelos Cardiovasculares , Simulação por Computador , Condutividade Elétrica , Análise de Elementos Finitos , Humanos , Modelos Estatísticos , Processos Estocásticos
10.
Int Semin Surg Oncol ; 5: 20, 2008 Aug 11.
Artigo em Inglês | MEDLINE | ID: mdl-18694514

RESUMO

Mastectomy rates may be affected by patient choice. 203 patients who had a Total Mastectomy for breast cancer were invited to complete questionnaires at routine follow up clinics to ascertain if they had been offered a choice of Breast Conserving Surgery (BCS), and to establish the reasons for their preference. Questionnaires were checked against medical and nursing records to confirm the reasons for the patients' choice of mastectomy. 130 patients (64%) chose to have a mastectomy, reporting that they felt safer (n = 119); wanted to decrease the risk of further surgery (n = 87) and/or wished to avoid radiotherapy (n = 34). Some were advised not to have BCS if they had a large tumour size, central or multifocal tumours and/or associated extensive microcalcification on mammography (n = 29). 24 patients had BCS as first operation but had repeat surgery for involved or narrow excision margins. Despite being advised that there is no difference between survival rates of this and breast conserving surgery, many patients still feel safer with mastectomy.

11.
IEEE Trans Vis Comput Graph ; 14(3): 680-92, 2008.
Artigo em Inglês | MEDLINE | ID: mdl-18369273

RESUMO

Streamline integration of fields produced by computational fluid mechanics simulations is a commonly used tool for the investigation and analysis of fluid flow phenomena. Integration is often accomplished through the application of ordinary differential equation (ODE) integrators--integrators whose error characteristics are predicated on the smoothness of the field through which the streamline is being integrated--smoothness which is not available at the inter-element level of finite volume and finite element data. Adaptive error control techniques are often used to ameliorate the challenge posed by inter-element discontinuities. As the root of the difficulties is the discontinuous nature of the data, we present a complementary approach of applying smoothness-enhancing accuracy-conserving filters to the data prior to streamline integration. We investigate whether such an approach applied to uniform quadrilateral discontinuous Galerkin (high-order finite volume) data can be used to augment current adaptive error control approaches. We discuss and demonstrate through numerical example the computational trade-offs exhibited when one applies such a strategy.


Assuntos
Algoritmos , Gráficos por Computador , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Reologia/métodos , Processamento de Sinais Assistido por Computador , Simulação por Computador , Análise de Elementos Finitos , Modelos Teóricos , Análise Numérica Assistida por Computador , Integração de Sistemas
12.
IEEE Trans Vis Comput Graph ; 14(6): 1539-46, 2008.
Artigo em Inglês | MEDLINE | ID: mdl-18989007

RESUMO

Methods that faithfully and robustly capture the geometry of complex material interfaces in labeled volume data are important for generating realistic and accurate visualizations and simulations of real-world objects. The generation of such multimaterial models from measured data poses two unique challenges: first, the surfaces must be well-sampled with regular, efficient tessellations that are consistent across material boundaries; and second, the resulting meshes must respect the nonmanifold geometry of the multimaterial interfaces. This paper proposes a strategy for sampling and meshing multimaterial volumes using dynamic particle systems, including a novel, differentiable representation of the material junctions that allows the particle system to explicitly sample corners, edges, and surfaces of material intersections. The distributions of particles are controlled by fundamental sampling constraints, allowing Delaunay-based meshing algorithms to reliably extract watertight meshes of consistently high-quality.

13.
IEEE Trans Vis Comput Graph ; 24(1): 903-912, 2018 01.
Artigo em Inglês | MEDLINE | ID: mdl-28866517

RESUMO

As the finite element method (FEM) and the finite volume method (FVM), both traditional and high-order variants, continue their proliferation into various applied engineering disciplines, it is important that the visualization techniques and corresponding data analysis tools that act on the results produced by these methods faithfully represent the underlying data. To state this in another way: the interpretation of data generated by simulation needs to be consistent with the numerical schemes that underpin the specific solver technology. As the verifiable visualization literature has demonstrated: visual artifacts produced by the introduction of either explicit or implicit data transformations, such as data resampling, can sometimes distort or even obfuscate key scientific features in the data. In this paper, we focus on the handling of elemental continuity, which is often only continuous or piecewise discontinuous, when visualizing primary or derived fields from FEM or FVM simulations. We demonstrate that traditional data handling and visualization of these fields introduce visual errors. In addition, we show how the use of the recently proposed line-SIAC filter provides a way of handling elemental continuity issues in an accuracy-conserving manner with the added benefit of casting the data in a smooth context even if the representation is element discontinuous.

14.
IEEE Trans Vis Comput Graph ; 24(12): 3268-3296, 2018 12.
Artigo em Inglês | MEDLINE | ID: mdl-29990196

RESUMO

This article surveys the history and current state of the art of visualization in meteorology, focusing on visualization techniques and tools used for meteorological data analysis. We examine characteristics of meteorological data and analysis tasks, describe the development of computer graphics methods for visualization in meteorology from the 1960s to today, and visit the state of the art of visualization techniques and tools in operational weather forecasting and atmospheric research. We approach the topic from both the visualization and the meteorological side, showing visualization techniques commonly used in meteorological practice, and surveying recent studies in visualization research aimed at meteorological applications. Our overview covers visualization techniques from the fields of display design, 3D visualization, flow dynamics, feature-based visualization, comparative visualization and data fusion, uncertainty and ensemble visualization, interactive visual analysis, efficient rendering, and scalability and reproducibility. We discuss demands and challenges for visualization research targeting meteorological data analysis, highlighting aspects in demonstration of benefit, interactive visual analysis, seamless visualization, ensemble visualization, 3D visualization, and technical issues.

16.
Int Semin Surg Oncol ; 4: 30, 2007 Dec 22.
Artigo em Inglês | MEDLINE | ID: mdl-18154682

RESUMO

AIMS: This paper describes a simple technique of axillary and breast massage which improves the successful identification of blue sentinel nodes using patent blue dye alone. METHODS: Patent blue dye was injected in the subdermal part of the retroaroelar area in 167 patients having surgical treatment for invasive breast cancer. Three stage axillary lymphatic massage was performed prior to making the axillary incision for sentinel lymph node biopsy. All patients had completion axillary sampling or clearance. RESULTS: A blue lymphatic duct leading to lymph nodes of the first drainage was identified in 163 (97%) of the patients. Results are compared with 168 patients who had sentinel lymph node biopsy using blue dye without axillary massage. Allergic reactions were observed in four patients (1.2%). CONCLUSION: Three stage axillary lymphatic massage improves the successful identification of a blue sentinel lymph node in breast cancer patients.

17.
IEEE Trans Vis Comput Graph ; 13(6): 1704-11, 2007.
Artigo em Inglês | MEDLINE | ID: mdl-17968128

RESUMO

This paper describes a method for constructing isosurface triangulations of sampled, volumetric, three-dimensional scalar fields. The resulting meshes consist of triangles that are of consistently high quality, making them well suited for accurate interpolation of scalar and vector-valued quantities, as required for numerous applications in visualization and numerical simulation. The proposed method does not rely on a local construction or adjustment of triangles as is done, for instance, in advancing wavefront or adaptive refinement methods. Instead, a system of dynamic particles optimally samples an implicit function such that the particles' relative positions can produce a topologically correct Delaunay triangulation. Thus, the proposed method relies on a global placement of triangle vertices. The main contributions of the paper are the integration of dynamic particles systems with surface sampling theory and PDE-based methods for controlling the local variability of particle densities, as well as detailing a practical method that accommodates Delaunay sampling requirements to generate sparse sets of points for the production of high-quality tessellations.

18.
IEEE Trans Vis Comput Graph ; 12(1): 114-25, 2006.
Artigo em Inglês | MEDLINE | ID: mdl-16382613

RESUMO

The purpose of this paper is to present a ray-tracing isosurface rendering algorithm for spectral/hp (high-order finite) element methods in which the visualization error is both quantified and minimized. Determination of the ray-isosurface intersection is accomplished by classic polynomial root-finding applied to a polynomial approximation obtained by projecting the finite element solution over element-partitioned segments along the ray. Combining the smoothness properties of spectral/hp elements with classic orthogonal polynomial approximation theory, we devise an adaptive scheme which allows the polynomial approximation along a ray-segment to be arbitrarily close to the true solution. The resulting images converge toward a pixel-exact image at a rate far faster than sampling the spectral/hp element solution and applying classic low-order visualization techniques such as marching cubes.


Assuntos
Algoritmos , Gráficos por Computador , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Modelos Teóricos , Interface Usuário-Computador , Simulação por Computador , Análise de Elementos Finitos
19.
Stud Health Technol Inform ; 119: 228-33, 2006.
Artigo em Inglês | MEDLINE | ID: mdl-16404050

RESUMO

The Material Point Method is used in the computational simulation of ballistic projectile damage to the heart. A transversely isotropic hyperelastic material model is used to model the myocardium. Computed estimates of tissue damage are used to characterize damaged tissue. The method's potential to estimate the damaged tissue in the complete torso is considered.


Assuntos
Simulação por Computador , Ferimentos Penetrantes , Tecido Conjuntivo , Humanos , Modelos Anatômicos
20.
J Sci Comput ; 63(3): 745-768, 2016 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-25983388

RESUMO

In this paper, we present a method based on Radial Basis Function (RBF)-generated Finite Differences (FD) for numerically solving diffusion and reaction-diffusion equations (PDEs) on closed surfaces embedded in ℝ d . Our method uses a method-of-lines formulation, in which surface derivatives that appear in the PDEs are approximated locally using RBF interpolation. The method requires only scattered nodes representing the surface and normal vectors at those scattered nodes. All computations use only extrinsic coordinates, thereby avoiding coordinate distortions and singularities. We also present an optimization procedure that allows for the stabilization of the discrete differential operators generated by our RBF-FD method by selecting shape parameters for each stencil that correspond to a global target condition number. We show the convergence of our method on two surfaces for different stencil sizes, and present applications to nonlinear PDEs simulated both on implicit/parametric surfaces and more general surfaces represented by point clouds.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA