Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 17 de 17
Filtrar
Más filtros

Bases de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Ann Surg Oncol ; 16(10): 2943-52, 2009 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-19582506

RESUMEN

BACKGROUND: Invisible NIR fluorescent light can provide high sensitivity, high-resolution, and real-time image-guidance during oncologic surgery, but imaging systems that are presently available do not display this invisible light in the context of surgical anatomy. The FLARE imaging system overcomes this major obstacle. METHODS: Color video was acquired simultaneously, and in real-time, along with two independent channels of NIR fluorescence. Grayscale NIR fluorescence images were converted to visible "pseudo-colors" and overlaid onto the color video image. Yorkshire pigs weighing 35 kg (n = 5) were used for final preclinical validation of the imaging system. A six-patient pilot study was conducted in women undergoing sentinel lymph node (SLN) mapping for breast cancer. Subjects received (99m)Tc-sulfur colloid lymphoscintigraphy. In addition, 12.5 microg of indocyanine green (ICG) diluted in human serum albumin (HSA) was used as an NIR fluorescent lymphatic tracer. RESULTS: The FLARE system permitted facile positioning in the operating room. NIR light did not change the look of the surgical field. Simultaneous pan-lymphatic and SLN mapping was demonstrated in swine using clinically available NIR fluorophores and the dual NIR capabilities of the system. In the pilot clinical trial, a total of nine SLNs were identified by (99m)Tc- lymphoscintigraphy and nine SLNs were identified by NIR fluorescence, although results differed in two patients. No adverse events were encountered. CONCLUSIONS: We describe the successful clinical translation of a new NIR fluorescence imaging system for image-guided oncologic surgery.


Asunto(s)
Neoplasias de la Mama/diagnóstico , Fluorometría/métodos , Verde de Indocianina , Ganglios Linfáticos/patología , Anciano , Animales , Neoplasias de la Mama/diagnóstico por imagen , Neoplasias de la Mama/cirugía , Colorantes , Femenino , Fluorescencia , Humanos , Periodo Intraoperatorio , Ganglios Linfáticos/diagnóstico por imagen , Metástasis Linfática , Persona de Mediana Edad , Proyectos Piloto , Cintigrafía , Radiofármacos , Biopsia del Ganglio Linfático Centinela , Cirugía Asistida por Computador , Sus scrofa , Azufre Coloidal Tecnecio Tc 99m
2.
J Biomed Opt ; 12(5): 051902, 2007.
Artículo en Inglés | MEDLINE | ID: mdl-17994885

RESUMEN

We present a novel methodology for combining breast image data obtained at different times, in different geometries, and by different techniques. We combine data based on diffuse optical tomography (DOT) and magnetic resonance imaging (MRI). The software platform integrates advanced multimodal registration and segmentation algorithms, requires minimal user experience, and employs computationally efficient techniques. The resulting superposed 3-D tomographs facilitate tissue analyses based on structural and functional data derived from both modalities, and readily permit enhancement of DOT data reconstruction using MRI-derived a-priori structural information. We demonstrate the multimodal registration method using a simulated phantom, and we present initial patient studies that confirm that tumorous regions in a patient breast found by both imaging modalities exhibit significantly higher total hemoglobin concentration (THC) than surrounding normal tissues. The average THC in the tumorous regions is one to three standard deviations larger than the overall breast average THC for all patients.


Asunto(s)
Neoplasias de la Mama/diagnóstico , Interpretación de Imagen Asistida por Computador/normas , Imagen por Resonancia Magnética/métodos , Imagen por Resonancia Magnética/normas , Técnica de Sustracción/normas , Tomografía Óptica/métodos , Tomografía Óptica/normas , Femenino , Humanos , Interpretación de Imagen Asistida por Computador/métodos , Imagenología Tridimensional/métodos , Imagenología Tridimensional/normas , Estándares de Referencia , Programas Informáticos , Estados Unidos
3.
Med Image Anal ; 10(1): 96-112, 2006 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-16150629

RESUMEN

The efficacy of radiation therapy treatment depends on the patient setup accuracy at each daily fraction. A significant problem is reproducing the patient position during treatment planning for every fraction of the treatment process. We propose and evaluate an intensity based automatic registration method using multiple portal images and the pre-treatment CT volume. We perform both geometric and radiometric calibrations to generate high quality digitally reconstructed radiographs (DRRs) that can be compared against portal images acquired right before treatment dose delivery. We use a graphics processing unit (GPU) to generate the DRRs in order to gain computational efficiency. We also perform a comparative study on various similarity measures and optimization procedures. Simple similarity measure such as local normalized correlation (LNC) performs best as long as the radiometric calibration is carefully done. Using the proposed method, we achieved better than 1mm average error in repositioning accuracy for a series of phantom studies using two open field (i.e., 41 cm2) portal images with 90 degrees vergence angle.


Asunto(s)
Procesamiento de Imagen Asistido por Computador/métodos , Postura , Planificación de la Radioterapia Asistida por Computador/métodos , Tomografía Computarizada por Rayos X , Algoritmos , Humanos , Fantasmas de Imagen , Reproducibilidad de los Resultados , Estadística como Asunto
4.
Stud Health Technol Inform ; 98: 397-403, 2004.
Artículo en Inglés | MEDLINE | ID: mdl-15544314

RESUMEN

We report on a stereoscopic video-see-through augmented reality system which we developed for medical applications. Our system allows interactive in-situ visualization of 3D medical imaging data. For high-quality rendering of the augmented scene we utilize the capabilities of the latest graphics card generations. Fast high-precision MPR generation ("multiplanar reconstruction") and volume rendering is realized with OpenGL 3D textures. We provide a tracked hand-held tool to interact with the medical imaging data in its actual location. This tool is represented as a virtual tool in the space of the medical data. The user can assign different functionality to it: select arbitrary MPR cross-sections, guide a local volume rendered cube through the medical data, change the transfer function, etc. Tracking works in conjunction with retroreflective markers, which frame the workspace for head tracking respectively are attached to instruments for tool tracking. We use a single head-mounted tracking camera, which is rigidly fixed to the stereo pair of cameras that provide the live video view of the real scene. The user's spatial perception is based on stereo depth cues as well as on the kinetic depth cues that he receives with the viewpoint variations and the interactive data visualization. The AR system has a compelling real-time performance with 30 stereo-frames/second and exhibits no time lag between the video images and the augmenting graphics. Thus, the physician can interactively explore the medical imaging information in-situ.


Asunto(s)
Diagnóstico por Imagen , Interfaz Usuario-Computador , Ultrasonografía
5.
Stud Health Technol Inform ; 85: 455-60, 2002.
Artículo en Inglés | MEDLINE | ID: mdl-15458132

RESUMEN

We have developed an augmented reality visualization system that helps the physician perform ultrasound guided needle biopsies. For a needle biopsy, the needle has to be inserted into an anatomical target to remove a tissue sample. Ultrasound guidance is routinely used e.g. for breast needle biopsies. The real-time ultrasound images allow the physician to locate the target and to monitor the needle position. Our system uses a combination of an optical laser guide and a virtual guide in the augmented image to provide intuitive guidance for the needle placement. There is no need to track the needle, i.e. there is no need to instrument the needle for tracking. In phantom tests, users have performed well with the system without prior training. This paper describes special features of our system and the workflow for the needle placement procedure.


Asunto(s)
Biopsia con Aguja/instrumentación , Simulación por Computador , Diagnóstico por Computador/instrumentación , Cirugía Asistida por Computador/instrumentación , Ultrasonografía/instrumentación , Interfaz Usuario-Computador , Mama/patología , Sistemas de Computación , Femenino , Humanos , Imagenología Tridimensional/instrumentación , Cómputos Matemáticos , Modelos Anatómicos , Transductores , Ultrasonografía Mamaria/instrumentación
6.
Stud Health Technol Inform ; 94: 151-7, 2003.
Artículo en Inglés | MEDLINE | ID: mdl-15455881

RESUMEN

A navigation system can increase the speed and accuracy of MR guided interventions that make use of scanners with high-field closed magnets. We report on first needle placement experiments performed with an Augmented Reality (AR) navigation system. AR visualization provides very intuitive guidance, resulting in a faster procedure. The accuracy of the needle placement depends on the registration accuracy of the system. In the present trials, the needle was placed as good as 1mm close to the target center, however in a small number of cases substantially larger errors occurred and were most likely caused by needle bending.


Asunto(s)
Biopsia con Aguja/métodos , Simulación por Computador , Imagen por Resonancia Magnética/métodos , Interfaz Usuario-Computador , Humanos
7.
Int J Radiat Oncol Biol Phys ; 76(5): 1592-8, 2010 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-20133069

RESUMEN

PURPOSE: To develop and clinically demonstrate the use of on-line real-time megavoltage (MV) fluoroscopy for gated treatment delivery verification. METHODS AND MATERIALS: Megavoltage fluoroscopy (MVF) image sequences were acquired using a flat panel equipped for MV cone-beam CT in synchrony with the respiratory signal obtained from the Anzai gating device. The MVF images can be obtained immediately before or during gated treatment delivery. A prototype software tool (named RTReg4D) was developed to register MVF images with phase-sequenced digitally reconstructed radiograph images generated from the treatment planning system based on four-dimensional CT. The image registration can be used to reposition the patient before or during treatment delivery. To demonstrate the reliability and clinical usefulness, the system was first tested using a thoracic phantom and then prospectively in actual patient treatments under an institutional review board-approved protocol. RESULTS: The quality of the MVF images for lung tumors is adequate for image registration with phase-sequenced digitally reconstructed radiographs. The MVF was found to be useful for monitoring inter- and intrafractional variations of tumor positions. With the planning target volume contour displayed on the MVF images, the system can verify whether the moving target stays within the planning target volume margin during gated delivery. CONCLUSIONS: The use of MVF images was found to be clinically effective in detecting discrepancies in tumor location before and during respiration-gated treatment delivery. The tools and process developed can be useful for gated treatment delivery verification.


Asunto(s)
Fluoroscopía/métodos , Fantasmas de Imagen , Planificación de la Radioterapia Asistida por Computador/métodos , Técnicas de Imagen Sincronizada Respiratorias/métodos , Tomografía Computarizada de Haz Cónico , Tomografía Computarizada Cuatridimensional , Humanos , Neoplasias Pulmonares/diagnóstico por imagen , Neoplasias Pulmonares/radioterapia , Estudios Prospectivos , Respiración , Técnicas de Imagen Sincronizada Respiratorias/instrumentación , Programas Informáticos , Columna Vertebral/diagnóstico por imagen
8.
Artículo en Inglés | MEDLINE | ID: mdl-20426135

RESUMEN

We propose a novel method to detect the current state of the quasi-periodic system from image sequences which in turn will enable us to synchronize/gate the image sequences to obtain images of the organ system at similar configurations. The method uses the cumulated phase shift in the spectral domain of successive image frames as a measure of the net motion of objects in the scene. The proposed method is applicable to 2D and 3D time varying sequences and is not specific to the imaging modality. We demonstrate its effectiveness on X-Ray Angiographic and Cardiac and Liver Ultrasound sequences. Knowledge of the current (cardiac or respiratory) phase of the system, opens up the possibility for a purely image based cardiac and respiratory gating scheme for interventional and radiotherapy procedures.


Asunto(s)
Algoritmos , Inteligencia Artificial , Técnicas de Imagen Sincronizada Cardíacas/métodos , Aumento de la Imagen/métodos , Interpretación de Imagen Asistida por Computador/métodos , Oscilometría/métodos , Humanos , Reproducibilidad de los Resultados , Técnicas de Imagen Sincronizada Respiratorias , Sensibilidad y Especificidad
9.
Med Image Comput Comput Assist Interv ; 12(Pt 1): 828-36, 2009.
Artículo en Inglés | MEDLINE | ID: mdl-20426065

RESUMEN

In this paper we propose a novel similarity metric and a method for deformable registration of two images for a specific clinical application. The basic assumption in almost all deformable registration approaches is that there exist explicit correspondences between pixels across the two images. This principle is used to design image (dis)similarity metrics, such as sum of squared differences (SSD) or mutual information (MI). This assumption is strongly violated, for instance, within specific regions of images from abdominal or pelvic section of a patient taken at two different time points. Nevertheless, in some clinical applications, it is required to compute a smooth deformation field for all the regions within the image including the boundaries of such regions. In this paper, we propose a deformable registration method, which utilizes a priori intensity distributions of the regions delineated on one of the images to devise a new similarity measure that varies across regions of the image to establish a smooth and robust deformation field. We present validation results of the proposed method in mapping bladder, prostate, and rectum contours of computer tomography (CT) volumes of 10 patients taken for prostate cancer radiotherapy treatment planning and verification.


Asunto(s)
Imagenología Tridimensional/métodos , Reconocimiento de Normas Patrones Automatizadas/métodos , Neoplasias de la Próstata/diagnóstico por imagen , Neoplasias de la Próstata/radioterapia , Radioterapia Asistida por Computador/métodos , Técnica de Sustracción , Tomografía Computarizada por Rayos X/métodos , Algoritmos , Inteligencia Artificial , Humanos , Masculino , Intensificación de Imagen Radiográfica/métodos , Interpretación de Imagen Radiográfica Asistida por Computador/métodos , Radioterapia Conformacional/métodos , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
10.
Artículo en Inglés | MEDLINE | ID: mdl-20425965

RESUMEN

This paper describes a novel method for improving the navigation and guidance of devices and catheters in electrophysiology and interventional cardiology procedures using volumetric data fusion. The clinical workflow includes the acquisition and reconstruction of CT data from a C-arm X-ray angiographic system and the real-time acquisition of volumetric ultrasound datasets with a new intracardiac real-time 3D ultrasound catheter. Mono- and multi-modal volumetric registration methods, as well as visualization modes, that are suitable for real-time fusion are described, which are the key components of this work. Evaluation on phantom and in-vivo animal data shows that it is feasible to register and track the motion of real-time 3D intracardiac ultrasound in C-arm CT.


Asunto(s)
Ablación por Catéter/métodos , Ecocardiografía Tridimensional/métodos , Técnicas Electrofisiológicas Cardíacas/métodos , Técnica de Sustracción , Cirugía Asistida por Computador/métodos , Tomografía Computarizada por Rayos X/métodos , Algoritmos , Sistemas de Computación , Humanos , Fantasmas de Imagen
11.
Med Image Anal ; 12(5): 577-85, 2008 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-18650121

RESUMEN

The fusion of tracked ultrasound with CT has benefits for a variety of clinical applications, however extensive manual effort is usually required for correct registration. We developed new methods that allow one to simulate medical ultrasound from CT in real-time, reproducing the majority of ultrasonic imaging effects. They are combined with a robust similarity measure that assesses the correlation of a combination of signals extracted from CT with ultrasound, without knowing the influence of each signal. This serves as the foundation of a fully automatic registration, that aligns a 3D ultrasound sweep with the corresponding tomographic modality using a rigid or an affine transformation model, without any manual interaction. These techniques were evaluated in a study involving 25 patients with indeterminate lesions in liver and kidney. The clinical setup, acquisition and registration workflow is described, along with the evaluation of the registration accuracy with respect to physician-defined Ground Truth. Our new algorithm correctly registers without any manual interaction in 76% of the cases, the average RMS TRE over multiple target lesions throughout the liver is 8.1mm.


Asunto(s)
Interpretación de Imagen Asistida por Computador/métodos , Neoplasias Renales/diagnóstico , Neoplasias Renales/cirugía , Neoplasias Hepáticas/diagnóstico , Neoplasias Hepáticas/cirugía , Cirugía Asistida por Computador/métodos , Tomografía Computarizada por Rayos X/métodos , Ultrasonografía/métodos , Algoritmos , Inteligencia Artificial , Humanos , Reconocimiento de Normas Patrones Automatizadas/métodos
12.
Med Image Comput Comput Assist Interv ; 11(Pt 2): 668-75, 2008.
Artículo en Inglés | MEDLINE | ID: mdl-18982662

RESUMEN

Despite rapid advances in interventional imaging, the navigation of a guide wire through abdominal vasculature remains, not only for novice radiologists, a difficult task. Since this navigation is mostly based on 2D fluoroscopic image sequences from one view, the process is slowed down significantly due to missing depth information and patient motion. We propose a novel approach for 3D dynamic roadmapping in deformable regions by predicting the location of the guide wire tip in a 3D vessel model from the tip's 2D location, respiratory motion analysis, and view geometry. In a first step, the method compensates for the apparent respiratory motion in 2D space before backprojecting the 2D guide wire tip into three dimensional space, using a given projection matrix. To countervail the error connected to the projection parameters and the motion compensation, as well as the ambiguity caused by vessel deformation, we establish a statistical framework, which computes a reliable estimate of the guide wire tip location within the 3D vessel model. With this 2D-to-3D transfer, the navigation can be performed from arbitrary viewing angles, disconnected from the static perspective view of the fluoroscopic sequence. Tests on a realistic breathing phantom and on synthetic data with a known ground truth clearly reveal the superiority of our approach compared to naive methods for 3D roadmapping. The concepts and information presented in this paper are based on research and are not commercially available.


Asunto(s)
Abdomen/irrigación sanguínea , Angiografía/métodos , Cateterismo/métodos , Imagenología Tridimensional/métodos , Reconocimiento de Normas Patrones Automatizadas/métodos , Radiografía Abdominal/métodos , Radiografía Intervencional/métodos , Algoritmos , Humanos , Intensificación de Imagen Radiográfica/métodos , Interpretación de Imagen Radiográfica Asistida por Computador/métodos , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
13.
Med Image Comput Comput Assist Interv ; 10(Pt 1): 136-43, 2007.
Artículo en Inglés | MEDLINE | ID: mdl-18051053

RESUMEN

The fusion of 3D freehand ultrasound with CT and CTA has benefits for a variety of clinical applications, however a lot of manual work is usually required for correct registration. We developed new methods that allow one to simulate medical ultrasound from CT in real-time, reproducing the majority of ultrasonic imaging effects. The second novelty is a robust similarity measure that assesses the correlation of a combination of multiple signals extracted from CT with ultrasound, without knowing the influence of each signal. This serves as the foundation of a fully automatic registration, which aligns a freehand ultrasound sweep with the corresponding 3D modality using a rigid or an affine transformation model, without any manual interaction. We also present the used initialization, global and local parameter optimization schemes, and validation on abdominal CTA and ultrasound imaging of 10 patients.


Asunto(s)
Inteligencia Artificial , Interpretación de Imagen Asistida por Computador/métodos , Modelos Biológicos , Reconocimiento de Normas Patrones Automatizadas/métodos , Técnica de Sustracción , Tomografía Computarizada por Rayos X/métodos , Ultrasonografía/métodos , Algoritmos , Simulación por Computador , Aumento de la Imagen/métodos , Imagenología Tridimensional/métodos , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
14.
Artículo en Inglés | MEDLINE | ID: mdl-17354978

RESUMEN

We present an intensity based deformable registration algorithm for 3D ultrasound data. The proposed method uses a variational approach and combines the characteristics of a multilevel algorithm and the properties of ultrasound data in order to provide a fast and accurate deformable registration method. In contrast to previously proposed approaches, we use no feature points and no interpolation technique, but compute a dense displacement field directly. We demonstrate that this approach, although it includes solving large PDE systems, reduces the computation time if implemented using efficient numerical techniques. The performance of the algorithm is tested on multiple 3D US images of the liver. Validation is performed by simulations, similarity comparisons between original and deformed images, visual inspection of the displacement fields and visual assessment of the deformed images by physicians.


Asunto(s)
Aumento de la Imagen/métodos , Interpretación de Imagen Asistida por Computador/métodos , Imagenología Tridimensional/métodos , Hígado/diagnóstico por imagen , Reconocimiento de Normas Patrones Automatizadas/métodos , Técnica de Sustracción , Ultrasonografía/métodos , Algoritmos , Inteligencia Artificial , Estudios de Factibilidad , Humanos , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
15.
Radiology ; 240(1): 230-5, 2006 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-16720866

RESUMEN

UNLABELLED: The purpose of this study was to evaluate the feasibility and performance of an augmented reality (AR) visualization prototype for virtual computed tomography (CT)-guided interventional procedures in a multimodality abdominal phantom. With the aid of AR guidance, three radiologists performed 30 attempts at targeting simulated liver lesions of different sizes (range, 5-15 mm) with a biopsy needle. The position of the needle tip relative to the lesion was verified by using ultrasonography and CT. With AR guidance, lesions were successfully targeted with the first needle pass in all cases. On the basis of these results, AR visualization for CT-guided intervention appears feasible and allows intuitive and accurate lesion targeting in a phantom. SUPPLEMENTAL MATERIAL: radiology.rsnajnls.org/cgi/content/full/2401040018/DC1


Asunto(s)
Biopsia con Aguja/métodos , Tomografía Computarizada por Rayos X/instrumentación , Interfaz Usuario-Computador , Abdomen , Calibración , Estudios de Factibilidad , Humanos , Procesamiento de Imagen Asistido por Computador , Hígado/diagnóstico por imagen , Fantasmas de Imagen , Tomografía Computarizada por Rayos X/métodos
16.
Radiology ; 238(2): 497-504, 2006 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-16436814

RESUMEN

PURPOSE: To evaluate an augmented reality (AR) system in combination with a 1.5-T closed-bore magnetic resonance (MR) imager as a navigation tool for needle biopsies. MATERIALS AND METHODS: The experimental protocol had institutional animal care and use committee approval. Seventy biopsies were performed in phantoms by using 20 tube targets, each with a diameter of 6 mm, and 50 virtual targets. The position of the needle tip in AR and MR space was compared in multiple imaging planes, and virtual and real needle tip localization errors were calculated. Ten AR-guided biopsies were performed in three pigs, and the duration of each procedure was determined. After successful puncture, the distance to the target was measured on MR images. The confidence limits for the achieved in-plane hit rate and for lateral deviation were calculated. A repeated measures analysis of variance was used to determine whether the placement error in a particular dimension (x, y, or z) differed from the others. RESULTS: For the 50 virtual targets, a mean error of 1.1 mm +/- 0.5 (standard deviation) was calculated. A repeated measures analysis of variance indicated no statistically significant difference (P > .99) in the errors in any particular orientation. For the real targets, all punctures were inside the 6-mm-diameter tube in the transverse plane. The needle depth was within the target plane in 11 biopsy procedures; the mean distance to the center of the target was 2.55 mm (95% confidence interval: 1.77 mm, 3.34 mm). For nine biopsy procedures, the needle tip was outside the target plane, with a mean distance to the edge of the target plane of 1.5 mm (range, 0.07-3.46 mm). In the animal experiments, the puncture was successful in all 10 cases, with a mean target-needle distance of 9.6 mm +/- 4.85. The average procedure time was 18 minutes per puncture. CONCLUSION: Biopsy procedures performed with a combination of a closed-bore MR system and an AR system are feasible and accurate.


Asunto(s)
Biopsia con Aguja/métodos , Imagen por Resonancia Magnética , Animales , Modelos Animales , Porcinos
17.
Artículo en Inglés | MEDLINE | ID: mdl-16685944

RESUMEN

This paper introduces a novel method for ultrasound calibration for both spatial and temporal parameters. The main advantage of this method is that it does not require a phantom, which is usually expensive to fabricate. Furthermore, the method does not require extensive image processing. For spatial calibration, we solve an optimization problem established by a set of equations that relate the orientations of a line (i.e., calibration pointer) to the intersection points appearing in the ultrasound image. The line orientation is provided through calibration of both ends of the calibration pointer. Temporal calibration is achieved by processing of the captured pointer orientations and the corresponding image positions of intersection along with the timing information. The effectiveness of the unified method for both spatial and temporal calibration is apparent from the quality of the 3D reconstructions of a known object.


Asunto(s)
Algoritmos , Análisis de Falla de Equipo/métodos , Aumento de la Imagen/métodos , Interpretación de Imagen Asistida por Computador/métodos , Imagenología Tridimensional/métodos , Ultrasonografía/métodos , Calibración , Aumento de la Imagen/normas , Fantasmas de Imagen , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Factores de Tiempo , Ultrasonografía/normas
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA