Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 73
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Medicine (Baltimore) ; 65(1): 46-55, 1986 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-3484536

RESUMEN

The detection of anti-cardiolipin antibodies and anti-DNA antibody idiotypes has shown utility in a prospective assessment of 42 lupus patients over a 1-year study period. However, so broad is the range of clinical and serological features included in the diagnostic category of SLE that even a combination of tests will often inadequately reflect disease activity at a given time. For the foreseeable future the value of laboratory investigations will probably lie in supporting clinical judgment of the nature of a patient's illness and the severity of the target organ's dysfunction.


Asunto(s)
Anticuerpos Antiidiotipos/inmunología , Cardiolipinas/inmunología , ADN/inmunología , Lupus Eritematoso Sistémico/inmunología , Adolescente , Adulto , Humanos , Inmunoglobulina G/inmunología , Idiotipos de Inmunoglobulinas/inmunología , Inmunoglobulina M/inmunología , Persona de Mediana Edad
2.
J Nucl Med ; 36(8): 1476-88, 1995 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-7629598

RESUMEN

UNLABELLED: We compared nine scatter correction methods based on spectral analysis which process SPECT projections. METHODS: Monte Carlo simulation was used to generate histories of photons emitted from a realistic 99mTc phantom. A particular projection was considered. Information regarding the history, location and energy of the photons detected in this projection was analyzed to test the assumptions underlying each scatter correction method. Relative and absolute quantification and signal-to-noise ratio were assessed for each scatter corrected image. RESULTS: For the simulated data, two methods do not enable activity quantification. Among the methods requiring some parameters to be calibrated, the dual-energy window method shows the best compromise between accuracy and ease of implementation but introduces a bias in relative quantification. In this respect, a triple-energy window technique is more accurate than the dual-window method. A factor analysis approach results in more stable quantitative accuracy (error approximately 10%) for a wide range of activity but requires a more sophisticated acquisition mode (30 energy windows). CONCLUSION: These results show that a scatter correction method using spectral analysis can be used to substantially improve accurate quantification.


Asunto(s)
Método de Montecarlo , Tomografía Computarizada de Emisión de Fotón Único , Humanos , Modelos Estructurales , Dispersión de Radiación , Procesamiento de Señales Asistido por Computador , Tecnecio
3.
J Nucl Med ; 29(12): 1971-9, 1988 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-3264020

RESUMEN

The detection of scattered radiation is recognized as one of the major sources of error in single photon emission computed tomography (SPECT). In this work three scatter correction techniques have been assessed and compared. Scatter coefficients and parameters characteristic of each technique have been calculated through Monte Carlo simulations and experimentally measured for various source geometries. Their dependence on the source/matter distribution and their spatial non-stationarity have been described. Each of the three scatter correction methods has then been tested on several SPECT phantom studies. The three methods provided comparable results. Following scatter compensation, both image quality and quantitative accuracy improved. In particular a slight improvement in spatial resolution and a statistically significant increase in cold lesion contrast, hot lesion recovery coefficient, and signal/noise ratio have been demonstrated with all methods.


Asunto(s)
Tomografía Computarizada de Emisión/métodos , Modelos Estructurales , Método de Montecarlo , Dispersión de Radiación
4.
J Nucl Med ; 41(8): 1400-8, 2000 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-10945534

RESUMEN

UNLABELLED: We determined the relative effect of corrections for scatter, depth-dependent collimator response, attenuation, and finite spatial resolution on various image characteristics in cardiac SPECT. METHODS: Monte Carlo simulations and real acquisition of a 99mTc cardiac phantom were performed under comparable conditions. Simulated and acquired data were reconstructed using several correction schemes that combined different methods for scatter correction (3 methods), depth-dependent collimator response correction (frequency-distance principle), attenuation correction (nonuniform Chang correction or within an iterative reconstruction algorithm), and finite spatial resolution correction (use of recovery coefficients). Five criteia were considered to assess the effect of the processing schemes: bull's-eye map (BEM) uniformity, contrast between the left ventricle (LV) wall and the LV cavity, spatial resolution, signal-to-noise ratio (SNR), and percent errors with respect to the known LV wall and liver activities. RESULTS: Similar results were obtained for the simulated and acquired data. Scatter correction significantly improved contrast and absolute quantitation but did not have noticeable effects on BEM uniformity or on spatial resolution and reduced the SNR. Correction for the depth-dependent collimator response improved spatial resolution from 13.3 to 9.5 mm in the LV region, improved absolute quantitation and contrast, but reduced the SNR. Correcting for attenuation was essential for restoring BEM uniformity (78% and 89% without and with attenuation correction, respectively [ideal value being 100%]) and accurate absolute activity quantitation (errors in estimated LV wall and liver activity decreased from 90% without attenuation correction to approximately20% with attenuation correction only). Although accurate absolute activity quantitation was achieved in the liver using scatter and attenuation corrections only, correction for finite spatial resolution was needed to estimate LV wall activity within 10%. CONCLUSION: The respective effects of corrections for scatter, depth-dependent collimator response, attenuation, and finite spatial resolution on different image features in cardiac SPECT were quantified for a specific acquisition configuration. These results give indications regarding the improvements to be expected when using a specific processing scheme involving some or all corrections.


Asunto(s)
Corazón/diagnóstico por imagen , Fantasmas de Imagen , Tomografía Computarizada de Emisión de Fotón Único/métodos , Humanos , Procesamiento de Imagen Asistido por Computador , Hígado/metabolismo , Radiofármacos/farmacocinética , Reproducibilidad de los Resultados , Dispersión de Radiación , Distribución Tisular , Tomografía Computarizada de Emisión de Fotón Único/instrumentación
5.
J Nucl Med ; 42(12): 1737-46, 2001 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-11752068

RESUMEN

UNLABELLED: The use of H(2)(15)O PET scans for the measurement of myocardial perfusion reserve (MPR) has been validated in both animal models and humans. Nevertheless, this protocol requires cumbersome acquisitions such as C(15)O inhalation or (18)F-FDG injection to obtain images suitable for determining myocardial regions of interest. Regularized factor analysis is an alternative method proposed to define myocardial contours directly from H(2)(15)O studies without any C(15)O or FDG scan. The study validates this method by comparing the MPR obtained by the regularized factor analysis with the coronary flow reserve (CFR) obtained by intracoronary Doppler as well as with the MPR obtained by an FDG acquisition. METHODS: Ten healthy volunteers and 10 patients with ischemic cardiopathy or idiopathic dilated cardiomyopathy were investigated. The CFR of patients was measured sonographically using a Doppler catheter tip placed into the proximal left anterior descending artery. The mean velocity was recorded at baseline and after dipyridamole administration. All subjects underwent PET imaging, including 2 H(2)(15)O myocardial perfusion studies at baseline and after dipyridamole infusion, followed by an FDG acquisition. Dynamic H(2)(15)O scans were processed by regularized factor analysis. Left ventricular cavity and anteroseptal myocardial regions of interest were drawn independently on regularized factor images and on FDG images. Myocardial blood flow (MBF) and MPR were estimated by fitting the H(2)(15)O time-activity curves with a compartmental model. RESULTS: In patients, no significant difference was observed among the 3 methods of measurement-Doppler CFR, 1.73 +/- 0.57; regularized factor analysis MPR, 1.71 +/- 0.68; FDG MPR, 1.83 +/- 0.49-using a Friedman 2-way ANOVA by ranks. MPR measured with the regularized factor images correlated significantly with CFR (y = 1.17x - 0.30; r = 0.97). In the global population, the regularized factor analysis MPR and FDG MPR correlated strongly (y = 0.99x; r = 0.93). Interoperator repeatability on regularized factor images was 0.126 mL/min/g for rest MBF, 0.38 mL/min/g for stress MBF, and 0.34 for MPR (19% of mean MPR). CONCLUSION: Regularized factor analysis provides well-defined myocardial images from H(2)(15)O dynamic scans, permitting an accurate and simple measurement of MPR. The method reduces exposure to radiation and examination time and lowers the cost of MPR protocols using a PET scanner.


Asunto(s)
Cardiomiopatía Dilatada/diagnóstico por imagen , Circulación Coronaria/fisiología , Corazón/diagnóstico por imagen , Tomografía Computarizada de Emisión , Análisis de Varianza , Cardiomiopatía Dilatada/fisiopatología , Estudios de Casos y Controles , Ecocardiografía Doppler , Análisis Factorial , Femenino , Fluorodesoxiglucosa F18 , Humanos , Masculino , Persona de Mediana Edad , Radioisótopos de Oxígeno , Radiofármacos , Tomografía Computarizada de Emisión/métodos , Agua
6.
J Nucl Med ; 23(11): 984-7, 1982 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-6982315

RESUMEN

A method for estimation of organ volume is proposed, based on analysis of individual slices obtained from SPET images. In a phantom simulating clinical circumstances, the data show that the level a threshold at 46% of the maximum activity predicts most closely the true volume over a wide range above one liter. The level at 45% predicted better volumes of less than one liter. For phantoms of 839 ml or less, the error was 6.3 ml (one standard error of estimation). This level seems to be independent of the plane or position of the phantom and also independent of the amount of scattering material around it. Nonradioactive voids ("holes") within a phantom may be included or excluded at will when their edges are not tangent to the edge of the phantom. In such cases, their edges are not distinguishable from the edge of the phantom and their volumes are excluded. Knowledge of organ volumes has both diagnostic and therapeutic importance and could lead to a more precisely quantitated total of the radioactivity contained in an organ or space.


Asunto(s)
Tomografía Computarizada de Emisión/métodos , Modelos Estructurales
7.
Semin Nucl Med ; 8(2): 125-46, 1978 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-684440

RESUMEN

Computer processing can improve the quality of scintigrams in several ways. It can increase the accuracy with which the image approximates the activity distribution by reversing degradation. It can selectively enhance normal or abnormal structures of interest. It can optimize the use of the display system presenting the image. The usefulness of computer processing must be determined by observer testing and clinical experience. The need to correct distortion in both intensity (nonuniformity) and space can be avoided by attention to calibration and to the setup of the imaging device employed and by use of the sliding energy window technique. Nonuniformity correction, especially for quantitative studies, should not be done using a flood field as this may actually decrease accuracy. Instead, any necessary correction should employ the sensitivity matrix, which measures the variation of sensitivity to a point source with the position of the source. Statistical fluctuations (noise) and degradation of resolution are commonly corrected using linear, stationary techniques [concepts which are defined and developed in the text], but nonstationary techniques appear to be frequently more successful at the expense of increased processing time. Techniques of choice for pure smoothing are nine-point binomial smoothing and variable shape averaging, and those for both sharpening and smoothing (preferred for most modern, high-count scintigrams) are unsharp masking, Metz or Wiener filtering, and bi-regional sharpening. Structures of interest can be enhanced by methods which detect and emphasize changes in local distributions of slope and curvature of intensity. High quality display devices are essential to reap any benefits from degradation correction. Those devices, which must have appropriately high sensitivity and must avoid display artifacts, have become available only recently. Use of the display should be matched to the processing done. Contrast enhancement, e.g. by histogram qualization, for optimal use for each image of the display intensity range, is often helpful. Most scintigram processing is done using computers with about 32K 16-bit words. Floating point hardware is often useful. Most processing methods require 1-30 seconds on such computers and usually under 15 seconds. Processing time tends to be negligible compared to time for user specification of the processing to be done, so the quality of command languages should be of concern. Careful observer studies using phantoms have shown processing to improve detectability of lesions when a single display is used for both processed and unprocessed images, but not when unprocessed images on standard analog displays are compared to processed images on common computer displays...


Asunto(s)
Computadores , Intensificación de Imagen Radiográfica/métodos , Cintigrafía , Presentación de Datos , Humanos , Modelos Biológicos , Cintigrafía/instrumentación
8.
IEEE Trans Med Imaging ; 2(1): 19-23, 1983.
Artículo en Inglés | MEDLINE | ID: mdl-18234584

RESUMEN

A black and white display was compared to a color display using the heated object spectrum with the aim of verifying which display was better for the detection of small abnormalities in a reasonably complicated background. Receiver operating characteristic curves with location (LROC) were generated for each type of display for each observer. The data were analyzed by taking the LROC curves in pairs, for each observer, fitting I binary ROC curve using a model, and using both parametric and nonparametric statistical tests for the paired differences. It was found, in this study, that the black and white display was significantly "better" than the color display. There was no significant difference in the time required to make a decision between the two displays. The technique of using paired ROC curves is suggested as being an appropriate and powerful test for intercomparing different imaging procedures and techniques in medicine.

9.
IEEE Trans Med Imaging ; 8(3): 245-50, 1989.
Artículo en Inglés | MEDLINE | ID: mdl-18230522

RESUMEN

An approach to image analysis and processing, called holospectral imaging, is proposed for dealing with Compton scattering contamination in nuclear medicine imaging. The method requires that energy information be available for all detected photons. A set of frames (typically 16) representing the spatial distribution at different energies is then formed. The relationship between these energy frames is analyzed, and the original data is transformed into a series of eigenimages and eigenvalues. In this space it is possible to distinguish the specific contribution to the image of both primary and scattered photons and, in addition, noise. Under the hypothesis that the contribution of the primary photons dominates the image structure, a filtering process can be performed to reduce the scattered contamination. The proportion of scattered information removed by the filtering process is evaluated for all images and depends on the level of residual quantum noise, which is estimated from the size of the smaller eigenvalues. Results indicate a slight increase in the statistical noise but also an increase in contrast and greatly improved ability to quantitate the image.

10.
Phys Med Biol ; 40(8): 1357-74, 1995 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-7480118

RESUMEN

We present a new uniformity correction (Fourier energy correction) which is designed to correct for gamma camera non-uniformity caused by variations of the energy response function within a wide spectral range. A convolution model is used to describe the spatial distortions of the energy response function. The model is solved in Fourier space. A preliminary flood acquisition is required to obtain energy-dependent Fourier weights which are used to correct subsequent acquisitions. The influence of the parameters involved in the correction procedure is studied and the Fourier energy correction is compared to a conventional multiplicative energy correction for different acquisition geometries. The Fourier energy correction appears especially useful when the energy information associated with each detected photon is analysed using a fine sampling, or when windows different from the photopeak window are used.


Asunto(s)
Cámaras gamma , Cintigrafía/métodos , Fenómenos Biofísicos , Biofisica , Análisis de Fourier , Cámaras gamma/estadística & datos numéricos , Humanos , Modelos Teóricos , Fantasmas de Imagen , Cintigrafía/instrumentación , Cintigrafía/estadística & datos numéricos , Dispersión de Radiación
11.
Phys Med Biol ; 44(9): 2289-306, 1999 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-10495122

RESUMEN

Dynamic image sequences allow physiological mechanisms to be monitored after the injection of a tracer. Factor analysis of medical image sequences (FAMIS) hence creates a synthesis of the information in one image sequence. It estimates a limited number of structures (factor images) assuming that the tracer kinetics (factors) are similar at each point inside the structure. A spatial regularization method for computing factor images (REG-FAMIS) is proposed to remove irregularities due to noise in the original data while preserving discontinuities between structures. REG-FAMIS has been applied to two sets of simulations: (a) dynamic data with Gaussian noise and (b) dynamic studies in emission tomography (PET or SPECT), which respect real tomographic acquisition parameters and noise characteristics. Optimal regularization parameters are estimated in order to minimize the distance between reference images and regularized factor images. Compared with conventional factor images, the root mean square error between regularized images and reference factor images is improved by 3 for the first set of simulations, and by about 1.5 for the second set of simulations. In all cases, regularized factor images are qualitatively and quantitatively improved.


Asunto(s)
Análisis Factorial , Procesamiento de Imagen Asistido por Computador , Algoritmos , Artefactos , Simulación por Computador , Aumento de la Imagen/métodos , Cinética , Modelos Estadísticos , Tomografía Computarizada de Emisión
12.
Phys Med Biol ; 44(11): 2821-34, 1999 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-10588287

RESUMEN

Results of principal component analysis depend on data scaling. Recently, based on theoretical considerations, several data transformation procedures have been suggested in order to improve the performance of principal component analysis of image data with respect to the optimum separation of signal and noise. The aim of this study was to test some of those suggestions, and to compare several procedures for data transformation in analysis of principal components experimentally. The experiment was performed with simulated data and the performance of individual procedures was compared using the non-parametric Friedman's test. The optimum scaling found was that which unifies the variance of noise in the observed images. In data with a Poisson distribution, the optimum scaling was the norm used in correspondence analysis. Scaling mainly affected the definition of the signal space. Once the dimension of the signal space was known, the differences in error of data and signal reproduction were small. The choice of data transformation depends on the amount of available prior knowledge (level of noise in individual images, number of components, etc), on the type of noise distribution (Gaussian, uniform, Poisson, other), and on the purpose of analysis (data compression, filtration, feature extraction).


Asunto(s)
Procesamiento de Imagen Asistido por Computador , Medicina Nuclear/métodos , Fantasmas de Imagen , Simulación por Computador , Modelos Teóricos , Distribución Normal , Distribución de Poisson , Reproducibilidad de los Resultados
13.
Rofo ; 128(4): 486-90, 1978 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-148414

RESUMEN

A variety of new non-invasive imaging techniques are now at the physicians' disposal for patients' management. However, the place of each new imaging modality within the general diagnostic armamentarium is far from established. Not only comparative assessments of these are needed but also pre-defined strategies for use are required for an intelligent approach to a difficult medical problem. The increasing cost of medicine and the decreasing resources available for development enhance the necessity for such an approach. This paper offers an initial solution to the problem.


Asunto(s)
Cintigrafía/métodos , Tomografía Computarizada por Rayos X/métodos , Ultrasonografía , Enfermedades de las Vías Biliares/diagnóstico , Toma de Decisiones , Femenino , Humanos , Metástasis de la Neoplasia , Placenta Previa/diagnóstico , Embarazo , Embolia Pulmonar/diagnóstico por imagen
14.
Artif Intell Med ; 9(3): 205-25, 1997 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-9071462

RESUMEN

Advances in imaging techniques have been mirrored by advances in the use of computers to extract and interpret image data. Helping clinicians to make effective use of this superabundance of information is a key aim of medical informatics research. One approach is to incorporate digital images and image processing within a knowledge-based decision aid for radiologists. This paper describes a generic design for such aids. The work is based on an abstract model of decision-making which is used to organize the presentation of information from a knowledge base. We describe a system in which the model of decision-making is augmented to describe processes underlying image interpretation. The augmented model, implemented as a logic program, is used to control the application of image processing operators to detect and describe radiological signs. Through the use of this model we are able to combine information from image processing with information from a symbolic knowledge base. The operation of the model is illustrated by considering three different applications in the domain of breast X-rays or mammograms.


Asunto(s)
Interpretación de Imagen Radiográfica Asistida por Computador , Humanos , Modelos Teóricos , Radiografía
15.
Nucl Med Commun ; 5(7): 421-37, 1984 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-6335742

RESUMEN

A protocol has been designed to enable an intercomparison of a number of different SPECT systems to be performed. While results are still not ideal, a considerable improvement of current systems with respect to those available prior to 1983 was noted. Resolution of up to 11 mm was found for a radius of rotation corresponding to the head, and reasonable uniformity could be achieved. Further improvement using noncircular orbits, better collimators, and careful control of uniformity can be achieved. Such systems should, it is hoped, be of clinical value. However, if one had to select a system, which one would it be? The answer must surely be to mount a Siemens head, with the Elscint electronics on a Philips stand, with a cut-away surround from GE. But that might be difficult, and possibly even expensive. Back to the drawing board!


Asunto(s)
Tomografía Computarizada de Emisión/instrumentación , Tomografía Computarizada de Emisión/normas
16.
Nucl Med Commun ; 13(9): 673-99, 1992 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-1448241

RESUMEN

Working Group 1 of the European project COST-B2 on quality assurance of nuclear medicine software has been concerned with the development of an appropriate mechanism for the transfer of nuclear medicine image data files between computer systems from different vendors. To this end a protocol based upon Report No. 10 of the American Association of Physicists in Medicine (AAPM) [1] was adopted. A previous publication [2] gave a specification (V3.2) for an intermediate file format with a list of key-value pairs for the header data associated with nuclear medicine image data files. This paper presents a revised specification for the intermediate file format and associated keys, now called V3.3, which has evolved from the experience in using the earlier version. It is hoped that the modifications proposed will improve the definition and usability of the file format as given in the earlier version.


Asunto(s)
Medicina Nuclear , Sistemas de Información Radiológica/normas , Programas Informáticos/normas , Unión Europea , Modelos Estructurales
17.
Nucl Med Commun ; 5(1): 35-40, 1984 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-6399731

RESUMEN

Twenty-six children underwent a total of 75 99TCm-DTPA scans. Two experienced operators, using different techniques were able to identify parenchyme in 77 of 111 kidneys. Parenchymal regions of interest were drawn by an entirely operator dependent and a relatively operator-independent technique. Transit time analysis, using a matrix algorithm, through the parenchyme of both normal and abnormal kidneys showed a high degree of correlation (tau = 0.595, P less than 0.001). The technique concurred in the studies for which the analysis could not be applied, generally due to poor technical factors or gross hydronephrosis with insufficient parenchyme to generate a time activity curve. Either technique may be used in clinical practice to identify renal parenchyme for further analysis.


Asunto(s)
Ácido Pentético , Renografía por Radioisótopo/métodos , Tecnecio , Adolescente , Niño , Preescolar , Humanos , Lactante , Pentetato de Tecnecio Tc 99m , Obstrucción Ureteral/diagnóstico por imagen
18.
Nucl Med Commun ; 9(8): 545-52, 1988 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-3050650

RESUMEN

Following injection for renography, 99Tcm-labelled diethylenetriamine-pentacetic acid (DTPA) rapidly enters the extravascular space. Background therefore comprises two components, a falling intravascular signal and an extravascular signal which initially rises. We estimated the relative magnitudes of these two components in terms of their impact on the calculation of differential renal function and individual kidney glomerular filtration rate (IKGFR) from the second phase of the 99Tcm-labelled DTPA renogram in 56 paediatric kidneys. We expressed each of the two background signals as a GFR equivalent. The GFR equivalent of the intravascular signal recorded from a peri-renal background region of interest (ROI), scaled by a factor equal to the ratio of the pixel numbers in the renal and background ROIs, was -39 (S.D. 14) ml min-1. The GFR equivalent of the extravascular signal was smaller than this and opposite to it at 23 (S.D. 10) ml min-1, giving a median ratio for the two equivalents of -1.68. Because of the opposing effects of the two background components on the second phase of the renogram, techniques recently described for the quantification of IKGFR from the renogram, and which eliminate the intravascular component, offer no theoretical advantage over a method of analysis which uses 'direct' subtraction of the total background signal. In practice, however, these new techniques are superior in their handling of 'noisy' data, consistently giving a lower coefficient of variation in their estimation of IKGFR.


Asunto(s)
Tasa de Filtración Glomerular , Compuestos Organometálicos , Ácido Pentético , Renografía por Radioisótopo/métodos , Tecnecio , Niño , Humanos , Pentetato de Tecnecio Tc 99m
19.
Nucl Med Commun ; 6(7): 377-88, 1985 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-2995889

RESUMEN

Five Gottingen minipigs from the same litter aged 1 month entered the study and right nephrectomies were performed at staggered intervals between weeks 1 and 5. Absolute renal uptake of 99Tcm-DMSA, expressed as a percentage of the administered dose, was recorded in each animal weekly before right nephrectomy and at weekly intervals for 5 weeks post-nephrectomy. Before nephrectomy absolute renal uptake of 99Tcm-DMSA ranged between 16.8 and 21.5% (mean 18.6%) per kidney with no difference between right and left kidneys. After right nephrectomy, uptake in the left kidney approximately doubled within 1 week. No correlation was shown between renal parenchymal mass and absolute 99Tcm-DMSA uptake in paired or solitary normal kidneys undergoing growth or compensatory growth.


Asunto(s)
Riñón/diagnóstico por imagen , Succímero , Compuestos de Sulfhidrilo , Tecnecio , Animales , Femenino , Riñón/crecimiento & desarrollo , Riñón/metabolismo , Masculino , Cintigrafía , Succímero/metabolismo , Porcinos , Tecnecio/metabolismo , Ácido Dimercaptosuccínico de Tecnecio Tc 99m
20.
Nucl Med Commun ; 9(12): 973-85, 1988 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-3064020

RESUMEN

Since 99Tcm-DTPA is diffusible and not significantly protein bound in plasma, it rapidly enters the extravascular space following injection. Therefore, during the first few minutes of the DTPA renogram, the period on which the measurements of individual kidney glomerular filtration rate and differential function are based, background activity comprises a rising extravascular signal and a falling intravascular signal. The aim of this study was to measure the ratio of these two signals in background present within the renal region of interest (ROI) and compare it with the ratio in a background ROI. An appropriate background ROI is one in which the ratio is equal to that in background in the renal ROI. To pursue this aim, we quantified the rates of change of the intravascular and extravascular activities in background and, by comparing them with the rate of increase of filtered activity, expressed them as GFR equivalents (the intravascular being negative). It is impossible, from a single renogram, to separate the rising extravascular signal from the signal due to filtered activity, and therefore impossible to quantify the extravascular GFR equivalent present in background within the renal ROI. We therefore studied six patients undergoing bone marrow transplantation before and after cyclosporin treatment. By comparing the dynamic renographic data between the two sequential studies, the substantial fall in GFR (from 107 +/- 12 S.D. to 49 +/- 7 ml min-1) permitted separate quantification of the extravascular GFR equivalent in the renal ROI in both studies. Three of the patients were studied on a third occasion after cyclosporin. In two, GFR remained low and these studies were paired with corresponding baseline studies, while in the other it increased and this was compared with the nephrotoxic study, giving a total of nine paired studies between which GFR changed. The ratio of intravascular to extravascular GFR equivalents in a background ROI placed above the kidney was considerably greater, and in a background ROI below the kidney considerably less, than that in the renal ROI. A background ROI which was the difference between the renal ROI and a perirenal ROI, 2 pixels outside the renal ROI along the horizontal and 1 pixel outside along the vertical, gave a ratio almost identical to that of the background within the renal ROI (renal ROI ratio:background ROI ratio = 1.09 +/- 0.17 S.D., n = 18).(ABSTRACT TRUNCATED AT 400 WORDS)


Asunto(s)
Compuestos Organometálicos , Ácido Pentético , Renografía por Radioisótopo/métodos , Tecnecio , Adulto , Ciclosporinas/efectos adversos , Ciclosporinas/uso terapéutico , Femenino , Tasa de Filtración Glomerular , Humanos , Riñón/efectos de los fármacos , Masculino , Pentetato de Tecnecio Tc 99m
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA