Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
1.
Chirurgia (Bucur) ; 112(4): 449-456, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-28862122

RESUMEN

Queues in hospitals are directly affecting the quality of human life, which should have priority compared to other types of queues. The aim of this paper is to design a future value stream map of the system and patient pathway in terms of quality improvement in order to decrease the non-value added activities for breast cancer patients, doctors and nurses for a radiology unit in a Training and Research University Hospital based in Kocaeli, Turkey. Nowadays, the increased demand versus insufficient sources affect healthcare services due to poor quality with long queues during the diagnosis and treatment processes. For this paper, data were collected from personal observations, information technologies units and authorized employees. Moreover, data tracking and keeping systems are too poor for revealing the current situation. This paper provides an example of a current and future value stream map showing step by step where the bottlenecks are and how it can be improved and what specific benefits it will bring to the healthcare system. In consideration of all these outcomes, it is highly suggested that the hospital apply European Guidelines for quality assurance in breast cancer screening and diagnosis together with the mentioned above improvement suggestions using lean applications.


Asunto(s)
Neoplasias de la Mama/diagnóstico , Neoplasias de la Mama/cirugía , Vías Clínicas , Mamografía , Nivel de Atención , Neoplasias de la Mama/diagnóstico por imagen , Vías Clínicas/normas , Diagnóstico Diferencial , Detección Precoz del Cáncer/normas , Europa (Continente) , Femenino , Guías como Asunto , Humanos , Mamografía/normas , Tamizaje Masivo , Valor Predictivo de las Pruebas , Sensibilidad y Especificidad , Sociedades Médicas , Turquía
2.
Med Image Anal ; 85: 102741, 2023 04.
Artículo en Inglés | MEDLINE | ID: mdl-36638747

RESUMEN

One of the greatest scientific challenges in network neuroscience is to create a representative map of a population of heterogeneous brain networks, which acts as a connectional fingerprint. The connectional brain template (CBT), also named network atlas, presents a powerful tool for capturing the most representative and discriminative traits of a given population while preserving its topological patterns. The idea of a CBT is to integrate a population of heterogeneous brain connectivity networks, derived from different neuroimaging modalities or brain views (e.g., structural and functional), into a unified holistic representation. Here we review current state-of-the-art methods designed to estimate well-centered and representative CBT for populations of single-view and multi-view brain networks. We start by reviewing each CBT learning method, then we introduce the evaluation measures to compare CBT representativeness of populations generated by single-view and multigraph integration methods, separately, based on the following criteria: Centeredness, biomarker-reproducibility, node-level similarity, global-level similarity, and distance-based similarity. We demonstrate that the deep graph normalizer (DGN) method significantly outperforms other multi-graph and all single-view integration methods for estimating CBTs using a variety of healthy and disordered datasets in terms of centeredness, reproducibility (i.e., graph-derived biomarkers reproducibility that disentangle the typical from the atypical connectivity variability), and preserving the topological traits at both local and global graph-levels.


Asunto(s)
Mapeo Encefálico , Imagen por Resonancia Magnética , Humanos , Reproducibilidad de los Resultados , Imagen por Resonancia Magnética/métodos , Mapeo Encefálico/métodos , Encéfalo , Neuroimagen , Biomarcadores
3.
Neural Netw ; 151: 250-263, 2022 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-35447482

RESUMEN

Multigraphs with heterogeneous views present one of the most challenging obstacles to classification tasks due to their complexity. Several works based on feature selection have been recently proposed to disentangle the problem of multigraph heterogeneity. However, such techniques have major drawbacks. First, the bulk of such works lies in the vectorization and the flattening operations, failing to preserve and exploit the rich topological properties of the multigraph. Second, they learn the classification process in a dichotomized manner where the cascaded learning steps are pieced in together independently. Hence, such architectures are inherently agnostic to the cumulative estimation error from step to step. To overcome these drawbacks, we introduce MICNet (multigraph integration and classifier network), the first end-to-end graph neural network based model for multigraph classification. First, we learn a single-view graph representation of a heterogeneous multigraph using a GNN based integration model. The integration process in our model helps tease apart the heterogeneity across the different views of the multigraph by generating a subject-specific graph template while preserving its geometrical and topological properties conserving the node-wise information while reducing the size of the graph (i.e., number of views). Second, we classify each integrated template using a geometric deep learning block which enables us to grasp the salient graph features. We train, in end-to-end fashion, these two blocks using a single objective function to optimize the classification performance. We evaluate our MICNet in gender classification using brain multigraphs derived from different cortical measures. We demonstrate that our MICNet significantly outperformed its variants thereby showing its great potential in multigraph classification.


Asunto(s)
Encéfalo , Redes Neurales de la Computación
4.
Brain Imaging Behav ; 15(4): 2081-2100, 2021 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-33089469

RESUMEN

The estimation of a connectional brain template (CBT) integrating a population of brain networks while capturing shared and differential connectional patterns across individuals remains unexplored in gender fingerprinting. This paper presents the first study to estimate gender-specific CBTs using multi-view cortical morphological networks (CMNs) estimated from conventional T1-weighted magnetic resonance imaging (MRI). Specifically, each CMN view is derived from a specific cortical attribute (e.g. thickness), encoded in a network quantifying the dissimilarity in morphology between pairs of cortical brain regions. To this aim, we propose Multi-View Clustering and Fusion Network (MVCF-Net), a novel multi-view network fusion method, which can jointly identify consistent and differential clusters of multi-view datasets in order to capture simultaneously similar and distinct connectional traits of samples. Our MVCF-Net method estimates a representative and well-centered CBTs for male and female populations, independently, to eventually identify their fingerprinting regions of interest (ROIs) in four main steps. First, we perform multi-view network clustering model based on manifold optimization which groups CMNs into shared and differential clusters while preserving their alignment across views. Second, for each view, we linearly fuse CMNs belonging to each cluster, producing local CBTs. Third, for each cluster, we non-linearly integrate the local CBTs across views, producing a cluster-specific CBT. Finally, by linearly fusing the cluster-specific centers we estimate a final CBT of the input population. MVCF-Net produced the most centered and representative CBTs for male and female populations and identified the most discriminative ROIs marking gender differences. The most two gender-discriminative ROIs involved the lateral occipital cortex and pars opercularis in the left hemisphere and the middle temporal gyrus and lingual gyrus in the right hemisphere.


Asunto(s)
Conectoma , Imagen por Resonancia Magnética , Encéfalo/diagnóstico por imagen , Análisis por Conglomerados , Femenino , Humanos , Masculino
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA