Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Sensors (Basel) ; 21(2)2021 Jan 13.
Artículo en Inglés | MEDLINE | ID: mdl-33450881

RESUMEN

Unmanned aerial vehicles (UAVs) have become a very popular way of acquiring video within contexts such as remote data acquisition or surveillance. Unfortunately, their viewpoint is often unstable, which tends to impact the automatic processing of their video flux negatively. To counteract the effects of an inconsistent viewpoint, two video processing strategies are classically adopted, namely registration and stabilization, which tend to be affected by distinct issues, namely jitter and drifting. Following our prior work, we suggest that the motion estimators used in both situations can be modeled to take into account their inherent errors. By acknowledging that drifting and jittery errors are of a different nature, we propose a combination that is able to limit their influence and build a hybrid solution for jitter-free video registration. In this work, however, our modeling was restricted to 2D-rigid transforms, which are rather limited in the case of airborne videos. In the present paper, we extend and refine the theoretical ground of our previous work. This addition allows us to show how to practically adapt our previous work to perspective transforms, which our study shows to be much more accurate for this problem. A lightweight implementation enables us to automatically register stationary UAV videos in real time. Our evaluation includes traffic surveillance recordings of up to 2 h and shows the potential of the proposed approach when paired with background subtraction tasks.

2.
Biosensors (Basel) ; 10(9)2020 Aug 21.
Artículo en Inglés | MEDLINE | ID: mdl-32825735

RESUMEN

BACKGROUND: At present, the assessment of autonomy in daily living activities, one of the key symptoms in Alzheimer's disease (AD), involves clinical rating scales. METHODS: In total, 109 participants were included. In particular, 11 participants during a pre-test in Nice, France, and 98 participants (27 AD, 38 mild cognitive impairment-MCI-and 33 healthy controls-HC) in Thessaloniki, Greece, carried out a standardized scenario consisting of several instrumental activities of daily living (IADLs), such as making a phone call or preparing a pillbox while being recorded. Data were processed by a platform of video signal analysis in order to extract kinematic parameters, detecting activities undertaken by the participant. RESULTS: The video analysis data can be used to assess IADL task quality and provide clinicians with objective measurements of the patients' performance. Furthermore, it reveals that the HC statistically significantly outperformed the MCI, which had better performance compared to the AD participants. CONCLUSIONS: Accurate activity recognition data for the analyses of the performance on IADL activities were obtained.


Asunto(s)
Actividades Cotidianas , Demencia , Anciano , Femenino , Grecia , Humanos , Masculino , Persona de Mediana Edad , Grabación en Video
3.
IEEE Trans Image Process ; 28(7): 3357-3371, 2019 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-30714921

RESUMEN

The land cover reconstruction from monochromatic historical aerial images is a challenging task that has recently attracted an increasing interest from the scientific community with the proliferation of large-scale epidemiological studies involving retrospective analysis of spatial patterns. However, the efforts made by the computer vision community in remote-sensing applications are mostly focused on prospective approaches through the analysis of high-resolution multi-spectral data acquired by the advanced spatial programs. Hence, four contributions are proposed in this paper. They aim at providing a comparison basis for the future development of computer vision algorithms applied to the automation of the land cover reconstruction from monochromatic historical aerial images. First, a new multi-scale multi-date dataset composed of 4.9 million non-overlapping annotated patches of the France territory between 1970 and 1990 has been created with the help of geography experts. This dataset has been named HistAerial. Second, an extensive comparison study of the state-of-the-art texture features extraction and classification algorithms, including deep convolutional neural networks (DCNNs), has been performed. It is presented in the form of an evaluation. Third, a novel low-dimensional local texture filter named rotated-corner local binary pattern (R-CRLBP) is presented as a simplification of the binary gradient contours filter through the use of an orthogonal combination representation. Finally, a novel combination of low-dimensional texture descriptors, including the R-CRLBP filter, is introduced as a light combination of local binary patterns (LCoLBPs). The LCoLBP filter achieved state-of-the-art results on the HistAerial dataset while conserving a relatively low-dimensional feature vector space compared with the DCNN approaches (17 times shorter).

4.
Sensors (Basel) ; 17(7)2017 Jun 29.
Artículo en Inglés | MEDLINE | ID: mdl-28661440

RESUMEN

Visual activity recognition plays a fundamental role in several research fields as a way to extract semantic meaning of images and videos. Prior work has mostly focused on classification tasks, where a label is given for a video clip. However, real life scenarios require a method to browse a continuous video flow, automatically identify relevant temporal segments and classify them accordingly to target activities. This paper proposes a knowledge-driven event recognition framework to address this problem. The novelty of the method lies in the combination of a constraint-based ontology language for event modeling with robust algorithms to detect, track and re-identify people using color-depth sensing (Kinect® sensor). This combination enables to model and recognize longer and more complex events and to incorporate domain knowledge and 3D information into the same models. Moreover, the ontology-driven approach enables human understanding of system decisions and facilitates knowledge transfer across different scenes. The proposed framework is evaluated with real-world recordings of seniors carrying out unscripted, daily activities at hospital observation rooms and nursing homes. Results demonstrated that the proposed framework outperforms state-of-the-art methods in a variety of activities and datasets, and it is robust to variable and low-frame rate recordings. Further work will investigate how to extend the proposed framework with uncertainty management techniques to handle strong occlusion and ambiguous semantics, and how to exploit it to further support medicine on the timely diagnosis of cognitive disorders, such as Alzheimer's disease.

5.
Front Aging Neurosci ; 7: 98, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-26082715

RESUMEN

Currently, the assessment of autonomy and functional ability involves clinical rating scales. However, scales are often limited in their ability to provide objective and sensitive information. By contrast, information and communication technologies may overcome these limitations by capturing more fully functional as well as cognitive disturbances associated with Alzheimer disease (AD). We investigated the quantitative assessment of autonomy in dementia patients based not only on gait analysis but also on the participant performance on instrumental activities of daily living (IADL) automatically recognized by a video event monitoring system (EMS). Three groups of participants (healthy controls, mild cognitive impairment, and AD patients) had to carry out a standardized scenario consisting of physical tasks (single and dual task) and several IADL such as preparing a pillbox or making a phone call while being recorded. After, video sensor data were processed by an EMS that automatically extracts kinematic parameters of the participants' gait and recognizes their carried out activities. These parameters were then used for the assessment of the participants' performance levels, here referred as autonomy. Autonomy assessment was approached as classification task using artificial intelligence methods that takes as input the parameters extracted by the EMS, here referred as behavioral profile. Activities were accurately recognized by the EMS with high precision. The most accurately recognized activities were "prepare medication" with 93% and "using phone" with 89% precision. The diagnostic group classifier obtained a precision of 73.46% when combining the analyses of physical tasks with IADL. In a further analysis, the created autonomy group classifier which obtained a precision of 83.67% when combining physical tasks and IADL. Results suggest that it is possible to quantitatively assess IADL functioning supported by an EMS and that even based on the extracted data the groups could be classified with high accuracy. This means that the use of such technologies may provide clinicians with diagnostic relevant information to improve autonomy assessment in real time decreasing observer biases.

6.
J Alzheimers Dis ; 44(2): 675-85, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-25362036

RESUMEN

Over the last few years, the use of new technologies for the support of elderly people and in particular dementia patients received increasing interest. We investigated the use of a video monitoring system for automatic event recognition for the assessment of instrumental activities of daily living (IADL) in dementia patients. Participants (19 healthy subjects (HC) and 19 mild cognitive impairment (MCI) patients) had to carry out a standardized scenario consisting of several IADLs such as making a phone call while they were recorded by 2D video cameras. After the recording session, data was processed by a platform of video signal analysis in order to extract kinematic parameters detecting activities undertaken by the participant. We compared our automated activity quality prediction as well as cognitive health prediction with direct observation annotation and neuropsychological assessment scores. With a sensitivity of 85.31% and a precision of 75.90%, the overall activities were correctly automatically detected. Activity frequency differed significantly between MCI and HC participants (p < 0.05). In all activities, differences in the execution time could be identified in the manually and automatically extracted data. We obtained statistically significant correlations between manually as automatically extracted parameters and neuropsychological test scores (p < 0.05). However, no significant differences were found between the groups according to the IADL scale. The results suggest that it is possible to assess IADL functioning with the help of an automatic video monitoring system and that even based on the extracted data, significant group differences can be obtained.


Asunto(s)
Actividades Cotidianas , Disfunción Cognitiva/diagnóstico , Reconocimiento de Normas Patrones Automatizadas/métodos , Grabación en Video/métodos , Anciano , Fenómenos Biomecánicos , Disfunción Cognitiva/fisiopatología , Disfunción Cognitiva/psicología , Femenino , Humanos , Masculino , Pruebas Neuropsicológicas , Sensibilidad y Especificidad
7.
Comput Biol Med ; 42(2): 257-64, 2012 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-22204867

RESUMEN

We present a software (ETHOWATCHER(®)) developed to support ethography, object tracking and extraction of kinematic variables from digital video files of laboratory animals. The tracking module allows controlled segmentation of the target from the background, extracting image attributes used to calculate the distance traveled, orientation, length, area and a path graph of the experimental animal. The ethography module allows recording of catalog-based behaviors from environment or from video files continuously or frame-by-frame. The output reports duration, frequency and latency of each behavior and the sequence of events in a time-segmented format, set by the user. Validation tests were conducted on kinematic measurements and on the detection of known behavioral effects of drugs. This software is freely available at www.ethowatcher.ufsc.br.


Asunto(s)
Conducta Animal/fisiología , Procesamiento de Imagen Asistido por Computador/métodos , Actividad Motora/fisiología , Programas Informáticos , Grabación en Video/métodos , Animales , Conducta Animal/efectos de los fármacos , Fenómenos Biomecánicos/fisiología , Cafeína/farmacología , Femenino , Actividad Motora/efectos de los fármacos , Ratas , Ratas Wistar , Reproducibilidad de los Resultados , Interfaz Usuario-Computador
8.
Artículo en Inglés | MEDLINE | ID: mdl-21096517

RESUMEN

Behavior studies on the neurobiological effects of environmental, pharmacological and physiological manipulations in lab animals try to correlate these procedures with specific changes in animal behavior. Parameters such as duration, latency and frequency are assessed from the visually recorded sequences of behaviors, to distinguish changes due to manipulation. Since behavioral recording procedure is intrinsically interpretative, high variability in experimental results is expected and usual, due to observer-related influences such as experience, knowledge, stress, fatigue and personal biases. Here, we present a computer program that supports the assessment of inter- and intra-observer concordance, using statistical indices (e.g., Kappa and Kendal coefficients and concordance index). The software was tested in a case study with 4 different observers, naïve to behavioral recording procedures. On paired analysis, the higher agreement index achieved was 0.76 (concordance index) and 0.47 (Kappa Coefficient, where 0 is no agreement and 1 is total agreement). Observers showed poor concordance indices (lower than 0.7), emphasizing the concern on observer recording stability and on precise morphological definition of the recorded behaviors. These indices can also be used to train observers and to refine the behavioral catalogue definitions, as they are related to different behavioral recording aspects.


Asunto(s)
Conducta Animal/fisiología , Programas Informáticos , Animales , Humanos , Ratones
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...