Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
Autism Res ; 16(7): 1360-1374, 2023 07.
Article in English | MEDLINE | ID: mdl-37259909

ABSTRACT

Early behavioral markers for autism include differences in social attention and orienting in response to one's name when called, and differences in body movements and motor abilities. More efficient, scalable, objective, and reliable measures of these behaviors could improve early screening for autism. This study evaluated whether objective and quantitative measures of autism-related behaviors elicited from an app (SenseToKnow) administered on a smartphone or tablet and measured via computer vision analysis (CVA) are correlated with standardized caregiver-report and clinician administered measures of autism-related behaviors and cognitive, language, and motor abilities. This is an essential step in establishing the concurrent validity of a digital phenotyping approach. In a sample of 485 toddlers, 43 of whom were diagnosed with autism, we found that CVA-based gaze variables related to social attention were associated with the level of autism-related behaviors. Two language-related behaviors measured via the app, attention to people during a conversation and responding to one's name being called, were associated with children's language skills. Finally, performance during a bubble popping game was associated with fine motor skills. These findings provide initial support for the concurrent validity of the SenseToKnow app and its potential utility in identifying clinical profiles associated with autism. Future research is needed to determine whether the app can be used as an autism screening tool, can reliably stratify autism-related behaviors, and measure changes in autism-related behaviors over time.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Humans , Autistic Disorder/diagnosis , Autistic Disorder/psychology , Autism Spectrum Disorder/diagnosis , Cognition
2.
IEEE Trans Affect Comput ; 14(2): 919-930, 2023.
Article in English | MEDLINE | ID: mdl-37266390

ABSTRACT

Atypical facial expression is one of the early symptoms of autism spectrum disorder (ASD) characterized by reduced regularity and lack of coordination of facial movements. Automatic quantification of these behaviors can offer novel biomarkers for screening, diagnosis, and treatment monitoring of ASD. In this work, 40 toddlers with ASD and 396 typically developing toddlers were shown developmentally-appropriate and engaging movies presented on a smart tablet during a well-child pediatric visit. The movies consisted of social and non-social dynamic scenes designed to evoke certain behavioral and affective responses. The front-facing camera of the tablet was used to capture the toddlers' face. Facial landmarks' dynamics were then automatically computed using computer vision algorithms. Subsequently, the complexity of the landmarks' dynamics was estimated for the eyebrows and mouth regions using multiscale entropy. Compared to typically developing toddlers, toddlers with ASD showed higher complexity (i.e., less predictability) in these landmarks' dynamics. This complexity in facial dynamics contained novel information not captured by traditional facial affect analyses. These results suggest that computer vision analysis of facial landmark movements is a promising approach for detecting and quantifying early behavioral symptoms associated with ASD.

3.
J Supercomput ; : 1-29, 2023 May 18.
Article in English | MEDLINE | ID: mdl-37359342

ABSTRACT

Internet of Things realizes the ubiquitous connection of all things, generating countless time-tagged data called time series. However, real-world time series are often plagued with missing values on account of noise or malfunctioning sensors. Existing methods for modeling such incomplete time series typically involve preprocessing steps, such as deletion or missing data imputation using statistical learning or machine learning methods. Unfortunately, these methods unavoidable destroy time information and bring error accumulation to the subsequent model. To this end, this paper introduces a novel continuous neural network architecture, named Time-aware Neural-Ordinary Differential Equations (TN-ODE), for incomplete time data modeling. The proposed method not only supports imputation missing values at arbitrary time points, but also enables multi-step prediction at desired time points. Specifically, TN-ODE employs a time-aware Long Short-Term Memory as an encoder, which effectively learns the posterior distribution from partial observed data. Additionally, the derivative of latent states is parameterized with a fully connected network, thereby enabling continuous-time latent dynamics generation. The proposed TN-ODE model is evaluated on both real-world and synthetic incomplete time-series datasets by conducting data interpolation and extrapolation tasks as well as classification task. Extensive experiments show the TN-ODE model outperforms baseline methods in terms of Mean Square Error for imputation and prediction tasks, as well as accuracy in downstream classification task.

4.
Sci Rep ; 13(1): 7158, 2023 05 03.
Article in English | MEDLINE | ID: mdl-37137954

ABSTRACT

Differences in social attention are well-documented in autistic individuals, representing one of the earliest signs of autism. Spontaneous blink rate has been used to index attentional engagement, with lower blink rates reflecting increased engagement. We evaluated novel methods using computer vision analysis (CVA) for automatically quantifying patterns of attentional engagement in young autistic children, based on facial orientation and blink rate, which were captured via mobile devices. Participants were 474 children (17-36 months old), 43 of whom were diagnosed with autism. Movies containing social or nonsocial content were presented via an iPad app, and simultaneously, the device's camera recorded the children's behavior while they watched the movies. CVA was used to extract the duration of time the child oriented towards the screen and their blink rate as indices of attentional engagement. Overall, autistic children spent less time facing the screen and had a higher mean blink rate compared to neurotypical children. Neurotypical children faced the screen more often and blinked at a lower rate during the social movies compared to the nonsocial movies. In contrast, autistic children faced the screen less often during social movies than during nonsocial movies and showed no differential blink rate to social versus nonsocial movies.


Subject(s)
Attentional Blink , Autistic Disorder , Humans , Child, Preschool , Infant , Attention , Vision, Ocular
5.
J Child Psychol Psychiatry ; 64(1): 156-166, 2023 01.
Article in English | MEDLINE | ID: mdl-35965431

ABSTRACT

BACKGROUND: Early differences in sensorimotor functioning have been documented in young autistic children and infants who are later diagnosed with autism. Previous research has demonstrated that autistic toddlers exhibit more frequent head movement when viewing dynamic audiovisual stimuli, compared to neurotypical toddlers. To further explore this behavioral characteristic, in this study, computer vision (CV) analysis was used to measure several aspects of head movement dynamics of autistic and neurotypical toddlers while they watched a set of brief movies with social and nonsocial content presented on a tablet. METHODS: Data were collected from 457 toddlers, 17-36 months old, during their well-child visit to four pediatric primary care clinics. Forty-one toddlers were subsequently diagnosed with autism. An application (app) displayed several brief movies on a tablet, and the toddlers watched these movies while sitting on their caregiver's lap. The front-facing camera in the tablet recorded the toddlers' behavioral responses. CV was used to measure the participants' head movement rate, movement acceleration, and complexity using multiscale entropy. RESULTS: Autistic toddlers exhibited significantly higher rate, acceleration, and complexity in their head movements while watching the movies compared to neurotypical toddlers, regardless of the type of movie content (social vs. nonsocial). The combined features of head movement acceleration and complexity reliably distinguished the autistic and neurotypical toddlers. CONCLUSIONS: Autistic toddlers exhibit differences in their head movement dynamics when viewing audiovisual stimuli. Higher complexity of their head movements suggests that their movements were less predictable and less stable compared to neurotypical toddlers. CV offers a scalable means of detecting subtle differences in head movement dynamics, which may be helpful in identifying early behaviors associated with autism and providing insight into the nature of sensorimotor differences associated with autism.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Infant , Child, Preschool , Humans , Child , Autistic Disorder/diagnosis , Head Movements , Systems Analysis , Autism Spectrum Disorder/diagnosis
6.
IEEE Access ; 10: 34022-34031, 2022.
Article in English | MEDLINE | ID: mdl-36339795

ABSTRACT

Eye movement assessments have the potential to help in diagnosis and tracking of neurological disorders. Cerebellar ataxias cause profound and characteristic abnormalities in smooth pursuit, saccades, and fixation. Oculomotor dysmetria (i.e., hypermetric and hypometric saccades) is a common finding in individuals with cerebellar ataxia. In this study, we evaluated a scalable approach for detecting and quantifying oculomotor dysmetria. Eye movement data were extracted from iPhone video recordings of the horizontal saccade task (a standard clinical task in ataxia) and combined with signal processing and machine learning approaches to quantify saccade abnormalities. Entropy-based measures of eye movements during saccades were significantly different in 72 individuals with ataxia with dysmetria compared with 80 ataxia and Parkinson's participants without dysmetria. A template matching-based analysis demonstrated that saccadic eye movements in patients without dysmetria were more similar to the ideal template of saccades. A support vector machine was then used to train and test the ability of multiple signal processing features in combination to distinguish individuals with and without oculomotor dysmetria. The model achieved 78% accuracy (sensitivity= 80% and specificity= 76%). These results show that the combination of signal processing and machine learning approaches applied to iPhone video of saccades, allow for extraction of information pertaining to oculomotor dysmetria in ataxia. Overall, this inexpensive and scalable approach for capturing important oculomotor information may be a useful component of a screening tool for ataxia and could allow frequent at-home assessments of oculomotor function in natural history studies and clinical trials.

7.
JAMA Pediatr ; 175(8): 827-836, 2021 08 01.
Article in English | MEDLINE | ID: mdl-33900383

ABSTRACT

Importance: Atypical eye gaze is an early-emerging symptom of autism spectrum disorder (ASD) and holds promise for autism screening. Current eye-tracking methods are expensive and require special equipment and calibration. There is a need for scalable, feasible methods for measuring eye gaze. Objective: Using computational methods based on computer vision analysis, we evaluated whether an app deployed on an iPhone or iPad that displayed strategically designed brief movies could elicit and quantify differences in eye-gaze patterns of toddlers with ASD vs typical development. Design, Setting, and Participants: A prospective study in pediatric primary care clinics was conducted from December 2018 to March 2020, comparing toddlers with and without ASD. Caregivers of 1564 toddlers were invited to participate during a well-child visit. A total of 993 toddlers (63%) completed study measures. Enrollment criteria were aged 16 to 38 months, healthy, English- or Spanish-speaking caregiver, and toddler able to sit and view the app. Participants were screened with the Modified Checklist for Autism in Toddlers-Revised With Follow-up during routine care. Children were referred by their pediatrician for diagnostic evaluation based on results of the checklist or if the caregiver or pediatrician was concerned. Forty toddlers subsequently were diagnosed with ASD. Exposures: A mobile app displayed on a smartphone or tablet. Main Outcomes and Measures: Computer vision analysis quantified eye-gaze patterns elicited by the app, which were compared between toddlers with ASD vs typical development. Results: Mean age of the sample was 21.1 months (range, 17.1-36.9 months), and 50.6% were boys, 59.8% White individuals, 16.5% Black individuals, 23.7% other race, and 16.9% Hispanic/Latino individuals. Distinctive eye-gaze patterns were detected in toddlers with ASD, characterized by reduced gaze to social stimuli and to salient social moments during the movies, and previously unknown deficits in coordination of gaze with speech sounds. The area under the receiver operating characteristic curve discriminating ASD vs non-ASD using multiple gaze features was 0.90 (95% CI, 0.82-0.97). Conclusions and Relevance: The app reliably measured both known and new gaze biomarkers that distinguished toddlers with ASD vs typical development. These novel results may have potential for developing scalable autism screening tools, exportable to natural settings, and enabling data sets amenable to machine learning.


Subject(s)
Autism Spectrum Disorder/diagnosis , Fixation, Ocular , Mobile Applications , Child, Preschool , Computers, Handheld , Female , Humans , Infant , Male , Primary Health Care , Prospective Studies
8.
J Child Psychol Psychiatry ; 62(9): 1120-1131, 2021 09.
Article in English | MEDLINE | ID: mdl-33641216

ABSTRACT

BACKGROUND: This study is part of a larger research program focused on developing objective, scalable tools for digital behavioral phenotyping. We evaluated whether a digital app delivered on a smartphone or tablet using computer vision analysis (CVA) can elicit and accurately measure one of the most common early autism symptoms, namely failure to respond to a name call. METHODS: During a pediatric primary care well-child visit, 910 toddlers, 17-37 months old, were administered an app on an iPhone or iPad consisting of brief movies during which the child's name was called three times by an examiner standing behind them. Thirty-seven toddlers were subsequently diagnosed with autism spectrum disorder (ASD). Name calls and children's behavior were recorded by the camera embedded in the device, and children's head turns were coded by both CVA and a human. RESULTS: CVA coding of response to name was found to be comparable to human coding. Based on CVA, children with ASD responded to their name significantly less frequently than children without ASD. CVA also revealed that children with ASD who did orient to their name exhibited a longer latency before turning their head. Combining information about both the frequency and the delay in response to name improved the ability to distinguish toddlers with and without ASD. CONCLUSIONS: A digital app delivered on an iPhone or iPad in real-world settings using computer vision analysis to quantify behavior can reliably detect a key early autism symptom-failure to respond to name. Moreover, the higher resolution offered by CVA identified a delay in head turn in toddlers with ASD who did respond to their name. Digital phenotyping is a promising methodology for early assessment of ASD symptoms.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Autism Spectrum Disorder/diagnosis , Autistic Disorder/diagnosis , Child , Child, Preschool , Humans , Infant
9.
Sci Rep ; 10(1): 18641, 2020 10 29.
Article in English | MEDLINE | ID: mdl-33122811

ABSTRACT

Eye movements are disrupted in many neurodegenerative diseases and are frequent and early features in conditions affecting the cerebellum. Characterizing eye movements is important for diagnosis and may be useful for tracking disease progression and response to therapies. Assessments are limited as they require an in-person evaluation by a neurology subspecialist or specialized and expensive equipment. We tested the hypothesis that important eye movement abnormalities in cerebellar disorders (i.e., ataxias) could be captured from iPhone video. Videos of the face were collected from individuals with ataxia (n = 102) and from a comparative population (Parkinson's disease or healthy participants, n = 61). Computer vision algorithms were used to track the position of the eye which was transformed into high temporal resolution spectral features. Machine learning models trained on eye movement features were able to identify abnormalities in smooth pursuit (a key eye behavior) and accurately distinguish individuals with abnormal pursuit from controls (sensitivity = 0.84, specificity = 0.77). A novel machine learning approach generated severity estimates that correlated well with the clinician scores. We demonstrate the feasibility of capturing eye movement information using an inexpensive and widely accessible technology. This may be a useful approach for disease screening and for measuring severity in clinical trials.


Subject(s)
Cell Phone , Cerebellum/physiology , Eye Movements , Machine Learning , Pursuit, Smooth , Adolescent , Adult , Child , Child, Preschool , Female , Humans , Male , Young Adult
10.
Autism ; 23(3): 619-628, 2019 04.
Article in English | MEDLINE | ID: mdl-29595333

ABSTRACT

To demonstrate the capability of computer vision analysis to detect atypical orienting and attention behaviors in toddlers with autism spectrum disorder. One hundered and four toddlers of 16-31 months old (mean = 22) participated in this study. Twenty-two of the toddlers had autism spectrum disorder and 82 had typical development or developmental delay. Toddlers watched video stimuli on a tablet while the built-in camera recorded their head movement. Computer vision analysis measured participants' attention and orienting in response to name calls. Reliability of the computer vision analysis algorithm was tested against a human rater. Differences in behavior were analyzed between the autism spectrum disorder group and the comparison group. Reliability between computer vision analysis and human coding for orienting to name was excellent (intra-class coefficient 0.84, 95% confidence interval 0.67-0.91). Only 8% of toddlers with autism spectrum disorder oriented to name calling on >1 trial, compared to 63% of toddlers in the comparison group (p = 0.002). Mean latency to orient was significantly longer for toddlers with autism spectrum disorder (2.02 vs 1.06 s, p = 0.04). Sensitivity for autism spectrum disorder of atypical orienting was 96% and specificity was 38%. Older toddlers with autism spectrum disorder showed less attention to the videos overall (p = 0.03). Automated coding offers a reliable, quantitative method for detecting atypical social orienting and reduced sustained attention in toddlers with autism spectrum disorder.


Subject(s)
Attention/physiology , Autism Spectrum Disorder/physiopathology , Photic Stimulation/methods , Child, Preschool , Computers , Female , Humans , Infant , Male , Reproducibility of Results , Sensitivity and Specificity
SELECTION OF CITATIONS
SEARCH DETAIL
...