Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 19 de 19
1.
Science ; 384(6701): eadh9979, 2024 Jun 14.
Article En | MEDLINE | ID: mdl-38870291

Understanding cellular architectures and their connectivity is essential for interrogating system function and dysfunction. However, we lack technologies for mapping the multiscale details of individual cells and their connectivity in the human organ-scale system. We developed a platform that simultaneously extracts spatial, molecular, morphological, and connectivity information of individual cells from the same human brain. The platform includes three core elements: a vibrating microtome for ultraprecision slicing of large-scale tissues without losing cellular connectivity (MEGAtome), a polymer hydrogel-based tissue processing technology for multiplexed multiscale imaging of human organ-scale tissues (mELAST), and a computational pipeline for reconstructing three-dimensional connectivity across multiple brain slabs (UNSLICE). We applied this platform for analyzing human Alzheimer's disease pathology at multiple scales and demonstrating scalable neural connectivity mapping in the human brain.


Alzheimer Disease , Brain , Molecular Imaging , Humans , Alzheimer Disease/diagnostic imaging , Brain/diagnostic imaging , Molecular Imaging/methods , Phenotype , Hydrogels/chemistry , Connectome
2.
Ultrasound Med Biol ; 50(6): 825-832, 2024 Jun.
Article En | MEDLINE | ID: mdl-38423896

OBJECTIVE: B-lines assessed by lung ultrasound (LUS) outperform physical exam, chest radiograph, and biomarkers for the associated diagnosis of acute heart failure (AHF) in the emergent setting. The use of LUS is however limited to trained professionals and suffers from interpretation variability. The objective was to utilize transfer learning to create an AI-enabled software that can aid novice users to automate LUS B-line interpretation. METHODS: Data from an observational AHF LUS study provided standardized cine clips for AI model development and evaluation. A total of 49,952 LUS frames from 30 patients were hand scored and trained on a convolutional neural network (CNN) to interpret B-lines at the frame level. A random independent evaluation set of 476 LUS clips from 60 unique patients assessed model performance. The AI models scored the clips on both a binary and ordinal 0-4 multiclass assessment. RESULTS: A multiclassification AI algorithm had the best performance at the binary level when applied to the independent evaluation set, AUC of 0.967 (95% CI 0.965-0.970) for detecting pathologic conditions. When compared to expert blinded reviewer, the 0-4 multiclassification AI algorithm scale had a reported linear weighted kappa of 0.839 (95% CI 0.804-0.871). CONCLUSIONS: The multiclassification AI algorithm is a robust and well performing model at both binary and ordinal multiclass B-line evaluation. This algorithm has the potential to be integrated into clinical workflows to assist users with quantitative and objective B-line assessment for evaluation of AHF.


Heart Failure , Lung , Ultrasonography , Humans , Heart Failure/diagnostic imaging , Lung/diagnostic imaging , Ultrasonography/methods , Acute Disease , Male , Female , Aged , Middle Aged , Image Interpretation, Computer-Assisted/methods , Machine Learning
3.
Article En | MEDLINE | ID: mdl-38082806

Commercial ultrasound vascular phantoms lack the anatomic diversity required for robust pre-clinical interventional device testing. We fabricated individualized phantoms to test an artificial intelligence enabled ultrasound-guided surgical robotic system (AI-GUIDE) which allows novices to cannulate deep vessels. After segmenting vessels on computed tomography scans, vessel cores, bony anatomy, and a mold tailored to the skin contour were 3D-printed. Vessel cores were coated in silicone, surrounded in tissue-mimicking gel tailored for ultrasound and needle insertion, and dissolved with water. One upper arm and four inguinal phantoms were constructed. Operators used AI-GUIDE to deploy needles into phantom vessels. Two groin phantoms were tested due to imaging artifacts in the other two phantoms. Six operators (medical experience: none, 3; 1-5 years, 2; 5+ years, 1) inserted 27 inguinal needles with 81% (22/27) success in a median of 48 seconds. Seven operators performed 24 arm injections, without tuning the AI for arm anatomy, with 71% (17/24) success. After excluding failures due to motor malfunction and a defective needle, success rate was 100% (22/22) in the groin and 85% (17/20) in the arm. Individualized 3D-printed phantoms permit testing of surgical robotics across a large number of operators and different anatomic sites. AI-GUIDE operators rapidly and reliably inserted a needle into target vessels in the upper arm and groin, even without prior medical training. Virtual device trials in individualized 3-D printed phantoms may improve rigor of results and expedite translation.Clinical Relevance- Individualized phantoms enable rigorous and efficient evaluation of interventional devices and reduce the need for animal and human subject testing.


Artificial Intelligence , Needles , Animals , Humans , Ultrasonography , Phantoms, Imaging , Ultrasonography, Interventional/methods
4.
Article En | MEDLINE | ID: mdl-38083265

Fatigue impairs cognitive and motor function, potentially leading to mishaps in high-pressure occupations such as aviation and emergency medical services. The current approach is primarily based on self-assessment, which is subjective and error-prone. An objective method is needed to detect severe and likely dangerous levels of fatigue quickly and accurately. Here, we present a quantitative evaluation tool that uses less than two minutes of facial video, captured using an iPad, to assess fatigue vs. alertness. The tool is fast, easy to use, and scalable since it uses cameras readily available on consumer-electronic devices. We compared the classification performance between a Long Short-Term Memory (LSTM) deep neural network and a Random Forest (RF) classifier applied to engineered features informed by domain knowledge. The preliminary results on an 11-subject dataset show that RF outperforms LSTM, with added interpretability on the features used. For the RF classifiers, the average areas under the receiver operating characteristic curve, based on the 11-fold and individualized 11-fold cross validations, are 0.72 ± 0.16 and 0.8 ± 0.12, respectively. Equal error rates are 0.34 and 0.26, respectively. This study presents a promising approach for rapid fatigue detection. Additional data will be collected to assess the generalizability across populations.


Memory, Long-Term , Neural Networks, Computer , ROC Curve , Electronics
5.
Annu Int Conf IEEE Eng Med Biol Soc ; 2022: 238-242, 2022 07.
Article En | MEDLINE | ID: mdl-36085649

As advances in microscopy imaging provide an ever clearer window into the human brain, accurate reconstruction of neural connectivity can yield valuable insight into the relationship between brain structure and function. However, human manual tracing is a slow and laborious task, and requires domain expertise. Automated methods are thus needed to enable rapid and accurate analysis at scale. In this paper, we explored deep neural networks for dense axon tracing and incorporated axon topological information into the loss function with a goal to improve the performance on both voxel-based segmentation and axon centerline detection. We evaluated three approaches using a modified 3D U-Net architecture trained on a mouse brain dataset imaged with light sheet microscopy and achieved a 10% increase in axon tracing accuracy over previous methods. Furthermore, the addition of centerline awareness in the loss function outperformed the baseline approach across all metrics, including a boost in Rand Index by 8%.


Algorithms , Imaging, Three-Dimensional , Animals , Axons , Brain/diagnostic imaging , Humans , Imaging, Three-Dimensional/methods , Mice , Neural Networks, Computer
6.
Annu Int Conf IEEE Eng Med Biol Soc ; 2022: 1675-1681, 2022 07.
Article En | MEDLINE | ID: mdl-36086232

Lung ultrasound (LUS) as a diagnostic tool is gaining support for its role in the diagnosis and management of COVID-19 and a number of other lung pathologies. B-lines are a predominant feature in COVID-19, however LUS requires a skilled clinician to interpret findings. To facilitate the interpretation, our main objective was to develop automated methods to classify B-lines as pathologic vs. normal. We developed transfer learning models based on ResNet networks to classify B-lines as pathologic (at least 3 B-lines per lung field) vs. normal using COVID-19 LUS data. Assessment of B-line severity on a 0-4 multi-class scale was also explored. For binary B-line classification, at the frame-level, all ResNet models pretrained with ImageNet yielded higher performance than the baseline nonpretrained ResNet-18. Pretrained ResNet-18 has the best Equal Error Rate (EER) of 9.1% vs the baseline of 11.9%. At the clip-level, all pretrained network models resulted in better Cohen's kappa agreement (linear-weighted) and clip score accuracy, with the pretrained ResNet-18 having the best Cohen's kappa of 0.815 [95% CI: 0.804-0.826], and ResNet-101 the best clip scoring accuracy of 93.6%. Similar results were shown for multi-class scoring, where pretrained network models outperformed the baseline model. A class activation map is also presented to guide clinicians in interpreting LUS findings. Future work aims to further improve the multi-class assessment for severity of B-lines with a more diverse LUS dataset.


COVID-19 , Deep Learning , COVID-19/diagnostic imaging , Humans , Lung/diagnostic imaging , Thorax , Ultrasonography
7.
Biosensors (Basel) ; 11(12)2021 Dec 18.
Article En | MEDLINE | ID: mdl-34940279

Hemorrhage is a leading cause of trauma death, particularly in prehospital environments when evacuation is delayed. Obtaining central vascular access to a deep artery or vein is important for administration of emergency drugs and analgesics, and rapid replacement of blood volume, as well as invasive sensing and emerging life-saving interventions. However, central access is normally performed by highly experienced critical care physicians in a hospital setting. We developed a handheld AI-enabled interventional device, AI-GUIDE (Artificial Intelligence Guided Ultrasound Interventional Device), capable of directing users with no ultrasound or interventional expertise to catheterize a deep blood vessel, with an initial focus on the femoral vein. AI-GUIDE integrates with widely available commercial portable ultrasound systems and guides a user in ultrasound probe localization, venous puncture-point localization, and needle insertion. The system performs vascular puncture robotically and incorporates a preloaded guidewire to facilitate the Seldinger technique of catheter insertion. Results from tissue-mimicking phantom and porcine studies under normotensive and hypotensive conditions provide evidence of the technique's robustness, with key performance metrics in a live porcine model including: a mean time to acquire femoral vein insertion point of 53 ± 36 s (5 users with varying experience, in 20 trials), a total time to insert catheter of 80 ± 30 s (1 user, in 6 trials), and a mean number of 1.1 (normotensive, 39 trials) and 1.3 (hypotensive, 55 trials) needle insertion attempts (1 user). These performance metrics in a porcine model are consistent with those for experienced medical providers performing central vascular access on humans in a hospital.


Catheterization, Central Venous , Robotic Surgical Procedures , Ultrasonography, Interventional , Animals , Artificial Intelligence , Femoral Vein/diagnostic imaging , Humans , Swine
8.
Front Hum Neurosci ; 14: 222, 2020.
Article En | MEDLINE | ID: mdl-32719593

Modern operational environments can place significant demands on a service member's cognitive resources, increasing the risk of errors or mishaps due to overburden. The ability to monitor cognitive burden and associated performance within operational environments is critical to improving mission readiness. As a key step toward a field-ready system, we developed a simulated marksmanship scenario with an embedded working memory task in an immersive virtual reality environment. As participants performed the marksmanship task, they were instructed to remember numbered targets and recall the sequence of those targets at the end of the trial. Low and high cognitive load conditions were defined as the recall of three- and six-digit strings, respectively. Physiological and behavioral signals recorded included speech, heart rate, breathing rate, and body movement. These features were input into a random forest classifier that significantly discriminated between the low- and high-cognitive load conditions (AUC = 0.94). Behavioral features of gait were the most informative, followed by features of speech. We also showed the capability to predict performance on the digit recall (AUC = 0.71) and marksmanship (AUC = 0.58) tasks. The experimental framework can be leveraged in future studies to quantify the interaction of other types of stressors and their impact on operational cognitive and physical performance.

9.
Ultrasound Med Biol ; 46(10): 2667-2676, 2020 10.
Article En | MEDLINE | ID: mdl-32622685

The purpose of this study was to develop an automated method for classifying liver fibrosis stage ≥F2 based on ultrasound shear wave elastography (SWE) and to assess the system's performance in comparison with a reference manual approach. The reference approach consists of manually selecting a region of interest from each of eight or more SWE images, computing the mean tissue stiffness within each of the regions of interest and computing a resulting stiffness value as the median of the means. The 527-subject database consisted of 5526 SWE images and pathologist-scored biopsies, with data collected from a single system at a single site. The automated method integrates three modules that assess SWE image quality, select a region of interest from each SWE measurement and perform machine learning-based, multi-image SWE classification for fibrosis stage ≥F2. Several classification methods were developed and tested using fivefold cross-validation with training, validation and test sets partitioned by subject. Performance metrics were area under receiver operating characteristic curve (AUROC), specificity at 95% sensitivity and number of SWE images required. The final automated method yielded an AUROC of 0.93 (95% confidence interval: 0.90-0.94) versus 0.69 (95% confidence interval: 0.65-0.72) for the reference method, 71% specificity with 95% sensitivity versus 5% and four images per decision versus eight or more. In conclusion, the automated method reported in this study significantly improved the accuracy for ≥F2 classification of SWE measurements as well as reduced the number of measurements needed, which has the potential to reduce clinical workflow.


Elasticity Imaging Techniques/methods , Image Processing, Computer-Assisted , Liver Cirrhosis/classification , Liver Cirrhosis/diagnostic imaging , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Retrospective Studies , Young Adult
10.
Annu Int Conf IEEE Eng Med Biol Soc ; 2019: 993-997, 2019 Jul.
Article En | MEDLINE | ID: mdl-31946060

Endometrial thickness is closely related to gyneco-logical function and is an important biomarker in transvaginal ultrasound (TVUS) examinations for assessing female reproductive health. Manual measurement is time-consuming and subject to high inter- and intra- observer variability. In this paper, we present a fully automated endometrial thickness measurement method using deep learning. Our pipeline consists of: 1) endometrium segmentation using a VGG-based U-Net, and 2) endometrial thickness estimation using medial axis transformation. We conducted experimental studies on 137 2D TVUS cases (74/63 secretory phase/proliferative phase). On a test set of 27 cases/277 images, the segmentation Dice score is 0.83. For thickness measurement, we achieved mean absolute error of 1.23/1.38 mm and root mean squared error of 1.79/1.85 mm on two different test sets. The results are considered well within the clinically acceptable range of ±2 mm. Furthermore, our phase-stratified analysis shows that the measurement variance from the secretory phase is higher than that from the proliferative phase, largely due to the high variability of the endometrium appearance in the secretory phase. Future work will extend our current algorithm toward different clinical outcomes for a broader spectrum of clinical applications.


Deep Learning , Endometrium , Algorithms , Endometrium/diagnostic imaging , Female , Humans , Observer Variation , Ultrasonography
11.
Article En | MEDLINE | ID: mdl-30440285

Diffuse liver disease is common, primarily driven by high prevalence of non-alcoholic fatty liver disease (NAFLD). It is currently assessed by liver biopsy to determine fibrosis, often staged as F0 (normal) - F4 (cirrhosis). A noninvasive assessment method will allow a broader population to be monitored longitudinally, facilitating risk stratification and treatment efficacy assessment. Ultrasound shear wave elastography (SWE) is a promising noninvasive technique for measuring tissue stiffness that has been shown to correlate with fibrosis stage. However, this approach has been limited by variability in stiffness measurements. In this work, we developed and evaluated an automated framework, called SWE-Assist, that checks SWE image quality, selects a region of interest (ROI), and classifies the ROI to determine whether the fibrosis stage is at or exceeds F2, which is important for clinical decisionmaking. Our database consists of 3,392 images from 328 cases. Several classifiers, including random forest, support vector machine, and convolutional neural network (CNN)) were evaluated. The best approach utilized a CNN and yielded an area under the receiver operating curve (AUROC) of 0.89, compared to the conventional stiffness only based AUROC of 0.74. Moreover, the new method is based on single image per decision, vs. 10 images per decision for the baseline. A larger dataset is needed to further validate this approach, which has the potential to improve the accuracy and efficiency of non-invasive liver fibrosis staging.


Elasticity Imaging Techniques/methods , Liver Cirrhosis/diagnostic imaging , Adult , Area Under Curve , Female , Humans , Male , Middle Aged , Neural Networks, Computer , Support Vector Machine
12.
Abdom Radiol (NY) ; 43(4): 786-799, 2018 04.
Article En | MEDLINE | ID: mdl-29492605

Ultrasound (US) imaging is the most commonly performed cross-sectional diagnostic imaging modality in the practice of medicine. It is low-cost, non-ionizing, portable, and capable of real-time image acquisition and display. US is a rapidly evolving technology with significant challenges and opportunities. Challenges include high inter- and intra-operator variability and limited image quality control. Tremendous opportunities have arisen in the last decade as a result of exponential growth in available computational power coupled with progressive miniaturization of US devices. As US devices become smaller, enhanced computational capability can contribute significantly to decreasing variability through advanced image processing. In this paper, we review leading machine learning (ML) approaches and research directions in US, with an emphasis on recent ML advances. We also present our outlook on future opportunities for ML techniques to further improve clinical workflow and US-based disease diagnosis and characterization.


Abdomen/diagnostic imaging , Machine Learning , Ultrasonography/methods , Forecasting , Humans
13.
IEEE Trans Robot ; 33(1): 81-91, 2017 Feb.
Article En | MEDLINE | ID: mdl-28190986

A system for automatically pointing ultrasound (US) imaging catheters will enable clinicians to monitor anatomical structures and track instruments during interventional procedures. Off-the-shelf US catheters provide high quality US images from within the patient. While this method of imaging has been proven to be effective for guiding many interventional treatments, significant training is required to overcome the difficulty in manually steering the imager to point at desired structures. Our system uses closed-form four degree of freedom (DOF) kinematic solutions to automatically position the US catheter and point the imager. Algorithms for steering and imager pointing were developed for a range of useful diagnostic and interventional motions. The system was validated on a robotic test bed by steering the catheter within a water environment containing phantom objects. While the system described here was designed for pointing ultrasound catheters, these algorithms are applicable to accurate 4-DOF steering and orientation control of any long thin tendon-driven tool with single or bi-directional bending.

14.
Article En | MEDLINE | ID: mdl-27754495

We present an instrument tracking and visualization system for intra-cardiac ultrasound catheter guided procedures, enabled through the robotic control of ultrasound catheters. Our system allows for rapid acquisition of 2D ultrasound images and accurate reconstruction and visualization of a 3D volume. The reconstructed volume addresses the limited field of view, an inherent problem of ultrasound imaging, and serves as a navigation map for procedure guidance. Our robotic system can track a moving instrument by continuously adjusting the imaging plane and visualizing the instrument tip. The overall instrument tracking accuracy is 2.2mm RMS in position and 0.8° in angle.

15.
J Ultrasound Med ; 32(12): 2185-90, 2013 Dec.
Article En | MEDLINE | ID: mdl-24277902

With the proliferation of portable sonography and the increase in nontraditional users, there is an increased need for automated decision support to standardize results. We developed algorithms to evaluate the presence or absence of "B-lines" on thoracic sonography as a marker for interstitial fluid. Algorithm performance was compared against an average of scores from 2 expert clinical sonographers. On the set for algorithm development, 90% of the scores matched the average expert scores with differences of 1 or less. On the independent set, a perfect match was achieved. We believe that these are the first reported results in computerized B-line scoring.


Algorithms , Dyspnea/diagnostic imaging , Image Interpretation, Computer-Assisted/methods , Lung Diseases/diagnostic imaging , Pattern Recognition, Automated/methods , Thorax/diagnostic imaging , Ultrasonography/methods , Adult , Aged , Aged, 80 and over , Dyspnea/etiology , Female , Humans , Image Enhancement/methods , Lung Diseases/complications , Male , Middle Aged , Observer Variation , Reproducibility of Results , Sensitivity and Specificity , Single-Blind Method , Young Adult
16.
IEEE Int Conf Robot Autom ; 2013: 5794-5799, 2013 Dec 31.
Article En | MEDLINE | ID: mdl-24683501

Intracardiac echocardiography (ICE) catheters enable high-quality ultrasound imaging within the heart, but their use in guiding procedures is limited due to the difficulty of manually pointing them at structures of interest. This paper presents the design and testing of a catheter steering model for robotic control of commercial ICE catheters. The four actuated degrees of freedom (4-DOF) are two catheter handle knobs to produce bi-directional bending in combination with rotation and translation of the handle. An extra degree of freedom in the system allows the imaging plane (dependent on orientation) to be directed at an object of interest. A closed form solution for forward and inverse kinematics enables control of the catheter tip position and the imaging plane orientation. The proposed algorithms were validated with a robotic test bed using electromagnetic sensor tracking of the catheter tip. The ability to automatically acquire imaging targets in the heart may improve the efficiency and effectiveness of intracardiac catheter interventions by allowing visualization of soft tissue structures that are not visible using standard fluoroscopic guidance. Although the system has been developed and tested for manipulating ICE catheters, the methods described here are applicable to any long thin tendon-driven tool (with single or bi-directional bending) requiring accurate tip position and orientation control.

17.
Article En | MEDLINE | ID: mdl-29862385

Real-time 3D ultrasound (3DUS) imaging offers improved spatial orientation information relative to 2D ultrasound. However, in order to improve its efficacy in guiding minimally invasive intra-cardiac procedures where real-time visual feedback of an instrument tip location is crucial, 3DUS volume visualization alone is inadequate. This paper presents a set of enhanced visualization functionalities that are able to track the tip of an instrument in slice views at real-time. User study with in vitro porcine heart indicates a speedup of over 30% in task completion time.

18.
Med Image Comput Comput Assist Interv ; 14(Pt 1): 105-12, 2011.
Article En | MEDLINE | ID: mdl-22003606

Intra-cardiac 3D ultrasound imaging has enabled new minimally invasive procedures. Its narrow field of view, however, limits its efficacy in guiding beating heart procedures where geometrically complex and spatially extended moving anatomic structures are often involved. In this paper, we present a system that performs electrocardiograph gated 4D mosaicing and visualization of 3DUS volumes. Real-time operation is enabled by GPU implementation. The method is validated on phantom and porcine heart data.


Electrocardiography/methods , Heart/physiology , Imaging, Three-Dimensional/methods , Ultrasonography/methods , Animals , Cardiac Surgical Procedures/methods , Computer Graphics , Electromagnetic Phenomena , Image Processing, Computer-Assisted , Phantoms, Imaging , Software , Swine , Time Factors , Ultrasonics , User-Computer Interface
19.
Article En | MEDLINE | ID: mdl-22256219

In this paper, we describe our prototype of an ultrasound guidance system to address the need for an easy-to-use, cost-effective, and portable technology to improve ultrasound-guided procedures. The system consists of a lockable, articulating needle guide that attaches to an ultrasound probe and a user-interface that provides real-time visualization of the predicted needle trajectory overlaid on the ultrasound image. Our needle guide ensures proper needle alignment with the ultrasound imaging plane. Moreover, the calculated needle trajectory is superimposed on the real-time ultrasound image, eliminating the need for the practitioner to estimate the target trajectory, and thereby reducing injuries from needle readjustment. Finally, the guide is lockable to prevent needle deviation from the desired trajectory during insertion. This feature will also allow the practitioner to free one hand to complete simple tasks that usually require a second practitioner to perform. Overall, our system eliminates the experience required to develop the fine hand movement and dexterity needed for traditional ultrasound-guided procedures. The system has the potential to increase efficiency, safety, quality, and reduce costs for a wide range of ultrasound-guided procedures. Furthermore, in combination with portable ultrasound machines, this system will enable these procedures to be more easily performed by unskilled practitioners in non-ideal situations such as the battlefield and other disaster relief areas.


Diagnosis, Computer-Assisted/instrumentation , Needles , Ultrasonography/instrumentation , Humans , Phantoms, Imaging , User-Computer Interface
...