Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters











Database
Language
Publication year range
1.
Sensors (Basel) ; 24(11)2024 May 21.
Article in English | MEDLINE | ID: mdl-38894080

ABSTRACT

Bridges are critical components of transportation networks, and their conditions have effects on societal well-being, the economy, and the environment. Automation needs in inspections and maintenance have made structural health monitoring (SHM) systems a key research pillar to assess bridge safety/health. The last decade brought a boom in innovative bridge SHM applications with the rise in next-generation smart and mobile technologies. A key advancement within this direction is smartphones with their sensory usage as SHM devices. This focused review reports recent advances in bridge SHM backed by smartphone sensor technologies and provides case studies on bridge SHM applications. The review includes model-based and data-driven SHM prospects utilizing smartphones as the sensing and acquisition portal and conveys three distinct messages in terms of the technological domain and level of mobility: (i) vibration-based dynamic identification and damage-detection approaches; (ii) deformation and condition monitoring empowered by computer vision-based measurement capabilities; (iii) drive-by or pedestrianized bridge monitoring approaches, and miscellaneous SHM applications with unconventional/emerging technological features and new research domains. The review is intended to bring together bridge engineering, SHM, and sensor technology audiences with decade-long multidisciplinary experience observed within the smartphone-based SHM theme and presents exemplary cases referring to a variety of levels of mobility.


Subject(s)
Smartphone , Humans , Monitoring, Physiologic/instrumentation , Monitoring, Physiologic/methods
2.
Sensors (Basel) ; 23(23)2023 Dec 02.
Article in English | MEDLINE | ID: mdl-38067945

ABSTRACT

Video game trailers are very useful tools for attracting potential players. This research focuses on analyzing the emotions that arise while viewing video game trailers and the link between these emotions and storytelling and visual attention. The methodology consisted of a three-step task test with potential users: the first step was to identify the perception of indie games; the second step was to use the eyetracking device (gaze plot, heat map, and fixation points) and link them to fixation points (attention), viewing patterns, and non-visible areas; the third step was to interview users to understand impressions and questionnaires of emotions related to the trailer's storytelling and expectations. The results show an effective assessment of visual attention together with visualization patterns, non-visible areas that may affect game expectations, fixation points linked to very specific emotions, and perceived narratives based on the gaze plot. The innovation in the mixed methodological approach has made it possible to obtain relevant data regarding the link between the emotions perceived by the user and the areas of attention collected with the device. The proposed methodology enables developers to understand the strengths and weaknesses of the information being conveyed so that they can tailor the trailer to the expectations of potential players.


Subject(s)
Video Games , Video Games/psychology , Emotions , Perception , Visual Perception
3.
Soft Robot ; 10(3): 467-481, 2023 Jun.
Article in English | MEDLINE | ID: mdl-36251962

ABSTRACT

Equipping soft robotic grippers with sensing and perception capabilities faces significant challenges due to their high compliance and flexibility, limiting their ability to successfully interact with the environment. In this work, we propose a sensorized soft robotic finger with embedded marker pattern that integrates a high-speed neuromorphic event-based camera to enable finger proprioception and exteroception. A learning-based approach involving a convolutional neural network is developed to process event-based heat maps and achieve specific sensing tasks. The feasibility of the sensing approach for proprioception is demonstrated by showing its ability to predict the two-dimensional deformation of three points located on the finger structure, whereas the exteroception capability is assessed in a slip detection task that can classify slip heat maps at a temporal resolution of 2 ms. Our results show that our proposed approach can enable complete sensorization of the finger for both proprioception and exteroception using a single camera without negatively affecting the finger compliance. Using such sensorized finger in robotic grippers may provide safe, adaptive, and precise grasping for handling a wide category of objects.


Subject(s)
Robotics , Fingers , Neural Networks, Computer , Proprioception , Hand Strength
4.
Sensors (Basel) ; 21(16)2021 Aug 06.
Article in English | MEDLINE | ID: mdl-34450760

ABSTRACT

To overcome the limitation in flight time and enable unmanned aerial vehicles (UAVs) to survey remote sites of interest, this paper investigates an approach involving the collaboration with public transportation vehicles (PTVs) and the deployment of charging stations. In particular, the focus of this paper is on the deployment of charging stations. In this approach, a UAV first travels with some PTVs, and then flies through some charging stations to reach remote sites. While the travel time with PTVs can be estimated by the Monte Carlo method to accommodate various uncertainties, we propose a new coverage model to compute the travel time taken for UAVs to reach the sites. With this model, we formulate the optimal deployment problem with the goal of minimising the average travel time of UAVs from the depot to the sites, which can be regarded as a reflection of the quality of surveillance (QoS) (the shorter the better). We then propose an iterative algorithm to place the charging stations. We show that this algorithm ensures that any movement of a charging station leads to a decrease in the average travel time of UAVs. To demonstrate the effectiveness of the proposed method, we make a comparison with a baseline method. The results show that the proposed model can more accurately estimate the travel time than the most commonly used model, and the proposed algorithm can relocate the charging stations to achieve a lower flight distance than the baseline method.

5.
Front Robot AI ; 8: 630935, 2021.
Article in English | MEDLINE | ID: mdl-33718442

ABSTRACT

Sensory feedback is essential for the control of soft robotic systems and to enable deployment in a variety of different tasks. Proprioception refers to sensing the robot's own state and is of crucial importance in order to deploy soft robotic systems outside of laboratory environments, i.e. where no external sensing, such as motion capture systems, is available. A vision-based sensing approach for a soft robotic arm made from fabric is presented, leveraging the high-resolution sensory feedback provided by cameras. No mechanical interaction between the sensor and the soft structure is required and consequently the compliance of the soft system is preserved. The integration of a camera into an inflatable, fabric-based bellow actuator is discussed. Three actuators, each featuring an integrated camera, are used to control the spherical robotic arm and simultaneously provide sensory feedback of the two rotational degrees of freedom. A convolutional neural network architecture predicts the two angles describing the robot's orientation from the camera images. Ground truth data is provided by a motion capture system during the training phase of the supervised learning approach and its evaluation thereafter. The camera-based sensing approach is able to provide estimates of the orientation in real-time with an accuracy of about one degree. The reliability of the sensing approach is demonstrated by using the sensory feedback to control the orientation of the robotic arm in closed-loop.

6.
Med Biol Eng Comput ; 56(2): 297-305, 2018 Feb.
Article in English | MEDLINE | ID: mdl-28714049

ABSTRACT

Hardness, dimensions, and location of biological tissues are important parameters for electronic palpation protocols with standardized performance. This study presents a novel fluid-type tactile sensor able to measure size and depth of heterogeneous substances in elastic bodies. The new sensor is very simple and can be easily fabricated. It consists of an image sensor, LED lights, and a touchpad filled with translucent water. The intensity field of the light traveling in the touchpad is analyzed to estimate the touchpad shape which conforms to the shape of an object in contact. The use of the new sensor for measuring size and depth of heterogeneous substances inside elastic bodies as well as hardness of elastic bodies is illustrated. Results obtained for breast cancer dummies demonstrate the effectiveness of the proposed approach.


Subject(s)
Palpation/instrumentation , Touch , Breast Neoplasms/diagnosis , Equipment Design , Female , Humans , Models, Theoretical , Vision, Ocular
SELECTION OF CITATIONS
SEARCH DETAIL