Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 15 de 15
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Genet Sel Evol ; 56(1): 29, 2024 Apr 16.
Artículo en Inglés | MEDLINE | ID: mdl-38627636

RESUMEN

BACKGROUND: With the introduction of digital phenotyping and high-throughput data, traits that were previously difficult or impossible to measure directly have become easily accessible, offering the opportunity to enhance the efficiency and rate of genetic gain in animal production. It is of interest to assess how behavioral traits are indirectly related to the production traits during the performance testing period. The aim of this study was to assess the quality of behavior data extracted from day-wise video recordings and estimate the genetic parameters of behavior traits and their phenotypic and genetic correlations with production traits in pigs. Behavior was recorded for 70 days after on-test at about 10 weeks of age and ended at off-test for 2008 female purebred pigs, totaling 119,812 day-wise records. Behavior traits included time spent eating, drinking, laterally lying, sternally lying, sitting, standing, and meters of distance traveled. A quality control procedure was created for algorithm training and adjustment, standardizing recording hours, removing culled animals, and filtering unrealistic records. RESULTS: Production traits included average daily gain (ADG), back fat thickness (BF), and loin depth (LD). Single-trait linear models were used to estimate heritabilities of the behavior traits and two-trait linear models were used to estimate genetic correlations between behavior and production traits. The results indicated that all behavior traits are heritable, with heritability estimates ranging from 0.19 to 0.57, and showed low-to-moderate phenotypic and genetic correlations with production traits. Two-trait linear models were also used to compare traits at different intervals of the recording period. To analyze the redundancies in behavior data during the recording period, the averages of various recording time intervals for the behavior and production traits were compared. Overall, the average of the 55- to 68-day recording interval had the strongest phenotypic and genetic correlation estimates with the production traits. CONCLUSIONS: Digital phenotyping is a new and low-cost method to record behavior phenotypes, but thorough data cleaning procedures are needed. Evaluating behavioral traits at different time intervals offers a deeper insight into their changes throughout the growth periods and their relationship with production traits, which may be recorded at a less frequent basis.


Asunto(s)
Conducta Alimentaria , Porcinos/genética , Femenino , Animales , Fenotipo , Modelos Lineales
2.
Sci Rep ; 13(1): 1855, 2023 02 01.
Artículo en Inglés | MEDLINE | ID: mdl-36725967

RESUMEN

The signal modelling framework JimenaE simulates dynamically Boolean networks. In contrast to SQUAD, there is systematic and not just heuristic calculation of all system states. These specific features are not present in CellNetAnalyzer and BoolNet. JimenaE is an expert extension of Jimena, with new optimized code, network conversion into different formats, rapid convergence both for system state calculation as well as for all three network centralities. It allows higher accuracy in determining network states and allows to dissect networks and identification of network control type and amount for each protein with high accuracy. Biological examples demonstrate this: (i) High plasticity of mesenchymal stromal cells for differentiation into chondrocytes, osteoblasts and adipocytes and differentiation-specific network control focusses on wnt-, TGF-beta and PPAR-gamma signaling. JimenaE allows to study individual proteins, removal or adding interactions (or autocrine loops) and accurately quantifies effects as well as number of system states. (ii) Dynamical modelling of cell-cell interactions of plant Arapidopsis thaliana against Pseudomonas syringae DC3000: We analyze for the first time the pathogen perspective and its interaction with the host. We next provide a detailed analysis on how plant hormonal regulation stimulates specific proteins and who and which protein has which type and amount of network control including a detailed heatmap of the A.thaliana response distinguishing between two states of the immune response. (iii) In an immune response network of dendritic cells confronted with Aspergillus fumigatus, JimenaE calculates now accurately the specific values for centralities and protein-specific network control including chemokine and pattern recognition receptors.


Asunto(s)
Proteínas , Programas Informáticos , Transducción de Señal , Comunicación Celular , Diferenciación Celular
3.
Animals (Basel) ; 13(2)2023 Jan 10.
Artículo en Inglés | MEDLINE | ID: mdl-36670787

RESUMEN

The objectives were to determine the sensitivity, specificity, and cutoff values of a visual-based precision livestock technology (NUtrack), and determine the sensitivity and specificity of sickness score data collected with the live observation by trained human observers. At weaning, pigs (n = 192; gilts and barrows) were randomly assigned to one of twelve pens (16/pen) and treatments were randomly assigned to pens. Sham-pen pigs all received subcutaneous saline (3 mL). For LPS-pen pigs, all pigs received subcutaneous lipopolysaccharide (LPS; 300 µg/kg BW; E. coli O111:B4; in 3 mL of saline). For the last treatment, eight pigs were randomly assigned to receive LPS, and the other eight were sham (same methods as above; half-and-half pens). Human data from the day of the challenge presented high true positive and low false positive rates (88.5% sensitivity; 85.4% specificity; 0.871 Area Under Curve, AUC), however, these values declined when half-and-half pigs were scored (75% sensitivity; 65.5% specificity; 0.703 AUC). Precision technology measures had excellent AUC, sensitivity, and specificity for the first 72 h after treatment and AUC values were >0.970, regardless of pen treatment. These results indicate that precision technology has a greater potential for identifying pigs during a natural infectious disease event than trained professionals using timepoint sampling.

4.
Transl Anim Sci ; 6(3): txac082, 2022 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-35875422

RESUMEN

Animal behavior is indicative of health status and changes in behavior can indicate health issues (i.e., illness, stress, or injury). Currently, human observation (HO) is the only method for detecting behavior changes that may indicate problems in group-housed pigs. While HO is effective, limitations exist. Limitations include HO being time consuming, HO obfuscates natural behaviors, and it is not possible to maintain continuous HO. To address these limitations, a computer vision platform (NUtrack) was developed to identify (ID) and continuously monitor specific behaviors of group-housed pigs on an individual basis. The objectives of this study were to evaluate the capabilities of the NUtrack system and evaluate changes in behavior patterns over time of group-housed nursery pigs. The NUtrack system was installed above four nursery pens to monitor the behavior of 28 newly weaned pigs during a 42-d nursery period. Pigs were stratified by sex, litter, and randomly assigned to one of two pens (14 pigs/pen) for the first 22 d. On day 23, pigs were split into four pens (7 pigs/pen). To evaluate the NUtrack system's capabilities, 800 video frames containing 11,200 individual observations were randomly selected across the nursery period. Each frame was visually evaluated to verify the NUtrack system's accuracy for ID and classification of behavior. The NUtrack system achieved an overall accuracy for ID of 95.6%. This accuracy for ID was 93.5% during the first 22 d and increased (P < 0.001) to 98.2% for the final 20 d. Of the ID errors, 72.2% were due to mislabeled ID and 27.8% were due to loss of ID. The NUtrack system classified lying, standing, walking, at the feeder (ATF), and at the waterer (ATW) behaviors accurately at a rate of 98.7%, 89.7%, 88.5%, 95.6%, and 79.9%, respectively. Behavior data indicated that the time budget for lying, standing, and walking in nursery pigs was 77.7% ± 1.6%, 8.5% ± 1.1%, and 2.9% ± 0.4%, respectively. In addition, behavior data indicated that nursery pigs spent 9.9% ± 1.7% and 1.0% ± 0.3% time ATF and ATW, respectively. Results suggest that the NUtrack system can detect, identify, maintain ID, and classify specific behavior of group-housed nursery pigs for the duration of the 42-d nursery period. Overall, results suggest that, with continued research, the NUtrack system may provide a viable real-time precision livestock tool with the ability to assist producers in monitoring behaviors and potential changes in the behavior of group-housed pigs.

5.
Int J Mol Sci ; 22(5)2021 Mar 05.
Artículo en Inglés | MEDLINE | ID: mdl-33807854

RESUMEN

We observed substantial differences in predicted Major Histocompatibility Complex II (MHCII) epitope presentation of SARS-CoV-2 proteins for different populations but only minor differences in predicted MHCI epitope presentation. A comparison of this predicted epitope MHC-coverage revealed for the early phase of infection spread (till day 15 after reaching 128 observed infection cases) highly significant negative correlations with the case fatality rate. Specifically, this was observed in different populations for MHC class II presentation of the viral spike protein (p-value: 0.0733 for linear regression), the envelope protein (p-value: 0.023), and the membrane protein (p-value: 0.00053), indicating that the high case fatality rates of COVID-19 observed in some countries seem to be related with poor MHC class II presentation and hence weak adaptive immune response against these viral envelope proteins. Our results highlight the general importance of the SARS-CoV-2 structural proteins in immunological control in early infection spread looking at a global census in various countries and taking case fatality rate into account. Other factors such as health system and control measures become more important after the early spread. Our study should encourage further studies on MHCII alleles as potential risk factors in COVID-19 including assessment of local populations and specific allele distributions.


Asunto(s)
COVID-19/mortalidad , Antígenos de Histocompatibilidad Clase II/genética , Antígenos de Histocompatibilidad Clase II/inmunología , SARS-CoV-2/química , Proteínas Estructurales Virales/química , Inmunidad Adaptativa , Alelos , COVID-19/inmunología , COVID-19/transmisión , Biología Computacional/métodos , Correlación de Datos , Epítopos de Linfocito B/genética , Epítopos de Linfocito B/inmunología , Epítopos de Linfocito T/genética , Epítopos de Linfocito T/inmunología , Antígenos HLA/genética , Antígenos de Histocompatibilidad Clase I/genética , Antígenos de Histocompatibilidad Clase I/inmunología , Humanos , Mortalidad , SARS-CoV-2/inmunología , Proteínas Estructurales Virales/inmunología
6.
Mil Med ; 186(Suppl 1): 281-287, 2021 01 25.
Artículo en Inglés | MEDLINE | ID: mdl-33499491

RESUMEN

INTRODUCTION: The U.S. Space Force was stood up on December 20, 2019 as an independent branch under the Air Force consisting of about 16,000 active duty and civilian personnel focused singularly on space. In addition to the Space Force, the plans by NASA and private industry for exploration-class long-duration missions to the moon, near-earth asteroids, and Mars makes semi-independent medical capability in space a priority. Current practice for space-based medicine is limited and relies on a "life-raft" scenario for emergencies. Discussions by working groups on military space-based medicine include placing a Role III equivalent facility in a lunar surface station. Surgical capability is a key requirement for that facility. MATERIALS AND METHODS: To prepare for the eventuality of surgery in space, it is necessary to develop low-mass, low power, mini-surgical robots, which could serve as a celestial replacement for existing terrestrial robots. The current study focused on developing semi-autonomous capability in surgical robotics, specifically related to task automation. Two categories for end-effector tissue interaction were developed: Visual feedback from the robot to detect tissue contact, and motor current waveform measurements to detect contact force. RESULTS: Using a pixel-to-pixel deep neural network to train, we were able to achieve an accuracy of nearly 90% for contact/no-contact detection. Large torques were predicted well by a trained long short-term memory recursive network, but the technique did not predict small torques well. CONCLUSION: Surgical capability on long-duration missions will require human/machine teaming with semi-autonomous surgical robots. Our existing small, lightweight, low-power miniature robots perform multiple essential tasks in one design including hemostasis, fluid management, suturing for traumatic wounds, and are fully insertable for internal surgical procedures. To prepare for the inevitable eventuality of an emergency surgery in space, it is essential that automated surgical robot capabilities be developed.


Asunto(s)
Medicina Aeroespacial , Robótica , Humanos , Luna
7.
J Insect Sci ; 20(6)2020 Nov 01.
Artículo en Inglés | MEDLINE | ID: mdl-33135753

RESUMEN

The horn fly, Haematobia irritans L. (Diptera: Muscidae), is a persistent pest of cattle globally. A threshold of 200 flies per animal is considered the standard management goal; however, determining when that threshold has been exceeded is difficult using visual estimates that tend to overestimate the actual fly densities and are, at best, subjective. As a result, a more reliable and durable method of determining horn fly densities is needed. Here, we describe the methods commonly used to quantify horn fly densities including visual estimates and digital photography, and provide examples of quantification software and the prospect for computer automation methods.


Asunto(s)
Entomología/métodos , Control de Insectos/métodos , Muscidae , Animales , Entomología/instrumentación , Control de Insectos/instrumentación , Fotograbar/veterinaria , Densidad de Población
8.
Sensors (Basel) ; 20(13)2020 Jun 30.
Artículo en Inglés | MEDLINE | ID: mdl-32630011

RESUMEN

Tracking individual animals in a group setting is a exigent task for computer vision and animal science researchers. When the objective is months of uninterrupted tracking and the targeted animals lack discernible differences in their physical characteristics, this task introduces significant challenges. To address these challenges, a probabilistic tracking-by-detection method is proposed. The tracking method uses, as input, visible keypoints of individual animals provided by a fully-convolutional detector. Individual animals are also equipped with ear tags that are used by a classification network to assign unique identification to instances. The fixed cardinality of the targets is leveraged to create a continuous set of tracks and the forward-backward algorithm is used to assign ear-tag identification probabilities to each detected instance. Tracking achieves real-time performance on consumer-grade hardware, in part because it does not rely on complex, costly, graph-based optimizations. A publicly available, human-annotated dataset is introduced to evaluate tracking performance. This dataset contains 15 half-hour long videos of pigs with various ages/sizes, facility environments, and activity levels. Results demonstrate that the proposed method achieves an average precision and recall greater than 95% across the entire dataset. Analysis of the error events reveals environmental conditions and social interactions that are most likely to cause errors in real-world deployments.


Asunto(s)
Algoritmos , Sistemas de Identificación Animal , Vivienda para Animales , Ganado , Animales , Conjuntos de Datos como Asunto , Porcinos
9.
Sensors (Basel) ; 19(4)2019 Feb 19.
Artículo en Inglés | MEDLINE | ID: mdl-30791377

RESUMEN

Computer vision systems have the potential to provide automated, non-invasive monitoring of livestock animals, however, the lack of public datasets with well-defined targets and evaluation metrics presents a significant challenge for researchers. Consequently, existing solutions often focus on achieving task-specific objectives using relatively small, private datasets. This work introduces a new dataset and method for instance-level detection of multiple pigs in group-housed environments. The method uses a single fully-convolutional neural network to detect the location and orientation of each animal, where both body part locations and pairwise associations are represented in the image space. Accompanying this method is a new dataset containing 2000 annotated images with 24,842 individually annotated pigs from 17 different locations. The proposed method achieves over 99% precision and over 96% recall when detecting pigs in environments previously seen by the network during training. To evaluate the robustness of the trained network, it is also tested on environments and lighting conditions unseen in the training set, where it achieves 91% precision and 67% recall. The dataset is publicly available for download.

10.
Biomed Sci Instrum ; 51: 289-96, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-25996730

RESUMEN

Shifting demographics in the U.S. has created an urgent need to reform the policies, practices, and technology associated with delivering healthcare to geriatric populations. Automated monitoring systems can improve the quality of life while reducing healthcare costs for individuals aging in place. For these systems to be successful, both activity detection and localization are important, but most existing research focuses on only one of these technologies and systems that do collect both data treat these data sources separately. Here, we present SLAD {Simultaneous Localization and Activity Detection a novel framework for simultaneously processing data collected from localization and activity classification systems. Using a hidden Markov model and machine learning techniques, SLAD fuses these two sources of data in realtime using a probabilistic likelihood framework, which allows activity data to refine localization, and vice-versa. To evaluate the system, a wireless sensor network was deployed to collect RSSI data and IMU data concurrently from a wrist-worn watch; the RSSI data was processed using a radial basis function neural network localization algorithm, and the resulting position likelihoods were combined with the likelihoods from an IMU acitivty classification algorithm. In an experiment conducted in an indoor office environment, the proposed method produces 97% localization accuracy and 85% activity classification.

11.
Artículo en Inglés | MEDLINE | ID: mdl-25570416

RESUMEN

As a first step toward building a smart home behavioral monitoring system capable of classifying a wide variety of human behavior, a wireless sensor network (WSN) system is presented for RSSI localization. The low-cost, non-intrusive system uses a smart watch worn by the user to broadcast data to the WSN, where the strength of the radio signal is evaluated at each WSN node to localize the user. A method is presented that uses simultaneous localization and mapping (SLAM) for system calibration, providing automated fingerprinting associating the radio signal strength patterns to the user's location within the living space. To improve the accuracy of localization, a novel refinement technique is introduced that takes into account typical movement patterns of people within their homes. Experimental results demonstrate that the system is capable of providing accurate localization results in a typical living space.


Asunto(s)
Conducta , Monitoreo Ambulatorio/métodos , Tecnología Inalámbrica , Actividades Cotidianas , Algoritmos , Calibración , Procesamiento Automatizado de Datos , Humanos , Internet , Rayos Láser , Movimiento , Probabilidad , Procesamiento de Señales Asistido por Computador
12.
Stud Health Technol Inform ; 184: 235-41, 2013.
Artículo en Inglés | MEDLINE | ID: mdl-23400163

RESUMEN

The availability of digital stereoscopic video feedback on surgical robotic platforms allows for a variety of enhancements through the application of computer vision. Several of these enhancements, such as augmented reality and semi-automated surgery, benefit significantly from identification of the robotic manipulators within the field of view. A method is presented for the extraction of robotic manipulators from stereoscopic views of the operating field that uses a combination of marker tracking, inverse kinematics, and computer rendering. This method is shown to accurately identify the locations of the manipulators within the views. It is further demonstrated that this method can be used to enhance 3D reconstruction of the operating field and produce augmented views.


Asunto(s)
Tejido Conectivo/cirugía , Imagenología Tridimensional/métodos , Procedimientos de Cirugía Plástica/instrumentación , Robótica/métodos , Cirugía Asistida por Computador/métodos , Interfaz Usuario-Computador , Humanos
13.
Surg Endosc ; 26(12): 3413-7, 2012 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-22648119

RESUMEN

BACKGROUND: Accurate real-time 3D models of the operating field have the potential to enable augmented reality for endoscopic surgery. A new system is proposed to create real-time 3D models of the operating field that uses a custom miniaturized stereoscopic video camera attached to a laparoscope and an image-based reconstruction algorithm implemented on a graphics processing unit (GPU). METHODS: The proposed system was evaluated in a porcine model that approximates the viewing conditions of in vivo surgery. To assess the quality of the models, a synthetic view of the operating field was produced by overlaying a color image on the reconstructed 3D model, and an image rendered from the 3D model was compared with a 2D image captured from the same view. RESULTS: Experiments conducted with an object of known geometry demonstrate that the system produces 3D models accurate to within 1.5 mm. CONCLUSIONS: The ability to produce accurate real-time 3D models of the operating field is a significant advancement toward augmented reality in minimally invasive surgery. An imaging system with this capability will potentially transform surgery by helping novice and expert surgeons alike to delineate variance in internal anatomy accurately.


Asunto(s)
Imagenología Tridimensional , Laparoscopía/métodos , Animales , Sistemas de Computación , Porcinos
14.
Stud Health Technol Inform ; 173: 92-6, 2012.
Artículo en Inglés | MEDLINE | ID: mdl-22356964

RESUMEN

Robotic surgical platforms require vision feedback systems, which often consist of low-resolution, expensive, single-imager analog cameras. These systems are retooled for 3D display by simply doubling the cameras and outboard control units. Here, a fully-integrated digital stereoscopic video camera employing high-definition sensors and a class-compliant USB video interface is presented. This system can be used with low-cost PC hardware and consumer-level 3D displays for tele-medical surgical applications including military medical support, disaster relief, and space exploration.


Asunto(s)
Percepción de Profundidad , Imagenología Tridimensional/métodos , Robótica , Cirugía Asistida por Video/economía , Cirugía Asistida por Video/instrumentación , Procedimientos Quirúrgicos Mínimamente Invasivos
15.
Stud Health Technol Inform ; 163: 454-60, 2011.
Artículo en Inglés | MEDLINE | ID: mdl-21335838

RESUMEN

Motor-based tracking and image-based tracking are considered for three-dimensional in vivo tracking of the arms of a surgical robot during minimally invasive surgery. Accurate tracking is necessary for tele-medical applications and for the future automation of surgical procedures. An experiment is performed to compare the accuracy of the two methods, and results show that the positioning error of image-based tracking is significantly less than that of motor-based tracking.


Asunto(s)
Interpretación de Imagen Asistida por Computador/métodos , Imagenología Tridimensional/métodos , Reconocimiento de Normas Patrones Automatizadas/métodos , Fotogrametría/métodos , Robótica/métodos , Cirugía Asistida por Computador/métodos , Interfaz Usuario-Computador
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...