Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 24
Filtrar
Más filtros

País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Trends Genet ; 35(3): 223-234, 2019 03.
Artículo en Inglés | MEDLINE | ID: mdl-30691868

RESUMEN

Data commons collate data with cloud computing infrastructure and commonly used software services, tools, and applications to create biomedical resources for the large-scale management, analysis, harmonization, and sharing of biomedical data. Over the past few years, data commons have been used to analyze, harmonize, and share large-scale genomics datasets. Data ecosystems can be built by interoperating multiple data commons. It can be quite labor intensive to curate, import, and analyze the data in a data commons. Data lakes provide an alternative to data commons and simply provide access to data, with the data curation and analysis deferred until later and delegated to those that access the data. We review software platforms for managing, analyzing, and sharing genomic data, with an emphasis on data commons, but also cover data ecosystems and data lakes.


Asunto(s)
Nube Computacional/tendencias , Genómica/métodos , Difusión de la Información/métodos , Programas Informáticos , Macrodatos , Investigación Biomédica/tendencias , Biología Computacional/tendencias , Humanos
2.
Sensors (Basel) ; 22(15)2022 Jul 26.
Artículo en Inglés | MEDLINE | ID: mdl-35898074

RESUMEN

There is a growing body of literature that recognizes the importance of Multi-Robot coordination and Modular Robotics. This work evaluates the secure coordination of an Unmanned Aerial Vehicle (UAV) via a drone simulation in Unity and an Unmanned Ground Vehicle (UGV) as a rover. Each robot is equipped with sensors to gather information to send to a cloud server where all computations are performed. Each vehicle is registered by blockchain ledger-based network security. In addition to these, relevant information and alerts are displayed on a website for the users. The usage of UAV-UGV cooperation allows for autonomous surveillance due to the high vantage field of view. Furthermore, the usage of cloud computation lowers the cost of microcontrollers by reducing their complexity. Lastly, blockchain technology mitigates the security issues related to adversarial or malicious robotic nodes connecting to the cluster and not agreeing to privacy rules and norms.


Asunto(s)
Nube Computacional , Procedimientos Quirúrgicos Robotizados , Robótica , Nube Computacional/normas , Nube Computacional/tendencias , Simulación por Computador , Privacidad , Procedimientos Quirúrgicos Robotizados/normas , Procedimientos Quirúrgicos Robotizados/tendencias , Robótica/instrumentación , Robótica/métodos , Dispositivos Aéreos No Tripulados/normas
4.
Sensors (Basel) ; 20(3)2020 Feb 02.
Artículo en Inglés | MEDLINE | ID: mdl-32024221

RESUMEN

The recent development of human-carried mobile devices has promoted the great development of mobile crowdsensing systems. Most existing mobile crowdsensing systems depend on the crowdsensing service of the deep cloud. With the increasing scale and complexity, there is a tendency to enhance mobile crowdsensing with the edge computing paradigm to reduce latency and computational complexity, and improve the expandability and security. In this paper, we propose an integrated solution to stimulate the strategic users to contribute more for truth discovery in the edge-assisted mobile crowdsensing. We design an incentive mechanism consisting of truth discovery stage and budget feasible reverse auction stage. In truth discovery stage, we estimate the truth for each task in both deep cloud and edge cloud. In budget feasible reverse auction stage, we design a greedy algorithm to select the winners to maximize the quality function under the budget constraint. Through extensive simulations, we demonstrate that the proposed mechanism is computationally efficient, individually rational, truthful, budget feasible and constant approximate. Moreover, the proposed mechanism shows great superiority in terms of estimation precision and expandability.


Asunto(s)
Teléfono Celular , Algoritmos , Nube Computacional/tendencias , Seguridad Computacional/tendencias , Recolección de Datos/tendencias , Humanos , Registros
5.
Genome Res ; 25(10): 1417-22, 2015 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-26430150

RESUMEN

The last 20 years have been a remarkable era for biology and medicine. One of the most significant achievements has been the sequencing of the first human genomes, which has laid the foundation for profound insights into human genetics, the intricacies of regulation and development, and the forces of evolution. Incredibly, as we look into the future over the next 20 years, we see the very real potential for sequencing more than 1 billion genomes, bringing even deeper insight into human genetics as well as the genetics of millions of other species on the planet. Realizing this great potential for medicine and biology, though, will only be achieved through the integration and development of highly scalable computational and quantitative approaches that can keep pace with the rapid improvements to biotechnology. In this perspective, I aim to chart out these future technologies, anticipate the major themes of research, and call out the challenges ahead. One of the largest shifts will be in the training used to prepare the class of 2035 for their highly interdisciplinary world.


Asunto(s)
Biología/tendencias , Investigación Genética , Genoma Humano , Genómica/tendencias , Biotecnología/tendencias , Nube Computacional/tendencias , Biología Computacional/tendencias , Recolección de Datos/tendencias , Procesamiento Automatizado de Datos , Predicción , Humanos , Almacenamiento y Recuperación de la Información/tendencias
7.
J Med Internet Res ; 20(8): e10886, 2018 08 08.
Artículo en Inglés | MEDLINE | ID: mdl-30089608

RESUMEN

BACKGROUND: Outbreaks of several serious infectious diseases have occurred in recent years. In response, to mitigate public health risks, countries worldwide have dedicated efforts to establish an information system for effective disease monitoring, risk assessment, and early warning management for international disease outbreaks. A cloud computing framework can effectively provide the required hardware resources and information access and exchange to conveniently connect information related to infectious diseases and develop a cross-system surveillance and control system for infectious diseases. OBJECTIVE: The objective of our study was to develop a Hospital Automated Laboratory Reporting (HALR) system based on such a framework and evaluate its effectiveness. METHODS: We collected data for 6 months and analyzed the cases reported within this period by the HALR and the Web-based Notifiable Disease Reporting (WebNDR) systems. Furthermore, system evaluation indicators were gathered, including those evaluating sensitivity and specificity. RESULTS: The HALR system reported 15 pathogens and 5174 cases, and the WebNDR system reported 34 cases. In a comparison of the two systems, sensitivity was 100% and specificity varied according to the reported pathogens. In particular, the specificity for Streptococcus pneumoniae, Mycobacterium tuberculosis complex, and hepatitis C virus were 99.8%, 96.6%, and 97.4%, respectively. However, the specificity for influenza virus and hepatitis B virus were only 79.9% and 47.1%, respectively. After the reported data were integrated with patients' diagnostic results in their electronic medical records (EMRs), the specificity for influenza virus and hepatitis B virus increased to 89.2% and 99.1%, respectively. CONCLUSIONS: The HALR system can provide early reporting of specified pathogens according to test results, allowing for early detection of outbreaks and providing trends in infectious disease data. The results of this study show that the sensitivity and specificity of early disease detection can be increased by integrating the reported data in the HALR system with the cases' clinical information (eg, diagnostic results) in EMRs, thereby enhancing the control and prevention of infectious diseases.


Asunto(s)
Nube Computacional/tendencias , Enfermedades Transmisibles/epidemiología , Registros Electrónicos de Salud/tendencias , Vigilancia de la Población/métodos , Humanos
9.
Curr Med Sci ; 41(6): 1134-1150, 2021 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-34939144

RESUMEN

The application of artificial intelligence (AI) technology in the medical field has experienced a long history of development. In turn, some long-standing points and challenges in the medical field have also prompted diverse research teams to continue to explore AI in depth. With the development of advanced technologies such as the Internet of Things (IoT), cloud computing, big data, and 5G mobile networks, AI technology has been more widely adopted in the medical field. In addition, the in-depth integration of AI and IoT technology enables the gradual improvement of medical diagnosis and treatment capabilities so as to provide services to the public in a more effective way. In this work, we examine the technical basis of IoT, cloud computing, big data analysis and machine learning involved in clinical medicine, combined with concepts of specific algorithms such as activity recognition, behavior recognition, anomaly detection, assistant decision-making system, to describe the scenario-based applications of remote diagnosis and treatment collaboration, neonatal intensive care unit, cardiology intensive care unit, emergency first aid, venous thromboembolism, monitoring nursing, image-assisted diagnosis, etc. We also systematically summarize the application of AI and IoT in clinical medicine, analyze the main challenges thereof, and comment on the trends and future developments in this field.


Asunto(s)
Inteligencia Artificial/tendencias , Macrodatos , Medicina Clínica/tendencias , Nube Computacional/tendencias , Internet de las Cosas/tendencias , Algoritmos , Humanos , Aprendizaje Automático
10.
PLoS One ; 16(8): e0256223, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34415945

RESUMEN

Cryptographic cloud storage is used to make optimal use of the cloud storage infrastructure to outsource sensitive and mission-critical data. The continuous growth of encrypted data outsourced to cloud storage requires continuous updating. Attacks like file-injection are reported to compromise confidentiality of the user as a consequence of information leakage during update. It is required that dynamic schemes provide forward privacy guarantees. Updates should not leak information to the untrusted server regarding the previously issued queries. Therefore, the challenge is to design an efficient searchable encryption scheme with dynamic updates and forward privacy guarantees. In this paper, a novel private multi-linked dynamic index for encrypted document retrieval namely Pindex is proposed. The multi-linked dynamic index is constructed using probabilistic homomorphic encryption mechanism and secret orthogonal vectors. Full security proofs for correctness and forward privacy in the random oracle model is provided. Experiments on real world Enron dataset demonstrates that our construction is practical and efficient. The security and performance analysis of Pindex shows that the dynamic multi-linked index guarantees forward privacy without significant loss of efficiency.


Asunto(s)
Indización y Redacción de Resúmenes/normas , Nube Computacional/tendencias , Seguridad Computacional/tendencias , Algoritmos , Confidencialidad/normas , Humanos , Servicios Externos/normas , Privacidad , Registros
11.
PLoS One ; 15(1): e0226981, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-31905210

RESUMEN

This paper explores the significance of narrative in collaborative reasoning using a qualitative case study of two teams of intelligence analysts who took part in an exercise using an online collaborative platform. Digital ethnographic methods were used to analyze the chat transcripts of analysts as they reasoned with evidence provided in a difficult, fictional intelligence-type problem and produced a final intelligence report. These chat transcripts provided a powerful "microscope" into the reasoning processes and interactions involved in complex, collaborative reasoning. We found that Individuals and teams used narrative to solve the kinds of complex problems organizations and intelligence agencies face daily. We observed that team members generated what we term "micro-narratives", which provided a means for testing, assessing and weighing alternative hypotheses through mental simulation in the context of collaborative reasoning. The creation of micro-narratives assisted in the teams' reasoning with evidence, an integral part of collaborative reasoning and intelligence analysis. Micro-narratives were combined into, and compared with, an ideal or 'virtual' narrative which informed the judgements the team came to in their final intelligence report. The case study developed in this paper provides evidence that narrative thought processes play an important role in complex collaborative problem-solving, reasoning with evidence and problem-solving. This is contrary to a widespread perception that narrative thinking is fundamentally distinct from formal, logical reasoning.


Asunto(s)
Conducta Cooperativa , Inteligencia , Narración , Solución de Problemas , Nube Computacional/tendencias , Toma de Decisiones , Humanos , Colaboración Intersectorial , Juicio , Pensamiento
12.
PLoS One ; 15(9): e0239053, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32946491

RESUMEN

To deal with dynamically changing user's credentials in identity-based encryption (IBE), providing an efficient key revocation method is a very important issue. Recently, Ma and Lin proposed a generic method of designing a revocable IBE (RIBE) scheme that uses the complete subtree (CS) method by combining IBE and hierarchical IBE (HIBE) schemes. In this paper, we propose a new generic method for designing an RIBE scheme that uses the subset difference (SD) method instead of using the CS method. In order to use the SD method, we generically design an RIBE scheme by combining IBE, identity-based revocation (IBR), and two-level HIBE schemes. If the underlying IBE, IBR, and HIBE schemes are adaptively (or selectively) secure, then our RIBE scheme is also adaptively (or selectively) secure. In addition, we show that the layered SD (LSD) method can be applied to our RIBE scheme and a chosen-ciphertext secure RIBE scheme also can be designed generically.


Asunto(s)
Seguridad Computacional/tendencias , Robo de Identidad/prevención & control , Algoritmos , Nube Computacional/tendencias , Modelos Estadísticos , Modelos Teóricos , Programas Informáticos
13.
J Diabetes Sci Technol ; 14(6): 1107-1110, 2020 11.
Artículo en Inglés | MEDLINE | ID: mdl-33050727

RESUMEN

With the recent pivot to telehealth as a direct result of the COVID-19 pandemic, there is an imperative to ensure that access to affordable devices and technologies with remote monitoring capabilities for people with diabetes becomes equitable. In addition, expanding the use of remote Diabetes Self-Management Education and Support (DSMES) and Medical Nutrition Therapy (MNT) services will require new strategies for achieving long-term, effective, continuous, data-driven care. The current COVID-19 pandemic has especially impacted underserved US communities that were already disproportionately impacted by diabetes. Historically, these same communities have faced barriers in accessing timely and effective diabetes care including access to DSMES and MNT services, and diabetes technologies. Our call to action encourages all involved to urge US Federal representatives to widen access to the array of technologies necessary for successful telehealth-delivered care beyond COVID-19.


Asunto(s)
Nube Computacional/tendencias , Infecciones por Coronavirus/epidemiología , Diabetes Mellitus/terapia , Accesibilidad a los Servicios de Salud/tendencias , Neumonía Viral/epidemiología , Telemedicina/tendencias , Atención de Salud Universal , COVID-19 , Infecciones por Coronavirus/terapia , Democracia , Complicaciones de la Diabetes/epidemiología , Complicaciones de la Diabetes/terapia , Diabetes Mellitus/epidemiología , Accesibilidad a los Servicios de Salud/organización & administración , Disparidades en Atención de Salud/organización & administración , Disparidades en Atención de Salud/tendencias , Humanos , Invenciones/tendencias , Área sin Atención Médica , Pandemias , Educación del Paciente como Asunto/métodos , Educación del Paciente como Asunto/organización & administración , Educación del Paciente como Asunto/tendencias , Neumonía Viral/terapia , Automanejo/métodos , Automanejo/tendencias , Telemedicina/métodos , Telemedicina/organización & administración
14.
Health Informatics J ; 25(2): 315-329, 2019 06.
Artículo en Inglés | MEDLINE | ID: mdl-28480788

RESUMEN

Social media has enabled information-sharing across massively large networks of people without spending much financial resources and time that are otherwise required in the print and electronic media. Mobile-based social media applications have overwhelmingly changed the information-sharing perspective. However, with the advent of such applications at an unprecedented scale, the privacy of the information is compromised to a larger extent if breach mitigation is not adequate. Since healthcare applications are also being developed for mobile devices so that they also benefit from the power of social media, cybersecurity privacy concerns for such sensitive applications have become critical. This article discusses the architecture of a typical mobile healthcare application, in which customized privacy levels are defined for the individuals participating in the system. It then elaborates on how the communication across a social network in a multi-cloud environment can be made more secure and private, especially for healthcare applications.


Asunto(s)
Seguridad Computacional/normas , Privacidad , Nube Computacional/normas , Nube Computacional/tendencias , Seguridad Computacional/tendencias , Humanos , Red Social , Telemedicina/métodos , Telemedicina/tendencias
15.
Neural Netw ; 108: 339-354, 2018 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-30245433

RESUMEN

Trustworthiness is a comprehensive quality metric which is used to assess the quality of the services in service-oriented environments. However, trust prediction of cloud services based on the multi-faceted Quality of Service (QoS) attributes is a challenging task due to the complicated and non-linear relationships between the QoS values and the corresponding trust result. Recent research works reveal the significance of Artificial Neural Network (ANN) and its variants in providing a reasonable degree of success in trust prediction problems. However, the challenges with respect to weight assignment, training time and kernel functions make ANN and its variants under continuous advancements. Hence, this work presents a novel multi-level Hypergraph Coarsening based Robust Heteroscedastic Probabilistic Neural Network (HC-RHRPNN) to predict trustworthiness of cloud services to build high-quality service applications. HC-RHRPNN employs hypergraph coarsening to identify the informative samples, which were then used to train HRPNN to improve its prediction accuracy and minimize the runtime. The performance of HC-RHRPNN was evaluated using Quality of Web Service (QWS) dataset, a public QoS dataset in terms of classifier accuracy, precision, recall, and F-Score.


Asunto(s)
Nube Computacional/tendencias , Modelos Estadísticos , Redes Neurales de la Computación , Algoritmos , Nube Computacional/normas , Sistemas de Computación/normas , Sistemas de Computación/tendencias , Predicción , Humanos
16.
Neuroinformatics ; 16(1): 43-49, 2018 01.
Artículo en Inglés | MEDLINE | ID: mdl-29058212

RESUMEN

The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.


Asunto(s)
Envejecimiento , Corteza Cerebral/diagnóstico por imagen , Nube Computacional , Bases de Datos Factuales , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Envejecimiento/patología , Envejecimiento/fisiología , Corteza Cerebral/citología , Corteza Cerebral/fisiología , Niño , Nube Computacional/tendencias , Bases de Datos Factuales/tendencias , Predicción , Humanos , Imagen por Resonancia Magnética/tendencias , Persona de Mediana Edad , Adulto Joven
17.
Mil Med ; 183(11-12): e438-e447, 2018 11 01.
Artículo en Inglés | MEDLINE | ID: mdl-29425378

RESUMEN

Introduction: This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Methods: Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. Results: We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services indicated that the security infrastructure in cloud services may be more compliant with the Health Insurance Portability and Accountability Act of 1996 regulations than traditional methods. To gauge the DoD's adoption of cloud technologies proposed metrics included cost factors, ease of use, automation, availability, accessibility, security, and policy compliance. Conclusions: Since 2009, plans and policies were developed for the use of cloud technology to help consolidate and reduce the number of data centers which were expected to reduce costs, improve environmental factors, enhance information technology security, and maintain mission support for service members. Cloud technologies were also expected to improve employee efficiency and productivity. Federal cloud computing policies within the last decade also offered increased opportunities to advance military healthcare. It was assumed that these opportunities would benefit consumers of healthcare and health science data by allowing more access to centralized cloud computer facilities to store, analyze, search and share relevant data, to enhance standardization, and to reduce potential duplications of effort. We recommend that cloud computing be considered by DoD biomedical researchers for increasing connectivity, presumably by facilitating communications and data sharing, among the various intra- and extramural laboratories. We also recommend that policies and other guidances be updated to include developing additional metrics that will help stakeholders evaluate the above mentioned assumptions and expectations.


Asunto(s)
Nube Computacional/tendencias , Programas de Gobierno/métodos , Políticas , Investigación Biomédica/métodos , Investigación Biomédica/tendencias , Nube Computacional/legislación & jurisprudencia , Programas de Gobierno/tendencias , Humanos , Medicina Militar/métodos , Medicina Militar/tendencias , Estados Unidos , United States Department of Defense/organización & administración , United States Department of Defense/estadística & datos numéricos
18.
Hosp Pediatr ; 8(7): 394-403, 2018 07.
Artículo en Inglés | MEDLINE | ID: mdl-29871887

RESUMEN

OBJECTIVES: Shared care plans play an essential role in coordinating care across health care providers and settings for children with medical complexity (CMC). However, existing care plans often lack shared ownership, are out-of-date, and lack universal accessibility. In this study, we aimed to establish requirements for shared care plans to meet the information needs of caregivers and providers and to mitigate current information barriers when caring for CMC. METHODS: We followed a user-centered design methodology and conducted in-depth semistructured interviews with caregivers and providers of CMC who receive care at a tertiary care children's hospital. We applied inductive, thematic analysis to identify salient themes. Analysis occurred concurrently with data collection; therefore, the interview guide was iteratively revised as new questions and themes emerged. RESULTS: Interviews were conducted with 17 caregivers and 22 providers. On the basis of participant perspectives, we identified 4 requirements for shared care plans that would help meet information needs and mitigate current information barriers when caring for CMC. These requirements included the following: (1) supporting the accessibility of care plans from multiple locations (eg, cloud-based) and from multiple devices, with alert and search features; (2) ensuring the organization is tailored to the specific user; (3) including collaborative functionality such as real-time, multiuser content management and secure messaging; and (4) storing care plans on a secure platform with caregiver-controlled permission settings. CONCLUSIONS: Although further studies are needed to understand the optimal design and implementation strategies, shared care plans that meet these specified requirements could mitigate perceived information barriers and improve care for CMC.


Asunto(s)
Cuidadores/psicología , Servicios de Salud del Niño , Enfermedad Crónica , Nube Computacional , Niños con Discapacidad , Personal de Salud/organización & administración , Adulto , Niño , Servicios de Salud del Niño/organización & administración , Servicios de Salud del Niño/tendencias , Nube Computacional/tendencias , Personal de Salud/tendencias , Humanos , Entrevistas como Asunto , Planificación de Atención al Paciente , Percepción , Relaciones Profesional-Familia , Investigación Cualitativa , Participación de los Interesados
19.
Artículo en Inglés | MEDLINE | ID: mdl-29597013

RESUMEN

INTRODUCTION: A newly developed total implant telemetry system for cardiovascular (CV), electrophysiological and body temperature measurement was evaluated. A cloud-based transmission of the physiological signals allowed an assessment of the quality of the physiological signals despite the physical separation between the instrumented animals and the evaluating home laboratory. The new system is intended to be used for safety pharmacological evaluations of drug candidates in various species. METHODS: Two female minipigs, 6 Labrador-mixed breed dogs and 4 female Cynomolgus monkeys were instrumented with a newly developed total implant system (TSE SYSTEMS). The implants feature a microprocessor, internal memory (1 GB), 2 solid state pressure-tipped catheters, amplifiers and a radio transmitter. Sampling rates for each measurement can be selected within a range between 0.1 and 1 kHz. Biological signals are selected in a programmable fashion on a session-by-session basis according to a user-defined protocol. The pressure sensors are at the tip of an electrical lead having a length customized to each species. Core temperature measurement and activity monitoring (3D accelerometer) are included in the system. Digital transmission range using a single antenna is 5 m with up to 16 animals held together and monitored simultaneously. The range can be expanded with more antennas in an array coupled to a single receiver. The antenna/receiver station consists of a single USB powered mobile unit connected to a PC or laptop. The battery life provides 110 days of continuous recording. The dogs and minipigs were instrumented and monitored in Germany. A novel cloud-based data transmission system was developed to monitor the physiological signals in real-time from the Cynomolgus monkeys, still kept in Mauritius, from the data evaluation laboratory in Germany. After recovery from the surgical implantation, aortic pressure (AP), left ventricular pressure (LVP), ECG and body temperature were recorded for 24 hr monitoring sessions in all animals. Additionally, moxifloxacin (10, 30 and 100 mg/kg) was tested in the dog model using a modified Latin square cross-over study design. RESULTS: The implant was well tolerated and the animals recovered rapidly from the implantation procedure. Excellent signal quality was obtained and stable hemodynamic and electrophysiological parameters could be measured, with little signal artefact or drop-out, over 24 h in each species. After oral dosing of moxifloxacin to the dogs, a substantial, dose-dependent increase in the QT-interval duration could be shown, as anticipated for this agent. Cloud-based data acquisition from the animals in Mauritius and the data evaluation lab in Germany worked well. CONCLUSION: This new CV telemetry system provides a novel alternative to fluid-filled catheter telemetry systems and the coupling to a cloud-based data transmission allows for flexibility in the location of the instrumented animals and data acquisition and the location of the site for data analysis. For the first time it is technically feasible to conduct a CV safety pharmacology study in Cynomolgus monkeys without having to ship them long distances to the home laboratory.


Asunto(s)
Presión Sanguínea/fisiología , Temperatura Corporal/fisiología , Nube Computacional , Frecuencia Cardíaca/fisiología , Tecnología de Sensores Remotos/métodos , Telemetría/métodos , Animales , Antibacterianos/farmacología , Presión Sanguínea/efectos de los fármacos , Temperatura Corporal/efectos de los fármacos , Nube Computacional/tendencias , Estudios Cruzados , Perros , Femenino , Frecuencia Cardíaca/efectos de los fármacos , Macaca fascicularis , Masculino , Moxifloxacino/farmacología , Tecnología de Sensores Remotos/instrumentación , Tecnología de Sensores Remotos/tendencias , Porcinos , Porcinos Enanos , Telemetría/instrumentación , Telemetría/tendencias
20.
Ann N Y Acad Sci ; 1387(1): 112-123, 2017 01.
Artículo en Inglés | MEDLINE | ID: mdl-27801987

RESUMEN

Big Data is no longer solely the purview of big organizations with big resources. Today's routine tools and experimental methods can generate large slices of data. For example, high-throughput sequencing can quickly interrogate biological systems for the expression levels of thousands of different RNAs, examine epigenetic marks throughout the genome, and detect differences in the genomes of individuals. Multichannel electrophysiology platforms produce gigabytes of data in just a few minutes of recording. Imaging systems generate videos capturing biological behaviors over the course of days. Thus, any researcher now has access to a veritable wealth of data. However, the ability of any given researcher to utilize that data is limited by her/his own resources and skills for downloading, storing, and analyzing the data. In this paper, we examine the necessary resources required to engage Big Data, survey the state of modern data analysis pipelines, present a few data repository case studies, and touch on current institutions and programs supporting the work that relies on Big Data.


Asunto(s)
Investigación Biomédica/métodos , Nube Computacional , Redes de Comunicación de Computadores , Biología de Sistemas/métodos , Acceso a la Información , Animales , Investigación Biomédica/tendencias , Nube Computacional/tendencias , Redes de Comunicación de Computadores/instrumentación , Redes de Comunicación de Computadores/tendencias , Minería de Datos/métodos , Minería de Datos/tendencias , Toma de Decisiones Asistida por Computador , Genómica/métodos , Genómica/tendencias , Humanos , Procesamiento de Imagen Asistido por Computador , Internet , Programas Informáticos , Biología de Sistemas/instrumentación , Biología de Sistemas/tendencias
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA