Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 43
Filtrar
1.
Eur Radiol ; 2023 Dec 04.
Artigo em Inglês | MEDLINE | ID: mdl-38047974

RESUMO

Creating a patient-centered experience is becoming increasingly important for radiology departments around the world. The goal of patient-centered radiology is to ensure that radiology services are sensitive to patients' needs and desires. This article provides a framework for addressing the patient's experience by dividing their imaging journey into three distinct time periods: pre-exam, day of exam, and post-exam. Each time period has aspects that can contribute to patient anxiety. Although there are components of the patient journey that are common in all regions of the world, there are also unique features that vary by location. This paper highlights innovative solutions from different parts of the world that have been introduced in each of these time periods to create a more patient-centered experience. CLINICAL RELEVANCE STATEMENT: Adopting innovative solutions that help patients understand their imaging journey and decrease their anxiety about undergoing an imaging examination are important steps in creating a patient centered imaging experience. KEY POINTS: • Patients often experience anxiety during their imaging journey and decreasing this anxiety is an important component of patient centered imaging. • The patient imaging journey can be divided into three distinct time periods: pre-exam, day of exam, and post-exam. • Although components of the imaging journey are common, there are local differences in different regions of the world that need to be considered when constructing a patient centered experience.

2.
Sensors (Basel) ; 23(9)2023 Apr 26.
Artigo em Inglês | MEDLINE | ID: mdl-37177511

RESUMO

With the rapid development of cloud storage and cloud computing technology, users tend to store data in the cloud for more convenient services. In order to ensure the integrity of cloud data, scholars have proposed cloud data integrity verification schemes to protect users' data security. The storage environment of the Internet of Things, in terms of big data and medical big data, demonstrates a stronger demand for data integrity verification schemes, but at the same time, the comprehensive function of data integrity verification schemes is required to be higher. Existing data integrity verification schemes are mostly applied in the cloud storage environment but cannot successfully be applied to the environment of the Internet of Things in the context of big data storage and medical big data storage. To solve this problem when combined with the characteristics and requirements of Internet of Things data storage and medical data storage, we designed an SM2-based offline/online efficient data integrity verification scheme. The resulting scheme uses the SM4 block cryptography algorithm to protect the privacy of the data content and uses a dynamic hash table to realize the dynamic updating of data. Based on the SM2 signature algorithm, the scheme can also realize offline tag generation and batch audits, reducing the computational burden of users. In security proof and efficiency analysis, the scheme has proven to be safe and efficient and can be used in a variety of application scenarios.

3.
Sensors (Basel) ; 23(5)2023 Feb 27.
Artigo em Inglês | MEDLINE | ID: mdl-36904822

RESUMO

With continuous advancements in Internet technology and the increased use of cryptographic techniques, the cloud has become the obvious choice for data sharing. Generally, the data are outsourced to cloud storage servers in encrypted form. Access control methods can be used on encrypted outsourced data to facilitate and regulate access. Multi-authority attribute-based encryption is a propitious technique to control who can access encrypted data in inter-domain applications such as sharing data between organizations, sharing data in healthcare, etc. The data owner may require the flexibility to share the data with known and unknown users. The known or closed-domain users may be internal employees of the organization, and unknown or open-domain users may be outside agencies, third-party users, etc. In the case of closed-domain users, the data owner becomes the key issuing authority, and in the case of open-domain users, various established attribute authorities perform the task of key issuance. Privacy preservation is also a crucial requirement in cloud-based data-sharing systems. This work proposes the SP-MAACS scheme, a secure and privacy-preserving multi-authority access control system for cloud-based healthcare data sharing. Both open and closed domain users are considered, and policy privacy is ensured by only disclosing the names of policy attributes. The values of the attributes are kept hidden. Characteristic comparison with similar existing schemes shows that our scheme simultaneously provides features such as multi-authority setting, expressive and flexible access policy structure, privacy preservation, and scalability. The performance analysis carried out by us shows that the decryption cost is reasonable enough. Furthermore, the scheme is demonstrated to be adaptively secure under the standard model.


Assuntos
Confidencialidade , Privacidade , Humanos , Computação em Nuvem , Segurança Computacional , Disseminação de Informação , Atenção à Saúde
4.
Sensors (Basel) ; 23(24)2023 Dec 12.
Artigo em Inglês | MEDLINE | ID: mdl-38139615

RESUMO

Large-scale incorporation of new energy generation units based on renewable sources, such as wind and photovoltaic power, drastically alters the structure of the power system. Because of the intermittent nature of these sources, switching in grids (connection and disconnection) occurs much more frequently than with conventional sources. As a result, the power system will inevitably experience a large number of transients, which raises questions about the stability of the system and the quality of the electrical energy. Therefore, measuring various types of transients in power system is crucial for stability, power quality, fault analysis, protection design, and insulation design. Transient recorders that are currently used are generally expensive and only suitable for particular locations in power systems. The number of installed transient recorders is insufficient for a comprehensive analysis of problems that may occur. Hence, it is important to have inexpensive and efficient transient recorders that can be installed at multiple points in the power system on various types of objects. It is also essential to have a transient record database with open access, which can be used by researchers to develop new analysis techniques based on artificial intelligence. This paper proposes an inexpensive measurement and acquisition system designed to record transient phenomena on different objects within the power system. The system is designed to use autonomous power, a standardized data acquisition module, a low-budget system for transmitting recorded transient events to the server via mobile network, and a sensor system adapted to the object where transients are recorded. The proposed system is designed to be used for all types of objects in the power system where transients may occur, such as power lines, transmission towers, surge arresters, and transformers. All components of the system are described, and the system is tested under laboratory conditions. The modular nature of the system allows customization to the specifics of the location in power system by choosing appropriate components. The calibration method of the custom designed Rogowski coil is described. The cost analysis of the proposed system and power consumption analysis are performed. The results show that the system's performance meets application requirements at a low cost.

5.
Entropy (Basel) ; 25(2)2023 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-36832728

RESUMO

In the cloud, uploading encrypted data is the most effective way to ensure that the data are not leaked. However, data access control is still an open problem in cloud storage systems. To provide an authorization mechanism to limit the comparison of a user's ciphertexts with those of another, public key encryption supporting the equality test with four flexible authorizations (PKEET-FA) is presented. Subsequently, more functional identity-based encryption supporting the equality test (IBEET-FA) further combines identity-based encryption with flexible authorization. The bilinear pairing has always been intended to be replaced due to the high computational cost. Hence, in this paper, we use general trapdoor discrete log groups to construct a new and secure IBEET-FA scheme, which is more efficient. The computational cost for the encryption algorithm in our scheme was reduced to 43% of that of the scheme of Li et al. In Type 2 and 3 authorization algorithms, the computational cost of both was reduced to 40% of that of the scheme of Li et al. Furthermore, we give proof that our scheme is secure against one-wayness under the chosen identity and chosen ciphertext attacks (OW-ID-CCA), and indistinguishable against chosen identity and chosen ciphertext attacks (IND-ID-CCA).

6.
Sensors (Basel) ; 22(21)2022 Nov 07.
Artigo em Inglês | MEDLINE | ID: mdl-36366266

RESUMO

The limitations of the classic PACS (picture archiving and communication system), such as the backward-compatible DICOM network architecture and poor security and maintenance, are well-known. They are challenged by various existing solutions employing cloud-related patterns and services. However, a full-scale cloud-native PACS has not yet been demonstrated. The paper introduces a vendor-neutral cloud PACS architecture. It is divided into two main components: a cloud platform and an access device. The cloud platform is responsible for nearline (long-term) image archive, data flow, and backend management. It operates in multi-tenant mode. The access device is responsible for the local DICOM (Digital Imaging and Communications in Medicine) interface and serves as a gateway to cloud services. The cloud PACS was first implemented in an Amazon Web Services environment. It employs a number of general-purpose services designed or adapted for a cloud environment, including Kafka, OpenSearch, and Memcached. Custom services, such as a central PACS node, queue manager, or flow worker, also developed as cloud microservices, bring DICOM support, external integration, and a management layer. The PACS was verified using image traffic from, among others, computed tomography (CT), magnetic resonance (MR), and computed radiography (CR) modalities. During the test, the system was reliably storing and accessing image data. In following tests, scaling behavior differences between the monolithic Dcm4chee server and the proposed solution are shown. The growing number of parallel connections did not influence the monolithic server's overall throughput, whereas the performance of cloud PACS noticeably increased. In the final test, different retrieval patterns were evaluated to assess performance under different scenarios. The current production environment stores over 450 TB of image data and handles over 4000 DICOM nodes.


Assuntos
Sistemas de Informação em Radiologia , Computação em Nuvem , Computadores , Software , Tomografia Computadorizada por Raios X
7.
Sensors (Basel) ; 22(23)2022 Nov 27.
Artigo em Inglês | MEDLINE | ID: mdl-36501937

RESUMO

For the monitoring and processing of network data, wireless systems are widely used in many industrial applications. With the assistance of wireless sensor networks (WSNs) and the Internet of Things (IoT), smart grids are being explored in many distributed communication systems. They collect data from the surrounding environment and transmit it with the support of a multi-hop system. However, there is still a significant research gap in energy management for IoT devices and smart sensors. Many solutions have been proposed by researchers to cope with efficient routing schemes in smart grid applications. But, reducing energy holes and offering intelligent decisions for forwarding data are remain major problems. Moreover, the management of network traffic on grid nodes while balancing the communication overhead on the routing paths is an also demanding challenge. In this research work, we propose a secure edge-based energy management protocol for a smart grid environment with the support of multi-route management. It strengthens the ability to predict the data forwarding process and improves the management of IoT devices by utilizing a technique of correlation analysis. Moreover, the proposed protocol increases the system's reliability and achieves security goals by employing lightweight authentication with sink coordination. To demonstrate the superiority of our proposed protocol over the chosen existing work, extensive experiments were performed on various network parameters.

8.
Sensors (Basel) ; 22(16)2022 Aug 10.
Artigo em Inglês | MEDLINE | ID: mdl-36015727

RESUMO

The digital transformation disrupts the various professional domains in different ways, though one aspect is common: the unified platform known as cloud computing. Corporate solutions, IoT systems, analytics, business intelligence, and numerous tools, solutions and systems use cloud computing as a global platform. The migrations to the cloud are increasing, causing it to face new challenges and complexities. One of the essential segments is related to data storage. Data storage on the cloud is neither simplistic nor conventional; rather, it is becoming more and more complex due to the versatility and volume of data. The inspiration of this research is based on the development of a framework that can provide a comprehensive solution for cloud computing storage in terms of replication, and instead of using formal recovery channels, erasure coding has been proposed for this framework, which in the past proved itself as a trustworthy mechanism for the job. The proposed framework provides a hybrid approach to combine the benefits of replication and erasure coding to attain the optimal solution for storage, specifically focused on reliability and recovery. Learning and training mechanisms were developed to provide dynamic structure building in the future and test the data model. RAID architecture is used to formulate different configurations for the experiments. RAID-1 to RAID-6 are divided into two groups, with RAID-1 to 4 in the first group while RAID-5 and 6 are in the second group, further categorized based on FTT, parity, failure range and capacity. Reliability and recovery are evaluated on the rest of the data on the server side, and for the data in transit at the virtual level. The overall results show the significant impact of the proposed hybrid framework on cloud storage performance. RAID-6c at the server side came out as the best configuration for optimal performance. The mirroring for replication using RAID-6 and erasure coding for recovery work in complete coherence provide good results for the current framework while highlighting the interesting and challenging paths for future research.


Assuntos
Computação em Nuvem , Armazenamento e Recuperação da Informação , Computadores , Reprodutibilidade dos Testes
9.
Sensors (Basel) ; 22(18)2022 Sep 16.
Artigo em Inglês | MEDLINE | ID: mdl-36146368

RESUMO

Cloud storage has become a keystone for organizations to manage large volumes of data produced by sensors at the edge as well as information produced by deep and machine learning applications. Nevertheless, the latency produced by geographic distributed systems deployed on any of the edge, the fog, or the cloud, leads to delays that are observed by end-users in the form of high response times. In this paper, we present an efficient scheme for the management and storage of Internet of Thing (IoT) data in edge-fog-cloud environments. In our proposal, entities called data containers are coupled, in a logical manner, with nano/microservices deployed on any of the edge, the fog, or the cloud. The data containers implement a hierarchical cache file system including storage levels such as in-memory, file system, and cloud services for transparently managing the input/output data operations produced by nano/microservices (e.g., a sensor hub collecting data from sensors at the edge or machine learning applications processing data at the edge). Data containers are interconnected through a secure and efficient content delivery network, which transparently and automatically performs the continuous delivery of data through the edge-fog-cloud. A prototype of our proposed scheme was implemented and evaluated in a case study based on the management of electrocardiogram sensor data. The obtained results reveal the suitability and efficiency of the proposed scheme.


Assuntos
Computação em Nuvem , Redes de Comunicação de Computadores , Eletrocardiografia , Internet
10.
Environ Monit Assess ; 195(1): 44, 2022 Oct 28.
Artigo em Inglês | MEDLINE | ID: mdl-36302915

RESUMO

Farming has a plethora of difficult responsibilities, and plant monitoring is one of them. There is also an urgent need to increase the number of alternative techniques for detecting plant diseases, which is now lacking. The agriculture and agricultural support sectors in India provide employment for the great majority of the country's people. In India, the agricultural production of the country is directly connected to the country's economic growth rate. In order to sustain healthy plant development, a variety of processes must be followed, including consideration of environmental factors and water supply management for the optimal production of crops. It is inefficient and uncertain in its outcomes to use the traditional method of watering a lawn. The devastation of more than 18% of the world's agricultural produce is caused by disease attacks on an annual basis. Because it is difficult to execute these activities manually, identifying plant diseases is essential to decreasing losses in the agricultural product business. In addition to diagnosing a wide range of plant ailments, our method also includes the identification of infections as a prophylactic step. Below is a detailed description of a farm-based module that includes numerous cloud data centers and data conversion devices for accurately monitoring and managing farm information and environmental elements. This procedure involves imaging the plant's visually obvious signs in order to identify disease. It is recommended that the therapy be used in conjunction with an application to minimize any harm. Increased productivity as a result of the suggested approach would help both the agricultural and irrigation sectors. The plant area module is fitted with a mobile camera that captures images of all of the plants in the area, and all of the plants' information is saved in a database, which is accessible from any computer with Internet access. It is planned to record information on the plant's name, the type of illness that has been afflicted, and an image of the plant. In a wide range of applications, bots are used to collect images of various plants as well as to prevent disease transmission. To ensure that all information given is retained on the Internet, data is collected and stored in cloud storage as it becomes essential to regulate the condition. According to our findings from our research on wide images of healthy and ill fruit and plant leaves, real-time diagnosis of plant leaf diseases may be done with 98.78% accuracy in a laboratory environment. We utilized 40,000 photographs and then analyzed 10,000 photos to construct a DCDM deep learning model, which was then used to train additional models on the data set. Using a cloud-based image diagnostic and classification service, consumers may receive information about their condition in less than a second on average, with the process requiring only 0.349 s on average.


Assuntos
Computação em Nuvem , Monitoramento Ambiental , Aplicativos Móveis , Doenças das Plantas , Humanos , Monitoramento Ambiental/instrumentação , Índia , Doenças das Plantas/prevenção & controle
11.
Sensors (Basel) ; 21(24)2021 Dec 15.
Artigo em Inglês | MEDLINE | ID: mdl-34960455

RESUMO

Information technology is based on data management between various sources. Software projects, as varied as simple applications or as complex as self-driving cars, are heavily reliant on the amounts, and types, of data ingested by one or more interconnected systems. Data is not only consumed but is transformed or mutated which requires copious amounts of computing resources. One of the most exciting areas of cyber-physical systems, autonomous vehicles, makes heavy use of deep learning and AI to mimic the highly complex actions of a human driver. Attempting to map human behavior (a large and abstract concept) requires large amounts of data, used by AIs to increase their knowledge and better attempt to solve complex problems. This paper outlines a full-fledged solution for managing resources in a multi-cloud environment. The purpose of this API is to accommodate ever-increasing resource requirements by leveraging the multi-cloud and using commercially available tools to scale resources and make systems more resilient while remaining as cloud agnostic as possible. To that effect, the work herein will consist of an architectural breakdown of the resource management API, a low-level description of the implementation and an experiment aimed at proving the feasibility, and applicability of the systems described.


Assuntos
Veículos Autônomos , Computação em Nuvem , Humanos , Software
12.
BMC Med Inform Decis Mak ; 20(1): 10, 2020 Jan 28.
Artigo em Inglês | MEDLINE | ID: mdl-31992273

RESUMO

BACKGROUND: Cloud storage facilities (CSF) has become popular among the internet users. There is limited data on CSF usage among university students in low middle-income countries including Sri Lanka. In this study we present the CSF usage among medical students at the Faculty of Medicine, University of Kelaniya. METHODS: We undertook a cross sectional study at the Faculty of Medicine, University of Kelaniya, Sri Lanka. Stratified random sampling was used to recruit students representing all the batches. A self-administrated questionnaire was given. RESULTS: Of 261 (90.9%) respondents, 181 (69.3%) were females. CSF awareness was 56.5% (95%CI: 50.3-62.6%) and CSF usage was 50.8% (95%CI: 44.4-57.2%). Awareness was higher in males (P = 0.003) and was low in senior students. Of CSF aware students, 85% knew about Google Drive and 70.6% used it. 73.6 and 42.1% knew about Dropbox and OneDrive. 50.0 and 22.0% used them respectively. There was no association between CSF awareness and pre-university entrance or undergraduate examination performance. Inadequate knowledge, time, accessibility, security and privacy concerns limited CSF usage. 69.8% indicated that they would like to undergo training on CSF as an effective tool for education. CONCLUSION: CSF awareness and usage among the students were 56.5 and 50.8%. Google drive is the most popular CSF. Lack of knowledge, accessibility, concerns on security and privacy limited CSF usage among students. Majority were interested to undergo training on CSF and undergraduate Information Communication Technology (ICT) curricula should introduce CSF as effective educational tools.


Assuntos
Computação em Nuvem/estatística & dados numéricos , Estudantes de Medicina/psicologia , Estudos Transversais , Feminino , Humanos , Masculino , Sri Lanka , Inquéritos e Questionários
13.
Sensors (Basel) ; 20(18)2020 Sep 21.
Artigo em Inglês | MEDLINE | ID: mdl-32967094

RESUMO

As the expenses of medical care administrations rise and medical services experts are becoming rare, it is up to medical services organizations and institutes to consider the implementation of medical Health Information Technology (HIT) innovation frameworks. HIT permits health associations to smooth out their considerable cycles and offer types of assistance in a more productive and financially savvy way. With the rise of Cloud Storage Computing (CSC), an enormous number of associations and undertakings have moved their healthcare data sources to distributed storage. As the information can be mentioned whenever universally, the accessibility of information becomes an urgent need. Nonetheless, outages in cloud storage essentially influence the accessibility level. Like the other basic variables of cloud storage (e.g., reliability quality, performance, security, and protection), availability also directly impacts the data in cloud storage for e-Healthcare systems. In this paper, we systematically review cloud storage mechanisms concerning the healthcare environment. Additionally, in this paper, the state-of-the-art cloud storage mechanisms are critically reviewed for e-Healthcare systems based on their characteristics. In short, this paper summarizes existing literature based on cloud storage and its impact on healthcare, and it likewise helps researchers, medical specialists, and organizations with a solid foundation for future studies in the healthcare environment.


Assuntos
Computação em Nuvem , Armazenamento e Recuperação da Informação , Telemedicina , Reprodutibilidade dos Testes
14.
Sensors (Basel) ; 20(3)2020 Jan 21.
Artigo em Inglês | MEDLINE | ID: mdl-31973154

RESUMO

The current agricultural water panorama in many Mediterranean countries is composed by desalination facilities, wells (frequently overexploited), the water public utility network, and several consumer agents with different water needs. This distributed water network requires centralized management methods for its proper use, which are difficult to implement as the different agents are usually geographically separated. In this sense, the use of enabling technologies such as the Internet of Things can be essential to the proper operation of these agroindustrial systems. In this paper, an Internet of Things cloud architecture based on the FIWARE standard is proposed for interconnecting the several agents that make up the agroindustrial system. In addition, this architecture includes an efficient management method based on a model predictive control technique, which is aimed at minimizing operating costs. A case study inspired by three real facilities located in Almería (southeast of Spain) is used as the simulation test bed. The obtained results show how around 75% of the total operating costs can be saved with the application of the proposed approach, which could be very significant to decrease the costs of desalinated water and, therefore, to maintain the sustainability of the agricultural system.

15.
Sensors (Basel) ; 20(5)2020 Mar 10.
Artigo em Inglês | MEDLINE | ID: mdl-32164160

RESUMO

Recently, the rapid development of the Internet of Things (IoT) has led to an increasing exponential growth of non-scalar data (e.g., images, videos). Local services are far from satisfying storage requirements, and the cloud computing fails to effectively support heterogeneous distributed IoT environments, such as wireless sensor network. To effectively provide smart privacy protection for video data storage, we take full advantage of three patterns (multi-access edge computing, cloudlets and fog computing) of edge computing to design the hierarchical edge computing architecture, and propose a low-complexity and high-secure scheme based on it. The video is divided into three parts and stored in completely different facilities. Specifically, the most significant bits of key frames are directly stored in local sensor devices while the least significant bits of key frames are encrypted and sent to the semi-trusted cloudlets. The non-key frame is compressed with the two-layer parallel compressive sensing and encrypted by the 2D logistic-skew tent map and then transmitted to the cloud. Simulation experiments and theoretical analysis demonstrate that our proposed scheme can not only provide smart privacy protection for big video data storage based on the hierarchical edge computing, but also avoid increasing additional computation burden and storage pressure.

16.
Sensors (Basel) ; 20(17)2020 Aug 31.
Artigo em Inglês | MEDLINE | ID: mdl-32878202

RESUMO

Recent developments in cloud computing allow data to be securely shared between users. This can be used to improve the quality of life of patients and medical staff in the Internet of Medical Things (IoMT) environment. However, in the IoMT cloud environment, there are various security threats to the patient's medical data. As a result, security features such as encryption of collected data and access control by legitimate users are essential. Many studies have been conducted on access control techniques using ciphertext-policy attribute-based encryption (CP-ABE), a form of attribute-based encryption, among various security technologies and studies are underway to apply them to the medical field. However, several problems persist. First, as the secret key does not identify the user, the user may maliciously distribute the secret key and such users cannot be tracked. Second, Attribute-Based Encryption (ABE) increases the size of the ciphertext depending on the number of attributes specified. This wastes cloud storage, and computational times are high when users decrypt. Such users must employ outsourcing servers. Third, a verification process is needed to prove that the results computed on the outsourcing server are properly computed. This paper focuses on the IoMT environment for a study of a CP-ABE-based medical data sharing system with key abuse prevention and verifiable outsourcing in a cloud environment. The proposed scheme can protect the privacy of user data stored in a cloud environment in the IoMT field, and if there is a problem with the secret key delegated by the user, it can trace a user who first delegated the key. This can prevent the key abuse problem. In addition, this scheme reduces the user's burden when decoding ciphertext and calculates accurate results through a server that supports constant-sized ciphertext output and verifiable outsourcing technology. The goal of this paper is to propose a system that enables patients and medical staff to share medical data safely and efficiently in an IoMT environment.

17.
BMC Bioinformatics ; 20(Suppl 9): 366, 2019 Nov 22.
Artigo em Inglês | MEDLINE | ID: mdl-31757212

RESUMO

BACKGROUND: Several large public repositories of microarray datasets and RNA-seq data are available. Two prominent examples include ArrayExpress and NCBI GEO. Unfortunately, there is no easy way to import and manipulate data from such resources, because the data is stored in large files, requiring large bandwidth to download and special purpose data manipulation tools to extract subsets relevant for the specific analysis. RESULTS: TACITuS is a web-based system that supports rapid query access to high-throughput microarray and NGS repositories. The system is equipped with modules capable of managing large files, storing them in a cloud environment and extracting subsets of data in an easy and efficient way. The system also supports the ability to import data into Galaxy for further analysis. CONCLUSIONS: TACITuS automates most of the pre-processing needed to analyze high-throughput microarray and NGS data from large publicly-available repositories. The system implements several modules to manage large files in an easy and efficient way. Furthermore, it is capable deal with Galaxy environment allowing users to analyze data through a user-friendly interface.


Assuntos
Big Data , Coleta de Dados , Software , Transcriptoma/genética , Linhagem Celular Tumoral , Bases de Dados Genéticas , Humanos , Interface Usuário-Computador
18.
Sensors (Basel) ; 20(1)2019 Dec 30.
Artigo em Inglês | MEDLINE | ID: mdl-31905910

RESUMO

In the IoT (Internet of Things) environment, smart homes, smart grids, and telematics constantly generate data with complex attributes. These data have low heterogeneity and poor interoperability, which brings difficulties to data management and value mining. The promising combination of blockchain and the Internet of things as BCoT (blockchain of things) can solve these problems. This paper introduces an innovative method DCOMB (dual combination Bloom filter) to firstly convert the computational power of bitcoin mining into the computational power of query. Furthermore, this article uses the DCOMB method to build blockchain-based IoT data query model. DCOMB can implement queries only through mining hash calculation. This model combines the data stream of the IoT with the timestamp of the blockchain, improving the interoperability of data and the versatility of the IoT database system. The experiment results show that the random reading performance of DCOMB query is higher than that of COMB (combination Bloom filter), and the error rate of DCOMB is lower. Meanwhile, both DCOMB and COMB query performance are better than MySQL (My Structured Query Language).

19.
Sensors (Basel) ; 18(6)2018 Jun 04.
Artigo em Inglês | MEDLINE | ID: mdl-29867037

RESUMO

In recent years, the Internet of Things (IoT) has found wide application and attracted much attention. Since most of the end-terminals in IoT have limited capabilities for storage and computing, it has become a trend to outsource the data from local to cloud computing. To further reduce the communication bandwidth and storage space, data deduplication has been widely adopted to eliminate the redundant data. However, since data collected in IoT are sensitive and closely related to users' personal information, the privacy protection of users' information becomes a challenge. As the channels, like the wireless channels between the terminals and the cloud servers in IoT, are public and the cloud servers are not fully trusted, data have to be encrypted before being uploaded to the cloud. However, encryption makes the performance of deduplication by the cloud server difficult because the ciphertext will be different even if the underlying plaintext is identical. In this paper, we build a centralized privacy-preserving duplicate removal storage system, which supports both file-level and block-level deduplication. In order to avoid the leakage of statistical information of data, Intel Software Guard Extensions (SGX) technology is utilized to protect the deduplication process on the cloud server. The results of the experimental analysis demonstrate that the new scheme can significantly improve the deduplication efficiency and enhance the security. It is envisioned that the duplicated removal system with privacy preservation will be of great use in the centralized storage environment of IoT.

20.
J Med Syst ; 42(8): 152, 2018 Jul 05.
Artigo em Inglês | MEDLINE | ID: mdl-29974270

RESUMO

To achieve confidentiality, authentication, integrity of medical data, and support fine-grained access control, we propose a secure electronic health record (EHR) system based on attribute-based cryptosystem and blockchain technology. In our system, we use attribute-based encryption (ABE) and identity-based encryption (IBE) to encrypt medical data, and use identity-based signature (IBS) to implement digital signatures. To achieve different functions of ABE, IBE and IBS in one cryptosystem, we introduce a new cryptographic primitive, called combined attribute-based/identity-based encryption and signature (C-AB/IB-ES). This greatly facilitates the management of the system, and does not need to introduce different cryptographic systems for different security requirements. In addition, we use blockchain techniques to ensure the integrity and traceability of medical data. Finally, we give a demonstrating application for medical insurance scene.


Assuntos
Computação em Nuvem , Segurança Computacional , Registros Eletrônicos de Saúde , Algoritmos , Sistemas Computacionais , Confidencialidade , Seguro Saúde
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa