Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(1)2022 Dec 29.
Artigo em Inglês | MEDLINE | ID: mdl-36616956

RESUMO

The unceasingly increasing needs for data acquisition, storage and analysis in transportation systems have led to the adoption of new technologies and methods in order to provide efficient and reliable solutions. Both highways and vehicles, nowadays, host a vast variety of sensors collecting different types of highly fluctuating data such as speed, acceleration, direction, and so on. From the vast volume and variety of these data emerges the need for the employment of big data techniques and analytics in the context of state-of-the-art intelligent transportation systems (ITS). Moreover, the scalability needs of fleet and traffic management systems point to the direction of designing and deploying distributed architecture solutions that can be expanded in order to avoid technological and/or technical entrapments. Based on the needs and gaps detected in the literature as well as the available technologies for data gathering, storage and analysis for ITS, the aim of this study is to provide a distributed architecture platform to address these deficiencies. The architectural design of the system proposed, engages big data frameworks and tools (e.g., NoSQL Mongo DB, Apache Hadoop, etc.) as well as analytics tools (e.g., Apache Spark). The main contribution of this study is the introduction of a holistic platform that can be used for the needs of the ITS domain offering continuous collection, storage and data analysis capabilities. To achieve that, different modules of state-of-the-art methods and tools were utilized and combined in a unified platform that supports the entire cycle of data acquisition, storage and analysis in a single point. This leads to a complete solution for ITS applications which lifts the limitations imposed in legacy and current systems by the vast amounts of rapidly changing data, while offering a reliable system for acquisition, storage as well as timely analysis and reporting capabilities of these data.


Assuntos
Big Data , Ciência de Dados , Registros , Análise de Dados
2.
Sensors (Basel) ; 22(4)2022 Feb 18.
Artigo em Inglês | MEDLINE | ID: mdl-35214517

RESUMO

The continuous evolution of IT (information technology) technologies is radically transforming many technical areas and social aspects, also reshaping the way we behave and looking for entertainment and leisure services. In that context, tourism experiences request to enhance the level of user involvement and integration and to create an ever more personalized and connected experience, by leveraging on the differentiated tourist services and information locally present in the territory, by pushing active participation of customers, and by taking advantage of the ever-increasing presence of sensors and IoT (Internet of Things) devices deployed in many realities. However, the deep fragmentation of services and technologies adopted in tourism context characterizes the whole information provided also by customer sensing and IoTs (Internet of Things) heterogeneity and deep clashes with an effective organization of smart tourism. This article presents APERTO5.0 (an Architecture for Personalization and Elaboration of services and data to Reshape Tourism Offers 5.0), an innovative architecture aiming at a whole integration and deep facilitation of tourism service and information organization and blending, to enable the re-provisioning of novel services as advanced aggregates or re-elaborated ones. The proposed solution will demonstrate its effectiveness in the context of Smart Tourism by choosing the real use case of the "Francigena way" (a pilgrim historical path), the Italian part.


Assuntos
Tecnologia da Informação , Turismo , Tecnologia
3.
Entropy (Basel) ; 24(6)2022 Jun 07.
Artigo em Inglês | MEDLINE | ID: mdl-35741515

RESUMO

In this paper, we propose a distributed secure delegated quantum computation protocol, by which an almost classical client can delegate a (dk)-qubit quantum circuit to d quantum servers, where each server is equipped with a 2k-qubit register that is used to process only k qubits of the delegated quantum circuit. None of servers can learn any information about the input and output of the computation. The only requirement for the client is that he or she has ability to prepare four possible qubits in the state of (|0⟩+eiθ|1⟩)/2, where θ∈{0,π/2,π,3π/2}. The only requirement for servers is that each pair of them share some entangled states (|0⟩|+⟩+|1⟩|-⟩)/2 as ancillary qubits. Instead of assuming that all servers are interconnected directly by quantum channels, we introduce a third party in our protocol that is designed to distribute the entangled states between those servers. This would simplify the quantum network because the servers do not need to share a quantum channel. In the end, we show that our protocol can guarantee unconditional security of the computation under the situation where all servers, including the third party, are honest-but-curious and allowed to cooperate with each other.

4.
Sensors (Basel) ; 21(4)2021 Feb 20.
Artigo em Inglês | MEDLINE | ID: mdl-33672605

RESUMO

The demand for online services is increasing. Services that would require a long time to understand, use and master are becoming as transparent as possible to the users, that tend to focus only on the final goals. Combined with the advantages of the unmanned vehicles (UV), from the unmanned factor to the reduced size and costs, we found an opportunity to bring to users a wide variety of services supported by UV, through the Internet of Unmanned Vehicles (IoUV). Current solutions were analyzed and we discussed scalability and genericity as the principal concerns. Then, we proposed a solution that combines several services and UVs, available from anywhere at any time, from a cloud platform. The solution considers a cloud distributed architecture, composed by users, services, vehicles and a platform, interconnected through the Internet. Each vehicle provides to the platform an abstract and generic interface for the essential commands. Therefore, this modular design makes easier the creation of new services and the reuse of the different vehicles. To confirm the feasibility of the solution we implemented a prototype considering a cloud-hosted platform and the integration of custom-built small-sized cars, a custom-built quadcopter, and a commercial Vertical Take-Off and Landing (VTOL) aircraft. To validate the prototype and the vehicles' remote control, we created several services accessible via a web browser and controlled through a computer keyboard. We tested the solution in a local network, remote networks and mobile networks (i.e., 3G and Long-Term Evolution (LTE)) and proved the benefits of decentralizing the communications into multiple point-to-point links for the remote control. Consequently, the solution can provide scalable UV-based services, with low technical effort, for anyone at anytime and anywhere.

5.
Entropy (Basel) ; 21(4)2019 Apr 02.
Artigo em Inglês | MEDLINE | ID: mdl-33267071

RESUMO

Recently, deep learning has achieved state-of-the-art performance in more aspects than traditional shallow architecture-based machine-learning methods. However, in order to achieve higher accuracy, it is usually necessary to extend the network depth or ensemble the results of different neural networks. Increasing network depth or ensembling different networks increases the demand for memory resources and computing resources. This leads to difficulties in deploying depth-learning models in resource-constrained scenarios such as drones, mobile phones, and autonomous driving. Improving network performance without expanding the network scale has become a hot topic for research. In this paper, we propose a cross-architecture online-distillation approach to solve this problem by transmitting supplementary information on different networks. We use the ensemble method to aggregate networks of different structures, thus forming better teachers than traditional distillation methods. In addition, discontinuous distillation with progressively enhanced constraints is used to replace fixed distillation in order to reduce loss of information diversity in the distillation process. Our training method improves the distillation effect and achieves strong network-performance improvement. We used some popular models to validate the results. On the CIFAR100 dataset, AlexNet's accuracy was improved by 5.94%, VGG by 2.88%, ResNet by 5.07%, and DenseNet by 1.28%. Extensive experiments were conducted to demonstrate the effectiveness of the proposed method. On the CIFAR10, CIFAR100, and ImageNet datasets, we observed significant improvements over traditional knowledge distillation.

6.
Sensors (Basel) ; 18(5)2018 May 17.
Artigo em Inglês | MEDLINE | ID: mdl-29772850

RESUMO

A general framework of data fusion is presented based on projecting the probability distribution of true states and measurements around the predicted states and actual measurements onto the constraint manifold. The constraint manifold represents the constraints to be satisfied among true states and measurements, which is defined in the extended space with all the redundant sources of data such as state predictions and measurements considered as independent variables. By the general framework, we mean that it is able to fuse any correlated data sources while directly incorporating constraints and identifying inconsistent data without any prior information. The proposed method, referred to here as the Covariance Projection (CP) method, provides an unbiased and optimal solution in the sense of minimum mean square error (MMSE), if the projection is based on the minimum weighted distance on the constraint manifold. The proposed method not only offers a generalization of the conventional formula for handling constraints and data inconsistency, but also provides a new insight into data fusion in terms of a geometric-algebraic point of view. Simulation results are provided to show the effectiveness of the proposed method in handling constraints and data inconsistency.

7.
J Biomed Semantics ; 15(1): 9, 2024 Jun 06.
Artigo em Inglês | MEDLINE | ID: mdl-38845042

RESUMO

BACKGROUND: In healthcare, an increasing collaboration can be noticed between different caregivers, especially considering the shift to homecare. To provide optimal patient care, efficient coordination of data and workflows between these different stakeholders is required. To achieve this, data should be exposed in a machine-interpretable, reusable manner. In addition, there is a need for smart, dynamic, personalized and performant services provided on top of this data. Flexible workflows should be defined that realize their desired functionality, adhere to use case specific quality constraints and improve coordination across stakeholders. User interfaces should allow configuring all of this in an easy, user-friendly way. METHODS: A distributed, generic, cascading reasoning reference architecture can solve the presented challenges. It can be instantiated with existing tools built upon Semantic Web technologies that provide data-driven semantic services and constructing cross-organizational workflows. These tools include RMLStreamer to generate Linked Data, DIVIDE to adaptively manage contextually relevant local queries, Streaming MASSIF to deploy reusable services, AMADEUS to compose semantic workflows, and RMLEditor and Matey to configure rules to generate Linked Data. RESULTS: A use case demonstrator is built on a scenario that focuses on personalized smart monitoring and cross-organizational treatment planning. The performance and usability of the demonstrator's implementation is evaluated. The former shows that the monitoring pipeline efficiently processes a stream of 14 observations per second: RMLStreamer maps JSON observations to RDF in 13.5 ms, a C-SPARQL query to generate fever alarms is executed on a window of 5 s in 26.4 ms, and Streaming MASSIF generates a smart notification for fever alarms based on severity and urgency in 1539.5 ms. DIVIDE derives the C-SPARQL queries in 7249.5 ms, while AMADEUS constructs a colon cancer treatment plan and performs conflict detection with it in 190.8 ms and 1335.7 ms, respectively. CONCLUSIONS: Existing tools built upon Semantic Web technologies can be leveraged to optimize continuous care provisioning. The evaluation of the building blocks on a realistic homecare monitoring use case demonstrates their applicability, usability and good performance. Further extending the available user interfaces for some tools is required to increase their adoption.


Assuntos
Serviços de Assistência Domiciliar , Fluxo de Trabalho , Semântica , Humanos
8.
Sensors (Basel) ; 8(3): 1755-1773, 2008 Mar 13.
Artigo em Inglês | MEDLINE | ID: mdl-27879791

RESUMO

Sensors provide some of the basic input data for risk management of natural andman-made hazards. Here the word 'sensors' covers everything from remote sensingsatellites, providing invaluable images of large regions, through instruments installed on theEarth's surface to instruments situated in deep boreholes and on the sea floor, providinghighly-detailed point-based information from single sites. Data from such sensors is used inall stages of risk management, from hazard, vulnerability and risk assessment in the preeventphase, information to provide on-site help during the crisis phase through to data toaid in recovery following an event. Because data from sensors play such an important part inimproving understanding of the causes of risk and consequently in its mitigation,considerable investment has been made in the construction and maintenance of highlysophisticatedsensor networks. In spite of the ubiquitous need for information from sensornetworks, the use of such data is hampered in many ways. Firstly, information about thepresence and capabilities of sensor networks operating in a region is difficult to obtain dueto a lack of easily available and usable meta-information. Secondly, once sensor networkshave been identified their data it is often difficult to access due to a lack of interoperability between dissemination and acquisition systems. Thirdly, the transfer and processing ofinformation from sensors is limited, again by incompatibilities between systems. Therefore,the current situation leads to a lack of efficiency and limited use of the available data thathas an important role to play in risk mitigation. In view of this situation, the EuropeanCommission (EC) is funding a number of Integrated Projects within the Sixth FrameworkProgramme concerned with improving the accessibility of data and services for riskmanagement. Two of these projects: 'Open Architecture and Spatial Data Infrastructure forRisk Management' (ORCHESTRA, http://www.eu-orchestra.org/) and 'Sensors Anywhere'(SANY, http://sany-ip.eu/) are discussed in this article. These projects have developed anopen distributed information technology architecture and have implemented web servicesfor the accessing and using data emanating, for example, from sensor networks. Thesedevelopments are based on existing data and service standards proposed by internationalorganizations. The projects seek to develop the ideals of the EC directive INSPIRE(http://inspire.jrc.it), which was launched in 2001 and whose implementation began this year(2007), into the risk management domain. Thanks to the open nature of the architecture andservices being developed within these projects, they can be implemented by any interestedparty and can be accessed by all potential users. The architecture is based around a serviceorientedapproach that makes use of Internet-based applications (web services) whose inputsand outputs conform to standards. The benefit of this philosophy is that it is expected tofavor the emergence of an operational market for risk management services in Europe, iteliminates the need to replace or radically alter the hundreds of already operational ITsystems in Europe (drastically lowering costs for users), and it allows users and stakeholdersto achieve interoperability while using the system most adequate to their needs, budgets,culture etc. (i.e. it has flexibility).

9.
Technol Health Care ; 24(6): 827-842, 2016 Nov 14.
Artigo em Inglês | MEDLINE | ID: mdl-27392830

RESUMO

BACKGROUND: The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. OBJECTIVE: This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. METHODS: The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. RESULTS: The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. CONCLUSIONS: The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.


Assuntos
Redes de Comunicação de Computadores/organização & administração , Registros Eletrônicos de Saúde/organização & administração , Integração de Sistemas , Grécia , Humanos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA