Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 25
Filtrar
1.
Sensors (Basel) ; 23(10)2023 May 18.
Artigo em Inglês | MEDLINE | ID: mdl-37430781

RESUMO

In cross-border transactions, the transmission and processing of logistics information directly affect the trading experience and efficiency. The use of Internet of Things (IoT) technology can make this process more intelligent, efficient, and secure. However, most traditional IoT logistics systems are provided by a single logistics company. These independent systems need to withstand high computing loads and network bandwidth when processing large-scale data. Additionally, due to the complex network environment of cross-border transactions, the platform's information security and system security are difficult to guarantee. To address these challenges, this paper designs and implements an intelligent cross-border logistics system platform that combines serverless architecture and microservice technology. This system can uniformly distribute the services of all logistics companies and divide microservices based on actual business needs. It also studies and designs corresponding Application Programming Interface (API) gateways to solve the interface exposure problem of microservices, thereby ensuring the system's security. Furthermore, asymmetric encryption technology is used in the serverless architecture to ensure the security of cross-border logistics data. The experiments show that this research solution validates the advantages of combining serverless architecture and microservices, which can significantly reduce the operating costs and system complexity of the platform in cross-border logistics scenarios. It allows for resource expansion and billing based on application program requirements at runtime. The platform can effectively improve the security of cross-border logistics service processes and meet cross-border transaction needs in terms of data security, throughput, and latency.

2.
Sensors (Basel) ; 23(15)2023 Jul 29.
Artigo em Inglês | MEDLINE | ID: mdl-37571578

RESUMO

The rapid development of Internet of Things (IoT) communication devices has brought about significant convenience. However, simultaneously, the destruction of communication infrastructure in emergency situations often leads to communication disruptions and challenges in information dissemination, severely impacting rescue operations and the safety of the affected individuals. To address this challenge, IoT big data analytics and unmanned aerial vehicle (UAV) technologies have emerged as key elements in the solution. By analyzing large-scale sensor data, user behavior, and communication traffic, IoT big data analytics can provide real-time communication demand prediction and network optimization strategies, offering decision support for post-disaster communication reconstruction. Given the unique characteristics of post-disaster scenarios, this paper proposes a UAV-assisted communication coverage strategy based on IoT big data analytics. This strategy employs UAVs in a cruising manner to assist in communication by partitioning the target area into multiple cells, each satisfying the minimum data requirements for user communication. Depending on the distribution characteristics of users, flight-communication or hover-communication protocols are selectively employed to support communication. By optimizing the UAV's flight speed and considering the coverage index, fairness index, and average energy efficiency of the mission's target area, the Inner Spiral Cruise Communication Coverage (IS-CCC) algorithm is proposed to plan the UAV's cruising trajectory and achieve UAV-based communication coverage. Simulation results demonstrate that this strategy can achieve energy-efficient cruising communication coverage in regions with complex user distributions, thereby reducing energy consumption in UAV-based communication.

3.
Sensors (Basel) ; 23(3)2023 Jan 29.
Artigo em Inglês | MEDLINE | ID: mdl-36772538

RESUMO

Internet of Things (IoT) finance extends financial services to the whole physical commodity society with the help of IoT technology to realize financial automation and intelligence. However, the security of IoT finance still needs to be improved. Blockchain has the characteristics of decentralization, immutability, faster settlement, etc., and has been gradually applied to the field of IoT finance. Blockchain is also considered to be an effective way to resolve the problems of the traditional supply chain finance industry, such as the inability to transmit core enterprise credit, the failure of full-chain business information connections and the difficulty of clearing and settlement. Supply chain finance allows the strongest enterprise in the supply chain to apply for credit guarantee from the bank to obtain bank loans, and use the funds for circulation in the supply chain to ensure that each enterprise in the whole supply chain can obtain working capital to realize profits, so as to maximize common interests. In this paper, a financial management platform based on the integration of blockchain and supply chain has been designed and implemented. Blockchain is used to integrate supply chain finance to synchronize the bank account payment system, realize the automatic flow of funds, process supervision and automatically settle account periods based on smart contracts. The four functional modules of the system are designed using unified modeling language (UML), and the model view controller (MVC) architecture is selected as the main architecture of the system. The results of the system test show that the proposed platform can effectively improve the system security, and can use the information in the blockchain to provide multi-level financing services for enterprises in supply chain finance.

4.
Sensors (Basel) ; 23(2)2023 Jan 14.
Artigo em Inglês | MEDLINE | ID: mdl-36679767

RESUMO

Mobile applications have rapidly grown over the past few decades to offer futuristic applications, such as autonomous vehicles, smart farming, and smart city. Such applications require ubiquitous, real-time, and secure communications to deliver services quickly. Toward this aim, sixth-generation (6G) wireless technology offers superior performance with high reliability, enhanced transmission rate, and low latency. However, managing the resources of the aforementioned applications is highly complex in the precarious network. An adversary can perform various network-related attacks (i.e., data injection or modification) to jeopardize the regular operation of the smart applications. Therefore, incorporating blockchain technology in the smart application can be a prominent solution to tackle security, reliability, and data-sharing privacy concerns. Motivated by the same, we presented a case study on public safety applications that utilizes the essential characteristics of artificial intelligence (AI), blockchain, and a 6G network to handle data integrity attacks on the crime data. The case study is assessed using various performance parameters by considering blockchain scalability, packet drop ratio, and training accuracy. Lastly, we explored different research challenges of adopting blockchain in the 6G wireless network.


Assuntos
Inteligência Artificial , Blockchain , Reprodutibilidade dos Testes , Inteligência , Agricultura , Segurança Computacional
5.
Sensors (Basel) ; 23(4)2023 Feb 10.
Artigo em Inglês | MEDLINE | ID: mdl-36850606

RESUMO

A cognitive radio network (CRN) is an intelligent network that can detect unoccupied spectrum space without interfering with the primary user (PU). Spectrum scarcity arises due to the stable channel allocation, which the CRN handles. Spectrum handoff management is a critical problem that must be addressed in the CRN to ensure indefinite connection and profitable use of unallocated spectrum space for secondary users (SUs). Spectrum handoff (SHO) has some disadvantages, i.e., communication delay and power consumption. To overcome these drawbacks, a reduction in handoff should be a priority. This study proposes the use of dynamic spectrum access (DSA) to check for available channels for SU during handoff using a metaheuristic algorithm depending on machine learning. The simulation results show that the proposed "support vector machine-based red deer algorithm" (SVM-RDA) is resilient and has low complexity. The suggested algorithm's experimental setup offers several handoffs, unsuccessful handoffs, handoff delay, throughput, signal-to-noise ratio (SNR), SU bandwidth, and total spectrum bandwidth. This study provides an improved system performance during SHO. The inferred technique anticipates handoff delay and minimizes the handoff numbers. The results show that the recommended method is better at making predictions with fewer handoffs compared to the other three.

6.
Sensors (Basel) ; 23(9)2023 Apr 22.
Artigo em Inglês | MEDLINE | ID: mdl-37177403

RESUMO

The aim of the peer-to-peer (P2P) decentralized gaming industry has shifted towards realistic gaming environment (GE) support for game players (GPs). Recent innovations in the metaverse have motivated the gaming industry to look beyond augmented reality and virtual reality engines, which improve the reality of virtual game worlds. In gaming metaverses (GMs), GPs can play, socialize, and trade virtual objects in the GE. On game servers (GSs), the collected GM data are analyzed by artificial intelligence models to personalize the GE according to the GP. However, communication with GSs suffers from high-end latency, bandwidth concerns, and issues regarding the security and privacy of GP data, which pose a severe threat to the emerging GM landscape. Thus, we proposed a scheme, Game-o-Meta, that integrates federated learning in the GE, with GP data being trained on local devices only. We envisioned the GE over a sixth-generation tactile internet service to address the bandwidth and latency issues and assure real-time haptic control. In the GM, the GP's game tasks are collected and trained on the GS, and then a pre-trained model is downloaded by the GP, which is trained using local data. The proposed scheme was compared against traditional schemes based on parameters such as GP task offloading, GP avatar rendering latency, and GS availability. The results indicated the viability of the proposed scheme.

7.
Sensors (Basel) ; 22(11)2022 May 26.
Artigo em Inglês | MEDLINE | ID: mdl-35684668

RESUMO

Integrating information and communication technology (ICT) and energy grid infrastructures introduces smart grids (SG) to simplify energy generation, transmission, and distribution. The ICT is embedded in selected parts of the grid network, which partially deploys SG and raises various issues such as energy losses, either technical or non-technical (i.e., energy theft). Therefore, energy theft detection plays a crucial role in reducing the energy generation burden on the SG and meeting the consumer demand for energy. Motivated by these facts, in this paper, we propose a deep learning (DL)-based energy theft detection scheme, referred to as GrAb, which uses a data-driven analytics approach. GrAb uses a DL-based long short-term memory (LSTM) model to predict the energy consumption using smart meter data. Then, a threshold calculator is used to calculate the energy consumption. Both the predicted energy consumption and the threshold value are passed to the support vector machine (SVM)-based classifier to categorize the energy losses into technical, non-technical (energy theft), and normal consumption. The proposed data-driven theft detection scheme identifies various forms of energy theft (e.g., smart meter data manipulation or clandestine connections). Experimental results show that the proposed scheme (GrAb) identifies energy theft more accurately compared to the state-of-the-art approaches.


Assuntos
Aprendizado Profundo , Redes de Comunicação de Computadores , Fenômenos Físicos , Máquina de Vetores de Suporte , Roubo
8.
Sensors (Basel) ; 22(13)2022 Jun 26.
Artigo em Inglês | MEDLINE | ID: mdl-35808325

RESUMO

In Smart Grid (SG), Transactive Energy Management (TEM) is one of the most promising approaches to boost consumer participation in energy generation, energy management, and establishing decentralized energy market models using Peer-to-Peer (P2P). In P2P, a prosumer produces electric energy at their place using Renewable Energy Resources (RES) such as solar energy, wind energy, etc. Then, this generated energy is traded with consumers (who need the energy) in a nearby locality. P2P facilitates energy exchange in localized micro-energy markets of the TEM system. Such decentralized P2P energy management could cater to diverse prosumers and utility business models. However, the existing P2P approaches suffer from several issues such as single-point-of-failure, network bandwidth, scalability, trust, and security issues. To handle the aforementioned issues, this paper proposes a Decentralized and Transparent P2P Energy Trading (DT-P2PET) scheme using blockchain. The proposed DT-P2PET scheme aims to reduce the grid's energy generation and management burden while also increasing profit for both consumers and prosumers through a dynamic pricing mechanism. The DT-P2PET scheme uses Ethereum-blockchain-based Smart Contracts (SCs) and InterPlanetary File System (IPFS) for the P2P energy trading. Furthermore, a recommender mechanism is also introduced in this study to increase the number of prosumers. The Ethereum SCs are designed and deployed to perform P2P in real time in the proposed DT-P2PET scheme. The DT-P2PET scheme is evaluated based on the various parameters such as profit generation (for prosumer and consumer both), data storage cost, network bandwidth, and data transfer rate in contrast to the existing approaches.


Assuntos
Blockchain , Comércio , Sistemas Computacionais , Armazenamento e Recuperação da Informação
9.
Sensors (Basel) ; 20(4)2020 Feb 22.
Artigo em Inglês | MEDLINE | ID: mdl-32098444

RESUMO

With the continuous development of science and engineering technology, our society has entered the era of the mobile Internet of Things (MIoT). MIoT refers to the combination of advanced manufacturing technologies with the Internet of Things (IoT) to create a flexible digital manufacturing ecosystem. The wireless communication technology in the Internet of Things is a bridge between mobile devices. Therefore, the introduction of machine learning (ML) algorithms into MIoT wireless communication has become a research direction of concern. However, the traditional key-based wireless communication method demonstrates security problems and cannot meet the security requirements of the MIoT. Based on the research on the communication of the physical layer and the support vector data description (SVDD) algorithm, this paper establishes a radio frequency fingerprint (RFF or RF fingerprint) authentication model for a communication device. The communication device in the MIoT is accurately and efficiently identified by extracting the radio frequency fingerprint of the communication signal. In the simulation experiment, this paper introduces the neighborhood component analysis (NCA) method and the SVDD method to establish a communication device authentication model. At a signal-to-noise ratio (SNR) of 15 dB, the authentic devices authentication success rate (ASR) and the rogue devices detection success rate (RSR) are both 90%.

10.
Sensors (Basel) ; 20(9)2020 Apr 26.
Artigo em Inglês | MEDLINE | ID: mdl-32357404

RESUMO

Log anomaly detection is an efficient method to manage modern large-scale Internet of Things (IoT) systems. More and more works start to apply natural language processing (NLP) methods, and in particular word2vec, in the log feature extraction. Word2vec can extract the relevance between words and vectorize the words. However, the computing cost of training word2vec is high. Anomalies in logs are dependent on not only an individual log message but also on the log message sequence. Therefore, the vector of words from word2vec can not be used directly, which needs to be transformed into the vector of log events and further transformed into the vector of log sequences. To reduce computational cost and avoid multiple transformations, in this paper, we propose an offline feature extraction model, named LogEvent2vec, which takes the log event as input of word2vec to extract the relevance between log events and vectorize log events directly. LogEvent2vec can work with any coordinate transformation methods and anomaly detection models. After getting the log event vector, we transform log event vector to log sequence vector by bary or tf-idf and three kinds of supervised models (Random Forests, Naive Bayes, and Neural Networks) are trained to detect the anomalies. We have conducted extensive experiments on a real public log dataset from BlueGene/L (BGL). The experimental results demonstrate that LogEvent2vec can significantly reduce computational time by 30 times and improve accuracy, comparing with word2vec. LogEvent2vec with bary and Random Forest can achieve the best F1-score and LogEvent2vec with tf-idf and Naive Bayes needs the least computational time.

11.
Sensors (Basel) ; 20(5)2020 Feb 29.
Artigo em Inglês | MEDLINE | ID: mdl-32121445

RESUMO

The proper utilization of road information can improve the performance of relay-node selection methods. However, the existing schemes are only applicable to a specific road structure, and this limits their application in real-world scenarios where mostly more than one road structure exists in the Region of Interest (RoI), even in the communication range of a sender. In this paper, we propose an adaptive relay-node selection (ARNS) method based on the exponential partition to implement message broadcasting in complex scenarios. First, we improved a relay-node selection method in the curved road scenarios through the re-definition of the optimal position considering the distribution of the obstacles. Then, we proposed a criterion of classifying road structures based on their broadcast characteristics. Finally, ARNS is designed to adaptively apply the appropriate relay-node selection method based on the exponential partition in realistic scenarios. Simulation results on a real-world map show that the end-to-end broadcast delay of ARNS is reduced by at least 13.8% compared to the beacon-based relay-node selection method, and at least 14.0% compared to the trinary partitioned black-burst-based broadcast protocol (3P3B)-based relay-node selection method. The broadcast coverage is increased by 3.6-7% in curved road scenarios, with obstacles benefitting from the consideration of the distribution of obstacles. Moreover, ARNS achieves a higher and more stable packet delivery ratio (PDR) than existing methods profiting from the adaptive selection mechanism.

12.
Sensors (Basel) ; 20(3)2020 Jan 30.
Artigo em Inglês | MEDLINE | ID: mdl-32019128

RESUMO

The development of the Internet of Things (IoT) plays a very important role for processing data at the edge of a network. Therefore, it is very important to protect the privacy of IoT devices when these devices process and transfer data. A mesh signature (MS) is a useful cryptographic tool, which makes a signer sign any message anonymously. As a result, the signer can hide his specific identity information to the mesh signature, namely his identifying information (such as personal public key) may be hidden to a list of tuples that consist of public key and message. Therefore, we propose an improved mesh signature scheme for IoT devices in this paper. The IoT devices seen as the signers may sign their publishing data through our proposed mesh signature scheme, and their specific identities can be hidden to a list of possible signers. Additionally, mesh signature consists of some atomic signatures, where the atomic signatures can be reusable. Therefore, for a large amount of data published by the IoT devices, the atomic signatures on the same data can be reusable so as to decrease the number of signatures generated by the IoT devices in our proposed scheme. Compared with the original mesh signature scheme, the proposed scheme has less computational costs on generating final mesh signature and signature verification. Since atomic signatures are reusable, the proposed scheme has more advantages on generating final mesh signature by reconstructing atomic signatures. Furthermore, according to our experiment, when the proposed scheme generates a mesh signature on 10 MB message, the memory consumption is only about 200 KB. Therefore, it is feasible that the proposed scheme is used to protect the identity privacy of IoT devices.

13.
Sensors (Basel) ; 20(16)2020 Aug 11.
Artigo em Inglês | MEDLINE | ID: mdl-32796687

RESUMO

Wireless Rechargeable Sensor Networks (WRSN) are not yet fully functional and robust due to the fact that their setting parameters assume fixed control velocity and location. This study proposes a novel scheme of the WRSN with mobile sink (MS) velocity control strategies for charging nodes and collecting its data in WRSN. Strip space of the deployed network area is divided into sub-locations for variant corresponding velocities based on nodes energy expenditure demands. The points of consumed energy bottleneck nodes in sub-locations are determined based on gathering data of residual energy and expenditure of nodes. A minimum reliable energy balanced spanning tree is constructed based on data collection to optimize the data transmission paths, balance energy consumption, and reduce data loss during transmission. Experimental results are compared with the other methods in the literature that show that the proposed scheme offers a more effective alternative in reducing the network packet loss rate, balancing the nodes' energy consumption, and charging capacity of the nodes than the competitors.

14.
Sensors (Basel) ; 20(4)2020 Feb 21.
Artigo em Inglês | MEDLINE | ID: mdl-32098092

RESUMO

Many remote sensing scene classification algorithms improve their classification accuracyby additional modules, which increases the parameters and computing overhead of the model atthe inference stage. In this paper, we explore how to improve the classification accuracy of themodel without adding modules at the inference stage. First, we propose a network trainingstrategy of training with multi-size images. Then, we introduce more supervision information bytriplet loss and design a branch for the triplet loss. In addition, dropout is introduced between thefeature extractor and the classifier to avoid over-fitting. These modules only work at the trainingstage and will not bring about the increase in model parameters at the inference stage. We useResnet18 as the baseline and add the three modules to the baseline. We perform experiments onthree datasets: AID, NWPU-RESISC45, and OPTIMAL. Experimental results show that our modelcombined with the three modules is more competitive than many existing classification algorithms.In addition, ablation experiments on OPTIMAL show that dropout, triplet loss, and training withmulti-size images improve the overall accuracy of the model on the test set by 0.53%, 0.38%, and0.7%, respectively. The combination of the three modules improves the overall accuracy of themodel by 1.61%. It can be seen that the three modules can improve the classification accuracy of themodel without increasing model parameters at the inference stage, and training with multi-sizeimages brings a greater gain in accuracy than the other two modules, but the combination of thethree modules will be better.

15.
Sensors (Basel) ; 20(1)2019 Dec 30.
Artigo em Inglês | MEDLINE | ID: mdl-31905910

RESUMO

In the IoT (Internet of Things) environment, smart homes, smart grids, and telematics constantly generate data with complex attributes. These data have low heterogeneity and poor interoperability, which brings difficulties to data management and value mining. The promising combination of blockchain and the Internet of things as BCoT (blockchain of things) can solve these problems. This paper introduces an innovative method DCOMB (dual combination Bloom filter) to firstly convert the computational power of bitcoin mining into the computational power of query. Furthermore, this article uses the DCOMB method to build blockchain-based IoT data query model. DCOMB can implement queries only through mining hash calculation. This model combines the data stream of the IoT with the timestamp of the blockchain, improving the interoperability of data and the versatility of the IoT database system. The experiment results show that the random reading performance of DCOMB query is higher than that of COMB (combination Bloom filter), and the error rate of DCOMB is lower. Meanwhile, both DCOMB and COMB query performance are better than MySQL (My Structured Query Language).

16.
Sensors (Basel) ; 19(20)2019 Oct 22.
Artigo em Inglês | MEDLINE | ID: mdl-31652631

RESUMO

Wireless sensor networks (WSN) have deeply influenced the working and living styles of human beings. Information security and privacy for WSN is particularly crucial. Cryptographic algorithms are extensively exploited in WSN applications to ensure the security. They are usually implemented in specific chips to achieve high data throughout with less computational resources. Cryptographic hardware should be rigidly tested to guarantee the correctness of encryption operation. Scan design improves significantly the test quality of chips and thus is widely used in semiconductor industry. Nevertheless, scan design provides a backdoor for attackers to deduce the cipher key of a cryptographic core. To protect the security of the cryptographic system we first present a secure scan architecture, in which an automatic test control circuitry is inserted to isolate the cipher key in test mode and clear the sensitive information at mode switching. Then, the weaknesses of this architecture are analyzed and an enhanced scheme using concept of test authorization is proposed. If the correct authorization key is applied within the specific time, the normal test can be performed. Otherwise, only secure scan test can be performed. The enhanced scan scheme ensures the security of cryptographic chips while remaining the advantages of scan design.

17.
Heliyon ; 10(7): e28861, 2024 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-38601595

RESUMO

In the context of the increasingly diversified blockchain technology, interoperability among heterogeneous blockchains has become key to further advancing this field. Existing cross-chain technologies, while facilitating data and asset exchange between different blockchains to some extent, have exposed issues such as insufficient security, low efficiency, and inconsistent standards. Consequently, these issues give rise to significant obstacles in terms of both scalability and seamless communication among blockchains within a multi-chain framework. To address this, this paper proposes an efficient method for cross-chain interaction in a multi-chain environment. Building upon the traditional sidechain model, this method employs smart contracts and hash time-locked contracts (HTLCs) to design a cross-chain interaction scheme. This approach decentralizes the execution of locking, verifying, and unlocking stages in cross-chain transactions, effectively avoiding centralization risks associated with third-party entities in the process. It also greatly enhances the efficiency of fund transfers between the main chain and sidechains, while ensuring the security of cross-chain transactions to some extent. Additionally, this paper innovatively proposes a cross-chain data interaction strategy. Through smart contracts on the main chain, data from sidechains can be uploaded, verified, and stored on the main chain, achieving convenient and efficient cross-chain data sharing. The contribution of this paper is the development of a decentralized protocol that coordinates the execution of cross-chain interactions without the need to trust external parties, thereby reducing the risk of centralization and enhancing security. Experimental results validate the effectiveness of our solution in increasing transaction security and efficiency, with significant improvements over existing models. Our experiments emphasize the system's ability to handle a variety of transaction scenarios with improved throughput and reduced latency, highlighting the practical applicability and scalability of our approach.

18.
Artigo em Inglês | MEDLINE | ID: mdl-38954570

RESUMO

In recent years, data-driven remote medical management has received much attention, especially in application of survival time forecasting. By monitoring the physical characteristics indexes of patients, intelligent algorithms can be deployed to implement efficient healthcare management. However, such pure medical data-driven scenes generally lack multimedia information, which brings challenge to analysis tasks. To deal with this issue, this paper introduces the idea of ensemble deep learning to enhance feature representation ability, thus enhancing knowledge discovery in remote healthcare management. Therefore, a multiview deep learning-based efficient medical data management framework for survival time forecasting is proposed in this paper, which is named as "MDL-MDM" for short. Firstly, basic monitoring data for body indexes of patients is encoded, which serves as the data foundation for forecasting tasks. Then, three different neural network models, convolution neural network, graph attention network, and graph convolution network, are selected to build a hybrid computing framework. Their combination can bring a multiview feature learning framework to realize an efficient medical data management framework. In addition, experiments are conducted on a realistic medical dataset about cancer patients in the US. Results show that the proposal can predict survival time with 1% to 2% reduction in prediction error.

19.
Math Biosci Eng ; 20(6): 10428-10443, 2023 Apr 06.
Artigo em Inglês | MEDLINE | ID: mdl-37322940

RESUMO

With the development of intelligent aquaculture, the aquaculture industry is gradually switching from traditional crude farming to an intelligent industrial model. Current aquaculture management mainly relies on manual observation, which cannot comprehensively perceive fish living conditions and water quality monitoring. Based on the current situation, this paper proposes a data-driven intelligent management scheme for digital industrial aquaculture based on multi-object deep neural network (Mo-DIA). Mo-IDA mainly includes two aspects of fish state management and environmental state management. In fish state management, the double hidden layer BP neural network is used to build a multi-objective prediction model, which can effectively predict the fish weight, oxygen consumption and feeding amount. In environmental state management, a multi-objective prediction model based on LSTM neural network was constructed using the temporal correlation of water quality data series collection to predict eight water quality attributes. Finally, extensive experiments were conducted on real datasets and the evaluation results well demonstrated the effectiveness and accuracy of the Mo-IDA proposed in this paper.


Assuntos
Aquicultura , Redes Neurais de Computação , Animais , Agricultura , Qualidade da Água , Peixes
20.
Math Biosci Eng ; 20(5): 8766-8781, 2023 Mar 07.
Artigo em Inglês | MEDLINE | ID: mdl-37161221

RESUMO

As the college students have been a most active user group in various social media, it remains significant to make effective sentiment analysis for college public opinions. Capturing the direction of public opinion in the student community in a timely manner and guiding students to develop the right values can help in the ideological management of universities. Universally, the recurrent neural networks have been the mainstream technology in terms of sentiment analysis. Nevertheless, the existing research works more emphasized semantic characteristics in vertical direction, yet failing to capture sematic characteristics in horizonal direction. In other words, it is supposed to increase more balance into sentiment analysis models. To remedy such gap, this paper presents a novel sentiment analysis method based on multi-scale deep learning for college public opinions. To fit for bidirectional semantic characteristics, a typical sequential neural network with two propagation paths is selected as the backbone. It is then extended with more layers in horizonal direction. Such design is able to balance both model depth and model breadth. At last, some experiments on a real-world social media dataset are conducted for evaluation, well acknowledging efficiency of the proposed analysis model.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA