Your browser doesn't support javascript.
loading
LLM-Twin: mini-giant model-driven beyond 5G digital twin networking framework with semantic secure communication and computation.
Hong, Yang; Wu, Jun; Morello, Rosario.
Affiliation
  • Hong Y; Graduate School of Information, Production and System, Waseda University, Fukuoka, 8080135, Japan.
  • Wu J; Graduate School of Information, Production and System, Waseda University, Fukuoka, 8080135, Japan. jun.wu@ieee.org.
  • Morello R; Department of Information Engineering, University "Mediterranea" of Reggio Calabria, Via Graziella, 89122, Reggio Calabria, Italy.
Sci Rep ; 14(1): 19065, 2024 Aug 17.
Article in En | MEDLINE | ID: mdl-39154033
ABSTRACT
Beyond 5G networks provide solutions for next-generation communications, especially digital twins networks (DTNs) have gained increasing popularity for bridging physical and digital space. However, current DTNs pose some challenges, especially when applied to scenarios that require efficient and multimodal data processing. Firstly, current DTNs are limited in communication and computational efficiency, since they require to transmit large amounts of raw data collected from physical sensors, as well as to ensure model synchronization through high-frequency computation. Second, current models of DTNs are domain-specific (e.g. E-health), making it difficult to handle DT scenarios with multimodal data processing requirements. Finally, current security schemes for DTNs introduce additional overheads that impair the efficiency. Against the above challenges, we propose a large language model (LLM) empowered DTNs framework, LLM-Twin. First, based on LLM, we propose digital twin semantic networks (DTSNs), which enable more efficient communication and computation. Second, we design a mini-giant model collaboration scheme, which enables efficient deployment of LLM in DTNs and is adapted to handle multimodal data. Then, we designed a native security policy for LLM-twin without compromising efficiency. Numerical experiments and case studies demonstrate the feasibility of LLM-Twin. To our knowledge, this is the first to propose an LLM-based semantic-level DTNs.

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Sci Rep Year: 2024 Document type: Article Affiliation country: Japón Country of publication: Reino Unido

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Sci Rep Year: 2024 Document type: Article Affiliation country: Japón Country of publication: Reino Unido