Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 52
Filtrar
Más filtros

Bases de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
IEEE Sens J ; 23(2): 955-968, 2023 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-36913217

RESUMEN

Recently, unmanned aerial vehicles (UAVs) are deployed in Novel Coronavirus Disease-2019 (COVID-19) vaccine distribution process. To address issues of fake vaccine distribution, real-time massive UAV monitoring and control at nodal centers (NCs), the authors propose SanJeeVni, a blockchain (BC)-assisted UAV vaccine distribution at the backdrop of sixth-generation (6G) enhanced ultra-reliable low latency communication (6G-eRLLC) communication. The scheme considers user registration, vaccine request, and distribution through a public Solana BC setup, which assures a scalable transaction rate. Based on vaccine requests at production setups, UAV swarms are triggered with vaccine delivery to NCs. An intelligent edge offloading scheme is proposed to support UAV coordinates and routing path setups. The scheme is compared against fifth-generation (5G) uRLLC communication. In the simulation, we achieve and 86% improvement in service latency, 12.2% energy reduction of UAV with 76.25% more UAV coverage in 6G-eRLLC, and a significant improvement of [Formula: see text]% in storage cost against the Ethereum network, which indicates the scheme efficacy in practical setups.

2.
Sensors (Basel) ; 23(14)2023 Jul 15.
Artículo en Inglés | MEDLINE | ID: mdl-37514719

RESUMEN

With the development of the Internet of Things (IoT), the number of devices will also increase tremendously. However, we need more wireless communication resources. It has been shown in the literature that non-orthogonal multiple access (NOMA) offers high multiplexing gains due to the simultaneous transfer of signals, and massive multiple-input-multiple-outputs (mMIMOs) offer high spectrum efficiency due to the high antenna gain and high multiplexing gains. Therefore, a downlink mMIMO NOMA cooperative system is considered in this paper. The users at the cell edge in 5G cellular system generally suffer from poor signal quality as they are far away from the BS and expend high battery power to decode the signals superimposed through NOMA. Thus, this paper uses a cooperative relay system and proposes the mMIMO NOMA double-mode model to reduce battery expenditure and increase the cell edge user's energy efficiency and sum rate. In the mMIMO NOMA double-mode model, two modes of operation are defined. Depending on the relay's battery level, these modes are chosen to utilize the system's energy efficiency. Comprehensive numerical results show the improvement in the proposed system's average sum rate and average energy efficiency compared with a conventional system. In a cooperative NOMA system, the base station (BS) transmits a signal to a relay, and the relay forwards the signal to a cluster of users. This cluster formation depends on the user positions and geographical restrictions concerning the relay equipment. Therefore, it is vital to form user clusters for efficient and simultaneous transmission. This paper also presents a novel method for efficient cluster formation.

3.
Sensors (Basel) ; 23(21)2023 Nov 02.
Artículo en Inglés | MEDLINE | ID: mdl-37960626

RESUMEN

The Internet of Things (IoT) is the most abundant technology in the fields of manufacturing, automation, transportation, robotics, and agriculture, utilizing the IoT's sensors-sensing capability. It plays a vital role in digital transformation and smart revolutions in critical infrastructure environments. However, handling heterogeneous data from different IoT devices is challenging from the perspective of security and privacy issues. The attacker targets the sensor communication between two IoT devices to jeopardize the regular operations of IoT-based critical infrastructure. In this paper, we propose an artificial intelligence (AI) and blockchain-driven secure data dissemination architecture to deal with critical infrastructure security and privacy issues. First, we reduced dimensionality using principal component analysis (PCA) and explainable AI (XAI) approaches. Furthermore, we applied different AI classifiers such as random forest (RF), decision tree (DT), support vector machine (SVM), perceptron, and Gaussian Naive Bayes (GaussianNB) that classify the data, i.e., malicious or non-malicious. Furthermore, we employ an interplanetary file system (IPFS)-driven blockchain network that offers security to the non-malicious data. In addition, to strengthen the security of AI classifiers, we analyze data poisoning attacks on the dataset that manipulate sensitive data and mislead the classifier, resulting in inaccurate results from the classifiers. To overcome this issue, we provide an anomaly detection approach that identifies malicious instances and removes the poisoned data from the dataset. The proposed architecture is evaluated using performance evaluation metrics such as accuracy, precision, recall, F1 score, and receiver operating characteristic curve (ROC curve). The findings show that the RF classifier transcends other AI classifiers in terms of accuracy, i.e., 98.46%.

4.
Sensors (Basel) ; 23(2)2023 Jan 14.
Artículo en Inglés | MEDLINE | ID: mdl-36679767

RESUMEN

Mobile applications have rapidly grown over the past few decades to offer futuristic applications, such as autonomous vehicles, smart farming, and smart city. Such applications require ubiquitous, real-time, and secure communications to deliver services quickly. Toward this aim, sixth-generation (6G) wireless technology offers superior performance with high reliability, enhanced transmission rate, and low latency. However, managing the resources of the aforementioned applications is highly complex in the precarious network. An adversary can perform various network-related attacks (i.e., data injection or modification) to jeopardize the regular operation of the smart applications. Therefore, incorporating blockchain technology in the smart application can be a prominent solution to tackle security, reliability, and data-sharing privacy concerns. Motivated by the same, we presented a case study on public safety applications that utilizes the essential characteristics of artificial intelligence (AI), blockchain, and a 6G network to handle data integrity attacks on the crime data. The case study is assessed using various performance parameters by considering blockchain scalability, packet drop ratio, and training accuracy. Lastly, we explored different research challenges of adopting blockchain in the 6G wireless network.


Asunto(s)
Inteligencia Artificial , Cadena de Bloques , Reproducibilidad de los Resultados , Inteligencia , Agricultura , Seguridad Computacional
5.
Sensors (Basel) ; 23(4)2023 Feb 10.
Artículo en Inglés | MEDLINE | ID: mdl-36850606

RESUMEN

A cognitive radio network (CRN) is an intelligent network that can detect unoccupied spectrum space without interfering with the primary user (PU). Spectrum scarcity arises due to the stable channel allocation, which the CRN handles. Spectrum handoff management is a critical problem that must be addressed in the CRN to ensure indefinite connection and profitable use of unallocated spectrum space for secondary users (SUs). Spectrum handoff (SHO) has some disadvantages, i.e., communication delay and power consumption. To overcome these drawbacks, a reduction in handoff should be a priority. This study proposes the use of dynamic spectrum access (DSA) to check for available channels for SU during handoff using a metaheuristic algorithm depending on machine learning. The simulation results show that the proposed "support vector machine-based red deer algorithm" (SVM-RDA) is resilient and has low complexity. The suggested algorithm's experimental setup offers several handoffs, unsuccessful handoffs, handoff delay, throughput, signal-to-noise ratio (SNR), SU bandwidth, and total spectrum bandwidth. This study provides an improved system performance during SHO. The inferred technique anticipates handoff delay and minimizes the handoff numbers. The results show that the recommended method is better at making predictions with fewer handoffs compared to the other three.

6.
Sensors (Basel) ; 23(16)2023 Aug 21.
Artículo en Inglés | MEDLINE | ID: mdl-37631830

RESUMEN

Hybrid beamforming is a viable method for lowering the complexity and expense of massive multiple-input multiple-output systems while achieving high data rates on track with digital beamforming. To this end, the purpose of the research reported in this paper is to assess the effectiveness of the three architectural beamforming techniques (Analog, Digital, and Hybrid beamforming) in massive multiple-input multiple-output systems, especially hybrid beamforming. In hybrid beamforming, the antennas are connected to a single radio frequency chain, unlike digital beamforming, where each antenna has a separate radio frequency chain. The beam formation toward a particular angle depends on the channel state information. Further, massive multiple-input multiple-output is discussed in detail along with the performance parameters like bit error rate, signal-to-noise ratio, achievable sum rate, power consumption in massive multiple-input multiple-output, and energy efficiency. Finally, a comparison has been established between the three beamforming techniques.

7.
Sensors (Basel) ; 23(5)2023 Mar 01.
Artículo en Inglés | MEDLINE | ID: mdl-36904915

RESUMEN

Topic modeling is a machine learning algorithm based on statistics that follows unsupervised machine learning techniques for mapping a high-dimensional corpus to a low-dimensional topical subspace, but it could be better. A topic model's topic is expected to be interpretable as a concept, i.e., correspond to human understanding of a topic occurring in texts. While discovering corpus themes, inference constantly uses vocabulary that impacts topic quality due to its size. Inflectional forms are in the corpus. Since words frequently appear in the same sentence and are likely to have a latent topic, practically all topic models rely on co-occurrence signals between various terms in the corpus. The topics get weaker because of the abundance of distinct tokens in languages with extensive inflectional morphology. Lemmatization is often used to preempt this problem. Gujarati is one of the morphologically rich languages, as a word may have several inflectional forms. This paper proposes a deterministic finite automaton (DFA) based lemmatization technique for the Gujarati language to transform lemmas into their root words. The set of topics is then inferred from this lemmatized corpus of Gujarati text. We employ statistical divergence measurements to identify semantically less coherent (overly general) topics. The result shows that the lemmatized Gujarati corpus learns more interpretable and meaningful subjects than unlemmatized text. Finally, results show that lemmatization curtails the size of vocabulary decreases by 16% and the semantic coherence for all three measurements-Log Conditional Probability, Pointwise Mutual Information, and Normalized Pointwise Mutual Information-from -9.39 to -7.49, -6.79 to -5.18, and -0.23 to -0.17, respectively.

8.
Sensors (Basel) ; 23(9)2023 Apr 22.
Artículo en Inglés | MEDLINE | ID: mdl-37177403

RESUMEN

The aim of the peer-to-peer (P2P) decentralized gaming industry has shifted towards realistic gaming environment (GE) support for game players (GPs). Recent innovations in the metaverse have motivated the gaming industry to look beyond augmented reality and virtual reality engines, which improve the reality of virtual game worlds. In gaming metaverses (GMs), GPs can play, socialize, and trade virtual objects in the GE. On game servers (GSs), the collected GM data are analyzed by artificial intelligence models to personalize the GE according to the GP. However, communication with GSs suffers from high-end latency, bandwidth concerns, and issues regarding the security and privacy of GP data, which pose a severe threat to the emerging GM landscape. Thus, we proposed a scheme, Game-o-Meta, that integrates federated learning in the GE, with GP data being trained on local devices only. We envisioned the GE over a sixth-generation tactile internet service to address the bandwidth and latency issues and assure real-time haptic control. In the GM, the GP's game tasks are collected and trained on the GS, and then a pre-trained model is downloaded by the GP, which is trained using local data. The proposed scheme was compared against traditional schemes based on parameters such as GP task offloading, GP avatar rendering latency, and GS availability. The results indicated the viability of the proposed scheme.

9.
Sensors (Basel) ; 23(15)2023 Jul 27.
Artículo en Inglés | MEDLINE | ID: mdl-37571505

RESUMEN

With the onset of 5G technology, the number of users is increasing drastically. These increased numbers of users demand better service on the network. This study examines the millimeter wave bands working frequencies. Working in the millimeter wave band has the disadvantage of interference. This study aims to analyze the impact of different interference conditions on unmanned aerial vehicle use scenarios, such as open-air gatherings and indoor-outdoor sports stadiums. Performance analysis was carried out in terms of received power and path loss readings.

10.
BMC Gastroenterol ; 22(1): 118, 2022 Mar 10.
Artículo en Inglés | MEDLINE | ID: mdl-35272611

RESUMEN

BACKGROUND: The natural history and incidence of hepatocellular carcinoma (HCC) arising from indeterminate liver lesions are not well described. We aimed to define the incidence of HCC in a cohort of patients undergoing surveillance by magnetic resonance imaging (MRI) and estimate any associations with incident HCC. METHODS: We performed a retrospective follow-up study, identifying MRI scans in which indeterminate lesions had been reported between January 2006 and January 2017. Subsequent MRI scan reports were reviewed for incident HCC arising from indeterminate lesions, data were extracted from electronic patient records and survival analysis performed to estimate associations with baseline factors. RESULTS: One hundred and nine patients with indeterminate lesions on MRI were identified. HCC developed in 19 (17%) patients over mean follow up of 4.6 years. Univariate Cox proportional hazards analysis found incident HCC to be significantly associated with baseline low platelet count (hazard ratio (HR) = 7.3 (95% confidence intervals (CI) 2.1-24.9), high serum alpha-fetoprotein level (HR = 2.7 (95% CI 1.0-7.1)) and alcohol consumption above fourteen units weekly (HR = 3.1 (95% CI 1.1-8.7)). Multivariate analysis, however, found that only low platelet count was independently associated with HCC (HR = 5.5 (95% CI 0.6-5.1)). CONCLUSIONS: HCC arises in approximately one fifth of indeterminate liver lesions over 4.6 years and is associated with a low platelet count at the time of first diagnosis of an indeterminate lesion. Incidence of HCC was more common in people with viral hepatitis and in those consuming > 14 units of alcohol per week. Our data may be used to support a strategy of enhanced surveillance in patients with indeterminate lesions.


Asunto(s)
Carcinoma Hepatocelular , Neoplasias Hepáticas , Carcinoma Hepatocelular/complicaciones , Estudios de Seguimiento , Humanos , Neoplasias Hepáticas/complicaciones , Imagen por Resonancia Magnética/métodos , Estudios Retrospectivos
11.
Sensors (Basel) ; 22(11)2022 May 26.
Artículo en Inglés | MEDLINE | ID: mdl-35684668

RESUMEN

Integrating information and communication technology (ICT) and energy grid infrastructures introduces smart grids (SG) to simplify energy generation, transmission, and distribution. The ICT is embedded in selected parts of the grid network, which partially deploys SG and raises various issues such as energy losses, either technical or non-technical (i.e., energy theft). Therefore, energy theft detection plays a crucial role in reducing the energy generation burden on the SG and meeting the consumer demand for energy. Motivated by these facts, in this paper, we propose a deep learning (DL)-based energy theft detection scheme, referred to as GrAb, which uses a data-driven analytics approach. GrAb uses a DL-based long short-term memory (LSTM) model to predict the energy consumption using smart meter data. Then, a threshold calculator is used to calculate the energy consumption. Both the predicted energy consumption and the threshold value are passed to the support vector machine (SVM)-based classifier to categorize the energy losses into technical, non-technical (energy theft), and normal consumption. The proposed data-driven theft detection scheme identifies various forms of energy theft (e.g., smart meter data manipulation or clandestine connections). Experimental results show that the proposed scheme (GrAb) identifies energy theft more accurately compared to the state-of-the-art approaches.


Asunto(s)
Aprendizaje Profundo , Redes de Comunicación de Computadores , Fenómenos Físicos , Máquina de Vectores de Soporte , Robo
12.
Sensors (Basel) ; 22(11)2022 May 31.
Artículo en Inglés | MEDLINE | ID: mdl-35684802

RESUMEN

The emerging need for high data rate, low latency, and high network capacity encourages wireless networks (WNs) to build intelligent and dynamic services, such as intelligent transportation systems, smart homes, smart cities, industrial automation, etc. However, the WN is impeded by several security threats, such as data manipulation, denial-of-service, injection, man-in-the-middle, session hijacking attacks, etc., that deteriorate the security performance of the aforementioned WN-based intelligent services. Toward this goal, various security solutions, such as cryptography, artificial intelligence (AI), access control, authentication, etc., are proposed by the scientific community around the world; however, they do not have full potential in tackling the aforementioned security issues. Therefore, it necessitates a technology, i.e., a blockchain, that offers decentralization, immutability, transparency, and security to protect the WN from security threats. Motivated by these facts, this paper presents a WNs survey in the context of security and privacy issues with blockchain-based solutions. First, we analyzed the existing research works and highlighted security requirements, security issues in a different generation of WN (4G, 5G, and 6G), and a comparative analysis of existing security solutions. Then, we showcased the influence of blockchain technology and prepared an exhaustive taxonomy for blockchain-enabled security solutions in WN. Further, we also proposed a blockchain and a 6G-based WN architecture to highlight the importance of blockchain technology in WN. Moreover, the proposed architecture is evaluated against different performance metrics, such as scalability, packet loss ratio, and latency. Finally, we discuss various open issues and research challenges for blockchain-based WNs solutions.


Asunto(s)
Cadena de Bloques , Inteligencia Artificial , Humanos , Motivación , Privacidad , Tecnología
13.
Sensors (Basel) ; 22(13)2022 Jun 26.
Artículo en Inglés | MEDLINE | ID: mdl-35808325

RESUMEN

In Smart Grid (SG), Transactive Energy Management (TEM) is one of the most promising approaches to boost consumer participation in energy generation, energy management, and establishing decentralized energy market models using Peer-to-Peer (P2P). In P2P, a prosumer produces electric energy at their place using Renewable Energy Resources (RES) such as solar energy, wind energy, etc. Then, this generated energy is traded with consumers (who need the energy) in a nearby locality. P2P facilitates energy exchange in localized micro-energy markets of the TEM system. Such decentralized P2P energy management could cater to diverse prosumers and utility business models. However, the existing P2P approaches suffer from several issues such as single-point-of-failure, network bandwidth, scalability, trust, and security issues. To handle the aforementioned issues, this paper proposes a Decentralized and Transparent P2P Energy Trading (DT-P2PET) scheme using blockchain. The proposed DT-P2PET scheme aims to reduce the grid's energy generation and management burden while also increasing profit for both consumers and prosumers through a dynamic pricing mechanism. The DT-P2PET scheme uses Ethereum-blockchain-based Smart Contracts (SCs) and InterPlanetary File System (IPFS) for the P2P energy trading. Furthermore, a recommender mechanism is also introduced in this study to increase the number of prosumers. The Ethereum SCs are designed and deployed to perform P2P in real time in the proposed DT-P2PET scheme. The DT-P2PET scheme is evaluated based on the various parameters such as profit generation (for prosumer and consumer both), data storage cost, network bandwidth, and data transfer rate in contrast to the existing approaches.


Asunto(s)
Cadena de Bloques , Comercio , Sistemas de Computación , Almacenamiento y Recuperación de la Información
14.
Sensors (Basel) ; 23(1)2022 Dec 24.
Artículo en Inglés | MEDLINE | ID: mdl-36616797

RESUMEN

With the rapid growth in the data and processing over the cloud, it has become easier to access those data. On the other hand, it poses many technical and security challenges to the users of those provisions. Fog computing makes these technical issues manageable to some extent. Fog computing is one of the promising solutions for handling the big data produced by the IoT, which are often security-critical and time-sensitive. Massive IoT data analytics by a fog computing structure is emerging and requires extensive research for more proficient knowledge and smart decisions. Though an advancement in big data analytics is taking place, it does not consider fog data analytics. However, there are many challenges, including heterogeneity, security, accessibility, resource sharing, network communication overhead, the real-time data processing of complex data, etc. This paper explores various research challenges and their solution using the next-generation fog data analytics and IoT networks. We also performed an experimental analysis based on fog computing and cloud architecture. The result shows that fog computing outperforms the cloud in terms of network utilization and latency. Finally, the paper is concluded with future trends.

15.
Multimed Syst ; 28(4): 1189-1222, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-34276140

RESUMEN

The COVID-19 pandemic is rapidly spreading across the globe and infected millions of people that take hundreds of thousands of lives. Over the years, the role of Artificial intelligence (AI) has been on the rise as its algorithms are getting more and more accurate and it is thought that its role in strengthening the existing healthcare system will be the most profound. Moreover, the pandemic brought an opportunity to showcase AI and healthcare integration potentials as the current infrastructure worldwide is overwhelmed and crumbling. Due to AI's flexibility and adaptability, it can be used as a tool to tackle COVID-19. Motivated by these facts, in this paper, we surveyed how the AI techniques can handle the COVID-19 pandemic situation and present the merits and demerits of these techniques. This paper presents a comprehensive end-to-end review of all the AI-techniques that can be used to tackle all areas of the pandemic. Further, we systematically discuss the issues of the COVID-19, and based on the literature review, we suggest their potential countermeasures using AI techniques. In the end, we analyze various open research issues and challenges associated with integrating the AI techniques in the COVID-19.

16.
Medicina (Kaunas) ; 58(2)2022 Feb 18.
Artículo en Inglés | MEDLINE | ID: mdl-35208634

RESUMEN

A coronavirus outbreak caused by a novel virus known as SARS-CoV-2 originated towards the latter half of 2019. COVID-19's abrupt emergence and unchecked global expansion highlight the inability of the current healthcare services to respond to public health emergencies promptly. This paper reviews the different aspects of human life comprehensively affected by COVID-19. It then discusses various tools and technologies from the leading domains and their integration into people's lives to overcome issues resulting from pandemics. This paper further focuses on providing a detailed review of existing and probable Artificial Intelligence (AI), Internet of Things (IoT), Augmented Reality (AR), Virtual Reality (VR), and Blockchain-based solutions. The COVID-19 pandemic brings several challenges from the viewpoint of the nation's healthcare, security, privacy, and economy. AI offers different predictive services and intelligent strategies for detecting coronavirus signs, promoting drug development, remote healthcare, classifying fake news detection, and security attacks. The incorporation of AI in the COVID-19 outbreak brings robust and reliable solutions to enhance the healthcare systems, increases user's life expectancy, and boosts the nation's economy. Furthermore, AR/VR helps in distance learning, factory automation, and setting up an environment of work from home. Blockchain helps in protecting consumer's privacy, and securing the medical supply chain operations. IoT is helpful in remote patient monitoring, distant sanitising via drones, managing social distancing (using IoT cameras), and many more in combating the pandemic. This study covers an up-to-date analysis on the use of blockchain technology, AI, AR/VR, and IoT for combating COVID-19 pandemic considering various applications. These technologies provide new emerging initiatives and use cases to deal with the COVID-19 pandemic. Finally, we discuss challenges and potential research paths that will promote further research into future pandemic outbreaks.


Asunto(s)
COVID-19 , Pandemias , Inteligencia Artificial , Humanos , SARS-CoV-2 , Tecnología
17.
Comput Electr Eng ; 103: 108352, 2022 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-36068837

RESUMEN

The proliferating outbreak of COVID-19 raises global health concerns and has brought many countries to a standstill. Several restrain strategies are imposed to suppress and flatten the mortality curve, such as lockdowns, quarantines, etc. Artificial Intelligence (AI) techniques could be a promising solution to leverage these restraint strategies. However, real-time decision-making necessitates a cloud-oriented AI solution to control the pandemic. Though many cloud-oriented solutions exist, they have not been fully exploited for real-time data accessibility and high prediction accuracy. Motivated by these facts, this paper proposes a cloud-oriented AI-based scheme referred to as D-espy (i.e., Disease-espy) for disease detection and prevention. The proposed D-espy scheme performs a comparative analysis between Autoregressive Integrated Moving Average (ARIMA), Vanilla Long Short Term Memory (LSTM), and Stacked LSTM techniques, which signify the dominance of Stacked LSTM in terms of prediction accuracy. Then, a Medical Resource Distribution (MRD) mechanism is proposed for the optimal distribution of medical resources. Next, a three-phase analysis of the COVID-19 spread is presented, which can benefit the governing bodies in deciding lockdown relaxation. Results show the efficacy of the D-espy scheme concerning 96.2% of prediction accuracy compared to the existing approaches.

18.
BMC Gastroenterol ; 21(1): 143, 2021 Mar 31.
Artículo en Inglés | MEDLINE | ID: mdl-33789586

RESUMEN

BACKGROUND AND AIMS: Alcohol use disorders (AUD) cause 7.2% of UK hospital admissions/year. Most are not managed by hepatologists and liver disease may be missed. We used the Enhanced Liver Fibrosis (ELF) test to investigate prevalence and associations of occult advanced liver fibrosis in AUD patients not known to have liver fibrosis. METHODS: Liver fibrosis was assessed using ELF in prospective patients referred to the Royal Free Hospital Alcohol Specialist Nurse (November 2018-December 2019). Known cases of liver disease were excluded. Patient demographics, blood tests, imaging data and alcohol histories recorded. Advanced fibrosis was categorised as ELF ≥ 10.5. RESULTS: The study included 99 patients (69% male, mean age 53.1 ± 14.4) with median alcohol intake 140 units/week (IQR 80.9-280), and a mean duration of harmful drinking of 15 years (IQR 10-27.5). The commonest reason for admission was symptomatic alcohol withdrawal (36%). The median ELF score was 9.62, range 6.87-13.78. An ELF score ≥ 10.5 was recorded in 28/99 (29%) patients, of whom 28.6% had normal liver tests. Within previous 5-years, 76% had attended A&E without assessment of liver disease. The ELF score was not associated with recent alcohol intake (p = 0.081), or inflammation (p = 0.574). CONCLUSION: Over a quarter of patients with AUD had previously undetected advanced liver fibrosis assessed by ELF testing. ELF was not associated with liver inflammation or recent alcohol intake. The majority had recent missed opportunities for investigating liver disease. We recommend clinicians use non-invasive tests to assess liver fibrosis in patients admitted to hospital with AUD.


Asunto(s)
Alcoholismo , Enfermeras Especialistas , Adulto , Anciano , Biomarcadores , Femenino , Humanos , Hígado/patología , Cirrosis Hepática/diagnóstico , Cirrosis Hepática/epidemiología , Cirrosis Hepática/patología , Pruebas de Función Hepática , Masculino , Persona de Mediana Edad , Estudios Prospectivos
19.
BMC Gastroenterol ; 21(1): 268, 2021 Jun 28.
Artículo en Inglés | MEDLINE | ID: mdl-34182924

RESUMEN

BACKGROUND: Alcohol is the main cause of chronic liver disease. The Enhanced Liver Fibrosis (ELF) test is a serological biomarker for fibrosis staging in chronic liver disease, however its utility in alcohol-related liver disease warrants further validation. We assessed the diagnostic and prognostic performance of ELF in alcohol-related liver disease. METHODS: Observational cohort study assessing paired ELF and histology from 786 tertiary care patients with chronic liver disease due to alcohol (n = 81) and non-alcohol aetiologies (n = 705). Prognostic data were available for 64 alcohol patients for a median of 6.4 years. Multiple ELF cut-offs were assessed to determine diagnostic utility in moderate fibrosis and cirrhosis. Survival data were assessed to determine the ability of ELF to predict liver related events and all-cause mortality. RESULTS: ELF identified cirrhosis and moderate fibrosis in alcohol-related liver disease independently of aminotransferase levels with areas under receiver operating characteristic curves of 0.895 (95% CI 0.823-0.968) and 0.923 (95% CI 0.866-0.981) respectively, which were non-inferior to non-alcohol aetiologies. The overall performance of ELF was assessed using the Obuchowski method: in alcohol = 0.934 (95% CI 0.908-0.960); non-alcohol = 0.907 (95% CI 0.895-0.919). Using ELF < 9.8 to exclude and ≧ 10.5 to diagnose cirrhosis, 87.7% of alcohol cases could have avoided biopsy, with sensitivity of 91% and specificity of 85%. A one-unit increase in ELF was associated with a 2.6 (95% CI 1.55-4.31, p < 0.001) fold greater odds of cirrhosis at baseline and 2.0-fold greater risk of a liver related event within 6 years (95% CI 1.39-2.99, p < 0.001). CONCLUSIONS: ELF accurately stages liver fibrosis independently of transaminase elevations as a marker of inflammation and has superior prognostic performance to biopsy in alcohol-related liver disease.


Asunto(s)
Cirrosis Hepática , Hepatopatías , Biomarcadores , Biopsia , Estudios de Cohortes , Humanos , Hígado/patología , Cirrosis Hepática/diagnóstico , Cirrosis Hepática/patología , Hepatopatías/patología , Pruebas de Función Hepática , Pronóstico
20.
J Gastroenterol Hepatol ; 36(6): 1435-1449, 2021 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-33171534

RESUMEN

BACKGROUND AND AIM: Mortality of alcohol-related liver disease (ArLD) is increasing, and liver fibrosis stage is the best mortality predictor. Non-invasive tests (NITs) are increasingly used to detect fibrosis, but their value as prognostic tests in chronic liver disease, and in particular in ArLD, is less well recognized. We aimed to describe the prognostic performance of four widely used NITs (Fibrosis 4 test [FIB4], Enhanced Liver Fibrosis [ELF] test, FibroScan, and FibroTest) in ArLD. METHODS: Applying systematic review methodology, we searched four databases from inception to May 2020. Inclusion/exclusion criteria were applied to search using Medical Subject Heading terms and keywords. The first and second reviewers independently screened results, extracted data, and performed risk-of-bias assessment using Quality in Prognosis Studies tool. RESULTS: Searches produced 25 088 articles. After initial screening, 1020 articles were reviewed independently by both reviewers. Eleven articles remained after screening for eligibility: one on ELF, four on FibroScan, four on FIB4, one on FIB4 + FibroScan, and one on FibroTest + FIB4. Area under the receiver operating characteristic curves for outcome prediction ranged from 0.65 to 0.76 for FibroScan, 0.64 to 0.83 for FIB4, 0.69 to 0.79 for FibroTest, and 0.72 to 0.85 for ELF. Studies scored low-moderate risk of bias for most domains but high risk in confounding/statistical reporting domains. The results were heterogeneous for outcomes and reporting, making pooling of data unfeasible. CONCLUSIONS: This systematic review returned 11 papers, six of which were conference abstracts and one unpublished manuscript. While the heterogeneity of studies precluded direct comparisons of NITs, each NIT performed well in individual studies in predicting prognosis in ArLD (area under the receiver operating characteristic curves >0.7 in each NIT category) and may add value to prognostication in clinical practice.


Asunto(s)
Diagnóstico por Imagen de Elasticidad/métodos , Cirrosis Hepática/diagnóstico , Hepatopatías Alcohólicas/diagnóstico , Pruebas de Función Hepática/métodos , Femenino , Humanos , Cirrosis Hepática/etiología , Hepatopatías Alcohólicas/complicaciones , Masculino , Pronóstico , Curva ROC
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA