Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 27
Filter
1.
Sensors (Basel) ; 24(3)2024 Jan 25.
Article in English | MEDLINE | ID: mdl-38339519

ABSTRACT

Indoor localization of a mobile target represents a prominent application within wireless sensor network (WSN), showcasing significant values and scientific interest. Interference, obstacles, and energy consumption are critical challenges for indoor applications and battery replacements. A proposed tracking system deals with several factors such as latency, energy consumption, and accuracy presenting an innovative solution for the mobile localization application. In this paper, a novel algorithm introduces a self-localization algorithm for mobile targets using the wake-up media access control (MAC) protocol. The developed tracking application is based on the trilateration technique with received signal strength indication (RSSI) measurements. Simulations are implemented in the objective modular network testbed in C++ (OMNeT++) discrete event simulator using the C++ programming language, and the RSSI values introduced are based on real indoor measurements. In addition, a determination approach for finding the optimal parameters of RSSI is assigned to implement for the simulation parameters. Simulation results show a significant reduction in power consumption and exceptional accuracy, with an average error of 1.91 m in 90% of cases. This method allows the optimization of overall energy consumption, which consumes only 2.69% during the localization of 100 different positions.

2.
Sensors (Basel) ; 23(19)2023 Sep 30.
Article in English | MEDLINE | ID: mdl-37837021

ABSTRACT

Wake-up receivers are gaining importance in power-aware wireless sensor networks, as they significantly reduce power consumption during RF reception, enabling asynchronous communication with low latency. However, the performance of wake-up receivers still lags behind that of off-the-shelf RF transceivers. There is a growing demand for higher sensitivity, enhanced reliability, and lower latency while maintaining the lowest power consumption. In this article, our goal is to advance the performance of wake-up receivers based on off-the-shelf components and low-frequency pattern matchers. Through a systematic investigation, we proposed multiple improvements aimed at enhancing wake-up receiver performance and reliability. We introduced an improved passive envelope detector and realized a wake-up receiver for the 868 MHz band, which achieves a power consumption of 5.71 µW and latency of 9.02 ms. Our proposed wake-up receiver is capable of detecting signals down to an average power level of -61.6 dBm. These achievements represent significant advancements compared to the existing state of research on wake-up receivers based on low-frequency pattern matchers. Recent articles have not been able to attain such improved values in signal detection, power consumption, and latency.

3.
Sensors (Basel) ; 22(9)2022 Apr 23.
Article in English | MEDLINE | ID: mdl-35590944

ABSTRACT

The Wireless Sensor Network (WSN) is one of the most promising solutions for the supervision of multiple phenomena and for the digitisation of the Internet of Things (IoT). The Wake-up Receiver (WuRx) is one of the most trivial and effective solutions for energy-constrained networks. This technology allows energy-autonomous on-demand communication for continuous monitoring instead of the conventional radio. The routing process is one of the most energy and time-consuming processes in WSNs. It is, hence, crucial to conceive an energy-efficient routing process. In this paper, we propose a novel Wake-up Receiver-based routing protocol called Clustered WuRx based on Multicast wake-up (CWM), which ensures energy optimisation and time-efficiency at the same time for indoor scenarios. In our proposed approach, the network is divided into clusters. Each Fog Node maintains the routes from each node in its cluster to it. When a sink requires information from a given node, it's corresponding Fog Node uses a multicast wake-up mechanism to wake up the intended node and all the intermediate nodes that will be used in the routing process simultaneously. Measurement results demonstrate that our proposed approach exhibits higher energy efficiency and has drastic performance improvements in the delivery delay compared with other routing protocols.

4.
Sensors (Basel) ; 22(6)2022 Mar 10.
Article in English | MEDLINE | ID: mdl-35336342

ABSTRACT

With the introduction of Internet of Things (IoT) technology in several sectors, wireless, reliable, and energy-saving communication in distributed sensor networks are more important than ever. Thereby, wake-up technologies are becoming increasingly important as they significantly contribute to reducing the energy consumption of wireless sensor nodes. In an indoor environment, the use of wireless sensors, in general, is more challenging due to signal fading and reflections and needs, therefore, to be critically investigated. This paper discusses the performance analysis of wake-up receiver (WuRx) architectures based on two low frequency (LF) amplifier approaches with regard to sensitivity, power consumption, and package error rate (PER). Factors that affect systems were compared and analyzed by analytical modeling, simulation results, and experimental studies with both architectures. The developed WuRx operates in the 868 MHz band using on-off-keying (OOK) signals while supporting address detection to wake up only the targeted network node. By using an indoor setup, the signal strength and PER of received signal strength indicator (RSSI) in different rooms and distances were determined to build a wireless sensor network. The results show a wake-up packets (WuPts) detection probability of about 90% for an interior distance of up to 34 m.

5.
Ann Emerg Med ; 64(5): 537-46, 2014 Nov.
Article in English | MEDLINE | ID: mdl-24970245

ABSTRACT

STUDY OBJECTIVE: Acute HIV infection is a clinical diagnosis aided by technology. Detecting the highly infectious acute stage of HIV infection is critical to reducing transmission and improving long-term outcomes. The Maricopa Integrated Health System implemented nontargeted, opt-out HIV screening with a fourth-generation antigen/antibody combination HIV assay test in our adult emergency department (ED) at Maricopa Medical Center to assess the prevalence of both acute and chronic unrecognized HIV. METHODS: Eligible patients aged 18 to 64 years were tested for HIV if they did not opt out and had blood drawn as part of their ED care. Patients were not eligible if they had a known HIV or AIDS diagnosis, exhibited altered mental status, were a current resident of a long-term psychiatric or correctional facility, or prompted a trauma activation. Reactive test results were delivered by a physician with the assistance of a linkage-to-care specialist. Specimens with a reactive fourth-generation assay result underwent confirmatory testing. RESULTS: From July 11, 2011, through January 5, 2014, 27,952 HIV screenings were performed for 22,468 patients tested for HIV; 78 (0.28%) had new HIV diagnoses. Of those, 18 (23% of all new diagnoses) were acute HIV infections, and 22 patients (28%) had a CD4 count of less than 200 cells/mL, or an opportunistic infection. CONCLUSION: HIV testing with a fourth-generation antigen/antibody laboratory test producing rapid results is feasible in an ED. Unexpectedly, nearly one fourth of patients with undiagnosed HIV had acute infections, which would have been more difficult to detect with previous testing technology.


Subject(s)
AIDS Serodiagnosis/methods , Emergency Service, Hospital , HIV Infections/diagnosis , AIDS Serodiagnosis/statistics & numerical data , Acute Disease , Adolescent , Adult , Aged , Arizona/epidemiology , Emergency Service, Hospital/statistics & numerical data , Female , HIV Infections/epidemiology , Humans , Male , Middle Aged , Treatment Refusal , Young Adult
6.
Am Heart J ; 158(6): 1018-23, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19958870

ABSTRACT

OBJECTIVE: The study aimed to determine the impact on eptifibatide-associated bleeding by implementing a computerized dosing algorithm in the cardiac catheterization suite. BACKGROUND: Excessive dosing of eptifibatide is associated with increased bleeding rates and hospital mortality. Although dosing adjustments based on renal function has been recommended, its implementation and clinical impact have not been assessed in daily practice. METHODS: A computerized algorithm was implemented in January 2006 to calculate appropriate eptifibatide infusion dose (1 microg kg(-1) min(-1) for creatinine clearance <50 mL/min or 2 microg kg(-1) min(-1) for creatinine clearance >or=50 mL/min) using the Cockroft-Gault formula. All patients had hemoglobin measured before and the day after the procedure. Bleeding within 24 hours and mortality during hospitalization were compared in consecutive patients before and after implementation of the algorithm. RESULTS: A total of 334 patients qualified for inclusion (pre-algorithm n = 91, post-algorithm n = 243). There was an increase in the proportion of patients receiving recommended doses of eptifibatide dosing (74.7% pre-algorithm vs 97.5% post-algorithm, P

Subject(s)
Algorithms , Drug Dosage Calculations , Drug Therapy, Computer-Assisted , Hemorrhage/chemically induced , Hemorrhage/mortality , Peptides/administration & dosage , Platelet Aggregation Inhibitors/administration & dosage , Eptifibatide , Female , Humans , Male , Middle Aged , Peptides/adverse effects , Platelet Aggregation Inhibitors/adverse effects , Retrospective Studies
7.
Postgrad Med ; 121(3): 160-70, 2009 May.
Article in English | MEDLINE | ID: mdl-19491554

ABSTRACT

We evaluated the impact of a 15-hospital, rural, multi-state intensive care unit (ICU) telemedicine program. Acute Physiology, Age, and Chronic Health Evaluation (APACHE III) scores, raw mortality rates, and actual-to-predicted length of stay (LOS) ratios and mortality ratios were used. Surveys evaluated program impact in smaller facilities and satisfaction of the physicians staffing the remote center. Smaller facilities' staff reported improvements in the quality of critical care services and reduced transfers. In regional hospitals, acuity scores increased (retention of sicker patients) while raw mortality was the same or lower. Length of stay ratios were reduced in these hospitals. In the tertiary hospital, actual-to-predicted ICU and hospital mortality and LOS ratios decreased.


Subject(s)
Intensive Care Units/organization & administration , Outcome Assessment, Health Care/methods , Program Evaluation/methods , Rural Health Services/organization & administration , Telemedicine/organization & administration , Aged , Female , Follow-Up Studies , Humans , Male , Retrospective Studies , Time Factors , United States
8.
J Crit Care ; 24(3): 458-63, 2009 Sep.
Article in English | MEDLINE | ID: mdl-19327322

ABSTRACT

BACKGROUND: Atrial fibrillation (AF) complicates up to 60% of patients after cardiac surgery. Current prophylactic measures are inadequate. Corticosteroids down-regulate activation of the proinflammatory response (including C-reactive protein) after cardiopulmonary bypass and have been suggested to reduce the risk of postoperative AF. OBJECTIVE: The goal of this meta-analysis was to determine (i) the efficacy of corticosteroids in preventing AF after cardiac surgery and (ii) the impact of different dosage regimens on this outcome. DATA SOURCES: Sources included MEDLINE, Embase, the Cochrane Database of Systematic Reviews, and citation review of relevant primary and review articles. STUDY SELECTION: The study identified prospective, randomized, placebo-controlled clinical trials that evaluated the role of corticosteroids in preventing AF after cardiac surgery. DATA EXTRACTION: Data were abstracted on study design, study size, type of cardiac surgery, corticosteroid dosage regimen, and the incidence of AF in the first 72 hours after surgery. The total cumulated dose of corticosteroid was classified as low dose (<200 mg/d), moderate dose (200-1000 mg/d), high dose (1001-10,000 mg/d), and very high dose (10,000 mg/d) of hydrocortisone equivalents. Meta-analytic techniques were used to analyze the data. DATA SYNTHESIS: We identified 7 relevant studies that included 1046 patients. The corticosteroid regimen differed between all studies with the total cumulative dose varying from 160 to 21,000 mg of hydrocortisone equivalents; one study each used low-dose and very high-dose corticosteroid. Overall, the use of corticosteroids was associated with a significant reduction in the risk of postoperative AF, with an odds ratio of 0.42, 95% confidence interval of 0.27 to 0.68, and P = .0004. Significant heterogeneity was however noted between studies. When the low-dose and very high-dose studies were excluded, the treatment effect was highly significant (odds ratio, 0.32; 95% confidence interval, 0.21 to 0.50; P < .00001) with insignificant heterogeneity. CONCLUSIONS: Moderate-dosage corticosteroid (hydrocortisone) should be considered for the prevention of AF in high-risk patients undergoing cardiac surgery. Although the optimal dose, dosing interval, and duration of therapy is unclear, a single dose given at induction may be adequate. The interaction between corticosteroids, beta-blockers, and amiodarone requires further study.


Subject(s)
Atrial Fibrillation/prevention & control , Cardiac Surgical Procedures , Cardiopulmonary Bypass/adverse effects , Glucocorticoids/therapeutic use , Postoperative Complications/prevention & control , Dose-Response Relationship, Drug , Glucocorticoids/administration & dosage , Humans , Hydrocortisone/administration & dosage , Hydrocortisone/therapeutic use , Inflammation Mediators/blood , Perioperative Care/methods , Postoperative Complications/etiology
9.
J Am Coll Cardiol ; 53(9): 802-10, 2009 Mar 03.
Article in English | MEDLINE | ID: mdl-19245974

ABSTRACT

OBJECTIVES: The aim of this study was to evaluate the impact of echocardiographic contrast utilization on patient diagnosis and management. BACKGROUND: Contrast echocardiography (CE) has improved visualization of endocardial borders. However, its impact on patient management has not been evaluated previously. METHODS: We prospectively enrolled 632 consecutive patients with technically difficult echocardiographic studies who received intravenous contrast (Definity, Lantheus Medical Imaging, Billerica, Massachusetts). Quality of studies, number of left ventricular (LV) segments visualized, estimated ejection fraction, presence of apical thrombus, and management decisions were compared before and after contrast. RESULTS: After CE, the percent of uninterpretable studies decreased from 11.7% to 0.3% and technically difficult studies decreased from 86.7% to 9.8% (p < 0.0001). Before contrast, 11.6 +/- 3.3 of 17 LV segments were seen, which improved after CE to 16.8 +/- 1.1 (p < 0.0001). An LV thrombus was suspected in 35 patients and was definite in 3 patients before CE. After contrast, only 1 patient had a suspected thrombus, and 5 additional patients with thrombus were identified (p < 0.0001). A significant impact of CE on management was observed: additional diagnostic procedures were avoided in 32.8% of patients and drug management was altered in 10.4%, with a total impact (procedures avoided, change in drugs, or both) observed in 35.6% of patients. The impact of contrast increased with worsening quality of nonenhanced study, the highest being in intensive care units. A cost-benefit analysis showed a significant savings using contrast ($122/patient). CONCLUSIONS: The utilization of CE in technically difficult cases improves endocardial visualization and impacts cardiac diagnosis, resource utilization, and patient management.


Subject(s)
Contrast Media , Echocardiography, Transesophageal/methods , Ventricular Dysfunction, Left/diagnostic imaging , Contrast Media/economics , Cost-Benefit Analysis , Echocardiography, Transesophageal/economics , Female , Heart Ventricles/diagnostic imaging , Humans , Male , Middle Aged , Prospective Studies , Radionuclide Imaging , Stroke Volume , United States , Ventricular Function, Left
11.
Crit Care Med ; 35(2): 584-8, 2007 Feb.
Article in English | MEDLINE | ID: mdl-17205004

ABSTRACT

BACKGROUND AND OBJECTIVES: Clinical chemistry is an important component of the diagnosis of many conditions, and advances in laboratory science have brought many new diagnostic tools to the intensive care unit clinician, including new biomarkers of cardiac injury like troponin T and I. Interpretation of these clinical laboratory results requires knowledge of the performance of these tests. SETTING AND PATIENTS: This article reviews the interpretation and performance of diagnostic markers of myocardial injury in patients with diverse clinical conditions of interest to critical care practitioners. CONCLUSIONS: Cardiac troponin I and T, regulatory components of the contractile apparatus, are sensitive indicators of myocardial injury and have become central to the diagnosis of myocardial infarction. The troponins are also released in a number of clinical situations in which thrombotic complications of coronary artery disease and resultant acute myocardial infarction have not occurred. These situations include conditions like pulmonary embolism, sepsis, myocarditis, and acute stroke. Elevated troponins in these conditions are thought to emanate from injured myocardial cells and, in most circumstances, have been associated with adverse outcomes. Practitioners should be mindful of the wide spectrum of diseases that may result in elevated troponin when interpreting these measurements.


Subject(s)
Heart Diseases/blood , Heart Diseases/diagnosis , Intensive Care Units , Troponin I/blood , Troponin T/blood , Humans
14.
Crit Care ; 8(2): 87-8, 2004 Apr.
Article in English | MEDLINE | ID: mdl-15025762

ABSTRACT

Coronary artery disease remains a common problem in industrialized countries. Percutaneous coronary interventions are usually performed utilizing the femoral approach. Arterial puncture-closing devices have been developed in hope to avoid manual compression and shortening the period of rest. In a recent meta-analysis in the Journal of the American Medical Association these devices have shown only marginal benefits over manual compression. Further, well designed studies are necessary to document the comparative effects of these devices versus manual compression.


Subject(s)
Angioplasty, Balloon, Coronary/adverse effects , Cardiac Catheterization/adverse effects , Coronary Artery Disease/therapy , Hemostatic Techniques/instrumentation , Punctures , Wound Healing , Bandages , Femoral Artery , Hematoma/etiology , Hematoma/prevention & control , Humans , Meta-Analysis as Topic , Postoperative Hemorrhage/etiology , Postoperative Hemorrhage/prevention & control , Pressure
15.
Crit Care Med ; 32(1): 256-62, 2004 Jan.
Article in English | MEDLINE | ID: mdl-14707589

ABSTRACT

OBJECTIVE: The development of practice guidelines for the conduct of intra- and interhospital transport of the critically ill patient. DATA SOURCE: Expert opinion and a search of Index Medicus from January 1986 through October 2001 provided the basis for these guidelines. A task force of experts in the field of patient transport provided personal experience and expert opinion. STUDY SELECTION AND DATA EXTRACTION: Several prospective and clinical outcome studies were found. However, much of the published data comes from retrospective reviews and anecdotal reports. Experience and consensus opinion form the basis of much of these guidelines. RESULTS OF DATA SYNTHESIS: Each hospital should have a formalized plan for intra- and interhospital transport that addresses a) pretransport coordination and communication; b) transport personnel; c) transport equipment; d) monitoring during transport; and e) documentation. The transport plan should be developed by a multidisciplinary team and should be evaluated and refined regularly using a standard quality improvement process. CONCLUSION: The transport of critically ill patients carries inherent risks. These guidelines promote measures to ensure safe patient transport. Although both intra- and interhospital transport must comply with regulations, we believe that patient safety is enhanced during transport by establishing an organized, efficient process supported by appropriate equipment and personnel.


Subject(s)
Critical Care/standards , Guideline Adherence , Patient Transfer/standards , Transportation of Patients/standards , Critical Illness , Female , Humans , Male , Monitoring, Physiologic/standards , Policy Making , Risk Assessment , United States
16.
J Emerg Med ; 25(4): 409-13, 2003 Nov.
Article in English | MEDLINE | ID: mdl-14654182

ABSTRACT

Gatherings of large numbers of people at concerts, sporting events, and other occasions lead to an assembled population with a potential for a wide variety of illnesses and injuries. The collection of large numbers of people in a single location has led some authors to recommend the placement of resuscitation equipment or other medical services in close proximity to these activities. These recommendations not withstanding, data on the frequency of critical illness at mass gatherings (a group exceeding 1000 persons) are difficult to ascertain. Therefore, it was the purpose of this study to describe the incidence of critical illnesses among assembled populations at mass gatherings. An observational prospective study was conducted involving patient encounters at a large, multipurpose, indoor mass-gathering complex in Houston, Texas occurring between September 1, 1996 and June 30, 1997. Demographic, treatment, disposition and diagnostic data were analyzed in a computerized database. Of the 3.3 million attendants to the 253 events analyzed during the 10-month study period, there were 2762 (0.08%) patient encounters. Fifty-two percent were women. Mean age was 32 +/- 15.6 years. Of these patients, 51.1% were patrons and the remaining patients were employees or contractors of the facility. A wide variety of illness was seen with trauma (39.5%), headache (31%), and other medical complaints (29.5%) being most frequent. Disposition of the patients included 95.3% being discharged to go back to the event and 2.2% being counseled to seek other medical attention. One hundred twenty-nine patients (4.7%) were referred to the Emergency Department (ED); of these, 70 were transferred for abrasions, lacerations, or skeletal injuries and 13 for chest pain. Of those referred to the ED, 50 (38.7%) patients were transported by ambulance and only 17.4% were admitted to telemetry, with none admitted to an ICU. It is concluded that critical illness at mass gatherings is infrequent, as seen in this study, with very few being admitted to telemetry and none to an ICU. Careful consideration of cost-benefit should occur when determining allocation of resources for these activities.


Subject(s)
Critical Illness/epidemiology , Emergency Medical Services/statistics & numerical data , Recreation , Adult , Disaster Planning , Female , Humans , Male , Music , Prospective Studies , Regression Analysis , Sports , Texas/epidemiology , Wounds and Injuries/epidemiology , Wounds and Injuries/therapy
19.
Chest ; 123(4): 1313, 2003 Apr.
Article in English | MEDLINE | ID: mdl-12703498
SELECTION OF CITATIONS
SEARCH DETAIL
...