Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 25
Filtrar
1.
Neurocrit Care ; 2024 May 10.
Artículo en Inglés | MEDLINE | ID: mdl-38730118

RESUMEN

BACKGROUND: Optimal pharmacologic thromboprophylaxis dosing is not well described in patients with subarachnoid hemorrhage (SAH) with an external ventricular drain (EVD). Our patients with SAH with an EVD who receive prophylactic enoxaparin are routinely monitored using timed anti-Xa levels. Our primary study goal was to determine the frequency of venous thromboembolism (VTE) and secondary intracranial hemorrhage (ICH) for this population of patients who received pharmacologic prophylaxis with enoxaparin or unfractionated heparin (UFH). METHODS: A retrospective chart review was performed for all patients with SAH admitted to the neurocritical care unit at Emory University Hospital between 2012 and 2017. All patients with SAH who required an EVD were included. RESULTS: Of 1,351 patients screened, 868 required an EVD. Of these 868 patients, 627 received enoxaparin, 114 received UFH, and 127 did not receive pharmacologic prophylaxis. VTE occurred in 7.5% of patients in the enoxaparin group, 4.4% in the UFH group (p = 0.32), and 3.2% in the no VTE prophylaxis group (p = 0.08). Secondary ICH occurred in 3.83% of patients in the enoxaparin group, 3.51% in the UFH group (p = 1), and 3.94% in the no VTE prophylaxis group (p = 0.53). As steady-state anti-Xa levels increased from 0.1 units/mL to > 0.3 units/mL, there was a trend toward a lower incidence of VTE. However, no correlation was noted between rising anti-Xa levels and an increased incidence of secondary ICH. When compared, neither enoxaparin nor UFH use was associated with a significantly reduced incidence of VTE or an increased incidence of ICH. CONCLUSIONS: In this retrospective study of patients with nontraumatic SAH with an EVD who received enoxaparin or UFH VTE prophylaxis or no VTE prophylaxis, there was no statistically significant difference in the incidence of VTE or secondary ICH. For patients receiving prophylactic enoxaparin, achieving higher steady-state target anti-Xa levels may be associated with a lower incidence of VTE without increasing the risk of secondary ICH.

2.
J Appl Stat ; 50(14): 2889-2913, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37808611

RESUMEN

In this paper, we present an efficient statistical method (denoted as 'Adaptive Resources Allocation CUSUM') to robustly and efficiently detect the hotspot with limited sampling resources. Our main idea is to combine the multi-arm bandit (MAB) and change-point detection methods to balance the exploration and exploitation of resource allocation for hotspot detection. Further, a Bayesian weighted update is used to update the posterior distribution of the infection rate. Then, the upper confidence bound (UCB) is used for resource allocation and planning. Finally, CUSUM monitoring statistics to detect the change point as well as the change location. For performance evaluation, we compare the performance of the proposed method with several benchmark methods in the literature and showed the proposed algorithm is able to achieve a lower detection delay and higher detection precision. Finally, this method is applied to hotspot detection in a real case study of county-level daily positive COVID-19 cases in Washington State WA) and demonstrates the effectiveness with very limited distributed samples.

3.
J Appl Stat ; 50(14): 2999-3029, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37808612

RESUMEN

Count data occur widely in many bio-surveillance and healthcare applications, e.g. the numbers of new patients of different types of infectious diseases from different cities/counties/states repeatedly over time, say, daily/weekly/monthly. For this type of count data, one important task is the quick detection and localization of hot-spots in terms of unusual infectious rates so that we can respond appropriately. In this paper, we develop a method called Poisson assisted Smooth Sparse Tensor Decomposition (PoSSTenD), which not only detect when hot-spots occur but also localize where hot-spots occur. The main idea of our proposed PoSSTenD method is articulated as follows. First, we represent the observed count data as a three-dimensional tensor including (1) a spatial dimension for location patterns, e.g. different cities/countries/states; (2) a temporal domain for time patterns, e.g. daily/weekly/monthly; (3) a categorical dimension for different types of data sources, e.g. different types of diseases. Second, we fit this tensor into a Poisson regression model, and then we further decompose the infectious rate into two components: smooth global trend and local hot-spots. Third, we detect when hot-spots occur by building a cumulative sum (CUSUM) control chart and localize where hot-spots occur by their LASSO-type sparse estimation. The usefulness of our proposed methodology is validated through numerical simulation studies and a real-world dataset, which records the annual number of 10 different infectious diseases from 1993 to 2018 for 49 mainland states in the United States.

5.
J Appl Stat ; 50(14): 2951-2969, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37808618

RESUMEN

Multistage sequential decision-making occurs in many real-world applications such as healthcare diagnosis and treatment. One concrete example is when the doctors need to decide to collect which kind of information from subjects so as to make the good medical decision cost-effectively. In this paper, an active learning-based method is developed to model the doctors' decision-making process that actively collects necessary information from each subject in a sequential manner. The effectiveness of the proposed model, especially its two-stage version, is validated on both simulation studies and a case study of common bile duct stone evaluation for pediatric patients.

6.
Seq Anal ; 42(2): 150-181, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37645693

RESUMEN

The active quickest detection problem with unknown post-change parameters is studied under the sampling control constraint, where there are p local streams in a system but one is only able to take observations from one and only one of these p local streams at each time instant. The objective is to raise a correct alarm as quickly as possible once the change occurs subject to both false alarm and sampling control constraints. Here we assume that exactly one of the p local streams is affected, and the post-change distribution involves unknown parameters. In this context, we propose an efficient greedy-cyclic-sampling-based quickest detection algorithm, and show that our proposed algorithm is asymptotically optimal in the sense of minimizing the detection delay under both false alarm and sampling control constraints. Numerical studies are conducted to show the effectiveness and applicability of the proposed algorithm.

7.
Front Cell Dev Biol ; 11: 1167111, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37305684

RESUMEN

Chromatin immunoprecipitation followed by sequencing (ChIP-seq) has revolutionized the studies of epigenomes and the massive increase in ChIP-seq datasets calls for robust and user-friendly computational tools for quantitative ChIP-seq. Quantitative ChIP-seq comparisons have been challenging due to noisiness and variations inherent to ChIP-seq and epigenomes. By employing innovative statistical approaches specially catered to ChIP-seq data distribution and sophisticated simulations along with extensive benchmarking studies, we developed and validated CSSQ as a nimble statistical analysis pipeline capable of differential binding analysis across ChIP-seq datasets with high confidence and sensitivity and low false discovery rate with any defined regions. CSSQ models ChIP-seq data as a finite mixture of Gaussians faithfully that reflects ChIP-seq data distribution. By a combination of Anscombe transformation, k-means clustering, estimated maximum normalization, CSSQ minimizes noise and bias from experimental variations. Further, CSSQ utilizes a non-parametric approach and incorporates comparisons under the null hypothesis by unaudited column permutation to perform robust statistical tests to account for fewer replicates of ChIP-seq datasets. In sum, we present CSSQ as a powerful statistical computational pipeline tailored for ChIP-seq data quantitation and a timely addition to the tool kits of differential binding analysis to decipher epigenomes.

8.
Technometrics ; 65(1): 33-43, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36950530

RESUMEN

In many real-world problems of real-time monitoring high-dimensional streaming data, one wants to detect an undesired event or change quickly once it occurs, but under the sampling control constraint in the sense that one might be able to only observe or use selected components data for decision-making per time step in the resource-constrained environments. In this paper, we propose to incorporate multi-armed bandit approaches into sequential change-point detection to develop an efficient bandit change-point detection algorithm based on the limiting Bayesian approach to incorporate a prior knowledge of potential changes. Our proposed algorithm, termed Thompson-Sampling-Shiryaev-Roberts-Pollak (TSSRP), consists of two policies per time step: the adaptive sampling policy applies the Thompson Sampling algorithm to balance between exploration for acquiring long-term knowledge and exploitation for immediate reward gain, and the statistical decision policy fuses the local Shiryaev-Roberts-Pollak statistics to determine whether to raise a global alarm by sum shrinkage techniques. Extensive numerical simulations and case studies demonstrate the statistical and computational efficiency of our proposed TSSRP algorithm.

9.
Neurocrit Care ; 38(2): 320-325, 2023 04.
Artículo en Inglés | MEDLINE | ID: mdl-35831731

RESUMEN

BACKGROUND: COVID-19 surges led to significant challenges in ensuring critical care capacity. In response, some centers leveraged neurocritical care (NCC) capacity as part of the surge response, with neurointensivists providing general critical care for patients with COVID-19 without neurologic illness. The relative outcomes of NCC critical care management of patients with COVID-19 remain unclear and may help guide further surge planning and provide broader insights into general critical care provided in NCC units. METHODS: We performed an observational cohort study of all patients requiring critical care for COVID-19 across four hospitals within the Emory Healthcare system during the first three surges. Patients were categorized on the basis of admission to intensive care units (ICUs) staffed by general intensivists or neurointensivists. Patients with primary neurological diagnoses were excluded. Baseline demographics, clinical complications, and outcomes were compared between groups using univariable and propensity score matching statistics. RESULTS: A total of 1141 patients with a primary diagnosis of COVID-19 required ICU admission. ICUs were staffed by general intensivists (n = 1071) or neurointensivists (n = 70). Baseline demographics and presentation characteristics were similar between groups, except for patients admitted to neurointensivist-staffed ICUs being younger (59 vs. 65, p = 0.027) and having a higher PaO2/FiO2 ratio (153 vs. 120, p = 0.002). After propensity score matching, there was no correlation between ICU staffing and the use of mechanical ventilation, renal replacement therapy, and vasopressors. The rates of in-hospital mortality and hospice disposition were similar in neurointensivist-staffed COVID-19 units (odds ratio 0.9, 95% confidence interval 0.31-2.64, p = 0.842). CONCLUSIONS: COVID-19 surges precipitated a natural experiment in which neurology-trained neurointensivists provided critical care in a comparable context to general intensivists treating the same disease. Neurology-trained neurointensivists delivered comparable outcomes to those of general ICUs during COVID-19 surges. These results further support the role of NCC in meeting general critical care needs of neurocritically ill patients and as a viable surge resource in general critical care.


Asunto(s)
COVID-19 , Neurología , Humanos , Capacidad de Reacción , Cuidados Críticos/métodos , Unidades de Cuidados Intensivos
10.
Mil Med ; 188(3-4): e771-e779, 2023 03 20.
Artículo en Inglés | MEDLINE | ID: mdl-34557921

RESUMEN

INTRODUCTION: Occupational exposure to repetitive, low-level blasts in military training and combat has been tied to subconcussive injury and poor health outcomes for service members. Most low-level blast studies to date have focused on explosive breaching and firing heavy weapon systems; however, there is limited research on the repetitive blast exposure and physiological effects that mortarmen experience when firing mortar weapon systems. Motivated by anecdotal symptoms of mortarmen, the purpose of this paper is to characterize this exposure and its resulting neurocognitive effects in order to provide preliminary findings and actionable recommendations to safeguard the health of mortarmen. MATERIALS AND METHODS: In collaboration with the U.S. Army Rangers at Fort Benning, blast exposure, symptoms, and pupillary light reflex were measured during 3 days of firing 81 mm and 120 mm mortars in training. Blast exposure analysis included the examination of the blast overpressure (BOP) and cumulative exposure by mortarman position, as well as comparison to the 4 psi safety threshold. Pupillary light reflex responses were analyzed with linear mixed effects modeling. All neurocognitive results were compared between mortarmen (n = 11) and controls (n = 4) and cross-compared with blast exposure and blast history. RESULTS: Nearly 500 rounds were fired during the study, resulting in a high cumulative blast exposure for all mortarmen. While two mortarmen had average BOPs exceeding the 4 psi safety limit (Fig. 2), there was a high prevalence of mTBI-like symptoms among all mortarmen, with over 70% experiencing headaches, ringing in the ears, forgetfulness/poor memory, and taking longer to think during the training week (n ≥ 8/11). Mortarmen also had smaller and slower pupillary light reflex responses relative to controls, with significantly slower dilation velocity (P < 0.05) and constriction velocity (P < 0.10). CONCLUSION: Mortarmen experienced high cumulative blast exposure coinciding with altered neurocognition that is suggestive of blast-related subconcussive injury. These neurocognitive effects occurred even in mortarmen with average BOP below the 4 psi safety threshold. While this study was limited by a small sample size, its results demonstrate a concerning health risk for mortarmen that requires additional study and immediate action. Behavioral changes like ducking and standing farther from the mortar when firing can generally help reduce mortarmen BOP exposure, but we recommend the establishment of daily cumulative safety thresholds and daily firing limits in training to reduce cumulative blast exposure, and ultimately, improve mortarmen's quality of life and longevity in service.


Asunto(s)
Traumatismos por Explosión , Personal Militar , Humanos , Personal Militar/psicología , Calidad de Vida , Explosiones , Traumatismos por Explosión/complicaciones , Traumatismos por Explosión/diagnóstico
11.
J Appl Stat ; 49(7): 1636-1662, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35707553

RESUMEN

In many real-world applications of monitoring multivariate spatio-temporal data that are non-stationary over time, one is often interested in detecting hot-spots with spatial sparsity and temporal consistency, instead of detecting system-wise changes as in traditional statistical process control (SPC) literature. In this paper, we propose an efficient method to detect hot-spots through tensor decomposition, and our method has three steps. First, we fit the observed data into a Smooth Sparse Decomposition Tensor (SSD-Tensor) model that serves as a dimension reduction and de-noising technique: it is an additive model decomposing the original data into: smooth but non-stationary global mean, sparse local anomalies, and random noises. Next, we estimate model parameters by the penalized framework that includes Least Absolute Shrinkage and Selection Operator (LASSO) and fused LASSO penalty. An efficient recursive optimization algorithm is developed based on Fast Iterative Shrinkage Thresholding Algorithm (FISTA). Finally, we apply a Cumulative Sum (CUSUM) Control Chart to monitor model residuals after removing global means, which helps to detect when and where hot-spots occur. To demonstrate the usefulness of our proposed SSD-Tensor method, we compare it with several other methods including scan statistics, LASSO-based, PCA-based, T2-based control chart in extensive numerical simulation studies and a real crime rate dataset.

13.
Technometrics ; 64(4): 502-512, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-37388823

RESUMEN

High-dimensional data has become popular due to the easy accessibility of sensors in modern industrial applications. However, one specific challenge is that it is often not easy to obtain complete measurements due to limited sensing powers and resource constraints. Furthermore, distinct failure patterns may exist in the systems, and it is necessary to identify the true failure pattern. This work focuses on the online adaptive monitoring of high-dimensional data in resource-constrained environments with multiple potential failure modes. To achieve this, we propose to apply the Shiryaev-Roberts procedure on the failure mode level and utilize the multi-arm bandit to balance the exploration and exploitation. We further discuss the theoretical property of the proposed algorithm to show that the proposed method can correctly isolate the failure mode. Finally, extensive simulations and two case studies demonstrate that the change point detection performance and the failure mode isolation accuracy can be greatly improved.

14.
J Neurosurg ; 136(1): 115-124, 2022 Jan 01.
Artículo en Inglés | MEDLINE | ID: mdl-34087804

RESUMEN

OBJECTIVE: Cerebral vasospasm and delayed cerebral ischemia (DCI) contribute to poor outcome following subarachnoid hemorrhage (SAH). With the paucity of effective treatments, the authors describe their experience with intrathecal (IT) nicardipine for this indication. METHODS: Patients admitted to the Emory University Hospital neuroscience ICU between 2012 and 2017 with nontraumatic SAH, either aneurysmal or idiopathic, were included in the analysis. Using a propensity-score model, this patient cohort was compared to patients in the Subarachnoid Hemorrhage International Trialists (SAHIT) repository who did not receive IT nicardipine. The primary outcome was DCI. Secondary outcomes were long-term functional outcome and adverse events. RESULTS: The analysis included 1351 patients, 422 of whom were diagnosed with cerebral vasospasm and treated with IT nicardipine. When compared with patients with no vasospasm (n = 859), the treated group was significantly younger (mean age 51.1 ± 12.4 years vs 56.7 ± 14.1 years, p < 0.001), had a higher World Federation of Neurosurgical Societies score and modified Fisher grade, and were more likely to undergo clipping of the ruptured aneurysm as compared to endovascular treatment (30.3% vs 11.3%, p < 0.001). Treatment with IT nicardipine decreased the daily mean transcranial Doppler velocities in 77.3% of the treated patients. When compared to patients not receiving IT nicardipine, treatment was not associated with an increased rate of bacterial ventriculitis (3.1% vs 2.7%, p > 0.1), yet higher rates of ventriculoperitoneal shunting were noted (19.9% vs 8.8%, p < 0.01). In a propensity score comparison to the SAHIT database, the odds ratio (OR) to develop DCI with IT nicardipine treatment was 0.61 (95% confidence interval [CI] 0.44-0.84), and the OR to have a favorable functional outcome (modified Rankin Scale score ≤ 2) was 2.17 (95% CI 1.61-2.91). CONCLUSIONS: IT nicardipine was associated with improved outcome and reduced DCI compared with propensity-matched controls. There was an increased need for permanent CSF diversion but no other safety issues. These data should be considered when selecting medications and treatments to study in future randomized controlled clinical trials for SAH.


Asunto(s)
Bloqueadores de los Canales de Calcio/administración & dosificación , Bloqueadores de los Canales de Calcio/uso terapéutico , Nicardipino/administración & dosificación , Nicardipino/uso terapéutico , Hemorragia Subaracnoidea/complicaciones , Vasoespasmo Intracraneal/tratamiento farmacológico , Vasoespasmo Intracraneal/etiología , Adulto , Factores de Edad , Anciano , Aneurisma Roto , Rotura de la Aorta/complicaciones , Rotura de la Aorta/cirugía , Bloqueadores de los Canales de Calcio/efectos adversos , Cuidados Críticos , Procedimientos Endovasculares , Femenino , Humanos , Inyecciones Espinales , Masculino , Persona de Mediana Edad , Procedimientos Neuroquirúrgicos , Nicardipino/efectos adversos , Puntaje de Propensión , Estudios Retrospectivos , Resultado del Tratamiento
15.
J Pediatr Gastroenterol Nutr ; 73(5): 636-641, 2021 Nov 01.
Artículo en Inglés | MEDLINE | ID: mdl-34224492

RESUMEN

BACKGROUND: Definitive non-invasive detection of pediatric choledocholithiasis could allow more efficient identification of those patients who are most likely to benefit from therapeutic endoscopic retrograde cholangiopancreatography (ERCP) for stone extraction. OBJECTIVE: To craft a pediatric choledocholithiasis prediction model using a combination of commonly available serum laboratory values and ultrasound results. METHODS: A retrospective review of laboratory and imaging results from 316 pediatric patients who underwent intraoperative cholangiogram or ERCP due to suspicion of choledocholithiasis were collected and compared to presence of common bile duct stones on cholangiography. Multivariate logistic regression with supervised machine learning was used to create a predictive scoring model. Monte-Carlo cross-validation was used to validate the scoring model and a score threshold that would provide at least 90% specificity for choledocholithiasis was determined in an effort to minimize non-therapeutic ERCP. RESULTS: Alanine aminotransferase (ALT), total bilirubin, alkaline phosphatase, and common bile duct diameter via ultrasound were found to be the key clinical variables to determine the likelihood of choledocholithiasis. The dictated specificity threshold of 90.3% yielded a sensitivity of 40.8% and overall accuracy of 71.5% in detecting choledocholithiasis. Positive predictive value was 71.4% and negative predictive value was 72.1%. CONCLUSION: Our novel pediatric choledocholithiasis predictive model is a highly specific tool to suggest ERCP in the setting of likely choledocholithiasis.


Asunto(s)
Coledocolitiasis , Niño , Colangiografía , Colangiopancreatografia Retrógrada Endoscópica , Coledocolitiasis/diagnóstico por imagen , Coledocolitiasis/cirugía , Conducto Colédoco , Humanos , Estudios Retrospectivos , Sensibilidad y Especificidad
16.
J Am Soc Echocardiogr ; 34(12): 1253-1261.e4, 2021 12.
Artículo en Inglés | MEDLINE | ID: mdl-34284098

RESUMEN

BACKGROUND: The authors retrospectively evaluated the impact of ultrasound enhancing agent (UEA) use in the first transthoracic echocardiographic (TTE) examination, regardless of baseline image quality, on the number of repeat TTEs and length of stay (LOS) during a heart failure (HF) admission. METHODS: There were 9,115 HF admissions associated with admission TTE examinations over a 4-year period (5,337 men; mean age, 67.6 ± 15.0 years). Patients were grouped into those who received UEAs (contrast group) in the first TTE study and those who did not (noncontrast group). Repeat TTE examinations were classified as justified if performed for concrete clinical indications during hospitalization. RESULTS: In the 9,115 admissions for HF (5,600 in the contrast group, 3,515 in the noncontrast group), 927 patients underwent repeat TTE studies (505 in the contrast group, 422 in the noncontrast group), which were considered justified in 823 patients. Of the 104 patients who underwent unjustified repeat TTE studies, 80 (76.7%) belonged to the noncontrast group and 24 to the contrast group. Also, UEA use increased from 50.4% in 2014 to 74.3%, and the rate of unjustified repeat studies decreased from 1.3% to 0.9%. The rates of unjustified repeat TTE imaging were 2.3% and 0.4% (in the noncontrast and contrast groups, respectively), and patients in the contrast group were less likely to undergo unjustified repeat examinations (odds ratio, 0.18; 95% CI, 0.12-0.29; P < .0001). The mean LOS was significantly lower in the contrast group (9.5 ± 10.5 vs 11.1 ± 13.7 days). The use of UEA in the first TTE study was also associated with reduced LOS (linear regression, ß1 = -0.47, P = .036), with 20% lower odds for odds of prolonged (>6 days) LOS. CONCLUSIONS: The routine use of UEA in the first TTE examination for HF irrespective of image quality is associated with reduced unjustified repeat TTE testing and may reduce LOS during an index HF admission.


Asunto(s)
Ecocardiografía , Insuficiencia Cardíaca , Anciano , Anciano de 80 o más Años , Insuficiencia Cardíaca/diagnóstico por imagen , Insuficiencia Cardíaca/epidemiología , Hospitalización , Humanos , Persona de Mediana Edad , Estudios Retrospectivos , Ultrasonografía
17.
Drug Deliv Transl Res ; 11(6): 2328-2343, 2021 12.
Artículo en Inglés | MEDLINE | ID: mdl-34165731

RESUMEN

Lymph nodes (LNs) are tissues of the immune system that house leukocytes, making them targets of interest for a variety of therapeutic immunomodulation applications. However, achieving accumulation of a therapeutic in the LN does not guarantee equal access to all leukocyte subsets. LNs are structured to enable sampling of lymph draining from peripheral tissues in a highly spatiotemporally regulated fashion in order to facilitate optimal adaptive immune responses. This structure results in restricted nanoscale drug delivery carrier access to specific leukocyte targets within the LN parenchyma. Herein, a framework is presented to assess the manner in which lymph-derived macromolecules and particles are sampled in the LN to reveal new insights into how therapeutic strategies or drug delivery systems may be designed to improve access to dLN-resident leukocytes. This summary analysis of previous reports from our group assesses model nanoscale fluorescent tracer association with various leukocyte populations across relevant time periods post administration, studies the effects of bioactive molecule NO on access of lymph-borne solutes to dLN leukocytes, and illustrates the benefits to leukocyte access afforded by lymphatic-targeted multistage drug delivery systems. Results reveal trends consistent with the consensus view of how lymph is sampled by LN leukocytes resulting from tissue structural barriers that regulate inter-LN transport and demonstrate how novel, engineered delivery systems may be designed to overcome these barriers to unlock the therapeutic potential of LN-resident cells as drug delivery targets.


Asunto(s)
Vasos Linfáticos , Portadores de Fármacos , Sistemas de Liberación de Medicamentos , Leucocitos , Ganglios Linfáticos
18.
J Appl Stat ; 48(1): 154-175, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34113056

RESUMEN

Sepsis is one of the biggest risks to patient safety, with a natural mortality rate between 25% and 50%. It is difficult to diagnose, and no validated standard for diagnosis currently exists. A commonly used scoring criteria is the quick sequential organ failure assessment (qSOFA). It demonstrates very low specificity in ICU populations, however. We develop a method to personalize thresholds in qSOFA that incorporates easily to measure patient baseline characteristics. We compare the personalized threshold method to qSOFA, five previously published methods that obtain an optimal constant threshold for a single biomarker, and to the machine learning algorithms based on logistic regression and AdaBoosting using patient data in the MIMIC-III database. The personalized threshold method achieves higher accuracy than qSOFA and the five published methods and has comparable performance to machine learning methods. Personalized thresholds, however, are much easier to adopt in real-life monitoring than machine learning methods as they are computed once for a patient and used in the same way as qSOFA, whereas the machine learning methods are hard to implement and interpret.

19.
Neurosurgery ; 88(3): 574-583, 2021 02 16.
Artículo en Inglés | MEDLINE | ID: mdl-33313810

RESUMEN

BACKGROUND: Aneurysmal subarachnoid hemorrhage (aSAH) is associated with disproportionally high mortality and long-term neurological sequelae. Management of patients with aSAH has changed markedly over the years, leading to improvements in outcome. OBJECTIVE: To describe trends in aSAH care and outcome in a high-volume single center 15-yr cohort. METHODS: All new admissions diagnosed with subarachnoid hemorrhage (SAH) to our tertiary neuro-intensive care unit between 2002 and 2016 were reviewed. Trend analysis was performed to assess temporal changes and a step-wise regression analysis was done to identify factors associated with outcomes. RESULTS: Out of 3970 admissions of patients with SAH, 2475 patients proved to have a ruptured intracranial aneurysm. Over the years of the study, patient acuity increased by Hunt & Hess (H&H) grade and related complications. Endovascular therapies became more prevalent over the years, and were correlated with better outcome. Functional outcome overall improved, yet the main effect was noted in the low- and intermediate-grade patients. Several parameters were associated with poor functional outcome, including long-term mechanical ventilation (odds ratio 11.99, CI 95% [7.15-20.63]), acute kidney injury (3.55 [1.64-8.24]), pneumonia (2.89 [1.89-4.42]), hydrocephalus (1.80 [1.24-2.63]) diabetes mellitus (1.71 [1.04-2.84]), seizures (1.69 [1.07-2.70], H&H (1.67 [1.45-1.94]), and age (1.06 [1.05-1.07]), while endovascular approach to treat the aneurysm, compared with clip-ligation, had a positive effect (0.35 [0.25-0.48]). CONCLUSION: This large, single referral center, retrospective analysis reveals important trends in the treatment of aSAH. It also demonstrates that despite improvement in functional outcome over the years, systemic complications remain a significant risk factor for poor prognosis. The historic H&H determination of outcome is less valid with today's improved care.


Asunto(s)
Aneurisma Roto/cirugía , Procedimientos Endovasculares/tendencias , Unidades de Cuidados Intensivos/tendencias , Aneurisma Intracraneal/cirugía , Hemorragia Subaracnoidea/cirugía , Adulto , Anciano , Aneurisma Roto/diagnóstico por imagen , Estudios de Cohortes , Femenino , Humanos , Aneurisma Intracraneal/diagnóstico por imagen , Masculino , Persona de Mediana Edad , Valor Predictivo de las Pruebas , Estudios Retrospectivos , Factores de Riesgo , Hemorragia Subaracnoidea/diagnóstico por imagen , Factores de Tiempo , Resultado del Tratamiento
20.
Neurocrit Care ; 33(2): 458-467, 2020 10.
Artículo en Inglés | MEDLINE | ID: mdl-31933216

RESUMEN

BACKGROUND: Critically ill aneurysmal subarachnoid hemorrhage (aSAH) patients suffer from systemic complications at a high rate. Hyperglycemia is a common intensive care unit (ICU) complication and has become a focus after aggressive glucose management was associated with improved ICU outcomes. Subsequent research has suggested that glucose variability, not a specific blood glucose range, may be a more appropriate clinical target. Glucose variability is highly correlated to poor outcomes in a wide spectrum of critically ill patients. Here, we investigate the changes between subsequent glucose values termed "inter-measurement difference," as an indicator of glucose variability and its association with outcomes in patients with aSAH. METHODS: All SAH admissions to a single, tertiary referral center between 2002 and 2016 were screened. All aneurysmal cases who had more than 2 glucose measurements were included (n = 2451). We calculated several measures of variability, including simple variance, the average consecutive absolute change, average absolute change by time difference, within subject variance, median absolute deviation, and average or median consecutive absolute percentage change. Predictor variables also included admission Hunt and Hess grade, age, gender, cardiovascular risk factors, and surgical treatment. In-patient mortality was the main outcome measure. RESULTS: In a multiple regression analysis, nearly all forms of glucose variability calculations were found to be correlated with in-patient mortality. The consecutive absolute percentage change, however, was most predictive: OR 5.2 [1.4-19.8, CI 95%] for percentage change and 8.8 [1.8-43.6] for median change, when controlling for the defined predictors. Survival to ICU discharge was associated with lower glucose variability (consecutive absolute percentage change 17% ± 9%) compared with the group that did not survive to discharge (20% ± 15%, p < 0.01). Interestingly, this finding was not significant in patients with pre-admission poorly controlled diabetes as indicated by HbA1c (OR 0.45 [0.04-7.18], by percentage change). The effect is driven mostly by non-diabetic patients or those with well-controlled diabetes. CONCLUSIONS: Reduced glucose variability is highly correlated with in-patient survival and long-term mortality in aSAH patients. This finding was observed in the non-diabetic and well-controlled diabetic patients, suggesting a possible benefit for personalized glucose targets based on baseline HbA1c and minimizing variability. The inter-measure percentage change as an indicator of glucose variability is not only predictive of outcome, but is an easy-to-use tool that could be implemented in future clinical trials.


Asunto(s)
Hemorragia Subaracnoidea , Glucosa , Mortalidad Hospitalaria , Humanos , Estudios Retrospectivos , Resultado del Tratamiento
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA