Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 22
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Entropy (Basel) ; 23(6)2021 Jun 16.
Artículo en Inglés | MEDLINE | ID: mdl-34208771

RESUMEN

Aims: Bubble entropy (bEn) is an entropy metric with a limited dependence on parameters. bEn does not directly quantify the conditional entropy of the series, but it assesses the change in entropy of the ordering of portions of its samples of length m, when adding an extra element. The analytical formulation of bEn for autoregressive (AR) processes shows that, for this class of processes, the relation between the first autocorrelation coefficient and bEn changes for odd and even values of m. While this is not an issue, per se, it triggered ideas for further investigation. Methods: Using theoretical considerations on the expected values for AR processes, we examined a two-steps-ahead estimator of bEn, which considered the cost of ordering two additional samples. We first compared it with the original bEn estimator on a simulated series. Then, we tested it on real heart rate variability (HRV) data. Results: The experiments showed that both examined alternatives showed comparable discriminating power. However, for values of 10

2.
Adv Exp Med Biol ; 1194: 457, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32468562

RESUMEN

Cancer research has yielded tremendous gains over the last two decades with remarkable results addressing this worldwide major public health problem. Continuous technological developments and persistent research has led to significant progress in targeted therapies. This paper focuses on the study of mathematical models that describe in the most optimal way the development of malignant tumours induced in experimental animals of a particular species following chemical carcinogenesis with a complete carcinogen factor known as 3,4-benzopyrene. The purpose of this work is to study the phenomenon of chemical carcinogenesis, inhibition and growth of malignant tumours.


Asunto(s)
Carcinogénesis , Simulación por Computador , Modelos Biológicos , Neoplasias , Animales , Carcinogénesis/patología , Carcinógenos , Modelos Animales de Enfermedad , Neoplasias/inducido químicamente , Neoplasias/fisiopatología , Neoplasias/prevención & control
3.
Entropy (Basel) ; 20(1)2018 Jan 13.
Artículo en Inglés | MEDLINE | ID: mdl-33265148

RESUMEN

Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regularity/complexity of a time series. On the other hand, it is a computationally expensive method which may require a large amount of time when used in long series or with a large number of signals. The computationally intensive part is the similarity check between points in m dimensional space. In this paper, we propose new algorithms or extend already proposed ones, aiming to compute Sample Entropy quickly. All algorithms return exactly the same value for Sample Entropy, and no approximation techniques are used. We compare and evaluate them using cardiac inter-beat (RR) time series. We investigate three algorithms. The first one is an extension of the k d -trees algorithm, customized for Sample Entropy. The second one is an extension of an algorithm initially proposed for Approximate Entropy, again customized for Sample Entropy, but also improved to present even faster results. The last one is a completely new algorithm, presenting the fastest execution times for specific values of m, r, time series length, and signal characteristics. These algorithms are compared with the straightforward implementation, directly resulting from the definition of Sample Entropy, in order to give a clear image of the speedups achieved. All algorithms assume the classical approach to the metric, in which the maximum norm is used. The key idea of the two last suggested algorithms is to avoid unnecessary comparisons by detecting them early. We use the term unnecessary to refer to those comparisons for which we know a priori that they will fail at the similarity check. The number of avoided comparisons is proved to be very large, resulting in an analogous large reduction of execution time, making them the fastest algorithms available today for the computation of Sample Entropy.

4.
Ann Noninvasive Electrocardiol ; 21(5): 508-18, 2016 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-27038287

RESUMEN

BACKGROUND: Deceleration capacity (DC) of heart rate proved an independent mortality predictor in postmyocardial infarction patients. The original method (DCorig) may produce negative values (9% in our analyzed sample). We aimed to improve the method and to investigate if DC also predicts the arrhythmic mortality. METHODS: Time series from 221 heart failure patients was analyzed with DCorig and a new variant, the DCsgn, in which decelerations are characterized based on windows of four consecutive beats and not on anchors. After 41.2 months, 69 patients experienced sudden cardiac death (SCD) surrogate end points, while 61 died. RESULTS: (SCD+ vs SCD-group) DCorig: 3.7 ± 1.6 ms versus 4.6 ± 2.6 ms (P = 0.020) and DCsgn: 4.9 ± 1.7 ms versus 6.1 ± 2.2 ms (P < 0.001). After Cox regression (gender, age, left ventricular ejection fraction, filtered QRS, NSVT≥1/24h, VPBs≥240/24h, mean 24-h QTc, and each DC index added on the model separately), DCsgn (continuous) was an independent SCD predictor (hazard ratio [H.R.]: 0.742, 95% confidence intervals (C.I.): 0.631-0.871, P < 0.001). DCsgn ≤ 5.373 (dichotomous) presented 1.815 H.R. for SCD (95% C.I.: 1.080-3.049, P = 0.024), areas under curves (AUC)/receiver operator characteristic (ROC): 0.62 (DCorig) and 0.66 (DCsgn), P = 0.190 (chi-square). Results for deceased versus alive group: DCorig: 3.2 ± 2.0 ms versus 4.8 ± 2.4 ms (P < 0.001) and DCsgn: 4.6 ± 1.4 ms versus 6.2 ± 2.2 ms (P < 0.001). In Cox regression, DCsgn (continuous) presented H.R.: 0.686 (95% C.I. 0.546-0.862, P = 0.001) and DCsgn ≤ 5.373 (dichotomous) presented an H.R.: 2.443 for total mortality (TM) (95% C.I. 1.269-4.703, P = 0.008). AUC/ROC: 0.71 (DCorig) and 0.73 (DCsgn), P = 0.402. CONCLUSIONS: DC predicts both SCD and TM. DCsgn avoids the negative values, improving the method in a nonstatistical important level.


Asunto(s)
Arritmias Cardíacas/mortalidad , Arritmias Cardíacas/fisiopatología , Insuficiencia Cardíaca/mortalidad , Insuficiencia Cardíaca/fisiopatología , Determinación de la Frecuencia Cardíaca/métodos , Frecuencia Cardíaca/fisiología , Anciano , Muerte Súbita Cardíaca , Desaceleración , Ecocardiografía , Electrocardiografía , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Prospectivos
5.
Bioengineering (Basel) ; 11(2)2024 Feb 11.
Artículo en Inglés | MEDLINE | ID: mdl-38391661

RESUMEN

The objective of this study was to evaluate the effectiveness of machine learning classification techniques applied to nerve conduction studies (NCS) of motor and sensory signals for the automatic diagnosis of carpal tunnel syndrome (CTS). Two methodologies were tested. In the first methodology, motor signals recorded from the patients' median nerve were transformed into time-frequency spectrograms using the short-time Fourier transform (STFT). These spectrograms were then used as input to a deep two-dimensional convolutional neural network (CONV2D) for classification into two categories: patients and controls. In the second methodology, sensory signals from the patients' median and ulnar nerves were subjected to multilevel wavelet decomposition (MWD), and statistical and non-statistical features were extracted from the decomposed signals. These features were utilized to train and test classifiers. The classification target was set to three categories: normal subjects (controls), patients with mild CTS, and patients with moderate to severe CTS based on conventional electrodiagnosis results. The results of the classification analysis demonstrated that both methodologies surpassed previous attempts at automatic CTS diagnosis. The classification models utilizing the motor signals transformed into time-frequency spectrograms exhibited excellent performance, with average accuracy of 94%. Similarly, the classifiers based on the sensory signals and the extracted features from multilevel wavelet decomposition showed significant accuracy in distinguishing between controls, patients with mild CTS, and patients with moderate to severe CTS, with accuracy of 97.1%. The findings highlight the efficacy of incorporating machine learning algorithms into the diagnostic processes of NCS, providing a valuable tool for clinicians in the diagnosis and management of neuropathies such as CTS.

6.
Bioengineering (Basel) ; 9(12)2022 Dec 14.
Artículo en Inglés | MEDLINE | ID: mdl-36551006

RESUMEN

Even though non-steroidal anti-inflammatory drugs are the most effective treatment for inflammatory conditions, they have been linked to negative side effects. A promising approach to mitigating potential risks, is the development of new compounds able to combine anti-inflammatory with antioxidant activity to enhance activity and reduce toxicity. The implication of reactive oxygen species in inflammatory conditions has been extensively studied, based on the pro-inflammatory properties of generated free radicals. Drugs with dual activity (i.e., inhibiting inflammation related enzymes, e.g., LOX-3 and scavenging free radicals, e.g., DPPH) could find various therapeutic applications, such as in cardiovascular or neurodegenerating disorders. The challenge we embarked on using deep learning was the creation of appropriate classification and regression models to discriminate pharmacological activity and selectivity as well as to discover future compounds with dual activity prior to synthesis. An accurate filter algorithm was established, based on knowledge from compounds already evaluated in vitro, that can separate compounds with low, moderate or high activity. In this study, we constructed a customized highly effective one dimensional convolutional neural network (CONV1D), with accuracy scores up to 95.2%, that was able to identify dual active compounds, being LOX-3 inhibitors and DPPH scavengers, as an indication of simultaneous anti-inflammatory and antioxidant activity. Additionally, we created a highly accurate regression model that predicted the exact value of effectiveness of a set of recently synthesized compounds with anti-inflammatory activity, scoring a root mean square error value of 0.8. Eventually, we succeeded in observing the manner in which those newly synthesized compounds differentiate from each other, regarding a specific pharmacological target, using deep learning algorithms.

7.
Bioengineering (Basel) ; 8(11)2021 Nov 10.
Artículo en Inglés | MEDLINE | ID: mdl-34821747

RESUMEN

Recent literature has revealed a long discussion about the importance and necessity of nerve conduction studies in carpal tunnel syndrome management. The purpose of this study was to investigate the possibility of automatic detection, based on electrodiagnostic features, for the median nerve mononeuropathy and decision making about carpal tunnel syndrome. The study included 38 volunteers, examined prospectively. The purpose was to investigate the possibility of automatically detecting the median nerve mononeuropathy based on common electrodiagnostic criteria, used in everyday clinical practice, as well as new features selected based on physiology and mathematics. Machine learning techniques were used to combine the examined characteristics for a stable and accurate diagnosis. Automatic electrodiagnosis reached an accuracy of 95% compared to the standard neurophysiological diagnosis of the physicians with nerve conduction studies and 89% compared to the clinical diagnosis. The results show that the automatic detection of carpal tunnel syndrome is possible and can be employed in decision making, excluding human error. It is also shown that the novel features investigated can be used for the detection of the syndrome, complementary to the commonly used ones, increasing the accuracy of the method.

8.
J Biomed Inform ; 43(2): 307-20, 2010 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-19883796

RESUMEN

The aim of this work is to present an automated method that assists in the diagnosis of Alzheimer's disease and also supports the monitoring of the progression of the disease. The method is based on features extracted from the data acquired during an fMRI experiment. It consists of six stages: (a) preprocessing of fMRI data, (b) modeling of fMRI voxel time series using a Generalized Linear Model, (c) feature extraction from the fMRI data, (d) feature selection, (e) classification using classical and improved variations of the Random Forests algorithm and Support Vector Machines, and (f) conversion of the trees, of the Random Forest, to rules which have physical meaning. The method is evaluated using a dataset of 41 subjects. The results of the proposed method indicate the validity of the method in the diagnosis (accuracy 94%) and monitoring of the Alzheimer's disease (accuracy 97% and 99%).


Asunto(s)
Enfermedad de Alzheimer/diagnóstico , Árboles de Decisión , Modelos Lineales , Imagen por Resonancia Magnética/métodos , Adolescente , Anciano , Anciano de 80 o más Años , Algoritmos , Enfermedad de Alzheimer/clasificación , Bases de Datos Factuales , Progresión de la Enfermedad , Femenino , Humanos , Masculino , Reproducibilidad de los Resultados , Adulto Joven
9.
Stud Health Technol Inform ; 273: 255-257, 2020 Sep 04.
Artículo en Inglés | MEDLINE | ID: mdl-33087622

RESUMEN

Smart devices, including the popular smart watches, often collect information on the heart beat rhythm and transmit it to a central server for storage or further processing. A factor introducing important limitations in the amount of data collected, transmitted and finally processed is the life of the mobile device or smart watch battery. Some devices choose to transmit the mean heart rate over relatively long periods of time, to save power. Heart Rate Variability (HRV) analysis gives useful information about the human heart, by only examining the heart rate time series. Its discriminating capability is affected by the amount of available information to process. Ideally, the whole RR interval time series should be used. We investigate here how this discriminating capability is affected, when the analysis is based on mean heart rate values transmitted over relatively long time periods. We show that we still can get useful information and the discriminating power is still remarkable, even when the amount of the available data is relatively small.


Asunto(s)
Corazón , Frecuencia Cardíaca , Humanos
10.
Comput Methods Programs Biomed ; 91(1): 48-54, 2008 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-18423927

RESUMEN

The approximate entropy (ApEn) is a measure of systems complexity. The implementation of the method is computationally expensive and requires execution time analogous to the square of the size of the input signal. We propose here a fast algorithm which speeds up the computation of approximate entropy by detecting early some vectors that are not similar and by excluding them from the similarity test. Experimental analysis with various biomedical signals revealed a significant improvement in execution times.


Asunto(s)
Algoritmos , Diagnóstico por Computador/métodos , Entropía , Modelos Biológicos , Modelos Estadísticos , Procesamiento de Señales Asistido por Computador , Simulación por Computador , Interpretación Estadística de Datos , Factores de Tiempo
11.
Comput Biol Med ; 37(5): 642-54, 2007 May.
Artículo en Inglés | MEDLINE | ID: mdl-16904097

RESUMEN

The goal of this paper is to examine the classification capabilities of various prediction and approximation methods and suggest which are most likely to be suitable for the clinical setting. Various prediction and approximation methods are applied in order to detect and extract those which provide the better differentiation between control and patient data, as well as members of different age groups. The prediction methods are local linear prediction, local exponential prediction, the delay times method, autoregressive prediction and neural networks. Approximation is computed with local linear approximation, least squares approximation, neural networks and the wavelet transform. These methods are chosen since each has a different physical basis and thus extracts and uses time series information in a different way.


Asunto(s)
Frecuencia Cardíaca/fisiología , Adulto , Factores de Edad , Anciano , Enfermedad Coronaria/fisiopatología , Electrocardiografía/clasificación , Electrocardiografía/estadística & datos numéricos , Electrocardiografía/tendencias , Electrocardiografía Ambulatoria/clasificación , Electrocardiografía Ambulatoria/estadística & datos numéricos , Electrocardiografía Ambulatoria/tendencias , Femenino , Predicción , Análisis de Fourier , Insuficiencia Cardíaca/fisiopatología , Humanos , Análisis de los Mínimos Cuadrados , Modelos Lineales , Masculino , Persona de Mediana Edad , Redes Neurales de la Computación , Factores de Tiempo
12.
IEEE Trans Biomed Eng ; 64(11): 2711-2718, 2017 11.
Artículo en Inglés | MEDLINE | ID: mdl-28182552

RESUMEN

Objective: A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy. Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors.Objective: A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy. Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors.


Asunto(s)
Simulación por Computador , Entropía , Procesamiento de Señales Asistido por Computador , Adulto , Anciano , Algoritmos , Electrocardiografía , Insuficiencia Cardíaca/fisiopatología , Frecuencia Cardíaca/fisiología , Humanos , Persona de Mediana Edad
13.
Vasc Endovascular Surg ; 40(2): 95-101, 2006.
Artículo en Inglés | MEDLINE | ID: mdl-16598356

RESUMEN

The authors reviewed a 2-year experience with abdominal aortic aneurysm (AAA) repair to determine if patients who were excluded from endovascular aneurysm repair (EVAR) because of anatomic criteria (Group III) represented a higher risk for subsequent open aneurysm repair than either patients undergoing EVAR (Group II) or those patients who preferentially underwent open repair (Group I). Between January 2001 and December 2003, 107 patients underwent AAA repair. Open repair was recommended in patients <70 years of age and without significant comorbidities (Group I). There were 35 patients in Group I; 72 patients were evaluated for EVAR; 29 patients underwent EVAR (Group II), and 43 were excluded and underwent open repair (Group III). Exclusion criteria were those recommended by the graft manufacturers. Patients in Group I were significantly younger than those in Groups II and III (p < 0.0001). Gender, incidence of diabetes, and hypertension were similar in all groups. Patients in Group III had a greater incidence of coronary artery disease (CAD) than those in Groups I and II, trending toward statistical significance (p = 0.06). Aneurysm size in Group II was statistically smaller than in Group I or III. Group III had significantly more complications (25.6% vs 5.7% and 6.9%) than either Group I or II (p < 0.015). Cardiac complications were similar in all groups. Three patients in Group III required prolonged intubation and 3 in Group III developed renal insufficiency. A history of CAD was predictive of complications (21.8% vs 5.8%, p < 0.024), as was inclusion in Group III. There were 2 deaths in this series, both in Group III. Length of stay was significantly less in Group II (4.17 +/-2.36 days) than in Group I (6.57 +/-1.84 days) or Group III (12.30 +/-9.82 days) (p = 0.0001). Open aneurysm repair can be safely performed in younger good-risk patients (Group I) with results equivalent to EVAR (Group II) but with slightly longer length of stay (LOS). In older patients with suitable anatomy EVAR can be performed with minimal morbidity and short LOS. Older patients not suitable for EVAR (Group III) constitute a higher risk group of patients because of increased incidence of CAD and the need for more complex repairs. However, the mortality rate in this group was only 4.6%.


Asunto(s)
Aneurisma de la Aorta Abdominal/cirugía , Cateterismo/efectos adversos , Enfermedad de la Arteria Coronaria/cirugía , Extremidades/irrigación sanguínea , Isquemia/etiología , Infarto del Miocardio/etiología , Selección de Paciente , Complicaciones Posoperatorias/etiología , Insuficiencia Renal/etiología , Procedimientos Quirúrgicos Vasculares/efectos adversos , Anciano , Anciano de 80 o más Años , Aneurisma de la Aorta Abdominal/complicaciones , Aneurisma de la Aorta Abdominal/mortalidad , Aneurisma de la Aorta Abdominal/terapia , Cateterismo/instrumentación , Enfermedad de la Arteria Coronaria/complicaciones , Enfermedad de la Arteria Coronaria/mortalidad , Enfermedad de la Arteria Coronaria/terapia , Procedimientos Quirúrgicos Electivos/efectos adversos , Femenino , Humanos , Isquemia/epidemiología , Isquemia/mortalidad , Tiempo de Internación , Masculino , Persona de Mediana Edad , Morbilidad , Infarto del Miocardio/epidemiología , Infarto del Miocardio/mortalidad , Complicaciones Posoperatorias/epidemiología , Complicaciones Posoperatorias/mortalidad , Insuficiencia Renal/epidemiología , Insuficiencia Renal/mortalidad , Estudios Retrospectivos , Factores de Riesgo , Stents
14.
Artículo en Inglés | MEDLINE | ID: mdl-26737009

RESUMEN

R-R interval signals that come from different subjects are regularly aligned and averaged according to the horological starting time of the recordings. We argue that the horological time is a faulty alignment criterion and provide evidence in the form of a new alignment method. Our main motivation is that the human heart rate (HR) rhythm follows a circadian cycle, whose pattern can vary among different classes of people. We propose two novel alignment algorithms that consider the HR circadian rhythm, the Puzzle Piece Alignment Algorithm (PPA) and the Event Based Alignment Algorithm (EBA). First, we convert the R-R interval signal into a series of time windows and compute the mean HR per window. Then our algorithms search for matching circadian patterns to align the signals. We conduct experiments using R-R interval signals extracted from two databases in the Physionet Data Bank. Both algorithms are able to align the signals with respect to the circadian rhythmicity of HR. Furthermore, our findings confirm the presence of more than one pattern in the circadian HR rhythm. We suggest an automatic classification of signals according to the three most prominent patterns.


Asunto(s)
Ritmo Circadiano/fisiología , Frecuencia Cardíaca/fisiología , Adulto , Anciano , Algoritmos , Automatización , Interpretación Estadística de Datos , Bases de Datos Factuales , Femenino , Humanos , Masculino , Persona de Mediana Edad , Reconocimiento de Normas Patrones Automatizadas , Procesamiento de Señales Asistido por Computador
15.
IEEE Trans Inf Technol Biomed ; 16(4): 615-22, 2012 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-22106154

RESUMEN

The accurate diagnosis of diseases with high prevalence rate, such as Alzheimer, Parkinson, diabetes, breast cancer, and heart diseases, is one of the most important biomedical problems whose administration is imperative. In this paper, we present a new method for the automated diagnosis of diseases based on the improvement of random forests classification algorithm. More specifically, the dynamic determination of the optimum number of base classifiers composing the random forests is addressed. The proposed method is different from most of the methods reported in the literature, which follow an overproduce-and-choose strategy, where the members of the ensemble are selected from a pool of classifiers, which is known a priori. In our case, the number of classifiers is determined during the growing procedure of the forest. Additionally, the proposed method produces an ensemble not only accurate, but also diverse, ensuring the two important properties that should characterize an ensemble classifier. The method is based on an online fitting procedure and it is evaluated using eight biomedical datasets and five versions of the random forests algorithm (40 cases). The method decided correctly the number of trees in 90% of the test cases.


Asunto(s)
Algoritmos , Árboles de Decisión , Diagnóstico por Computador , Enfermedad/clasificación , Bases de Datos Factuales , Humanos
16.
Injury ; 43(9): 1355-61, 2012 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-22560130

RESUMEN

Despite the establishment of evidence-based guidelines for the resuscitation of critically injured patients who have sustained cardiopulmonary arrest, rapid decisions regarding patient salvageability in these situations remain difficult even for experienced physicians. Regardless, survival is limited after traumatic cardiopulmonary arrest. One applicable, well-described resuscitative technique is the emergency department thoracotomy-a procedure that, when applied correctly, is effective in saving small but significant numbers of critically injured patients. By understanding the indications, technical details, and predictors of survival along with the inherent risks and costs of emergency department thoracotomy, the physician is better equipped to make rapid futile versus salvageable decisions for this most severely injured subset of patients.


Asunto(s)
Reanimación Cardiopulmonar/estadística & datos numéricos , Servicios Médicos de Urgencia , Paro Cardíaco/cirugía , Toracotomía/estadística & datos numéricos , Heridas no Penetrantes/cirugía , Heridas Penetrantes/cirugía , Adolescente , Adulto , Reanimación Cardiopulmonar/métodos , Niño , Preescolar , Femenino , Paro Cardíaco/etiología , Humanos , Lactante , Masculino , Guías de Práctica Clínica como Asunto , Toracotomía/métodos , Estados Unidos/epidemiología , Heridas no Penetrantes/complicaciones , Heridas Penetrantes/complicaciones , Adulto Joven
17.
J Trauma Acute Care Surg ; 73(3): 625-8, 2012 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-22929493

RESUMEN

BACKGROUND: Venous thromboembolism (VTE) is a significant risk in trauma patients. Although low-molecular weight heparin (LMWH) is effective in VTE prophylaxis, its use for patients with traumatic intracranial hemorrhage remains controversial. The purpose of this study was to evaluate the safety of LMWH for VTE prophylaxis in blunt intracranial injury. METHODS: We conducted a retrospective multicenter study of LMWH chemoprophylaxis on patients with intracranial hemorrhage caused by blunt trauma. Patients with brain Abbreviated Injury Scale score of 3 or higher, age 18 years or older, and at least one repeated head computed tomographic scan were included. Patients with previous VTE; on preinjury anticoagulation; hospitalized for less than 48 hours; on heparin for VTE prophylaxis; or required emergent thoracic, abdominal, or vascular surgery at admission were excluded. Patients were divided into two groups: those who received LMWH and those who did not. The primary outcome was progression of intracranial hemorrhage on repeated head computed tomographic scan. RESULTS: The study included 1,215 patients, of which 220 patients (18.1%) received LMWH and 995 (81.9%) did not. Hemorrhage progression occurred in 239 of 995 control subjects and 93 of 220 LMWH patients (24% vs. 42%, p < 0.001). Hemorrhage progression occurred in 32 patients after initiating LMWH (14.5%). Nine of these patients (4.1%) required neurosurgical intervention for hemorrhage progression. CONCLUSION: Patients receiving LMWH were at higher risk for hemorrhage progression. We were unable to demonstrate safety of LMWH for VTE prophylaxis in patients with brain injury. The risk of using LMWH may exceed its benefit. LEVEL OF EVIDENCE: Therapeutic study, level IV.


Asunto(s)
Lesiones Encefálicas/complicaciones , Hemorragia/inducido químicamente , Heparina de Bajo-Peso-Molecular/administración & dosificación , Tromboembolia Venosa/prevención & control , Adulto , Anciano , Anticoagulantes/administración & dosificación , Anticoagulantes/efectos adversos , Lesiones Encefálicas/diagnóstico , Lesiones Encefálicas/terapia , Estudios de Casos y Controles , Femenino , Estudios de Seguimiento , Hemorragia/epidemiología , Heparina de Bajo-Peso-Molecular/efectos adversos , Mortalidad Hospitalaria , Humanos , Incidencia , Puntaje de Gravedad del Traumatismo , Masculino , Persona de Mediana Edad , Prevención Primaria/métodos , Valores de Referencia , Estudios Retrospectivos , Medición de Riesgo , Administración de la Seguridad , Sociedades Médicas , Análisis de Supervivencia , Centros Traumatológicos , Resultado del Tratamiento , Tromboembolia Venosa/epidemiología , Tromboembolia Venosa/etiología , Heridas no Penetrantes/complicaciones , Heridas no Penetrantes/diagnóstico , Heridas no Penetrantes/terapia
18.
IEEE Trans Inf Technol Biomed ; 13(4): 512-8, 2009 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-19273030

RESUMEN

In this study, heartbeat time series are classified using support vector machines (SVMs). Statistical methods and signal analysis techniques are used to extract features from the signals. The SVM classifier is favorably compared to other neural network-based classification approaches by performing leave-one-out cross validation. The performance of the SVM with respect to other state-of-the-art classifiers is also confirmed by the classification of signals presenting very low signal-to-noise ratio. Finally, the influence of the number of features to the classification rate was also investigated for two real datasets. The first dataset consists of long-term ECG recordings of young and elderly healthy subjects. The second dataset consists of long-term ECG recordings of normal subjects and subjects suffering from coronary artery disease.


Asunto(s)
Inteligencia Artificial , Frecuencia Cardíaca/fisiología , Procesamiento de Señales Asistido por Computador , Adulto , Anciano , Anciano de 80 o más Años , Algoritmos , Enfermedad de la Arteria Coronaria/fisiopatología , Electrocardiografía , Humanos , Masculino , Modelos Estadísticos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA