Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
1.
Eur J Surg Oncol ; 48(6): 1189-1197, 2022 06.
Artigo em Inglês | MEDLINE | ID: mdl-35183411

RESUMO

BACKGROUND: Prehabilitation is a promising method to enhance postoperative recovery, especially in patients suffering from cancer. Particularly during times of social distancing, providing home-based programmes may have become a suitable solution to increase compliance and effectiveness. METHODS: In line with the PRISMA guidelines, a systematic review was conducted including trials that investigated the effect of home-based prehabilitation (HBP) in patients undergoing surgery for cancer. The primary outcome was postoperative functional capacity (6 min walk test, 6MWT). Secondary outcomes were postoperative complications and compliance. RESULTS: Five randomized controlled trials were included with 351 patients undergoing surgery for colorectal cancer, oesophagogastric cancer, bladder cancer and non-small cell lung cancer. Three studies presented results of significant progress after eight weeks. The meta-analysis showed a significant improvement of the 6MWT in the prehabilitation group compared to the control group preoperatively (MD 35.06; 95% CI 11.58 to 58.54; p = .003) and eight weeks postoperatively (MD 44.91; 95% CI 6.04 to 83.79; p = .02) compared to baseline. Compliance rate varied from 63% to 83% with no significant difference between prehabilitation and control groups. These data must be interpreted with caution because of a high amount of heterogeneity and small sample sizes. DISCUSSION: In conclusion, HBP may enhance overall functional capacity of patients receiving oncological surgery compared to standard of care. This could be a promising alternative to hospital-based prehabilitation regarding the current pandemic and further digitalization in the future. In order to increase accessibility and effectiveness of prehabilitation, home-based solutions should be further investigated.


Assuntos
COVID-19 , Carcinoma Pulmonar de Células não Pequenas , Neoplasias Colorretais , Neoplasias Pulmonares , COVID-19/epidemiologia , Neoplasias Colorretais/cirurgia , Controle de Doenças Transmissíveis , Humanos , Complicações Pós-Operatórias/epidemiologia , Complicações Pós-Operatórias/etiologia , Cuidados Pré-Operatórios/métodos
2.
Artif Intell Med ; 28(3): 281-306, 2003 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-12927337

RESUMO

In this work, we develop and evaluate several least squares support vector machine (LS-SVM) classifiers within the Bayesian evidence framework, in order to preoperatively predict malignancy of ovarian tumors. The analysis includes exploratory data analysis, optimal input variable selection, parameter estimation, and performance evaluation via receiver operating characteristic (ROC) curve analysis. LS-SVM models with linear and radial basis function (RBF) kernels, and logistic regression models have been built on 265 training data, and tested on 160 newly collected patient data. The LS-SVM model with nonlinear RBF kernel achieves the best performance, on the test set with the area under the ROC curve (AUC), sensitivity and specificity equal to 0.92, 81.5% and 84.0%, respectively. The best averaged performance over 30 runs of randomized cross-validation is also obtained by an LS-SVM RBF model, with AUC, sensitivity and specificity equal to 0.94, 90.0% and 80.6%, respectively. These results show that the LS-SVM models have the potential to obtain a reliable preoperative distinction between benign and malignant ovarian tumors, and to assist the clinicians for making a correct diagnosis.


Assuntos
Doenças dos Anexos/diagnóstico , Teorema de Bayes , Biologia Computacional/métodos , Neoplasias Ovarianas/diagnóstico , Inteligência Artificial , Diagnóstico por Computador , Diagnóstico Diferencial , Feminino , Humanos , Modelos Logísticos , Pessoa de Meia-Idade , Valor Preditivo dos Testes , Curva ROC , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
3.
IEEE Trans Neural Netw ; 14(2): 447-50, 2003.
Artigo em Inglês | MEDLINE | ID: mdl-18238028

RESUMO

In this paper, we present a simple and straightforward primal-dual support vector machine formulation to the problem of principal component analysis (PCA) in dual variables. By considering a mapping to a high-dimensional feature space and application of the kernel trick (Mercer theorem), kernel PCA is obtained as introduced by Scholkopf et al. (2002). While least squares support vector machine classifiers have a natural link with the kernel Fisher discriminant analysis (minimizing the within class scatter around targets +1 and -1), for PCA analysis one can take the interpretation of a one-class modeling problem with zero target value around which one maximizes the variance. The score variables are interpreted as error variables within the problem formulation. In this way primal-dual constrained optimization problem interpretations to the linear and kernel PCA analysis are obtained in a similar style as for least square-support vector machine classifiers.

4.
IEEE Trans Neural Netw ; 12(4): 809-21, 2001.
Artigo em Inglês | MEDLINE | ID: mdl-18249915

RESUMO

The Bayesian evidence framework is applied in this paper to least squares support vector machine (LS-SVM) regression in order to infer nonlinear models for predicting a financial time series and the related volatility. On the first level of inference, a statistical framework is related to the LS-SVM formulation which allows one to include the time-varying volatility of the market by an appropriate choice of several hyper-parameters. The hyper-parameters of the model are inferred on the second level of inference. The inferred hyper-parameters, related to the volatility, are used to construct a volatility model within the evidence framework. Model comparison is performed on the third level of inference in order to automatically tune the parameters of the kernel function and to select the relevant inputs. The LS-SVM formulation allows one to derive analytic expressions in the feature space and practical expressions are obtained in the dual space replacing the inner product by the related kernel function using Mercer's theorem. The one step ahead prediction performances obtained on the prediction of the weekly 90-day T-bill rate and the daily DAX30 closing prices show that significant out of sample sign predictions can be made with respect to the Pesaran-Timmerman test statistic.

5.
Neural Comput ; 14(5): 1115-47, 2002 May.
Artigo em Inglês | MEDLINE | ID: mdl-11972910

RESUMO

The Bayesian evidence framework has been successfully applied to the design of multilayer perceptrons (MLPs) in the work of MacKay. Nevertheless, the training of MLPs suffers from drawbacks like the nonconvex optimization problem and the choice of the number of hidden units. In support vector machines (SVMs) for classification, as introduced by Vapnik, a nonlinear decision boundary is obtained by mapping the input vector first in a nonlinear way to a high-dimensional kernel-induced feature space in which a linear large margin classifier is constructed. Practical expressions are formulated in the dual space in terms of the related kernel function, and the solution follows from a (convex) quadratic programming (QP) problem. In least-squares SVMs (LS-SVMs), the SVM problem formulation is modified by introducing a least-squares cost function and equality instead of inequality constraints, and the solution follows from a linear system in the dual space. Implicitly, the least-squares formulation corresponds to a regression formulation and is also related to kernel Fisher discriminant analysis. The least-squares regression formulation has advantages for deriving analytic expressions in a Bayesian evidence framework, in contrast to the classification formulations used, for example, in gaussian processes (GPs). The LS-SVM formulation has clear primal-dual interpretations, and without the bias term, one explicitly constructs a model that yields the same expressions as have been obtained with GPs for regression. In this article, the Bayesian evidence framework is combined with the LS-SVM classifier formulation. Starting from the feature space formulation, analytic expressions are obtained in the dual space on the different levels of Bayesian inference, while posterior class probabilities are obtained by marginalizing over the model parameters. Empirical results obtained on 10 public domain data sets show that the LS-SVM classifier designed within the Bayesian evidence framework consistently yields good generalization performances.


Assuntos
Inteligência Artificial , Teorema de Bayes , Análise dos Mínimos Quadrados , Distribuição Normal
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA