Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
JMIR Ment Health ; 10: e42420, 2023 May 10.
Artigo em Inglês | MEDLINE | ID: mdl-37163323

RESUMO

BACKGROUND: One of the reasons why students go to counseling is being called on based on self-reported health survey results. However, there is no concordant standard for such calls. OBJECTIVE: This study aims to develop a machine learning (ML) model to predict students' mental health problems in 1 year and the following year using the health survey's content and answering time (response time, response time stamp, and answer date). METHODS: Data were obtained from the responses of 3561 (62.58%) of 5690 undergraduate students from University A in Japan (a national university) who completed the health survey in 2020 and 2021. We performed 2 analyses; in analysis 1, a mental health problem in 2020 was predicted from demographics, answers for the health survey, and answering time in the same year, and in analysis 2, a mental health problem in 2021 was predicted from the same input variables as in analysis 1. We compared the results from different ML models, such as logistic regression, elastic net, random forest, XGBoost, and LightGBM. The results with and without answering time conditions were compared using the adopted model. RESULTS: On the basis of the comparison of the models, we adopted the LightGBM model. In this model, both analyses and conditions achieved adequate performance (eg, Matthews correlation coefficient [MCC] of with answering time condition in analysis 1 was 0.970 and MCC of without answering time condition in analysis 1 was 0.976; MCC of with answering time condition in analysis 2 was 0.986 and that of without answering time condition in analysis 2 was 0.971). In both analyses and in both conditions, the response to the questions about campus life (eg, anxiety and future) had the highest impact (Gain 0.131-0.216; Shapley additive explanations 0.018-0.028). Shapley additive explanations of 5 to 6 input variables from questions about campus life were included in the top 10. In contrast to our expectation, the inclusion of answering time-related variables did not exhibit substantial improvement in the prediction of students' mental health problems. However, certain variables generated based on the answering time are apparently helpful in improving the prediction and affecting the prediction probability. CONCLUSIONS: These results demonstrate the possibility of predicting mental health across years using health survey data. Demographic and behavioral data, including answering time, were effective as well as self-rating items. This model demonstrates the possibility of synergistically using the characteristics of health surveys and advantages of ML. These findings can improve health survey items and calling criteria.

2.
Multivariate Behav Res ; 57(4): 658-678, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-33750245

RESUMO

There has been a growing interest in psychological measurements that use the multiple-alternative forced-choice (MAFC) response format for its resistance to response biases. Although several models have been proposed for the data obtained from such measurements, none have succeeded in incorporating the response time information. Given that currently, many psychological measurements are performed via computers, it would be beneficial to develop a joint model involving an MAFC item response and response time. The present study proposes the first model that combines a cognitive process model that underlies the observed response time and the forced-choice item response model. Specifically, the proposed model is based on the linear ballistic accumulator model of response time, which is substantially extended by reformulating its parameters so as to incorporate the MAFC item responses. The model parameters are estimated by the Markov chain Monte Carlo (MCMC) algorithm. A simulation study confirmed that the proposed approach could appropriately recover the parameters. Two empirical applications are reported to demonstrate the use of the proposed model and compare it with existing models. The results showed that the proposed model could be a useful tool for jointly modeling the MAFC item responses and response times.


Assuntos
Personalidade , Modelos Lineares , Cadeias de Markov , Método de Monte Carlo , Tempo de Reação
3.
Behav Res Methods ; 52(3): 1091-1107, 2020 06.
Artigo em Inglês | MEDLINE | ID: mdl-32394181

RESUMO

The two-alternative multidimensional forced-choice measurement of personality has attracted researchers' attention for its tolerance to response bias. Moreover, the response time can be collected along with the item response when personality measurement is conducted with computers. In view of this situation, the objective of this study is to propose a Thurstonian D-diffusion item response theory (IRT) model, which combines two key existing frameworks: the Thurstonian IRT model for forced-choice measurement and the D-diffusion IRT model for the response time in personality measurement. The proposed model reflects the psychological theories behind the data-generating mechanism of the item response and response time. A simulation study reveals that the proposed model can successfully recover the parameters and factor structure in typical application settings. A real data application reveals that the proposed model estimates similar but still different parameter values compared to the original Thurstonian IRT model, and this difference can be explained by the response time information. In addition, the proposed model successfully reflects the distance-difficulty relationship between the response time and the latent relative respondent position.


Assuntos
Personalidade , Teoria Psicológica , Psicometria , Tempo de Reação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA