RESUMO
Visualization is a valuable tool in problem solving, especially for citizen science games. In this study, we analyze data from 36,351 unique players of the citizen science game Foldit over a period of 5 years to understand how their choice of visualization options are affected by expertise and problem type. We identified clusters of visualization options, and found differences in how experts and novices view puzzles and that experts differentially change their views based on puzzle type. These results can inform new design approaches to help both novice and expert players visualize novel problems, develop expertise, and problem solve.
RESUMO
Many studies have already shown that games can be a useful tool to make boring or difficult tasks more engaging. However, with serious game design being a relatively nascent field, such experiences can still be hard to learn and not very motivating. In this paper, we explore the use of learning and motivation frameworks to improve player experience in the well-known citizen science game Foldit. Using Cognitive Load Theory (CLT) and Self Determination Theory (SDT), we developed six interface and mechanical changes to the tutorial levels in Foldit designed to increase engagement and retention. We tested these features with new players of Foldit and collected both behavioral data, using game metrics, and prior experience data, using self-report measures. This study offers three major contributions: (1) we document the process of operationalizing CLT and SDT as new game features, a unique methodology not used in game design previously; (2) the user interface, specifically the level selection screen, significantly impacts how players progress through the game; and (3) a player's expertise, whether from prior domain knowledge or prior gaming experience, increases their engagement. We discuss both implications of these findings as well as how these implementations can generalize to other designs.
RESUMO
The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design, media studies, and the social sciences. They've extended and modified these methods for different types of digital games, such as social games, casual games, and serious games. This article describes several current GUR methods. A case study illustrates two specific methods: think-aloud and heuristics.
Assuntos
Projetos de Pesquisa , Design de Software , Jogos de Vídeo , Humanos , Interface Usuário-ComputadorRESUMO
The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design, media studies, and the social sciences. They've extended and modified these methods for different types of digital games, such as social games, casual games, and serious games. This article focuses on quantitative analytics of in-game behavioral user data and its emergent use by the GUR community. The article outlines open problems emerging from several GUR workshops. In addition, a case study of a current collaboration between researchers and a game company demonstrates game analytics' use and benefits.