Biometric Storyboards: visualising meaningful gameplay events

From the CHI 2011 workshop - “Brain and Body Interfaces: Designing for Meaningful Interaction”

For more information about this talk visit the workshop at brainandbody.physiologicalcomputing.net

Meta

Mirza-Babaei, P. & McAllister, G. 2011. Biometric Storyboards: visualising meaningful gameplay events. In Proceedings of CHI 2011 BBI workshop, Vancouver, Canada. [PDF]

Mirza-Babaei, P., Nacke, L.E., McAllister, G. 2012. Biometric Storyboards: Toward a Better Understanding of Player Experience. In Proceedings of CHI 2012 Game User Research Workshop, Austin, TX, USA. [PDF]

Annotation

This paper describes the use of facial electromyography (EMG) as a measure of positive and negative emotional valence during interactive experience. Thirteen boys played a car racing video game on an Xbox platform while facial EMG data were collected. Through video review positive and negative events during play were identified. The zygomaticus muscle EMG, which controls smiling, was found to be significantly greater during positive events as compared to negative. The corrugator muscle EMG, which controls frowning, was found to be significantly greater during negative events. The results of this study demonstrate that positive valence can be measured during interactive experiences with physiologic measures. This study also found that the corrugator EMG can still measure negative valence during high intensity interactive play in spite of the confounding factor of mental effort. These methods appear useful for associating the player’s emotion with game events, and could be applied to HCI in general.

Meta

Richard L. Hazlett, Measuring emotional valence during interactive experiences: boys at video game play, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, April 22-27, 2006, Montréal, Québec, Canada

Annotation

This paper describes an investigation into the usefulness and reliably of facial EMG as a method of monitoring user experience. To do this, the team constructed an experiment in which participants played the racing game “Juiced” while their corrugator and zygomaticus EMG data was recorded. The results show that the zygomaticus muscle EMG can indeed provide an ongoing measure of the player’s positive emotional valence, and that the corrugator EMG may also serve as practical measure of a player’s emotional valence as well.

Abstract

Understanding players’ visual attention patterns within an interactive 3D game environment is an important research area that can improve game level design and graphics. Several graphics techniques use a perception based rendering method to enhance graphics quality while achieving the fast rendering speed required for fast-paced 3D video games. Game designers can also enhance game play by adjusting the level design, texture and color choices, and objects’ locations, if such decisions are informed by a study of players’ visual attention patterns in 3D game environments. This paper seeks to address this issue. We present results showing different visual attention patterns that players exhibit in two different game types: action-adventure games and first person shooter games. In addition, analyzing visual attention patterns within a complex 3D game environment presents a new challenge because the environment is very complex with many rapidly changing conditions; the methods used in previous research cannot be used in such environments. In this paper, we will discuss our exploration seeking a new approach to analyze visual attention patterns within interactive 3D environments.

Meta

Magy Seif El-Nasr, Su Yan, Visual attention in 3D video games, Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology, June 14-16, 2006, Hollywood, California

Annotation

In this paper, the research team sought to investigate visual attention in video games by having their subjects play Soul Caliber 2, Legacy of Kain Blood Omen II, and Halo II (while attempting to control for game familiarity by dividing participants based on their gaming experience) while having their eye movements recorded by a head-mounted eye tracker. While the results are limited due in most part to a small subject pool, the data collected does show a certain degree of consistency between subjects, and may be used to inform future research.

Abstract

We developed a biofeedback game in which players can take other physical actions besides simply “relaxing”. We used the skin conductance response for sensing a player’s surge of excitement and penalized players when they did not attack enemies in situations because they were not calm enough to meet the biofeedback threshold. We conducted a subjective experiment to to see whether people found the game enjoyable. Most participants felt the game was enjoyable.

Meta

Munekata, N., Nakamura, T., Tanaka, R., Domon, Y., Nakamura, F., & Matsubara H. A Biofeedback Game with Physical Actions. ICEC 2010: 381-388

Annotation

Case study in which Munekata et al developed a game in which players controlled a sword-like controller while simultaneously having their gsr recorded and feed into the game engine. This data was used to augment in-game events, and punished players when they attempted to attack enemies while their scr was above a certain threshold (calculated from their baseline data). Participants were asked to play the game six times total, three times with biofeedback augmentation, and three times without, then interviewed about their experience. The data collected indicates that not only did the participants find the game entertaining, but also that participants seemed to intuitively understand the effect of their psychophysiological state on the game environment

Abstract

In this workshop we study the research themes and the state-of-the-art of brain-computer interaction. Brain-computer interface research has seen much progress in the medical domain, for example for prosthesis control or as biofeedback therapy for the treatment of neurological disorders. Here, however, we look at brain-computer interaction especially as it applies to research in Human-Computer Interaction (HCI). Through this workshop and continuing discussions, we aim to define research approaches and applications that apply to disabled and able-bodied users across a variety of real-world usage scenarios. Entertainment and game design is one of the application areas that will be considered.

Meta

Nijholt, A., & Tan, D.Playing with your brain: brain-computer interfaces and games, Proceedings of the international conference on Advances in computer entertainment technology, June 13-15, 2007, Salzburg, Austria

Annotation

Notes on a workshop discussing the possible applications of brain-computer interfaces. The topics discussed include: using thoughts to control computers (as an alternative input device), the application of biofeedback in user testing, and the possible development of adaptive user interfaces which react to users’ cognitive states. Paper also includes a brief discussion of the challenges inherent in brain-computer interfaces in hci research.

Abstract

Digital games offer rich media content and engaging action, accessible individually or in groups collaborating or competing against each other. This makes them promising for use as stimulus in research settings. This paper examines the advantages and challenges of using games in experimental research with particular focus on strict stimulus control through the following four areas: (1) matching and regulating task type, (2) data segmentation and event coding, (3) compatibility between participants and (4) planning and conducting data collection. This contribution provides a breakdown of the steps necessary for using a digital game in experimental studies, along with a checklist for researchers illustrating variables that potentially affect the reliability and validity of experiments. We also offer a practical study example. Ideally, the identification of the methodological and practical considerations of employing games in empirical research will also provide useful in interpreting and evaluating experimental work utilizing games as stimulus.

Meta

Simo, J., Inger, E., Matias, K., & Niklas, R. “Digital games as experiment stimulus”, Proceedings of 2012 DiGRA Nordic gaming conference, 2012 Tampere, Finland

Annotation

In an attempt to address the issues encountered in previous studies, the authors of this paper offer a number of methods through which researchers may be able use video games as an experimental stimulus in a way as to provide reliable, reproducible results. The paper addresses a number of the challenges inherent in using games as a stimulus, and how they might be addressed. Practical and methodological consideration are discussed in detail, with special attention paid to the need to control for variables such as player experience and in-game events. The team also addresses the need for researcher familiarity, and recommends those looking to use games in their research familiarize themselves with the medium, and provide a checklist of questions for guiding experimental design.

Abstract

Utilising biometric data has become an increasingly active area in the video games user research community, and a number of academic papers have been published introducing various biometric based analysis techniques in video games research. This paper aims to quantify the value of biometric methods as an addition to traditional observation-based user research methodologies, and their respective contributions to the production of formative feedback during the development of video games. Our results show that observation-based techniques can expose the majority of issues relating to usability, however the biometrics-based approach enabled researchers to discover latent issues in related to players’ feelings, immersion and gameplay experience and, in certain categories of issue, reveal up to 63% more issues than observation alone.

Meta

Mirza-Babaei, P., Long, S., Foley, E., & McAllister, G. Understanding the Contribution of Biometrics to Games User Research. Proc. DIGRA (2011).

Annotation

This paper seeks to quantify the objective value of the inclusion of biometics in user testing. The authors begin with an overview of traditional user research methods, then briefly discuss several physiological measurements which can be used in user testing. For the purposes of this particular paper, however, the team chose to monitor galvanic skin response while participants played two first-person shooters (one which had been well received, and the other which had received far fewer positive reviews). Subjects were filmed and their GSR monitored during gameplay.

Data collected during player trails was subjected to analysis by two different approaches: The first being a traditional post-gameplay observational review of the captured video footage, and the second being a biometric-based approach in which players’s physiological data was mapped to the video, and used to shape the post-gameplay interview questions (e.g. “Can you explain what happened here?” in reference to specific point in time in which a biometric micro-event had been identified in the record.) Data collected over the course of this study shows that the inclusion of biometric data in user testing confers a significant advantage over traditional observation-and-interview only testing.

Abstract

This paper describes an investigation into how real-time but low-cost biometric information can be interpreted by computer games to enhance gameplay without fundamentally changing it. We adapted a cheap sensor, (the Lightstone mediation sensor device by Wild Divine), to record and transfer biometric information about the player (via sensors that clip over their fingers) into a commercial game engine, Half-Life 2. During game play, the computer game was dynamically modified by the player’s biometric information to increase the cinematically augmented “horror” affordances. These included dynamic changes in the game shaders, screen shake, and the creation of new spawning points for the game’s non-playing characters (zombies), all these features were driven by the player’s biometric data. To evaluate the usefulness of this biofeedback device, we compared it against a control group of players who also had sensors clipped on their fingers, but for the second group the gameplay was not modified by the biometric information of the players. While the evaluation results indicate biometric data can improve the situated feeling of horror, there are many design issues that will need to be investigated by future research, and the judicious selection of theme and appropriate interaction is vital.

Meta

Dekker, A., & Champion, E. 2007. Please Biofeed the Zombie: Enhancing the Gameplay and Display of a Horror Game Using Biofeedback. Situated Play: Proceedings of DiGRA 2007, 550-558. Tokyo: The University of Tokyo

Annotation

In the experiment outlined in this paper, researchers used a commercially-available biofeedback device to capture information about players’ physical states and feed that information into an existing game engine, the goal of which was to determine the possible usefulness of such a technique as a method of augmenting gameplay. To do this, each test subject was asked to play through the “Ravenholm” level of Half Life 2, once as a control condition and once through a version of the same level which had been altered to change certain game variables in response to biometric information collected from a combination biosensor which monitored both electrocardiogram heart rate variability and gsr. These changes fell into two categories. The first being changes which were visible/obvious to the player during gameplay, in that the display of in-game elements would change in response to changes in biometric data (e.g. movement speed increased with heart rate). The second category, referred to in the paper as “stealth mode” consisted of “hidden” changes which would reward players for being able to control their physical responses to mental pressures (e.g. If the participants’ heart rate fell below 0.5 of their calibration average, they became invisible to enemies). Players were interviewed after playing through each level to glean additional insight into their experiences, then asked to view footage of their performance and discuss which elements affected their experiences.

Several issues were found during the evaluation stage, primarily that participants did not attempt to exert conscious control over their breathing to see how or if it affected gameplay. This may indicate that in order for such methods to be useful, the cause and effect relationship between actions and in-game consequences must be explicit. Subjects’ unfamiliarity with the game also seemed to have significant detrimental effects on gameplay, as did presence of the sensor itself, which interfered with players’ ability to play using the standard keyboard and two-button mouse configuration. As such, despite the fact that the information collected from the prototype was indeed correlated with the answers given in the post-game evaluation, the exploration was not able to come to any concrete conclusions, and it is recommended that future experiments would be designed in such a way as to control for variations due to factors such as user experience, and that the monitoring methods used would be less intrusive, so as to not affect gameplay itself.

Abstract

In this paper we investigate the relation between immersion in a game and the player’s intensity of physical behaviours, in order to explore whether these behaviours can be reliably used as indicators of player experience. Immersion in the game was manipulated by means of screen size (20” vs 42” screen), and sound pressure level (60dBA vs 80 dBA), according to a 2 x 2 design. The effects of these manipulations on self-reported experience (including arousal and presence) and behavioural intensity (controller tilt and button pressure) were measured. Results showed that sound pressure level in particular strongly influenced both the self-reported measures of people’s affective reactions and feelings of presence and the force people applied to the interface device. Results from controller tilt demonstrated that participants did move along with the dynamics of the game. The measure was, however not sensitive to either of the two manipulations of sensory immersion. In the paper the implications for the use of behavioural indicators of player experience in general and the feeling of presence are discussed.

Meta

Hoogen, W.M. van den, IJsselsteijn, W.A. & Kort, Y.A.W. de (2009). Effects of sensory immersion on behavioural indicators of player experience : movement synchrony and controller pressure. In B. Atkins & H. Kennedy (Eds.), Breaking new ground : innovation in games, play, practice and theory, London: DiGRA.

Annotation

This paper investigates whether or not certain behaviors, specifically controller tilt and button pressure, can be used as reliable indicators of a player’s level of immersion in a game (self-reported levels of presence and emotional experience). To test this, researchers conducted an experience in which screen size and sound pressure level were manipulated to create different environmental experiences. Data was collected from sensors embedded in the controllers being used during gameplay, and player experience was reported post-game in self-assessments. According to the data collected by the self-reports, players experienced a greater level of immersion and spatial presence with a larger field of view and sound pressure. However, greater feelings of immersion were not tied to any significant differences in the intensity of physical behavior; neither controller rotation nor pressure increased with changes in either the field of view or sound pressure.

Abstract

In search of suitable methods for measuring the affective state of video-game players, this study investigates the hypothesis that the player’s state of arousal will correspond with the pressure used to depress buttons on a gamepad. A video game was created that would detect the force of each button press during play. It was found that as the difficulty level of the game increased, players would hit the gamepad buttons significantly harder.

Meta

Sykes, J., & Brown, S. Affective gaming: measuring emotion through the gamepad, CHI ‘03 extended abstracts on Human factors in computing systems, April 05-10, 2003, Ft. Lauderdale, Florida, USA

Annotation

This paper describes an attempt to investigate the matter of how player affect might be measured through gameplay hardware through the use of pressure sensors and whether the data collected from such a system could be used as a reliable proxy for player emotions during gameplay. The paper discusses previous research into the use of gsr as an input method for game augmentation, but argues that such a method is inappropriate for “traditional, fast-paced video games”. The researchers go on to propose that pressure sensors could be used as an alternative method of measuring player affect, as the amount of pressure one exerts on the gamepad would increase with the amount of stress one is feeling at a given point in time. To test this, an experiment was conducted in which participants were asked to play a version of “Space Invaders” using a gamepad which had been modified to include the aforementioned pressure sensors. Data collected from this experiment was consistant with the hypothesis, as the mean button pressure did steadily increase with level difficulty. Without a way to measure valence, however, it is impossible to tell whether the subjects’ arousal state was more positive or negative (i.e. whether they were more engaged, or simply more frustrated as the game progressed).