Research

Research

Research

Virtual experience, real consequences: the potential negative emotional consequences of virtual reality gameplay

2020 | Raymond Lavoie, Kelley Main, Corey King, Danielle King

Abstract

As virtual reality (VR) technology enters mainstream markets, it is imperative that we understand its potential impacts on users, both positive and negative.

In the present paper, we build on the extant literature’s focus on the physical side effects of VR gameplay (e.g., cybersickness) by focusing on VR’s potential to intensify users’ experiences of negative emotions. We first conducted a preliminary survey to assess users’ emotional responses during VR gameplay, with the results suggesting that certain VR situations can in fact produce intense negative emotional experiences. We then designed an interactive scenario intended to elicit low to moderate amounts of negative emotion, wherein participants played out the scenario in either VR (using the HTC Vive) or on a laptop computer. Compared to the participants who enacted the scenario on the laptop, those in the VR condition reported higher levels of absorption, which in turn increased the intensity of their negative emotional response to the scenario.

A follow-up questionnaire administered several hours later revealed that the intensified negative emotions resulting from VR had a significant positive correlation with negative rumination (i.e., harmful self-related thoughts related to distress). These results show that VR gameplay has the potential to elicit strong negative emotional responses that could be harmful for users if not managed properly. We discuss the practical and policy implications of our findings.

How are your robot friends doing? A design exploration of graphical techniques supporting awareness of robot team members in teleoperation

2020 | Stela H. Seo, James E. Young, Pourang Irani | Interface by ZenFri

Abstract

While teleoperated robots continue to proliferate in domains including search and rescue, field exploration, or the military, human error remains a primary cause for accidents or mistakes. One challenge is that teleoperating a remote robot is cognitively taxing as the operator needs to understand the robot’s state and monitor all its sensor data.

In a multi-robot team, an operator needs to additionally monitor other robots’ progress, states, notifications, errors, and so on to maintain team cohesion. One strategy for supporting the operator to comprehend this information is to improve teleoperation interface designs to carefully present data. We present a set of prototypes that simplify complex team robot states and actions, with an aim to help the operator to understand information from the robots easily and quickly.

We conduct a series of pilot studies to explore a range of design parameters used in our prototypes (text, icon, facial expression, use of color, animation, and number of team robots), and develop a set of guidelines for graphically representing team robot states in the remote team teleoperation.

Social Robotics for Nonsocial Teleoperation: Leveraging Social Techniques to Impact Teleoperator Performance and Experience

2020 | Daniel J. Rea, Stela H. Seo, James E. Young | Interface by ZenFri

Abstract

Purpose of Review Research has demonstrated the potential for robotic interfaces to leverage human-like social interaction techniques, for example, autonomous social robots as companions, as professional team members, or as social proxies in robot telepresence. We propose that there is an untapped opportunity to extend the benefits of social robotics to more traditional teleoperation, where the robot does not typically communicate with the operator socially.

We argue that teleoperated robots can and should leverage social techniques to shape interactions with the operator,
even in use cases such as remote exploration or inspection that do not involve using the robot to communicate with other people.


Recent Findings The core benefit of social robotics is to leverage human-like and thus familiar social techniques to communicate effectively or shape people’s mood and behavior. Initial results provide proofs of concept for similar benefits of social techniques applied to more traditional teleoperation; for example, we can design teleoperated robots as social agents to facilitate communication or to shape operator behavior, or teleoperated robots can leverage knowledge of operator psychology to change perceptions, potentially improving operation safety and performance.

Summary This paper provides a proposal and roadmap for leveraging social robotics techniques in more classical teleoperation
interfaces.

Conveyor: A Dual-Task Paradigm for Studying VR Dialogue Interfaces

2018 | Patrick Dubois, Daniel J. Rea, Kevin Hoang, Meghan Chua, Danielle King, Corey King, James E. Young, Andrea Bunt

Abstract

VR applications can enhance players’ sense of presence within virtual environments. A common scenario is to have a player working on a task, while simultaneously making dialogue selections using VR. We investigate a dual-task experiment design for this scenario, and an initial study using four interfaces for dialogue selection. We found that interface naturalness – one measure of immersion – seems to have a large role in player preference, regardless of selection speed.

The virtual takeover: The influence of virtual reality on consumption

2018 | Raymond Lavoie, Corey King

Abstract

Looking to current trends, this paper explores the influence of Virtual Reality (VR) on consumption. Specifically, we focus on the influence that VR has on consumer spending by suggesting that identities created in VR will influence consumption behaviour in the real world. While other forms of technology allow consumers to create alternative identities, we suggest that the unique aspects of VR, bolstered by forthcoming advances, will make identities created in VR relatively more self‐important and more salient in real world consumption.

We also propose implications for marketing research and practice.

Poor Thing! Would You Feel Sorry for a Simulated Robot? A comparison of empathy toward a physical and a simulated robot

2015 | Stela H. Seo, Denise Geiskkovitch, Masayuki Nakane, Corey King, James E. Young

Abstract

In designing and evaluating human-robot interactions and interfaces, researchers often use a simulated robot due to the high cost of robots and time required to program them. However, it is important to consider how interaction with a simulated robot differs
from a real robot; that is, do simulated robots provide authentic interaction? We contribute to a growing body of work that explores
this question and maps out simulated-versus-real differences, by explicitly investigating empathy: how people empathize with a
physical or simulated robot when something bad happens to it. Our results suggest that people may empathize more with a physical robot than a simulated one, a finding that has important implications on the generalizability and applicability of simulated
HRI work. Empathy is particularly relevant to social HRI and is integral to, for example, companion and care robots. Our contribution additionally includes an original and reproducible HRI experimental design to induce empathy toward robots in laboratory settings, and an experimentally validated empathy-measuring instrument from psychology for use with HRI.

"Distinguishing the types of flow"

SSHRC Partnership Development Grant with University of Manitoba [Asper School of Business] and Arizona State University, 2017-2020

"3D scene analysis and semantic labeling for augmented reality in mobile applications"

NSERC Engage, with University of Manitoba [Department of Computer Science], 2014-2015

"Computer vision 3D environment recognition in augmented reality mobile games"

MITACS Accelerate, with University of Manitoba [Department of Computer Science], 2014

"Dynamics between consumers and augmented reality"

MITACS Accelerate, with University of Manitoba [Asper School of Business], 2014

"Middleware-Layer for Mobile Game Architecture"

MITACS Accelerate, with University of Saskatchewan [Department of Computer Science], 2013