EILAB Experience 2 | Data Collection

By Todd BlayoneExperience, Featured Videos, , , , , , , , , , , , , , Comments are off

EILAB Experience 2: Script

Narrated by April Stauffer | Written by Todd J.B. Blayone

Introduction

Scientific research typically starts with an interesting question, moves to an assessment of existing research and then on to generating data related to this question. Along with many of the social sciences, research in educational informatics follows this model. This means the success of a research project is often directly related to the quality of the data generated. For this reason, facilitating the collection of quality data stands at the heart of the EILab, both as a physical infrastructure and as a virtual community of associated researchers.

Methodologies for generating data related to human-computer interactions can be divided into three basic types. These are: 1) participant self-reporting, 2) researcher observation of live or recorded activity, and 3) automated response tracking. Each of these approaches, taken separately, possesses strengths and limitations. The use of self-report instruments allows researchers to gather data from large populations efficiently and affordably, but may provide data of uneven reliability. Direct observation of activity in a controlled setting often achieves higher levels of reliability, but can be time consuming and expensive. Automated-response tracking systems are also reliable but expensive. Moreover, they are often focused on a very specific dimension, and for that reason, may function best to provide supplemental data. Of course, in each case, limitations are contextual and can be mitigated to various degrees.

Interestingly, from the perspective of educational informatics, each of these data-collection methodologies is facilitated through a symbiotic relationship between human researchers and digital computers, or more specifically between human minds and “cognitive” machines. In the case of self-reports, human researchers establish survey questions, and the computer both presents them to a participant and analyzes responses. In the case of observation, the computer augments the visual and auditory perceptions of the researcher, providing many sets of “ears” and “eyes” capable of high fidelity recording and playback. Finally, in the case of automated data-collection, human-designed computer applications play the lead role, tracking and recording specific sets of phenomena, most of which are difficult or impossible for humans to detect, let alone track, on their own.

The EILab provides rich support for all three methodologies. However, its current affordances have been optimized for a mixed-method approach focusing on: 1) simultaneous, multi-perspective recording and automated tracking of human-computer interactions, and 2) capturing interactions of individuals with mobile devices, including populations with cognitive challenges or atypical functioning, such as individuals diagnosed with Autism Spectrum Disorder.

Let’s take a closer look at these affordances:

Multi-Perspective Recorded Observation

The physical EILab features a furnished, participant-activity area and a glassed-in observation room. The entire facility is configured with multiple high-speed networks, and an array of audio- and video-recording affordances. On the hardware side, these include: 1) four networked, ceiling-mounted cameras (two offering high definition, and two providing high degrees of rotation and zoom), 2) high-definition, tripod-mountable cameras, 3) several studio-quality, audio-capture devices, 4) an audio mixer, and 5) a specifically-configured, dual-display, PC workstation providing the horse power and audio-visual connectors to make it all “sing.”

On the software side, the PC workstation is configured with Media Recorder, one of three major applications in the Noldus scientific software suite. This application is capable of recording simultaneous data streams from up to four cameras and multiple audio devices. It supports the capture of discrete, synchronized video streams, and several picture-in-picture (PIP) configurations.
In addition to media-recording software, the EILab workstation is also configured with camera-control software allowing full configuration and Pan, Tilt and Zoom control for each camera. In a typical recording session, the cameras are controlled on one display while source video streams and audio levels are displayed on the other. Although achieving optimal recordings sometimes requires dexterity, particularly when a participant wanders during an observation, much can be achieved via predefined configurations and hot-key sequences.

Computer-Automated Tracking

Currently, the EILab is configured to support the identification and tracking of emotional states via the Noldus FaceReader application. (Future acquisitions may include an eye-tracking solution and other types of human-response systems.)
FaceReader uses artificial intelligence to classify facial expressions of either live or recorded individuals using Ekman’s basic or universal emotions. These are: happy, sad, angry, surprised, scared, disgusted and neutral. FaceReader can also track gaze direction, head orientation, subject characteristics and subtle facial states based on muscular configurations.

Three layers power FaceReader. The first detects a participant’s face. The second layer models the face. The third layer achieves actual classification of the facial expressions. The first two layers are based on well-established, scientific algorithms. The third layer was developed by training an artificial neural network using over 10000 manually annotated images!

FaceReader’s main output is an analysis of the facial expressions of a participant. This output can be visualized in several different charts and exported to log files. Under optimal conditions, which includes controlled lighting (to avoid facial shadows) and limited participant movement (to maintain a software-controlled facial lock), FaceReader delivers a wealth of data that would be too onerous to collect manually and without significant observational experience. This data can be related to numerous analytical constructs such as motivational flow, self-efficacy and digital competency.

Mobile Device Usage

The EILab is equipped with a broadly representative inventory of mobile devices from leading manufacturers. During observations within the EILab environment, the displays of these devices are typically mirrored on a large screen for observation and recording, and this video stream is synchronized with those focused on the participant. Some EILab researchers are conducting observations using freely or commercially available mobile applications. Others are developing custom applications to examine particular theoretical constructs.

Special Needs Populations

The EILAB is member of the Inclusive Design Institute (IDI), which addresses the challenge of designing information and communication systems for all potential users, including users with disabilities, varying language needs and diverse cultural preferences. Currently, several graduate research projects are assessing human-computer interactions among participants diagnosed with Autism spectrum disorder. In order to support these projects in the controlled EILab setting, special efforts have been made to construct a safe environment that accommodates the behavioral characteristics and sensory needs of this population, through the lifespan. Researchers are constantly exploring new ways to study and serve this population.