EILAB Experience 3 | Data Analysis, Part 1

By Todd BlayoneExperience, Featured Videos, , , , , , , , , , , , , , , , Comments are off

EILAB Experience 3: Script

Narrated by April Stauffer | Written by Todd J.B. Blayone

Introduction

Previously, we discussed data collection in the EILab, and observed how research practices today are facilitated by a partnership between humans and digital computers. This observation holds true for data analysis. Whether quantitative or qualitative, performed live or on recorded data, regardless of the theoretical frameworks involved, data analysis in the EILab represents a partnership between researchers and “cognitive machines.”

When we speak about the EILab’s data-analysis affordances, we focus on four software applications providing robust data-analysis functionality, and sophisticated interfaces for a human-computer, research partnership.

• Noldus FaceReader
• Noldus The Observer XT
• Transana
• QSR nVivo

Here, we introduce the first two applications—FaceReader and The Observer XT from Noldus Technologies. A second video will introduce Transana and QSR nVivo.

Noldus FaceReader

FaceReader is an automated tracking and analysis system for facial expressions that captures data from either a live or recorded video stream. It also provides several, basic analysis functions. FaceReader’s main output is a classification of the facial expressions of an observed participant using seven universal expressions: happy, sad, angry, surprised, scared, disgusted and neutral. Each expression is assigned an intensity value between 0 and 1 drawing from an extensive database of expert assessments.

Importantly, because several emotions often cause facial expressions, it is possible that multiple expressions will occur simultaneously with high intensity. Therefore, the sum of the intensity values for the seven expressions at a particular point in time is normally not equal to 1.

Additionally, FaceReader automatically estimates the emotional state of a participant based on the amplitude, duration and continuity of emotional responses. Each time the emotional state changes, state data is recorded.

Classification, intensity and state data can be analyzed in several different chart formats, and in text or “log” files. The Detailed log file contains all the emotional classifier outputs per time point. Log files can be opened in a spreadsheet program like Excel for basic statistical analysis.

One can also import FaceReader log files into The Observer XT for performing more complex analysis. Indeed, The Observer XT makes it possible to combine FaceReader data with manually tagged events and with data from other systems, like eye trackers or physiological data acquisition systems.

Noldus The Observer XT

The Observer XT is a sophisticated and complex event recorder for the collection, management, analysis and presentation of observational data. It was first developed as an automated system to collect observations of behavioral patterns in animals. However, the flexibility and powerful analysis features of The Observer XT made it suitable for almost any study involving observational data. In the context of educational informatics and the observation of human-computer interactions in the EILab, it provides rich functionality for synchronized playback, event tagging and statistical analysis of multiple streams of recorded video data.

Using The Observer to perform data analysis may be summarized as follows: having captured video streams of human-computer interactions as described in our previous video, a researcher may open the streams for each participant in The Observer XT, perform a close observation from all available perspectives, and add event tags or codes that were created in an earlier research phase and developed in relation to the theoretical framework being applied. Such tags can be applied to any observable, recorded phenomena. These include:

• software application events triggered by the participant
• competence or self-efficacy indicators
• emotional valance indicators
• any other visible phenomena in the recorded video streams deemed significant

The Observer XT supports a “computerized” shorthand for event tags consisting of a subject, event/behaviour, optional related modifiers and an automatically generated time stamp.

Simultaneously, the researcher can overlay and synchronize other data (e.g., emotional classification/state or eye-tracking data) for the participant being analyzed. All these synchronized layers of data can be visualized and analyzed. The analysis functions of The Observer XT allow the researcher to produce lists, tables, graphical representations or statistical calculations to answer specific research questions.

• emotional valance indicators
• any other visible phenomena in the recorded video streams deemed significant

The Observer XT supports a “computerized” shorthand for event tags consisting of a subject, event/behaviour, optional related modifiers and an automatically generated time stamp.

Simultaneously, the researcher can overlay and synchronize other data (e.g., emotional classification/state or eye-tracking data) for the participant being analyzed. All these synchronized layers of data can be visualized and analyzed. The analysis functions of The Observer XT allow the researcher to produce lists, tables, graphical representations or statistical calculations to answer specific research questions.