Data Overview

Understanding the data

In this section we describe the data readings provided by the system and the associated methods used for the detection of the following physiological insights:

  • Expressivity or Facial muscle activation, via Electromyography (EMG)
  • Heart-rate and Heart-rate variability monitoring, via Photoplethysmography (PPG)
  • Head activity (attitude, velocity and position), via Inertial Measurement unit (IMU)
  • Eye movement and object of interest, via Eye tracking

Valence and arousal are often plotted on a two-dimensional graph called the Dimensional Model. The activation axis, plotted vertically, ranges from deactivation (low arousal) to activation (high arousal). The valence axis, plotted horizontally, ranges from negative to positive.

Based on the dimensional emotion model approach (see circumflex model of affect by Russell 1980, all emotional states can be represented within the affect space. In short, valence (x-axis) can be described as the polarity of the affective state, or else, the positivity or negativity levels. Arousal (y-axis) on the other hand is the physiological and behavioural intensity of an affective state. For example, the corresponding increase in heart rate or loudness on our voice, ranging from low (sleepy) to high (excited or stimulated) levels.

Example:

Arousal & Valence Plot

Reference:

Posner, J., Russell, J. A., & Peterson, B. S. (2005). The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and psychopathology, 17(3), 715–734. https://doi.org/10.1017/S0954579405050340

Facial Muscle Activation (Expressivity)

Method: Electromyography (EMG)

EMG (electromyography) records the movement of our muscles by capturing the electrical activity generated by muscle contraction. The muscles receive signals from the spinal cord via motor neurons which innervate the muscle directly at the neuromuscular junction. This innervation causes the release of Calcium ions within the muscle, ultimately creating a mechanical change in the tension of the muscle. As this process involves depolarization (a change in the electrochemical gradient), the difference in current can be detected by EMG.

EMG activity (usually measured in µV) is correlated to the amount of muscle activation, thus the stronger the muscle activation, the higher the recorded voltage amplitude will be.

The Amplitude of the EMG signal is calculated from the root mean square (RMS) envelope of the filtered signal. This is calculated using rolling (moving) windows. The RMS output is commonly used in EMG analysis, as it provides a direct insight on the power of the EMG activation at a given time, in a simple form, as shown in the graph below.

Example of EMG amplitude coming from the activation of the zygomaticus muscles (top), corrugator muscle (middle) and frontalis muscles (top).

In this example, a user was recorded performing three expressions twice: a smile (top graph), a frown (second graph) and a surprised expression (bottom graph). For each expression some of the muscles were activated more than others. For example, during smiling, the zygomaticus sensors (right and left, here as orange and brown) and the orbicularis sensors (right and left, here as green and purple) are activated intensely, above the activation of the remaining sensors.

Example of EMG amplitude coming from the activation of the zygomaticus muscles (top), corrugator muscle (middle) and frontalis muscles (top).

The signal measured by the EMG sensors provides an insight of the muscle contractions and configurations made by the facial muscles during a VR experience. These can be voluntary or spontaneous (e.g., as a response to a stimulus). However, spontaneous and naturalistic expressions differ from posed voluntary expressions (see Duchenne versus non-Duchenne smile, citation), in terms of intensity, duration and configuration.

As the face is the richest source of valence information (Ekman, 2009), facial EMG provides a window to tracking valence changes.

What is provided for each sensor:

  • EMG contact (impedance)
  • EMG raw sensor data
  • EMG filtered sensor data
  • EMG signal amplitude

See more information on data outputs in ‘Data Acquisition' section.

Reference: Ekman P. (2009). Darwin's contributions to our understanding of emotional expressions. Philosophical transactions of the Royal Society of London. Series B, Biological sciences, 364(1535), 3449–3451. https://doi.org/10.1098/rstb.2009.0189

Heart-rate monitoring

Method: Photoplethysmogram (PPG)

PPG (photoplethysmography) sensor, embedded within the emteqPRO Mask, uses a light-based technology to detect the systolic peaks (and the rate of blood flow) as controlled by the heart's pumping action.

Throughout the cardiac cycle, blood pressure increases and decreases periodically – even in the outer layers and small vessels of the skin. Peripheral blood flow can be measured using optical sensors attached to the forehead, fingertip, the ear lobe or other capillary tissue.

The PPG sensor provides a comparative signal output to the electrocardiogram (ECG) gold-standard method, see the ECG/PPG graph below. The ECG records the electrical activity generated by heart muscle depolarization, which propagate in pulsating electrical waves towards the skin. Although the amount of electricity is in fact very small, it can be measured with ECG electrodes attached to the skin. However, it typically requires the attachment of multiple wet sensors in the chest area which can be cumbersome and obtrusive for the VR user.

How does it work?

Graph showing ECG and PPG sensor signals.

A typical blood flow measuring device as the PPG sensor, has an LED that sends light into the tissue and records how much light is either absorbed or reflected to the photodiode (a light-sensitive sensor).

PPG are dry sensors that do not require skin preparation, unlike ECG devices.

Graph showing ECG and PPG sensor signals.

In above graph, the peaks (referred to as R for ECG and P for PPG) of the signals are outlines per sensor modality. Those peaks are relative to the heart-cycle, and from their distances we can extract other useful metrics including the measurement of beats per minute (BPM) and heart-rate (or pulse rate) variability (HRV) measures.

What is provided:

  • PPG raw sensor data
  • PPG proximity sensor data
  • PPG average heart-rate in beats per minute(BPM)
  • Heart-rate variability features (see below)

Heart Rate Variability

Heart rate variability (HRV) is being used to measure changes in physiological arousal and stress. HRV features are traditionally calculated from RR interval time series, which are extracted from ECG sensor data. More specifically, the RR interval time series represent the distances between successive heartbeats (RR intervals).

Alternatively, the distances between successive heartbeats can be calculated from photoplethysmography (PPG) sensor data. In our devices, a PPG sensor is placed on the forehead to measure users' blood flow. From the blood flow, pulse to pulse intervals (PP intervals) can be extracted. Similarly, as the RR intervals, the PP intervals represent the distances between successive heartbeats, but they are calculated through the PPG signal.

Technically, the PPG signal lags behind the ECG signal by the time required for transmission of blood flow, thus there is a small difference between RR intervals and PP intervals. However, there is a high correlation (median = 0.97) between the RR intervals and PP intervals [1]. Additionally, there is no significant differences (p < 0.05) among HRV parameters computed using RR intervals and HRV parameters (also known as PRV when computed from PPG) computed using PP intervals. Thus, HRV can be reliably estimated from the PP intervals.

In summary, our HRV features are calculated from the PPG sensor data. For simplification, we are referring to them as HRV features instead of PRV features.

What is provided: Heart Rate Variability (HRV) features provided by the emteqPRO system.

Name Description Normal values
mean RR-interval Mean value of the RR interval (milliseconds) 787.7 (79.2)
mean heartrate Mean heart rate (beats per minute) 76.17 (7.7)
sdnn Standard deviation of the RR intervals 136.5 (33.4)
rmssd Root mean square of successive differences 27.9 (12.3)
sdsd Standard deviation of successive differences 136.5 (33.4)

See more information on data outputs in Data Acquisition section.

Head activity

Method: Inertial Measurement unit (IMU)

Head movement tracking can be attained via the inertial measurement unit (IMU) which is integrated within the emteqPRO system. The IMU contains a gyroscope, an accelerometer and a magnetometer. Each sensor outputs data along three aces, the x, y and z. Such sensors can be easily integrated in wearable solutions and are non-invasive. A wealth of research is utilising inertial sensing for activity recognition in active experimental protocols and for inferring the underlying emotional state of the user.

What is provided:

  • Imu/Accelerometer.{x | y | z}
  • Imu/Magnetometer.{x | y | z}
  • Imu/Gyroscope.{x | y | z}

See more information on data outputs in Data Acquisition section.

Eye movement & Pupillometry

Method: Eye tracking (with VR headset only)

Eye tracking is the method that enables the continuous monitoring of the eyes movements. This in turn allow us to track where the eyes are pointed towards (and therefore what they are looking at). Eye trackers are the sensors that measure the eye motion relative to the head as well as pupil changes. Some of the most popular eye trackers are using computer vision techniques via video-feed and infrared lights to track the eye pupil. Such sensors are embedded within commercial VR headsets, such as the HTC VIVE Pro Eye and the Pico Neo 2.

What is provided:

  • Object of interest detection and Event-tagging
  • Pupillometry
  • Eye blinks

Note: These features are only provided via the emteqVR Unity SDK

See more information on data outputs in the Unity SDK Data Points & Data Sections and Recording Data section.

Best Practice for Data Collection

For some guidance on best practice when performing data collection we have some recommendations on how you should prepare for and run your data collection process.

Each data collection will be unique, dependant on your own requirements. However there are some commonalities between them:

Preparation

  • Ensure the PC or Laptop is fully updated with the latest emteq software.
    • This includes Windows updates, Steam, Pico and any other supporting software required to prevent any interruptions.
  • Clean the sensors with alcohol wipes to ensure good contact.
  • Check the available disk space and battery levels on controllers and headset as appropriate.
  • If using the Open Face SDK app, ensure you have created all required annotations.
  • Ensure that participants are not wearing excessive makeup or covering their face with hair.
    • This also includes not chewing gum or wearing eye glasses.
  • Remind them not to adjust the mask after the initial fit and calibration is completed.

During Data collection

  • Ensure a good, comfortable yet tight fit of the emteqPRO mask is applied before proceeding with calibration.
    • Use the SuperVision mask UI to ensure good skin-to-sensor contact is attained (sensors are in stable green).
    • This will improve data quality and insights, while also prevent repetition of steps…etc.
  • Do not move or alter the fit of the mask once calibrated. Else, perform another quick expression calibration and you are ready to go.
  • Note taking can be useful, reminders of any major mask adjustments of technical hiccups (such as stuttering performance).
    • This can be useful when you later review large batches of data you have collected.
  • External video recording of the participant may be helpful as a reference.
Tip

You can use our SDK to create notes and events on the fly which include specific timestamps you can annotate directly onto your data.

Post Data Collection

  • Ensure data are saved with the appropriate dates and codes for an easy way to retrieve them later on.
  • Do not delete the dab files. Remember that dab files can be converted to csv at any later stage or by using our cloud insights function on the Supervision.
  • Use our cloud storage for secure data saving.
  • If a subscriber to our Cloud AI insights, download additional affective metrics for your data and run further analysis.
  • Clean the sensors with alcohol wipes to ensure appropriate sanitation.