For offline data analysis the system provides:
When you initiate data saving or when using our sample unity scene, the outputted data will be saved in you PC under the Documents folder but will differ depending on the application in use. For example:
C:\Users\[user name]\Documents\EmteqLabs\[application name]\Upload
From each recording session you will find a .dab and a .json file. The .dab file contain the sensor data, and the .json file containing the event markers created by the Unity SDK. Please refer to the SDK Data Point & Data Section for further information on the event markers.
In the following sections we will describe the general format of the sensor data (.dab) saved when using the emteqPRO system, and how to convert it to a .csv file.
As ‘sensor data file' we refer to the data collected as part of the .dab.file when using the Unity SDK and/or the SuperVision. The sensor data file (e.g. date_time_name.dab) is a binary formatted data file.
The .dab file contains
The .dab file can be quickly converted to comma separated files (csv) which can easily opened, read and consumed by external data processing software e.g. python, matlab, and R.
Additional data points and affective insights can be added using our cloud based Ml models. The process is explained in the section Obtaining affective insights.
You can convert a .dab file into a comma separated file (.csv) by using the dab2csv converter from the EmteqDabtools package we provide as a Download.
Once you have downloaded and installed the EmteqDabTools.exe. you can simply convert dab files on the go.
DETAILED CSV SPECIFICATION For version management, the official emteqPRO CSV Specification is available for reference. Please refer to this file for detailed row/column information by version.
The .csv file generated contains all the data recorded from the time the data recording was triggered (see #Time/Seconds.referenceOffset) until the end of the recording. The .csv file will look similar to the one shown below
The .csv file contains:
Information about the system and the frequency rates are exposed in the first section. In this example of a CSV above, line 32 contains the headers for each column corresponding to each sensor channel that is currently available in real-time. All comments are indicated by the symbol ‘#' at the start of the line. The system information contains the following:
The sensor and physiological insights are outputted directly under the system information section in the .csv file.
This contains the ‘Time' (time elapsed since the start of the recording), EMG contact (impedance), EMG raw sensor, Emg Filtered, EMG Amplitude (RMS) for each sensor (from ‘0' being the first EMG sensor to ‘6' being the last), followed by the PPG raw sensor data, PPG average heart-rate (BPM), HRV metrics and IMU sensor data from each axis.
Important The row or line on which the headers are outputted can vary between hardware and firmware versions.
The table below show the detailed view of the measures provided.
|Time||Time elapsed since the start of the recording.|
|Faceplate/FaceState||Discrete OFF>ON face information indicating when the device is detected as being worn.|
|Faceplate/FitState||Abstract continuous measurement of Mask ‘Fit' with higher values representing the ideal state of system performance/quality.|
|Faceplate/FitState.any#||Supplementary data counting number of electrodes which have any contact on either electrode of the pair.|
|Faceplate/FitState.both#||Supplementary data counting the number of electrode-pairs which are both in contact.|
|Faceplate/FitState.settled#||Supplementary data counting the number of electrode pairs with settled contact.|
|Emg/ContactState[0-6]||Discrete (OFF>ON>STABLE>SETTLED) contact information (8-bit value).|
|Emg/Contact[0-6]||Impedance measurement of electrode-to-skin contact when #Emg/Properties.contactMode is AC mode.|
|Emg/Raw[[0-6]||Raw Analog signal from each EMG measurement sensor.|
|Emg/RawLift[0-6]||Supplementary AC contact mode signal (may be removed in future versions).|
|Emg/Filtered[0-6]||Filtered EMG data to contain only in-band EMG measurements.|
|Emg/Amplitude[0-6]||Amplitude of the muscle EMG in-band signal acquired by a moving-window RMS over Emg/Filtered.|
|HeartRate/Average||Average beats-per-minute (BPM) of the cardiac cycle as measured from the Photoplethysmographic (PPG) sensor on the user's forehead.|
|HRV/rr||Average RR interval (milliseconds).|
|HRV/sdnn||Standard deviation of the RR intervals.|
|HRV/rmssd||Root mean square of successive differences.|
|HRV/sdsd||Standard deviation of successive differences.|
|Ppg/Raw.ppg||Raw Photoplethysmographic (PPG) sensor reading which detects variation in blood volume within the skin of the user's forehead.|
|Ppg/Raw.proximity||Raw proximity sensor reading.|
|Imu/Accelerometer.(x y z)||IMU sensor reading of linear-acceleration for the X, Y, and Z axes.|
|Imu/Magnetometer.(x y z)||IMU sensor reading of magnetic-field strength on the X, Y, and Z axes which can be used to derive absolute orientation or compensate for Gyroscopic drift.|
|Imu/Gyroscope.(x y z)||IMU sensor reading of angular-velocity on the X, Y, and Z axes.|
Important The raw data values can be outputted directly to Volts if you selected the Dab2CSV Normalised FUNCTION.
For more information on each feature outputted in the CSV file, please refer to the CSV Specification
We provide some sample data extraction and analysis scripts in the ‘Data Processing’ section.
Seconds.referenceOffset in the system information can be found in the last commented lines before the sensor data in the csv file.
It defines the absolute date-time at which the recording started. It is a J2000 epoch timestamp.
Converting the epoch timestamp to human date & time
Using this value you can calculate the UnixTime.
unixTime = (Seconds.referenceOffset + 946684800)
You can also convert the UnixTime to Human readable dateTime by following the instructions below:
Starting_timestamp = Seconds.referenceOffset
To get this reference Offset and convert to ‘Unix' time there is an offset of +946684800 seconds between 01-01-1970 and 01-01-2000 e.g.:
If you want to do the same for each sample (or observation), simply add the Unix_startingtime to the ‘Time' column values.
The ‘Time' column in the sensor data output (usually the second column from the left) refers to the time in seconds at which the data was captured, with ‘0.0' being at the beginning of the recording. ‘Time' between records is regular in the .CSV but is governed by the fastest configured measurement i.e., if all are set at 50Hz then the .CSV will have 50-records-per-second, but if a single measurement were 1KHz then the .CSV will have 1000 records per second.
‘Time' may not always be regular as clocks may drift over very long recordings. Also, in rare occasions, whenever there is an ‘AdsMiss' event there will be a larger step. This shouldn't occur but any consumer of the data should be aware of potential gaps in the data due to PC or Device performance issues. Refer to the CSV specification for the specific AdsMiss message information.
The .csv sensor data and .json event data files saved during a data recording session can be imported and analysed in data analysis tools like MATLAB, Python and R. Event data and sensor data can be synchronised using the timestamps/Time (for more information see the CSV Specification).
To help you get started with data processing we provide you with sample analysis scripts for Python and with some sample data.
The associated data for both scripts can be downloaded from the Downloads Section of this support site
Please Contact Support if you require any additional information.
Physiological data analysis references:
Fridlund, A. J., & Cacioppo, J. T. (1986). Guidelines for human electromyographic research. Psychophysiology, 23(5), 567-589.
An integral part of the emteqPRO system is an AI module that recognizes the affective state of the user using physiological data. This module is based on novel data fusion and machine learning models that recognize the user's arousal and valence using PPG, EMG and IMU sensors. The models output three-level arousal (low, medium and high) and three-level valence (negative, neutral and positive). The models provide an output (arousal and valence) every 10 seconds.
To obtain the affective insights for a particular recording, you can use the ‘Data Insights' tab in the SuperVision application, which requires an active internet connection to function. There you should provide two files:
Note: Please ensure the version number of the data insights you are using corresponds to the version of the system you are currently recording data with. The file requirements and the data outputs may be updated in future versions.
Once the processing is done, you can download the resulting .csv file in the ‘Your Insights' section. The generated .csv file will contain all the data from the original input data file, with six additional columns representing the machine learning models' output:
Based on the dimensional emotion model approach (see circumflex model of affect by Russell 1980, all emotional states can be represented within the affect space. In short, valence (x-axis) can be described as the polarity of the affective state, or else, the positivity or negativity levels. Arousal (y-axis) on the other hand is the physiological and behavioural intensity of an affective state. For example, the corresponding increase in heart-rate or loudness on our voice, ranging from low (sleepy) to high (excited or stimulated) levels.
Other data insights:
Posner, J., Russell, J. A., & Peterson, B. S. (2005). The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and psychopathology, 17(3), 715–734. https://doi.org/10.1017/S0954579405050340