The data in the storage is always structured in the same way. The data is stored in the following format:
- Project
- Participant_1
- Task_1
- sensor_data.txt
- video_data.mp4
- Task_2
- Task_n
- Participant_2
- Participant_n
Sensor data is saved in .txt
format and includes a header and detailed data points:
The header contains metadata about sensor details, sampling rates (50 Hz), and additional information which includes file information, device specifics, sensor information and calibration information. In the table below you can find the description of the fields in the header.
Header Field | Description |
---|---|
File Information | |
File/DateTime | The creation date and time of the data file |
File/FileFormat | The internal format identifier of the file |
File/User | The username assigned to this data |
File/Tags | User-defined tags associated with the file |
File/Category | The task category related to the data |
File/Exercise | Details of the task, such as the video name for video-related tasks |
Device Information | |
ConnectDevice/Name | The name of the device (e.g., iPad) used for data collection |
Software/Build.buildTag | The build tag for the software version |
Firmware/Build.buildTag | The build tag for the firmware version |
Device/Version.serialId | The serial ID of the device |
Device/Version.hardware | The hardware version of the device |
IMU Sensors (Accelerometer, Gyroscope, Magnetometer, Euler) + Pressure Sensor | |
sensor/Version.model | The model of the IMU sensors |
sensor/Properties.rawDivisor | The divisor used to convert raw sensor data into standardized units (m/s², deg/s, µT, deg, hPa) |
sensor/Config.hertz | The sampling rate of the IMU sensors |
Navigation (OCO™) and Proximity Sensors | |
Nav/Properties.rawDivisor | The divisor used to convert raw navigation data into standardized units [mm] |
Nav/Properties.sensorCount | The total number of navigation sensors |
Nav/Properties.ids[] | The IDs of the navigation sensors included in the setup |
Nav/Config/Raw.hertz | The sampling rate of the navigation sensors |
Prox/Properties.formula | The formula used to convert raw proximity data into standardized units [mm] |
Prox/Properties.sensorCount | The total number of proximity sensors |
Prox/Properties.ids[] | The IDs of the proximity sensors included in the setup |
Prox/Config/Raw.hertz | The sampling rate of the proximity sensors |
Calibration (If calibrated, both left and right sides are considered calibrated) | |
Calibration/SmileL | Indicates whether the smile expression is calibrated (true/false) |
Calibration/SmileR | Indicates whether the smile expression is calibrated (true/false) |
Calibration/BrowraiseL | Indicates whether the eyebrow raise expression is calibrated (true/false) |
Calibration/BrowraiseR | Indicates whether the eyebrow raise expression is calibrated (true/false) |
Calibration/FrownL | Indicates whether the frown expression is calibrated (true/false) |
Calibration/FrownR | Indicates whether the frown expression is calibrated (true/false) |
Key data points included in each frame:
The data is provided in raw format. To convert the values into standardized units, apply the divisors specified in the header file.
The table below lists the standardized units associated with each sensor:
Sensor | Standardized Unit |
---|---|
Accelerometer | m/s² |
Gyroscope | deg/s |
Magnetometer | µT |
Euler | deg |
Navigation (OCO™) | mm |
Proximity | mm |
When optional analytics is enabled in the OCO Supervision app, insights are included in the output .txt
file:
Video recordings are saved in .mp4
format with consistent specifications:
Advanced insights can be generated through the OCO Data Lab portal (see detailed instructions). These insights are generated using our most advanced models. Once the processing is done the insights are available in the "Processed Files" section.
For each file, the following outputs are generated:
filename_output.csv
- Contains predictions for all available algorithms, with each column representing a different algorithm.filename_probabilities_output.csv
- Contains probability scores for each class prediction across all algorithms. Each column follows the format: {algorithm_name}_pred_proba_{class_name}
.filename_output.mp4
- If a video was recorded with the iPad camera, an output video is generated that includes both the recording and overlaid prediction probabilities.Our models currently support the following analyses:
The system classifies various facial expressions, which can reveal emotional responses and engagement levels. Possible expressions (with their numeric mappings) include:
Recognized physical activities can provide context for emotional data and overall engagement. The activities detected (with their numeric mappings) are:
Note: The numeric mappings for each class are used in the output file for class probabilities, where each class is represented by its corresponding number.