Data Collection
Each data collection project consists of a series of tasks and a group of participants. On the Home screen, click the New Project button to create a new project for data collection.
Step 1: Project Setup
- Project Name and Description: Choose a unique project name and optionally add a description.
- Tags: Create tags (e.g., ‘control', ‘experiment', ‘male', ‘female') to label participants for filtering and analysis. These tags will be included in the metadata. The tags are optional.
- Click Next to proceed to the Task Screen.
Step 2: Task Configuration
Create Tasks: Begin by creating your first task. You can rename tasks using the edit icon and add additional tasks by tapping Add Task (top right). Each task requires a data configuration (described below), with optional data and analytics settings.
Data Configurations:
- Sensor data only: Records only sensor data from OCOsense glasses, automatically uploaded to the cloud when internet is available.
- Sensor data + labels: Allows users to label sensor data using pre-set labels created by the researcher.
- Instructional text: Displays a specific text for participants to read while sensor data is recorded.
- Instructional text + labels: Shows text while allowing labels to be added to sensor data.
- Video player: Plays a video from the iPad's local storage while sensor data is recorded. The video is synchronized with the sensor data but is not uploaded to the cloud.
Optional Data:
- Camera: Choose to record video using the iPad's front or back camera alongside sensor data. Camera data is synchronized with the sensor data and is uploaded to the cloud. Only available when the data configuration is set to Sensor data only and Video player.
- Lab Streaming Layer: Streams sensor data to integrate with other devices, allowing synchronized data collection across platforms. For more details on how to use the lab streaming layer visit our Lab Streaming Layer support page
Optional Analytics:
- Expression Intensity and Certainty: Provides the size and confidence level of detected expressions.
- Key Metrics:
- Attention: Defined as the percentage of frames in which the user's head remains within a 12.5-degree "attention circle" centered on the iPad. The radius of the attention circle was derived from the fact that iPad typically has a horizontal FOV of 25 degrees at normal user distance. Range: 0% to 100%
- Expressivity: Defined as how much of the total time has been spent in recognized expressions, without taking into account the intensity of the expressions. Range: 0% to 100%
- Interaction: Defined as the percentage of time the user interacts with the hosting device (iPad/iPhone) through screen taps. Range: 0% to 100%
- Valence: Measures emotional tone, ranging from 100% positive (smile) to 100% negative (frown), based on both the presence and intensity of expressions. Range: -100% (completely negative) to 100% (completely positive)
- Engagement: A measure combining attention and expressivity, indicating the level of user involvement. Engagement is high when the user is both attentive (within the attention circle) and expressive. Range: 0% to 100%
After creating tasks, you can add additional tasks at any time.
Step 3: Adding Participants
- Participant Information: Enter the participant's name or ID and add additional details or tags as needed.
- Click Save to add more participants or proceed to calibration or data collection.
- To add more participants, tap Add New. You can also select participants from a list.
Calibration
Calibration is essential for accurate expression detection and additional statistical insights.
- Initiate Calibration: Tap the Calibrate button to start the calibration process.
- Expressions: Calibrate smile, frown, and eyebrow raise. A red dot indicates an uncalibrated expression, while green indicates a completed calibration.
-
Saved Calibration: Calibration files are saved locally and associated with the participant ID, so repeated calibration is not necessary for returning participants.
Calibration Process
- The participant starts in a neutral position and follows on-screen instructions.
- Press START to begin recording each expression.
- The participant must hold the expression for a 3-second timer. If released early, the participant will need to restart. When held for 3 seconds, they can proceed to the next expression.
-
If glasses vibrate during calibration, they need a reset. Instructions will appear on-screen, and the reset completes automatically once the participant is in a neutral position.
With this setup, your project is ready for data collection. Be sure to monitor each step closely to ensure data accuracy.
Step 4: Record Data
- Once a project with at least one task and one participant is created, data can be recorded. From the task screen, select the task for which you want to record data.
- For each data collection configuration, you can edit the file name using the pen icon next to ‘File name'. Tap the red button (top right) to start and stop the recording.
- Top left: sensor data and video: The only information on the screen will be the video being recorded (front facing or back facing camera). You can change the camera before starting the recording. No other information will be displayed to avoid distractions.
- Top right: sensor and labels: You can create categories and subcategories of labels. During data collection, participants or researchers can tap the label, the labels will be added to the raw data file. The front/back camera of the iPad cannot be used in this mode.
- Bottom left: Researcher can upload a textfile (.txt) for the participant to read, or instructions for the participant to follow. The textfile must be saved and uploaded from the iPad Folder. The front/back camera of the iPad cannot be used in this mode. The textfile is not uploaded to the cloud.
- Bottom right: Researcher can upload a video for the participant to watch, or instructions for the participant to follow. The video must be saved and uploaded from the iPad Folder. The video shown to the participant is not uploaded to the cloud.
- If the iPad is connected to Wi-fi, the sensor data and video recorded from the iPad will be uploaded to the cloud automatically. A message will confirm successful upload of sensor data and of the video data separately. If the iPad is not connected to Wi-fi, data will be temporarily stored on the iPad, and will be automatically uploaded to the cloud when a Wi-Fi connection is established. Large files will be uploaded in batches. Note that raw data cannot be accessed or downloaded from the OCOsense app but is available for download via the cloud.