Product Name
Option 1 / Option 2 / Option 3
Weekly Delivery
Product Discount (-$0)
COUPON1 (-$0)
$0
$0
-
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
+
Cart is empty
Subtotal
$0
Order Discount
-$0
COUPON2
-$0
Total
$0

How did Neurable Validate Our AI’s Ability to Track Focus?

5
 min read
Neurable Team
This post originally appeared in:
Instructions
If you intend to use this component with Finsweet's Table of Contents attributes follow these steps:
  1. Remove the current class from the content27_link item as Webflows native current state will automatically be applied.
  2. To add interactions which automatically expand and collapse sections in the table of contents select the content27_h-trigger element, add an element trigger and select Mouse click (tap)
  3. For the 1st click select the custom animation Content 27 table of contents [Expand] and for the 2nd click select the custom animation Content 27 table of contents [Collapse].
  4. In the Trigger Settings, deselect all checkboxes other than Desktop and above. This disables the interaction on tablet and below to prevent bugs when scrolling.

While building good focus habits and racking up Focus Points should feel easier—and even fun—over time, the scientific process behind validating how our electroencephalogram (EEG) sensors and AI capture, track, and measure focus was incredibly rigorous and serious. Here’s how we did it.

Proving EEG Performance: Enten vs. Quick20

In 2021, to confirm that our consumer-friendly EEGs were capturing accurate brain data on par with Quick20 EEG hardware used by neuroscientists and neurosurgeons in clinical settings and that our AI was accurately estimating focus, we ran a series of 45 experiments with 132 participants across 337 sessions with one of our early prototypes, the “Enten.”

2021 photo of our Enten headphone prototype. The Enten had 11 EEG electrodes on each ear pads including reference (Ref) and ground (GND) electrodes.

First, we needed to prove our Enten headset could reliably capture neural activity as well as Quick20 headsets, so we ran a series of industry-standard EEG protocols called Eyes Open and Closed, Auditory Steady-State Response (ASSR), and P300. In each of these experiments, we proved our headset’s performance at picking up neural activity was on par—or even slightly better—than industry-standard hardware.

Spectogram results from the Eyes Open and Closed Experiment. This recording is from a single participant during two minutes of the experiment, where the eyes are open for 30 seconds followed by closed for 30 seconds. The procedure is repeated twice, as recorded in Enten headphones (left) and Quick20 (middle and right).

1 Our Eyes Open and Closed experiment measured Alpha wave activity. Closed eyes = greater alpha oscillations. Participants were instructed to keep their eyes open for 30 seconds, then closed for 30 seconds. We ran the experiment twice, concluding our sensors were at least on-par, or perhaps slightly better, at picking up ear-centric Alpha activity than Quick20 headsets.

Our Auditory Steady-State Response (ASSR) experiment measured how quickly nerves respond to certain stimuli. We played participants a series of tones modulated at specific frequencies, expecting to see an increase in that frequency in their brainwaves—and we did—particularly from the electrodes along the auditory cortex. We were excited to discover Neurable’s sensors were more sensitive at picking up these signals than Quick20 sensors.Our P300 experiment used an unexpected visual prompt to stimulate a rapid change in neural activity. When participants saw something “odd,” we expected our headset to record a signal spike around 300 milliseconds after that stimulus was presented. In our experiment, participants were shown four words and were told to press a button when they saw the word “BLUE,” which appeared 25% of the time. We saw similar P300 amplitude in both the Enten and Quick20 EEGs.

Results from the P300 Experiment. The participant was instructed to press a key every time the word “BLUE” appeared on the screen, ignoring all the other words. This experiment was performed twice, once on the Quick20, once on the Enten.

Proving Our AI Could Predict Focus

After we validated our hardware, we turned our attention to validating our AI by asking participants to complete a series of experiments called Distraction Stroop Tasks and Interruption by Notification, which were designed to mimic how focus might be maintained or disrupted in a real-world office setting. We wanted to demonstrate these tasks would cause changes in focus levels and that our proprietary algorithm would be able detect the changes in attention consistently, across several days, regardless of participant. And we did! Our AI correctly captured 80% ± 4.1% of distractions across subjects, time points, and conditions.

Results from our Stroop task experiment with Distractions from Notifications layered in to mimic real-world focus killers: push notifications. In experiment 2A, when participants heard a sound, they had to switch windows, respond to a chatbot, and quickly return to the Stroop task. In experiment 2B, every time participants heard the notification, they had to solve a math problem. In experiment 2C, when participants heard the notification, they had to recall an object that was the same color as the word on the screen. Below each experiment is the number of participants and valid sessions.

2 Distraction Stroop Tasks experiment: The Stroop effect (also known as cognitive interference) is a psychological phenomenon describing the difficulty people have naming a color when it's used to spell the name of a different color. During each trial of this experiment, we flashed the words “Red” or “Yellow” on a screen. Participants were asked to respond to the color of the words and ignore their meaning by pressing four keys on the keyboard –– “D”, “F”, “J”, and “K,” which were mapped to “Red,” “Green,” “Blue,” and “Yellow” colors, respectively. Trials in the Stroop task were categorized into congruent, when the text content matched the text color (e.g., Red), and incongruent, when the text content did not match the text color (e.g., Red). The incongruent case was counter-intuitive and more difficult. We expected to see lower accuracy, higher response times, and a drop in Alpha band power in incongruent trials. To mimic the chaotic distraction environment of in-person office life, we added an additional layer of complexity by floating the words on different visual backgrounds (a calm river, a roller coaster, a calm beach, and a busy marketplace). Both the behavioral and neural data we collected showed consistently different results in incongruent tasks, such as longer reaction times and lower Alpha waves, particularly when the words appeared on top of the marketplace background, the most distracting scene.

Interruption by Notification: It’s widely known that push notifications decrease focus level. In our three Interruption by Notification experiments, participants performed the Stroop tasks, above, with and without push notifications, which consisted of a sound played at random time followed by a prompt to complete an activity. Our behavioral analysis and focus metrics showed that, on average, participants presented slower reaction times and were less accurate during blocks of time with distractions compared to those without them.

This is a high-level overview of our results. For those who enjoy digging into the scientific details, read our full white paper here.

Never Stop Iterating, Experimenting, Improving

Since 2021’s Enten prototype, we’ve done two product-cycle iterations, arriving at MW75 Neuro, our most cutting-edge Neurable AI device to date. We continued to evolve our focus algorithm, too, so it performs well on anyone regardless of age, race, head shape, hairstyle, skin type, etc.

After nearly a decade of research with thousands of users, we’re proud to say our technology and proprietary AI still perform as well—or better—than industry-standard EEGs, a big leap toward bringing brain-computer interface neurotechnology out of the research lab and into daily life. You could say we’ve been laser-focused on creating products to help people banish distractions and build better focus habits. What’s your Focus Goal for today?


2 Distraction Stroop Tasks experiment: The Stroop Effect (also known as cognitive interference) is a psychological phenomenon describing the difficulty people have naming a color when it's used to spell the name of a different color. During each trial of this experiment, we flashed the words “Red” or “Yellow” on a screen. Participants were asked to respond to the color of the words and ignore their meaning by pressing four keys on the keyboard –– “D”, “F”, “J”, and “K,” -- which were mapped to “Red,” “Green,” “Blue,” and “Yellow” colors, respectively. Trials in the Stroop task were categorized into congruent, when the text content matched the text color (e.g. Red), and incongruent, when the text content did not match the text color (e.g., Red). The incongruent case was counter-intuitive and more difficult. We expected to see lower accuracy, higher response times, and a drop in Alpha band power in incongruent trials. To mimic the chaotic distraction environment of in-person office life, we added an additional layer of complexity by floating the words on different visual backgrounds (a calm river, a roller coaster, a calm beach, and a busy marketplace). Both the behavioral and neural data we collected showed consistently different results in incongruent tasks, such as longer reaction times and lower Alpha waves, particularly when the words appeared on top of the marketplace background, the most distracting scene.

Interruption by Notification: It’s widely known that push notifications decrease focus level. In our three Interruption by Notification experiments, participants performed the Stroop Tasks, above, with and without push notifications, which consisted of a sound played at random time followed by a prompt to complete an activity. Our behavioral analysis and focus metrics showed that, on average, participants presented slower reaction times and were less accurate during blocks of time with distractions compared to those without them.

Continue reading...

Stay up to date

Sign up and receive the latest on features and releases.
By subscribing, you agree to our Privacy Policy and provide consent to receive updates from our company.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.