Neurable co-founders Ramses Alcaide and Adam Molnar recently sat down to discuss the state of BCI technology and neuroethics today—head over to Part 1 of this Q&A here to check out that conversation. Now, they’re sharing their perspectives on where the industry is headed, and the steps companies need to take to achieve broader consumer adoption.
Because BCI technology is still in its early days, there’s the opportunity to set standards when it comes to ethics. How do you think companies should go about doing this?
Adam: I believe there are two parts to this answer, and you need to have both to successfully set those standards. First, you need to make sure that each individual is informed from the start about what they’re getting themselves into—they should know exactly what data they’ll be giving you if it’ll be going anywhere, how you’re going to protect it. Second, you need to stick to your word—actually, protect and manage that data.
At Neurable, we built a unique encryption method specifically for each individual’s information. We’re explicit about not selling our user’s data. That’s part of our moral fiber as a company.
Ramses: Big tech has changed everything when it comes to privacy. We don’t want to follow that model, and we refuse to monetize ourselves off of our users.
Also, as I’ve said before, companies also need to stop overhyping their products. Some neurotechnology companies have created more of a show than a product, which hurts the entire industry. There needs to be solid evidence and reasoning behind anything that you create, which is why we publicly released a whitepaper explaining the technology behind Enten. To us, transparency is key both to setting standards and building consumer trust.
How much transparency have you seen in BCI startups? Is it enough?
Adam: As a neurotechnology startup that released a whitepaper giving a behind-the-scenes look into our technology, we’re definitely in the minority. Generally—whether invasive or non-invasive—BCI companies can seem like a black box, which makes it hard to evaluate their levels of transparency when you have no insight into what they’re actually doing.
Ramses: It’s also important to us to leave ourselves open to conversations with consumers. We have a Neurable Discord channel, which allows us to be open with the public about what exactly we’re doing.
What excites you about the road ahead for your industry?
Adam: What I’m particularly excited about with Enten is the fact that we get to build a product alongside consumers, which automatically means that their preferences and comfort are built in. There’s a difference between doing it that way, versus simply building it, shipping it, and retroactively asking for feedback. That’s why we decided to launch Enten with an Indiegogo campaign—it enabled us to start building that consumer trust and relationship from the very start.
Ramses: In terms of industry trends, I think it’s important to point out that Enten is a pair of headphones, which a consumer can take on and off. The choice to let EEG sensors gather their data is completely theirs. So, I think the positive reception to the launch represents how BCI companies are increasingly realizing how important transparency is to those consumers, who want to be able to have more autonomy over the data they offer up. Wearables, by nature, allow that kind of autonomy.
More broadly, though, we’re seeing a growing trend of industry leaders advocating for increased transparency and consumer rights when it comes to emerging neurotechnology. For instance, Dr. Rafael Yuste, a professor at Columbia University’s Neurotechnology Center, has begun to draft “neurorights” with his team—those are essentially the ethical foundations that would regulate the use and development of advanced neurotechnologies. I want to see more startups acknowledge that these types of guardrails are critical for trust and transparency when it comes to consumer products, not just in academia.
Adam: With Enten, and with our own ethical standards, we’re not reinventing the wheel. We’ve always wanted to be a company that first and foremost does good, and we firmly believe that if we follow good data practices, consumer confidence and broader adoption will follow.