Science

An AR, brain-computer interface that shows the color, speed and shape of our emotions

If we could see our emotions through a Brain Computer Interface (BCI) device based on Augmented Reality (AR), what color, shape, or velocity would they be?

“Is it really possible to tell someone else what one feels?” ― Leo Tolstoy, Anna Karenina

With evolution, the human species has learnt to mask its true feelings. We smile in polite society even when we hurt or rage inside. So much so, that at times even we don’t realize what we are actually feeling.

What if there were a  BCI platform that could show us our true feelings on a screen? Will they shock us? Will we be relieved at finally being aware of what we’ve known all along?

Kammil Carranza

“My basic inspiration came from understanding something that you cannot see but you do feel. Something similar to ghosts”

Kammil Carranza, Creative Director and Project Manager at Augmented Island Studios, has come up with just such a device that uses Augmented Reality (AR) to display the true emotions of an individual by presenting their perception on a screen through color and velocity.

The Sociable spoke with Carranza about his project Daydreamers.

“Sometimes you feel angry, but you might wonder what your anger looks like or whether it’s different from another person’s anger. So I wanted to give these abstract concepts a representation,” he says.

Read More: Virtual Reality Takes Consciousness Research into Mystic Realms of the Divine Play

Daydreamers builds a new perception of reality, proposing the construction of the world that surrounds us as a direct response to our brain waves that can be explained as emotions we are experiencing.

Inspiration

Carranza started the project as part of his master’s thesis at the Institute for Advanced Architecture of Catalonia (IAAC) under Professors Luis Fraguada and Elizabeth Bigger, within the advanced interaction research line, with a focus on how wearable technology can augment the senses.

Read More: Brain-computer interface helps turn brainwave patterns into speech

He reveals that in the beginning, the product was about recording dreams. So people could experience other people’s dreams as well as see them on an AR platform.

“My basic inspiration came from understanding something that you cannot see but you do feel. Something similar to ghosts,” he says.

How it Works

For Daydreamers to work, the user has to wear a headband called Muse, which is used to measure brain activity. With the help of an EEG headset as the sensor, the user sees custom digital content based on the data recorded by this sensor, which is set in the real world using AR as a visualization platform.

“If you feel excitement or fear, it will change its shape, color and also the velocity of some elements”

In simple terms, if you are wearing the headset and looking at a tree, the screen will show the tree in a shape and color that reflects your emotional state.

“I can pick the data and send it to the phone, transforming the content in real-time. So if you feel excitement or fear, it will change its shape, color and also the velocity of some elements,” Carranza explains.

The screen starts to reflect the emotions of the user by changing the pattern, velocity, and the colors of whatever the user perceives.

Read More: Terence McKenna’s ‘cyberdelic’ predictions for Virtual Reality 25 years on

“So if you are depressed, it starts to show slow movement or goes purple and a bit faded. It converts the user data in a way to express his/her own emotions,” he says.

Carranza did not have a commercial aspect in mind while creating the project, but he believes that it has social insights. The research supports the benefits of creating a BCI interface that can control digital and physical content.

Read More: Brain-computer interface allows for telepathic piloting of drones

Even if the digital context is in a virtual reality world, the device offers a reasonable understanding of mixing the physical and digital worlds. In the process, not only can users delve into themselves for self-awareness, they can also develop empathy by understanding another person’s emotions.

Navanwita Sachdev

An English literature graduate, Navanwita is a passionate writer of fiction and non-fiction as well as being a published author. She hopes her desire to be a nosy journalist will be satisfied at The Sociable.

Recent Posts

US should study Ukraine war, dominate AI drone tech: Eric Schmidt

AI is an all purpose tool, for good or ill, and Schmidt is placing his…

2 days ago

CBDCs, digital currencies could lead to cash extinction: IMF paper

All signs point towards a cashless society, whether through convenience or coercion -- carrot or…

2 days ago

Financial resilience weakens in SMBs, putting need for robust reporting techniques in the spotlight 

The month-end close process refers to a set of accounting processes to review, record and…

3 days ago

RAND wargames to see if AI could wipe out humanity with pathogens, geoengineering & nukes

The RAND Corporation wargames scenarios to see if AI could contribute to human extinction by…

5 days ago

Prezent marks another major milestone as Dr. Charlotte Owens appointed to newly announced Senior Executive Board

Since GenAI hit the public market, it’s been a natural fit for a range of…

5 days ago

AI Isn’t a Religion (Yet): Why Tech’s False Prophets Aren’t the Problem

Correct me if I’m wrong, but one of the unofficial slogans of Trump’s second administration…

6 days ago