Science

An AR, brain-computer interface that shows the color, speed and shape of our emotions

If we could see our emotions through a Brain Computer Interface (BCI) device based on Augmented Reality (AR), what color, shape, or velocity would they be?

“Is it really possible to tell someone else what one feels?” ― Leo Tolstoy, Anna Karenina

With evolution, the human species has learnt to mask its true feelings. We smile in polite society even when we hurt or rage inside. So much so, that at times even we don’t realize what we are actually feeling.

What if there were a  BCI platform that could show us our true feelings on a screen? Will they shock us? Will we be relieved at finally being aware of what we’ve known all along?

Kammil Carranza

“My basic inspiration came from understanding something that you cannot see but you do feel. Something similar to ghosts”

Kammil Carranza, Creative Director and Project Manager at Augmented Island Studios, has come up with just such a device that uses Augmented Reality (AR) to display the true emotions of an individual by presenting their perception on a screen through color and velocity.

The Sociable spoke with Carranza about his project Daydreamers.

“Sometimes you feel angry, but you might wonder what your anger looks like or whether it’s different from another person’s anger. So I wanted to give these abstract concepts a representation,” he says.

Read More: Virtual Reality Takes Consciousness Research into Mystic Realms of the Divine Play

Daydreamers builds a new perception of reality, proposing the construction of the world that surrounds us as a direct response to our brain waves that can be explained as emotions we are experiencing.

Inspiration

Carranza started the project as part of his master’s thesis at the Institute for Advanced Architecture of Catalonia (IAAC) under Professors Luis Fraguada and Elizabeth Bigger, within the advanced interaction research line, with a focus on how wearable technology can augment the senses.

Read More: Brain-computer interface helps turn brainwave patterns into speech

He reveals that in the beginning, the product was about recording dreams. So people could experience other people’s dreams as well as see them on an AR platform.

“My basic inspiration came from understanding something that you cannot see but you do feel. Something similar to ghosts,” he says.

How it Works

For Daydreamers to work, the user has to wear a headband called Muse, which is used to measure brain activity. With the help of an EEG headset as the sensor, the user sees custom digital content based on the data recorded by this sensor, which is set in the real world using AR as a visualization platform.

“If you feel excitement or fear, it will change its shape, color and also the velocity of some elements”

In simple terms, if you are wearing the headset and looking at a tree, the screen will show the tree in a shape and color that reflects your emotional state.

“I can pick the data and send it to the phone, transforming the content in real-time. So if you feel excitement or fear, it will change its shape, color and also the velocity of some elements,” Carranza explains.

The screen starts to reflect the emotions of the user by changing the pattern, velocity, and the colors of whatever the user perceives.

Read More: Terence McKenna’s ‘cyberdelic’ predictions for Virtual Reality 25 years on

“So if you are depressed, it starts to show slow movement or goes purple and a bit faded. It converts the user data in a way to express his/her own emotions,” he says.

Carranza did not have a commercial aspect in mind while creating the project, but he believes that it has social insights. The research supports the benefits of creating a BCI interface that can control digital and physical content.

Read More: Brain-computer interface allows for telepathic piloting of drones

Even if the digital context is in a virtual reality world, the device offers a reasonable understanding of mixing the physical and digital worlds. In the process, not only can users delve into themselves for self-awareness, they can also develop empathy by understanding another person’s emotions.

Navanwita Sachdev

An English literature graduate, Navanwita is a passionate writer of fiction and non-fiction as well as being a published author. She hopes her desire to be a nosy journalist will be satisfied at The Sociable.

Recent Posts

‘New Cyber Order’ is here, every human identity will have 80 agentic AI identities: WEF

Small businesses and govts will need a type of Digital Public Infrastructure [digital ID, fast…

13 hours ago

WEF to publicly livestream annual cybersecurity meeting for the 1st time

The spirits of Cyber Polygon & the Great Reset live on with sessions like 'Riding…

6 days ago

Language is civilization’s worst inefficiency

Why language is a bottleneck for intelligence, collaboration and progress An apology: This is a…

7 days ago

Grokipedia and the coming war with Wikipedia for the world’s knowledge

Introduction: The Death of Neutrality Knowledge has never been neutral. Whoever controls the archives controls…

7 days ago

UN, Gates digital ID, fast payments, data sharing 50-in-5 campaign hits 30 countries

Digital economy ministers call for interoperable digital ID across the entire African continent The 50-in-5…

1 week ago

Hundreds of specialist tech roles on offer in Mexico as Ness opens major new AI development center in Guadalajara 

This week, the Silicon Valley of Mexico received a boost following the inauguration of the…

1 week ago