. Military Space News .
ROBO SPACE
A computer that understands how you feel
by Staff Writers
Boulder CO (SPX) Jul 31, 2019

Kragel is combining machine learning with brain imaging to learn more about how images impact emotions.

Could a computer, at a glance, tell the difference between a joyful image and a depressing one?

Could it distinguish, in a few milliseconds, a romantic comedy from a horror film?

Yes, and so can your brain, according to research published this week by University of Colorado Boulder neuroscientists.

"Machine learning technology is getting really good at recognizing the content of images - of deciphering what kind of object it is," said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder. "We wanted to ask: Could it do the same with emotions? The answer is yes."

Part machine-learning innovation, part human brain-imaging study, the paper, published Wednesday in the journal Science Advances, marks an important step forward in the application of "neural networks" - computer systems modeled after the human brain - to the study of emotion.

It also sheds a new, different light on how and where images are represented in the human brain, suggesting that what we see - even briefly - could have a greater, more swift impact on our emotions than we might assume.

"A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system," said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. "We found that the visual cortex itself also plays an important role in the processing and perception of emotion."

The Birth Of Emonet
For the study, Kragel started with an existing neural network, called AlexNet, which enables computers to recognize objects. Using prior research that identified stereotypical emotional responses to images, he retooled the network to predict how a person would feel when they see a certain image.

He then "showed" the new network, dubbed EmoNet, 25,000 images ranging from erotic photos to nature scenes and asked it to categorize them into 20 categories such as craving, sexual desire, horror, awe and surprise.

EmoNet could accurately and consistently categorize 11 of the emotion types. But it was better at recognizing some than others. For instance, it identified photos that evoke craving or sexual desire with more than 95 percent accuracy. But it had a harder time with more nuanced emotions like confusion, awe and surprise.

Even a simple color elicited a prediction of an emotion: When EmoNet saw a black screen, it registered anxiety. Red conjured craving. Puppies evoked amusement. If there were two of them, it picked romance. EmoNet was also able to reliably rate the intensity of images, identifying not only the emotion it might illicit but how strong it might be.

When the researchers showed EmoNet brief movie clips and asked it to categorize them as romantic comedies, action films or horror movies, it got it right three-quarters of the time.

What You See Is How You Feel
To further test and refine EmoNet, the researchers then brought in 18 human subjects.

As a functional magnetic resonance imaging (fMRI) machine measured their brain activity, they were shown 4-second flashes of 112 images. EmoNet saw the same pictures, essentially serving as the 19th subject.

When activity in the neural network was compared to that in the subjects' brains, the patterns matched up.

"We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so," said Kragel.

The brain imaging itself also yielded some surprising findings. Even a brief, basic image - an object or a face - could ignite emotion-related activity in the visual cortex of the brain. And different kinds of emotions lit up different regions.

"This shows that emotions are not just add-ons that happen later in different areas of the brain," said Wager, now a professor at Dartmouth College. "Our brains are recognizing them, categorizing them and responding to them very early on."

Ultimately, the resesarchers say, neural networks like EmoNet could be used in technologies to help people digitally screen out negative images or find positive ones. It could also be applied to improve computer-human interactions and help advance emotion research.

The takeaway for now, says Kragel:

"What you see and what your surroundings are can make a big difference in your emotional life."

Research paper


Related Links
University of Colorado at Boulder
All about the robots on Earth and beyond!


Thanks for being here;
We need your help. The Space Media Network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
SpaceMediaNetwork Contributor
$5 Billed Once


credit card or paypal
SpaceMediaNetwork Monthly Supporter
$5 Billed Monthly


paypal only


ROBO SPACE
Kitchen disruption: better food through artificial intelligence
Washington (AFP) July 21, 2019
Looking for that perfect recipe, or a new flavor combination that delights the senses? Increasingly, players in the food industry are embracing artificial intelligence to better understand the dynamics of flavor, aroma and other factors that go into making a food product a success. Earlier this year, IBM became a surprise entrant to the food sector, announcing a partnership with seasonings maker McCormick to "explore flavor territories more quickly and efficiently using AI to learn and predict n ... read more

Comment using your Disqus, Facebook, Google or Twitter login.



Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle

ROBO SPACE
Lockheed Martin gets $22.5M contract for Aegis upgrades

Lockheed awarded $1.4B contract for Saudi THAAD system

China tested new anti-ballistic missile in South China Sea

Trump declines to criticize Turkey's Russia missile purchase

ROBO SPACE
Missiles 'probably' from Israel fired into south Syria: monitor

Paris says its missiles found on pro-Haftar rebel base in Libya

Lockheed awarded $492.1M to produce HIMARS for U.S., Poland, Romania

Missile seized in Italy sold to third country in 1994: Qatar

ROBO SPACE
US may have downed two Iranian drones last week: general

U.S. Defense Department considers buying Israeli-made drones

C-Astral participates in demonstrations to help Europe set rules for drone deliveries

Navy's Fire Scout unmanned helicopter achieves initial operational capability

ROBO SPACE
Newly established US Space Agency offers sneak peek at satellite layout

AEHF-5 encapsulated and prepared for launch

Corps begins fielding mobile satellite communication system

AFRL demonstrates world's first daytime free-space quantum communication enabled by adaptive optics

ROBO SPACE
BAE Systems wins $45M contract for howitzer modifications

Leidos Inc. awarded $66.7M for Air Force Research Lab C4ISR sensor work

Oshkosh Defense awarded $320M to supply FMTVs for U.S., allies

Air Force rolls out new medical model to minimize troop downtime

ROBO SPACE
Turkey convinced Trump wants to avoid sanctions over S-400

US finally gets new Pentagon chief as Senate confirms Esper

Trump doesn't see sanctions 'right now' on Turkey

US bars Turkey from F-35 program over Russian missiles

ROBO SPACE
China eyes high-tech army, says US undermines global stability

Fort takes over as commander of Naval Forces Japan, Navy Region Japan

Historian unearths evidence that Istanbul directed Armenian genocide

Vietnam criticises China over vessels in disputed waters

ROBO SPACE
DARPA Announces Microsystems Exploration Program

Monitoring the lifecycle of tiny catalyst nanoparticles









The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.