Loading...

Monkeys and Machines Process Images in Surprisingly Similar Ways

08 July 2025
Monkeys and Machines Process Images in Surprisingly Similar Ways
A new study reveals how primate brains and AI systems overlap when interpreting visual scenes, offering fresh clues about how we see the world.

When a monkey and a machine look at the same image, how much do their inner worlds align?

According to new research from Yale University, more than we might expect.

In a groundbreaking study, scientists compared how rhesus macaques and artificial neural networks interpret complex visual information. Using a shared set of natural images, from birds mid-flight to bustling street scenes, the team tracked how neurons in monkey brains fired and how deep learning systems responded.

The result? Striking parallels between biological and artificial perception. “It’s not that the machine sees exactly like the monkey,” says co-author John Murray, a Yale neuroscientist. “But the underlying patterns of processing, how different elements of a scene are weighted or grouped—showed remarkable alignment.”

The researchers recorded brain activity in macaques as they viewed each image, focusing on areas of the brain responsible for higher-level vision. They then compared this to responses generated by convolutional neural networks, AI models that power modern visual recognition tools.

What emerged was a shared visual logic: both systems seemed to prioritize similar elements in a scene, suggesting that certain strategies for interpreting the world may be universally effective, whether evolved or engineered.

Why does this matter?

Beyond helping us refine machine vision, the study could offer rare insight into how human perception works and where it might diverge from both AI and our primate cousins. It also deepens the ongoing conversation around the nature of intelligence, both organic and synthetic.

“This gives us a new lens on cognition,” says Murray. “It shows us how far machines have come but also how deeply rooted some visual strategies might be in biological brains.”

The team hopes the findings will inspire cross-disciplinary research where neuroscience, computer science, and cognitive psychology meet to better understand not just how we see, but how we make sense of what we see.

Because in the end, whether you’re a monkey, a machine, or a human the question remains the same: what exactly are we looking at, and how do we know?


The full study is available on Yale University's website