Brain imaging has potential to read minds someday, including in court

It's a given of criminal proceedings that some people will lie in the courtroom if it suits their purposes. They'll lie about where they were, what they saw or did, why they did it, how they felt. Whom to believe?

One day, we won't have to guess. We'll know because it will show up in the electrical activity in their brain scans. We will, in essence, be able to read their minds, possibly using a machine from a remote location. And researchers at Carnegie Mellon University are leading us closer to that point -- for good or ill, depending on one's view of privacy.

Marcel Just, director of CMU's Center for Cognitive Brain Imaging, and colleague Tom Mitchell, head of the Machine Learning Department, which collects and decodes big data, are in the forefront of "reading" thoughts, feelings, intentions and memories based on images of brain activity. These factors can figure heavily in culpability, so pinpointing them could change how we view criminality.

"There are criteria for what we decide to punish people for," said Just, a psychology professor. "It's the basis for deciding guilt or innocence -- did you intentionally drive the knife into someone, what were you thinking when you did it, did you intend it, plan it, was it self-defense, do you feel remorse? All these things are completely implemented in a person's brain. If they had malevolent intent, it's in the brain, not the fist."

To get at those thoughts and feelings, CMU researchers measure brain signals via functional magnetic resonance imaging, or fMRI, and the patterns in big data. So far they have succeeded in precisely identifying what letter of the alphabet a subject sees and what emotion a subject experiences without relying on self-reporting -- the first time such a thing has been possible, according to a study they released in July that was funded by the National Institute of Mental Health.

The study tapped 10 actors from CMU's drama school and had them run the gamut of emotions. Researchers scanned their brains while they viewed the words for various feelings -- anger, disgust, envy, fear, happiness, lust, pride, sadness and shame.

Inside the fMRI scanner, the actors entered these emotional states multiple times in random order. Researchers used their neural activation patterns in early scans to gauge the emotions felt by the same participants in later scans.

In all cases, the computer models were far better than random guessing at identifying the correct emotion, and in some cases were correct as much as 60 percent or 80 percent of the time.

All the subjects were willing participants and were thinking and feeling in real time. Identifying thoughts that occurred in the past is still a major challenge, and the researchers have not attempted to extract data against a subject's will.

"We have maximal participation of subjects," Just said. "We can only get the information out if they think of it actively. If they don't cooperate we have nothing. But in the future that constraint will go away.

"When that capability is developed, it will revolutionize forensic science. It will become 'CSI-fMRI.' This research opens a new world of possible evidence for assessing a person's responsibility for their actions."

Say you put someone on the stand and ask why he stabbed the victim, Just posited, and "a machine from a remote location dredges up what he was feeling. We can't dredge out memories at this point, but they're in there, stored as electro-chemical activity."

What if the court could accurately assess remorse at sentencing?

"Remorse is certainly a brain state. It's physically in your brain as a pattern of electrical activity. In principle we should be able to detect it. We can't right now when someone is being sentenced, but at some point we will be able to.

"How long will it take us to find it? It took decades to find this much."

For one thing, he said, there are limits to fMRI's sensitivities. Scanner pixels are relatively big, and the timing resolutions are much slower than actual brain activity.

"We're going to need more sensitive instruments to get to it," Just said.

Ethics are another matter, he said.

"In America we hold privacy very dear and people are horrified by the invasion of it. I'm not anxious to give it up, but it's interesting to think about a world with much less of it. What if everyone could look at everything and we all knew it? What kind of a world would that be?"

(Reach Pittsburgh Post-Gazette writer Sally Kalson at skalson@post-gazette.com. Distributed by Scripps Howard News Service, www.shns.com.)

(Distributed by Scripps Howard News Service, www.shns.com.)

Print this article Back to Top