The Psychotic Brain, part two

(This is part two in a two-part post on the psychopathic brain. You can find part one here.)


Testing someone for psychopathy is a time-consuming process. “You have to do a very thorough study. It takes me three to four hours with somebody,” says Jonathan Pincus. “That’s the time I spend with the individual, examining him, questioning him and talking to him — and it takes a psychiatrist at least double that time.” That all comes before conducting a PET scan, and that time doesn’t include the background information one gleans from examining the medical, police, and school records a modern neurologist would have at his disposal. This begs a question: if it takes the foremost experts on psychopathy at least three hours to determine someone is psychopathic, how can we trust the gut reaction of a 19th century policeman? How seriously should we take the claim that Lizzie Borden was psychopathic?

In the 1890s, one could hardly say that the field of psychology was even in its infancy. Hysteria was treated with vibrators. Lunatic asylums were very much in vogue. This was an era before malaria was used to treat insanity, and likewise before lobotomies had even been invented. The ideas of Jung and Freud were fifteen years from gaining traction in the scientific community, and forty years from receiving any sort of mainstream acceptance. All that to say, even among medical professionals there was no actual understanding of what a psychopath looked like, and the perception among the general population was even more naïve.

So what does it mean that Lizzie Borden seemed too calm and collected? Ultimately, it means nothing at all. One would expect that a child who finds the murdered remains of her father to be in hysterics. But how we would expect people to react often bears little similarity to how they actually react. People in the aftermath of trauma have been noted to act in a variety of ways. Some get manic. Some go into shock and become catatonic. Lizzie’s behavior in the hours after her father’s death are meaningless in determining her guilt.

Crime historian Bill James offers a simple paradigm for how to think about criminal evidence. “Real evidence bears not the slightest resemblance to the so-often-cited structure of motive, means, and opportunity.” James points out that thousands of people have the motive, means, and opportunity to commit crimes every day and do not. “Suppose that you try to apply this concept of ‘proof’ to some ordinary event. Let us suppose that the prosecution is trying to prove that you purchased a melon last Saturday.” Virtually everyone you’ll come into contact with could be said to have the motive, means, and opportunity to purchase a melon. You get hungry, and melons are delicious. Melons are extremely cheap. Almost everyone lives in close enough proximity to a grocer that they could purchase one. None of this constitutes proof that you purchased a watermelon.

Too often, people are accused and convicted of crimes due to the inappropriate amount of weight placed on circumstantial evidence. “Real evidence that I purchased a watermelon,” says James, “is like a sales receipt for a watermelon with my fingerprints on it, a check that I wrote to the grocery store for that amount on that date, and a videotape of me carrying a watermelon out of the store. There’s a half-eaten watermelon in the refrigerator and watermelon rinds in the garbage; you got me.” What matters most, then, is physical evidence, not the directional arrow that gives you a place to start the search for suspects. “Motive, means and opportunity, you’ve got squat.”

What, then, is the physical evidence against Lizzie Borden? In short, there is none. No murder weapon was ever found. There were no witnesses that saw her commit the crime. Perhaps most significant of all, there was no blood on her. Not on her hands; not on her clothes; not in her hair; not on her shoes. No clothes in the home were found to have blood on them. Even the police officers who raised eyebrows at Lizzie’s alleged coldness at the crime scene didn’t fail to notice that she didn’t have a drop of blood on her. If you can stomach it, go back to my description of Emma’s attack. Then try to imagine a scenario by which such an attacker could avoid getting blood on them.



In 2012, a Swiss medical doctor named Franz Messerli found himself looking at how much chocolate is eaten in various countries. He noticed a strange pattern: the more chocolate the average person of a country ate, the more Nobel prizes were won by citizens of those countries. The Japanese, for example, eat a little more chocolate than the Chinese – about two kilograms per year, compared to less than half of that in China – and the Japanese have more Nobel laureates on average than the Chinese. In the Netherlands, they eat about about three times as much chocolate as the Japanese; the Netherlands earns Nobel prizes at ten times the rate of Japan. (Messerli also noticed – one imagines with a good deal of pride – that his native Switzerland leads the pack both in terms of chocolate consumption and Nobel prizes.) Says Messerili: “since chocolate consumption has been documented to improve cognitive function, it seems most likely that in a dose-dependent way, chocolate intake provides the abundant fertile ground needed for the sprouting of Nobel laureates.” It would seem that eating chocolate makes us smarter.

Not so fast. It would take a serious leap of faith to think that binging on Hershey bars will turn anyone into a world-class chemist or economist. Ashutosh Jogalekar, writer for Scientific American, says, “if only three rules of scientific deduction were inscribed on the doors of every university and research organization in the world, one of them should be that ‘correlation does not mean causation.’” Jogalekar is critical of Messerli’s argument. Messerli has roughly shown that chocolate consumption is strongly correlated to Nobel prizes. But is there any reason at all to think the two are more than loosely related? “What I find absolutely baffling,” Jogalekar says, “is that he makes no attempt to dissect other possible contributing factors. In fact at the end of the article he acknowledges the existence of such factors and then proceeds to dismiss them.” Chocolate consumption is correlated to affluence; affluence is correlated to more educational opportunities. Perhaps winning the Nobel prize is more a factor of education than it is of how much chocolate one eats.

The logic that underlies the claim, “chocolate eaters win more Nobel prizes” is the same logic that James Fallon employs when he says “decreased activity in the orbital cortex causes psycopathy.” Chris Chambers, senior research fellow in cognitive neuroscience at Cardiff University, asks, “Suppose we were to find that psychopaths, on average, show reduced activity in a particular brain region compared with a healthy control group. What would that mean, exactly?” Fallon is assuming that this reduced activity demonstrates psychopathy; Chambers isn’t so sure. “Maybe the reduced activity caused psychopathy. Or maybe it was the symptoms of psychopathy that caused changes in that part of the brain. Or maybe the brain activity is completely unrelated to psychopathy – a mere witness to the crime.” Knowing that psychopaths have a certain brain issue is one thing; understand the complex relationship between those two things is quite another.

But there’s more. Fallon has shown that, at least in terms of the set of psychopaths that made themselves available for PET scans, that those people have decreased function in their orbital cortex. Has he likewise shown that everyone with that brain artifact is a psychopath? This is called the “reverse inference fallacy.” If all cars have four wheels, is everything with four wheels a car? Of course not. Likewise, if James Fallon has physical characteristics correlated with psychopathy, is he the born killer, as he says “in one sense” that he is? Or does the fact that he’s a law-abiding citizen who has never raped or murdered anyone demonstrate that he is, in fact, not a psychopath after all? When all is said and done, is what qualifies such a definition is the expression of its behavior, or the potential for that behavior? A person with a charisma, a great singing voice, and a repertoire of songs is not a performing artist until she steps onto a stage and performs; can we call a person with the fire triangle of psychotic traits a psychopath before they behave psychotically?


The Psychopathic Brain, part one

Lizzie Borden took an axe
And gave her mother forty whacks
When she saw what she had done
She gave her father forty one


On the morning of August 4th, 1892, Andrew and Abby Borden were murdered in their home in Fall River, Massachusetts. Andrew was a tall, slender man with a wispy chinstrap beard. He had worked much of his life as a carpenter and an undertaker, two occupations that proved lucrative during the Civil War. He invested in mill stock and real estate and amassed a fortune, estimated to be the rough equivalent of $10 million today. His first wife Sarah died in 1863, leaving him two daughters: the 12-year old Emma and the 3-year old Lizzie. Two years later, he married Abby gray. Abby’s father peddled tin from a pushcart. She was short and stocky and 63 at the time of her death.

Abby was killed first. She had been in a second-floor guest bedroom, where the family kept the sewing machine, changing the pillowcases. She was struck on the side of the head with a hatchet (or “hatchet-like weapon”) and fell face-down on the floor, causing contusions to her nose and forehead. Her attacker then straddled her back and delivered nineteen additional blows to the back of the skull. Some forensic experts believe Abby was killed as early as 9:30 in the morning; others argue she may have been alive as much as an hour later.

Though there is some debate over when, exactly, Abby was attacked, Andrew’s time of death has been well-established. Andrew arrived home at 10:45 a.m., having been out earlier that morning to inspect some properties he had under construction. The door was unlocked for him by the live-in maid, Maggie Sullivan, and he was escorted to the first-floor sitting room by Lizzie. She suggested that he take a nap and opened the windows to make the room feel more comfortable. According to her inquest testimony, at that point she went to the barn to find sinkers for an upcoming fishing trip. At about 11:00, Lizzie reentered the house, where she found her father’s body and screamed for help.

Suspicion almost immediately fell on Lizzie. She was one of only two people were known to be in the home at the time of the murders. (The other, of course, was Maggie, who was violently ill at the time and was on the third floor sleeping.) It was later reported that Lizzie had attempted to buy cyanide from a pharmacist the day before the murders. When questioned, her answers were not always consistent. Did she go to the barn to find sinkers, or did she go out there to eat pears? Why was she out there for twenty minutes when everything she said she accomplished could have been done in five? Was she in the kitchen when her father arrived home, or was it the dining room? Or was she coming down the staircase as Maggie was opening the door for Andrew?

Those factors may have initially raised suspicion, but it was two others that cemented it in the minds of the police and prosecutors. The first was the fact that, three days after the crime, Lizzie was seen burning a dress. This could only be interpreted in one way: she was destroying evidence of her crime. The second factor was more subtle but equally damning. She was just too calm. One officer remarked that he was “disturbed” by the fact that she showed no agitation at all. Shouldn’t she be in hysterics? Seeing Lizzie, minutes after seeing her father’s mutilated body, so collected and coherent convinced him that she might be behind the events. It convinced him that she was a psychopath.

More than a hundred years later, most scholars stick to some variation on that theme. The famous psychic Sylvia Browne claimed, confirming Lizzie’s guilt, that Lizzie was bipolar. Victoria Lincoln suggested that Lizzie committed the murders while in a fugue state. Jules Ryckebusch, a professor at Fall River’s Bristol Community College, sees it in the same vein. “There’s an almost erotic association with that kind of violence. Nineteen or 20 ax blows! She got a kick out of continuing to slaughter her.”

The mayor and the city marshall came to the Borden house the Saturday after the murders. “I suppose you are here to arrest me,” said Lizzie, as calm as ever.



Dr. James Fallon is a neuroscientist at the University of California – Irvine. Fallon has a robust oval face with a white-tinged beard. Were he to be cast in one of The Hobbit movies, he would not seem out of place. He even goes so far as to describe his scientific career in a similar vein. “I’ve been a neuroscientist for about forty years. And most of that forty years I’ve been what’s called a small-time scientist. Small lab, small grants. Most scientists are like this. We’re kind of hobbits.” Fallon studies the biological basis for behavior: how genes and neurotransmitters and the like determine our actions. “But then,” he explains, “for some reason, I got into something else, just recently. And it all grew out of one of my colleagues asking me to analyze a bunch of brains of psychopathic killers.”

Pinpointing a cause of psychopathy is anything but straightforward, but Fallon is among a growing number of scientists that believe that violent behavior is caused by the combination of traumatic childhood abuse, brain dysfunction, and mental illness. “Two-thirds of murderers have all three factors,” says Georgetown neurologist Jonathan Pincus. “The others have two of the three.” None of those factors taken by themselves are enough to cause violent behavior. But two or more in tandem have a terrifying synergy, feeding off each other to create something far worse than any individual component. You could think of this as the neurological equivalent of what is known as the “fire triangle”: heat, fuel, and oxygen are not dangerous in and of themselves; together in a certain balance and you get a blaze that rages out of control.

Fallon believes we can be even more specific. “The pattern is that those people, every one of them I looked at who was a murderer, had damage to their orbital cortex.” To put it in simple terms, the orbital cortex is the part of the brain that evaluates the feelings of fear, aggression, and anxiety that come from the almond-shaped part of the temporal lobe called the amygdala. When the orbital cortex isn’t doing its job, the amygdala runs unimpeded. “What’s left? What takes over?” Fallon asks. “The part of the brain that drives your id-type behaviors, which is rage, violence, eating, sex, drinking.” Nothing is left to temper the part of the brain that generates our most primal urges. If we oversimplify, we can think of the amygdala as being like a gas can and the orbital cortex as being like a valve that regulates how much fuel is released. When that valve is damaged, gas starts leaking everywhere.

But damage to the brain is only part of the story. Fallon believes that another key piece of the puzzle is a factor called the MAO-A gene, otherwise known as the “warrior gene. People with the warrior gene respond more aggressively to provocation than the average person. The MAO-A gene regulates an enzyme that breaks down certain neurotransmitters, including serotonin. Serotonin has a calming effect on the brain and increases feelings of well-being. According to Fallon, people with the warrior gene have their brains bathed in serotonin in utero. “Your whole brain becomes insensitive to serotonin. It doesn’t work later in life.” Serotonin is, in effect, like water from a fire hose; in the warrior gene brain, those fire hoses have no effect.

In October of 2005, Fallon made what could only be described as a disturbing discovery. “I was looking at many scans, scans of murderers mixed in with schizophrenics, depressives and other normal brains. Out of serendipity, I was also doing a study on Alzheimer’s and as part of that, had brain scans from me and everyone in my family right on my desk. I got to the bottom of the stack, and saw this scan that was obviously pathological.” It showed the characteristic damage to the orbital cortex. “There’s almost nothing here,” he said. Knowing the scan belonged to a member of his family, Fallon decided to break the binding that prevented him from knowing whose brain was pictured. But the scan did not belong to a family member. The damaged brain was his own.

He dug deeper and had genetic tests done. Sure enough, he has MAO-A. “I’m 100%. I have the pattern, the risky pattern,” he says. “In a sense, I’m a born killer.” His mother then prompted him to take a closer look at his heritage. “My mother said to me, ‘You’re talking as if you come from a normal family.'” He didn’t. His grandfather’s grandfather’s grandfather’s grandfather was hung for matricide. In all, there were seven alleged murderers in his family tree. Sure, one of his cousins was Ezra Cornell, the founder of Cornell University. But another cousin was Lizzie Borden.

First Impressions, Or Why Relationships are Terrible

“Social progress, unless we’re careful, can merely be the means by which we replace the obviously arbitrary with the not so obviously arbitrary.”

– Malcolm Gladwell

Frank Bernieri, a psychologist at the University of Toledo, conducted an experiment in which he trained two interviewers for six weeks on how to properly conduct an effective job interview. Those two then interviewed 98 volunteers and evaluated how likely they would be to hire them. Then the first few seconds of the interviews – the time from when the volunteers walked in the door to when they shook the interviewer’s hand – were shown to a new set of observers. “On nine out of the eleven traits the applicants were being judged on, the observers significantly predicted the outcome of the interview,” Bernieri says. “The strength of the correlations was extraordinary.” It’s not so much that the interviews themselves didn’t matter. Rather, everything that followed those first thirty seconds was colored by the initial impression the applicants made.

Compare that to a similar experiment conducted by Nalini Ambady, an experimental psychologist at Stanford. Ambady cut down video footage of Harvard professors teaching their classes until she had ten-second video clips of the professors’ facial expressions and other physical cues. She then turned the sound off and showed them to neutral observers. The observers had no difficulty evaluating the professors and a fifteen-item checklist of personality traits. What’s more, those evaluations were almost exactly the same when she pared the clips down from ten seconds to five seconds, and again when she pared them down from five seconds to two seconds. Two seconds is all it takes to consistent impression of a professor’s personality. More astonishing still, when Ambady compared those evaluations with the ones made by students after a semester class, she found that those were virtually identical. As Malcolm Gladwell put it, ”A person watching a two-second silent video clip of a teacher he has never met will reach conclusions about how good that teacher is that are very similar to those of a student who sits in the teacher’s class for an entire semester.”

If you think of the state of dating in the context of these studies, I think it starts to become obvious why relationships are so hard. We make an intuitive judgment about a person’s desirability as a sexual partner within the first thirty seconds of meeting them. We color every subsequent interaction with them through that lens. Bernieri’s best applicants weren’t the ones who would be best at the job; Ambady’s highest-rated professors weren’t necessarily the best teachers. In both those cases and with dating, those rated the best were just the ones that struck the reviewers – initially and intuitively – as the most likable. When we find someone likable or desirable, we overlook a lot of things that would put us off. I’m not trying to make a case for arranged marriage (I’ll save that for another blog post). Rather, I’m just wondering: if we’re aware of that pitfall in modern relationships, could we avoid much of the heartache, infidelity, and divorce that is plaguing us today?


Mid-Century Romances and the Benefits of Ignorance

In the middle part of last century, there was a Minnesota forester named Wayne Hanson. Wayne was about six feet tall, with brick-red hair, droopy ears and rows of teeth like a Mako shark. He was also something of a rapscallion. One year, while managing a public forest, he encountered a state senator wading in a river bank, fly fishing without a license.  Rather than cite him and collect a small fine, Wayne parlayed this encounter into political pressure to get a bill passed. Wayne split his time between working in the north woods and living in Minneapolis near Folwell Park. He went to church at Wesley United Methodist, one of those sprawling, Romanesque Minneapolis churches erected during the Minneapolis building boom of the 1890s. In 1949, shortly after committing to a Wesley Bible study, Wayne met Gladys Granlund, an assembly-line worker at Honeywell. Despite the considerable ethnic tension – he was Norwegian and she was Swedish – they married six years later. This is lucky for me. Wayne and Gladys were my grandparents.

Of course, this is the polished and abridged version of their story. During my freshman year of college, after my Fall semester romance fizzled and died, Grandpa took me out for coffee and pie at a Perkin’s. He ordered blueberry and I got banana cream. In what I can only guess was an effort to make me feel better, he detailed for me his own experiences as a bachelor. Turns out, while he was dating my grandmother, he had a second girlfriend, deliberately chosen because of her name: my grandpa’s weekend lass was also named Gladys. He didn’t want to risk calling either one by the wrong name and slip his secret. Rather than making me feel better, though, that just made me feel worse: it demolished my understanding of my grandparents and their relationship.

Maybe I shouldn’t have been so naïve, but the childish notion that my grandparents were in some way a perfect couple had merit. It gave me a model to emulate and aspire to. You could argue that a better understanding of my grandpa’s humanity had merit of its own, and there’s no doubt that it did. But if given the option, I’m not sure I’d make the tradeoff.

We live in the information age, and part of our ethos is that ignorance is bad, and knowledge is good. Shakespeare might as well have been speaking for our generation when he wrote, “I say there is no darkness but ignorance.” Plato described ignorance as the root and stem of all evil, and that thought was echoed by Camus two thousand years later: “The evil that is in the world almost always comes of ignorance.” But I disagree. I think there are some things that we are all better off not knowing. The problem is there is no way to identify in advance how to tell if knowing something will build up or if it will tear down.

When I was in elementary school, my grandparents would take me and my siblings on road trips. We’d ride in a big white Chevy conversion van, with a backseat that folded into a bed and a 10-inch TV in an overhead compartment. My grandma would always have the coffee-flavored Nestle Nips in the glove compartment. I remember now that grandpa would honk the horn from time to time. There was an official explanation to this behavior, and that was he always honked at the pretty girls. Twenty years later, it strikes me as possible that my grandpa was just an angry driver who was aggressive with his horn. I don’t feel a need to justify his behavior. I just guess at this point I’d rather not know.

Why Breakups are Lopsided

“Doubt is not a pleasant condition, but certainty is absurd.”

– Voltaire


In the early 1960s, the psychologists Stanley Schachter and Jerry Singer conducted an experiment where they gave a group of subjects epinephrine. They told some of the subjects what symptoms to expect – heart palpitations, hand tremors, sweaty palms, and the like. The rest were misled about the effects of the drug. What Schachter and Singer discovered was the people who knew what epinephrine would do to their bodies were, predictably, able to make sense of things when their hands started shaking. The second group, though, lacked that ready explanation. When their palms started sweating and their hearts were beating uncontrollably, they convinced themselves that they were either uncontrollably angry or deliriously happy. A stimulus demanded an explanation, and rather than let it go unexplained, they created one out of thin air.

This latter group  engaged in a behavior called “self-justification.” Self-justification, in the vernacular of psychology, refers to the process by which we convince ourselves that our behavior makes sense. When we tout the gas mileage of a new vehicle, for example, or point to our apartment’s proximity to grocery stores, we are engaging in self-justifying behavior. In itself, this is neither inherently good or bad. But it demonstrates an interesting reality. We like to believe that we act rationally: we consider our options, weigh the variables, and then make an appropriate decision. But psychological research suggests we do the opposite. We make a decision, and then we convince ourselves that it was the correct one.


An important component of self-justification is called cognitive dissonance. Cognitive dissonance, as defined by the social psychologist Elliot Aronson, is the “state of tension that occurs whenever an individual simultaneously holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically incompatible.” When this occurs, we change one or  both cognitions to make them more compatible (or “consonant”). Let me give a practical example. Most Christians believe it is a duty to give to the poor. At the same time, when asked to give to a homeless person, many will decline. The two cognitions “I have a duty to give to the poor” and “I don’t want to give this person my money” are incompatible. To reduce dissonance, such a person might tell themselves, “It’s better to give to a shelter,” or, “They would spend the money on alcohol; it’s not my duty to subsidize their abuse.” As a result, we cmodify our opinions in order to reduce that psychological tension.

Cognitive dissonance can have a profound effect on our attitudes. In one of my favorite social psychology studies, subjects were given either $.25 or $20 to describe a man as good looking. Those in the latter group rated the man’s attractiveness as the same before and after the experiment: they could justify telling a white lie in exchange for $20. But those given  a quarter came to believe their lie. The cognitions “I am an honest person” and “This man is good looking” are perfectly compatible if you can convince yourself you believe the second. Consequently, the quarter group rated the man’s attractiveness as significantly higher than the $20 group. Cognitive dissonance has been used to explain attitude shifts in marijuana use, workplace theft, smoking, as well as countless other phenomena. Now I want to look at how it impacts how we deal with break ups.



The whole point of this discussion is to shed some light on a particular type of breakup. I’m thinking of a couple that had no clear, major dysfunction (i.e., neither party cheated on the other). Things have gone well, or reasonably well for some time, but one of the two (for sake of simplicity, let’s say it’s the woman*) starts to feel some doubts about the relationship. Their uncertainty grows and they feel conflicted. Sooner or later, she decides to end things. Self-justification predicts that a week after the breakup, she will feel much more confident in that decision than the week before. (Have you ever noticed that all the idioms we use to mean “to go through with a decision” employ violent imagery? Bite the bullet. Take the plunge. Pull the trigger. Our language seems to intuitively recognize the truth that the moments surrounding a decision are tumultuous.) Rather than incrementally working our way from doubt to certainty and then acting accordingly, it is in the act of deciding that we shrug off our doubts.

But the one getting dumped does not get the benefit of having made a decision: the decision was made for him. Psychologically speaking, everything is stacked in the favor of the one who initiates the breakup. Whereas the very act of breaking up makes her more certain of her decision, he is left to sort through the maelstrom of negative emotions. This, to me, is where things get particularly interesting. Cognitive dissonance predicts that the more difficult the breakup is on the one being dumped, the more easy it will be for the dumper to get over it.

The cognitions, “I care about this person,” is dissonant with “My decision is causing him pain.” But there is greater dissonance between the cognitions, “I care about this person” and “My decision is causing him an enormous amount of pain.” There are many ways the mind can reduce the tension between the two, but the one that seems most common is to add the cognition, “I would not want to cause them this much pain if it wasn’t the right decision. Therefore I made the right decision.”

This isn’t just an exercise in psychological speculation. I have seen many friends stunned by the apparent coldness of someone they love, unable to process how quickly they have fallen from their relational bliss. Continuing with my example, it’s not the woman’s insensitivity to his pain that is causing her to move on. In fact, it’s quite the opposite: her acuteness to his struggle is the very thing that’s convincing her she made the right decision. On the flip side of that coin, it’s easy for the dumper to get annoyed with the lingering struggle of the dumpee. But it’s important to remember that doing the dumping gives you a 90 meter head start in a 100 meter dash.

*I want to be clear: this is not gender-specific. These psychological pressures are exerted on both sexes equally.