“When people lie … they are trying to create a false belief in another person, but to attempt this, the liar must understand that the other person has beliefs, knowledge and opinions that are different from theirs and thus can be manipulated. You need, stresses Talwar, a ‘theory of mind’ that differentiates ‘my mind’ from ‘your mind,’ something that requires a high level of intellectual development.”
The main point of crime writer Alex Brett’s essay on lying is that most of us can’t detect when another person is lying. In a long-term study, “[e]ven customs agents and police officers couldn’t spot the lying [subjects] more than 50 per cent of the time. In other words, their accuracy was no better than chance. They might as well have flipped a coin.” Another study by psychologist Paul Ekman “showed video tapes of liars and truth-tellers to various groups of experts, including polygraph operators, robbery investigators, judges and psychiatrists, and asked them to try to identify the lies. All tried their best. None of the groups performed better than chance.” Secret service agents, however, had a 60% rate of detection, and the studies also “identified an extremely rare group of people who could nail a fibber 80 to 90 per cent of the time, more accurate than a polygraph. What, the researchers wondered, were these human lie detectors seeing that the rest of us miss?”
Analysis of about 200 papers concerned with clues to deception turned up no consistent red flag that marked a lie: “Lying, it turns out, is highly individual. What signals a lie in one person might be a cue to truthfulness in another.” (But that “rare group” must be picking up on something …. Both discussion of studies I’ve linked to here note that “liars often try psychologically to distance themselves from their falsehoods, and so tend to include fewer references to themselves in their stories.” Maybe that’s one clue. )
Not only are obvious and assumed cues (sweaty palms, shuffling feet, blinking, hesitant speech, shifty gaze, etc.) inconsistent among liars and lies, but “according to Harvard neuroscientist Stephen Kosslyn, even the brain seems to function differently depending on the type of lie and the person lying.”
Using MRI to look at the brains of people lying, Kosslyn “found that quite distinct neural systems were activated when you’re lying based on a well memorized, coherent alternative story versus when you’re making something up on the fly. Some of the structures that are active when you’re spewing out memorized lies are known to be involved in the retrieval of stored memory, but these systems aremore active when you’re lying than when you’re telling the truth, so it’s not just a case of drawing forth a memory. And, says Kosslyn, fabricating lies on the spot will light up another part of the brain known as the anterior cingulate, a structure involved in monitoring errors. ‘Probably what’s going on is you’re trying to suppress the truth. It’s not activated when you’re using memorized lies.’ People even use different parts of the brain when lying about themselves (‘I didn’t do it.’) than when lying about someone else (‘He didn’t do it.’).”
More on lying at Psychology Today, including the interesting finding that depressed people lie less and are deceived less than others.