In a Heinlein book I ran across the 'art of the lie.' One of which is to to tell the truth in such a way that nobody believes you. (Tricky, but can be done -- I've seen sociopaths do it.) The other is to tell only part of the truth.
The last is really slick because it goes past just lying by omission.
The last is really slick because it goes past just lying by omission.
First everything you say is verifiable. For example, black children do indeed constitute a high number of firearm homicides.
Second, the lying by omission part is the extra data you don't supply (e.g., that older gangmembers manipulate and use young kids because of lighter sentences, that the standards of 'children' can go up to 23, most of these homicides are directly involved in criminal activity). But as you will see, calling it a lie is ...well, not exactly right.
Third, the supplied, but edited, information usually leads to an emotional and not rational specific conclusion. "POOR CHILDREN ARE BEING SLAUGHTERED!" If you have that other information that isn't the same conclusion you'll jump to -- that's why the data you're supplied is carefully edited.He’s not lying per se, but he is manipulating you to a desired conclusion. A conclusion that you ‘make.’
Second, the lying by omission part is the extra data you don't supply (e.g., that older gangmembers manipulate and use young kids because of lighter sentences, that the standards of 'children' can go up to 23, most of these homicides are directly involved in criminal activity). But as you will see, calling it a lie is ...well, not exactly right.
Third, the supplied, but edited, information usually leads to an emotional and not rational specific conclusion. "POOR CHILDREN ARE BEING SLAUGHTERED!" If you have that other information that isn't the same conclusion you'll jump to -- that's why the data you're supplied is carefully edited.He’s not lying per se, but he is manipulating you to a desired conclusion. A conclusion that you ‘make.’
Fourth it creates a 'yes set.'
Super short version, after three yeses, it's harder to say no on the fourth point. Add to that, after three yeses we start to trust the person and stop checking/verifying the accuracy of what they are saying. So the small lies start at four, and the whoppers start coming in at eight.
https://www.youtube.com/watch?v=9vMrSSOXfrw
and
https://www.youtube.com/watch?v=cFdCzN7RYbw
Fifth, is confidence.
How's that for a left turn?
Realize that people WAY over estimate their ability to detect falsehood. A big part of this is they are looking for non-verbal cues that indicate unease, uncertainty (can I get away with this) and inner conflict. All of which are real common with young children lying to their parents.
Yet when someone presents something that is true there's a certain body language that accompanies it. It's a form of confidence in the information and self. Try it yourself. Look in the mirror and say "The sun comes up in the East." Then say "The sun comes up in the south." You're conflicted by saying something you know isn't true. You can fake the confidence the second time, but the first time don't.
Here's the rub with lying by providing only half of the needed information. You don't have to fake it. You can say what you're saying with absolute conviction because it is -- partially at least -- true. If you can keep from thinking about those partials, you can be very, very convincing. Mostly because you've convinced yourself.
Sixth is it doesn't create cognitive dissonance.
How's that for a left turn?
Realize that people WAY over estimate their ability to detect falsehood. A big part of this is they are looking for non-verbal cues that indicate unease, uncertainty (can I get away with this) and inner conflict. All of which are real common with young children lying to their parents.
Yet when someone presents something that is true there's a certain body language that accompanies it. It's a form of confidence in the information and self. Try it yourself. Look in the mirror and say "The sun comes up in the East." Then say "The sun comes up in the south." You're conflicted by saying something you know isn't true. You can fake the confidence the second time, but the first time don't.
Here's the rub with lying by providing only half of the needed information. You don't have to fake it. You can say what you're saying with absolute conviction because it is -- partially at least -- true. If you can keep from thinking about those partials, you can be very, very convincing. Mostly because you've convinced yourself.
Sixth is it doesn't create cognitive dissonance.
The first four minutes of this clip convey some important concepts -- especially 'that satisfies' and 'allow something to be true.'
https://www.youtube.com/watch?v=36GT2zI8lVA
We have a lot of biases and heuristics that we use to make everyday decisions. And believe me, they are incredibly important for functioning (look at the persuasion clip again in this light). No BS, if we didn't you couldn't get through the day, starting with putting your shoes on. This ties back to the difference between knowledge and belief, which is too big to into right now, but is well worth looking into.
When we are supplied information we run it through these filters. Not just looking for BS, but also what we're going to do about it. If something doesn't gel with our beliefs it's going to be like large particulate matter in a screen. It gets hung up. But things we do agree with pass through like smoke -- unfiltered and with ease.
Now that's all a fancy way of saying if you believe something already, you're not going to spot either edited information or an outright lie about the subject. You're certainly not going to spot the Holland Tunnel sized holes in the logic. For example:
https://www.youtube.com/watch?v=36GT2zI8lVA
We have a lot of biases and heuristics that we use to make everyday decisions. And believe me, they are incredibly important for functioning (look at the persuasion clip again in this light). No BS, if we didn't you couldn't get through the day, starting with putting your shoes on. This ties back to the difference between knowledge and belief, which is too big to into right now, but is well worth looking into.
When we are supplied information we run it through these filters. Not just looking for BS, but also what we're going to do about it. If something doesn't gel with our beliefs it's going to be like large particulate matter in a screen. It gets hung up. But things we do agree with pass through like smoke -- unfiltered and with ease.
Now that's all a fancy way of saying if you believe something already, you're not going to spot either edited information or an outright lie about the subject. You're certainly not going to spot the Holland Tunnel sized holes in the logic. For example:
"Guns kill people! So they should be illegal!"
"Ummmm...the number of legally owned firearms used in homicides is statistically meaningless. The killers are already breaking the existing gun laws."
"That means we need stricter gun laws!"
Now if you've bought into the rhetoric of 'guns are evil' this makes perfect sense. If you haven't, "Wait...what?"
So another way of saying 'doesn't create cognitive dissonance' is it "It aligns with your pre-existing biases."
Seventh is operative conditioning.
Many people have heard the Hitler/Goebbels quote about 'tell a lie big enough' but have you heard the rest of the quote? Here it is:
Now if you've bought into the rhetoric of 'guns are evil' this makes perfect sense. If you haven't, "Wait...what?"
So another way of saying 'doesn't create cognitive dissonance' is it "It aligns with your pre-existing biases."
Seventh is operative conditioning.
Many people have heard the Hitler/Goebbels quote about 'tell a lie big enough' but have you heard the rest of the quote? Here it is:
If you tell a lie big enough and keep repeating it, people will eventually come to believe it. The lie can be maintained only for such time as the State can shield the people from the political, economic and/or military consequences of the lie. It thus becomes vitally important for the State to use all of its powers to repress dissent, for the truth is the mortal enemy of the lie, and thus by extension, the truth is the greatest enemy of the State.
The thing about this is, if you get beaten over the head enough times with an idea, you begin to give it credibility -- even if you don't buy it. Let me give you a few examples of completely made up concepts that we now take their validity for granted. Assault rifles, muscle memory, rape culture, instinctive flinch response, racism, religion and rights.
Do we actually know what those are? Seriously can ANYONE give a solid definition that is verifiable? Or is it a 'everyone knows what you're talking about?' Yet I can guarantee you we can argue over them. Man, all I have to say is any of those words and there's a fight. That we’re fighting over it, unconsciously acknowledges that we’ve accepted the validity of the concept -- even if we rabidly disagree with it.
We've been conditioned not to question the validity of the ideas -- but instead to fight over which one is right. If I can get you to stop questioning, and instead to buy into the a belief (pro or con) I've just secured my power base. That being done, you're way more susceptible to buying into the tailored information I'm feeding you after you've bought into 'the big lie.'
Eighth -- and finally -- get into the habit of asking "What aren't you telling me?"
I gotta tell you this is a barrel of laughs, especially the deer in the headlights look. In truth, though it's more for you to develop the habit of looking beyond the narrative. To look at the 'yes set' and ask, 'where's the spin?' To look at people who are feeding you carefully prepared information and ask 'what are you getting out of telling me this?
Basically asking "so what do you want me to do about this?" Including me doing nothing about what you're doing. (And believe me, THAT is a biggie.)
So right there is a list of things you can do to help you spot spindoctoring lies and statistics.
The thing about this is, if you get beaten over the head enough times with an idea, you begin to give it credibility -- even if you don't buy it. Let me give you a few examples of completely made up concepts that we now take their validity for granted. Assault rifles, muscle memory, rape culture, instinctive flinch response, racism, religion and rights.
Do we actually know what those are? Seriously can ANYONE give a solid definition that is verifiable? Or is it a 'everyone knows what you're talking about?' Yet I can guarantee you we can argue over them. Man, all I have to say is any of those words and there's a fight. That we’re fighting over it, unconsciously acknowledges that we’ve accepted the validity of the concept -- even if we rabidly disagree with it.
We've been conditioned not to question the validity of the ideas -- but instead to fight over which one is right. If I can get you to stop questioning, and instead to buy into the a belief (pro or con) I've just secured my power base. That being done, you're way more susceptible to buying into the tailored information I'm feeding you after you've bought into 'the big lie.'
Eighth -- and finally -- get into the habit of asking "What aren't you telling me?"
I gotta tell you this is a barrel of laughs, especially the deer in the headlights look. In truth, though it's more for you to develop the habit of looking beyond the narrative. To look at the 'yes set' and ask, 'where's the spin?' To look at people who are feeding you carefully prepared information and ask 'what are you getting out of telling me this?
Basically asking "so what do you want me to do about this?" Including me doing nothing about what you're doing. (And believe me, THAT is a biggie.)
So right there is a list of things you can do to help you spot spindoctoring lies and statistics.
You do have a way with the written word. :-)
ReplyDelete