The rules of public debate were honed in ancient Athens, where participants weren’t protected by the anonymity of a YouTube comments section.

The art of public debate is an old one, and arose in recorded history alongside the Athenian Democracy some 2500 years ago. Which seems to suggest that the two—debate and democracy—are inextricably intertwined.

But, as you’ll learn in the section on “Avoiding Logical Fallacies,” later in this blog post, correlation does not imply causation. In fact, “False Cause” is one of the original 13 logical fallacies identified by the philosopher Aristotle in 350 BC. These fallacies are defined as errors in reasoning that will not only undermine your argument, but also the potential for participants to learn anything from discussion at all.

Pastafarians have shown that global warming correlates with the bittersweet decline in piracy, not a rise in atmospheric carbon.

Today, public debate is having a popular renaissance, thanks to the rise of social media and collapse of political apathy. But when you argue something controversial, particularly online, how often does anyone change his or her mind? How often do you change yours? How often is information exchanged in a way that guides debaters closer to compromise?

Almost never, right?

But, how often is someone insulted, offended, frustrated, unfriended, or blocked?


Internet access has reinvigorated public conversation, but it seems that we have all but forgotten the old ground rules that make that discussion productive. This has led to a lot of negativity and confusion. Which could be cleared up.

Understand the Other Person’s Argument

“It is the mark of an educated mind to be able to entertain a thought without accepting it.”
– Aristotle

If you want to enjoy productive disagreements, the first (probably subconscious) belief you need to do away with is the idea that you know more than everyone who disagrees with you. First, you probably don’t. Second, even if you do, you may still be wrong. Bill Gates, for example, thought the Internet was going to be a flop.

More importantly, if you understand why your friend disagrees with you, you’ll be better able to persuade them to your point of view. Seventeenth century philosopher Blaise Pascal put it like this:

“When we wish to correct with advantage, and to show another that he errs, we must notice from what side he views the matter, for on that side it is usually true, and admit that truth to him, but reveal to him the side on which it is false. He is satisfied with that, for he sees that he was not mistaken, and that he only failed to see all sides. Now, no one is offended at not seeing everything; but one does not like to be mistaken.”

By beginning your argument from a place of respect and understanding, you give your ideological opponent a little bit of breathing room—really reasoning room—to open up to your ideas.

Modern psychology backs this up. Agreeing with someone inspires neural coupling, the tendency for humans to mirror the thought processes of others. As we’ve previously noted, this is one of the goals of good storytelling. When you include their beliefs in your story, because you understand and genuinely accept them, your opponent stops being the enemy.

Neuroscientists also suggest that guiding people toward changing their own minds is far more effective than an aggressive lecture. As we’ve also covered in other blog posts, the best way to do this is by asking questions, not making statements.

Questions can get better results than edicts.

Most importantly you need to listen to the response. The mind you change might be your own. But, even if no one changes their minds after this type of exchange, mutual respect can help you avoid the hurt feelings that so often clutter Thanksgiving dinners and Facebook posts.

 Be Aware of Your Cognitive Biases

”The high-minded man must care more for the truth than for what people think.” ― Aristotle

We humans are a social animal, and have been influenced by the conventional wisdom of the crowd for millions of years. Which is why the Temple of Luxor in Egypt, constructed around 1400 BC, is inscribed with a maxim you’ve probably committed to memory: “Know thyself.”

That is, of course, a more difficult task than your 17-year-old self originally suspected. But we should still strive to know ourselves at least as well as the computer algorithms tracking every like, tweet, credit card purchase, song, show, psychology quiz, and personal networks. Like the algorithm, we should strive to understand how and why the crowd affects our personal decisions.

Because you must keep in mind that the “digital bubble” is only accentuating the job your brain is already doing every day, that brains have been doing since the Luxor Temple was built. It’s filtering out information that doesn’t agree with your worldview—discounting it as false, suspicious, unimportant, out of context, or simply irrelevant. What remains after you’ve undercut or ignored all that uncomfortable data is your cognitive bias.

First floated by two Israeli psychologists in 1972, the idea that we subconsciously curate information to fit preconceived opinions seems obvious; other people do it all the time. That’s why you’re having this argument in the first place, correct? They think your source is obviously “fake news,” when in reality, theirs is the fake news. How can you get them to see the light?

The bad news is that you can’t. Evolutionary psychologists suspect that we value consistency and loyalty over critical thinking because those are the mindsets that most benefits the tribe. We simply couldn’t survive unless we are surrounded by like-minded people as part of a mutual aid society. The best way to build a prosperous, functioning, like-minded community is to subconsciously trick yourselves into liking the same things. Not punch holes in other people’s poor reasoning.

Cognitive biases are also really handy mental shortcuts. If you automatically avoid dark, creepy alleys at night without really thinking about why, you’re going to get mugged less often. The problem is that if you assume your biases are unimpeachable truths, you’re going to make bad decisions. You’ll also be an insufferable debate partner.

The good news is that you can overcome your own cognitive biases, at least partially. When you’re happy, well-fed, well-rested, and not under attack, you’ll better able to navigate your own preconceptions. Provided you know what they are. Provided you know yourself.

A comprehensive overview of cognitive biases would require an entire blog post, and a long one at that. Happily, several have already been written. Here’s a list of the 58 most common cognitive biases, which you use regularly. This gentleman has lumped them into four clusters of biases, which makes them easier to remember. Finally, Harvard Business Review offers tips for transcending your cognitive biases, so you can make better business decisions.

And, more to the point, enjoy more productive disagreements.

Avoiding Logical Fallacies

It is evident that some reasonings are genuine, while others seem to be so but are not. This happens with arguments, as also elsewhere, through a certain likeness between the genuine and the sham.” – Aristotle

So opens Aristotle’s On Sophistical Refutations, in which he introduces the 13 original logical fallacies. He had noticed that the orators of his day would often use rhetorical tricks to sway a crowd emotionally into supporting their side, rather than convincing the people with rational discourse. He analyzed their arguments, poked holes in them, and began the long list of logically unsound arguments forbidden in formal debate.

For example, you’re probably familiar with Equivocation (“to call by the same name”), which occurs when deliberately misuse a word with two meanings. The classic example is, “All trees have bark. All dogs bark. Therefore, all dogs are trees.”

That’s obviously fallacious, but not all equivocations are so obvious. For example: “Evolution is just a theory.” In common parlance, a theory can mean “opinion” or “conjecture.” In the scientific community, a theory is a “well-substantiated explanation.” When someone deliberately misuses the scientific definition of “theory” to imply that evolution is an opinion equal to that of an uneducated eight-year-old, you have a logical fallacy.

Today, Aristotle’s original list of logical fallacies has expanded, and you can even download a free poster listing the 24 top offenders in today’s online arguments. It’s worth memorizing and understanding them for several reasons.

First, we all know when an argument is illogical, even if we can’t put our finger on why. When a Creationist lands a zinger like, “evolution is just a theory,” he may feel pretty good about his witty debate style. His Creationist friends will probably applaud it. But the proponent of evolution, and her allies, will just roll their eyes and be annoyed by the gambit.

If she’s memorized common logical fallacies, if the audience is familiar with the ground rules of productive argument, she could simply say, “Your logical fallacy is Equivocation, and your argument is invalid. Try again.” But, until a majority of people understand the importance and usefulness of avoiding logical fallacies, they’ll be forced to use wit rather than reason to contradict them.

For instance, Richard Dawkins clapped back, “Evolution is just a theory? Well, so is gravity and I don’t see you jumping out of buildings.” Which is as witty as the original logical fallacy. Parse out the cleverness, however, and you’ve got two people calling each other stupid, and two camps that have not come to any agreement.

Take the time to learn what arguments are logical, however, and you will have done your part to move the discussion forward.

What Does A Productive Argument Look Like?

Anybody can become angry – that is easy, but to be angry with the right person and to the right degree and at the right time and for the right purpose, and in the right way – that is not within everybody’s power and is not easy.
― Aristotle

The most effective arguments, the ones that transform nations, win elections, and guide popular culture, are almost never the ones that follow the ground rules of productive debate. More often they are cynical appeals to emotion, cognitive bias, and the sort of tribalism that makes critical thinking seem unwise, dangerous, or even treasonous.

Which is the point. As social creatures, we instinctively want broad agreement, not the best solution. If you feel like you’re under attack, even in a silly online argument, that tendency retreat back into the crowd-approved consistency of conventional wisdom is only intensified. You have allies there. You will be safe there. Tribalism means never having to admit you’re wrong, or responsible if indeed you are wrong.

If you risk critical thinking, however, you also risk being held personally responsible for your position. You risk losing an argument. You risk changing your mind. A commitment to the ground rules of productive debate means accepting the possibility that you are wrong—and may need to admit that publicly.

So, what does a logical argument look like? It looks like two people who have educated themselves about the value of a logical argument, and who are willing to sacrifice an effective, emotional slogan in order to identify a less inspiring truth. It looks like two people who understand and admit to their own biases, and are actively trying to see through them.

It looks like two people who are willing to be vulnerable, who are willing to risk the self-inflicted shame of being wrong. Like two people who are actively trying to see the other side of the argument, and incorporate it into their own worldview.

In short, it looks like an endangered species. But with a little effort on your part, that could change.