Search
share
Search
What is Knowledge?

Date

author

share

In this article, we explore a definition of knowledge and how the question 'what is knowledge?' differs from the question 'what is truth?' We'll look at a standard approach to defining knowledge and how postmodernists treat the problem of knowledge. We also look at how belief and what one believes relates to what one knows.

The Knowledge Problem

Studying knowledge is one of those perennial topics—like the nature of matter in the hard sciences—that philosophy has been refining since before the time of Plato. The discipline, epistemology, comes from two Greek words episteme (επιστημη) which means knowledge and logos (λογος) which means a word or reason. Epistemology literally means to reason about knowledge. Epistemologists study what makes up knowledge, what kinds of things can we know, what are the limits to what we can know, and even if it’s possible to actually know anything at all.

Coming up with a definition of knowledge has proven difficult but we’ll take a look at a few attempts and examine the challenges we face in doing so. We’ll look at how prominent philosophers have wrestled with the topic and how postmodernists provide a different viewpoint on the problem of knowledge. We’ll also survey some modern work being done in psychology and philosophy that can help us understand the practical problems with navigating the enormous amounts of information we have at our disposal and how we can avoid problems in the way we come to know things.

Do We Know Stuff?

In order to answer that question, you probably have to have some idea what the term “know” means. If I asked, “Have you seen the flibbertijibbet at the fair today?” I’d guess you wouldn’t know how to answer. You’d probably start by asking me what a flibbertijibbet is. But most adults tend not to ask what knowledge is before they can evaluate whether they have it or not. We just claim to know stuff and most of us, I suspect, are pretty comfortable with that. There are lots of reasons for this but the most likely is that we have picked up a definition over time and have a general sense of what the term means. Many of us would probably say knowledge that something is true involves:

  1. Certainty – it’s hard if not impossible to deny
  2. Evidence – it has to based on something
  3. Practicality – it has to actually work in the real world
  4. Broad agreement – lots of people have to agree it’s true

But if you think about it, each of these has problems. For example, what would you claim to know that you would also say you are certain of? Let’s suppose you’re not intoxicated, high, or in some other way in your “right” mind and conclude that you know you’re reading an article on the internet. You might go further and claim that denying it would be crazy. Isn’t it at least possible that you’re dreaming or that you’re in something like the Matrix and everything you see is an illusion? Before you say such a thing is absurd and only those who were unable to make the varsity football team would even consider such questions, can you be sure you’re not being tricked? After all, if you are in the Matrix, the robots that created the Matrix would making be making you believe you are not in the Matrix and that you’re certain you aren’t.

What about the “broad agreement” criterion? The problem with this one is that many things we might claim to know are not, and could not be, broadly agreed upon. Suppose you are experiencing a pain in your arm. The pain is very strong and intense. You might tell your doctor that you know you’re in pain. Unfortunately though, only you can claim to know that (and as an added problem, you don’t appear to have any evidence for it either—you just feel the pain). So at least on the surface, it seems you know things that don’t have broad agreement by others.

These problems and many others are what intrigue philosophers and are what make coming up with a definition of knowledge challenging. Since it’s hard to nail down a definition, it also makes it hard to answer the question “what do you know?”

What is Knowledge?

As with many topics in philosophy, a broadly-agreed-upon definition is difficult. But philosophers have been attempting to construct one for centuries. Over the years, a trend has developed in the philosophical literature and a definition has emerged that has such wide agreement it has come to be known as the “standard definition.” While agreement with the definition isn’t universal, it can serve as a solid starting point for studying knowledge.

The definition involves three conditions and philosophers say that when a person meets these three conditions, she can say she knows something to be true. Take a statement of fact: The Seattle Mariners have never won a world series.  On the standard definition, a person knows this fact if:

  1. The person believes the statement to be true
  2. The statement is in fact true
  3. The person is justified in believing the statement to be true

The bolded terms earmark the three conditions that must be met and because of those terms, the definition is also called the “tripartite” (three part) definition or “JTB” for short. Many many books have been written on each of the three terms so I can only briefly summarize here what is going on in each. I will say up front though that epistemologists spend most of their time on the third condition.

Belief

First, beliefs are things people have. Beliefs aren’t like rocks or rowboats where you come across them while strolling along the beach. They’re in your head and generally are viewed as just the way you hold the world (or some aspect of the world) to be. If you believe that the Mariners never won a world series, you just accept it is as true that the Mariners really never won a world series. Notice that accepting that something is true implies that what you accept could be wrong. In other words, it implies that what you think about the world may not match up with the way the world really is. This implies that there is a distinction between belief and truth. There are some philosophers—notably postmodernists and existentialists—who think such a distinction can’t be made which we’ll examine more below. But in general, philosophers claim that belief is in our heads and truth is about the way the world is. In practical terms, you can generally figure out what you or someone else believes by examining behavior. People will generally act according to what they really believe rather than what they say they believe—despite what Dylan says.

Truth

Something is true if the world really is that way. Truth is not in your head but is “out there.” The statement, “The Mariners have never won a world series” is true if the Mariners have never won a world series. The first part of that sentence is in quotes on purpose. The phrase in quotes signifies a statement we might make about the world and the second, unquoted phrase is supposed to describe the way the world actually is. The reason philosophers write truth statements this way is to give sense to the idea that a statement about the world could be wrong or, more accurately, false (philosophers refer to the part in quotes as a statement or proposition). Perhaps you can now see why beliefs are different than truth statements. When you believe something, you hold that or accept that a statement or proposition is true. It could be false that’s why your belief may not “match up” with the way the world really is. For more on what truth is, see the Philosophy News article, “What is Truth?

Justification

If the seed of knowledge is belief, what turns belief into knowledge? This is where justification (sometimes called ‘warrant’) comes in. A person knows something if they’re justified in believing it to be true (and, of course, it actually is true). There are dozens of competing theories of justification. It’s sometimes easier to describe when a belief isn’t justified than when it is. In general, philosophers agree that a person isn’t justified if their belief is:

  1. a product of wishful thinking (I really wish you would love me so I believe you love me)
  2. a product of fear or guilt (you’re terrified of death and so form the belief in an afterlife)
  3. formed in the wrong way (you travel to an area you know nothing about, see a white spot 500 yards away and conclude it’s a sheep)
  4. a product of dumb luck or guesswork (you randomly form the belief that the next person you meet will have hazel eyes and it turns out that the next person you meet has hazel eyes)

Because beliefs come in all shapes and sizes and it’s hard to find a single theory of justification that can account for everything we would want to claim to know. You might be justified in believing that the sun is roughly 93 million miles from the earth much differently than you would be justified in believing God exists or that you have a minor back pain. Even so, justification is a critical element in any theory of knowledge and is the focus of many a philosophical thought.

Edmund-Gettier (photo from utm.edu)[Incidentally: while JTB is generally considered a starting point for a definition, it by no means is the final word. Many philosophers reject the JTB formulation altogether and others think that, at the very least, JTB needs to be “fixed up” somehow. Regarding this latter category, a small paper written by a philosopher named Edmund Gettier really kicked off a brouhaha that made philosophers doubt that JTB was sufficient for knowledge. Gettier’s paper was roughly two and a half pages long (almost unheard of in philosophy) but has become so important that the issues he raised are known as The Gettier Problem.]

People-centered Knowledge

You might notice that the description above puts the focus of knowing on the individual. Philosophers talk of individual persons being justified and not the ideas or concepts themselves being justified. This means that what may count as knowledge for you may not count as knowledge for me. Suppose you study economics and you learn principles in the field to some depth. Based on what you learn, you come to believe that psychological attitudes have just as much of a role to play in economic flourishing or deprivation as the political environment that creates economic policy. Suppose also that I have not studied economics all that much but I do know that I’d like more money in my pocket. You and I may have very different beliefs about economics and our beliefs might be justified in very different ways. What you know may not be something I know even though we have the same evidence and arguments in front of us.

So the subjective nature of knowledge partly is based on the idea that beliefs are things that individuals have and those beliefs are justified or not justified. When you think about it, that makes sense. You may have more evidence or different experiences than I have and so you may believe things I don’t or may have evidence for something that I don’t have. The bottom line is that “universal knowledge” – something everybody knows—may be very hard to come by. Truth, if it exists, isn’t like this. Truth is universal. It’s our access to it that may differ widely.

Rene Descartes and the Search for Universal Knowledge

A lot of people are uncomfortable with the idea that there isn’t universal knowledge. Philosopher Rene Descartes (pronounced day-cart) was one of them. When he was a young man, he was taught a bunch of stuff by his parents, teachers, priests and other authorities. As he came of age, he, like many of us, started to discover that much of what he was taught either was false or was highly questionable. At the very least, he found he couldn’t have the certainty that many of his educators had. While many of us get that, deal with it, and move on, Descartes was deeply troubled by this.

One day, he decided to tackle the problem. He hid himself away in a cabin and attempted to doubt everything of which he could not be certain. Since it wasn’t practical to doubt every belief he had, Descartes decided that it would be sufficient to subject the foundations of his belief system to doubt and the rest of the structure will “crumble of its own accord.” He first considers the things he came to believe by way of the five senses. For most of us these are pretty stable items but Descartes found that it was rather easy to doubt their truth. The biggest problem is that sometimes the senses can be deceptive. And after all, could he be certain he wasn’t insane or dreaming when he saw that book or tasted that honey? So while they might be fairly reliable, the senses don’t provide us with certainty—which is what Descartes was after.

Rene DescartesNext he looked at mathematics. If certainly is to be found, it must be here. He reasoned that the outcome of mathematical formulas and theorems hold both in dreams and in waking so at the very least, it fares better than the senses. But he developed an argument from which he could not spare math. Suppose there is an evil genius, he thought, that is “supremely powerful and clever” and was bent upon deceiving Descartes and developed mathematics as a device to carry out his evil deceptions (The Matrix should be coming to mind about now). Descartes found there was no way to rule out this possibility. Whether it’s highly unlikely or not isn’t the point. Descartes was looking for certainty and if there is even a slim possibility that he’s being deceived, he had to throw out mathematics too.

Unfortunately, this left Descartes with no where to turn. He found that he could be skeptical about everything and was unable to find a certain foundation for knowledge. But then he hit upon something that changed modern epistemology. He discovered that there was one thing he couldn’t doubt: the fact that he was a thinking thing. In order to doubt it, he would have to think. He reasoned that it’s not possible to doubt something without thinking about the fact that you’re doubting. If he was thinking then he must be a thinking thing and so he found that it was impossible to doubt that he was a thinking being.

This seemingly small but significant truth led to his most famous contribution to Western thought: cogito ergo sum (I think, therefore I am). Some mistakenly think that Descartes was implying with this idea that he thinks himself into existence. But that wasn’t his point at all. He was making a claim about knowledge. Really what Descartes was saying is: I think, therefore I know that I am.

The story doesn’t end here for Descartes but for the rest of it, I refer you to the reading list below to dig deeper. The story of Descartes is meant to illustrate the depth of the problems of epistemology and how difficult and rare certainty is, if certainty is possible—there are plenty of philosophers who think either that Descartes’ project failed or that he created a whole new set of problems that are even more intractable than the one he set out to solve.

Postmodernism and Knowledge

Postmodern epistemology is a growing area of study and is relatively new on the scene compared with definitions that have come out of the analytic tradition in philosophy. Generally, though, it means taking a specific, skeptical attitude towards certainty, and a subjective view of belief and knowledge. Postmodernists see truth as much more fluid than classical (or modernist) epistemologists. Using the terms we learned above, they reject the idea that we can ever be fully justified in holding that our beliefs line up with the way the world actually is. We can’t know that we know.

Perspective at the Center

In order to have certainty, postmodernists claim, we would need to be able to “stand outside” our own beliefs and look at our beliefs and the world without any mental lenses or perspective. It’s similar to wondering what it would be like to watch ourselves meeting someone for the first time? We can’t do it. We can watch the event of the meeting on a video but the experience of meeting can only be had by us. We have that experience only from “inside” our minds and bodies. Since its not possible to stand outside our minds, all the parts that make up our minds influence our view on what is true. Our intellectual and social background, our biases, our moods, our genetics, other beliefs we have, our likes and dislikes, our passions (we can put all these under the label of our “cognitive structure”) all influence how we perceive what is true about the world. Further, say the postmodernists, it’s not possible to set aside these influences or lenses. We can reduce the intensity here and there and come to recognize biases and adjust for them for sure. But it’s not possible to completely shed all our lenses which color our view of things and so it’s not possible to be certain that we’re getting at some truth “out there.”

Many have called out what seems to be a problem with the postmodernist approach. Notice that as soon as a postmodernist makes a claim about the truth and knowledge they seem to be making a truth statement! If all beliefs are seen through a lens, how do we know the postmodernists beliefs are “correct?” That’s a good question and the postmodernist might respond by saying, “We don’t!” But then, why believe it? Because of this obvious problem, many postmodernists attempt to simply live with postmodernist “attitudes” towards epistemology and avoid saying that they’re making claims that would fit into traditional categories. We have to change our perspective to understand the claims.

Community Agreement

To be sure, Postmodernists do tend to act like the rest of us when it comes to interacting with the world. They drive cars, fly in airplanes, make computer programs, and write books. But how is this possible if they take such a fluid view of knowledge? Postmodernists don’t eschew truth in general. They reject the idea that any one person’s beliefs about it can be certain. Rather, they claim that truth emerges through community agreement. Suppose scientists are attempting to determine whether the planet is warming and that humans are the cause. This is a complex question and a postmodernist might say that if the majority of scientists agree that the earth is warming and that humans are the cause, then that’s true. Notice that the criteria for “truth” is that scientists agree. To use the taxonomy above, this would be the “justification condition.” So we might say that postmodernists accept the first and third conditions of the tripartite view but reject the second condition: the idea that there is a truth that beliefs need to align to a truth outside our minds. 

When you think about it, a lot of what we would call “facts” are determined in just this way. For many years, scientists believed in a substance called “phlogiston.” Phlogiston was stuff that existed in certain substances (like wood and metal) and when those substances were burned, more phlogiston was added to the substance. Phlogiston was believed to have negative weight, that’s why things got lighter when they burned. That theory has since been rejected and replace by more sophisticated views involving oxygen and oxidation.

So, was the phlogiston theory true? The modernist would claim it wasn’t because it has since been shown to be false. It’s false now and was false then even though scientists believed it was true. Beliefs about phlogiston didn’t line up with the way the world really is, so it was false. But the postmodernist might say that phlogiston theory was true for the scientists that believed it. We now have other theories that are true. But phlogiston theory was no less true then than oxygen theory is now. Further, they might add, how do we know that oxygen theory is really the truth? Oxygen theory might be supplanted some day as well but that doesn’t make it any less true today.

Knowledge and the Mental Life

As you might expect, philosophers are not the only ones interested in how knowledge works. Psychologists, social scientists, cognitive scientists and neuroscientists have been interested in this topic as well and, with the growth of the field of artificial intelligence, even computer scientists have gotten into the game. In this section, we’ll look at how work being done in psychology and behavioral science can inform our understanding of how human knowing works.

Thus far, we’ve looked at the structure of knowledge once beliefs are formed. Many thinkers are interested how belief formation itself is involved our perception of what we think we know. Put another way, we may form a belief that something is true but the way our minds formed that belief has a big impact on why we think we know it. The science is uncovering that, in many cases, the process of forming the belief went wrong somewhere and our minds have actually tricked us into believing its true. These mental tricks may be based on good evolutionary principles: they are (or at least were at some point in our past) conducive to survival. But we may not be aware of this trickery and be entirely convinced that we formed the belief in the right way and so have knowledge. The broad term used for this phenomenon is “cognitive bias” and mental biases have a significant influence over how we form beliefs and our perception of the beliefs we form.1

Wired for Bias

A cognitive bias is a typically unconscious “mental trick” our minds play that lead us to form beliefs that may be false or that are directed towards some facts and leaving out others such that these beliefs align to other things we believe, promote mental safety, or provide grounds for justifying sticking to to a set of goals that we want to achieve. Put more simply, mental biases cause us to form false beliefs about ourselves and the world. The fact that our minds do this is not necessarily intentional or malevolent and, in many cases, the outcomes of these false beliefs can be positive for the person that holds them. But epistemologists (and ethicists) argue that ends don’t always justify the means when it comes to belief formation. As a general rule, we want to form true beliefs in the “right” way.

Ernest Becker in his important Pulitzer Prize winning book The Denial of Death attempts to get at the psychology behind why we form the beliefs we do. He also explores why we may be closed off to alternative viewpoints and why we tend to become apologists (defenders) of the viewpoints we hold. One of his arguments is that we as humans build an ego ( in the Freudian sense; what he calls “character armor”) out of the beliefs we hold and those beliefs tend to give us meaning and they are strengthened when more people hold the same viewpoint. In a particularly searing passage, he writes:

Each person thinks that he has the formula for triumphing over life’s limitations and knows with authority what it means to be a man [N.B. by ‘man’ Becker means ‘human’ and uses masculine pronouns as that was common practice when he wrote the book], and he usually tries to win a following for his particular patent. Today we know that people try so hard to win converts for their point of view because it is more than merely an outlook on life: it is an immortality formula. . . in matters of immortality everyone has the same self-righteous conviction. The thing seems perverse because each diametrically opposed view is put forth with the same maddening certainty; and authorities who are equally unimpeachable hold opposite views! (Becker, Ernest. The Denial of Death, pp. 255-256. Free Press.)

In other words, being convinced that our viewpoint is correct and winning converts to that viewpoint is how we establish ourselves as persons of meaning and significance and this inclination is deeply engrained in our psychological equipment. This not only is why biases are so prevalent but why they’re difficult to detect. We are, argues Becker and others, wired towards bias. Jonathan Haidt agrees and go so far as to say that reason and logic is not only the cure but a core part of the wiring that causes the phenomenon.

Anyone who values truth should stop worshipping reason. We all need to take a cold hard look at the evidence and see reasoning for what it is. The French cognitive scientists Hugo Mercier and Dan Sperber recently reviewed the vast research literature on motivated reasoning (in social psychology) and on the biases and errors of reasoning (in cognitive psychology). They concluded that most of the bizarre and depressing research findings make perfect sense once you see reasoning as having evolved not to help us find truth but to help us engage in arguments, persuasion, and manipulation in the context of discussions with other people. (Haidt, Jonathan. The Righteous Mind: Why Good People Are Divided by Politics and Religion (p. 104). Knopf Doubleday Publishing Group.)

Biases and Belief Formation

Research in social science and psychology are uncovering myriad ways in which our minds play these mental tricks. For example, Daniel Kahneman discusses the impact emotional priming has on the formation of a subsequent idea. In one study, when participants were asked about happiness as it related to their romantic experiences, those that had a lot of dates in the past would report that they were happy about their life while those that had no dates reported being lonely, isolated, and rejected. But then when they subsequently were asked about their happiness in general, they imposed the context of their dating happiness to their happiness in general regardless of how good or bad the rest of their lives seemed to be going. If a person would have rated their overall happiness as “very happy” when asked questions about general happiness only, they might rate their overall happiness as “somewhat happy” if they were asked questions about their romantic happiness just prior and their romantic happiness was more negative than positive.

This type of priming can significantly impact how we view what is true. Being asked if we need more gun control or whether we should regulate fatty foods will change right after a local shooting right or after someone suffers a heart scare. The same situation will have two different responses by the same person depending on whether he or she was primed or not. Jonathan Haidt relates similar examples.

Psychologists now have file cabinets full of findings on ‘motivated reasoning,’ showing the many tricks people use to reach the conclusions they want to reach. When subjects are told that an intelligence test gave them a low score, they choose to read articles criticizing (rather than supporting) the validity of IQ tests. When people read a (fictitious) scientific study that reports a link between caffeine consumption and breast cancer, women who are heavy coffee drinkers find more flaws in the study than do men and less caffeinated women. (Haidt, p. 98)

There are many other biases that influence our thinking. When we ask the question, “what is knowledge?” this research has to be a part of how we answer the question. Biases and their influence would fall under the broad category of the justification condition we looked at earlier and the research should inform how we view how beliefs are justified. Justification is not merely the application of a philosophical formula. There are a host of psychological and social influences that are play when we seek to justify a belief and turn it into knowledge.2 We can also see how this research lends credence to the philosophical position of postmodernists. At the very least, even if we hold that we can get past our biases and get “more nearer to the truth,” we at least have good reason to be careful about the things we assert as true and adopt a tentative stance towards the truth of our beliefs.

In a day when “fake news” is a big concern and the amount of information for which we’re responsible grows each day, how we justify the beliefs we hold becomes a even more important enterprise. I’ll use a final quote from Haidt to conclude this section:

And now that we all have access to search engines on our cell phones, we can call up a team of supportive scientists for almost any conclusion twenty-four hours a day. Whatever you want to believe about the causes of global warming or whether a fetus can feel pain, just Google your belief. You’ll find partisan websites summarizing and sometimes distorting relevant scientific studies. Science is a smorgasbord, and Google will guide you to the study that’s right for you. (Haidt, pp. 99-100)

Making Knowledge Practical

Well most of us aren’t like Descartes. We actually have lives and don’t want to spend time trying to figure out if we’re the cruel joke of some clandestine mad scientist. But we actually do actually care about this topic whether we “know” it or not. A bit of reflection exposes just how important having a solid view of knowledge actually is and spending some focused time thinking more deeply about knowledge can actually help us get better at knowing.

Really, knowledge is a the root of many (dare I say most) challenges we face in a given day. Once you get past basic survival (though even things as basic as finding enough food and shelter involves challenges related to knowledge), we’re confronted with knowledge issues on almost every front. Knowledge questions range from larger, more weighty questions like figuring out who our real friends are, what to do with our career, or how to spend our time, what politician to vote for, how to spend or invest our money, or should we be religious or not, to more mundane ones like which gear to buy for our hobby, how to solve a dispute between the kids, where to go for dinner, or which book to read in your free time. We make knowledge decisions all day, every day and some of those decisions deeply impact our lives and the lives of those around us.

So all these decisions we make about factors that effect the way we and others live are grounded in our view of knowledge—our epistemology. Unfortunately few spend enough time thinking about the root of their decisions and many make knowledge choices based on how they were raised (my mom always voted Republican so I will), what’s easiest (if I don’t believe in God, I’ll be shunned by my friends and family), or just good, old fashioned laziness. But of all the things to spend time on, it seems thinking about how we come to know things should be at the top of the list given the central role it plays in just about everything we do.

Updated January, 2018: Removed dated material and general clean up; added section on cognitive biases.
Updated March, 2014: Removed reference to dated events; removed section on thought experiment; added section on Postmodernism; minor formatting changes


  1. While many thinkers have written on cognitive biases in one form or another, Jonathan Haidt in his book The Righteous Mind and Daniel Kahneman in his book Thinking Fast and Slow have done seminal work to systemize and provide hard data around how the mind operates when it comes to belief formation and biases. There is much more work to be done for sure but these books, part philosophy, part psychology, part social science, provide the foundation for further study in this area. The field of study already is large and growing so I can only provide a thumbnail sketch of the influence of how belief formation is influenced by our mind and other factors. I refer the reader to the source material on this topic for further study (see reading list below).
  2. For a strategy on how we can adjust for these natural biases that our minds seem wired to create, see the Philosophy News article, “How to Argue With People”. I also recommend Carol Dweck’s excellent book Mindset.

For Further Reading

  • Epistemology: Classic Problems and Contemporary Responses (Elements of Philosophy) by Laurence BonJour. One of the better introductions to the theory of knowledge. Written at the college level, this book should be accessible for most readers but have a good philosophical dictionary on hand.
  • Belief, Justification, and Knowledge: An Introduction to Epistemology (Wadsworth Basic Issues in Philosophy Series) by Robert Audi. This book has been used as a text book in college courses on epistemology so may be a bit out of range for the general reader. However, it gives a good overview of many of the issues in the theory of knowledge and is a fine primer for anyone interested in the subject.
  • The Theory of Knowledge: Classic and Contemporary Readings by Louis Pojman. Still one of the best books for primary source material. The edited articles have helpful introductions and Pojman covers a range of sources so the reader will get a good overview from many sides of the question. Written mainly as a textbook.
  • The Stuff of Thought: Language as a Window into Human Nature  by Steven Pinker. While not strictly a book about knowledge per se, Pinker’s book is fun, accessible, and a good resource for getting an overview of some contemporary work being done mainly in the hard sciences.
  • The Selections From the Principles of Philosophy by René Descartes. A good place to start to hear from Descartes himself.
  • Descartes’ Bones: A Skeletal History of the Conflict between Faith and Reason by Russell Shorto. This book is written as a history so it’s not strictly a philosophy tome. However, it gives the general reader some insight into what Descartes and his contemporaries were dealing with and is a fun read.
  • On Bullshit by Harry Frankfurt. One get’s the sense that Frankfurt was being a bit tongue-in-cheek with the small, engaging tract. It’s more of a commentary on the social aspect of epistemology and worth reading for that reason alone. Makes a great gift!
  • On Truth by Harry Frankfurt. Like On Bullshit but on truth.
  • A Rulebook for Arguments by Anthony Weston. A handy reference for constructing logical arguments. This is a fine little book to have on your shelf regardless of what you do for a living.
  • Warrant: The Current Debate  by Alvin Plantinga. Now over 25 years old, “current” in the title may seem anachronistic. Still, many of the issues Plantinga deals with are with us today and his narrative is sure to enlighten and prime the pump for further study.
  • Thinking Fast and Slow by Daniel Kahneman. The book to begin a study on cognitive biases.
  • The Righteous Mind by Jonathan Haidt. A solid book that dabbles in cognitive biases but also in why people form and hold beliefs and how to start a conversation about them.
  • The Denial of Death by Ernest Becker. A neo (or is it post?) Freudian analysis of why we do what we do. Essential reading for better understanding why we form the beliefs we do.
  • Mindset: The New Psychology of Success by Carol S. Dweck. The title reads like a self-help book but the content is actually solid and helpful for developing an approach to forming and sharing ideas.

More
articles

More
news

Referencing is a sham

Referencing is a sham

Referencing is supposed to improve writing by making it accountable to its sources and to external facts. But the rise...