Knowledge vs. Belief

Knowledge: “… a familiarity, awareness or understanding of someone or something, such as facts, information, descriptions, or skills, which is acquired through experience or education by perceiving, discovering, or learning. Knowledge can refer to a theoretical or practical understanding of a subject. It can be implicit (as with practical skill or expertise) or explicit (as with the theoretical understanding of a subject); it can be more or less formal or systematic. __Wikipedia “Knowledge”

Belief: “… a state of the mind, treated in various academic disciplines, especially philosophy and psychology, as well as traditional culture, in which a subject roughly regards a thing to be true. __ Wikipedia “Belief”

A belief becomes knowledge when it is validated by solid evidence. But in reality, beliefs can only achieve a probabilistic likelihood of being true or false. This distinction was once the province of philosophers, but with the coming of science, philosophy has been forced to make room for empirical and statistical testing.

Persons who confuse the two concepts often spend much of their lives mired in futile controversy. When beliefs become systematised and incorporated into ideologies, they can acquire a lock-grip upon an individual’s mind and life trajectory.

Ideology: “… a set of conscious and unconscious ideas that constitute one’s goals, expectations, and actions. An ideology is a comprehensive normative vision, a way of looking at things, as argued in several philosophical tendencies (see political ideologies), and/or a set of ideas proposed by the dominant class of a society to all members of this society (a “received consciousness” or product of socialization)… __Wikipedia “Ideology”

Ideologies incorporate many beliefs. But ideologies also influence the ongoing formation and retention of new beliefs.

Knowledge, beliefs, and ideologies all incorporate significant subconscious and unconscious components. But of the three, knowledge is closer to the surface — and the most vulnerable to real-world testing and validation.

Ideally, all of our beliefs (and ideologies) would be supported well enough to be treated as “knowledge.” But that is rarely true. Usually, our beliefs and belief systems are built on a foundation of loose sand. But that does not necessarily make us more humble in our beliefs — often quite the opposite.

Sometimes we feel that if we express our beliefs loudly enough and often enough, that they will become the equivalent of supported knowledge. This is “the big lie” approach used so effectively by Hitler and by Soviet propagandists (some of whom are still active in Moscow), except incorporated by our own subconscious — against our own rational selves.

The big lie is, in fact, a common method of personal and social manipulation that is as old a the human race. It even predates spoken language. Gestures, body language, facial expression, touch — all of these can be even more effective in deception and self-deception than speech or writing.

Here at the Al Fin Dangerous Child Institute, we often say: “Everything you think you know, just ain’t so.” But that is not strictly true. Some forms of knowledge are tested on a daily basis, and prove themselves true. The knowledge of a surgeon in the operating room, of a chef in the kitchen, of a taxi driver finding the best way through varying traffic and other changing obstructions, of a boat captain navigating through dangerous shoals in a storm at night — these and many other examples of combined implicit and explicit knowledge provide a more meaningful view of “truth” or “reality” than one will find in ideologies and belief systems.

Whenever you are tempted to “set someone straight” in regard to their “beliefs” about something about which you may feel yourself more knowledgeable, try providing supportable information — rather than unsupportable opinions. No matter how often opinions are repeated, they do not gain more weight. Opinions — beliefs — can only gain weight when they are supported by testable “facts.”

It is more than enough for each person to try to rein in his own biases, delusions, and illusions. Trying to do so for another person is futile, although just a few carefully weighted words can sometimes help. If one is truly trying to assist someone back onto what he sees as “the rational path”, the fewer words said, the better.

Here is an example of someone attempting to introduce a touch of reality into the thinking of someone who may believe strongly in the promise of big solar energy:
http://www.americanthinker.com/2014/07/five_fatal_flaws_of_solar_energy.html

Notice the relative brevity, and the reduction of the argument to 5 main points. These 5 points are briefly elaborated and supported by analogies and numerical arguments. The article is provided in such a form as to provide testable information which might lead a person into a deeper study of the issue.

Contrast the above approach with the approach so often used by green partisans, who generally avoid the pivotal issues that must be faced in order to prove their argument.

An honest argument sticks to pertinent facts and things that can be falsified. A dishonest argument avoids the key issues and makes liberal use of logical fallacies and appeal to cognitive biases, in order to dance around the central issues.

And this is the problem faced by modern societies today: the big 3 shapers of opinion — media, academia, and government — are all dedicated to pushing particular agendas, rather than to helping individuals sort out the truth of things for themselves. We expect such things from government and media, but when teachers and professors commonly choose the dark side — and victimise their students in this way — people may feel it is time to bring out the guillotines.

And once the mobs get started chopping heads, they rarely know when to stop.

Advertisements
This entry was posted in Knowledge, Philosophy. Bookmark the permalink.