Cultural bias

Cultural bias is interpreting and judging phenomena in terms particular to one's own culture. This is a danger in any field of knowledge that claims objectivity and universality, such as philosophy and the natural sciences. The problem of cultural bias is central to social and human sciences, such as economics, psychology, anthropology and sociology, which have had to develop methods and theories to compensate for or eliminate cultural bias.

Cultural bias occurs when people of a culture make assumptions about conventions, including conventions of language, notation, proof and evidence. They can then mistake these assumptions for laws of logic or nature.

Numerous such biases are believed to exist, concerning cultural norms for color, location of body parts, mate selection, concepts of justice, linguistic and logical validity, acceptability of evidence, and taboos. In brief, any normative belief of a human being seems to be caused by culture, and thus can be reasonably isolated as a cultural bias. See goodness and value theory.

People who read English often assume that it is natural to scan a visual field from left to right and from top to bottom. Also, in the most western countries, a light switch usually turns a light on when up. Also, in these countries, North is the top of a map, up is usually the larger quantity and better, as well. As another example, Japanese do not place an X in a check-box to indicate acceptance -- this indicates refusal.

These conventions are generally useful, as once one is used to light switches behaving a certain way one does not need to learn a per-lightswitch rule but just a general rule. Unfortunately, when people move between cultures or design something for a different group they often do not attend to which conventions remain and which change.

Linguistic and ethnic groups often do not share these notational assumptions. Notational and operative assumptions can damage control systems if the users are a different culture from the designers. In safety-critical systems, control panels and similar devices have to be validated to prevent degradation from cultural biases.

As one might expect, the effects on philosophy are profound, and not merely limited to ethics.

Philosophically, people trained in Western reasoning habitually assume that accumulation of independent evidence constitutes proof of an event's occurrence, even if no person saw the event.

Western scientific reasoning is even more generous. It assumes that independent evidence can prove theories, indicate the existence of invisible objects, and validate new forms of logic. Further, in some sense, these theories, objects and logic are only conditionally true, and can cease to be simply from a change in recorded human experience.

At the same time, scientific reasoning discards entire classes of explanations without consulting evidence, such as the assumption that supernatural beings cannot affect physical experiments.

Now note that these assumptions can be denied, and when they are denied, large amounts of science and logic fail, taking much of our knowledge with them.

Significantly to a philosopher, none of these skeptical denials require one to be skeptical of logic or language. Thus, these forms of skepticism do not clearly invalidate themselves as a direct denial of the validity of logic or language would. This means that they might be formally valid.

Wittgenstein had a more practical way of classifying cultural biases. He believed the critical thing was whether we would coerce the cultural biases by locking up or medicating people who defied them. That is, whether people who violated the norms were "crazy." He considered cultural biases of this sort to be much more strongly held, perhaps grounded in human biology or physical reality in some way.

Compare with disconfirmation bias.