Dictionary and translator for handheld
New : sensagent is now available on your handheld
A windows (pop-into) of information (full-content of Sensagent) triggered by double-clicking any word on your webpage. Give contextual explanation and translation from your sites !
With a SensagentBox, visitors to your site can access reliable information on over 5 million pages provided by Sensagent.com. Choose the design that fits your site.
Improve your site content
Add new content to your site from Sensagent by XML.
Crawl products or adds
Get XML access to reach the best products.
Index images and define metadata
Get XML access to fix the meaning of your metadata.
Please, email us to describe your idea.
Lettris is a curious tetris-clone game where all the bricks have the same square shape but different content. Each square carries a letter. To make squares disappear and save space for other squares you have to assemble English words (left, right, up, down) from the falling squares.
Boggle gives you 3 minutes to find as many words (3 letters or more) as you can in a grid of 16 letters. You can also try the grid of 16 letters. Letters must be adjacent and longer words score better. See if you can get into the grid Hall of Fame !
Change the target language to find translations.
Tips: browse the semantic fields (see From ideas to words) in two languages to learn more.
A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, leading to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality. Implicit in the concept of a "pattern of deviation" is a standard of comparison with what is normatively expected; this may be the judgment of people outside those particular situations, or may be a set of independently verifiable facts. A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics.
Cognitive biases are instances of evolved mental behavior. Some are presumably adaptive, for example, because they lead to more effective actions in given contexts or enable faster decisions when faster decisions are of greater value (heuristics). Others presumably result from a lack of appropriate mental mechanisms (bounded rationality), or simply from mental noise and distortions.
Bias arises from various processes that are sometimes difficult to distinguish. These include information-processing shortcuts (heuristics), mental noise and the mind's limited information processing capacity, emotional and moral motivations, or social influence.
The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 and grew out of their experience of people's innumeracy, or inability to reason intuitively with the greater orders of magnitude. They and their colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. They explained these differences in terms of heuristics, rules which are simple for the brain to compute but introduce systematic errors. For instance the Availability heuristic, when the ease with which something comes to mind is used to indicate how often (or how recently) it has been encountered.
These experiments grew into the heuristics and biases research program which spread beyond academic psychology into other disciplines including medicine and political science. It was a major factor in the emergence of behavioral economics, earning Kahneman a Nobel Prize in 2002. Tversky and Kahneman developed prospect theory as a more realistic alternative to rational choice theory.
Critics of Kahneman and Tversky such as Gerd Gigerenzer argue that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus.
Biases can be distinguished on a number of dimensions. For example, there are biases specific to groups (such as the risky shift) as well as biases at the individual level.
Some biases affect decision-making, where the desirability of options has to be considered (e.g., Sunk Cost fallacy). Others such as Illusory correlation affect judgment of how likely something is, or of whether one thing is the cause of another. A distinctive class of biases affect memory, such as consistency bias (remembering one's past attitudes and behavior as more similar to one's present attitudes).
Some biases reflect a subject's motivation, for example, the desire for a positive self-image leading to Egocentric bias and the avoidance of unpleasant cognitive dissonance. Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as "Hot cognition" versus "Cold Cognition", as motivated reasoning can involve a state of arousal.
Among the "cold" biases, some are due to ignoring relevant information (e.g. Neglect of probability), whereas some involve a decision or judgement being affected by irrelevant information (for example the Framing effect where the same problem receives different responses depending on how it is described) or giving excessive weight to an unimportant but salient feature of the problem (e.g., Anchoring).
The fact that some biases reflect motivation, and in particular the motivation to have positive attitudes to oneself accounts for the fact that many biases are self-serving or self-directed (e.g. Illusion of asymmetric insight, Self-serving bias, Projection bias). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and "better" in many respects, even when those groups are arbitrarily-defined (Ingroup bias, Outgroup homogeneity bias).
Some cognitive biases belong to the subgroup of attentional biases which refer to the paying of increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop Task and the Dot Probe Task.
The following is a list of the more commonly studied cognitive biases:
A 2012 Psychological Bulletin article suggests that at least 8 seemingly unrelated biases can be produced by the same information-theoretic generative mechanism. It is shown that noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can produce regressive conservatism, the conservatism (Bayesian), illusory correlations, better-than-average effect and worse-than-average effect, subadditivity effect, exaggerated expectation, overconfidence, and the hard–easy effect.
Many social institutions rely on individuals to make rational judgments. A fair jury trial, for example, requires that the jury ignore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist fallacies such as appeal to emotion. The various biases demonstrated in these psychological experiments suggest that people will frequently fail to do all these things. However, they fail to do so in systematic, directional ways that are predictable.
Cognitive biases are also related to the persistence of superstition, to large social issues such as prejudice, and they also work as a hindrance in the acceptance of scientific non-intuitive knowledge by the public.
In heuristics and biases literature, it is almost impossible to make an accurate and unbiased decision as the "rational" decision is usually sandwiched between two contradictory biases. For example, overestimating one's abilities can be due to the Dunning–Kruger effect, and underestimating them because of the false consensus effect. As a practical example, if one should estimate his skill in throwing a flying disc as far as possible, one must get his estimate right on the spot or else he has demonstrated biased judgement.