Intelligence: IQ, EQ, and other variety of Academic Metrics may not match Public Usage.

If a theory that could be explained in a simple, easy-to-understand way, and easily applied to anyone who read it, would be easily abused and moved away from its original purpose. So is the intelligence test.

Monster Box
12 min readDec 3, 2021

On some free days, you were watching a movie episode and saw a kid in which his parents make him take an IQ test to find out why he or she receives bad grades so often at school. You see the worry and suspense of the two parents while their child is doing the test and then the expression of ridiculous joy at the result that their child is also at the top of the list of participating kids. You laugh, but also wonder how smart you are yourself. With some basic google searches, it is not difficult for you to find a reputable IQ test site with a 20-page test. After an hour of brainstorming with all the pictures and numbers, you also complete the test.

You press send…

“You need to pay $1 to receive the detailed result.”

You transfer the money. Can’t let the time and energy you spent on it be in vain.

“Congratulations. Your IQ is 115. You are in the top 15% of the most intelligent people.”

That’s right. You are satisfied with the result. But is the truth that simple?

1. A brief history on the concept of intelligence.

Although it can exist in other life forms such as in some plants, non-human animals, or in computer artificial forms, intelligence is most often mentioned and studied in humans. And the “intelligence” mentioned in this article refers to humans.

Despite being one of the most popular research topics in the field of philosophy and psychology, or recently involving modern biology and neuroscience, there has never been a universal definition of the nature of “intelligence” and its constituents never get recognized.

Basically, intelligence is related to human intellectual activities, including the ability to learn from experience, adapt to new situations, understand and process abstract concepts, and use knowledge to adapt to their environments [1]. Modern studies show that intelligence can be described in many ways such as logical thinking, understanding, self-awareness, learning, emotional knowledge, reasoning, planning, creativity, critical thinking, and problem-solving.

Tracing the history of human intelligence research, mainstream documents often begin with the work of Sir Francis Galton (Charles Darwin’s cousin) in the late 1800s, recognized as the first study on intelligence in a scientific way. Galton was interested in the concept of gifted individuals and argued that intelligence is an intellectual capability in general, at the same time a product of biological evolution. He built an anthropometric laboratory to measure visitor response times to psychophysiological tests and other physical characteristics to test his hypothesis [2].

In one of his experiments, Galton found that people’s ability to hear high notes drastically decreased with age, and he also found that humans were inferior to cats in their ability to perceive tones in high-pitched sounds [2]. This finding points to a problem with any psycho-physiological intellectual theory that introduces the concept of evolutionary continuity since it was previously thought that wisdom accumulates and advances over time and possesses many outstanding factors. Galton’s experiments have shown that intelligence does not develop linearly or increases with age, and at least on some measure of “response rate” like this, cats have superior intelligence than humans. The findings help Galton realize the need to formulate a general theory and test with empirical data more concretely than just speculation. Since then, paving the way for the emergence of more and more theories (and controversies) in this field.

Concern about human intelligence has actually appeared much older than that. And the prevailing modern theories of intelligence are sometimes similar or borrowed from the past, whether or not the authors themselves perceive (or admit) it.

Ideas of “intelligence”, or similar concepts used to describe human intelligence, have even existed since ancient times. Typically Plato’s metaphor of imagining that in everyone’s mind exists a mass of wax with different properties (size, moisture, hardness, or purity) in different people. People with soft wax are able to learn well but forget quickly, and vice versa, people with hard wax learn more slowly but memorize longer. Later scientists even though did not consider the brain as a waxy mass, still, they replaced it with more modern biological concepts such as neurons, synapses, and connections, but still noted the idea of Plato as a preliminary metaphor of the human mind [3].

Or Aristotle’s view on the importance of logical reasoning to intelligence, introduced in the fourth century BC and confirmed in later studies in the twentieth century. More closely, in his 1651 publication Leviathan, Thomas Hobbes distinguished “natural” and “learned” intelligence, and “foreshadowed” later scholars that emotion and motivation can supplement or impede the use of intelligence. Or John Locke, a famous empiricist philosopher of the seventeenth century, once mentioned a connection between mental speed and intelligence — an idea that Kahneman and other cognitive psychologists in the second half of the twentieth century introduced with the term “fast and slow thinking” [3].

In general, the thought of a number of pre-nineteenth-century philosophers provided the background and foundation for modern thinking, though with many limitations. For example, the pre-Galton conception of intelligence indirectly emphasized intelligence formed by age, and many other concepts related to the social order at the time. Galton’s experiment has shown this to be false, at least indicating that the response rate of the senses tends to decrease with age. But Galton’s own conception is somewhat misleading, because if he thinks that senses play an important role in intelligence, then near-sighted people are more likely to be less intelligent than people with healthy sights.

To modern studies, after Galton, more prominent are the two opposing schools of intelligence that are too familiar with social science students:

(1) The views of Charles Spearman, an English psychologist, with the Two-Factor theory [4a, b]. The first and more important factor, which he called the “common factor”, or “g”, represents an individual’s general intelligence and affects the performance of all intelligence-demanding activities. The second factor relates to an individual’s specific abilities in a particular field. For example, when a person takes a math test, his or her performance requires a common coefficient for all math problems (g) and a specific coefficient for whatever math is performed. However, what the “g” factor actually was, Spearman did not know. He considered it a form of “mental energy” and each person has a fixed amount of mental energy that can be distributed at different times for different intellectual activities. And the difference in intelligence is generally due to the difference in it.

(2) Challenging the view of Spearman, American psychologist Louis L. Thurstone, proposed describing intelligence through the Primary Mental Abilities model, including 7 different elements involved language ability, computation, memory, cognitive speed, reasoning, and spatial thinking, instead of just two factors that are too general [5]. Following Thurstone’s work was the idea of ​​many different types of intelligence, or Multiple Intelligences, by another American psychologist, Howard Gardner [6]. He thinks that there is no single intelligence, but there exist many types of intelligence that are independent, distinct and representative of certain skills. Most activities, Gardner argues, will involve the combination of two or more forms of intelligence. The most prominent and popular theories today such as “Emotional Intelligence” or “Social Intelligence” are derived from the idea of intrapersonal and interpersonal intelligence of Gardner.

By the end of the twentieth century, most of the proposed theories were based on Spearman and Thurstone early ideas about intelligence. However, later studies note that Spearman and Thurstone’s theories are complementary instead of opposing each other. Because intelligence seems to be an intrinsic concept, a gray area in research when there are too many theories to explain it but no explanation is enough to persuade and deny other opponents.

However, although it is not possible to formulate a single universal definition, researchers can still build mental measurements based on developed intelligence models. Their theories may not match the “reality” yet, but they all have clear experimental methods based on the core ideas.

2. Quantify intelligence.

Ideas purely speculative will become meaningless if they cannot be measured experimentally, especially in the age of science.

The first people to lay the foundations for scientific intelligence measurement were Alfred Binet and his partners, Victor Henri and Théodore Simon, to assist the French Ministry of Education in developing tests to identify children with cognitive difficulty and therefore require more attention in school [7]. The Binet and Simon team have completed a set of 30 specific questions that focus on aspects that Binet considers the main characteristics of intelligence such as the ability to read, make inferences, and solve problems [8]. He also coined the concept of an individual’s mental age or the level of intellectual activity of an individual relative to the average performance at that age. The Binet-Simon scale was later adjusted by psychologist Lewis Terman of Stanford University and published as the “Stanford-Binet Scale” [9]. This test uses a unique metric, called an “intelligence quotient (IQ)”, to represent an individual’s intelligence score.

That is, the IQ measurement system was originally designed to look for the reasons behind some individual’s lack of cognitive abilities, not to reinforce one’s cognitive superiority over others.

Similar to the problem of definition, the tools of intelligence measurement given are also subject to much controversy. After Terman published the Stanford-Binet test, American psychologist David Wechsler developed a new set of questions because he was not satisfied with the limitations of the Stanford-Binet test. Like Thurstone and Gardner, Wechsler believed that intelligence is related to many different intellectual abilities and thinks that the Stanford-Binet scale is over-reflecting the idea of ​​a common form of intelligence. Therefore, he created the Wechsler Intelligence Scale for Children (WISC) and Wechsler Adult Intelligence Scale (WAIS), which are two of the personal intelligence tests as well as two among the oldest intellectual tests still in common use today besides the Stanford-Binet scale [10a, b].

It is important to note that it is vital to distinguish between intelligence and models of intelligence. The theories of Spearman, Thurstone, Gardner, or other theorists are all examples of the intelligence model that attempts to explain the causal relationship between the influencing factors and the person’s intellectual capacity. And IQ is simply a kind of scale for one aspect of intelligence. Although measurement methods are routinely standardized and have a reference value, so far there is no method to quantify intelligence in its true nature, if there is any real nature in existence.

3. It is still a long way to go before being widely adopted in society.

If a theory that could be explained in a simple, easy-to-understand way, and easily applied to anyone who read it, would be easily abused and moved away from its original purpose. So is the intelligence test. Early intelligence models were used in conjunction with other research purposes, were relatively specific, and not intended to measure society at large.

Perhaps, when Galton came up with the idea of ​​”eugenics” based on biological determinism in the belief that the difference in one’s mental abilities is primarily through genetics and that humans can improve our race through selective breeding of the good genes, without thinking that people will do the opposite, eliminating those who are believed to carry the “bad” genes. The Binet-Simon Scale, created with the purpose of helping children with poor intellectual abilities, has become the foundation for developing tests for “eugenics*” or evaluation and classification tools for recruits in the military[11]. After World War I, active military psychologists widely disseminated intelligence tests in the community, partly helping psychology to develop but also strongly promoting eugenics in the late nineteenth century to the early twentieth-century United States. The eugenics at this stage believed that poor intellectual capacity was inherited and should be prevented through surgical sterilization or basic isolation measures. And they did, creating a big stigma in history [12]. In addition, the racism of intelligence is highlighted by Charles Murray and Richard Herrnstein with the publication “the Bell Curve” [13] that IQ is the best predictor of the economical-social class, academic achievement, and the ability to commit crimes throughout a lifetime, have also provided additional “material” for extreme ideas in society.

*Eugenics, a set of beliefs and practices that improve the genetic quality of a population by excluding those rated inferior and motivating those rated superiors [14].

Obviously, extreme theories such as eugenics and racism based on IQ only exist for a short time when they are constantly met with criticism and rebuttal from the academic community, and eventually proven wrong. But before becoming obscured, the eugenics movement caused disaster for nearly half a billion people because of the false sterilization policy.

Incomplete scientific theories, in addition to containing risks when widely applied, also create gray areas for the development of pseudoscience or exaggerate non-scientific factors. Emotional intelligence is one example. The term emotional intelligence first appeared in 1964 in scientific texts [15] and remained in the area of ​​consideration, but it only became popular after Daniel Goleman published “Emotional Intelligence” in 1995, bringing the concept of emotional intelligence to the public [16]. And since his book, an industry worth millions of dollars has emerged. The influence of this concept is now so widespread that Harvard Business School has published a collection of best-selling articles on emotional intelligence and numerous corporate training in emotional intelligence and entrepreneurship. Hundreds of commercial books have been published on this subject [17]. Daniel Goleman popularized emotional intelligence that “it could be more important than the IQ.” [18] But studies comparing IQ and EQ have shown the opposite [19].

In short, through thousands of years of history, from the initial ideas of ancient philosophers to complex neurological studies, humans have never decoded the origin and intelligent nature of our species. Today’s models of intelligence are still fragments of an overall picture, and no one is sure if this is the final accurate structure of intelligence. It is still developing, testing, and applying it on a narrow scale, but never asserted its superiority when widely applied in the masses, and no one dares to guarantee that current knowledge of intelligence can radically solve real-world problems.

Perhaps, homo sapiens still have a long way to go from understanding the nature of the intelligence of our species, thanks to … intelligence. But we need to take small and careful steps, starting with knowing what limits the availability of knowledge (up to the present moment), and not just areas extending beyond the boundaries of science — an area where no one is responsible for them.

The greatest danger in worshiping intelligence is the tendency to hate the “lack of intelligence”. With narrow and incomplete definitions that have received so much public attention, this issue is indeed a nightmare. How will we evaluate our children, or ourselves, if the IQ results returned are not as expected?

Accept or deny are both equally lousy choices. Just let the work of measuring and applying the measurements back to the professionals, they will know what to do. “Measure” has never been a tool suitable for everyone, even more, is never a tool suitable for all purposes.

___________

References:

[1] Encyclopædia Britannica, “Human Intelligence | Definition, Types, Test, Theories, & Facts.”

‌‌[2] Galton F., “Inquiries into human faculty and its development,” MacMillan Co., 1883.

[3] Adler, M. J. (Ed.), “Great books of the Western world”, 54 vols., Chicago: Encyclopedia Britannica, 1987.

[4a] Spearman, C., “‘General Intelligence,’ Objectively Determined and Measured,” American Journal of Psychology, 1904.

[4b] Spearman, C., “The Abilities of Man,” Scientific Research Publishing, 1927.

[5] Thurstone L. L., “Primary Mental Abilities. In: The Measurement of Intelligence,” Springer, Dordrecht, 1973.

[6] Gardner. H., “Frames of Mind: The Theory of Multiple Intelligences,” New York: Basic., 2011.

‌[7] Binet, A., & Simon, T., “The development of intelligence in children (The Binet-Simon Scale),” Williams & Wilkins Co., 1916.

[8] Binet, A., & Simon, T., “The intelligence of the feeble-minded,” Williams & Wilkins Co., 1916.

[9] Terman, L. M., & Merrill, M. A., “Stanford-Binet intelligence scale: Manual for the third revision,” Boston: Houghton Mifflin, 1973.

[10a] Wechsler, D., “The measurement of adult intelligence. Baltimore,” MD: Williams & Wilkins, 1939.

[10b] Wechsler, D., “Wechsler intelligence scale for children,” New York: The Psychological Corporation, 1949.

[11] D. J. Kevles, “Testing the Army’s Intelligence: Psychologists and the Military in World War I,” The Journal of American History, vol. 55, no. 3, p. 565, Dec. 1968, doi: 10.2307/1891014.

‌[12] Edwin Black, “The Horrifying American Roots of Nazi Eugenics,” History News Network, 2003.

[13] C. Murray and R. Herrnstein, “The Bell Curve: Intelligence and Class Structure in American Life,” The University of Chicago Press, 2021.

‌[14] “Eugenics | Description, History, & Modern Eugenics,” Encyclopædia Britannica.

[15] Salovey, P., & Mayer, J. D., “Emotional intelligence,” Imagination, Cognition and Personality, 9(3), 185–211, 1990.

[16] Goleman, D., “Emotional intelligence: Why it can matter more than IQ,” New York: Bantam Books, 1995.

[17] Goleman, D., Boyatzis, R., & McKee, A., “HBR’s 10 must reads on emotional intelligence,” Cambridge, MA: Harvard Business Review Press, 2015.

[18] Goleman, D., “Introduction. In Emotional Intelligence: Why it Can Matter More than IQ,” New York: Bantam Books, 2005.

[19] Joseph, D. L., & Newman, D. A., “Emotional intelligence: An integrative meta-analysis and cascading model,” Journal of Applied Psychology, 95(1), 54–78, 2010.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Monster Box
Monster Box

Written by Monster Box

All knowledge from past to present is fascinating, just that they haven’t been properly told.

No responses yet

Write a response