Schools have historically used IQ scores in the placement of students on programs or to identify students with mental difficulties. However, IQ scores and tests are no longer the only measure of a person’s cognitive ability or potential.

Nowadays, IQ scores are somewhat controversial, as cultural and environmental factors may also play a role in how well a person performs on a test. That said, IQ tests remain a part of helping identify a person’s intelligence.

Keep reading to find out the average IQ in the United States and other countries around the world.

a teacher working with students with average IQShare on Pinterest
Cultural and environmental factors may affect a person’s cognitive ability.

When psychologists first developed the current IQ test, they set the average score of the norming scale to IQ 100. People have scores based on standard deviations above or below 100. This means that the average score should fall right around 100.

In 2010, two researchers published a report about the average IQ in 108 countries and provinces. In it, the U.S., countries in Europe, and countries in East Asia had averages within the expected range. However, African countries consistently scored around or below 70.

Other researchers have since discredited the notion that people in African countries have a lower average IQ. Importantly, they suggest that there were flaws with the way the original team sampled the 2010 populations.

The IQ test dates back to the late 1800s. The first test to measure intelligence looked at how quickly a person responded to stimuli. However, people largely abandoned this method when they realized that the speed test did not accurately predict a person’s intelligence.

Alfred Binet created the first modern intelligence test in 1905. He developed the test to determine whether or not a child would be able to keep up with their peers in the educational system. Binet used age as the means of control.

He created a test that arranged questions based on the average ability of children of different ages. In this way, the test could show how a child performed compared with other children of a similar age.

For example, if a child was able to answer questions for children 2 years older, that child would test as being 2 years ahead in “mental age.” Binet would then subtract that “mental age” from the child’s real age to give an intelligence score.

Though Binet’s model was an improvement in determining intelligence, it had some flaws.

William Stern proposed a different model: IQ. Instead of subtracting the mental age, Stern proposed dividing a person’s mental age by their actual age. The formula he proposed was (mental age) / (chronological age).

Still, Stern geared his version of the IQ test toward children, which meant that it would not work for adults.

Eventually, Donald Wechsler solved this issue by comparing test scores with those of a person’s peers and normalizing average scores to 100.

Therefore, the quotient is no longer a quotient at all. Instead, it is a comparison between how a person performs compared with their peers.

The U.S. military adapted this test to create a multiple choice test, which they later started to use. Over time, educational and work settings also started to use IQ tests to help determine a person’s intellectual strengths.

Learn about the links between attention deficit hyperactivity disorder and IQ here.

There are a lot of other intelligence tests that people use today. Some of the most popular tests for measuring intelligence include the:

  • Stanford-Binet Intelligence Scale
  • Wechsler Intelligence Scale for Children
  • Differential Ability Scales
  • Wechsler Intelligence Scale for Adults
  • Peabody Individual Achievement Test

Licensed psychologists administer these tests to people.

There are also a number of commercial intelligence tests that both companies and individuals can acquire. They can help assess how well a person can pick up certain tasks, the ways in which they think, and so on.

In workplace settings, employers can use these types of tests to help match people to the roles that fit with their natural abilities and skillsets.

An IQ score may provide part of the answer when it comes to a person’s intellectual capabilities, but it is not a perfect system. It does not show a person’s full range of intellect. For example, it does not account for their creativity or social intelligence.

Also, IQ can vary greatly by country and region. The following factors can also affect IQ scores:

  • access to education
  • rates of infectious disease
  • nutrition
  • cultural norms

In fact, one study found that the prevalence of infectious disease is one of the most important factors in predicting IQ scores. The researchers revealed that people who had a childhood illness used their energy to fight the illness, not boost brain development.

A similar study found that even within the U.S., people living in states with a higher rate of childhood illness also had a lower overall IQ score than those in other states.

Focusing solely on IQ scores as a measure of intelligence is neither fair nor accurate. A person’s true ability to succeed at school, work, and in other aspects of life has to do with a huge range of factors, not just their IQ score.

The average IQ score in the U.S. is around 100.

However, although IQ scores may provide some insight into a person’s overall intellectual capability, people should avoid placing too much emphasis on the results of these tests.

Cultural factors, nutrition, access to education, and illness can all play a role in how well a person will score on an IQ test.