Published on Psychology Today
When my son Edwin was a baby, his favorite thing to do was to bounce a ball. We didn’t have that many ball toys in the house, but we did happen to have a lot of balls for my dog, which means that for months, Edwin almost exclusively played with dog toys. For long stretches of time, he would sit and bounce a ball over and over again until it rolled away, then he’d smile to himself and do it again. He was so obsessed with balls that anytime we gave him anything remotely round, he would throw it onto the ground to see if it would bounce. This might seem like a simple behavior, but what he was doing was actually quite amazing: He was exploring whether new objects fit into his category for “ball,” which he defined as round objects that bounced. He ended up breaking a lot of toys this way, but it was a pretty smart heuristic, as balls do indeed tend to be round, and they do tend to bounce when you throw them.
Perhaps you’ve never given much thought or wonder to how we form categories, as it’s just a natural and automatic part of being human. But the fact that humans have the capacity to categorize objects is quite remarkable, and one of the abilities that allows us to store an incredible amount of information in our brains. Having categories for things like “apple” and “dog” and “ball” means that we can group things that are alike, helping us easily store new information, and helping us to make quick inferences about things we’ve never seen before. Imagine for a second that we didn’t have the ability to form categories; it would mean that every time we encountered a brand-new object, we’d have to learn about it from scratch. The fact that we have categories means that whenever we encounter a new apple, we don’t have to learn about it; we can just apply what we know about other apples and safely assume that this new apple is a piece of fruit that tastes sweet and is nutritious. Likewise, whenever we encounter a new dog, we’d know something about the way it likely behaves, what it eats, and whether it’s safe to approach.
As an essential part of human cognition, categorization begins to develop in the first few months of life. For example, by 3 to 4 months of age, infants can form distinct categories for dogs and cats. How do we know? Researchers interested in early categorization abilities typically present infants with the same kind of thing over and over again, and then see how the infants react when you show them something new. For example, one group of researchers showed young infants a photo of a cat, then a photo of a different cat, then a different cat after that, and so on until the infants lost interest and stopped paying attention to the photos. Once the infants were sufficiently bored with cat pictures, the researchers presented them with a new photo of a cat and a photo of a dog. If the infants were still bored and looked away from both photos, the researchers could deduce that infants couldn’t distinguish between the dog and cat, and thus treated them both as if they belonged to the same boring category. However, while the infants in this study did look away when they saw the new cat photo, they were suddenly interested again when they saw the dog, suggesting that they recognized that the dog was from a new category. This demonstrates that even 3- and 4-months-olds can form very basic categories for animals (Quinn et al., 1993).
It’s a good thing that infants can do this so quickly, because again, the ability to form categories is a powerful way for us to store information and make predictions about new things. However, categorization, despite its benefits, also has some downsides. Namely, our propensity to store information in categories can sometimes extend beyond the categorization of objects to the categorization of people. This might sound useful in some cases; for example, knowing that “3-year-olds” generally consist of tiny humans that are prone to throwing insane temper tantrums might help you steer clear of anyone who fits that cute but high-maintenance category. However, this kind of category, often called social categories, can also have negative consequences. Indeed, if you were to form categories for groups of people, you could easily make a false assumption about someone new just purely based on their category membership. In fact, this behavior is the very definition of “prejudice,” which is the tendency to make assumptions about someone purely based on their membership in a particular group. On top of that, when people tend to place themselves in a group, or what we might call an “ingroup,” there will inevitably be others who are left out of that group—people we call “outgroup” members.
Research does suggest that even infants can make distinctions between ingroup and outgroup members early in life purely because they prefer things that are most familiar to them. For example, although newborns don’t show any preferences for faces based on race, 3-month-old infants prefer images of adults of their own race over images of adults of other races (Kelly, Quinn, Slater, Lee et al., 2005). Similarly, 5- and 6-month-old infants prefer to look at people who speak their native language, and 10-month-old infants are even more willing to take toys from people who speak their native language compared to people who speak a foreign language (Kinzler, Dupoux, & Spelke, 2007). These preferences could form the basis for later category formation, which often relies on grouping members based on their features. On top of that, research shows that young children expect that people in the same group—usually marked in research by sharing some arbitrary feature, like a shirt color—should like the same things and behave in similar ways (e.g., Liberman, Woodward, & Kinzler, 2017), suggesting that features like skin color, native language, or even shirt color can form the basis for social category membership.
This doesn’t mean that all infants will grow into children and adults who will categorize people based on their race, but it does mean infants start to treat people who they don’t often have contact with as different from an early age. But frequent exposure to other races on a daily basis can erase these effects. For example, if children live in neighborhoods where they are often exposed to people of other races, they are better at differentiating between faces of people from other races than children who don’t have the same exposure (Bar-Haim et al., 2006). The same is true for infants who have exposure to people who speak different languages and have different types of accents.
The take home message is that forming categories is essential for our everyday lives. But the same tendency that allows us to sort objects into boxes can also cause us to put people into boxes, which has some obvious downsides. To fight our natural tendency to want to categorize, it’s important to remember that while categorizing objects is incredibly useful, it is often more useful to think of people as individuals, even if they do belong to religious groups, social groups, or even cultural groups. And exposing infants to lots of different kinds of individuals may help them see people as just that.
Photo by Pixabay/Fotorech
Bar-Haim, Y., Ziv, T., Lamy, D., & Hodes, R. M. (2006). Nature and nurture in own-race face processing. Psychological science, 17(2), 159-163.
Kelly, D. J., Quinn, P. C., Slater, A. M., Lee, K., Ge, L., & Pascalis, O. (2007). The other-race effect develops during infancy: Evidence of perceptual narrowing. Psychological science, 18(12), 1084-1089.
Kinzler, K. D., Dupoux, E., & Spelke, E. S. (2007). The native language of social cognition. Proceedings of the National Academy of Sciences, 104(30), 12577-12580.
Liberman, Z., Woodward, A. L., & Kinzler, K. D. (2017). The origins of social categorization. Trends in cognitive sciences, 21(7), 556-568.
Quinn, P. C., Eimas, P. D., & Rosenkrantz, S. L. (1993). Evidence for representations of perceptually similar natural categories by 3-month-old and 4-month-old infants. Perception, 22(4), 463-475.