Are you eyebrow or blush blind? TikTok users seem to think so—users have been taking to the social media platform to bemoan makeup wearers’ supposed inability to spot whether they’re blindly following makeup trends rather than wearing what looks good on their faces.
Though “eyebrow blindness” and other types of “beauty blindness” aren’t exactly destined to appear in psychological diagnostic manuals any time soon, people’s tendency to follow trends—even when they look silly or take risks doing so—is definitely a real phenomenon. It turns out that we’re hardwired to signal our affiliations with others (in this case, influencers with in-the-moment brows)—and that the desire to signal those affiliations can override our best intentions (or our secret knowledge that a new look may not be right for us).
(How social media becomes intertwined with our dreams—and nightmares.)
But makeup girlies and Sephora kids aren’t the only ones susceptible to using trends in a bid to fit in or stand out—in fact, there’s hot debate about whether the phenomenon gets particularly fierce during mid-life. So what does psychology say about trend following—and is there a way to break free from the desire to embrace the newest fad?
An evolutionary thirst for social connection
First, some reassurance: Following trends isn’t a sign of character weakness or mental health trouble. Instead, says psychologist Pamela B. Rutledge, who specializes in the psychological science behind media and technology, it’s perfectly normal. The reason? “Social connection,” a psychological concept that refers to humans’ core need to belong to a social group and connect with others.
“Social connection is a really primary motivation,” says Rutledge. “In fact, it was necessary for our survival back in the day; we are very much driven to be tribal.”
Social connection is so important, it’s considered to be one of humans’ basic needs. Multiple psychological frameworks argue that human connections are as important as basics like access to food and shelter. These include Abraham Maslow’s canonical hierarchy of needs, first argued in the 1950s, and the popular theory of self-determination introduced in the 1980s.
(Social running is all the rage. Here’s why it’s good for you.)
That’s because humans evolved to rely on one another. Evidence of group efforts to survive goes back almost as far as humans themselves. For example, archaeologists have uncovered evidence that early hominins worked together to transport stone tools over long distances about two million years ago in what is now Kenya—collaboration that allowed them to survive in a harsh environment.
And though following a trend on social media isn’t exactly as important as outrunning a saber-toothed tiger, human evolution has primed our brains to attune themselves to social signals.
How the social brain signals your identity to others
Social cognition—the processes that dictate what we notice and how we respond to others—largely takes place in the parts of the brain that oversee sight, pattern recognition, decision-making, empathy, and similar functions. These include the amygdala (which detects danger and difference) and the prefrontal cortex (involved in higher executive function and decision-making).
Thanks to these complex cognitive systems, our brains are really good at identifying patterns and prompting behavior that communicates our social status. As a result, says Rutledge, “You can use almost anything to signal your membership or affiliation” with a social group.
Giving off the right social signals lets others know which group you belong to or want to be affiliated with—and those signals are monitored and interpreted by others. Known as identity signaling, these behaviors can range from putting a political bumper sticker on a car to selecting a brand of clothing.
(Beauty is pain—at least it was in 17th-century Spain.)
These signals aren’t just for others: On a group level, identity signaling helps drive the broader culture. Groups adopt identifiers and symbols that let people feel as if they belong—or make it clear they want to stand out. That helps explain why micro-trends like “Instagram brows” or blush placement are so compelling.
And psychologists have documented that when trends hit the mainstream, many of their adopters move on to other signals in an attempt to signal that they’re on the cutting edge of culture—a social group in itself.
Has social media changed how humans signal their social status to one another? Not exactly, says Rutledge. Rather, “It has certainly allowed certain trends to spread faster and farther than they would have otherwise.” Take fashion: While trends once trickled down from haute couture runways toward common use over a scale of years, social media now allows mini trends to emerge and die out within weeks.
But why do trends themselves elicit such a pull? Blame evolution again, Rutledge suggests. “Our brains are hardwired to notice things that are weird,” she says. “If it’s not normal, we have to check it out.” Thus, we notice outliers and attention-grabbers, giving trends even more cachet for those bold enough to follow them.
Everyone is susceptible
No one is exempt from following fads, noticing trends, or signaling their perceived or real belonging to social groups, says Rutledge. But one group in particular is even more prone to follow trends—even risky ones: Tweens, teens, and young adults, says Rutledge.
As children start the road toward adult independence, they look for ways to express their individuality. Ironically, this can produce sometimes desperate attempts to prove affiliation with socially credible in-groups—and can stoke desire to stand out with the help of viral trends.
“You have to figure out how to make your way in the world,” says Rutledge. “And in order to do that, you have to figure out who you are.” Since we’re social creatures, that development takes place in a social environment, Rutledge says—and a growing awareness of social signals means a growing interest in following (or abandoning) trends. Popularity “doesn’t really matter, even though it feels like it does,” says Rutledge. “But from a biological perspective, it matters for getting a mate.”
Accordingly, the teenage brain undergoes rapid development in the very areas essential to social cognition. Studies suggest that the ability to recognize and read faces peaks during adolescence, contributing to a hyperawareness of one’s peers. Meanwhile, the prefrontal cortex—the region associated with logic and decision-making—is the last brain area to fully develop, making it little wonder that teens can turn to eating Tide pods or inhaling cinnamon to impress their friends.
(What is our fear of aging doing to kids’ mental health?)
By the same token, says Rutledge, older adults tend to feel more secure in their identity, a sense that may protect them from being as susceptible to every passing fad. Indeed, research shows that social attention varies by age, with older adults paying less attention to social cues than their younger counterparts.
But even adults can fall victim to the desire to signal social difference. Consider the midlife crisis: Though the paradigm is still hotly debated, some research has shown that individuals may be more sensitive to social reward—positive input from members of their social circles—during midlife.
Nor is social media the only place to find virality and socially relevant trends: Rutledge cites everything from flags to shoulder pads to tattoos and cars as social signals.
So the next time you put on makeup—or get dressed, order a meal in public, decide on a bumper sticker, or purchase a vehicle—consider that your decision may not be as personalized as you may think. “It’s really just an ingrained response,” says Rutledge—one that can be explained by our need to fit in with (or stand out from) the crowd.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : National Geographic – https://www.nationalgeographic.com/science/article/why-do-we-follow-trends-even-when-theyre-bad-for-us