A collage of different generations, from left to right, millennials, boomers, gen-z.
Gen Z hates boomers, everyone hates millennials, and no one thinks about Gen X. Why are we so obsessed with talking about generations?

Gen Z is soft, millennials are embarrassing, boomers are evil, and no one has thought about Gen X in years. Even if we can't remember exactly which ages define each cohort, many people can offer up these generational stereotypes on command. These supposedly profound differences have been used to explain shifting attitudes about certain colors, the rising popularity of spicy foods, and even the perceived onset of adulthood. But while generational framings are ubiquitous, just how real are these fault lines?

The Pew Research Center has spent decades conducting surveys and research about what each generation thinks, feels, and is doing. Its start and end dates for generations became the standard for news publications, academic research, and dinner-table arguments. But late this spring, Pew announced it would no longer use generational labels such as millennial and Gen Z in its research. In doing so, Pew quietly ended a tradition that had in recent years become a source of growing frustration (and heated debate) in social-science circles.

The problem, said Kim Parker, the center's director of social-trends research, is that what we call a generation covers too wide of a time span to offer any useful insight. Generations span 15 to 18 years, Parker explained to me in an email, making it challenging to home in on a handful of attributes that realistically apply to the whole group. A 27-year-old is likely to experience today's lightning-fast social and technological changes differently than a 39-year-old, for instance, though both are considered millennials under Pew's definition. And it may be hard to generalize about a generation whose eldest members were already in the workforce when the 2008 recession hit but whose youngest were just leaving elementary school. 

To account for this "great diversity of thought, experience and behavior within generations," Parker wrote in an essay on the decision, Pew will reframe generational research in the context of "age cohorts": groupings of people who were born in a particular time frame who might have experienced major societal events in meaningfully similar ways. "For example, it could be a group/cohort that came of age politically when Obama was president, or it could be young adults who were in college during the pandemic, or we could group people by birth decade," a Pew spokesperson said.

"The question isn't whether young adults today are different from middle-aged or older adults today," Parker wrote. "The question is whether young adults today are different from young adults at some specific point in the past."

Pew's announcement raises questions about the validity of the stream of generational content we've been served. Is there really a cohesive Gen Z? Does it make sense to compare millennials to boomers? Are 20-year-olds always just 20-year-olds? At its core, Pew's decision makes clear that generations — and the distinctions we draw between them — are simply made up.

But if generations are fake, why do we care so much about them?

Fake generations

You're probably familiar with each generation's unique flavor of malcontent. Millennials (born from 1981 to 1996, according to Pew) are lazy, self-obsessed, and slow to launch. Baby boomers (1946 to 1964) are entitled, selfish, and basically the root of all societal ills.  Gen Zers (1997 to 2012) are tech-obsessed, psychologically fragile, and either too woke or not woke enough (looks like the jury's out on that one). And Gen Xers (1965 to 1980) — who cares? Yawn!

The idea of generations was born roughly a century ago. The sociologist Karl Mannheim developed the notion of discrete "generation units" in his 1928 essay "The Problem of Generations." When a group of people experience a historical or cultural event at a formative age, Mannheim argued, they develop a distinct consciousness that becomes a part of their shared identity. In a New Yorker essay published in 2021, Louis Menand linked this idea to the interwar explosion in high-school attendance in the US: In 1910, only 14% of Americans between 14 and 17 were in school, but by 1940 that had rocketed to 73%. Menand argued that the high-school boom gave rise to the "teenager" — a whole new social category and marketing demographic. 

There's only so much to be gleaned about a person from a haphazardly drawn, nearly two-decade window that happens to encompass their birth year.

While the idea of generations percolated for decades, today's intense obsession with age cohorts can be traced to the 1992 book "Generations." In a recent paper, the Skidmore College sociologists Andrew M. Lindner, Sophia Stelboum, and Azizul Hakim say the book's authors, William Strauss and Neil Howe, drew on "a long lineage of quasi-scientific romantic historical generational thinking" to help popularize today's generational terminology. The book even came up with the term "millennial."

"Since the release of Strauss and Howe's influential book, the generational labels of 'Baby Boomer,' 'Gen X,' 'Millennials,' and 'Gen Z' have appeared in dozens of trade paperbacks, thousands of newspaper headlines, and all over social media," Lindner and his colleagues said. "Each of these labels is associated with a package of supposed psychological traits, behavioral patterns, and political commitments typical of each respective generation (e.g., being narcissistic, parting one's hair in the middle, destroying the global economy)."

Through our generation-tinted cultural lens, the outfits worn by the character Portia on the second season of the HBO series "The White Lotus" are seen not just as the sartorial missteps of a judgment-challenged 20-something but as a comment on how an entire generation's sense of style was broken by social media. The consulting behemoth McKinsey's speculation on the future of work focuses not just on technological developments but on a workplace generation gap between Gen Zers and everyone else. Declining birth rates? Generational. Climate activism? Also generational. The list goes on.

But social scientists have long chafed at the idea of using generations to understand our changing culture, and there are numerous problems with overdeploying the generational framework. For one, there's only so much to be gleaned about a person from a haphazardly drawn, nearly two-decade window that happens to encompass their birth year. Generational discussions also tend to ignore critical variables such as race, education, and gender — as Pew researchers pointed out, generational stereotypes carry a distinctly upper-class bias. And they often amplify points of perceived difference instead of reflecting the similarities between groups. At one point or another, boomers, Gen Xers, millennials, and Gen Zers have all been branded (in so many words) as the self-absorbed sociopaths of their time.

A good chunk of the generational fascination comes down to people's interest in what the kids are up to. But even then, polls about Gen Z's attitudes typically leave out crucial context. "The issue there is that young adults will change as they get older," Parker told me. "So we can't really assess how their attitudes and behaviors are unique without the benefit of historical data." 

To see whether young people's attitudes toward work, for instance, are actually all that different from older workers' attitudes, researchers would need data on young people's views on work over time. But that kind of historical data is unfortunately lacking. So it's hard, if not impossible, to compare Gen Z's thoughts on the workplace with Gen X's thoughts on the workplace when they were the same age. And without that historical data, what you're really comparing is how 20-year-olds feel about work with how 50-year-olds feel about it.

'Kids these days'

Artificial or not, generational tensions have calcified into easy shorthand for advertisers, writers, and consultants. Popular reporting on generations tends to fill in information gaps with generalizations that flatten the various groups into catchphrases. But Philip N. Cohen, a sociologist at the University of Maryland, College Park, suggests this troublesome reflex can't necessarily be chalked up to cynicism or malice — at least, not entirely. It also comes from a real, compassionate human desire to connect with each other. Generation talk, he said, can help people scratch the itch of understanding, especially during periods of rapid social and technological change.

"Stereotyping is just very powerful, whether you love it or hate it," Cohen told me.

"When you're clicking on an article about generation concepts, it may be because you're irritated or cackling at the stereotype that's being portrayed in the headline, but it's also because you're trying to understand how the culture is changing. And I think there's a great impulse there."

We might say, 'Oh, kids these days are different,' when really it's that people these days are different, and kids are kids.

Cohen has nonetheless been among the most vocal recent critics of generational labeling in social research — and specifically of Pew's role in perpetuating what he views as one of pop social science's greatest myths. In 2021 he published a Washington Post op-ed article and an open letter asking that the think tank "do the right thing" and "help put an end to the use of arbitrary and misleading 'generation' labels and names." More than 200 social scientists signed on.

The circulation of Cohen's letter coincided with the release of "The Generation Myth." Written by Bobby Duffy, the director of the Policy Institute at King's College London, the book argued that "generational thinking" muddied the factors that actually do shape people's views and behavior over time. Duffy grouped these into three categories: "period effects" (major, era-defining events that affect everyone, such as the COVID-19 pandemic), "life-cycle events" (the typical milestones of an average person's life within a given society, such as getting married or having kids), and "cohort effects" (the overlapping experiences of people in the same age group). The trouble with generational thinking, according to Duffy, is that it homes in on cohort effects at the expense of other key mechanisms of social change.

By and large, Cohen shares Duffy's view that generational labels make it tough for both experts and laypeople to distinguish between generational traits and universal, or multifactorial, occurrences. "If an event comes along and changes things for everybody — you know, like a war, a recession, a pandemic — those things are not generational, and the changes that follow aren't examples of generational change," he told me. "But because we're fixated on generational labels, we might assume they are. We might say, 'Oh, kids these days are different,' when really it's that people these days are different, and kids are kids."

Duffy, Cohen, and the signatories of Cohen's open letter believe that the crutch of generational labels does more than oversimplify the complexity of demographic diversity. By creating rigid boxes for research, the labels stifle the potential for scientific breakthroughs. They can also warp data, generating conclusions that fail to encapsulate the full picture.

To its credit, Pew has been transparent in acknowledging how the use of generational labels may have tilted its analyses. In one of the center's recent blog posts, researchers revisited a 2017 report asserting that millennials were less likely than prior generations of young adults to move residences within the next year. By running the dataset through a new statistical model that decoupled generation from age and period, the researchers arrived at a new conclusion: "The apparent differences between the generations are better explained by other factors in the model, not generation."

We want to understand each other

How can categories that are so arbitrary and often unscientific also be so present in our lives? There's a simple answer: Despite their glass-half-empty assessments, generational boxes seem to resonate with people. Lindner, Stelboum, and Hakim said in their paper that "after decades of exposure" to "heavily marketed" generational labels, Americans generally identified with the categories they'd been slotted into. This particularly applied to people born in the center of their generational cohorts; a millennial born between 1986 and 1990 would feel more "millennial" than their peers born in the five years before or after that window. 

It helps capture societal change in a way that the public can understand and identify with.

In some cases, generational labels can offer something useful.

Pew "does believe generational research can be a useful tool in the right context," Parker told me. "It helps capture societal change in a way that the public can understand and identify with. Also, outside of the commonly used definitions of generations, people can understand at a very basic level what generational change is — my generation is different from my parents' and my grandparents'. And my young-adult children are experiencing the world in a different way than I did."

Parker and her colleagues note, however, that young adults have always faced different societal circumstances than their parents did at the same age. Pew's president, Michael Dimock, pointed out in a blog post that it's always been common for elders to "express some degree of concern or alarm" in response to younger people's departures in behavior from the norms established in the elders' coming of age. There's a reason the "kids these days" trope is, well, a trope. Whatever generational labels the future brings, it seems fated that the senior citizens of 2123 will fixate on the supposed character deficiencies of their younger counterparts — their work ethic, perhaps, or their self-obsession — while the youth bemoan the challenges imparted by their elders' mistakes. The more things change, the more they stay the same.


Kelli María Korducki is a journalist whose work focuses on work, tech, and culture. She's based in New York City.

Read the original article on Business Insider