It All Makes Sense Now: New Brain Research Says Adolescence Ends at 32

Now I have a scientific excuse for aimlessness and underachievement

Science continues to swagger through time and space, accelerating as it goes. There’s no doubt about scientific progress, and there’s little doubt it will continue. The result is that our understanding of the world, and our places within it, remains uneasily contingent. As scientific advances bring new insights, we are compelled to adapt our thinking and to accommodate what we learn.

That’s the theory, anyway. Practice in what we call the real world tells us our endeavor to adapt remains an industrious work in progress. Some days are better than others.

Age of the Dolt

New studies counterintuitively inform us that, even amid the scientific advances, we’re getting intellectually dimmer. Perhaps that’s because, as Marshall McLuhan and others suggested, the media is the message. The media, as we currently find it, is akin to regurgitated junk food, packed with stupefying saturated fats. Our new media diet is a long way from the Mediterranean.

Some contend that the age of the dolt can be blamed overwhelmingly on a degeneration (or “enshittification”) of social-media platforms, further aggravated by AI-slop excretions. The result is the fatuity of a babbling commons. The jury is still out, perhaps engrossed in TikTok videos. A conclusive verdict might never be reached.

Getting back to the steady advance of the scientific plow, we observe that it generates considerable dissonance in its wake. As science exposes ignorance, long-held beliefs are often found wanting. The science of two centuries ago, even a century ago, yielded contemporary knowledge, but subsequent breakthroughs resulted in unsettling reassessments and revaluations. Things we believed during earlier eras lose their scientific purchase and much of their quotidian utility. We think we’ve come a long way — and we have, relatively speedily at that — but, if we look ahead, posterity might regard us as gawping primitives, much as we perceive our relatively benighted forebears.

Whenever I read the latest news from the world of scientific research, I try to maintain my intellectual equanimity. Yes, presuming the methodology and execution are sound, the latest discovery is important, indisputably valuable, and a step in the right direction, namely forward. At the same time, however, everything is a work in progress, and the last word might never be declaimed. (If we ever arrive at a last word, will anybody be able to record it, much less remember it?)

The story of human achievement, with science opening ever-fresh vistas, is a long one. As actors, we play our small parts, then leave the stage. A hundred years from now, this snapshot in time is apt to look at a lot less vibrant than it feels to those of us living it now. The narrative will continue to unfold long after we’re gone, and our accomplishments will be surpassed and relegated to the historical record (or not).

Brain Research to the Rescue

Still, even accounting for those caveats, I was struck by an article published on the BBC website regarding a study that suggests our brains, and their neural filaments, age differently than was previously believed.

Researchers at the University of Cambridge scanned the connected brain cells of approximately 4,000 people. The researchers found that the brain goes through five distinct phases in life, with critical turning points at the ages of nine, 32, 66, and 83.

As you doubtless noticed, there are significant gaps between each of those inflection points. Perhaps the most interesting range is the one between the age of nine and 32, a period the researchers deemed “adolescence.”

Granted, some definitions of adolescence extend from the age of 10 to 30, but most people probably conceive of adolescence as inhabiting a shorter span. Most would agree with the World Health Organization (WHO), which defines adolescence as a period that begins at about 10 years of age and ends at about 19. Denizens of the Western world typically consider ourselves adults by the time we reach our late teens or early 20s, when many are still ensconced at an educational institution.

Not many people, I aver, would regard themselves as an adolescent at 32. But that’s what this new research suggests. The University of Cambridge researchers claim their findings show that the brain stays in the adolescent phase until our early thirties, which is when we peak in our brain development, and presumably in our cognitive acuity. These results, according to the learned folks at Cambridge, could help us understand how and why the risks of mental-health disorders and dementia intensify at different points of our lives.

I realize, as noted above, that all knowledge is contingent. What we know now is apt to be modified or supplanted by scientific advances yet to occur. Nonetheless, the results of this study provided me with some exculpatory consolation. You see, I’ve always felt that I was a bit slow off the mark. I’ve thought that my path through life was chronically belated and digressive.

It’s All Downhill from Here, and the View Could Be Better

Adolescence was, for me, a long-running project. I blundered through much of my 20s, staggered into my 30s, and only got my act — such as it was — in passable order from my mid-30s onward. Even then, the act was no spellbinder.

Through it all, though, I thought I was the only one stuck in a perpetual miasma of feckless adolescence. The brain research from Cambridge helpfully suggests otherwise. It’s good to know, but I still rue my ungainly start in the marathon of life.

Near the end of the article, we are told that not everyone will experience the same cadence in their brain’s developmental cycles. Mileage, as always, will vary. Still, the researchers claim the broad categories they’ve demarcated will apply, give or take a couple of years here or there.

Having gained provisional insight into how brain science shaped my past, I turn my attention reluctantly to what’s ahead, according to the research. Frankly, the view is unedifying, if not bleak. The findings indicate that I have a few years of cognitive brightness remaining, after which I will begin to gradually and inevitably dim. At some point, I suppose my brain will flicker spasmodically and burn out, like a spent bulb. Game over. My brain cells will have snapped, crackled, and popped.

Still, I suppose there’s still time to tackle new challenges, to push myself to new heights or — if that’s too grandiose — I can peddle as fast I can to avoid a premature slide into oblivion.

On the whole, though, I take comfort in Cambridge’s brain research. A large portion of my life, which I’ve looked on retrospectively as one of prodigality, now makes a little more sense to me. I thank science for its illuminations, even as I know that its light of learning and reason will become even brighter in the years to come.

Subscribe to Crepuscular Circus

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe