Call us toll free: +1 4062079616
How To Be Spiritual In A Material World
Call us toll free: +1 4062079616

Full Width Blog

04 Sep 2024
Comments: 0

What God, Quantum Mechanics and Consciousness Have In Common

 

In my 20s, I had a friend who was brilliant, charming, Ivy-educated and rich, heir to a family fortune. I’ll call him Gallagher. He could do anything he wanted. He experimented, dabbling in neuroscience, law, philosophy and other fields. But he was so critical, so picky, that he never settled on a career. Nothing was good enough for him. He never found love for the same reason. He also disparaged his friends’ choices, so much so that he alienated us. He ended up bitter and alone. At least that’s my guess. I haven’t spoken to Gallagher in decades.

There is such a thing as being too picky, especially when it comes to things like work, love and nourishment (even the pickiest eater has to eat something). That’s the lesson I gleaned from Gallagher. But when it comes to answers to big mysteries, most of us aren’t picky enough. We settle on answers for bad reasons, for example, because our parents, priests or professors believe it. We think we need to believe something, but actually we don’t. We can, and should, decide that no answers are good enough. We should be agnostics.

Some people confuse agnosticism (not knowing) with apathy (not caring). Take Francis Collins, a geneticist who directs the National Institutes of Health. He is a devout Christian, who believes that Jesus performed miracles, died for our sins and rose from the dead. In his 2006 bestseller The Language of God, Collins calls agnosticism a “cop-out.” When I interviewed him, I told him I am an agnostic and objected to “cop-out.”

Collins apologized. “That was a put-down that should not apply to earnest agnostics who have considered the evidence and still don’t find an answer,” he said. “I was reacting to the agnosticism I see in the scientific community, which has not been arrived at by a careful examination of the evidence.” I have examined the evidence for Christianity, and I find it unconvincing. I’m not convinced by any scientific creation stories, either, such as those that depict our cosmos as a bubble in an oceanic “multiverse.”

People I admire fault me for being too skeptical. One is the late religious philosopher Huston Smith, who called me “convictionally impaired.” Another is megapundit Robert Wright, an old friend, with whom I’ve often argued about evolutionary psychology and Buddhism. Wright once asked me in exasperation, “Don’t you believe anything?” Actually, I believe lots of things, for example, that war is bad and should be abolished.

But when it comes to theories about ultimate reality, I’m with Voltaire. “Doubt is not a pleasant condition,” Voltaire said, “but certainty is an absurd one.” Doubt protects us from dogmatism, which can easily morph into fanaticism and what William James calls a “premature closing of our accounts with reality.” Below I defend agnosticism as a stance toward the existence of God, interpretations of quantum mechanics and theories of consciousness. When considering alleged answers to these three riddles, we should be as picky as my old friend Gallagher.

THE PROBLEM OF EVIL

Why do we exist? The answer, according to the major monotheistic religions, including the Catholic faith in which I was raised, is that an all-powerful, supernatural entity created us. This deity loves us, as a human father loves his children, and wants us to behave in a certain way. If we’re good, He’ll reward us. If we’re bad, He’ll punish us. (I use the pronoun “He” because most scriptures describe God as male.)

My main objection to this explanation of reality is the problem of evil. A casual glance at human history, and at the world today, reveals enormous suffering and injustice. If God loves us and is omnipotent, why is life so horrific for so many people? A standard response to this question is that God gave us free will; we can choose to be bad as well as good.

The late, great physicist Steven Weinberg, an atheist, who died in July, slaps down the free will argument in his book Dreams of a Final Theory. Noting that Nazis killed many of his relatives in the Holocaust, Weinberg asks: Did millions of Jews have to die so the Nazis could exercise their free will? That doesn’t seem fair. And what about kids who get cancer? Are we supposed to think that cancer cells have free will?

On the other hand, life isn’t always hellish. We experience love, friendship, adventure and heartbreaking beauty. Could all this really come from random collisions of particles? Even Weinberg concedes that life sometimes seems “more beautiful than strictly necessary.” If the problem of evil prevents me from believing in a loving God, then the problem of beauty keeps me from being an atheist like Weinberg. Hence, agnosticism.

THE PROBLEM OF INFORMATION

Quantum mechanics is science’s most precise, powerful theory of reality. It has predicted countless experiments, spawned countless applications. The trouble is, physicists and philosophers disagree over what it means, that is, what it says about how the world works. Many physicists—most, probably—adhere to the Copenhagen interpretation, advanced by Danish physicist Niels Bohr. But that is a kind of anti-interpretation, which says physicists should not try to make sense of quantum mechanics; they should “shut up and calculate,” as physicist David Mermin once put it.

Philosopher Tim Maudlin deplores this situation. In his 2019 book Philosophy of Physics: Quantum Theory, he points out that several interpretations of quantum mechanics describe in detail how the world works. These include the GRW model proposed by Ghirardi, Rimini and Weber; the pilot-wave theory of David Bohm; and the many-worlds hypothesis of Hugh Everett. But here’s the irony: Maudlin is so scrupulous in pointing out the flaws of these interpretations that he reinforces my skepticism. They all seem hopelessly kludgy and preposterous.

Maudlin does not examine interpretations that recast quantum mechanics as a theory about information. For positive perspectives on information-based interpretations, check out Beyond Weird by journalist Philip Ball and The Ascent of Information by astrobiologist Caleb Scharf. But to my mind, information-based takes on quantum mechanics are even less plausible than the interpretations that Maudlin scrutinizes. The concept of information makes no sense without conscious beings to send, receive and act upon the information.

Introducing consciousness into physics undermines its claim to objectivity. Moreover, as far as we know, consciousness arises only in certain organisms that have existed for a brief period here on Earth. So how can quantum mechanics, if it’s a theory of information rather than matter and energy, apply to the entire cosmos since the big bang? Information-based theories of physics seem like a throwback to geocentrism, which assumed the universe revolves around us. Given the problems with all interpretations of quantum mechanics, agnosticism, again, strikes me as a sensible stance.

MIND-BODY PROBLEMS

The debate over consciousness is even more fractious than the debate over quantum mechanics. How does matter make a mind? A few decades ago, a consensus seemed to be emerging. Philosopher Daniel Dennett, in his cockily titled Consciousness Explained, asserted that consciousness clearly emerges from neural processes, such as electrochemical pulses in the brain. Francis Crick and Christof Koch proposed that consciousness is generated by networks of neurons oscillating in synchrony.

Gradually, this consensus collapsed, as empirical evidence for neural theories of consciousness failed to materialize. As I point out in my recent book Mind-Body Problems, there are now a dizzying variety of theories of consciousness. Christof Koch has thrown his weight behind integrated information theory, which holds that consciousness might be a property of all matter, not just brains. This theory suffers from the same problems as information-based theories of quantum mechanics. Theorists such as Roger Penrose, who won last year’s Nobel Prize in Physics, have conjectured that quantum effects underpin consciousness, but this theory is even more lacking in evidence than integrated information theory.

Researchers cannot even agree on what form a theory of consciousness should take. Should it be a philosophical treatise? A purely mathematical model? A gigantic algorithm, perhaps based on Bayesian computation? Should it borrow concepts from Buddhism, such as anatta, the doctrine of no self? All of the above? None of the above? Consensus seems farther away than ever. And that’s a good thing. We should be open-minded about our minds.

So, what’s the difference, if any, between me and Gallagher, my former friend? I like to think it’s a matter of style. Gallagher scorned the choices of others. He resembled one of those mean-spirited atheists who revile the faithful for their beliefs. I try not to be dogmatic in my disbelief, and to be sympathetic toward those who, like Francis Collins, have found answers that work for them. Also, I get a kick out of inventive theories of everything, such as John Wheeler’s “it from bit” and Freeman Dyson’s principle of maximum diversity, even if I can’t embrace them.

I’m definitely a skeptic. I doubt we’ll ever know whether God exists, what quantum mechanics means, how matter makes mind. These three puzzles, I suspect, are different aspects of a single, impenetrable mystery at the heart of things. But one of the pleasures of agnosticism—perhaps the greatest pleasure—is that I can keep looking for answers and hoping that a revelation awaits just over the horizon.

 

 

Original article here


02 Sep 2024
Comments: 0

The health benefits of seaweed – a bath full of bladderwrack might be just what the doctor ordered

 

Seaweed, the colourful macroalgae that grows in the ocean, is a food source for marine life and humans. Each type of seaweed has a unique set of nutrients and can boost vitamin and mineral intake if eaten regularly.

Seaweed is widely consumed in Asia, and a staple ingredient in many Japanese, Korean and Chinese dishes. For example, nori is well known as the seaweed used to wrap sushi rolls, while wakame or kelp are often found in comforting ramen noodle dishes. These seaweeds impart an instantly recognisable savoury taste – known as umami flavour – to food and can add a variety of vitamins and minerals to meals.

Seaweed types can be broadly classified according to their colour: red, green and brown. Around 145 species of seaweed are eaten around the world. Seaweed is valued for its antioxidant properties, which helps protect cells against free radical damage.

Many seaweeds contain phenolic and flavonoid compounds, which are important antioxidants, as well as omega-3 fatty acids, amino acids, fibre, vitamins A, C and E – and minerals such as copper, zinc and iodine.

The seaweeds with the greatest nutritional value include those often referred to as the “wracks”. Bladder, toothed and spiral wracks often have notably higher antioxidant contents than many other seaweeds. However, nutritional content varies depending on the type of seaweed, harvest location and the season in which it’s harvested.

As well as being nutritious, seaweed has potential as a source of valuable antimicrobial agents. Research from Queen’s University Belfast has shown that seaweed species can fight off the growth of harmful microorganisms in animals, some of which are becoming resistant to antibiotics.

While eating seaweed has an array of benefits, there are a few things to watch out for. Due to the high iodine content in many seaweed species, seaweed consumption could exacerbate thyroid conditions or interfere with thyroid medications.

Some reports also suggest that, depending on the habitat, seaweed species may accumulate heavy metals such as cadmium, which has been reported to cause liver and kidney toxicity, and mercury, a known neurotoxin.

It’s important, then, to check different countries’ maximum contaminant levels (MCLs) for heavy metal concentrations in seaweed products and only purchase those that have been tested and deemed safe for consumption.

Seaweed Bathing

But diet is not the only way to experience the benefits of seaweed.

In recent years, there has been an upsurge in seaweed bathing, which is believed to be beneficial to health. A bath full of bladderwrack might not seem very appealing – but this strap-like, olive coloured seaweed, found along the coasts of the North Sea, Atlantic Ocean and Pacific Ocean, has been shown to have beneficial effects as a topical treatment for various skin issues from skin ageing to wound healing.

The tradition of seaweed bathing, which has been practised for hundreds of years, was originally recommended for arthritis sufferers to ease pain, stiffness and inflammation of joints. Nowadays, it is also recommended for athletes after a strenuous workout because seaweed’s high magnesium content is believed to be soothing for aching muscles.

Many companies that specialise in the harvesting of seaweed for bathing purposes aim to do so sustainably. This means that the seaweed is cut from the reef at low tide and is never pulled from the root to ensure regrowth. As demand increases, there is a growing awareness of the challenges around growing and harvesting methods, technical know how and environmental impact of expanding production.

The moisturising effects of seaweed are also prized by the beauty industry as today’s consumers place greater value on natural – and environmentally friendly – skincare ingredients.

Containing extracts of seaweeds such as toothed wrack and sea spaghetti, seaweed-based skincare lines claim anti-photoaging, hydrating and protective, nourishing and skin-plumping effects. But check the label: the closer to the top of the ingredient list, the greater the amount of seaweed in the product.

So, to enhance your health, seaweed can be easily incorporated into your lifestyle. Add to your diet, in moderation, by exploring seaweed recipes and by sprinkling dried, ground seaweed mixtures featuring dulse, wakame, sea spaghetti and wracks into smoothies, over salads and even on pizza.

It is also possible to buy hand harvested seaweed products for adding to the bath so that the benefits of submerging in marine algae can be realised from home – no matter how far from the sea we may live.

 

 

Original article here


31 Aug 2024
Comments: 0

The Forgotten Medieval Habit of ‘Two Sleeps’

 

It was around 23:00 on 13 April 1699, in a small village in the north of England. Nine-year-old Jane Rowth blinked her eyes open and squinted out into the moody evening shadows. She and her mother had just awoken from a short sleep.

Mrs Rowth got up and went over to the fireside of their modest home, where she began smoking a pipe. Just then, two men appeared by the window. They called out and instructed her to get ready to go with them.

As Jane later explained to a courtroom, her mother had evidently been expecting the visitors. She went with them freely – but first whispered to her daughter to “lye still, and shee would come againe in the morning”. Perhaps Mrs Rowth had some nocturnal task to complete. Or maybe she was in trouble, and knew that leaving the house was a risk.

Either way, Jane’s mother didn’t get to keep her promise – she never returned home. That night, Mrs Rowth was brutally murdered, and her body was discovered in the following days. The crime was never solved.

Nearly 300 years later, in the early 1990s, the historian Roger Ekirch walked through the arched entranceway to the Public Record Office in London – an imposing gothic building that housed the UK’s National Archives from 1838 until 2003. There, among the endless rows of ancient vellum papers and manuscripts, he found Jane’s testimony. And something about it struck him as odd.

Originally, Ekirch had been researching a book about the history of night-time, and at the time he had been looking through records that spanned the era between the early Middle Ages and the Industrial Revolution. He was dreading writing the chapter on sleep, thinking that it was not only a universal necessity – but a biological constant. He was sceptical that he’d find anything new.

So far, he had found court depositions particularly illuminating. “They’re a wonderful source for social historians,” says Ekirch, a professor at Virginia Tech, US. “They comment upon activity that’s oftentimes unrelated to the crime itself.”

But as he read through Jane’s criminal deposition, two words seemed to carry an echo of a particularly tantalising detail of life in the 17th Century, which he had never encountered before – “first sleep”.

“I can cite the original document almost verbatim,” says Ekirch, whose exhilaration at his discovery is palpable even decades later.

In her testimony, Jane describes how just before the men arrived at their home, she and her mother had arisen from their first sleep of the evening. There was no further explanation – the interrupted sleep was just stated matter-of-factly, as if it were entirely unremarkable. “She referred to it as though it was utterly normal,” says Ekirch.

A first sleep implies a second sleep – a night divided into two halves. Was this just a familial quirk, or something more?

An omnipresence

Over the coming months, Ekirch scoured the archives and found many more references to this mysterious phenomenon of double sleeping, or “biphasic sleep” as he later called it.

Some were fairly banal, such as the mention by the weaver Jon Cokburne, who simply dropped it into his testimony incidentally. But others were darker, such as that of Luke Atkinson of the East Riding of Yorkshire. He managed to squeeze in an early morning murder between his sleeps one night – and according to his wife, often used the time to frequent other people’s houses for sinister deeds.

 

 

When Ekirch expanded his search to include online databases of other written records, it soon became clear the phenomenon was more widespread and normalised than he had ever imagined.

For a start, first sleeps are mentioned in one of the most famous works of medieval literature, Geoffrey Chaucer’s The Canterbury Tales (written between 1387 and 1400), which is presented as a storytelling contest between a group of pilgrims. They’re also included in the poet William Baldwin’s Beware the Cat (1561) – a satirical book considered by some to be the first ever novel, which centres around a man who learns to understand the language of a group of terrifying supernatural cats, one of whom, Mouse-slayer, is on trial for promiscuity.

But that’s just the beginning. Ekirch found casual references to the system of twice-sleeping in every conceivable form, with hundreds in letters, diaries, medical textbooks, philosophical writings, newspaper articles and plays.

The practice even made it into ballads, such as “Old Robin of Portingale. “… And at the wakening of your first sleepe, You shall have a hot drink made, And at the wakening of your next sleepe, Your sorrows will have a slake…”

Biphasic sleep was not unique to England, either – it was widely practised throughout the preindustrial world. In France, the initial sleep was the ” premier somme“; in Italy, it was ” primo sonno“. In fact, Eckirch found evidence of the habit in locations as distant as Africa, South and Southeast Asia, Australia, South America and the Middle East.

One colonial account from Rio de Janeiro, Brazil in 1555 described how the Tupinambá people would eat dinner after their first sleep, while another – from 19th Century Muscat, Oman – explained that the local people would retire for their first sleep before 22:00.

And far from being a peculiarity of the Middle Ages, Ekirch began to suspect that the method had been the dominant way of sleeping for millennia – an ancient default that we inherited from our prehistoric ancestors. The first record Ekirch found was from the 8th Century BC, in the 12,109-line Greek epic The Odyssey, while the last hints of its existence dated to the early 20th Century, before it somehow slipped into oblivion.

How did it work? Why did people do it? And how could something that was once so completely normal, have been forgotten so completely?

A spare moment

In the 17th Century, a night of sleep went something like this.

From as early as 21:00 to 23:00, those fortunate enough to afford them would begin flopping onto mattresses stuffed with straw or rags – alternatively it might have contained feathers, if they were wealthy – ready to sleep for a couple of hours. (At the bottom of the social ladder, people would have to make do with nestling down on a scattering of heather or, worse, a bare earth floor – possibly even without a blanket.)

At the time, most people slept communally, and often found themselves snuggled up with a cosy assortment of bedbugs, fleas, lice, family members, friends, servants and – if they were travelling – total strangers.

To minimise any awkwardness, sleep involved a number of strict social conventions, such as avoiding physical contact or too much fidgeting, and there were designated sleeping positions. For example, female children would typically lie at one side of the bed, with the oldest nearest the wall, followed by the mother and father, then male children – again arranged by age – then non-family members.

A couple of hours later, people would begin rousing from this initial slumber. The night-time wakefulness usually lasted from around 23:00 to about 01:00, depending on what time they went to bed. It was not generally caused by noise or other disturbances in the night – and neither was it initiated by any kind of alarm (these were only invented in 1787, by an American man who – somewhat ironically – needed to wake up on time to sell clocks). Instead, the waking happened entirely naturally, just as it does in the morning.

The period of wakefulness that followed was known as “the watch” – and it was a surprisingly useful window in which to get things done. “[The records] describe how people did just about anything and everything after they awakened from their first sleep,” says Ekirch.

Under the weak glow of the Moon, stars, and oil lamps or “rush lights” – a kind of candle for ordinary households, made from the waxed stems of rushes – people would tend to ordinary tasks, such as adding wood to the fire, taking remedies, or going to urinate (often into the fire itself).

For peasants, waking up meant getting back down to more serious work – whether this involved venturing out to check on farm animals or carrying out household chores, such as patching cloth, combing wool or peeling the rushes to be burned. One servant Ekirch came across even brewed a batch of beer for her Westmorland employer one night, between midnight and 02:00. Naturally, criminals took the opportunity to skulk around and make trouble – like the murderer in Yorkshire.

But the watch was also a time for religion.

For Christians, there were elaborate prayers to be completed, with specific ones prescribed for this exact parcel of time. One father called it the most “profitable” hour, when – after digesting your dinner and casting off the labours of the world – “no one will look for you except for God”.

Those of a philosophical disposition, meanwhile, might use the watch as a peaceful moment to ruminate on life and ponder new ideas. In the late 18th Century, a London tradesman even invented a special device for remembering all your most searing nightly insights – a “nocturnal remembrancer”, which consisted of an enclosed pad of parchment with a horizontal opening that could be used as a writing guide.

But most of all, the watch was useful for socialising – and for sex.

As Ekirch explains in his book, At Day’s Close: A History of Nighttime, people would often just stay in bed and chat. And during those strange twilight hours, bedfellows could share a level of informality and casual conversation that was hard to achieve during the day.

For husbands and wives who managed to navigate the logistics of sharing a bed with others, it was also a convenient interval for physical intimacy – if they’d had a long day of manual labour, the first sleep took the edge off their exhaustion and the period afterwards was thought to be an excellent time to conceive copious numbers of children.

Once people had been awake for a couple of hours, they’d usually head back to bed. This next step was considered a “morning” sleep and might last until dawn, or later. Just as today, when people finally woke up for good depended on what time they went to bed.

An ancient adaptation

According to Ekirch, there are references to the system of sleeping twice peppered throughout the classical era, suggesting that it was already common then. It’s casually dropped into works by such illustrious figures as the Greek biographer Plutarch (from the First Century AD), the Greek traveller Pausanias (from the Second Century AD), the Roman historian Livy and the Roman poet Virgil.

Later, the practise was embraced by Christians, who immediately saw the watch’s potential as an opportunity for the recital of psalms and confessions. In the Sixth Century AD, Saint Benedict required that monks rise at midnight for these activities, and the idea eventually spread throughout Europe – gradually filtering through to the masses.

But humans aren’t the only animals to discover the benefits of dividing up sleep – it’s widespread in the natural world, with many species resting in two or even several separate stretches. This helps them to remain active at the most beneficial times of day, such as when they’re most likely to find food while avoiding ending up as a snack themselves.

One example is the ring-tailed lemur. These iconic Madagascan primates, with their spooky red eyes and upright black-and-white tails, have remarkably similar sleeping patterns to preindustrial humans – they’re “cathemeral”, meaning they’re up at night and during the day.

“There are broad swaths of variability among primates, in terms of how they distribute their activity throughout the 24-hour period,” says David Samson, director of the sleep and human evolution laboratory at the University of Toronto Mississauga, Canada. And if double-sleeping is natural for some lemurs, he wondered: might it be the way we evolved to sleep too?

Ekirch had long been harbouring the same hunch. But for decades, there was nothing concrete to prove this – or to illuminate why it might have vanished.

Then back 1995, Ekirch was doing some online reading late one night when he found an article in the New York Times about a sleep experiment from a few years before.

 

 

The research was conducted by Thomas Wehr, a sleep scientist from the National Institute of Mental Health, and involved 15 men. After an initial week of observing their normal sleeping patterns, they were deprived of artificial illumination at night to shorten their hours of “daylight” – whether naturally or electrically generated – from the usual 16 hours to just 10. The rest of the time, they were confined to a bedroom with no lights or windows, and fully enveloped in its velvety blackness. They weren’t allowed to play music or exercise – and were nudged towards resting and sleeping instead.

At the start of the experiment, the men all had normal nocturnal habits – they slept in one continuous shift that lasted from the late evening until the morning. Then something incredible happened.

After four weeks of the 10-hour days, their sleeping patterns had been transformed – they no longer slept in one stretch, but in two halves roughly the same length. These were punctuated by a one-to-three-hour period in which they were awake. Measurements of the sleep hormone melatonin showed that their circadian rhythms had adjusted too, so their sleep was altered at a biological level.

Wehr had reinvented biphasic sleep. “It [reading about the experiment] was, apart from my wedding and the birth of my children, probably the most exciting moment in my life,” says Ekirch. When he emailed Wehr to explain the extraordinary match between his own historical research, and the scientific study, “I think I can tell you that he was every bit as exhilarated as I was,” he says.

More recently, Samson’s own research has backed up these findings – with an exciting twist.

Back in 2015, together with collaborators from a number of other universities, Samson recruited local volunteers from the remote community of Manadena in northeastern Madagascar for a study. The location is a large village that backs on to a national park – and there is no infrastructure for electricity, so nights are almost as dark as they would have been for millennia.

The participants, who were mostly farmers, were asked to wear an “actimeter” – a sophisticated activity-sensing device that can be used to track sleep cycles – for 10 days, to track their sleep patterns.

“What we found was that [in those without artificial light], there was a period of activity right after midnight until about 01:00-01:30 in the morning,” says Samson, “and then it would drop back to sleep and to inactivity until they woke up at 06:00, usually coinciding with the rising of the Sun.”

As it turns out, biphasic sleep never vanished entirely – it lives on in pockets of the world today.

A new social pressure 

Collectively, this research has also given Ekirch the explanation he had been craving for why much of humanity abandoned the two-sleep system, starting from the early 19th Century. As with other recent shifts in our behaviour, such as a move towards depending on clock-time, the answer was the Industrial Revolution.

“Artificial illumination became more prevalent, and more powerful – first there was gas [lighting], which was introduced for the first time ever in London,” says Ekirch, “and then, of course, electric lighting toward the end of the century. And in addition to altering people’s circadian rhythms. artificial illumination also naturally allowed people to stay up later.”

However, though people weren’t going to bed at 21:00 anymore, they still had to wake up at the same time in the morning – so their rest was truncated. Ekirch believes that this made their sleep deeper, because it was compressed.

As well as altering the population’s circadian rhythms, the artificial lighting lengthened the first sleep, and shortened the second. “And I was able to trace [this], almost decade by decade, over the course of the 19th Century,” says Ekirch.

(Intriguingly, Samson’s study in Madagascar involved a second part – in which half the participants were given artificial lights for a week, to see if they made any difference.

And this case, the researchers found that it had no impact on their segmented sleep patterns. However, the researchers point out that a week may not be long enough for artificial lights to lead to major changes. So the mystery continues…)

Even if artificial lighting was not fully to blame, by the end of the 20th Century, the division between the two sleeps had completely disappeared – the Industrial Revolution hadn’t just changed our technology, but our biology, too.

A new anxiety

One major side-effect of much of humanity’s shift in sleeping habits has been a change in attitudes. For one thing, we quickly began shaming those who oversleep, and developed a preoccupation with the link between waking up early and being productive.

“But for me, the most gratifying aspect of all this,” says Ekert, “relates to those who suffer from middle-of-the-night insomnia.” He explains that our sleeping patterns are now so altered, any wakefulness in the middle of the night can lead us to panic. “I don’t mean to make light of that – indeed, I suffer from sleep disorders myself, actually. And I take medication for it… ” But when people learn that this may have been entirely normal for millennia, he finds that it lessens their anxiety somewhat.

However, before Ekirch’s research spawns a spin off of the Paleo diet, and people start throwing away their lamps – or worse, artificially splitting their sleep in two with alarm clocks – he’s keen to stress that the abandonment of the two-sleep system does not mean the quality of our slumber today is worse.

Despite near-constant headlines about the prevalence of sleep problems, Ekirch has previously argued that, in some ways, the 21st Century is a golden age for sleep – a time when most of us no longer have to worry about being murdered in our beds, freezing to death, or flicking off lice, when we can slumber without pain, the threat of fire, or having strangers snuggled up next to us.

In short, single periods of slumber might not be “natural”. And yet, neither are fancy ergonomic mattresses or modern hygiene. “More seriously, there’s no going back because conditions have changed,” says Ekirch.

So, we may be missing out on confidential midnight chats in bed, psychedelic dreams, and night-time philosophical revelations – but at least we won’t wake up covered in angry red bites.

 

 

Original article here


28 Aug 2024
Comments: 0

20-Somethings Are in Trouble

 

What if I told you that one age group is more depressed, more anxious, and lonelier than any other in America?

You might assume I’m talking about teens. Mood disorders, self-harm, and suicide have become more common among adolescents in recent years; article after article reports that social media is toxic for teen girls especially, eroding their self-esteem and leaving them disconnected. Or you might think of older adults, often depicted in popular culture and news commentary as isolated and unhappy, their health declining and their friends dropping away.

So perhaps you’d be surprised to hear the results of a Harvard Graduate School of Education survey on mental health in America: Young adults are the ones most in crisis. Even Richard Weissbourd, who led the study in 2022, was taken aback. His team found that 36 percent of participants ages 18 to 25 reported experiencing anxiety and 29 percent reported experiencing depression—about double the proportion of 14-to-17-year-olds on each measure. More than half of young adults were worried about money, felt that the pressure to achieve hurt their mental health, and believed that their lives lacked meaning or purpose. Teenagers and senior citizens are actually the two populations with the lowest levels of anxiety and depression, Weissbourd’s research has found.

Other studies of young adults have similarly alarming findings. According to the CDC, in 2020, depression was most prevalent among 18-to-24-year-olds (and least prevalent among those 65 or older). A 2023 Gallup poll found that loneliness peaked at ages 18 to 29. And, according to one meta-analysis spanning four decades, more and more young adults reported loneliness each year. When Weissbourd repeated his survey last year, young-adult anxiety and depression had also risen, to 54 and 42 percent, respectively. Still, the struggles of young adults have gone widely unnoticed. When Weissbourd got his data, “it was really upsetting,” he told me. “What is going on here? And why aren’t we talking about it more?

The phase between adolescence and adulthood has long been daunting: You’re expected to figure out who you are, to create a life for yourself. That might sound exciting, as if all the doors are wide open, but much of the time it’s stressful—and modern challenges are making it harder. Young adults are more vulnerable than ever, but much of American society doesn’t see them that way.

One thing that gets Jennifer Tanner fired up is the myth that young adulthood is a carefree time. Many people see it as a perfect juncture, when you’re old enough to have agency but young enough to be free of big responsibilities. Commonly, though, it’s the inverse: You have new obligations but not the wisdom, support, or funds to handle them. Tanner is a developmental researcher studying “emerging adulthood,” typically defined as the years from age 18 to 29, and she thinks that many more established adults wish they could go back to that period and do things differently; in hindsight, it might seem like a golden age of possibility. “Everybody who’s 40 is like, I wish I was 18.” Meanwhile, young adults are “like, The world’s on my shoulders and I have no resources,” she told me. “We’re gaslighting the hell out of them all the time.”

Of course, being a teen isn’t easy either. Depression and anxiety are increasing among adolescents. But in high school, you’re more likely to have people keeping an eye on you, who’ll notice if you’re upset at home or if you don’t show up to school. Adults know that they should protect you, and they have some power to do it, Weissbourd said. After you graduate from high school or college, though, you might not have anyone watching over you. The friends you had in school may scatter to different places, and you may not be near your family. If you’re not regularly showing up to a workplace, either, you could largely disappear from the public eye. And if life is taking a toll, mental-health resources can be hard to come by, Tanner told me, because psychologists tend to specialize either in childhood and adolescence or adult services, which generally skew older.

As soon as you become independent, you’re expected to find housing, land a satisfying job, and connect with a community. But achieving those hallmarks of adulthood is getting harder. College tuition has skyrocketed, and many young people are saddled with student loans. With or without such debt, finding a place to live can feel impossible, given the current dearth of affordable housing. In 2022, a full half of renters spent more than 30 percent of their income on rent and utilities—a precarious situation when you haven’t yet built up savings. Under rising financial stress, finding fulfilling work can come second to paying the bills, Weissbourd explained. But that might mean missing out on a career that gives you a sense of self-worth and meaning. Jillian Stile, a clinical psychologist who works with young adults, told me that a lot of her clients are “feeling like a failure.”

 

 

On top of that, the social worlds that young people once occupied are crumbling. In the recent past, young adults were more likely to marry and have kids than they are today. They might have befriended other parents or co-workers, or both. Commonly, they’d belong to a religious congregation. Now they’re marrying and starting families later, if at all. Those with white-collar jobs are more likely to work remotely or to have colleagues who do, making it hard to find friends or mentors through work, Pamela Aronson, a sociologist at the University of Michigan at Dearborn, told me. Religious-participation rates have plunged. Americans in general are spending more time alone, and they have fewer public places to hang out and talk with strangers. For young adults who haven’t yet established social routines, the decline of in-person gatherings can be especially brutal. “Until you build those new systems around yourself that you contribute to, and they contribute back to your health and well-being,” Tanner told me, “you’re on shaky ground.”

Sources of companionship inevitably shift. Today, for example, more young people are getting support (emotional and financial) from parents; 45 percent of 18-to-29-year-olds live with their folks. But that can be isolating if you don’t also have friends nearby. Family bonds, no matter how wonderful, aren’t substitutes for a group of peers going through this sometimes-scary life phase at the same time.

Without a sense of belonging, the world can seem bleak. In Weissbourd’s study, 45 percent of young adults said they had a “sense that things are falling apart,” 42 percent said gun violence in schools was weighing on them, 34 percent said the same of climate change, and 30 percent reported worrying about political leaders being incompetent or corrupt. These issues don’t affect only young adults, but they might feel particularly grim if you can’t imagine what your life will look like in a decade. When it comes to “anxiety and depression,” Weissbourd told me, “it’s not only about your past—it’s about how you imagine your future.” And young adults? “They’re not hopeful.”

A rocky start to adulthood could cast a shadow over the rest of someone’s life. Aronson reminded me that, on average, Millennials have “less wealth than their predecessors at the same age—because their incomes were lower, because they started their jobs during a recession.” Gen Z spends a greater portion of its money on essentials than Millennials did at their age. That doesn’t bode well for Gen Z’s future finances. And there are other concerns: Maybe, if you can’t afford to pursue a rewarding job when you’re young, you’ll work your way up in a career you don’t care about—and end up feeling stuck. Perhaps if you don’t make genuine friends in young adulthood—commonly a time when people form long-lasting bonds—you’ll be lonelier in middle age. And if you lean exclusively on your parents, what will you do when they die?

Leaving individual young adults responsible for overcoming societal obstacles clearly isn’t working. “I don’t think we’re going to therapize or medicate our way out of this problem,” Weissbourd, a therapist himself, told me. He wants to see more “social infrastructure”: Libraries might arrange classes, volunteer opportunities, or crafting sessions that would be open to people of all ages but that could allow isolated young people to feel part of something. Doctors might ask young-adult patients about loneliness and offer resources to connect them with other people. Colleges could assign students an adviser for all four years and offer courses to guide students through the big questions about their place in the world. (Weissbourd teaches one at Harvard called “Becoming a Good Person and Leading a Good Life.”) Aronson suggested that workplaces should hold mentoring programs for young employees. And of course, student-loan-debt forgiveness, government support for higher education, affordable housing, and more extensive mental-health-care coverage wouldn’t hurt.

First, older adults need to acknowledge this crisis. Seeing young people as worthy of empathy means understanding today’s challenges, but it might also involve recalling one’s own youth as it really was—and finding compassion for one’s past self. While older adults may have regrets, they probably did their best with the perspective and resources they had. And they could stand to remind the young adults in their lives: Even flawed choices can lead to a life that, however imperfect, encompasses real moments of joy, accomplishment, and self-knowledge. If our culture romanticized that growth a little more and the golden glow of youth a little less, young adults might feel less alone in their distress. They might even look forward to finding out what’s next.

 

 

Original article here

 


Leave a Comment!

You must be logged in to post a comment.