You’re not growing horns, but tech use is not without physical risks ~ Author Lance Ulanoff
Last week, for almost 24 terrifying hours, mankind believed that overuse of smartphones could lead to us growing horns on the back of our heads.
This is not as nutty as it sounds. We’ve long believed, and often rightly so, that overuse of any kind of technology can lead to all sorts of medical calamities. Entire industries have sprung up to help protect us from the threat of curved backs, aching wrists, arthritic fingers, and twisted necks. But an entire generation evolving based on a couple of decades of smartphone use had the unmistakably sickly-sweet smell of myth.
A Tale Told by Us
This questionable news traveled fast and wide thanks to a very-official and trust-worthy-sounding source. Multiple academic papers from a research team in Queensland, Australia, found that – and this was the stunner – 33% (yes, read that as “one-third”) the population of Australia had developed horn-like bone-spurs on the base of their skulls. This new report, by the way, was the more conservative one. An earlier 2016 study put the number of teens with the horn-like protrusions at 41%.
The study was careful to point out that this was not a genetic trait (“My daddy had tiny horns on the back of his head and so did his daddy, and so on. You’re gonna have ‘em, too”). Instead, it was a posture thing. The latest study noted that the prevalence of these protrusions, which are up to 31 mm long, indicates that it’s not a degenerative bone condition connected to age, but some other factor.
What factor? The researchers, naturally, had an idea. I suspect they started looking around the research lab, their building, and maybe even took a walk street side where they encountered an entire population staring at their phones. Then they ran back upstairs and jotted it down.
From the paper:
“We hypothesize EEOP [enlarged external occipital protuberance] may be linked to sustained aberrant postures associated with the emergence and extensive use of hand-held contemporary technologies, such as smartphones and tablets. Our findings raise a concern about the future musculoskeletal health of the young adult population and reinforce the need for prevention intervention through posture improvement education.”
That paragraph ignited a mini global firestorm. I mean, we knew these things were bad for us, but to trigger cranial evolution — and, yes, the researchers do throw “evolutionary changes” into the mix — well that’s a genetic leap too far.
Horns Come Home
Granted, the minute I read the initial story, I felt the back of my head for a tell-tale lump or tiny “horn,” just as I suspect you’re doing now. By the time I arrived home last week, the news had already infected my son who asked with a straight face if I’d checked my head.
However, by then, the tide of information had turned on this tech consequence science. I smiled back at my son and said, it wasn’t true. He said, “but The Washington Post.” I countered with, “The New York Times.” By the afternoon, the Gray Lady ran a debunker that noted:
“Experts give the report mixed reviews, noting that the study is based on looking back at X-rays taken in the past, lacks a control group and cannot prove cause and effect. In addition, the subjects were people who were having enough neck trouble to visit a chiropractic clinic and require X-rays, so it’s not clear what bearing the results have on the rest of the population.”
There’s also the reality of evolution, which most science tells us, happens over eons, not decades. In addition, the adaptations are just that, changes that help living things better navigate their changing world and environment. I’m not sure horns would qualify for humans, unless you spent a lot of time in the bull fighting ring – substituting yourself for the bull.
In other words, don’t start buying hats with a special space for the horns you’ll surely grow.
We can, in effect, shake off this ill-conceived head-horn study and go back to staring down at our phones.
But I think that would dismiss an underlying truth about technology use: too much of anything isn’t good for us. The reason so many people accepted the study at face (or head) value is not that they detected horns growing out of the backs of their own heads, but the gnawing concern that, just maybe, they’re overdoing it with tech.
This is not a new worry, and the negative effects of some technology — from sleepless nights because of too much blue light from our screens to carpal tunnel syndrome from prolonged use of keyboards and mice — are very real.
Back in the 1970s, when we still called TV the “boob tube” and parents realized they could let even just a handful of broadcast channels babysit their kids, there was real concern that sitting too close to the TV (my preferred distance was 3.7 feet) could harm eyesight. Now, ignoring the fact that I wear glasses, researchers eventually agreed that television wasn’t blinding young children but the content wasn’t doing their young psyches any favors, either.
Still, the nagging fear that the latest technology could cause us physical harm has persisted and it hasn’t always been wrong.
In the earliest days of our computer revolution, health concerns were more prosaic. In 1982, PC Magazine published a lengthy article about ergonomics and noted that “although computers don’t cause permanent physiological or psychological disorders, their improper designs and use can lead to such maladies as headaches, back pains, eye strain, stress, and frustration.”
Later, we began to realize that even with the best design, prolonged use of computers and, especially, lengthy sessions in a chair were not exactly recipes for long-term health.
In general, technology design has moved more quickly than ergonomic. Usually, the designs of products like your desktop computer or desk survive years without a single, fundamental change. We stared at bulky, low-resolution, slow-refresh-rate CRT screens for decades. Standing desks, are, for instance, a relatively new phenomenon after centuries of desk-bound work.
Smartphones and tablets get more slab-like all the time, but the way we hold, carry and stare of them remains relatively unchanged. Ergonomic updates like blue-light-reducing night mode are a relatively new phenomenon.
My point is, we can comfortably laugh off the idea that we’re evolving into a population of goat-people, but the researchers mistake was likely not one of accuracy as much as messaging. They mistook, and maybe we did, too, an ailment for evolution.
Computing ergonomics already tell us that it’s not healthy to bend your neck at a 45-degree angle for hours at a time (looking up or down) and that the best desk posture is one where we can sit straight and stare straight into our screens. Right now I’m looking at a HD screen that floats almost weightlessly on an adjustable arm, ensuring I can keep every pixel at eye-level.
No one is going to start walking around with their smartphones held up in front of their face (unless they enjoy walking smack into a wall). Not wanting to miss a single Beyoncé Insta update, we walk and sit with our necks craned at a chiropractor-inducing angle. That’s unlikely to change. But this somewhat spurious report and the media’s insistence of spreading the myth almost unchecked might have had an unintended benefit.
Tell me that the next time you’ve spent 30 minutes staring down at your phone that you won’t stop, rub the back of your neck and then start searching furtively for the first signs of a horn. It won’t be there, but your neck will appreciate the break, nonetheless.