News as Entertainment

Published: Jun 4, 2009 by Joe Larabell

I was just reading an editorial article on the recent “rise-and-fall” of media reports concerning the novel A/H1N1 outbreak. The author seems to confuse a lack of panic with “scoffing” at the alarmist stance taken by the media at the start of what is clearly a pandemic of global proportion. I can certainly relate. The only opinions that seem to make it into the mainstream media these days are those at the extremes. On the one hand, you have those who believe that the current strain of A/H1N1 is a lethal killer waiting to jump out of the shadows and take out half the planet’s population. On the other hand, you have those who believe the whole scare was trumped up by the media outlets as a publicity stunt (well… ok, maybe nobody has used those exact words… yet). What you don’t hear nearly as much (unless you consider the WHO and CDC websites “media”) are the balanced reports of what’s really going on and how likely we really are to fall victim to this particular threat.

Those who would have you believe that Armageddon is at hand point to the last huge killer pandemic in 1918, when the Spanish Flu claimed around 50 million lives. In that case, experts say, the same strain showed up in a milder form the previous Spring but came back with a vengeance when the regular flu season started the following Autumn. The Spanish Flu of 1918 was eventually identified as another strain of H1N1 (nobody knew that at the time, obviously – apparently, someone was able to recover samples of the 1918 virus from the Alaskan permafrost). Every year since then (and even before then) has brought with it some kind of flu virus (twice a year, actually, if you consider that flu season in the Southern Hemisphere is six months out of phase with that of the Northern Hemisphere). Since one of the survival mechanisms of the flu virus is constant mutation [1], every year’s seasonal flu is slightly different from the year before. Health officials attempt to predict these mutations and concoct a vaccine each year specific to the predicted strains – and most of the time it works. What tripped the alarm this year is: (a) the fact that there hasn’t been an A/H1N1 virus circulating in recent memory so it was believed that humans had no natural immunity, (b) the virus seemed to be affecting more than just the very young and the very old, and (c) it appeared in the Spring – somewhat unusual for influenza… and very similar to the situation leading up to the pandemic in 1918. But all this is circumstantial evidence. In fact, at least one study found that the reason older folks (who are usually the hardest hit by seasonal influenza) were not being affected in their usual proportion was that many actually did have a natural immunity. It seems a strain of A/H1N1 may have circulated around the mid-1950s and those affected then (who, for the most part, survived) are the older folks of today – those who, for some reason, aren’t being affected by A/H1N1 in the proportions usually expected for seasonal influenza. Could this be the reason for the skewed age curves this time around?

I suppose it does no good to point out that more people are killed in automobile mishaps (45,000 in the US in 2002) than by influenza (20,000 in the US in 2001). That would be comparing apples to oranges. And it’s probably moot to point out that so far only 117 people have died from novel A/H1N1 (as of 3 June 2009, WHO) because we don’t yet know whether we’re on the leading edge of a major pandemic or the falling edge of an out-of-season and comparatively mild outbreak. Moreover, we don’t know what lies in store come next November, or any November for that matter.

But isn’t that the point – that we simply don’t know? One of the comments attached to the above-referenced editorial said: “To know the past is to predict the future.” Actually, I would correct that to read “To know the past completely is to know that it’s simply not possible to predict the future.” Especially when we’re predicting said future based on something that happened 90 years ago and hasn’t happened (in the same intensity) since.

Let’s look at the timeline:

  • 1918: Some form of H1N1 claims some 50 million lives. History suggests the same virus may have appeared in a milder form in the Spring and then mutated to a more virulent form the following Autumn.
  • Some 40-plus years pass, each year seeing an influenza virus that does not cause widespread death, at least as far as we know (no news is apparently good news in this case).
  • Mid-1950s: Some form of A/H1N1 makes its round with no unusual fanfare.
  • Another 50-plus years pass, some years worse than others (can anyone say "Hong Kong Flu"?), but nothing matching 1918 in virility or mortality.
  • 2009: The A/H1N1 virus appears again, in the Spring, causing 117 deaths (so far).

And from this history, we’re supposed to believe that it’s likely that we’ll be faced with a more virulent form of the current virus this coming Autumn. I’m not trying to offend those who are genuinely concerned but, to me, the numbers seem to say that it’s just as likely – no, far more likely – that the mutations produced by this particular strain will present no more a threat then they did in the mid-1950s or than that of any other strain of influenza since 1918. Influenza has been with us for at least 500 years. It constantly mutates [1] and new strains circulate every year during the colder part of the year. The only difference it that, today, we have a global news infrastructure whose very existence depends on big, bad things happening across the globe – and a viewing public with an apparently insatiable thirst for disaster.

Sure… odds are that eventually the planet will be hit by another global pandemic of 1918 proportions. And… odds are that Tokyo will be hit by another killer earthquake. However, the odds that either of those events will happen in any particular year are slim, at best. Those who say we’re overdue for a huge disaster because one hasn’t happened for so long (1906 for Tokyo, 1918 for influenza) simply don’t know how probability works. The elapsed time since any given event doesn’t make us overdue for another occurrence any more than having a coin land heads-up 50 times in a row makes it more likely that tails (or heads) will show up on the next toss (unless, of course, the root cause is a crooked coin). When an event is rare but catastrophic (like killer earthquakes or global pandemics), the human mind tends to give that more weight than an event that is relatively more frequent but less of a disaster (like automobile accidents). So we panic over the appearance of A/H1N1 while climbing into our cars every morning without a moment’s pause.

I’m glad that we (collectively – meaning governments and other institutions) have procedures in place for addressing catastrophic disasters. And I’m glad that we have a global news infrastructure that keeps us informed of potential risks as they appear. But when you put the two together – advance notice of possible disasters which, in turrn, triggers the very procedures designed to address those disasters – and then you mix in the human tendency to prioritize things according to their effect while ignoring their probability – and you have a recipe for panic. And panic has never solved any problem.

So my advice, when faced with news of impending doom, is to realize that these days news really is entertainment, that “situation normal” simply doesn’t make for good news copy, that rare events continue to be rare despite how long it’s been since the last such event, and that a modicum of preparation is far more useful than a head-full of panic. And just ignore those who would tell you otherwise.

This post was originally published as:


The Disqus comments section is currently under evaluation. You should not have to be logged in to post a comment. If you have any trouble or see anything strange in this section, please let me know. Thanks.

Latest Posts

Effortless Magick

It’s funny how, every once in a while, if you listen to the subtle messages unfolding around you on a constant basis, you pick up on a pattern of small bits of information that seem to build into something substantial. That happened to me recently on the general topic of effortlessness. Like many would-be adepts, I have a number of daily practices that I fit into various parts of the day. Sometimes they pay off with feelings of increased awareness or energy but, if I were being totally honest, most of the time they feel like drudge-work… a part of the day that occurs more out of habit than anything else… with the basic idea being one of consistency rather than joy.

Out with the Old...

I was listening to the latest Sam Harris podcast today and ran across an interesting take on something that should be familiar to most Western Ceremonial Magicians. Eric Weinstein was talking about finding meaning in license plate numbers as he drives around (don’t we all do that when we first start on the Path?) and the way he explained it was:

"'s important to notice what it feels like to discern meaning where there is no meaning... it's important to get in touch with the "as if madness" experience in order to guard against madness; so I'm hoping to suspend my insistence on Truth for periods of time..."

I’m not sure about the connection with madness, per-se… and I’m wondering if that wasn’t just a ploy designed to wrap up the thought before getting interrupted. I realized when he said that that another good reason for discerning meaning where there is none is to prevent intellectual ossification (my term… it didn’t appear in the podcast, as far as I know). The belief that one particular way of looking at things must serve as the filter through which we see everything else from that point forward seems to be common in most philosophies and pretty much all religions. Adherence to a strict theology makes us less able to evaluate contrary ideas on their own merit. On the other hand, by constantly playing fast and loose with one’s synaptic network, so to speak, one might stand a chance of maintaining enough mental flexibility to recognize a true Epiphany when it finally does come.

It’s ironic that avoiding intellectual ossification was one of the main points that Sam was trying to convey just moments earlier… that there’s no logical reason to use one or more points-of-view which happen to have been elaborated thousands of years ago over new points-of-view developed by one’s own reason in the present time. Of course, that’s easier said than done and when most people start on any sort of Philosophical or Spiritual Path, they’re usually not capable of the kind of deep reasoning that would discern the “true meaning” of the Universe at first glance… so we may need to use ancient philosophy and religion as a crutch for a while… in order to bootstrap our thinking to the point where we can reason with some depth on the Universe and our purpose within it. But I expect that we all have to eventually drop the rhetoric and design our own systems based on First Principles.

Misunderstanding Multitasking

I was listening to an interview with the authors of the new book The Distracted Mind on NPR this morning and they touched on a favorite pet peeve of mine that centers on a basic misunderstanding of the term multitasking. According to Wikipedia, the first published use of the term “multitask” appeared in an IBM paper describing the capabilities of the IBM System/360 in 1965. Is is only recently that the term has been used in the common vernacular to refer to the apparent ability of humans to “concentrate” on more than one task at a time.