Between the wake of living and the insensibility of death: the experience of now

It’s an old and familiar trope; as a young man, it would enrage me.

Picture it: an old person, who is tired of living, decides that they are ready to die. Then, they close their eyes and die, as if the matter was decided in that moment — probably after some important milestone had passed and some important wisdom had been imparted.

The decision itself to die is not, I think, the key issue. Death as the ultimate sacrifice, in the name of some higher principle or for the benefit of some other person, has always tickled my adolescent fancy. Likewise, for as long as I can remember, I have always thought suicide to be an appropriate response to a cruel and terminal illness, even if it isn’t the choice I would make for myself.

I think the trope enraged me because it eulogized a decision to acquiesce to death’s inevitable and final ushering for no other reason than the old person’s indifference to life. The old person could live longer; they simply choose not to because they don’t much see the point in living any longer. It seemed to me to be the ultimate betrayal of the very idea of life, in all of its stubborn glory. Death is not an undiscovered country; it is an insensibility to be resisted at all costs until the very moment of consumption and consummation.

However, now that I have made it to middle age, I have found that the trope no longer enrages me. The decision to acquiesce to death, however unpalatable such acquiescence  may be to me, even seems to make sense, once the nature of lived experience is rightly understood.

When I was younger, lived experience seemed much more concrete and enduring, even after it had already been lost under the wake of living, because the amount of lived experience I could remember seemed to be much more than the experience I had forgotten. Sure, I couldn’t remember every detail of waking life but, on the whole, it felt like my experiences lived on with me in my memories.

At forty-five, however, the ledger of memories and lived experience is not at all balanced. I have undeniably forgotten much more of my life than I can now remember. I can no longer pretend otherwise: experience is gone forever once it is lived and our very fallible and fleeting memories can’t preserve or resurrect it. In terms of the experience of lived experience, the only difference between living and death is that the now of living is experienced and the now of death is not. The past is as unknowable as the future, whatever the fantasy of memory might otherwise try to tell us.

Now that this insight has taken root, it has become much easier for me to imagine a time when I will be able to look forward into death and look back onto life and not really see that much difference in terms of the experience of lived experience. As a young person, the experience of now was a supernova that illuminated all horizons; today, it is a star bright enough for me to look back with fondness and forward with anticipation, despite the shadows growing all around me; looking out towards 80 or 90 (and, hopefully, 100 or 120), it is very easy to imagine that the experience of now might feel like a pale dim light in a universe of nothing stretching in all directions. If that is the case, persistence for the sake of persistence might not seem to really add or subtract from the final ledger; and acquiescence to an insensible future might not seem so different from an attachment to the insensible past. Maybe, just maybe, I will also be ready to close my eyes and slip away quietly.

But, let me say this now! If some future Sterling starts nattering on about going gently into that good night, he is a rogue and a fraud! Here me now and believe me later: attach every machine, do all the surgeries, and give me every drug; do whatever it takes to keep my faint ember of consciousness aglow, no matter the suffering I may endure. I expect future Sterling will feel the same; however, because younger Sterling would probably be enraged at my defence of the enraging trope, I shall err on the side of caution: let my will today bind his then. If future Sterling ever loses sight of the faint ember of his experience in the engulfing insensibility of past and future, give him a stiff rum or two and send him to bed. I’m sure he will be fine in the morning. He’s probably just had a bad day. Plus, if he has got to go, he will probably want to go quietly in his own bed, enveloped in  a nice light glow.

Losing my religion: the unknowable self and the myth of a well-ordered society

I suspect that you and I don’t really know anything.

Today, thanks to a lot of trial and error, we humans have a pretty good understanding of what we need to do to distinguish between plausible and implausible beliefs. If we run controlled double-blind and repeatable experiments that generate a sufficient amount of data of sufficient quality, we can use statistical methods to confidently identify those beliefs that are false and those that are plausibly true but still in need of further testing. Considered from this perspective, it seems pretty obvious to me that you and I don’t really know anything. Most of our beliefs have not been tested in this way. 

To start, almost all of our beliefs about the universe are taken on faith that the people doing the work of understanding the universe are doing it correctly. To be sure, this is probably a sensible approach for you and I to take. It certainly seems much more efficient to rely on a specialized community of inquirers to undertake this work, but it doesn’t change the fact that you and I don’t really know what the scientific community knows. Their well-tested beliefs are, for us, articles of faith, even if we can expect them to be much more reliable than the articles of faith generated by theologians interpreting ancient texts. And if this is true, it is true whenever we rely on others to formulate and test beliefs on our behalf. Beliefs that we don’t test ourselves are, for us, articles of faith. 

With that conclusion in mind, take a few minutes to catalogue all the beliefs that you have and rely on each day that are formulated for you and/or tested by others. If you are honest with yourself, I am pretty sure the list will be quite long. And while it is tempting to believe that we have good reason to rely on others for all of these beliefs, I’m willing to bet that you have not tested that belief either. I, for one, can admit that I have not tested it — and most of my other beliefs. I also feel pretty comfortable guessing that you and I are in the same boat. 

And this, I think, is the crucial consideration. We might be able to shrug off the fact that particle physics is for us a matter of faith, but I suspect it will be much more unsettling to realize that you and I never properly test a whole range of beliefs that fundamentally shape our sense of self, our identity, and our daily experience of living.

Consider: Am I happy or unhappy today? Am I happier or less happy than I was yesterday? Last week? Last year? Am I better off now than I was three years ago? Am I consistently making choices that support my well-being? Did I go to the right university? Was I right not to go to university? Am I in the right career? Are my goals for the future the right goals? Am I with the right partner? Would I have been happier with no children or more children? Am I the person I wanted to become? Who was I? Which of my memories are accurate? How accurate? And so on. For all of these questions and many more, there are objective and measurable answers. I’m also willing to bet that your answers to these kinds of questions are a mix of educated guesses, received wisdom, and Magic 8-Ball proclamations. 

To further complicate matters, it is very likely that some of these questions can’t ever be properly answered. We could, for example, carefully track our self-reported experiences of happiness over a long enough period of time to come up with some plausible theories about what makes us happy and then test those theories with more data. However, we probably will never be able to adequately test whether any particular life choice was the right one to make. There are no do-overs in life. As a result, we can’t even generate the data that would put us in a position to make a valid assessment. Furthermore, in the face of this certain uncertainty, it seems likely that we can’t even reliably assess these choices in the here and now because we don’t have the well-tested beliefs upon which to assess the expected outcomes. So, even if we want to evaluate our life choices before we make them (overlooking the important consideration that many people don’t), we don’t even have the correct data for that evaluation. 

One plausible way to sidestep these concerns is to simply stipulate a lower burden of proof for these kinds of beliefs. Perhaps, it doesn’t really matter if we have properly tested beliefs about our happiness, our favourite foods, or our career path. One might be happy to claim that the good life requires only that we can tell ourselves a convincing story in the here and now that we are happy, well-off and that the events of our lives brought us here. All’s well that we can describe as ending well! And while I suspect that this tactic might actually be the best explanation for our species’ reproductive success up to this point (i.e. that we have a curious ability to reimagine suffering as a net benefit), I remain suspicious of the notion that we should lower the burden of proof for these kinds of beliefs. A delusion is a delusion is a delusion, even if we can convince ourselves that we are happy about it. 

In the face of this uncertainty, however, I suspect the only appropriate conclusion is to give up on the notion that we can ever definitively know ourselves. We are constantly evolving animals that are bound in the flow of time and, as a result, there are beliefs about ourselves of which we can never properly test. We have to rely on hunches, received wisdom and wild guesses because we have no other option. It isn’t because we are inherently mystical or otherworldly. It is because we are constrained  by our temporal existence. The much larger and crucial delusion, I think, is the belief that we could know with certainty who we are and what we value. Once we give up on that idea, the notion that we don’t know ourselves with God-like certainty seems much less unsettling and becomes just another mundane limitation of human existence. 

And while this conclusion might be well and good on the personal level, it creates one teensy-weensy little issue when we turn our attention to society and its organization: the fundamental and essential assumption of a liberal democracy and a market economy is that you and I can know our own well-being and happiness, know it better than anyone else, and reason effectively about it. Thanks to research in neuroscience and behavioural psychology, we now know with some certainty that these assumptions are false. We are poor reasoners in general but especially about what we value. Additionally, many of our beliefs about our own well-being are demonstrably false (i.e. people remember happiness that they did not experience and forget pain that they did). So, if it is true that most of our beliefs are inadequately tested and that we can’t even make accurate judgments about what we value or think to be good, democracies and markets are, at best, arbitrarily organizing society and, at worst, guaranteed to do it poorly. Garbage in, garbage out, as the saying goes. And to be clear, this is also true for authoritarian strong men, councils of nerds, and any other social-political system that depends on anyone looking deep within themselves to figure out who they are, what they value, or what they want to become. The root problem is the practical constraints of inquiry. There is no social architecture that will solve that problem for us.  

What then of politics, society, and its organization, if we can’t count on people knowing themselves with any certainty? 

First, I think we need to recognize and accept that our present-day social and political habits, institutions, and systems are largely the consequence of chance (akin to biological evolution), prone to constant change, and persist only as long we allow them to persist. They are an expression of our need to organize ourselves, they reflect the environment in which they developed, and they emerge like any other natural phenomenon. They can become better or worse (relative to a host of benchmarks), none of them will magically persist over time, and there is no reason to think that solutions from hundreds and even thousands of years ago will work for today’s challenges. We need to accept that society’s organization is an ever-evolving and accidental by-product of the on-going effort to solve many different, discrete and often intertwined problems. 

Second, I think we need to get out of the habit of appealing to any claims that rely on introspection alone, in the same way that we almost got out of the habit of appealing to claims about the one true God. There are a lot of well-tested and plausible beliefs that we can use to guide our efforts to organize ourselves and direct our problem-solving efforts. The challenge, of course, is that even well-tested beliefs don’t all necessarily point to the same conclusion and course of action. In those cases, we must resist the temptation to frame the debate in terms of largely unanswerable questions like “what’s best for me”, “who’s vision of the good life is correct,” or “who worships the right God.” Instead, we need to look to well-tested beliefs, run good experiments, and always account for all the costs and benefits of whatever approach we settle on in the here and now, recognizing that with new evidence we may need to adapt and change.  

Finally, for those of us who think that we should settle our disagreements based on well-tested beliefs rather than dubious claims grounded in introspection, we need to lead by example. I think this will primarily involve asking the right sort of questions when we disagree with others. For example, what well-tested evidence do we have for one conclusion or the other? What kind of evidence do we need to decide the matter? What experiments can we run to get the necessary evidence? We will also need to get in the habit of discounting our own beliefs, especially if they are based on nothing more than introspection or received wisdom. And this might actually be the toughest hurdle to overcome both personally and practically. It is very natural to become attached to our own bright ideas before they are properly tested. Once attached, it becomes much easier to discount the evidence against them. To further complicate matters, humans also seem to be too easily motivated to action by strongly-expressed convictions that align with preconceived notions, whether they are well-tested or not. Asking for evidence before action and expressing doubts about one’s own convictions might not resonate with the very people we need to sway. Unfortunately, but not surprisingly, there is no easy general all-purpose way to solve this problem. People who want to motivate others to action will always need to strike the tricky balance between rhetoric and honest communication. We don’t need to be puritans about honest communication but we also shouldn’t use the human condition as an excuse to spin outright lies — even in the service of thoroughly tested beliefs.            

Descartes is often credited with kicking off modernity when he famously doubted the existence of everything but his own thinking mind. In the very many years since he reached his pithy and highly quotable conclusion, we have learned a lot more about the best methods of inquiry and have developed a well-tested and always evolving understanding of the world. More recently, thanks to those methods of inquiry and their application in neuroscience and behavioural psychology, it is becoming increasingly clear that we can’t know much of anything from introspection alone — including ourselves. There is nothing you, I, or Descartes can know with any certainty by looking inwards for answers. Unfortunately, we continue to rely on habits, institutions, and systems which presuppose that you or I have privileged and certain knowledge about our own well-being, values, and optimal outcomes. This may partly explain — in conjunction with other issues (hello, massive inequality) — why liberal democratic political systems that rely on free markets are in crisis these days.

It was fashionable in the late 20th century to talk as if we had escaped modernism, but postmodernism, I think, only takes Descartes’ modernism to its logical conclusion, while willfully overlooking the fact that we humans have become pretty good at understanding the world around us. To set ourselves on a new path, to really escape the gravity well of modernism, we need to set aside the Cartesian notion that the aim of inquiry is absolute certainty and that such certainly can be found through introspection. Instead, we need to accept that we really don’t know ourselves, whatever our heartfelt convictions might tell us, and look instead to well-tested beliefs to guide and organize our lives, both individually and collectively. 

Who died and made content king? Survival bias, confirmation bias, and a farcical aquatic ceremony.

When I first started using social media, thirty Helens agreed: “content is king!” 

And, at the time, it certainly felt that way. Perfectly crafted tweets seemed to be retweeted again and again; insightful blogs seemed to lead to comment after comment; great articles were always bookmarked. 

I suspect, however, that content looked kingly only because we content creators looked at tiny samples of high-performing content and jumped quickly to conclusions. Survival bias ran rampant, it was primarily the bias of content creators that was running, and content creators really really wanted to believe that expertly crafted content could compel others to action.     

Much later, in the early days of live streaming on Facebook, a video I shot and shared live went “viral”. It received something like half-a-million views in twelve hours or so. For a social media nerd like me, let me tell you, there is no greater thrill than hitting refresh every few seconds and seeing the number of views on your post jump by hundreds and, at times, thousands. Like slot machine enthusiasts everywhere, the bells and whistles are almost more important than the jackpot itself.

And, on the face of it, it seemed like the sort of video that should earn a lot of attention. My phone had captured a pretty special moment in a powerful story, even if the video quality was questionable and the audio mediocre. The story — we content enthusiasts had been telling ourselves for years — was much more important than the technical specifications of the media that shared it. And, this video was a perfect case in point! A live, raw and powerful moment was the stuff of social media glory! I had always known it, but now here was the proof! One more bias was joyfully confirmed.

Then, I watched that short video of a woman laughing in a Chewbacca mask. Do you even remember it? It was the video that blew up in those early days of live streaming on Facebook. Sure, it was vaguely amusing, but was it really that share worthy? Was it really earning all those views and engagements? Was this really the kingly content that the social media prophecy had foretold?  

Then, it occurred to me: Facebook had just launched its live stream functionality and they wanted it to make a splash. My phone had been rattling every two seconds to let me know whenever anyone streamed live for the first time. Moreover, because it was a new service, it had appeared on my phone using the default settings for notifications, which is something like “maximum racket.” In other words, Facebook was making every effort to put as many eyeballs as possible on any content that was shared live.  

Facebook’s effort to boost the visibility of its live stream service should come as no surprise. They wanted people to use the service right away and they wanted those people who used it right away to experience success right away. Easy success would hook users and those who were hooked would talk it up to others. The first hit is always free. 

I am reminded of all of this because of a recent article about TikTok and the author’s naive attempt to explain why some videos on this service have earned big numbers. To be blunt: I wouldn’t be at all surprised if the people running TikTok are specifically manipulating things behind the scenes to generate big media-story-worthy numbers. You are the product, after all; they need you to be active; and, what’s a few inflated numbers between friends?  

However, even if the people running TikTok aren’t intentionally manipulating the numbers, there is a much more plausible explanation why some content is getting more attention than other content. Dumb chance. When enough content gets in front of enough people, some of that content will earn more attention and, from there, it can snowball. That’s it; that’s all. There is nothing in the content itself that will definitively explain its success. In the same way that we can’t know in advance which genetic adaptions will lead to an organism’s reproductive success, we can’t know in advance which features of our content will lead to its reproductive success.

Circling back to those early days of social media and the quest for the holy content grail, if there was any truth in our collective hope that content is king, I suspect it was this: the experience of kingly content is probably symptomatic of the fact that humans tend to socialize with people much like themselves and become more like the people with whom they socialize as they socialize with them. 

So, at the outset, specific social media channels were attractive to a particular community of users who were already pretty similar in terms of interests, values, and identity. There wasn’t a lot of content being created, so any content that was shared was bound to earn whatever attention was out there to earn. Because the people using the tools were already pretty similar, they came up with similar theories to explain the success of some content and those theories became self-reinforcing. As people shared content that fit their theories of success, the successful content was more likely to match the theories because there was more content out in the world that aligned with the theory. For example, if you claim that red aces are always drawn because they are special and you add more red aces to the deck every time one is drawn, your theory is bound to look true whether there is anything special about red aces or not. 

Eventually, these theories about what made content shareable, engaging or whatever were internalized as norms, values and aesthetic sensibilities. In this context, content starts to look kingly and almost magical because it’s attractiveness is rooted in a sense of “we”. We are the kind of people who think a tweet will be more engaging if the hashtag is at the end of the copy instead of beginning, so we see it as such and act accordingly. In other words, the apparent kingliness of content is an expression of a particular community’s sense of shared identity. If a particular community of we has power and influence, then, they will influence the tastes of other communities. And so on.

But here, I think, is the nub of the matter: this isn’t some kind of social media gaffe or millennial voodoo. It has always been like this for all content everywhere. The success of content is best explained by the communities that behold it, their sense of “we”, and their power and status. Shakespeare’s plays, for example, seem kingly to us only because an influential group of people took a liking to them at a time when there wasn’t much competition for people’s attention. When you are the only show in town, it is very easy to make the taste.  

If I am right about this (and I’d bet that I am not the first to claim it), I suspect a lot of content lovers and creators’ will react to my conclusion with nihilistic rage. “If there is nothing in the artefact of creation itself that guarantees success or could guarantee success, what is the point of creating at all? Why create if what is produced is of secondary importance or, dear god, not important at all? Oh woe is us!” However, I want to make the case that this frown can be turned upside down. 

On the one hand, if your aim is to create content and be recognized as a content creator, the path forward is pretty simple: do your best to ingratiate yourself to whatever community is the tastemaker community for the kind of content you want to create. Meet, greet and emulate. Play the game well enough and long enough, and you will probably get a shot at shifting the community’s taste. No magic or special natural gifts required. You don’t need to be the anointed one. Being pleasant and patient should do the trick.

Alternatively, if you enjoy creating content for its own sake and have no particular desire or need to be recognized as a content creator by the relevant tastemaker community, you are free to create in accord with whichever standard(s) you want. Who cares what the tastemakers think? They no longer control the means of creative production or distribution. Go forth and create! Celebrate the fact that you have enough time and the means to create, even if no one is looking. On the other hand, if it turns out that you don’t want to suck up to tastemakers to earn a living as a content creator and have better things to do with your time than create for the fun of it, so be it. The choice is yours and, to be frank (you be Jane), having that choice is pretty lucky too.

I can think of only two groups of people who will be in a jam: those people who desperately want to be recognized as a content creator but don’t want to suck up to the relevant tastemaker community or the people who are ignored by that community even when they do suck up. For them, only Nietzschean frustration awaits. 

If you are among this lot, I can offer only this advice: storm the taste making gates until you are accepted, ingratiate yourself to a marginalized or underserved community and hope their day is yet to come, or ride the early adopting wave of some new technology like the printing press or social media. However, whichever path you take, please remember: if you end up holding something that feels like a sword of divine right, the underlying mechanism that provided it to you remains the same, whether you were finally picked by the cool kids or the uncool kids somehow suddenly turned cool. The sword doesn’t make you or your content king; nor does the farcical aquatic ceremony that put it in your hand. Instead, it is the community who thinks of “you” as “we”.

My answer to the ultimate question of life, the universe, and everything: four four through it all

If the mystery of the human condition can be characterized as a kind of puzzle or riddle, the answer and/or punchline can be aphorized, I think, through four banal facts and four mollifying delusions.

I can’t say that anyone will necessarily gain anything by knowing and understanding these facts; nor can I say that they will gain anything by ridding themselves of the delusions.

If anything, I am pretty sure the delusions persist precisely because they are useful to most people most of the time. Whether or not they become more or less useful will be settled by evolution eventually — and not by you or I.

Four banal facts:

  1. Almost all human behaviour is perfectly predictable. Some human behaviour may be random or the result of chance.
  2. Human behaviour and all the products of human behaviour are expressions of the human disposition to allocate resources according to status.
  3. Human society and its organization requires the exercise of power. The risk of abuse is omnipresent. Some will guard against it; some won’t.
  4. We die and will be forgotten.

Four mollifying delusions:

  1. Humans have free will and are masters of their own destiny.
  2. The truth will set you free.
  3. Democracy is the worst form of government, except for all the others.
  4. Immortality is possible.

That’s it; that’s all. If you like my solution or enjoy talking about the puzzle, let’s start a club. You bring the (alcoholic) punch. I will bring the (vegan) pie.

The spade has turned: from weltschmerz to mono no aware and back again.

mono no aware After many years of digging, I am pretty sure I have hit bedrock.

The spade has turned.

Human behaviour and all the products of human behaviour are best understood as expressions of the human tendency to allocate resources according to status.

Practically-speaking, a person’s survival depends on their membership in a group and their status in it. Additionally, success — however it is defined over and beyond survival and reproduction — also depends on and is defined by an individual’s membership in a group and their status in it.

There is nothing — absolutely nothing — that trumps, circumvents or transcends the ceaseless churn of seeking, gaining, maintaining and losing the approval of others.

And so it goes. Fine weather, isn’t it?

Unreal city: a black hole of dazzling light

A picture of the unreal city at night.

I remember the moment, but I can’t place it in time.

We were returning to Waterloo from Toronto. It was night. The stream of lights heading east on the 401 was an endless milky way.

It struck me: behind each set of quivering headlights, there was at least one person. It struck me: on one side of this narrow strip of highway, heading east towards a moderately-sized metropolitan city, there was a galaxy of human experience, unique distinct breathing and, like me, living at the centre of its own universe. It struck me.

At high school in Ottawa, I remember it often felt like I was surrounded on all sides by unknown, colourless, cardboard people, who reappeared over and over again like the recycled backgrounds of a low-budget cartoon. Many years later and long after the moment of dread on the highway heading west from Toronto to Waterloo, it struck me: I was as colourless and unreal to the unknown others as they were to me.

I now live in Toronto, a city as unreal as Eliot’s. From my window, I see towers and towers of existences. When I walk to and from work, there is always a hornet’s nest of activity. When I shop within minutes of my home, I see faces that I know I will never see again. Like a shovel of dirt from a wild and healthy field, these few blocks of my existence are teeming with life.

If I reflect for too long on the scale of life in this city and on this planet, it obviates me. If I focus instead on the energy, colours and details of this urban microcosm, I am dazzled by it all, and happy to play the role of cardboard cutout to the unknown universes of life booming and buzzing around me.

Oh, unreal city, at the centre of a black hole where all light is trapped, could it be as dazzling as this?

My own private iconoclasm: making the word flesh once and for all

After many many years of reading, writing and thinking, I have arrived at the rather unremarkable conclusion that reading, writing and thinking are neither important nor unimportant in and of themselves.

They are human activities like any other and, as such, their value is ultimately determined by other humans. They can influence others — if they influence at all — only because of the values and valuings of families, peers, and communities. They can’t convince, compel, or convert on their own. They do not have quasi-divine and human-independent power to influence humans and their affairs.

I mention this only because I suspect that I may have implicitly believed all these years that reading, writing and thinking did have quasi-divine powers, even I can’t recall ever explicitly thinking to myself, “if I read, write and think just so, people will have no choice but to understand and agree.” Why else would I spend so much time reading and rereading, writing and rewriting, thinking and rethinking? Of course, I enjoyed it, but there are many other enjoyable activities I might have pursued instead. The intensity of my dedication seems to imply that I was hoping for something more.

School, university and academia probably helped to engender this implicit hope for the quasi-divine power of reading, writing and thinking. From the earliest days of school until the very end of academia, I was taught that the correct reading, writing and thinking would produce and, perhaps, even compel the appropriate mark, degree, or publication. It was as if there was a kind of magic at work — a magic that inevitably produced success when it was invoked correctly.

The implicit hope for the quasi-divine power of reading, writing and thinking was also stoked in the early days of social media. Time-and-again, it was (and is) claimed that there is a uniquely correct way to succeed at social media. Do it correctly, we were (are) told, and the followers, likes, pageviews and advertising dollars will inevitably flow. In the end, we have learned that there isn’t anything entirely unique about social media. Like any other human activity, there are many familiar but not entirely certain paths to success and failure.

I suppose William Carlos Williams also contributed to my implicit hope. As a young man, I was entranced by the idea that he remained a doctor, lived in Paterson, New Jersey, and, nevertheless, became a towering literary figure. According to the official hagiography, he opted out of the lifestyle of a poet but, nevertheless, became one of the greatest. I assumed, at the time, that it was the power of his words and talent that helped him overcome the geography of his choices. I see now that I overlooked the true power of the relationships he maintained.

I am tempted to be troubled by the non-divine nature of reading, writing and thinking, to characterize it as a problem, and to draw some profound conclusion, but I’ve been down that path too many times before to make the same mistake again. The absence of God only seems troubling if you characterize it as an absence, but to do so is a mistake. That which never existed can’t be absent because it was never present to begin with, no matter how it might have otherwise felt.   

The only real consequence of this realization is that I must give up on an ancient and essentially childish dream. Neither the bug-eating mystic in the desert, nor the stone-throwing philosopher on the mountain, nor the house-call-making poet in New Jersey can, by the shear force of reading, writing and thinking, legislate on behalf of the world. Read, write, and think if you enjoy it, but don’t expect or pretend that it will have any more influence on humans and their affairs than counting blades of grass, memorizing all the digits of pi, or surfing off the coast of Maui. To influence human affairs, one must be a part of them. There is no escaping that fact of human existence.

Thus spoke Zarathustra.