Why aren’t more teams outside of sport playing Moneyball? Because they’re human, stupid.

After reading Michael Lewis’ Moneyball, I asked myself: why hasn’t this data-driven approach to the evaluation and recruitment of talent been embraced by more teams and organizations outside of professional sports? Why, after all of these years and with the very tangible success of professional sports to look to as an example, why are we still evaluating and recruiting talent like we always have? 

After all, when you cut through the sound and the fury of Lewis’ tale, the innovation described in Moneyball is pretty straightforward. Billy Bean and Paul DePodesta of the Oakland Athletics use data to identify players who are undervalued by other teams and then sign them to contracts at a bargain price. Essentially, they get more for less by exploiting information the other teams ignore. It’s smart, but it’s also a tactic that every bargain hunter, thrifter and value investor understands. Because the core idea described in Moneyball is so straightforward and has been so widely celebrated, you would think (or, at least, I would) the data-driven approach to the evaluation and recruitment of talent described in Moneyball (or something approximating it) would have swept through all other industries by now.

Instead, it seems that most teams and organizations rely on recruitment practices that are probably older than baseball. You know the drill: after a largely arbitrary sorting process based on self-reported data points (i.e. a resume is pulled out of a hat based on a crappy keyword search or because a friend-of-a-friend recommends that it be pulled), the evaluation of a potential hire boils down to a highly subjective gut-check, which may or may not be based on an assessment of the candidate’s skills in highly artificial circumstances. A few reference checks later — which everyone agrees are useless — and, blammo, a new hire is being onboarded. If a professional sports team recruited like this, it would be out of business in no time. How is it possible that so many teams and organizations continue to recruit in this essentially arbitrary fashion? 

Overlooking the rhetorical nature of my question, you might reply, “well, probably because most teams, organizations and industries don’t have access to the kind of dataset baseball has. Baseball has always been kind of nutty for numbers.” To which, I might reply, great point, Dave, but there is no necessary reason why a baseball-like dataset couldn’t be developed and maintained by, say, a professional association. Isn’t the market supposed to identify opportunities like this and fill them? Potential employers, it is fair to say, would probably pay oodles of money to access this kind of data, if it led to better and less costly hires. Moreover, I would quickly add, not giving you a chance to get a word in edgewise, because that’s how I roll, once someone is hired, a team or organization can create and maintain as much data about the new hire and their performance as they would like. So, if some hungry-for-success team or organization wants to evaluate a new hire based on their contribution to the success of the team or organization, generating the right kind of data should be a straightforward exercise once the person is onboarded.

Instead, much like the recruitment process itself, the evaluation of new hires seem to be largely a matter of feel. If a new hire “fits” into their new team and seems to contribute, the recruitment process is normally judged a success, whether or not the person measurably contributes to the success of the team or organization. To be fair, group harmony and team cohesion is always going to play a role in any team’s success. However, group harmony and team cohesion are very often a by-product of team work rather than a catalyst for it. Whether or not a person “fits” is probably irrelevant, so long as they make some effort to cooperate and work well with others. Proximity and time will take care of the rest. 

Before you interrupt me with another objection that I already have a clever reply to, it was probably around this point in my thinking and writing that the penny dropped. Duh, Sterling, of course, most organizations and industries hire based on “feel”, where “feel” more or less translates into, “yep, gut sure says that they’re like me.” We humans are tribal. From the very outset of our lives, we tend to form relationships and social groups based on physical proximity and physical similarities. Why would it be dramatically different for the workforce? Well, Sterling, I guess I was assuming that competition and/or the desire to achieve our aims would have nudged us to adopt more rational, coherent and less arbitrary approaches to building teams and organizations. Whether an organization is for-profit or not-for-profit, it makes much more sense to recruit people who measurably contribute to the achievement of the organization’s aims rather than people who just happen to look and talk like the friends-of-friends we have in common.

Think about it, if the jocks — of all people — have figured this one out, why hasn’t anyone else? 

Then, it was around this point that another penny dropped for me. Most people agree that Michael Lewis’ version of the events in Moneyball “torques” the facts for the sake of a more compelling story. In particular, it seems likely that there was far less conflict and debate about the data-driven approach Bean and DePodesta championed. Strictly-speaking, once a certain caricature of scouts and scouting is set aside, the difference between player evaluation and acquisition as it was traditionally done in baseball and the approach described in Moneyball is one of degree rather than kind. Moreover, by the time that Beane and DePodesta had turned to data to drive their player acquisitions, amateur data aficionados had already been using data to dissect and criticize professional baseball’s approach to player evaluation and acquisition for some time. The notion that data could lead to better recruitment practices was already well and truly in the air.

It’s also important to remember that Bean and DePodesta were evaluating and recruiting players who had already been through a very long and very difficult vetting process. To be among the players who are even on the radar of being considered for a spot on a professional baseball team, a lot of people in the baseball community would have already vouched for that player in some way or the other. It’s not like the Athletics were using data to recruit hockey goalies to be catchers or signing Tim from the mailroom. If a team is trying to decide between signing this guy and that guy, and everyone already agrees that both of them are part of the very exclusive club known as professional baseball, why wouldn’t you roll the dice and pick the cheaper guy if the data also seems to predict he would do fine. Shorn of Lewis’ drama, the Athletics faced a pretty simple choice. On the one hand, they could continue evaluating and recruiting talent as they always had and expect the same middling results or, on the other, they could take a chance on a newish approach broadly recognized as having some merit, generate results no worse than they might otherwise expect, and save money while doing it. Really, when you think about it, it’s a no-brainer, but, “the not-so-remarkable tale of safely entrenched insiders making an even safer bet that works out better than expected” doesn’t make for compelling dust jacket copy.

With all of that throat-clearing now well in hand (uh, gross), the answer to the question I started with is this, I think: teams and organizations outside of professional sports haven’t yet broadly adopted a data-driven approach to the evaluation and recruitment of talent because, all things considered, the age-old approaches work well-enough; as a result, no well-established insiders have felt compelled to try something new. On the one hand, successful organizations tend to attract a lot people who have already been vetted in some fashion. Randomly picking, more or less, among those people who present themselves for selection is probably a safe bet and, if random selection is a safe bet, why not also pick people “like me,” if it will make you and everyone else on the team feel more comfortable with the new hire. On the other hand, struggling organizations tend to cut employees rather than than make new hires and, you can be sure, any hires they do make are going to be on the safe and familiar side. In other words, even after very many years of working together in groups to achieve different aims, it seems that we humans haven’t confronted any situation that would compel us to change how we recruit people or how we evaluate their contributions to our efforts. And, if it hasn’t happened yet, don’t hold your breath! Businesses fail every day and entire industries have collapsed over the years and yet these very negative consequences have not driven business or industry insiders to fundamentally and systematically rethink how they evaluate and acquire talent. If it hasn’t happened yet, I doubt it it will happen anytime soon.

Now, if you are like me, at this point, you might be somewhat disheartened to realize that organizations build their teams using methods that wouldn’t look out of place on the schoolyard (i.e. pick that kid, he dresses like us!). However, if you are a normal human being, you are probably actually thinking, “Are you serious?” Did you really only just figure out that hiring decisions are primarily an exercise of “like” hiring “like”?” Well, sort of. I have always understood that humans have a habit of grouping together based on superficial similarities and excluding those who are superficially different, but I have always thought of it as a bad habit, which would eventually be broken, both at the individual and group level, either consciously as people and societies matured or unconsciously through something like competition. What has dawned on me (thanks to Moneyball and baseball!?) is that the human tendency to socialize, build teams and act collectively by looking for and finding people “like us” is so fundamental that nothing will ever compel us to change, other than a true evolutionary shift in our DNA, which, strictly-speaking, is just a fancy way of saying, “if people who embrace difference reproduce more than all those other people who prefer homogeneity.” It’s “we like us” and “different like them” all the way down. 

Moreover, on a personal level, it is also dawning on me that whatever I have accomplished in my life is probably best understood as being a consequence of my similarities to others  rather than my differences. I’m not a beautiful unique snowflake; I’m a me-too drug. And, yes, while I am one hundred per cent talking about social privilege, I am also driving at something that runs deeper. Returning again to evolution (which probably should be the subordinate clause that starts every discussion about human nature), in my experience, evolution is often characterized as a triumph of difference because it is a heritable difference in phenotype that leads to a reproductive advantage that, over many generations, leads to a new breeding population. Hurray for difference — so long as you overlook the fact that the difference is one tiny bit in a whole lot of sameness. Without the sameness, the little bit of difference wouldn’t ever take hold in a breeding population. To put it bluntly, if you are too different, your difference ain’t being passed along to anyone because you won’t get the opportunity to reproduce and, if you are really different, the breeding might not even work. In other words, what makes you and me human are the ways in which we are the same; insofar as we aspire to be unique, it it only possible because we are like others — and not in spite of it.

And that is the moral of a completely different after-school special than the ones I watched growing up.

The alpha and omega of our humanity: an all-too-familiar trope

It is a standard narrative trope. We’ve encountered and enjoyed it millions of times: life is not as it seems. Our hero is not what she appears to be. Behind the veil of illusion, there is a different and more profound reality to discover.

If I have previously reflected on the ubiquity of this trope, I probably concluded that it is so commonplace only because it is a very easy idea to hang a story around.

I am now wondering if there is something much more fundamental to the trope. I am wondering if an ability and willingness to treat direct experience as an illusion behind which a more fundamental reality exists is the ultimate source of our humanness.

Take, for example, religion and science.

On the one hand, religion tells us that the savage and unpredictable storm is really an angry god. On the other hand, science tells us the storm is really an atmospheric disturbance created by the interplay of fundamental laws. The explanations are different, but they both rely on the idea that the truth of the matter is very different from what we directly experience. For them both, direct experience is an illusion behind which a more fundamental reality exists.

With these examples in mind, take a closer look at learning, creativity, language, consciousness, hermeneutics, the human reproductive cycle — really, just about anything fundamental to understanding humans as humans — and the trope turns up time and again. It seems to be as ubiquitous in reality as it is in our stories. If that is correct, perhaps, the trope appears in our stories so often only because we are fundamentally the kinds of beings that make sense of the world from that perspective. Perhaps, our stories mirror and reinforce an innate way of looking at the world.

Now, if this is true, here is a curious thing.

True happiness, the sages often tell us, is found only when we learn to appreciate the here and now, our given circumstances, the moment. Unhappiness, we are told time and again, is rooted in an inability or an unwillingness to appreciate the inherent value of direct experience. We suffer unnecessarily only when we grasp for superfluous wants beyond the here and now.

If the sages and I are both correct, it looks like our ability and willingness to treat direct experience as an illusion behind which a more fundamental reality exists is not only what makes us fundamentally human but it is also the root cause of our unhappiness. We are our most human, it seems, when we treat the given as something to be looked through and dismissed. We are also our most unhappy when we fail to appreciate what we are experiencing in the moment.

My highly speculative and totally-talking-out-of-my-ass theory to make sense of this apparent conflict is that the species is coping with a recent adaptation. Our brains, at some point in our recent evolutionary history, developed an ability to treat direct experience as an illusion behind which a more fundamental reality exists. From this adaptation, many of our most distinctively human traits have sprung. We are, nevertheless, mammals fundamentally and, for most of our evolution, we were animals that took direct experience as a given and succeeded because of it. We carry both traits in us now because they both helped us to succeed over the course of our existence.

The big worry for me, however, is that the ability to treat direct experience as an illusion behind which a more fundamental reality exists might actually be a maladaptive trait. Because of it, we dominate and control the environment like no other species and are reproducing at a frenetic rate. On first impression, this seems like the very definition of an evolutionary win. However, if our species gets wiped out in the next century or two because of our domination of the environment and frenetic population growth, that will be a undisputable lose. Our “success” might be so fleeting in geological terms that in a few thousand years no trace of us will remain beyond a curious spike in carbon emissions. If that is the case, as a species, we would have been much better off never developing the traits that allowed us to dominate nature and reproduce so frenetically.

There, of course, remains an outside chance that enough people will recognize the reality of what is coming and act together to make the dramatic changes necessary to avert the species’ oblivion. Perhaps, our day-to-day existence will become so difficult that we will have no choice but to change our ways before it is too late. There is even the faint hope of some kind of technological fix. And while the colonization of other planets is also feasible, abandoning the ship does not really seem like much of a solution or a victory, when we were the ones who scuttled it.

It seems we have painted ourselves into the corner of a familiar story. A catastrophic outcome is inevitable and only a miracle will save us. Is there a plucky band of misfits assembling now who will save us, thanks to their courage and conviction? Perhaps, a higher power or powers has already picked the chosen one and will reveal his or her true destiny shortly. Perhaps, that flickering light is not a star but a starship racing towards us, laden with the technology and know-how we will need to survive and flourish. One can only hope that there is some reality in these well-worn fantasies, but that in itself is an all-too-familiar story.

The condition of my humanity: arrogant humility

It would be fair to say that I have spent most of my life thinking about the human condition.

The catalyst for this lifelong reflection was the profound realization, at the age of nineteen, that God does not exist. At the time, it seemed that the fact of God’s non-existence was a big deal. I also thought that a full and proper understanding of this fact would have profound consequences for the way I, you, all of us should live. I expected profound consequences because we live in ways that have been built on and around the idea that God exists. Remove the keystone of God’s existence, I thought, and the structure of everything would fall away, and we could rebuild everything anew. I read, I argued, I taught and, in the end, I realized that God’s existence or non-existence is pretty much irrelevant to deciding how we should live.

Then, it occurred to me that capital-T truth does not exist. It seemed to me that this was the fundamentally important fact, for more or less the same reasons that I thought God’s non-existence was so important. Again, I hoped that if I thought long and hard enough about it that I would identify some profound implications for the way I, you, all of us should live. I read, I argued, I taught and, in the end, I realized that the existence or non-existence of capital-T truth is as irrelevant to how we live as the existence or non-existence of God, for more or less the same reasons. Whatever you or I may believe about the nature of truth, it doesn’t really matter when it comes to deciding how we should live.

Then, it occurred to me that a fully naturalized and evolutionary understanding of consciousness was the key. Because culture and society begins and ends with humans, it seemed reasonable to conclude that a better understanding of the human nervous system would lead to profound implications for the way I, you, all of us should live. Moreover, for the first time in human history we had tools that allowed us to exorcise the quasi-divine conception of self we had inherited from our ancestors. The moon may have already been conquered by others but we are the first humans to tread on the very stuff of the human condition. And while it remains theoretically possible that there may be some unimaginable discovery yet to be made that will falsify the conclusion that I am about to share with you and that you should really be able to anticipate by now; but, after reading, arguing, and teaching, I have reached the conclusion that we will never be able to draw unassailable and universally compelling conclusions about how we should live based on a fully naturalized and evolutionary understanding of consciousness either.

The crucial words here are “unassailable” and “universally compelling”. With the benefit of hindsight, I see now that I was hoping to find a conclusion, a claim, an idea, something that would win in every argument and always compel all others to action. I was doing what prophets and priests and philosophers and warlords have been doing since time immemorial. I was trying to derive an “ought” from an “is” and hoping that the “ought” would be so magical and powerful that everyone would be swayed by it. The subtle and not terribly sophisticated difference is that I was trying to derive an unyielding “ought” from a “not is” instead of an “is.” Rather than saying, “x, therefore you must do y”, I was saying (or hoping for), “not x, therefore you must do y.” For example, instead of “God is love, therefore, we should do good,” I was hoping for “there is no God, therefore, we should do good.” And while it remains intuitively plausible to me even now that there is some special significance in the fact that things like God and capital-T truth don’t exist, I know that it is as nonsensical to draw unconditional moral claims based on what is not as it is to draw unconditional moral claims based on what is.

And, as important as that conclusion may be, the far more important insight, I think, is that the very idea of an unassailable and universally compelling argument is a coercive fantasy. It is essentially the hope that might and right are identical and that rightness can in and of itself compel others to believe and act. It is also an idea that leads, I think, either to passivity or to oppression because, if right and might are one in the same, either unpopular beliefs are not quite right or there is something not quite right with everyone who fails to accept and act on beliefs we think are right. If a belief, idea or way of life fails to compel acceptance and motivate action, we either think less of that which was  not compelling or think less of the people who failed to be compelled. So, either we end up believing and doing nothing because the burden of proof is impossibly high or we do whatever we want because disagreement is proof that those who disagree with us are somehow broken or not fully human and, for this reason, don’t deserve our consideration and can be compelled to do anything we want.

It’s also crucial, I think, to realize that might comes in many forms, is expressed in many ways, and is never in itself a measure of rightness whatever its form or expression. Most people, for example, would probably now accept the notion that the strength of a person’s muscles has no bearing on the validity of their beliefs, and yet many today still believe that the strength of a country’s military or its economy is a measure of the rightness of its moral and political values. Vote-getting, profit-making, and fundraising are often thought to be legitimate measures of rightness but they really only indicate what can attract votes, profits, and charity at any given point in time. An argument, a speech or an essay may be persuasive, but this in itself is proof only of its persuasiveness. Charm may be non-violent, but there is no reason to think that a consensus built on it is any more true than a consensus built on fear. Might comes in many forms, and it never makes right — even when it is expressed in a way we admire or by people we like.

I should, nevertheless, be explicit on this point: coercion is an inescapable fact of social and political life. We must sometimes coerce people to do things they would rather not do (remember: forcing people not to interfere in the lives of others is a form of coercion too). However, we should always coerce cautiously and from a place of humility, respect and empathy, recognizing that there will be times when we will also be coerced to do something we would rather not do. Most importantly, we must never conclude that our ability to force a person to do something that they would rather not do proves anything about the merits of our beliefs, our way of life or our worldview. Coercion becomes oppression, I think, precisely when we start to believe that our might — whether it be physical, intellectual, emotional, financial, electoral, anything — is proof that we are right. It is one thing to force people to comply with, say, a political or legal decision with which they do not fully agree, while at the same time recognizing that the decision may be imperfect. It is something altogether different to force compliance and, at the same time, insist that coercion would be unnecessary if only those who were being coerced were more rational, compassionate, or open-minded — or whatever term we might use to signal that they are to blame for not seeing it our way. We must, I think, always remain mindful of the fact that anyone of us — and not just those people who we think are the bad guys — can walk the path of good intentions from coercion to oppression.

With that important caveat in mind, we must, nevertheless, carry on living and, in my own case, I have come to embrace an attitude of, what might be called, arrogant humility. I’m arrogant enough to think I have a pretty good shot at making pretty good judgments about what is or is not the best course of action in most situations, when I do the work to gather and consider enough of the relevant evidence. I am also humble enough to accept that I often get it wrong, that I have blind spots, and that some of my most cherished beliefs and well-considered beliefs might be totally wrong. In short, I’ve come to trust my judgement, while at the same time accepting its limitations and failings. I am no longer looking for something — or a not-something — to validate my beliefs, decisions and failings.

I will not, however, claim that all people should necessarily adopt this attitude. I can’t ignore the fact that much good has come from people who have put their faith in God, who pursue the Truth, or stand their ground in the name of moral facts that they consider to be self-evident. I am also well aware that much evil has been done in the name of God, Truth, and indubitable moral facts written into the bones of nature, however, when I consider the evidence available, I am not convinced that these attitudes necessarily lead to good or evil. Whether a person has faith in God or in their own judgement, they must consider the evidence and make judgments based on it. They and I may sometimes disagree over what counts as admissible evidence, but a shared commitment to the fact that might does not make right and right does not make might seems to me to be much more important than a shared opinion about the nature of God.

And once I set aside aside worries about the existence or non-existence of God, Truth, and Human Nature, it was much easier for me to see that there is both too little and too much to say about the human condition. From one perspective, we are simple, fleeting and trivial creatures who, like all the other quirks and quarks in a cold, vast and indifferent universe, are, in principle, perfectly predictable. From another perspective, the human condition is an unimaginably rich and cacophonous kaleidoscope of boundless possibility and each human life is unique, beautiful, and precious. The human condition is a lot like the weather, I think. Seen from on high, it is simple and perfectly predictable, but, closer to the ground, it is complex, varied and difficult to predict, and, at the eye of the storm, no two storms are ever quite the same for those who experience it — no matter what the experts, instruments, and equations may say.

And that’s all I have to say about that (I think).

As Insignificant As A Star: The Brief Light of Consciousness

Pale Blue Dot“We’re made of star stuff,” Carl Sagan famously quipped.

Sagan makes this claim, in part, because of what we are made of. We humans, like all other animals and most of the matter on Earth, are made of carbon, nitrogen, and oxygen. These elements, we know, were created in stars long ago.

Sagan also makes this claim because he wants to make us feel special. He adds, in a curiously Hegelian turn of phrase, “We are a way for the cosmos to know itself.”

In this way, Sagan adds his voice to a chorus of opinion about the nature of human consciousness. Like Sagan, many other people want to characterize the fact of our consciousness as something profoundly special. They want human consciousness to be much more than one more mere phenomenon of the universe. Sagan wants us to feel special because we are conscious of the universe and can come to know it.

Sagan’s claim about the specialness of humans, however, like all such claims, does not make much sense.

Yes, we are made of matter that originated in stars. That matter, however, has existed in one form or another for billions of years. It will exist for billions more. The amount of time it will be animated by our consciousness is imperceptibly short. From this perspective, consciousness and whatever it might come to know is of no more or less significance than anything else.

Consciousness, nevertheless, is precious to us. From our perspective, it should be. Its temporality, its finitude, its ephemeralness, its very nature shouldn’t diminish its preciousness to us. It only seems less precious, I think, when we fantasize, like Sagan, about its special significance.

We humans seem to have a desperate need to make ourselves out to be much more than we are. Even a cosmologist like Sagan, who is all too aware of the vastness and scale of the universe, succumbs to this desperation. It is this desperation to be more than we are, I think, that leads either to hubristic fantasy or pointless nihilism.

Instead, we should accept and embrace our indifferent and fleeting place in the vastness of the universe. It is, after all, the most plausible account of our place in the universe. It may also be the key to truly enjoying our brief time as conscious and experiencing matter here on our pale blue dot of a planet.

*

SUPPORT MY THINKING AND WRITING ON PATREON

*

The Performance of Teaching: Not Ready For Its Close Up.

Blue and BrutalTeaching, I’ve discovered, is a bit like theatre.

I’ve always known, of course, that performance is an important aspect of effective teaching, especially when the size of the class is more than a handful of students.

I’ve now learned that the kind of performance involved in teaching, like the performance involved in theatre, does not translate directly to video very well.  

I learned this recently while developing a video version of Brains, Minds, and Human Nature, a course I developed and delivered for Carleton University’s Learning in Retirement program.  

Originally, I had imagined I would make the video version of the course simply by delivering and recording new lectures using the lecture notes and slides that I used for the class. As soon as I tested the idea, a few weeks ago, I realized it wouldn’t work.

There’s a casualness of speech and tone in classroom teaching, which doesn’t transfer well to the detailed attention of an audiovisual recording. Similarly, audio recordings also require a pace and intensity that would be over the top in the classroom.

Repetition, in order to reinforce key details, is essential in classroom teaching. In an audiovisual recording, which can be stopped and played again immediately, that kind of repetition quickly becomes tiresome.

After a few false starts, I developed a script for the video which is much shorter and much more focussed than I thought it would be, focussing on only a few of the ideas I presented in the course. It will work, I think, but it will be different than a formal course.

Hopefully, it will be ready for sharing fairly soon, depending on the approach I adopt for its visual components. I’m considering a simple approach and a more elaborate approach. I’m inclined to keep it simple, but I won’t know for sure until I get the audio recorded and drop it into a video editor.

If you’d like me to send you a link to the video, when it’s posted, drop me a quick note at sterling.lynch@gmail.com or leave a comment below.

My Learning In Retirement Course Wraps Up: More to Come!

LearningSix weeks goes quickly when you’re preparing new lectures and delivering them to a group of highly engaged and attentive students.

My Learning In Retirement course, Brains, Mind, and Human Nature, wrapped up at Carleton University last Tuesday.

I haven’t received the official feedback yet, but I expect it will be positive overall.

I’d say most people in the class enjoyed the course and my lecturing style. A few people even took the time to express their enthusiastic appreciation directly to me. One woman told me that she enjoyed my humor. I’ve also got a few more Facebook friends too.

For me, it was an exceptional experience. It’s rare to have the rapt attention of any number of people, but all the more rare while teaching. In sharp contrast to most undergraduate classes, everyone present wanted to be in the classroom and was very eager to engage with the ideas I presented. 

If you’d like to learn more about the course, I produced fairly detailed lecture notes to go along with my slides. Take a look below and send me your thoughts.  

  1. Introduction and Overview
  2. The Politics of Your Brain: Anarchy Not Monarchy
  3. The Unconscious: An Altogether New Kind of Beast
  4. The Geography of You: Where Do You Begin and End?
  5. You’re Not in Charge: Free Will and Moral Responsibility
  6. Exorcising the Ghost in the Machine: A New Understanding for an Old Vision of Self

I will also convert this course material into a series of YouTube videos. I expect the videos will have a slightly different tone, given the nature of the medium, but I expect the series will cover much of the same ground that the course did.

If you’d like to be notified when I produce and post the first video, send me an email: [Sterling.lynch@gmail.com]. I’m also available for tutoring — one-on-one or in small groups. 

Think you’ve got a lecture series in you? Contact the good people at Learning In Retirement. The program is exceptionally well-administered. I highly recommend the program to any teacher, who is ready to bring their passion to a group of people who are ready to share in it and to learn from it.

The Geography of You: Where Are Your Borders?

Springs eternal.Where do you begin and where do you end?

If you’re like most people, your answer to this question is probably something like, “I begin inside my skull, at a spot about an inch or two behind my eyes, and I extend only as far as my skin.”

Other than the feeling that this is the extent of your geography, is there any other reason to believe that these are the true borders of you?

Perhaps, and probably not.

Your sense of your identity’s geography, like everything else about you and your mind, has roots in your brain. Moreover, the parts of the brain responsible for this feeling can be influenced, damaged, and manipulated to change the feeling of where you begin and end.

In controlled experiments, for example, subjects can be induced to believe fake rubber hands, mannequins, and even other people are a part of who they are — in the same way that you currently believe your hand is a part of you. Similarly, damage to the brain can cause a person to deny that one of his limbs belongs to him — in the same way that you are likely to deny that another person’s limb belongs to you. Last but not least, a person can be induced to believe, by seizure activity, intentional stimulation of the brain, and psychoactive chemicals, that they exist outside of their body — in the same way you think you exist inside your body now. In other words, that feeling of where you begin and end is not set in stone and is open to influence and manipulation from and by stimulus in your environment.

Once we recognize and accept this fact about our sense of self, it become much easier to second guess the presumption that a mind — yours or mine — necessarily originates in one body or brain. If the feeling of where a person begins and ends can change depending on how the brain is stimulated, there doesn’t seem to be any reason to accept as natural and given the very modern notion that a mind is something that originates in and, ultimately, belongs to one body or brain. We might even come to question whether or not this modern notion is the correct understanding of the relationship between a mind and the environment in which it emerges.

For example, many indigenous people often talk as if the land is a part of who they are, in a very concrete sense. There is an easy temptation to understand such talk allegorically, but, if a brain can be induced to believe that a fake rubber hand is a part of its identity, presumably a brain can also evolve to see the land around it as a part of its identity, in a way that is as concrete as the feeling that your hand is a part of you.

More importantly, we can and should turn this observation on its head and ask, instead, if it is our modern sense of self that has been distorted, say, by colonialism and capitalism. It is, after all, much easier to exploit other people and the world around us, when we believe that our identity extends no further than our skin.

Intrigued?

If this line of reasoning has piqued your interest, please take a look at some of my other posts that discuss our new and growing understanding of the brain.

If you’re feeling more ambitious, take a look at Robert A. Burton’s A Skeptic’s Guide to the Mind. It a friendly and accessible start to a fascinating topic, which discusses some of the research I’ve mentioned in this post.

I’m also in the process of developing a little (and free!) online course, which will explore the implications of the research described in Burton’s book (and others) from a philosophical perspective.

If you want to be the first to receive what I develop, sign-up to my email list or subscribe to my YouTube channel.

If you would prefer a personal guided tour through this research and its implications, let’s talk.

The Politics of Your Brain: Anarchy in the You, ‘kay?

A Many Brained BeastFor as long as the Western mind has thought about itself, it has thought of its nature in essentially authoritarian and paternalistic terms.

Whether its Plato’s charioteer driving two willful horses or Freud’s ego struggling to contain and direct the family feud between the Id and the Super-Ego, we Westerners tend to think of the mind as a kind of political community where one part of the mind — typically, the conscious mind — controls, dominates, and otherwise rules the unruly aspects of our nature.

Only Nietzsche, as far as I know, ever challenged this authoritarian account of human nature. When he examined the contents of consciousness, he did not see a single conscious mind ruling the roost. Instead, he saw a loose confederacy of minds, with one part of the conscious mind taking credit for the decisions and work of others — not unlike a king or modern day politician. According to Nietzsche, the King of the Mind thinks and feels like he is in charge, but the conviction is a self-serving illusion.

The latest psychological research and neuroscience is much more in line with Nietzsche’s understanding of the mind than the authoritarian model at the core of the western intellectual tradition, and at the core of the model you probably use to make sense of your own day-to-day existence. It is becoming increasingly clear that your conscious mind is not in charge most of the time, and, as Nietzsche suggests, it is most often preoccupied with the task of justifying and accounting for decisions made elsewhere in the brain.

The metaphor that now comes up often when psychologists and neuroscientists discuss the interpretative function of your brain is that of the press secretary in American Presidential politics. He or she is primarily responsible for weaving a narrative that makes sense of the giant multi-headed beast that is the US government, and she has to do it in such a way that it is reasonably plausible for everyone to believe that the President is totally and unequivocally running the show at all times.

Unlike the President’s press secretary, however, who is primarily concerned with knitting the wool of a story to pull over the eyes of the press gallery and the public at large, our internal press secretary is as concerned with pulling the wool over our own eyes as it is concerned with pulling the wool over other people’s eyes. In fact, all the evidence so far indicates, the press secretary in our brain is far better at fooling us than it is at fooling the people around us.

If this seems implausible to you, I am sympathetic to your incredulity. After all, the latest research on our brains flies in the face of one of — if not the — fundamental ideas of the Western intellectual and political tradition — the idea of the autonomous, self-aware, rational agent, who is the king of the castle of his mind. It is such a hard idea to accept and internalize that even a best-selling author and scientist, in a recent book, seems unable to accept and internalize the idea, even when he is explicitly writing about the discoveries of this new research!

Michio Kaku, a professor of theoretical physics, who also happens to host a national science radio show in the United States, writes, in The Future of the Mind, that he thinks the best analogy for the brain, given the new research, is that of a large complex corporation that has a special command center where the CEO makes the final decisions of the brain — even after quoting scientists who indicate in the very quotes he quotes that no one part of the brain is in charge!

For example, Kaku quotes Steven Pinker, a leading psychologist, who writes “the intuitive feeling we have that there’s an executive ‘I’ that sits in a control room of our brain, scanning the screens of the senses and pushing the buttons of our muscles, is an illusion [35].” Then, on the very same page, only a few sentences down from the Pinker quote, Kaku writes in bolded text, “Final decisions are made by the CEO in the command centre.” This is exactly the notion that Pinker, as quoted, has described is an illusion. It’s like a written example of cognitive dissonance!

To put the final nail in the coffin of Kaku’s corporate characterization of the mind, here are a few juicy quotes from Michael S. Gazzaniga’s Who’s In Charge? Gazzaniga, it should be noted, is also quoted in the section of the book where Kaku builds the case for his CEO metaphor.

  • “We have thousands, if not millions, of wired-in predilections for various actions and choices. […] The brain has millions of local processors making important decisions. It is a highly specialized system with critical networks distributed throughout the 1,300 grams of tissue. There is no one boss in the brain. You are certainly not the boss of the brain [44].”
  • “It’s a dog-eat-dog world going on in your brain with different systems competing to make it to the surface to win the prize of conscious recognition [66].”
  • “Our subjective awareness arises out of our dominant left hemisphere’s unrelenting quest to explain these bits and pieces that have popped into consciousness. Notice that popped is in the past tense. The interpreter that weaves our story only weaves what makes it into consciousness. Because consciousness is a slow process, whatever has made it to consciousness has already happened. It is a fait accompli [103].”

Gazzaniga is so sure of this new understanding of the brain that he thinks the only question left for us to confront is whether or not we can hold people responsible for their actions now that we know how the brain actually works. Remember, our entire legal tradition is built around the notion that our conscious minds can and should regulate our behavior, and it is becoming increasingly clear that our conscious mind, at best, can only tell a story about decisions that are made elsewhere in the brain, and often for reasons the conscious mind can’t possibly know (For the record, Gazzaniga thinks there is “no scientific reason not to” hold people responsible for their actions, but I wasn’t entirely satisfied with the account he offers in this book. I will set that discussion aside for another day)!

So, all this is to say, if you are struggling with the notion that no one part of your brain is running the show of your life, and, of all the parts driving your behavior, it’s very rarely your conscious mind, then take solace in the fact that you are in very good company. Even a Professor of Theoretical Physics — who deals with difficult and counter-intuitive ideas in physics all the time — is having a hard time swallowing this particular pill.

And the reason why it is hard to swallow is simple to understand once you accept the facts. The entire edifice of the Western religious, moral, legal, and political tradition is built on a notion of the human self that is demonstrably wrong. To my knowledge, as of yet, no one has systematically assessed whether or not that edifice crumbles or if can stand on our new understanding of the brain.

Exciting days, don’t you think!?

As a first introduction to this conceptual revolution, take a look at Strangers to Ourselves or Who’s In Charge?

I’m also in the process of developing a little online course, which will explore the implications of the research described in these books (and others) from a philosophical perspective.

If you want to be the first to receive what I develop, sign-up to my email list or subscribe to my YouTube channel.

If you would prefer a personal guided tour through this research and its implications, let’s talk.

High School Über Alles

1. Statistically-speaking, the population of a typical high-school is a representative sample of the community in which it is located.

2. Accordingly, the persons and behaviors encountered there are normally representative of the community in which the high school is located.

3. It seems (although, I know of no formal study) there are significant similarities concerning the kinds of persons and behaviors encountered in most high schools.

4. Therefore, it can be predicted that the kinds of persons and behaviors encountered in high school are representative of the kinds of behavior one will always encounter in life.

5. In some circumstances, as persons age, they have more opportunity to select and determine which persons they interact with directly on a regular basis and, for this reason, some kinds of persons and behaviors may be encountered less often, but they are, nevertheless, still likely to exist in the general population.

6. I would have preferred to realize this earlier in life but am pleased to know it now.