Language Barrier

Pod-and-blog-fade seems to be running rampant in the post-election libertarian and philosophy circles. I can’t help but wonder if it’s a combination of political hangover and something like a sigh of relief as certain existential threats have been postponed. Everywhere else, lefty entertainment and philosophy podcasts and blogs have begun their four-to-eight year pity-party, wherein they cry about the president to the exclusion of any other form of content. Technically, that’s why I voted for Trump, was to make these people cry… but I’ve got a bit of buyers’ remorse, now.

Anyway, I’m back on the content-producing bandwagon. Today, I’m talking about words.

 

I expect most of my readers will be well aware of the rules of grammar and have a decently expansive vocabulary. I’m not going to make a “top ten” list of fun punctuation marks… I mean, who hasn’t heard of an interrobang I’m not going to share my fun story about arguing about ancient Greek grammar with Jehova’s Witnesses (subject-object relationships are more important when you haven’t discovered punctuation yet). Instead, I’m discussing the philosophy of language in broad strokes.

As far as I can tell, most people haven’t critically examined the relationship between language and the world around them (unless they’ve smoked a lot of weed or have suffered severe concussions). As such, most people have intuitively just assumed one of two paradigms concerning the operation of language. If this describes you, Understand I’m not talking-down to you, as this is something esoteric enough in the realm of philosophy so as to be compared to particle physics or studying neolithic attitudes towards one’s in-laws. It is, however, an important issue to address when engaging in philosophical discussions.

Now that the disclaimers are out of the way, what are these two paradigms of language people assume? The first is that of what could be called “linguistic realism”: it’s a belief that words and sentences directly correlate to reality (in some cases, one could even say that words and reality are commensurate). In the case of thinkers like Plato and Aristotle, the word “justice” is an actual expression of some form or concept. When a poor soul makes the mistake of using the word “justice” near Socrates, Socrates assumes that the man must know the platonic form of justice so thoroughly so as to be able to utter the word, itself. Aristotle is a little more grounded, but he still assumes a sort of direct correlation between the word “justice” and manifestations in meatspace of someone “giving that which is owed”. In the modern age, that attitude is usually expressed by people who really enjoy Rhonda Byrne, people who think that bad words are bad words due to some innate quality of the word itself, and people who deride the idea of words changing meaning over time as well as the creation of new words. I used to be a linguistic realist.

The second paradigm of language could be called “postmodern nominalism” or “naive nominalism”. This position holds that words have very little correlation to reality; as a matter of fact, the best way to describe the position would be “the belief that words exist as nothing more than a game between individuals wherein rules are made up concerning the meaning and use of words, with little to no relation to the world outside of said game.” In the case of thinkers like Peter Abelard and Ludwig Wittgenstein, the meaning of a word depends on something along the lines of social consensus and common usage. When I say “tree”, it only means “that thing growing out of the ground, made out of wood, and bearing leaves” if I am speaking to someone who comprehends English and understands the botanical context of the statement. In a different context, the term “tree” could refer to a shape, such as that of a propane tree, a family tree, or a decision tree. To a non-English-speaker, it may as well be any other set of phonemes: it’s pure gibberish. In the modern age, that attitude is usually expressed by people who really enjoy saying “a rose by any other name…”, people who think that bad words are bad because of some historical or class-related context, and people who live-tweet their netflix-and-chill experience with their cis-gendered binary life-partner.

One of the clearest ways to delineate between these two positions is to inquire as to the nature of dictionaries. For example, if I hear or read a word I do not recognize, I obviously go to the dictionary… well… to google’s dictionary, at least. When I read the definition of the word, I am reading one of two things: I’m either reading the common context for the use of the particular term at the time of publication, or I am reading the “actual meaning” of the word. For example, if I were given the word “obacerate”, I would obviously have to google it or look it up in a century-old edition of the OED. When I get the definition “to interrupt one’s speech”, is that what the word means in some innate sense, or is that simply a description of how the word has been used in the past? If I were to begin using the word in colloquial conversation, would it mean “to interrupt one’s speech”, or could it take on a new meaning based on the context in which I use it or the context in which others understand it? If I only ever used the word “obacerate” when referencing covering someone’s mouth or punching them in the jaw, could the word take on that connotation?

If one says “the word means what the word means, regardless of context” one is likely a linguistic realist. If one says “the word hasn’t been used for almost a hundred years, it can mean whatever society begins to use it as” one is likely a naive nominalist. A more apparent, but less cut-and-dried example would be the use of words like “tweet”, wherein it could either be onomatopoeia for bird sounds or an activity which takes place on the website, twitter. If the word were to fall out of common parlance concerning birds, would the meaning of the word have changed once Webster cuts out the atavistic use of the word?

As is typically the case, I get the feeling that most people who bother to read this far are asking themselves “Why do I care about this hair-splitting over words?” If you are, you are right to do so. In day-to-day conversation, words just mean what they mean. If there is a misunderstanding, we need merely exchange one word for a synonym or offer a definition to contextualize the use of a particular word. In philosophy (and, therefore, any sufficiently advanced field of thought), though, these sorts of distinctions become important.

For example, if I assume that words have innate meanings and are either direct representations of something or are a sort of manifestation of the thing, itself, then when I start talking about something like colors, thoughts, phenomena, property norms… you know, abstractions, it can get hairy if I’m speaking to someone from a different set of preconceptions about language. I’m a sort of compatibilist nominalist. I greatly appreciate Peter Abelard’s contributions to the philosophy of language and I’m a recovering linguistic realist. As I will eventually get to in the 95 Theses, and I have already covered in the Patreon subscribers-only content, the human experience appears to be one which takes place entirely within one’s mind.

Whoah. Hit the brakes. That likely seems either patently obvious or totally insane, depending on who’s reading it. It’s either obvious that one has a consciousness which navigates a never-ending stream of sense-data and never grasps a “thing-in-itself” beyond those sense-inputs, or it’s insane to start talking like a Cartesian or Kantian solipsist: of course one sees, touches, tastes, smells, and hears the world around them and discusses these things with others…

…Which is a similar divide as the one between the linguistic realists and the postmodern nominalists. As far as I’m concerned, though, my mind is locked away from the world and only sees it as mediated through sense organs, nerve connections, chemical emulsions, brain wrinkles, and more. The only way I can make sense of all those inputs is to pick out regularities and assign concepts to those regularities. Through this systematic approach to those sense inputs, one can create a noetic and epistemic framework by which one can interact (albeit though similar mediation as the senses) with the world outside of one’s mind.

After all that fancy noesis and epistemology is underway, it becomes useful to apply language to this framework. If I consistently see a woody creature growing from the earth and bearing leaves and fruit, and I wish to express that set of concepts to someone else (who is obviously a similar set of sense perceptions, but I assume to be someone like myself), it helps to have a name, a sound, a mark, etc. to signify that set of concepts. And the basis for the word “tree” is created. The intuitive concepts such as causality, correlation, etc. also exist in that bundle of sense inputs and later receive names. If trees, causality, or even a world beyond the phenomena don’t actually exist, the sense inputs I have mistaken for these things still do. The reason I bring up abstractions of relationships, such as causality, is because they seem to relate to certain aspects of grammar. For example, subject-object relationships and prepositions seem to presuppose these causal and abstracted relationships.

Now, of course, there’s hundreds of years of philosophy of language at work and I couldn’t hope to go through even a thorough examination of my particular flavor of philosophy of language. The reason I tried to give this 2,000-word summary of the idea is twofold. First, I think that this is an issue that underlies a lot of misunderstandings and disagreements on the more superficial levels of human interaction. From the comical dilemmas over who’s allowed to say “faggot” or “nigger” to the more fundamental issues of whether or not “rights” or “norms” exist and in what manner, these conflicting theories of language are at play. The 95 Theses will go into the idea more in-depth and if the Patreon subscribers demand it, I’ll explore the idea further.

Second, I want to announce the upcoming glossary page on the website. I am often accused of mutilating language or using words in a way that only I can understand them. Less often, I’m accused of using too many technical words for people to keep up. I hope to remedy some of these issues by providing a cheat sheet of sorts to help people keep up with me and to understand what I am saying when I use words in a more precise way than they are commonly presented in dictionary definitions and colloquial use. Of course, I need feedback on which words should go in said glossary so, please, do comment on this post and send me emails about my abuses of language.

TL;DR: Philosophy of language is a very involved field of study, but nearly everyone is a philosopher of language, provided they speak a language. Even if one hasn’t critically analyzed their understanding of how language relates to the world, they are walking around with a bundle of assumptions as to what they mean when they speak certain words, and whether or not those words have some innate quality to them or whether they are just some sort of social game being played with other speakers of that same dialect. Most of those assumptions can be categorized as being that of “linguistic realism” (words are directly related to things and act as an avatar of the things they relate) or that of “postmodern nominalism” (words don’t mean anything in and of themselves and only vaguely gesture at socially agreed upon concepts). There are other, more nuanced positions that people can hold, but usually only as a result of actively engaging in the philosophy of language, an exercise I strongly recommend for those that are able.

Liberty Classroom: an Invaluable Tool

If you are reading this near the end of November in 2016, you can get some major discounts and provide a great deal of support to the Mad Philosopher project by going to Tom Woods Liberty Classroom and subscribing.  If you are reading this at any other time, you can still provide a great amount of value to the project by doing so.

Tom Woods Liberty Classroom is easily one of the most undervalued resources available on the internet, as it provides a legitimate PhD-level resource on a number of crucial subjects such as history and economics.  The term “legitimate” is important, here, as what most universities provide is only half-true and full of leftist propaganda.  This resource is the closest to comprehensive and the closest to unbiased as can be found.

Click Here to get some coupon codes and subscribe.  This affiliate program is definitely one of the best ways to support the Mad Philosopher project, second only to just sending me Bitcoin directly.

 

Here’s some free samples (the best stuff is behind the paywall, obviously):

the best way to fulfill the maxim “Carpe Veritas” is to subscribe to Liberty Classroom and take advantage of everything such a subscription provides.

patreon-logo

Chapter 1: Epistemic Assumptions

Chapter 1: Epistemic Assumptions

Thesis #1: One is solely informed by experience

“We must, as in all other cases, set the apparent facts before us and, after first discussing the difficulties, go on to prove, if possible, the truth of all the common opinions about these affections of the mind, or, failing this, of the grater number and the most authoritative; for if we resolve both the difficulties and leave the common opinions undisturbed, we shall have proved the case sufficiently.”1 As a read through the canon of philosophy2 will evidence, there is a long-standing tradition of beginning with and stating atomic, self-apparent, facts followed by exploring the ramifications of accepting those facts. While some philosophers may begin with assumptions more apparent and verifiable than others, it remains the case that all worldviews are predicated on basic assertions which are made by the one (or group) which crafted said worldview.

This assertion is, itself, a self-apparent truth. There is no real way to prove that all reason is derived from immediate facts, only to disprove it. The principle of non-contradiction is one such principle: a thing cannot both be and not be in the same mode at the same time3. There is no way to conclusively prove this to be the case, but it is the foundation of all our reasoning. I assert that any example that could be presented contrary to this claim is either simply a convoluted example of my assertion or is an exercise in irrationality and absurdity4. I will choose to arbitrarily select one out of all the available examples of a beginning paradigm which attempts to circumvent this reality. A common line of reason in modern American society is the claim that “There exist, among men, a large percentage of bad actors who harm others. We wish to be protected from bad actors. Therefore we must place men in positions of authority over other men in order to protect them from bad actors.”5. Of course, in this case, there will undoubtedly be bad actors introduced into the aforementioned positions of authority, amplifying rather than mitigating the negative effects of bad actors in society.6 This is one of innumerable examples which demonstrate the impossibility of escaping the paradigm I have presented.

As can be assumed, these self-apparent facts are apparent only through the experience of the one to which the fact is apparent. Each of these (and all subsequent) experiential facts are, themselves, informed solely by experience. Even the most extremely outlandish claims to the reception of knowledge, like divine revelation or telepathy, are in their own way experiential. Ignoring whether or not it is possible or likely that one can have a vision or spontaneously altered awareness which is factual or true, what is guaranteed to be the case is that those who honestly make this claim have had an experience of such which has informed their worldview.

Reason, then, as the faculty by which one can analyze and make judgments about one’s environment, is ultimately derived from experience7. The experience of fundamental principles, like the PNC, allows one to generate the praxis8 of reason. By using the tools and flexing the muscles of the mind, one can begin to develop the faculty of reason.

Thesis #2: Reason dictates one’s understanding of the universe

One without reason, like an animal, exists in a perpetual cycle of stimulus and response. No different than a complex computer program, the sum of all an animal’s behaviors is dictated by a genetic, instinctual, rubric by which an animal eats when it is hungry, mates when it is fertile, and flees predators when threatened. Every nuance in their behavior is simply a property of their programming. This can lead to amusing circumstances when an animal’s conditioning is no longer appropriate for their environment, such as dogs refusing to walk through doorways due to certain cues which lead them to believe the door is closed or Andrew Jackson’s parrot swearing so profusely it must be removed from its owner’s funeral9. These amusing behaviors, though, are prime indicators as to the lack of a key characteristic which makes man unique from the animals: reason.

Both man and animals have experiences: certain events as perceived through the senses. However, man has the unique experience of experiencing that he is experiencing. In other words, “We are not only aware of things, but we are often aware of being aware of them. When I see the sun, I am often aware of my seeing the sun; thus ‘my seeing the sun’ is an object with which I have acquaintance.”10 Experience, itself, is clearly not sufficient, then, to be considered reason or a source of reason. Experience, as the animals have it (animal experience as I will refer to it), is little more than a sensational input to an organic calculator which produces a result. That result, even, is no more than an action of the body which, in turn, generates further sensational input. This cycle simply repeats itself thousands of times per minute, millions of minutes in succession, until the animal dies. The experience of man (or just “experience”, as I will call it), however, is different.

Man still experiences via the senses, but there is a slightly more complex process in operation after that initial sense experience. If a man is still in his infancy, is drunk, caught sufficiently off-guard, is mentally disabled, or is one of my critics (or is any combination of the above), it is incredibly likely that they will have a form of animal experience by which reason doesn’t enter the picture until some time after an instinctual and automatic response takes place. Even though that may be the case, there will be an opportunity later to reflect on the experience and interpret it as one wishes (though, at times, that opportunity is ignored). More commonly, an individual has the opportunity to process sense perceptions with a rational mindset, deliberating whether he should say a particular sentence or another while on a date, for example.

In this example of a date, one, we will name him Mike, can draw on experiences from the past to inform the present choice. Upon reflecting how poorly his last date went, Mike may opt to avoid describing in graphic detail what it feels like to shoot oneself in the leg over a veal entree… at least on the first date. This is an example of how one’s understanding is a direct result of one’s internal narrative. After experiencing the horror and disappointment of a first date ending abruptly and with no prospects of a second, Mike would have the rational faculty to reminisce over the experience in order to find a way to succeed in the future. Having reached an understanding that such behavior is not conducive to a successful date, he can choose to avoid that behavior in the future. This applies in all circumstances besides the aforementioned date. If, say, Mike were to decide to read this book, after reading a miserable and arrogant introduction, he may come to an understanding that this book is not worth it and return to watching football never to read philosophy again (that sorry bastard).

Of course, it is possible that one’s interpretation of an experience can be flawed. In the case of Mike, it’s possible that his earlier failed date had less to do with his choice of conversation and more to do with the fact that his would-be girlfriend was a vegan with a touch of Ebola. In the case of his current date, it is distinctly possible that his current would-be girlfriend is a red-blooded anarchist meat-eater who listens to Cannibal Corpse songs when she eats dinner at home. By misinterpreting previous experiences, Mike is going to spoil his chances with a real keeper. For this reason, I find it necessary to delineate between one’s subjective understanding of particular instances, which may or may not be inaccurate, and one’s faculty of understanding.

Thesis #3: One’s understanding of the universe dictates one’s behavior

As we addressed when discussing the differences between animal experience and actual experience, man behaves in a manner distinct from animals. Due to man’s faculty of reason, understanding and justification are elements which interject themselves between the phenomena of stimulus and response. In any instance of stimulus, a man must choose to assent to the stimulus and choose to respond. In the case of Mike, while reading my book, he would be exposed to the stimuli of mind-expansion, intellectual challenge, existential intrigue, and more. Being unaccustomed to such stimuli, our example, while incredulous of the stimuli, assents and then chooses to cease to read and retreat to the comforts of the familiar simulated manhood of football. In the case of a dog, however, whatever new stimuli it is exposed to are immediately either perceived through the filter of instinct or disregarded outright, much like a blind man being the recipient of a silent and rude gesture. As that stimuli is perceived, the dog’s instinct causes it to behave in one manner or another. For instance, being of domesticated genetic stock and trained to assist his blind owner in particular ways, he may maul the one performing the rude gesture, with no rational process involved, merely organic calculation.

This difference, however, does not mean that man is devoid of animal experience or instinct. As mentioned before, under certain circumstances, man can behave in a manner consistent with animal experience. As a matter of fact, it is the case that instinct may play, at a minimum, as much as half of the role in man’s experience and understanding. Man is clearly not the “tabula rasa” of Avicenna and Locke11. As I have asserted, the faculty of reason is inborn. Evidence exists to support my claim in that infants instinctively act on stimuli in order to feed, cry, swim, and flail their limbs; there are also contemporary scientific claims that the brain operates as an organic calculator, the evidence of this also exists in the behavior and brain structure of infants. Additionally, evolutionary psychologists have observed similar phenomena in grown adults concerning phobias, pain reactions, sexual attraction and many other areas of the human experience. As will be addressed later in this book, it is even possible that this rational faculty my argument hinges so heavily on is, in fact, nothing more than a uniquely complex form of animal experience12. Until such a time that I do address such claims, though, we will continue to operate under the belief that rationality exists per se.

Understanding and habituation, then, drastically impact one’s behavior because they are the medium by which one’s experience informs and dictates one’s behavior. Through experience of particular sensations, and the application of reason to those sensations, man can come to understand his environment. Through application of reason to any given circumstance of stimuli, he can then choose an action understood to be most appropriate in any circumstance. Habituation, additionally, impacts man through the instinctual inclination to maintain a certain consistency in one’s actions. In the case of Mike, this would result in choosing to watch sports over reading philosophy.

Thesis #4: The epistemic and phenomenological endeavors of philosophy (and, by extension, certain areas of physics which pertain to the human experience) are crucial to one’s understanding of the universe and one’s resultant behavior.

In choosing to watch sports rather than read philosophy, Mike is attempting to avoid the discomfort of a new experience for which he is ill-equipped. However, in avoiding that experience, Mike is attempting to shirk his need to engage in public discourse and exposure to culture. Whether or not he succeeds in such an endeavor is less important to us now than what such an experience represents. The experiences of public discourse and culture are key experiences which inform one’s understanding and behavior. Our example in the introduction to this book concerning the need for communication and language is a prime example of the fundamentals of public discourse and culture. “This mushroom bad,” clearly establishes certain cultural norms as well as informing one’s attitudes towards certain concepts. In the case of Mike, it could be a friend coaching him with dating advice or beer commercials during the football game altering his expectations of his date. If he had read my book, Mike would be more likely to succeed in his date, having better equipped himself with a tool set for working with the human condition.

These tools have been graciously provided for us through the long-standing traditions of philosophy, most notable in this instance would be epistemology and phenomenology. Through the study of knowledge and how man acquires knowledge13 and experiences and how man feels what he does,14 philosophy can aid significantly in one’s quest for understanding what and how he knows what he does and how to influence those around him. Most of what has been written in this chapter is lifted directly from discussions I have had regarding various works in epistemology and phenomenology. In this regard, I believe this work is a paradigm example of the assertion made, that one of the most crucial kinds of experience for the formation of one’s understanding is one of a social and philosophical nature.

A strong cultural and public formation of one’s understanding is crucial because a well-informed understanding can ultimately provide maximal utility to an individual and society15 whereas a poorly-informed understanding can effectively cripple one’s ability to develop their rational faculties or provide much utility to themselves or others. As was mentioned earlier, one’s subjective, personal understanding can be flawed. Some merely make a small error in their reasoning while others may be mentally disabled by either material means or due to a cripplingly misinformed understanding. The strongest influence to both the possibilities of an accurate understanding or mental disability is that public influence on the individual. As discussed in the intro, when done correctly, philosophy creates the circumstances most conducive to a well-informed worldview.

In this way, we see that one is solely informed by personal experience. That experience allows one do develop inherent faculties such as reason. Reason, in turn, allows one to analyze one’s experiences and engage one’s culture. This analysis generates an understanding and worldview within the individual, which also has a bearing on one’s habits as well. This understanding is the premise on which one makes a decision regarding how to behave in any given circumstance. As forming an accurate worldview is crucial to one’s successes, philosophy (the strongest candidate in this regard) is crucial to forming said worldview.

95 Theses

1Aristotle’s Nicomachean Ethics (Oxford World’s Classics) p.118

2The widely accepted list of “most significant philosophers to-date”.

3We will explore the Principle of Non-Contradiction, or the PNC, more thoroughly in chapter 3: Orders of Knowledge.

4A claim which is logically self defeating, whose conclusions deny the very premises on which it is built.

5This is an example of how Philosophies written in the mid-17th century (Hobbes’ Leviathan) have percolated though the social consciousness for centuries and are no longer questioned.

6Additional examples and further exploration of absurdity can be found in Hobbes’ Leviathan, chapter 5.

7The next chapter will explore this concept more fully.

8The method by which one, through either experience or theoretical knowledge (“knowledge that”), can develop practical, active knowledge (“knowledge how”).

9 Volume 3 of Samuel G. Heiskell’s Andrew Jackson and Early Tennessee History

10“Problems of Philosophy” Bertrand Russell ch.5

11“Tabula rasa” refers to a “scraped tablet” or “blank slate”, evoking a description of the mind in which there is initially no knowledge or activity whatsoever.

12In Chapter 2: “The Embodied Mind”

13epistemology

14phenomenology

15In this case, I’m using the term “utility” in a very loose way. The best definition of “utility”, though, would be, “the capacity for a thing to provide or contribute to one’s flourishing.”

Abstract of the 95 Theses

Assumptions and their descendants:

From Aristotle1 to Zeno, every man who has claimed the title “philosopher”, has made basic assumptions from which all their later works (if rigorously done) are derived. Even those that demand a priori proof of even the most atomic basis for argumentation (such as those in the Cartesian tradition2) make assumptions somewhere, no matter how well disguised or hidden they may be. There is nothing wrong about doing so, though; being an experiential creature man can only begin to reason from some given truth of which they have experience. The pre-existent knowledge required is of two kinds. In some cases admission of the fact must be assumed, in others comprehension of the meaning of the term used, and sometimes both assumptions are essential… Recognition of a truth may in some cases contain as factors both previous knowledge and also knowledge acquired simultaneously with that recognition-knowledge, this latter, of the particulars actually falling under the universal and therein already virtually known. ”3

Because it is the case that one must begin from assumptions, it is in one’s best interest to select the most fundamental and apparent assumptions and build up from there with the assistance of reason and observation. When one follows these assumptions to their logical conclusion, then, one will likely see the errors of one’s assumptions if the results are absurd or impossible. At that point, one must select an improved set of assumptions and move forward, repeating this process as many times as is necessary. I use epistemic assumptions here, as my childhood experiences in Cartesianism have shown to me the impossibility of accurately describing the universe if one is an epistemic skeptic or nihilist.

In addition to selecting a certain type of assumption, one must be deliberate in what quantity of assumptions one makes. If too few assumptions are made, there will be insufficient material from which to derive cogent syllogisms or conclusions, trapping one in the tiny cell of skepticism. Choosing too many or too advanced assumptions will short-circuit the philosophical process of discovering where the assumptions will lead and will necessarily result in the desired (and likely incorrect) conclusions of the author. Also, too many or too complex assumptions place one’s work beyond the accessibility of critics, in that no critic can hope to verify one’s claims based on one’s assumptions if the assumptions themselves are opaque, obscurantist, or simply a secret to all but the author.

As was implied by an earlier paragraph, and would logically follow from this conversation concerning the quantity and quality of assumptions, certain enlightenment-era questions and practices ought to be bracketed4 for later discussion. If one were to be forced to synthesize their own version of the Cogito, or the world of numena, the practice of philosophy would have halted midway through the enlightenment with each new philosopher attempting to invent a square wheel. That is not so say that skepticism should not be addressed; only that it doesn’t necessarily have to be the starting point. Nor does it mean that one’s assumptions suffice on their own; they ought to result in an empirically falsifiable claim by which one could determine the validity of one’s assumptions.

The physical world and our understanding:

Why would my project run straight from epistemological assumptions into physics? The physical sciences are the first source of certitude after the basic epistemological claims are made. It is far simpler to state that we can know things and that the primary engine for any knowledge is our experience and discuss that experience as opposed to making such an epistemological claim and immediately begin attempting to discuss experience or knowledge of some transcendent or ethical claim, as their experience is often derived from some manner of physical experience to begin with.

This is because philosophy, like reason, operates from the ground up: first, building a foundation before building arguments atop that foundation. “…If a house has been built, then blocks must have been quarried and shaped. The reason is that a house having been built necessitates a foundation having been laid, and if a foundation has been laid blocks must have been shaped beforehand.”5 As our immediate experiences are derived from our bodily senses, which are confined to matters of a physical nature, so too must our immediate foundations. Even universal and unavoidable principles, like the principle of non-contradiction or many ethical principles, are made known to one by way of physical sense experience (with assistance from reason, of course). In addition to the foundation which physics provides on an experiential level, it also provides a conceptual basis. One cannot properly ask “why?” without first asking “what?” and “how?” Physics, when done properly, effectively shows one what happens in our physical universe and how it does so.

Metaphysics6, as the name would imply, can also be appropriately appealed to in this stage of development. As a counterpart to the physical studies of how our universe operates, metaphysics applies a slightly less experiential and more rational but very similar method as physics to immaterial questions regarding our experience. Metaphysics and I have had a very rocky on-again-off-again relationship throughout my life. As a confessed former adherent of scientism, for quite some time I disavowed that metaphysics could even rightly be considered to exist. I am sure that by the time my life ends, I will have left and returned to metaphysics at least once more, but each time such an event occurs, our understanding and appreciation of each other grows.

Ontology as derived from experience:

Why ontology? If ontology is to be understood as the study of existence or existants, then it would naturally follow from our study of our experience to move on to the study of the things we are experiencing, namely, that which exists. There is a question more likely to be asked by a modern readership. That is, “why theism?” I have long struggled with the discussion of theism or atheism in the realm of philosophy. Even as a “scientist”, I was agnostic as to whether there existed some being beyond the physical realm, primarily because both a positive or negative claim as to theism are empirically unfalsifiable.

However, that was at a period of time where I was still immature, both biologically and philosophically. I have come to realize (as will be discussed in the Theses)7, that one’s assumptions on which one builds one’s philosophy necessarily result in either a positive or negative claim concerning theism. In the case of any teleological philosophy, it must result in a positive claim and, conversely, in the case of any nihilist philosophy, it must result in a negative claim.

Also, after physics is able to establish an empirical validation of one’s assertions, it must be relegated to the role of double-checker, simply checking all later claims against man’s experiences, ensuring that no claims made by other fields of study run contrary to that experience. Naturally, after physics establishes what happens and how, the philosopher must ask why it happens, or another way of phrasing “why” would be, “what is the practical universal significance of such an event?”

Although the question asks for the practical universal significance, and despite the claims made by postmodernists, it is not in any way untoward or egotistical to presume that the universal significance of such an event must, in some way, be centered upon ourselves. There is a twofold reason that this is the case. Firstly, the nature of man is such that he feels a compelling need to search for meaning in his existence; any teleological philosophy would rightly assign an end to that compulsion. Secondly, our definition of philosophy is predicated on the assumption that man is capable of discerning a relevant place in the cosmos for himself. Ultimately, in this case, the absurdist is right, it matters not whether there is a significant place for man in the universal sense or not, man can always make one.

In knowing man’s role and significance in the cosmos, one possesses a tool set which one can use to determine what one ought to do. Now, many will refer to Hume at this point and will insist that “One cannot derive an ought from an is,”8 but rather than conclusively disproving my point, they merely indicate their lack of understanding of Hume. The prohibition of deriving an ought from an is assumes that the realm of “is” consists merely of objective impersonal atomic facts. If one allows value claims into their ontology, or their category of “is”, it becomes inevitable that the is/ought distinction collapses. These value claims are clearly not empirical, but that brings us to our earlier discussion about the relationship between the sciences and philosophy, the moment that certain supplementary matters of fact are allowed into the realm of discourse, such as metaphysical, psychological, teleological, or ontological assertions, it can easily stand to reason that one can derive an ought from an is.

Even in such an event that objective values do not exist, the subjective values of individuals must be informed by a proper understanding of physics, metaphysics, and ontology. If one values a particular activity or outcome, one’s ability to achieve such a result is dependent on properly navigating reality. Many would-be “oughts” are simply impossible or absurd and are beyond the human capacity for comprehension, let alone accomplishment; thus, the realm of values to which one can assent is limited by the same factors which have confined our definition of the philosophical activity thus far. Even after one assents to a rationally consistent and metaphysically possible value, the methods by which one achieves such an outcome is dependent on the nature of reality and the actor’s ability to navigate it. With these strictures in place, it is essentially actionable to claim that one can derive an “ought” from an “is”.

The problem of evil and subsequent ethical prescriptions:

All philosophers are eventually faced with the question which plagues all men: “Why does life suck?” It finds itself phrased in many different ways but, since the time of Epicurus, the problem of evil has remained central to the discourse of philosophy. The most common phrasing would be something akin to, “If there exists an omnipotent, omniscient and omnibenevolent god, how can he allow innocent people to suffer as horribly as they do?”9 Usually, there are citations of disease and natural disasters killing small children to this effect.10

Different philosophers and traditions provide different answers, some more radically different than others. Some, such as Epicurus, would say that the problem of evil is sufficient cause for a practical atheistic hedonism. Others, such as Pascal, argue quite the opposite. Not the least of the responses, while being more or less outside the theistic spectrum, would be the approach popular in the ancient East (and the answer I once held myself), “Life simply sucks”. While my answer now is slightly more refined, the practical application of it remains mostly the same. So, what to do about the problem of evil? This is, again, more clearly and articulately discussed in the Theses11 than I could hope to write here. It will suffice to say, for now, that our understanding of man’s telos must accommodate for the problem of evil.

What can one do about the problem of evil? I believe that the answer is twofold. In the case of the philosopher, one is obligated to, at least, address and accommodate for it and move on with their reasoning. Each man, however, must be able to address and accommodate for the problem in their daily lives. While the appearances between these two courses of action are very similar, I believe that each require individual attention. The problem of evil serves as a strong device for proofreading philosophical assertions; insofar as one’s philosophy can or cannot address the problem, one can quickly assess the practical viability of said philosophy. The personal approach, while strongly tied to the philosophical one, need not be as rigorous or well-reasoned as the philosophical. The great acts of kindness displayed by those such as Blessed Teresa of Calcutta or Saint Nicholas are no less great a response to the problem of evil because of any lack of philosophical argumentation for their actions. In this work, I hope to articulate the philosophical side of the problem, and in a later work I hope to provide practical tools for living in accordance with that philosophical approach.

As will be discussed in this work, in all reality, the problem of evil only exists in the form of a problem because of the innate desires of man. Man bears in his heart the desire and freedom to excel. Whether one is aware of it or not, a majority of his actions are caused by or strongly influenced by that desire. Despite the common formulation of the problem of evil, it is less an ontological statement of “How can this thing possibly exist?” and more a plaintive cry of “Why do I want this, if the universe conspires such that I cannot have it?” One must be able and willing to address the problem and either overcome or circumvent it in order to achieve the self-fulfillment sought after by all men.

Conclusion

My aforementioned saloon discussions have operated as a club of sorts, with the working title of Lucaf Fits, which is an acronym for “Let us create a foundation For it to stand.” As the basis of logic, reason, philosophy, and ultimately all human endeavors, a solid rational foundation is required for all meaningful discourse and progress. “Lucaf Fits” serves well as both a goal and mantra for my group and myself. With this work, I hope to begin setting forth a foundation on which my other discourses may stand.

This work, as I have already said, is to be a starting place, not an exhaustive foundation or even an introductory work like the Summa or Prolegomenon. In sharing this work, I am exposing the beginnings on my internal discourse to the harsh elements of the social world. I hope to be met with great amounts of constructive criticism and support from my peers and superiors, but I am not so confident so as to expect it.

Regardless of the social and financial success or failure of “A Philosopher’s 95 Theses”, I intend to continue this line of work, exploring and expanding the 95 Theses, following them to their logical conclusions and modify the foundation as is needed to most successfully pursue the goal of philosophy. I also hope that with sufficient time, effort and experience, I can one day move beyond such foundational types of works and move into a more practical style of discourse and argumentation. I believe that the foundations such as these outlined here will necessarily lead to the conclusions that I so frequently argue and strive to engender in social media and day-to-day life; I hope one day to have outlined from this foundation those points so that others may see the validity of my position and actions. If, however, my conclusions are invalid and do not follow from the premises I am currently laying out, then, just as well, as it will guide me to the Truth which is far more valuable to a philosopher than public affirmation.

Because such discussion is directed at the revision of one’s arguments and beliefs, I will likely revise and correct this work through time. I have already, in the writing of this introduction, revised a few of the theses contained within this book, and have since edited each one a number of times, so as to more appropriately maintain their cohesion and logical validity. While I hope that such causes for revision will appear less and less frequently until, one day, I have acquired Truth, I am skeptical that such a time or event will occur in my lifetime, or even this world at all.

The ideas contained herein are the product of nearly two decades of oral discussion12 and revision, as well as excessive reading of philosophers across time and traditions. I am simultaneously both encouraged and discouraged by the genealogy of my current position. Having run the gamut of political, economic, religious and philosophical stances in my short lifetime, I am emboldened in saying that I have recognized my own mistakes and intellectual frailty enough times now to be more willing and able to admit my own mistakes when they are made. At the same time, however, I find myself skeptical of any truth claims I do make, now, because of my long list of fallacious stances in the past.

With luck and a fair degree of self-control, God willing, I will be able to make use of another seven or eight decades in this endeavor. That, I would hope, will be sufficient time to complete the revisions to this and my later works. Perhaps, one day, my ideas will be perpetuated in the traditions of philosophy. Perhaps commentaries on my work will be required reading in some institutions.

After all, the entire tradition of philosophy consists of free ideas. I do not mean “free” as in without cost, for many of the greatest and worst of the world’s philosophies have been crafted at great price. I mean “free” in the sense that the ideas, granted an appropriate environment, will spread and flourish like wildflowers. As I mentioned before, these ideas are as much a part of the intellectual atmosphere as any other cultural trend or idea. In many cases, these ideas are so liberated from the moorings of their original author that they are falsely attributed to one who was unwittingly synthesizing an already existing work.

It is an obligation of the philosopher to give credit where it is due. One ought especially to give citations to one’s contemporaries, as they are still present to take advantage of what approbations and criticisms come their way. To only a marginally lesser degree, one ought also give credit to those who have come before and laid the foundations on which one now builds, both so that one is not falsely assumed to be the progenitor of another’s work and so that one’s readership may be able to find the primary sources for their own edification. That being said, one must not be so averse to inadvertent plagiarism so as to hinder actual progress. A healthy balance must be struck between progress and citation.

In addition to the intellectual and social coin of credit given where it is due, actual coin ought to be given as well. Being merely human, a philosopher still needs food and shelter and time. When one works full-time performing menial and self-debasing labor (as is common in this age), it can be difficult or impossible to set aside sufficient time, resources, and motivation for such an undertaking as philosophy. Even if the ideas and art of philosophy ought to be unbound by financial constraints like all other intellectual or artistic works, the one producing the work is. I can justify selling this work as opposed to making it freely available to all only because it is being sold at an affordable price and because I am willing to donate copies and excerpts to those who can and will benefit from it but cannot possibly afford it13.

I make this financial case for philosophers with a caveat: no man should solely be a philosopher. If not working some form of job at least part-time or arranging for one’s self-sufficiency to supplement both one’s wallet and mind, than one must be working in some capacity either for survival or for art. A man’s mind can stagnate on outdated and fallacious thought if he is not careful to keep both his body and his social life healthy and active. Even if one makes enough money from teaching or publication (which, I understand, is rare), one must at least volunteer for a local, personal charity in which one works with other people and worldviews.

To this effect, I intend to continue this course my life has taken and see where it leads. I hope you, my reader, are willing and able to make use of this work and to aid me in my quest for Truth.

95 Theses

1Technically, Albertus de Saxonia is alphabetically prior to Aristotle, but he is much less known.

2The philosophers who followed in Descartes’ footsteps, maintaining a skeptical stance towards all facts that are not entirely doubt-free

3Aristotle “Posterior Analytics” book one

4Set aside with the intent to more thoroughly explore at a later time, it is a technique to be used only on concepts that are not crucial to the discussion at hand.

5Aristotle “Posterior Analytics” book 2

6From Greek: “after physics”. While the name denotes only that it was the subject Aristotle would teach after physics, it can be said to deal with the non-material aspects of physical inquiry.

7Chapters 5 and 13

8Hume “A Treatise of Human Nature” book 3

9 Hospers “An Introduction to Philosophical Analysis” p310

10Dostoevsky “Brothers Karamazov” is an excellent example of such descriptions.

11Book 5

12 In this case, I consider social media as a form of oral discussion.

13 Ironically, I qualify under my own rubric for a free copy

Philosophy in Seven Sentences

I’ve previously presented a brief review of Christian Apologetics (which seems to have vanished… I will have to write a second one or re-publish it). From the same author, InterVarsity Press has recently published Philosophy in Seven Sentences. Now that I’ve read the book (twice), I feel compelled to share it with my readers.

I love teaching/tutoring, especially audiences yet uncorrupted by academic ignorance and apathy. A few years ago, I taught a series of philosophy classes to a local homeschool group. It was well-received, it payed the bills, it gave both myself and my audience a newfound appreciation for the science and art that is philosophy.

The average age of the class was somewhere in the vicinity of thirteen or fourteen years of age, so they were largely unaware of philosophy altogether (which is a shame). I had four lectures with which to cover all the bases of “Philosophy 101” in a manner amenable to a young audience. Ultimately, I decided on pulling four themes/philosophers from history and simply walking the class through a philosophical exercise of exploring those themes. Almost the entirety of my preparation time was spent choosing the four themes. Ultimately, I think I chose Plato’s (Socrates’) apology, Aristotle’s categories (basic logic), Descartes’ cogito, and Kant’s categorical imperative. Of course each philosopher served as a foil for their contemporary history of philosophy and their inheritors, thereby covering the bases of philosophy’s history. Having taken two Philosophy 101 classes (from two different schools, long story), I get a feeling this is a popular way to teach such courses.

All this dry nostalgia is to set the stage for a brief overview of “Philosophy in Seven Sentences”. Typically, this would be a full-on “teaching from the text” post, but this book is literally fresh off the presses and both you and Douglas Groothuis would be better served if you ponied up the small amount of money required to acquire the text itself. That said, I do intend to give the text its due justice.

In eight short chapters, averaging about sixteen pages each, Groothuis takes one sentence per chapter (plus a short challenge at the end) and gives an excellent introduction to both the tools and traditions of philosophy. Typically, such a text will either attempt to impress its readers with technical terms, obscure references, and complicated methods of presentation or it will be written so casually and simplistically so as to render a rich and beautiful tradition banal and empty. Groothuis manages to dance a fine line between condescension and elitism, speaking plainly and straightforwardly but also challenging even seasoned readers to step up to his level of mastery concerning the material at hand.

I genuinely enjoy reading primary sources which, I guess, makes me weird; secondary and tertiary sources are generally less appealing to me, but I read any material with a sufficient insight-to-page-count ratio. As a case-in-point, I’ve already read many of the texts referenced in “Philosophy in Seven Sentences”. Even so, Groothuis manages to take a broad array of information, presumably acquired through extensive reading, discussion, and lecturing, and distill it down to one of the highest insight-to-page-count concentrations I have seen, even for someone with reasonable familiarity with the material presented.

The seven sentences in question are well-selected: spanning history and traditions from ancient Greece with Protagoras, Socrates, and Aristotle, to the early Church with Augustine, to the enlightenment with Descartes and Pascal, to modern existentialism with Kierkegaard. While I may have selected a couple different sentences (exchanging Paschal for Nietzsche and Kierkegaard for Camus or Sartre), Groothuis tells a progressive narrative which begins, dialectically and historically, with Protagoras’ “Man is the measure of all things,” and concludes with Kierkegaard’s pointed and melancholy “The greatest hazard of all, losing one’s self, can occur very quietly in the world, as if it were nothing at all.”

Readers who have no prior exposure to philosophy proper should, at least, recognize three or more of these quotes, as they have become memes referenced and repeated throughout popular culture. “Man is the measure of all things,” “I think, therefore I am,” and “The unexamined life is not worth living,” are referenced in popular films, shows, books, and songs. Descartes’ contribution, in particular, is the subject of a great many common jokes. I once owned a t-shirt which read “I drink, therefore I am.”Groothuis does an excellent job of setting misconceptions concerning these sentences without becoming a party-pooper.

Usually, a book I enjoy reading is full of highlights, annotations, and sticky notes. Every page of Human Action and Existentialism is a Humanism has some sort of mark on it. One would expect, then, that an unmarked book would be a sign of disinterest and, typically, one would be correct. In the case of “Philosophy in Seven Sentences”, though, nearly every line would be highlighted (defeating the purpose of highlighting) and there is no need for annotating the text; it is clear, concise, and wastes no time or space in exploring, if not the history of philosophy, a powerful narrative through the tradition of philosophy.

I have never before encountered a book better suited to serve as a textbook for an intro to philosophy class. Admittedly, this book would likely be better received in a Christian institution than elsewhere but, even elsewhere, it far outstrips and conspicuously secular text as far as both demonstrating the techniques of the philosophical exercise as well as exploring the philosophical tradition. I guess I’ve been salivating over this book long enough and ought to move on to “teaching”.

The general plot of the book begins with Protagoras’ exploration of subjectivity. Given that the pre-socratics are the progenitors of western philosophy, it makes perfect sense that one would start the narrative there. With a quick glance over extant pre-socratic works, one largely has a choice between the Zenos’ contributions of stoicism and obnoxious math problems, Pythagoras’ trigonometry, Heraclitus’ almost Buddhist sense of impermanence and meaninglessness, or Protagoras’ relativism. While Zeno (either one), Pythagoras, Heraclitus, et.al. each contributed quite a lot to philosophy as a whole, Protagoras sets a particular stage for Plato and Aristotle to get the show really going.

“Man is the measure of all things,” could easily be the opening lone of a stage play concerning the history of philosophy. I know from firsthand witness that phrase has hung on the wall of many dorm rooms that have borne witness to activities often reserved for cheap motel rooms outside of town; it has also, quite contrarily, remained very near the heart of philosophical discourse for over two millennia.

Such a mentality is easy for the philosophically-minded to slip into. As the exercise of philosophizing often consists of comparing and contrasting (AKA “measuring”) experiences, narratives, and ideas, it’s a natural temptation to declare oneself (or one’s kind) “the measure of all things”. Given the absence of an immediately apparent alternative to man, as far as measuring is concerned, Protagoras can’t really be blamed for making such a claim. Groothuis does an excellent job of exploring Protagoras’ position, the rationale behind it, what such a position means, and the ultimate results of a position. I don’t have the ability or word count to do so.

Moving on, a younger and arguably more famous contemporary of Protagoras is reported to have said “The unexamined life is not worth living.” Of course, if man is the measure of all things, then such an examination is likely to be very short in duration. Groothuis shows the tension between Socrates/Plato’s views on the transcendental nature of reality and Protagoras’ more materialist understanding of reality. While also setting up an opposition between Protagoras’ camp and the Socratic camp (which remains in the narrative all the way through Kierkegaard), he describes Socrates and his basis for such an extreme statement as “The unexamined life is not worth living,” in its own right as well. Admittedly, I feel that, despite explicitly addressing the key issue in interpreting Socrates (he didn’t write anything down, so all we have is other peoples’ accounts of what he said), Groothuis blurs the line between Socrates and Plato as far as their ideas are concerned.

Regardless of whether Plato or Socrates ought to get the credit allotted by Groothuis, they effectively prepare the stage for Aristotle who begins the discussion of man’s nature. Ultimately, the issue of man’s nature is what Augustine, Descartes, Pascal, and Kierkegaard are called to opine upon. Each one comes from a particular philosophical school and era in history and, therefore, has something unique to contribute to the discussion and Groothuis demonstrates a depth and breadth of knowledge on both the philosophers and their ideas.

This book is a must-read and must-have for anyone who is even fleetingly interested in matters beyond dinner, dates, and this week’s sportsball game. This goes for the engineer who did everything in his power to avoid liberal arts as well as the philosophy masters’ students who may need a reminder on the basics, a reminder of where philosophy 101 students stand, or as a textbook from which to teach. This book is one of the few secondary sources I will suggest, and I plan on snagging a few of the books listed in the bibliography for my personal extra-credit.

TL;DR; Philosophy in Seven Sentences, by Douglas Groothuis, is a paradigm example of how the more knowledgeable one is concerning a particular subject, the better one ought to be at explaining it in terms everyone can understand and, hopefully, enjoy. Derived from a popular introductory lecture style, Groothuis’ work takes seven deep, meaningful, and crucial sentences from the history of philosophy. While I may have chosen sentences from Nietzsche, Rousseau, ort Sartre instead, I would not have been even remotely capable of laying out so much information in so concise and readable a narrative. If anyone has a hard time keeping up with the terminology or argumentation in this blog, “Philosophy in Seven Sentences” is my most highly recommended starting place (followed by Liberty Classroom).

Introduction to the 95 Theses

Introduction

“A Philosopher’s 95 Theses”, a silly and audacious title for a work by a college dropout with little to no substantive endorsements. What is this work even supposed to be? This work is primarily an attempt to begin a systematized and traceable discussion concerning my particular brand of philosophy. Having spoken in various public forums, from in the classroom, to hosting salon discussions (thank you, Voltaire), to water cooler discussions, to arguing on Facebook (a noble means of communication, to be sure), teaching and tutoring homeschoolers, and managing a blog, I have found that many people in my generation and social stratum lack even rudimentary exposure to true philosophy or even formal logic. This isn’t the case for everyone, but a majority. Many times, people disagree with my statements or beliefs, not because of any logical or ideological error on my part, but rather a lack of understanding of how conclusions follow from premises. Ultimately, the discussions belie no understanding of the objective material at hand, but merely emotional attachments to already-existing prejudices as well as a fundamental lack of foundation from which they are arguing. When presented with this fact, others are wont to accuse me of the same. In this work, I hope to both soundly establish a defense from such accusations as well as begin to spread a culture of “lower-class intellectualism”: a culture of self-education and intellectual progress compatible with and available to “the lower class”, economically speaking. The first step of doing so would be to make something accessible and affordable available to what I call “my social stratum”, as well as simply raising awareness of alternatives to the current institutions which are fueled by big money and political agendas.

Clearly, as a starting place, this work is merely the beginning of what I hope to be an expansive and pervasive body of work. I hope to one day move beyond this project of establishing my foundations to making these concepts concrete and practical, providing a certain utility to all that would be open to a paradigm shift from our current postmodern sensibilities. From this body of work, I intend to expand and build on these ninety-five theses using the same style and methods contained herein, as well as writing a series of philosophically weighted articles concerning how one ought to live from day to day.

As most anyone who reads this work can tell, there is nothing groundbreaking or even original in this work, other than the arrangement of these ideas pulled from the atmosphere of the philosophical tradition. As a foundational work, I would expect this piece to be fairly conventional. Besides, as one prone to taking things too far and stating the outrageous, I want to give myself a moderate baseline from which to work in order to give some credence to my more extreme assertions which I have begun to publish already, alongside this work.

Despite the conventional content, I chose a particularly evocative title, (if I do say so myself). The title “A Philosopher’s 95 Theses” is an unabashed attempt to cash in on the fairy tale of Martin Luther’s dramatic succession from the Church. There is a narrative in which Luther made official his succession through the posting of the 95 Theses on the church doors as an overt “Eff-You” to the Church. While evidential support for this re-telling of history is nonexistent, the actual format and concept of the work itself is worthy of emulation. This is certainly the case if this is to be a beginning of a break from the status-quo of contemporary philosophy.

To be honest, the suggestion for the title and style for this work was presented to me by a friend who seemed quite earnest in wanting me to write my thoughts for his own edification. The suggestion was made primarily from a religious awareness of the Theses as a work of philosophy which could be easily adapted to a social media format. The concise nature of each thesis makes it easily tweeted in ninety-five segments. He leveled a challenge to me to post ninety five philosophical theses in ninety five days on Twitter and Facebook in order to encourage me to begin writing my ideas in a codified and discussion-friendly format. After a hilariously disorganized and epistemically infuriating four months, I had ninety-five theses, a ton of notes from discussions that were sparked (by the early theses, I think many friends and loved ones lost interest around #35 or so), and a new-found energy for attempting to publish something of worth.

The name and format of the original “95 Theses” has been lifted, but much of the argumentation and content has been abandoned, as Luther and I have very different intentions and circumstances concerning our respective works. Where Luther simultaneously affirmed and protested various Church doctrines and principles of theology, I intend to do the same for the philosophical doctrines which many contemporary philosophers have confessed. As such, rather than explicitly arguing the finer points of revelation and redemption, I intend to establish a solid foundation for later arguments in the philosophical realms.

As I will address in detail later, philosophy is a historical and holistic entity. Due to the nature of philosophy, I don’t expect to have come up with any original material, even if I know not where it has been written before. In the words of Descartes, drawing on Cicero, One cannot conceive anything so strange and implausible that it has not already been said by one philosopher or another.1 The ideas and truths of philosophy are simply “in the air”, as it were. One of the marks of truth in the philosophical world is its longevity. Many ideas that emerge in these theses, as well as my other works, are strongly rooted in classical philosophy as it has survived to this day.

I borrow heavily from existing works, as all philosophers do. I give credit where I can recall or research the original source, but it would be impossible to trace the genealogy of every idea which springs from my mind. This arrangement of concepts and their relationships is likely to be original, but the ideas themselves are old and deep-rooted. It is the perennial duty of the philosopher to water, trim and tend to the tree of knowledge which is philosophy: to hold the ideas in one’s mind, to criticize and correct errors, and generally allow the Truth to become known. Not a bonsai tree, but a veritable orchard of delicious and ripe fruits.

This work, hopefully, will establish a faux a priori2 foundation from which I can assert all of my later reasoning. Now is your chance, critics. Now is the time, in this work, to correct my premises, my errors, my moments of weakness, before I attempt to plumb the depths of truth in this vessel I have cobbled together. It will be too late, I am sure, when I arrive at a premise so incomprehensible and flawed to point out that I had overlooked a basic truth here and now.

I have grandstanded long enough on what philosophy is, without giving an appropriate definition and description of it. One should not assume that one’s use of terms is identical to that of one’s readers or opponents.

What is philosophy and why bother?

I believe that all who can rightly claim to be a philosopher will recognize certain fundamental characteristics which I believe to be necessary conditions for philosophy. It must be rational, as even the most blasé and stale philosophy assumes the basic precepts of logic, non-contradiction, and the ability of the mind to grasp truth. It must be consistent, as rationality simply can not allow for the possibility that the principle of non-contradiction is invalid. Therefore, all rational things are self-consistent. It must be empirically viable, as our experiences determine our understanding of the universe and, subsequently, the truth (the theses themselves will discuss this3); we cannot hold a belief which predicts or necessitates an experience divergent from what we actually experience. It must be universal, as any truth which is contingent upon circumstance is not a truth, but merely a fact.

In addition to these necessary attributes of the practice itself, I believe it must also produce certain results, fruits if you will, lest it be nothing but a mental exercise. Without ethical agency, this exercise would have no bearing on our lives as a prescriptive measure which, in the absence of an equivalent authority for prescription, would result in aimless and irrational lives, driven simply by the reptilian and hedonistic pleasures of our own genome. Without utility, this exercise would be superfluous to any other activity man would undertake; very few (and no sane) men would choose an impotent and laborious endeavor at the expense of something enjoyable and productive. Ultimately, without truth, there would be no rhyme or reason to the philosophical endeavor; if it were to be self- consistent and pursue truth, it must actually be capable of and ultimately accomplish the task of acquiring Truth. For these reasons, I assert with a fair degree of certitude that the purpose and goal of philosophy, as well as its necessary and sufficient conditions, (and, therefore its constituent elements, such as theology, physics, etc.) is to create an internally consistent, logically sound, empirically viable, and universal worldview which possesses ethical agency, utility, and (ultimately) Truth.

As mentioned in the above definition, philosophy possesses many constituent elements and tools of which it avails itself. As a reading of Aristotle or many of the enlightenment philosophers will support, I find that it is most natural to begin the philosophical journey in the realm of epistemology or phenomenology. A definition of each is in order, I believe, before addressing the practicality of such a method. Epistemology, taken from the Greeks, can simply be considered “the philosophy of knowledge and thinking, an explanation for how one thinks and knows”. Similarly, phenomenology would be “the philosophy of experience, an explanation for how one experiences and interprets those experiences”, also from the Greeks.

An approach starting from the angle of philosophy of thought and experiences does present some inherent issues, like the infamous discussion between Kant and Hegel:

“We ought, says Kant, to become acquainted with the instrument, before we undertake the work for which it is to be employed; for if the instrument be insufficient, all our trouble will be spent in vain… But the examination of knowledge can only be carried out by an act of knowledge. To examine this so-called instrument is the same thing as to know it. But to seek to know before we know is as absurd as the wise resolution of Scholasticus, not to venture into the water until he had learned to swim.”4

Hegel presents a very pragmatic alternative approach, which was quite popular with later Hegelian philosophers, like Marx. Essentially, he asserts that one ought to simply begin thinking and doing philosophy and will learn how one learns by witnessing one’s own experiences, much like how one learns to swim. As one can see, in reading the first ten or so theses, my assumptions and their descendants take a very Hegelian approach to early epistemology.

Amongst the historical traditions of philosophy, a debate as old as the pre-Socratic philosophies rages to this day: the theists vs. the atheists. Despite the greatest attempts of the moralist atheists, though, the arguments between theism and atheism ultimately deal with a more fundamental question. Whether or not there is a God is ultimately an argument as to whether there is any Truth at all. Again, as the theses address, either the universe is nihilist (devoid of any fundamental or objective meaning and purpose) or it is teleological (purposeful and directed)5. The most common theistic argument made is one concerning teleology, “What’s the point, if there’s no point?” Conversely, the atheist makes an absurd or existential (presenting logically inconsistent facts, or asserting that the universe itself is logically inconsistent) argument, “If there is no point, I can make one.” These arguments will be addressed in the theses6.

Ultimately, all forms of science and pseudo-science (assuming that they are rational and logically rigorous) are constituent elements of philosophy. If our definition of philosophy is accurate, then all rational activities which are directed at the goal of achieving ethics, utility, or Truth are elements of the grand attempt that is philosophy. The scientific endeavors are all part of the philosophical school of physics, by which one establishes the empirical viability of any particular philosophical view. The pseudo-sciences, ranging from sociology, to psychology, to astrology, to magic (again, assuming that they are rational and logically rigorous) can sometimes be appropriated into either physics or metaphysics. Some rare cases may even wander further from physics into epistemology or phenomenology, but all intellectual pursuits are ultimately an element of philosophy. Many of the individuals which pursue these endeavors lose sight of the forest for the trees, but that does not make their work any less valuable to the philosopher.

Bertrand Russel asserts, in chapter fifteen of “Problems of Philosophy”, that science becomes science by divorcing itself from philosophy once it becomes useful. Joseph Pieper, similarly contends that scientific inquiry is capable of achieving conclusions which are resolute and unyielding, whereas the philosophical endeavor can not.7 Both Russell and Pieper have a distinctly post-enlightenment flavor to them in this regard, which is unfortunate. They both fail to see that science is but a tool and a field contained within philosophy. Science may try to distinguish itself apart from its mother, with such cultural figureheads as Neil DeGrasse Tyson outright ridiculing her, but it can never truly extricate itself from the frameworks from which it came. Instead, it would be more appropriate for the specialists to concern themselves with their specialty and the philosophers to draw on them when needed.

Above all, reason is the driving force of man and his works. Above all rational pursuits, philosophy reigns. While not all men may have the ability to be great philosophers, all men are called to be philosophers, nonetheless. If in no other way, one must examine their choices and their lives in such a manner to achieve the best outcome available. Unfortunately, in this day and age, I fear that even this minor task proves to be too much for most.

It is no surprise, really, that this task has proven too much for my generation. The heart of philosophy is discourse and my generation is illiterate and disjointed in this regard. Rather than bemoaning our state of affairs, however, I ought to concern myself presently with the discursive nature of philosophy. Whether the discussion be oral debate in the city square, essays and books written in the solitude of a cave or study, or a college dropout’s ramblings on social media, philosophy only flourishes when an idea is shared, tested, refined, and put into practice. The manner in which this discourse and implementation takes shape is varied and veiled, but it is very real, even today.

The ideas and themes in popular philosophy pervade every area of our society, especially in the United States of America. They are boiled down to aphorisms and images and spread like a plague or meme through the cultural ether. I say “especially in America” as our nation was founded on a social experiment derived from the popular philosophies of the time (social contract theory), and that is a tradition that has continued for two centuries. Those that participate in the creation and sharing of art in society play a crucial role in the spread of these ideas.

Literature has been a long-suffering companion to philosophy. As far back as Homer and Gilgamesh, we see philosophical themes and musings riddle the characters and narratives of the culture. In more modern times, with the rise of the printing press, we saw an emergence of overtly philosophical fiction and some less-overtly philosophical fiction. There was such literature before the press, to be sure, just look at the classics. However, I find it unlikely that “Candide” or “Thus Spake Zarathustra” would have lasted the way the “Iliad” or “The Divine Comedy” has in the absence of the press. Even popular works of both fiction and nonfiction, whether intentionally or not, are rife with philosophical commitments.

These commitments are equally prevalent in film. While film is a fairly recent advancement in technology, it shares a common lineage with literature. We can easily trace its heritage from screenplay to stage play to oral traditions which stand as the forebears of ancient literature. For the sake of this discussion, I will consider video games and television shows as film, as their storytelling devices and methods are more-or-less identical. In addition to the words and language used in literature, film also presents ideas and commitments through the visual medium as well, certain images or arrangements can, consciously or unconsciously, link certain ideologies and characters together. The same holds true for music, sculpture, painting, any artistic or cultural endeavor, really, even dance.

Through the public discourse and permeation of cultural works, philosophy drives a society’s zeitgeist8. Any of the uninterested or uneducated who participate in cultural events, from watching movies, going to school, being subjected to advertising, have their minds and views molded by the underlying philosophy. Through exposure and osmosis, ideas that were once held in contempt have become mainstream and vice versa. This is the natural cycle of philosophy, and it is always made possible by the liberty of the minds of true philosophers. Even if the zeitgeist demands that the world be one way or another, the free thinkers are always at liberty to pursue the truth and share that quest with others through discourse.

Philosophical Schools, the Good and the Bad

Philosophies, taken in their historical and cultural context, ultimately tend to land in two categories: that of “the man” and that of “the rebel”. Whatever cultural or institutional norm for a culture may exist, it exists because of the philosophers who have brought those concepts to light and shared them via the public discourse. Those ideas that find themselves in favor of the ruling class or establishment naturally become the driving force of a society or state. Those ideas which are newer and less conformed tend to become popular amongst the counter-culture. It is important to note: this observation does not lend any judgment to the truth value of any one or another idea, simply its cultural impetus. It is the duty of the free-thinking philosopher to sort thought these ideas, regardless of the cultural context, and to ascertain the objective truth value of each respective idea. This often makes their philosophies unpalatable by both “the man” and his reactionaries. (C’est la vie.)

This cultural presence and impetus of popular ideas is revealed in every cultural work. From little nuances in color choice, sentence structure, musical tonality, to overt themes and statements, certain ideologies become manifest to an audience. These manifestations can be analytical and conscious and others can be more insidious or subconscious. The two most prominent contemporary examples are in the mainstream news and popular film, where phrasing and imagery is specifically designed to impart a worldview and philosophy on the unwashed masses.

It is no mistake or coincidence that the more authoritarian a state becomes, the more strictly social discourse and cultural works are censored. It is always in the best interest of the establishment to engender in their subjects conformity of thought and philosophy. The most intuitive and frequently used methods towards that end are limiting the subjects of discourse and subverting the thoughts of the masses. I believe that now, like any other time in history, the people of the world are having their thoughts and philosophies subverted and censored by the social and political establishments around the globe. An easy example of this phenomena would be the blind adherence to material reductionism, Neo-Darwinism, and cultural relativism which is strictly enforced in academia as well as by societal pressure, despite the lack of compelling rational evidence to support any of the three.

It is possible, however, that the prevalence of “bad philosophy” in popular culture is less a conspiracy of idiocy and more a benign zeitgeist of an uneducated time. Regardless of whether it is intentional or incidental, there is a silver lining in this situation. Philosophy, when maligned, can be a powerful tool for subjugation, but it is also, by its fundamental nature, liberating. Philosophy, as the pursuit of truth by rational means, necessarily drives its earnest adherents to freedom. By questioning the reasoning behind the social structures and institutional norms one encounters, one comes to understand where the truth lies and liberates oneself from the lies perpetuated by a society devoid of reason. Because of this, we see a dichotomy emerge: popular culture and its discontents. Now, this doesn’t mean that philosophers cannot enjoy and partake in the fruits of popular culture; it simply means that one ought to be aware of what is being imparted upon oneself, especially when there is a surplus of material available.

Reality exists such that there are several misconceptions and maligned concepts in the realm of contemporary philosophy. One of the popular misconceptions concerning philosophy and intellectualism is that it is a domain primarily inhabited by out-of-touch nerds arguing about stupid questions. “Which would win in a skirmish, the Enterprise or the Executor?” While the answer is obvious after a short bit of reflection (Enterprise), it is a dilemma that only a specific and small demographic will ever face. It is also a question that has questionable practical significance. I have witnessed in both the media and the general public a rising belief that those that contemplate such questions are to be considered intellectual and philosophical, at the expense of those that are deserving of the titles.

Of course, those that are deserving of the title have long been plagued by equally absurd-sounding puzzles. “When removing stones from a pile of stones, at which point is it no longer a pile?” While the answer may appear to be obvious to a mathematician or engineer (the pile is a designated set, it remains a pile even if there are no units in the set), it has far-reaching implications in the way man thinks and knows, or in other words, in the realm of epistemology.

Without philosophy, man would lack a crucial tool of introspection and rationality. The very question “What is knowledge?” does not have a satisfactorily categorical answer. Through our pursuits in philosophy, man has made great strides in addressing such a fundamental question which has evolved from “What is justice?” and moving onto “How can I be certain I exist?” and now addressing a wider, more complex assortment of queries. The fact remains, we must always ask, “How do I know this?”

These questions form our culture and our ethos. Or, rather, the pursuit of answers to this class of questions drives the popular zeitgeist. Even banal entertainment, like prime time television and late night talk shows touch on the questions which plague all sentient beings. “Why am I here?”, “Why am I unhappy?”, “What’s for lunch?”9 are all questions which people are desperately trying to answer whether they are aware of it or not. Philosophy attempts to codify and rationalize the pursuit of these answers, to make it accessible to our contemporaries and future generations, not only for our own sakes, but for the sake of man as well. These attempts are frequently used to answer these questions by taking our common assumptions and putting them to the test.

In each age and culture, there are certain ideas that become popular and omnipresent. An example would be polytheism in ancient Greece, or Christianity in 13th century Europe, or social Darwinism in the early 20th century. As can be seen through the examples presented, many of the common assumptions of the time fall to the wayside as a culture’s awareness evolved. In the words of Paschal: “Whatever the weight of antiquity, truth should always have the advantage, even when newly discovered, since it is always older than every opinion men have held about it, and only ignorance of its nature could [cause one to] imagine it began to be at the time it began to be known.”10 In some cases, those changes are for the better or worse (the shift from superstition to reason or the social ideology which fostered Nazism) at the time that change occurs. However, in the long run, philosophy always allows the individual and their culture to learn from the past. Typically, though (as I indicated above), this puts the individual at odds with his culture until the culture can catch up with him. This often makes the more notable philosophers those that were considered nonconformist.

A popular postmodern mindset in the philosophical landscape today has attempted to artificially generate that notoriety through philosophical non-conformity. What I mean is, they attempt to protest even philosophy itself. This is a trend which began in the enlightenment and found its perfection in the existentialist movement. Where enlightenment philosophers tended to either decry the philosophical mindset as some form of mental illness or feel the need to announce that it isn’t a “real” science, existentialists were (and are) wont to denounce not just the rationale of philosophy, but the very existence of logic altogether.

Absurdity is, fundamentally, simply denying or violating the principle of noncontradiction: asserting that something both is and is not in the same mode at the same time. Absurdism is a whole realm of postmodern philosophy in which one, such as Jean-Paul Sartre, attempts to use the tools of philosophy without following the rule of logic. While such attempts are entertaining and mind-expanding, they are just as the name says: absurd. As the 95 Theses (like all philosophy) assumes the existence and necessity of logic and rationality, this treatment of absurdism will be short and off-handed. Even so, Sartre, Camus, and other existentialists manage to contribute observations and arguments of value to those pursuing truth. I hope, in other works, to address the good and the bad of absurdist philosophy, but not today. This will be explicitly outlined in the theses themselves11, but this will help to better prepare a novice for the oncoming vocabulary contained in this work.

Nihilism is not a new concept in philosophy, but it has recently found a surge in popularity after witnessing the World War and all of its continuations. It is tempting to deny the existence of meaning when witnessing the most inhumane behaviors being perpetrated by man. “What is the meaning in millions of men killed by other men?” can easily become “What is the meaning?” However, as a being capable of asking such a question, the answer literally precedes the question. If one is able to witness and analyze whether or not something has meaning, there is, at a minimum, the production of that question. In the case of an absurdist, he looks no further than the mind of the inquirer, asserting that the inquirer/philosopher must give meaning to an otherwise meaningless world (and ultimately violating the PNC to do so). In this way, nihilism, in using a meaningful discourse to establish that there is no meaning besides the absurd is, itself, absurd. In the case of a philosopher, one asks “from whence does that desire for meaning come?”

In order to make sense of the universe at large, philosophy must be logical. Taking the evidence available to the philosopher and arranging it into a coherent narrative which is both satisfying and capable of producing utility and accurate predictions of cosmic behavior. The fact that our minds and our philosophical endeavors exist in such a way, and the fact that it is successful as such, we conclude that the universe itself must follow a form of logic. While the human intellect may be limited to codifying and adapting a series of laws to describe the universe’s behavior distinct from that behavior itself, the universe’s behavior is quite clearly consistent and logical, regardless of our perception of it.

This, of course, brings us to the subject of relativism. Relativism, in all but its softest forms, asserts and assumes the absence of objective existence, either in the form of moral reality, or physical or ontological reality. Moral relativism and its twin, cultural relativism, asserts that, because of the diversity of contradicting perceptions of ethical truth, there can be no absolute moral truth. Naïve relativism follows this form of logic to its inevitable conclusion: anything that can have contradictory observations or beliefs concerning it does not exist objectively, therefore reality itself does not objectively exist. While, at times, some form of scientific study is used in an attempt to justify such an assertion, typically it is an extreme reaction to scientism.

As objectionable as relativism is, it is at least identifiable and easily refuted. Scientism, however, is a beast of a different nature. Scientism is a strict adherence to the scientific method predicated on the philosophy of materialism, it is a union of empirical positivism and material reductivism. Anything not immediately falsifiable12 is of no consequence and ought to be done away with. Not all elements of scientism are bad (coming from a former adherent to scientism); a strict adherence to the methods of reson and empirical observation is what has elevated the school of physics to become the driving force of modern society it is today.

In recent centuries, most noticably the twentieth, there was a sudden surge in scientific thought and progress in all of the civilized world. There were innumerable factors that contributed to this phenomenon and, thankfully, I have no intention of going into detail concerning them. At the moment, I am far more concerned with the fruits of this technological renaissance than its causes. In the nineteenth century, the perpetual swell of knowledge and increasng standards of living appeared to be infinitely sustainable. This led to an optimism in the whole of society, but most especially in philosophy and its constituent sciences.

Confidence in science’s ability to cure all of humanity’s ails was joined by a popular trend in science known as reductionism. It was widely believed that science’s messianic qualities were a result of its percieved ability to reduce the most complex psychological or biological ailments into some simple alchemical formula (female histeria and electroshock therapy come to mind) and even the darkest and most troubling metaphysical questions could be exorcized with a simple application of mystical scientific hand-waving. Reductionism isn’t a modern invention, even the pre-Socratics strove to reduce all things to one atomic principle (the world is air/water/fire/flux/love/whatever), but never before was it so widespread and influential as during the rise of modernism and postmodernism.

Unfortuntely, in all their excitement over the leaps and bounds that were being made in their discoveries, true scientists (one who studies the physical sciences) became “scientists” (those that adhere to the philosophy of scientism). Subsequently, some bad science was introduced into the realm of sceintism without sufficient criticism. A handful of non-falsifiable theories, like Neo-Darwinism and String Theory, have managed to charade their way into the cult of scientism and are now defended with a fervor and blindness rivaled only by the most rediculous of religions. While it is not currently my goal to write a full-fledged indictment of scientism and other instances of bad science, I am compelled to at least demonstrate that materialism is insufficient and direct my readers to a work that more than completely shows that materialism and Neo-Darwinism are incomplete and illogical worldviews13. In favor of misguided science, many are equally prone to jihad in favor of bad philosophy (ie. relativism and consequentialism14). Some of these people have legitimate exuses for doing so (public education and demographics of their upbringing come to mind), ultimately, their excuses can be reduced to the defense of, “I didn’t know any better.” Some despicable men, however, are quite aware of the logical fallacies they commit in the name of furthering an agenda contrary to the pursuit of Truth.

Sophists, since ancient Greece, have always profited from making defenses of the indefensible, either for the acquisition of wealth or the silencing of their own conciences. Whenever an ill-informed or malignant trend emerges in a culture, it is certain that some sophist or another will emerge from the woodwork to champion it. Unfortunately for true philosophers, most sophists find their roots in philosophy and academia. This is unfortunate because, to the unwashed, the sophists and philosophers are indistinguishable between each other, save for sophists defending the fulfillment of their base desires while the other demands intellecual rigor and consitency. These sophists were the enemy of the ancients and are the enemy of philosophy today. As certain historians through history (like Cicero) have noted, there has been a noticeable trend of cultures falling for sophistry not long before their demise. In our modern culture, we see popular philosophy dominated by sophistry and intellectual vacuity. In academic philosophy, it would appear that a certain apathy to the common man and common culture has gripped the hearts of philosophers as they discuss the impractical and esoteric. Worse, though, than the philosopher turned sophist, is the celebrity or lawyer turned “philosopher”. Lawyers are paid to play by the rules and obfuscate the truth. Celebrities are paid because they make people feel good. Both of these careers are antithetical to the pursuit of truth. In such a case that one who makes a career of pursuing personal interest (whether it be thier own or their clients’) turns their attention to announcing certain ethical, social, scientific, or really any intellectual claim, they ought to be met with close scrutiny. An example which has plagued America (and the world) in recent years is the Hollywood zeitgeist of celebrities loudly and aggressively endorsing the political ideologies of the radical left. While these endorsements ought to be recieved skeptically, we instead have seen a widespread voice of agreement in the public forum. This is no different than the phenomenon observed by historians of bygone empires and cultures.

The same cult of irresponsibility and self-promotion in both popular culture and academia that existed in ancient Athens still plauges true philosophers today. At times, given the ascetic15 nature of the philosophical disciplines, it can be incredibly temptng for one to compromise one’s integrity for the sake of wealth or popularity which a philosopher would never see otherwise. Additionally, even if one is unaware of what they are doing, it is common for one to confuse one’s ideas with one’s self, which leads one to take justified criticism poorly and leaves no room for improvement and correction of ideas. When one is more concerned as to whether they are well-liked or can turn a profit rather than engaging in a genuine loving pursuit of wisdom and truth, it can only end badly.

As Socrates is credited to have said (which is more likely a paraphrase of his entire body of work), “The unexamined life is not worth living.” In order to successfully achieve eudaemonia16 or Truth, one must be vigilant and develop the ability to accurately assess one’s self. As will be expressed in the theses, one’s experience and examination of that experience is fundamental in one’s understanding of the universe and subsequent actions. Additionally, seeing as how eudaemonia and truth are the goals of the philosopher, it is clear that any philosopher and, truly, every man must live an examined life.

Now, this is not to say that every man must so thoroughly analyze and examine every atomic facet of his life in perpetual stoic apatheia. In fact, the reality is quite the opposite. While the philosopher must develop a categorical and pervasive habit of self-assessment, this could be crippling in other endeavors. Some men are simply incapable of this degree of introspection and others live in an environment which disallows such behavior. Even these men, though, can and ought to engage in what could rightly be called a “partially examined life”17: a lifestyle in which one at least routinely examines one’s conscience and actions. Training in and awareness of philosophy are invaluable tools in such an endeavor.

After all, our definition of philosophy clearly illustrates that philosophy is universally applicable. In clearly defining how the universe operates and why, as well as exploring what our actions must be in any given circumstance, philosophy establishes itself as the prime candidate to be the very center of culture and individual lives.

Through careful examination of one’s self and of the universe at large, one can come to an understanding of what one needs in order to acquire self-fulfillment. The desire for self-fulfillment is already the driving force behind culture. In developing and advancing the understanding required to achieve self-fulfillment, one contributes to the formation of a culture of self-fulfillment. This culture, informed by philosophy, would be a haven for those seeking eudaimonia.

As the centerpiece of ancient Greek culture and subsequently of philosophy, eudaimonia deserves a more thorough examination and definition. While it is alluded to in the 95 Theses, it may not get the fullest treatment it deserves. It then falls on the introduction here to give at least a high-altitude explanation with which to work. Eudaimonia as it is used here and in the theses can most easily be described as “the freedom to excel”. This means not only the presence of the mental faculties required to conceptualize and pursue excellence, but also the material and metaphysical circumstances required. In truth, I believe that this has always been the pursuit of man: to live in a culture of eudaimonia.

Philosophy: a Brief Genealogy

Regardless of which narrative one adheres to concerning the origins of man, there are certain circumstances which must have occurred at some point. While the beginnings of just such a narrative exist in the theses, I will attempt to imagine the worst-case scenario for the point I am attempting to illustrate. That point is, from the inception of the human race, philosophy has existed. With the emergence or creation of the first man, whether he was a mutated member of an ancestor race or created fully formed from the dirt by the very hand of God, his was the unique responsibility of siring the human race. While language and conceptualization may not be required in order to find a mate, it could certainly help. However, from the birth of the first progeny of man, communication and conceptualization become necessary for the continuation of the species. In order for her offspring to survive long enough to fulfill its duty to the species, our Eve must be able to express the concepts necessary for survival. Even if one is to assume that genetics supplied her offspring with instincts concerning fight-or-flight responses or aversions to creepy-crawlies that could be harmful, they would be insufficient for the task of allowing the offspring to learn, “This mushroom is bad,” or “This is how you kill a boar,” when they are one-chance circumstances which drastically impact survival.

It is clearly in the best interest of humanity’s survival to build on and diversify the material each generation inherits. “This mushroom is bad,” can only take one so far; it certainly does not place one at the top of the food chain. However, inquiry, discovery, and purpose can drive a nomadic people, scratching a meager sustenance from the earth, to ever greater achievements. I may not be able to kill a bear in hand-to-hand combat (I have never had the chance to try), but I don’t have to. By virtue of the utility of philosophy (and its constituent physical sciences), I live in an environment which is naturally repulsive to bears (though, in the instance of this region, the case was quite the opposite until recently); as added protection, though, I have many tools at my disposal, not the least of which is my Mosin–Nagant.

Aside from mere survival though, philosophy also provides mankind with an awareness of purpose and ethics which provides far more utility and impetus than survival, especially once the requirements for survival are met. In the pursuit of eudaimonia, we can imagine a genealogy of thought, moving from, “This mushroom is bad,” to, “Why is this mushroom bad?” to, “Why is?” With as many intermediary steps. Alongside this line of reasoning, we also see a diversification of material, branching from mere survival and pagan “gods of the gaps” into physics (including biology, astronomy/astrology, chemistry/alchemy, etc), metaphysics, epistemology, theology, etc.

While all these endeavors are oriented towards one end: the creation of an internally consistent, logically sound, empirically viable, and universal worldview which possesses ethical agency, utility, and (ultimately) Truth, they are sufficiently detailed and esoteric that one could spend their entire lives in devotion to one small element of a particular area of philosophy. This should not, however be used as a justification for skepticism18 as it would only serve as justification if philosophy were a solitary venture. Philosophy, by it’s nature, is collaborative. Each area of philosophy, no matter how distinct from another in focus and subject, bears at a minimum a holistic relationship to each other. In the same way that each area of study collaborates with the others, so too must individual philosophers. This relationship of the areas of study is due, in part, to their common material and practical significance; each area of philosophy informs the others and serves as a check against fallacious reasoning.

Being a human endeavor, philosophy finds itself the victim of human error quite frequently. As optimistic and teleological as my views are concerning this endeavor, I am not ignorant of the inherent shortcomings and roadblocks such an endeavor faces. I fully expect that even in the case of my own contributions, I will find myself (many years from now) arguing against the very assertions I make in this work. These shortcomings often lead to the development of dead-ends and half-truths. Some of these are quite speedily identified and handily defeated (like geocentrism) but many others are quite bothersome. Concepts which are rooted in truth or bear tangential resemblances to the truth often mislead the philosophical discourse. One need only to look as far as Epicurus’ problem of evil and subsequent resolution, or Puritanism, or the Copenhagen Interpretation, or Marxism to see what kind of damage can be done by philosophy run awry. These mistakes, as damaging as they may be, will, ultimately become a footnote in philosophy as failed experiments, as the utility of accurate reasoning becomes apparent and the march of the true philosopher continues unabated.

As the definition I am using for philosophy states, philosophy is an ongoing pursuit of truth (or, the Truth). All legitimate philosophers have, at one point or another, made a categorical assertion regarding truth. Even most faux philosophers make categorical assertions regarding truth, even if that assertion is a naive and misguided utterance of, “There is no truth.” While I do not necessarily believe that the “end of philosophy” has some metaphysical role to play in directing philosophy or that it may be attainable in this world, I do believe that the simple utility of truth allows and encourages “those who have eyes to see” to be diligent in selecting the philosophies to which they ascribe. This “natural selection” of memes will, naturally, lead towards the end of philosophy. I know this sounds quite similar to the Darwinist narrative which I have rejected mere pages before now, and it should, as there are some good ideas buried amidst the bad science. The survival of the fittest, as Herbert Spencer is credited with having formulated it, is one such concept.

Such memes as survival of the fittest are a prime contemporary example of how philosophical concepts tend to simply be a part of the atmosphere in which society functions. Most everyone has heard that phrase in one memorable context or another, even if they have no idea or a misconceived notion of what it means. In the case of philosophical culture, or rather the culture of philosophers, far more obscure and odd concepts are part of the atmosphere. In this way, a well-read and intelligent philosopher may breathe in Descartes, Scholasticus, Nietzsche, and Groothuis in order to utter forth a synthesis of these elements unique unto himself, even if it is identical to another’s work.

What utterance do I have to make? What can one such as myself bring to the banquet table of philosophy? I desire to partake of the feast about which those before me have written, but what can I do to pay admission? As will be clear to those who will bother to read these Theses, I am not yet sure, but I hope to one day have applied myself thoroughly enough to this, my vocation, so as to be worthy to touch the garment of lady philosophy.

This work, itself, is an attempt to codify my existing ideas in a format suitable for public development and critique. Philosophy, by its nature, is discursive and social by nature. I could not rightly call myself a philosopher if I were to merely wonder at the cosmos. Only if I were to share my wonder with others and argue my way to the truth alongside my companions would I be worthy of such a name. This is my first of a thousand steps towards the banquet for which I was created. I hope to bring along as many as can come with me to sing the praises of the Grand Architect of such a marvel as creation.

All I can rightly ask of philosophy and of those philosophers who would aid me in this journey would be that I contribute one more voice to this chorus as old as man: to be heard and considered by others, to have what truth I can find be perpetuated while my own shortcomings be disregarded. A lesson I have learned from Ayn Rand: to be considered sophomoric and redundant is still, at least, to be considered. If I could rightly ask more, however, I would ask that I be granted a personal fulfillment of my unslakable thirst for answers.

Hopefully, I can play an integral role in this chorus, can make an impact. I want to bring the practice of true philosophy back from the grave that enlightenment dug, existentialism filled, and postmodernism hid in the woods. The death of god19 was less a death of god and more the abortion of philosophy. I want to aid in the restoration of Lady Philosophy to her former glory, to clothe her once again in dignity and honor, and to bring her back to the common people, not as an object of rape, but of royalty. This novitiate book is the inauspicious beginning of such a daunting career choice.

95 Theses

1Discourse on the Method of Rightly Conducting One’s Reason and of Seeking Truth in the Sciences” Pt. 2

2Self-evident and deductively reasoned

3Chapter 1: Epistemic Assumptions

4Hegel, Encyclopaedia of the Philosophical Sciences p10

5Chapter 5: Teleology?

6Also Ch 5

7“Leisure: The Basis of Culture” p110

8German: “Spirit of the times”

9“Time is an illusion, lunchtime doubly so.” Douglas Adams

10Groothuis, On Pascal (Stamford: Thomson Learning, 2003), 202

11Chapter 5

12 a theory resulting in an empirically verifiable prediction which, if inaccurate, determines that the theory is wrong

13Groothuis “Christian Apologetics” chapter 13

14An ethical school of thought which argues that the result of an action determines the ethical quality of said action

15Self-disciplinary and abstinent

16Flourishing and fulfillment

17 A phrase that is certainly as old as the Socrates quote from before, but never better implemented than as by the people on the Partially Examined Life podcast: http://www.partiallyexaminedlife.com/

18 disbelief that it is possible for one to obtain truth or knowledge of the truth

19Nietzsche used the phrase “god is dead” quite frequently. Most notable of which is his parable of the madman from “The Gay Science” book three.

NonViolent Communication

About a year ago, I read “Nonviolent Communication: A Language of Life — Second 2nd Edition”. I was very resistant to giving NVC a chance.  My introduction to it was some people on Free Talk Live talking about it, and it sounded like some sort of cult-y, Scientology-like, “if we all learn to pray and talk with hippie vibes, the world will be healed”.  Hearing about it from Stephan Molyneux next sealed the deal (he is a de-facto cult leader). Satya Nadella made this book required reading for Microsoft execs, which made me wonder if this was becoming a mainstream fad and made me even more resistant to the idea. Also, the name itself seemed off-putting to me.  I figured (and still do) that any language that didn’t consist of veiled or direct threats is, by default, non-violent.

Then, certain people that I don’t always agree with but always respect their opinion and degree of thought it takes for them to develop an opinion re-introduced me to the idea of NVC. Between Brian Sovryn explaining that it has less to do with non-violence, and more to do with empathy, I started to reconsider. Seeing Adam Kokesh put it to work on Christopher Cantwell, of all people, sealed the deal. I saw the way that Kokesh (someone whom I’ve always been suspicious of) managed to basically shut down the angry part of Cantwell’s brain and get a begrudging admission that NVC may be an effective tool. I still was very, very suspicious of the whole idea in general, but I knew I had to at least research it before dismissing it.

I bought the book on Amazon for something like $15 and read it in a few weeks, taking it a few pages at a time. The book is easy to read, short and sweet, and gives actionable suggestions. While the methods of NVC aren’t useful in every circumstance, (philosophical discourse, for instance), they are incredibly effective at smoothing out day-to-day interactions with people, especially adversarial people. I am, by no means, a peaceful parent, but I’m looking into that, as well. I can say this much, though, after giving NVC a shot, I’ve gotten incredible results with my middle child. It used to seem like her sole purpose in life was to antagonize me, but we’re making excellent progress in getting along, thanks to Rosenberg.

The way I understand NVC to operate is thus:
We, in our culture today, are addicted to counter-productive emotions. We have developed a habit of being outraged at things. The4 internet has proven to be instrumental in fueling this addiction to outrage, as there’s always something out there for anyone to be mad at. The way addictions work is in cycles. Stimulus, reaction, dopamine/adrenaline/etc, brain-drugs wear off, repeat. In the case of outrage, something touches on an unresolved need or desire within us, we get mad and lash out at at whoever or whatever touched on that nerve, we get a release of feel-good drugs in our brains, and we feel good about being miserable, repeat ad-infinitum. What NVC seems to do is interject itself between the stimulus and reaction and closes that loop prematurely. This is how addictions are broken, how good habits are formed, and how someone can talk down a 280 lb thug before getting their face punched in.
It is also a method of communicating that, in closing that loop prematurely, leads people into uncharted areas of their own human mental experience and opens them up to actually exploring alternative ways of seeing the world, which is useful when discussing crucial matters such as human flourishing.

As it stands now, I understand NVC in an almost entirely scholastic sense, but my early efforts at putting it into practice have already made family and work far more manageable. I recommend everyone read this book. I don’t think it’s some sort of silver-bullet to eliminating the state, as some do, but I do believe that this is a tool set that is irreplaceable if one wants to flourish in a post-state society.

 

Admittedly, the metaphysics in the book is very cloogy, but that’s to be expected.  Ignoring the metaphysics and treating the work as a rhetorical tool seems to be much more efficacious and fits well into other practices in rhetoric, such as the Trivium.

Moral Ambiguity

The time has already come for another dose of procedural philosophy.

 As is always the case with procedural philosophy, some homework is in order. If you want to get the most out of this post, you should read or listen to the post about “Paradigmatic Awareness”. Today, we are talking about ethics directly, as opposed to the usual posts about how ethics impacts our relationships. Ethics, like all terms, requires a shared definition in order to be useful.

Ethics is the study of principles which dictate the actions of rational actors. Some will note that this closely parallels some people’s definition of economics. This is not an accident, but this phenomenon will have to be addressed later. There is a glut of ethical theories which assume different premises and result in wildly different prescriptions. This is a problem for an individual who is genuinely concerned with pursuing an absolute truth by which to live. Being one such person, I must admit I’m still searching; but I can help others make it as far as I have and ask others to do the same for me.

“But wait, ain’t you one o’ dem Catholic fellers?” Yes, I am. The Church has a pretty solid grasp on it’s doctrine and dogma (of which there is surprisingly little) and has built an ethics on top of that, something akin to a divine-law-meets-metaphysical-utilitarianism to which it appeals in every ethical discussion. One will notice that I do not advocate a moral stance which violates the doctrinal positions of the Church. I am fortunate that my quest for the truth has not yet forced me to choose between my own faculty of reason and the divine law of my faith. One will also notice that I staunchly oppose certain modern positions of the Church, especially in cases surrounding “divine right of kings” and compromise with injustice, such as “You have to pay taxes, because of the politically expedient manner in which we interpret ‘Epistle to Diognetus’, a letter written thousands of years ago.” (CCC-2240) What I am trying to say here is that “God said so” is never sufficient justification for one’s actions, but what “God said so” may nonetheless be rationally justifiable.

That tangent segues nicely to where we are going today. Ethics operates identically to the method outlined in “Paradigmatic Awareness” in many ways, with some variation. As the numerous postmodern moral nihilists are wont to point out, ethics faces an important problem: the is/ought divide. This problem, popularized by Hume, essentially points out that objective material knowledge of what is does not give rise to ethical prescription without first approaching what is with a subjective value assessment, an ought. This is where the procedure outlined in “Paradigmatic Awareness” becomes crucial.

Simply put, I must determine by way of intuition and abduction from what is to what I (should) value. Ultimately, anything could conceivably be the basis of ethical reasoning; hedonism, consequentialism, stoicism, legalism, virtue ethics, divine law, statism, nihilism, and anarchism are all predicated on different values and represent a fraction of existing ethical frameworks. Many are compatible with each other; as a matter of fact, most ethical frameworks are ultimately either nihilist or teleological in nature and tend to compliment others of the same nature.

Ethics, really, is the ultimate product of philosophy. Philosophy can answer any question, “How did the universe come to be?” “What is it made of?” “How can we know anything?”, but without answering “Why should I care?” it has no real utility. I propose that the best answer to “Why should I care?” is “because, if this worldview is factually true, you ought to do X and here is why.”

Of course, an ethics which is too esoteric or complex for common application and immediate results is as equally useless as a philosophy with no ethics whatsoever. This is where rules become attractive; “thou shalt not” and “always do” are certainly the result of most or all ethics. For instance, if I were a Kantian (I am NOT), I would value the rationality and identity of individuals, which results in the mandate that people be ever treated as ends only and never means; followed to its logical conclusion, one could say, “Thou shalt not enslave others.” Those that lack the faculties or resources to consider the corpus of Kant (a waste of time, really) can simply rely on the rules which fall out of his work. Without an understanding for the cause of these rules, though, one cannot reliably improvise in a circumstance not outlined in the rules, nor can they discuss ethical matters in an intelligible way. “You can’t do that, because this book said so” is a laughable claim, regardless of the book in question.

Everyone considers themselves to be an intelligent person and feel themselves to be very ethically-minded. They are correct in thinking and feeling so. Even psychopaths have a set of motivating factors for behaving in the way that they do. However, such a set of motivations, even in the form of a rule-set, does not qualify as an ethical framework. As a matter of fact, if one does not pursue the full rational grounding of one’s motivations, they will likely adopt a heterogeneous hodgepodge of contradicting rules from various sources. Any ethical claim which feels intuitive or justifies an action one desires can be easily adopted and, with a little mental gymnastics, can be incorporated into one’s rule set without too much apparent contradiction.

This results in an emotional minefield scattered with beliefs such as, “I value property rights above all else, so we have to steal from people to prevent theft.” All one needs to do is go on the internet and read the intellectually toxic political arguments found in nearly every comments section and they will see what I am talking about. The problem is not the argument or even the belief held (though, by definition, nearly every political belief is wrong), but instead the lack of paradigmatic awareness. If someone lacks the foundational knowledge of what is, a clear definition of one’s values, or a grasp of logic sufficient to put it all together, it is impossible to assess others’ claims or to sufficiently convey one’s own belief. Instead, such people (regardless of whether one’s claim is factual or not) are forced to resort to dismissive name-calling and an arsenal of rhetorical and formal fallacies.

So, then, the same prescription in “Paradigmatic Awareness” applies in ethics as well. When encountered with a radical and apparently nonsensical claim such as, “You have a duty to vote, even if it is merely a choice between two evils,” it is important to inquire as to the value and basis for such a claim. Conversely, when meeting resistance to a personally forwarded claim, it is crucial to present the premises and method used to reach the contested claim, lest one look no different than a generic social justice warrior or fundamentalist republican.

Also, just like with paradigmatic awareness, if someone is not willing or able to have a calm rational discourse, they are not providing an opportunity for critical thought. They are wasting everyone’s time. One’s time is better spent writing blog posts no one will read, reading books, or smashing one’s face in with a hammer rather than getting into a shouting match with a morally illiterate person. The goal, as is the case with all of philosophy, is pursuing truth; one cannot do so while stooping to the level of the ignorant. However, if one pursuing truth happens to bring others along, all the better.

Ultimately, my motivation for writing this post is twofold. I want to invite people to critically assess this approach and help me do a better job of understanding how I ought to live my life. I also want to find someone, anyone, who can play by the rules I’ve outlined and believe to be absolutely crucial to communication and progress. I honestly desire for someone to prove me wrong. The ethic that I have managed to cobble together over the last twenty years is incredibly taxing. I would love to (re)apply for welfare, to stop going to church, to stop trying and start partying… but I can’t. My rationality and what little virtue I do possess prevent me from doing so. I think I could do well as a Fascist (which I believe to be the only logically consistent alternative to anarchy), but no one has proven me wrong yest, so as to grant me the opportunity to try my hand at it.

Remember, despite the immense and demonstrable utility that it provides, anarchism is a moral philosophy. It holds the utmost value for human rights and, as a result, human flourishing. When an anarchist says “you shouldn’t do that,” they aren’t forcing someone else to behave in a manner consistent with their opinion. Anarchists cannot point a gun at someone and demand that they refrain from doing so, nor can they vote and delegate that task to someone else.

TL:DR; If someone wants the privilege of being able to criticize the actions and ethics of others, they ought to put in the work of critically assessing one’s own position and actions. If people cannot communicate the reasons for the rules they are so wont to broadcast, they are wasting everyone’s time.

Paradigmatic Awareness

 Why can’t we all just get along? When it comes to discussion, why can’t we seem to understand what each other are saying?

            As is outlined extensively in my yet-unfinished book, epistemology (how we know what we know) is a field of intense and voluminous study.  I will do my utmost to remain concise and direct today, but we will see if I can manage to get my point across.
Among thinking people, there is a disturbing trend of people missing each others’ points and progressively resorting to name-calling and physical altercation.  Friendships end, wars erupt, libraries are burned… all over a misunderstanding as to whether Star Trek ToS is better or worse than J.J. Abrams’ reboot.  This phenomenon is easy to see every four years in America, when just under half of the population suddenly erupts in closed-minded and aggressive rhetoric over which master we should be owned by and what behaviors we ought to compel with the violence of the state.  For many people, this argument continues on a daily basis (Thanks, Obama).

Very, very rarely does one actually change their mind or realize that oneself was wrong.  On the occasion that one does so, it is rarely a result of dialogue, but instead a result of a personal and concrete experience of their worldview and reality not comporting.  This sort of event is at the heart of every popular feel-good drama about a grouchy old person overcoming his racism.  My purely subjective standard by which I choose to judge a philosopher’s ability to philosophize is their willingness and ability to change their mind and admit error by way of dialogue as opposed to concrete experience.

While very few people my be called to be a philosopher, everyone ought to be capable and willing to do philosophy, lest they be vulnerable to misanthropy, self-dehumanization, and falling for vicious and criminal ideologies.  What is required in order to do philosophy?  There is a multitude of tools required and yet another multitude of tools that are merely useful.  The first two, the most fundamental and primary, of these tools are logic and paradigmatic awareness.  Of course, one is a prerequisite for the other.

What is logic?  Logic, contrary to popular belief, does not refer to “all of the not-emotional things that happen in my brain”.  Logic is a science and an art as old as man’s pursuit of knowledge.  As a science, the body of theories and research has been steadily growing through the generations.  As an art, the technique and skill of those who wield it waxes and wanes with times and cultures.  Logic is the place where language, reason, and objective observation meet.  Logic, in its purest form, is the exploration of the principle of non-contradiction and its application to our experience of reality.  The quest for knowledge requires a reliable and finely-tunes toolset.  The study of logic, epistemology, and phenomenology, has been directed towards the development of these tools since their inception.

Even though some high schools teach introductory classes on deductive symbolic logic and may touch on inductive reasoning, logic has been widely abandoned by our education system and, by extension, society at large. Without a working knowledge of and praxis concerning deduction, induction, abduction, and the interrelationship of the three, one cannot be expected to be consistent in their beliefs, claims, and behaviors. Unfortunately, a blogcast of this length and quality is insufficient to teach such a skill. Fortunately, there is a vast body of material available on the internet for those that wish to be rational.

A grossly oversimplified and brief introduction of the three is required, though, before I can address paradigmatic awareness. Deduction, then, is described as “arguing from the general to the specific”. A classic, if not entirely reliable, example is the famous “all men are mortal” syllogism.
“All men are mortal. Socrates is a man. ∴ Socrates is mortal.”
In this case, it assumes general premises such as “all men are mortal” and uses the principle of non-contradiction to reach the conclusion, “Socrates is mortal.” So long as the premises are factual and there is no error in the logic, the conclusion must be true.
Induction, in simple formulation, is arguing from specifics to the general. An example frequently addressed in modern philosophy is the claim, “the sun will rise tomorrow.” This claim is made based in the consistency of such an occurrence in the past as well as an absence of any predictors which indicate that such an occurrence would cease (for example, the sun vanishing would leave some pretty significant clues). Induction does not produce certainty in the same way that deduction may, but instead some well-reasoned and reliable guesses which have a particular utility about them.

Abduction can be considered “making the strongest case”. If the circumstance arises such that a question presents itself which requires an answer and neither a deductive nor an inductive argument is possible, one can produce an answer which does not contradict accepted deductive and inductive claims and is, itself, self-consistent. Using tools such as observation, occam’s razor, intuition, and a detailed understanding of one’s paradigm (we’ll address this is a minute), one can make a compelling case as to why their chosen belief is true.

This brings us to the interrelation of the three. Due to the certainty produced by valid deductive reasoning, one’s inductive claims cannot come into contradiction with such claims. If one is committed to a particular inductive claim which is found in contradiction with deductive claims, they must first demonstrate a flaw in the premises or logic of the existing deductive claim. This same priority is given induction over abduction for the same reasons.

Of course, this description ignores the source of our general premises that this whole process began with. In all reality, premises are produced by abductive reasoning and ratified by the simple Popperian principle of trial and error. This means that, per Gödel, any complete philosophical worldview cannot prove itself to be factual. Only by way of comparing a worldview’s predictions and claims against one’s experience of reality or confirming the strength of the premises’ defense can one ultimately justify any particular worldview.

This finally brings us to paradigmatic awareness. Those that have read this far, I salute you. Using a modified version of Thomas Kuhn’s definition of “paradigm”, a paradigm is the set of established or assumed claims which take priority before the claim in question based on the rubric I briefly described when addressing logic. Why does something so simple-yet-esoteric matter? It may sound intuitive once described, but despite its intuitive qualities, very few (if any) people truly possess paradigmatic awareness

For instance, when faced with a claim one may find absurd, such as “We need to tax every transaction possible in order to pay for government guns,” it is possible that the (clearly incorrect) individual may have a valid logical argument to reach that conclusion. More likely they hold, either implicitly or explicitly, flawed premises from which they derived an absurd conclusion. There is really no point in discussing the conclusion itself so long as the premises are left unacknowledged and unaddressed. Communication simply isn’t possible without commonly accepted paradigms between communicants.

This is where the standard of being able to change one’s mind comes into play; in the process of exploring the premises held by someone else which resulted in an apparently absurd claim, three beneficial results may arise. In exploring the paradigm of someone else, you may bring to light counter-intuitive or implicit premises that your conversant may never have previously critically assessed. Additionally, it will give you the opportunity to cast doubt on another’s premises, allowing them the otherwise impossible moment of self-reflection. Lastly, of course, by holding a counter-factual presented by someone else, there is always a chance (however slim) that you may realize that you, yourself, are wrong.

Now, one cannot always explore others’ worldviews without expecting the same intellectual courtesy in return. By following the advice given above and explaining what you are doing along the way, you can effectively provide an education in communication skills and logic that far exceeds what meager offerings most people are exposed to. This will give them a greater chance to entertain your correct but unpopular claims like, “Taxation is theft.” Additionally, anyone unwilling to explore their own premises or yours are clearly not interested in intellectually honest dialogue directed at obtaining truth and, therefore, are not worth your time or energy; a handy resource management tool, if you ask me.

So, why can’t we get along? Because no one is given the tools required to even consider getting along. Why can’t we understand what each other are saying? Because we don’t try hard enough. Remember, no unwilling student can learn, this includes yourself.

TL;DR: Listen to what people claim. Ask, “How did you reach that conclusion?” Make it a point to maintain an awareness of your opponent’s paradigm. Genuinely search for the truth in their words. Expect and demand that they reciprocate the effort, lest you waste both parties’ time and energy.
As I said on facebook the other day (while re-realizing some flaws in the AnCap worldview):
I love being a philosopher. My worldview is constantly shifting and undulating… but always gradually comporting itself more closely to reality. Where fleeting moments of intuition can, decades later, be given meaning and purpose and carefully constructed arguments and justifications can crumble, there is where humility and virtue can grow. The fires of truth and the crucible of reason can lay bare natural and artificial landscapes of mind alike, and enrich the soil for new growth and the return of the most robust ideas to carry on their existence.

Surprise! Another Post

 On rare occasion, I am surprised. Sometimes, it is something as mild as hearing a decent song on the radio. Other times it is something as extreme as finding scorpions in my hair. Yesterday, I was surprised to be inspired by an atheist podcast I listen to… so here’s what I was inspired to write about. Surprise can be unpleasant, hilarious, or any blend of the sensations in-between. What, exactly, is surprise? A neurobiologist with a higher IQ and worse social life than mine own may be able to answer this question better, but I thought it was worth exploring.

I contend that surprise occurs when someone experiences a state of affairs contrary to their noetic framework. An easy example would be when evil clown appears before you and you shit your pants in surprise.

The cause for surprise is not the clown itself, it is the experiential contradiction to one’s noetic framework. In this example, it is the implicit (or explicit) belief that one holds which states, “I live in a world in which evil clowns do not appear before me without warning,” being violated which causes surprise. Other common beliefs which are frequently upset could be, “this is the last step in a flight of stairs”, “you’ll love this joke”, or “my bed isn’t full of spiders”. That gut-wrenching shock occurs simply because those beliefs were incontrovertibly disproven.

A great many of our entertainment dramas play off of this reality. Coming-of-age flicks like “My Girl”, feel-good dramas like “Gran Torino”, horror films like “Alien”, etc. all demonstrate or a assume the audience or protagonist’s belief structure and proceed to to surprise the audience and protagonist over the course of two-ish hours. Showing the protagonist and audience that the world (either the real one, or the fictional one which is the center of attention) doesn’t work the way they thought it does is pretty much the singular impetus of the plot.

But, why should someone care about surprise? Well, as it turns out, it took me about two years to come up with an answer to that question. I was surprised when presenting this idea to my wife… she got mad at me, which was unexpected. It turns out, two years ago she brought this idea to my attention, but I couldn’t find a place in my worldview that could be enriched by such a line of questioning… and so I forgot the conversation altogether. </anecdote>
You may laugh, but my newly-realized reason for caring about surprise is an ethical one. As any poor soul still reading this post ought to know, I am a virtue ethicist. What does surprise have to do with human flourishing, though? Well the connections are twofold.

Firstly, Surprise is an opportunity for discipline. When one is surprised, as I already explained, it’s because they are faced with a reality that is distinct from the one in their head. In science, this is called a “discovery” or “falsification” (in my under-caffeinated state, I can’t remember what exactly the rubric is for declaring something a “discovery”). In a horror movie, it’s called “being dead”. What it really is, though, is an opportunity to correct one’s beliefs and resultant behavior.

For example, if one consistently wins at a competition of skill (ie. chess, first person shooters, martial arts, etc.) and is surprised by a loss, it is an opportunity for them to fill whatever blind spot they had. With a demonstrably superior physique or mind, there must be a blind-spot in their knowledge of their particular sport. After a surprise loss, they can survey the playing field and actions of their opponent with a new perspective, analyzing which implicit beliefs they held which resulted in their loss. Another example would be if one is surprised by a bed full of spiders, they are given the opportunity to incorporate that knowledge and develop the habit of checking their bed before staggering in and collapsing in a drunken heap. Maybe, they could even discern the cause for a bed full of spiders and develop habits which prevent such a possibility in the first place.

I used to be surprised quite frequently in my younger years, probably due to the fact tat I was an immature insufferable know-it-all. Nowadays, I am pleasantly surprised at the rare occasion of surprise in my life. This brings me to the second reason a virtue ethicist would be concerned about the nature of surprise; surprise can serve as an excellent self-diagnostic tool. The frequency and trends of a person’s surprise can express to the surprisee their general attitudes and their epistemic strengths and weaknesses. This, again is divided in two ways: determining the cause for one’s lack of surprises and revealing epistemic blind-spots. In the case of lack of surprise, I can think of three reasons one would be infrequently surprised:

  1. They have an unusually accurate worldview, resulting in few instances where they would be surprised by inaccuracies
  2. They are a Taoist sage, with a certain expectation of epistemic inaccuracy built into their worldview, “It’s not surprising that I was wrong, as I am always wrong” or, alternatively, “I hold no beliefs… so none of my beliefs can be shown false.”
  3. Or, this person could just be a total jerk. “I knew that all along”, “Did I just think of that? I had to have… because I am the greatest”, “That can’t be an evil clown standing in front of me… because I didn’t predict that it was possible.”

While it is the case that a virtue ethicist such as myself would insist that one strive for omniscience, resulting in a total lack of surprise due to cause #1, I am aware that such an achievement is impossible for a human being qua the human condition. Therefore, the most practical solution to the question of surprise would be one of fine-tuning. Finding the appropriate blend of omniscience, Taoist apathy, jerkiness, and surprise-ability is likely to be the most direct path to flourishing with regards to surprise. Despite the credit I would like to give myself, I don’t think I’ve yet found the appropriate balance of the four… I’m likely less surprised simply because I’m now a mature insufferable know-it-all.

The second useful diagnostic tool that surprise provides us with is one of trends. If someone is frequently surprised by similar things, for instance that people around them are smarter than one thinks, they are likely to have an implicit belief that everyone around them is an idiot. Alternatively, if one is consistently surprised that the guy they are dating is a jerk, maybe they have an implicit set of beliefs that gives them a poor taste in men. These can also be positive surprises. An example would be if a shy person with low self-esteem presents a rare idea to a group and the idea is surprisingly well-received, then there is likely a set of implicit beliefs that leads the shy individual to underestimate their own intelligence.

By keeping a record of one’s surprises, they are more likely to find the appropriate fine-tuning of their behaviors and worldview in order to flourish. As always, knowing oneself is most of the battle when virtue is concerned, and surprise can be a valuable asset in the discovery of oneself.