The Role of Philosophy in Daily Life

One might read the previous chapter and question whether philosophy is more than esoteric navel-gazing.  Admittedly, I didn’t do a very good job of presenting it in a manner that would appeal to “Plumber Joe”.  Why should one concern oneself with trying to figure out all the little details about how the universe operates and why?  Shouldn’t it be sufficient to figure out how these more concrete tools at my disposal can contribute to my quality of life?  I can make more money, get better employee benefits, and have more self-satisfaction if I simply tend my garden[1] and work on much more real things.  Besides: lifting weights, buying cars, and playing guitar are easier activities than questioning fundamental assumptions about reality and considerably increase my value in the sexual market by comparison.

I, myself, feed my growing family by way of more practical considerations than discussing the specific ontological status of contracts.  I’m a facilities manager by trade and a philosopher by vocation.  Given that practical considerations generally have more market value than philosophical ones, why would one choose to engage philosophy?  There are a number of answers that, cumulatively, make a compelling case for such activity.  For now, I will focus on the more practical aspects and save the more psychological and ephemeral ones for later in this book.

One of the key aspects of the philosophical exercise is epistemology.  What epistemology effectively boils down to is the study of knowledge: what it means to know something and by what mechanism one comes to know something.  At first, it may seem like a dumb line of inquiry.  One knows something if they believe something and it happens to be true; they know these things because experience leads them to believe such things with accuracy.

As anyone who has had experience with mind-altering substances, mental illness, or living with a pathological liar, will attest, sometimes knowing things isn’t as easy as people initially think.  This has been the case throughout history, as well.  If I see an omen or an angel comes down and tells me something will happen at an appointed time, could that belief rightly be called knowledge?  What if an authority figure tells me something?  Hell, even my senses are suspect; how many times has someone looked at an object and misjudged its size or distance, witnessed a mirage, heard or felt something that didn’t correspond to anyone else’s experience, or any number of other illusions?

Descartes[2] wondered if he was the only mind in existence and that there may be a spirit of some sort causing him to have a vision of all the other phenomena he experienced.  This line of reasoning is called solipsism[3].  This solipsistic reasoning has been extended to “Matrix”-like brain-in-a-vat thought experiments and universe-simulation theories.  One doesn’t need to get as involved as Descartes, though, a quick trip on drugs or mental instability will give one sufficient experience of “seeing things that aren’t really there” to begin doubting one’s senses.

Epistemic problems don’t even need to be that far-reaching, either.  For example, inexplicably, there are a growing number of people that believe the Earth is flat, that crystals have magic healing powers, that children should be encouraged to undergo irreversible unhealthy and life-altering plastic surgery, and so many more absurdities.  Just yesterday, I was led to believe that I had to be somewhere at a certain time… and both the time and location were incorrect.

Understanding the nature of knowledge in a deeper and more reflective manner has, however, been quite useful in preventing situations such as the one that occurred yesterday.  For example, exploring common occurrences of human fallibility in theory helps to identify instances in reality and navigate people through them.  When attempting to coordinate multiple contractors, administrators, and customers, heightened awareness of epistemic difficulties and solutions has been invaluable.

Something related to epistemology and equal in utility is the study of ontology.  Ontology is the study of existence, things that exist, and in what manner.  Again, this may seem to be as obviously superfluous as epistemology at first, and one could just as easily be surprised.  The earlier epistemic examples of “experiencing things that aren’t really there apply to ontology as well, of course.  But what if I told you that a great many things we take for granted as existants[4] are of dubious ontological status?

There’s the obvious things like God, space aliens, astrological energies, political authority, true love… and some less obvious things like consciousness, free will, fundamental particles, or that fortune that Nigerian prince still owes you[5].  One can’t be certain of the existence (or non-existence) of these things if one doesn’t have a firm grasp on one’s methods of knowing things but, even then, it can be difficult to prove or disprove the existence such things.

This is where the bottom-up approach of philosophy
I mentioned in the previous chapter becomes pertinent.  If one can secure knowledge of or, at least, confidence in the existence of some things, it becomes easier to bring other things into that sphere of knowledge by way of understanding the relationships between the two.  Since Descartes’s famous cogito[6], philosophers have largely attempted to prove their own existence or the existence of the phenomena experienced by themselves and used that as a starting place by which to prove the existence of the other furniture of the world that we all take for granted.

I’m sure that this doesn’t seem practical just yet.  “I know I’m hungry because I feel hungry and I know that this bacon cheeseburger I’m about to eat is real because I can see, smell, touch, and taste it.”  Fair enough.  But what if there is a God and he hates people who eat cheeseburgers?  Alternatively, what if that meat isn’t real meat but is some science experiment grown in a vat and happens to be riddled with prions[7]?  Knowing either of those circumstances may give one sufficient reason to modify one’s behavior.

The same goes for whether or not the cow and pig that were, ostensibly, butchered to produce one’s meal possess consciousness and are capable of experiencing meaningful mental events.  If one were convinced that were the case, one would likely become a vegetarian, posthaste.  Otherwise, why wouldn’t one eat baby-burgers with dolphin sauce?

That took a dark turn, but the question still stands.  There is a great deal of human suffering that one can witness and, assuming one believes that other humans exist and are capable of comparable mental faculties to oneself.  A good portion of this suffering is, directly or indirectly, a result of epistemic or ontological mistakes made by either those that are suffering or by others who have those unfortunate individuals within their sphere of influence.

This is why ethics is the oldest and most-engaged field of study throughout the history of philosophy.  The pre-Socratics[8] were primarily concerned with “how does one live the good life” and secondarily concerned with “how does the world work?”  Socrates, Plato, and Aristotle had similar priorities.  Medieval thinkers in Europe and the Middle East alike were also primarily concerned with “How does one be holy?” and secondarily concerned with “How does God work?”.  Enlightenment-era and modern thinkers have been primarily concerned with “what is justice?” and secondarily concerned with political institutions such as monarchy and various forms of socialism (such as democracy, republicanism, communism, etc.).  Only recently has postmodernism shifted the focus from “how does one live the good life?” to “how can we best undermine all of the institutions which were built by Europeans of bygone eras?” with living the good life becoming a secondary philosophical pursuit.

Of course, one can’t know how one ought to act without first knowing at least a little bit about the world one is trying to navigate, hence my initial focus on epistemology and ontology.  For example, one cannot determine that one ought to act to minimize the suffering of others if one does not first establish that there are others who can suffer and that suffering is undesirable.  The same dilemma applies when determining that one ought to live by the prescriptions of a book written thousands of years ago or refraining from eating a delicious and juicy steak.

A quick survey of ethical theories will present so many varieties of premises and conclusions that one is liable to despair at the outset of such an investigation.  Do not worry; I hope that, by the end of this book, you will have a firm enough grasp of philosophical methodology and (possibly) the reality of the matter which philosophy engages that you will be well on your way to making sense of ethics.

For now, I think it should suffice to say that ethics is the most practically applicable area of philosophy because its primary focus is influencing how one acts.  Ethics takes into account the various circumstances an actor finds himself in and applies a rubric by which he can or should act.  As the ancient Greeks phrased it, the problem is “how does one live the good life?”  Such an inquiry is obviously directed at happiness and, hey, who doesn’t like being genuinely happy?

Admittedly, this rubric must take into account objective facts about the world, such as what things exist and in what manner as well as subjective matters such as the objective of the individual actor, and that process is where things get hairy.  The methodology one uses to sort through the furniture of the world and the subjective goals of the individual actor is the source of the plethora of divergent ethical theories[9].

Ultimately, this introduction to the basics of philosophy is directed at establishing in your mind the plausibility of philosophy having practical utility in daily life.  I do not know you, the reader, personally but I am confident that it is a rare exception to find an individual completely lacking in ethical awareness.  How often does one encounter phrases like “that’s just wrong,” “people should just,” “such-and-such are as bad as Hitler,” “you really should go vegan/to church/vote/to college” or other variations of statements directed at modifying or justifying one’s behavior?  Whether those claims relate to a consistent and expansive network of ethical calculations and value judgements or not, those are ethical frameworks in action.

Even if one isn’t aware of the genealogy of those ethical compunctions, I can guarantee that they are derived from some philosophical work or another.  It is important to be aware of that genealogy, though; without the ability to critically examine the consistency of ethical claims one can fall victim to con artists and well-meaning do-gooders alike.  How many political campaigns have stemmed from undeserved patriotism or lies generating outrage?  How many people donate money to charities that simply show a sad image and ask for money, only to line the pockets of fraudsters?  Philosophy can help prevent such things.

[1] This is a barely-veiled allusion to “Candide” by Voltaire.  It’s an exceptional work of scathing philosophical satire.  It’s not as much fun if one hasn’t familiarized oneself with Leibnitz’ optimism.

[2] Rene Descartes: French philosopher from the turn of the 17th century; began a series of inquiries in modern philosophy named “Cartesian” which center on mind-body dualism and problems of knowledge.

[3] Solipsism: The belief that one’s self is the only thing that can be known to exist as such.

[4] Existants (n): Things that exist.

[5] If you don’t get the reference, just look up “Nigerian Prince scam” on the internet.

[6] “Cogito ergo sum.” translated as “I think, therefore I am.”

[7] A prion is a unique vector of disease wherein mutated proteins migrate through a host organism and reproduce, much like a virus.

[8] Pre-Socratics (n): The philosophers who lived in the Mediterranean region before the time of Socrates (the end of the 5th century BC).

[9] This dilemma is made strikingly clear by the observation of David Hume in “A Treatise of Human Nature” wherein he indicates that moral obligation is a concept of a different category than facts about the world.  This is commonly called the is-ought divide.  I will address this particular issue in the chapter on human action.

The Nature of Philosophy

As is the case with most cultural pursuits which hearken back into the dark recesses of history, philosophy has no universally-agreed upon definition.  Even in academic circles, the definitions of the enterprise called “philosophy” is likely to be as numerous as the number of philosophy department chairs one asks.  This is a phenomenon[1] that vexes many analytic-minded[2] philosophers, given their obsession with necessary and sufficient conditions[3].

While I write and think very much like an analytic, I do not feel that it should be absolutely crucial to assign a definition to philosophy which outlines necessary and sufficient conditions.  At the same time, however, I am not inclined to do as postmodern[4] and continental[5] thinkers tend and simply hand-wave the issue and say “it’s a family of activities that generally resemble each other”.  The only remaining option, then, is to make an attempt at crafting a heuristic[6] for identifying philosophical activities as opposed to any other activities within the scope of human intellectual experience.

Looking at the historical context of philosophy, one may get a feel for the “family resemblance” of philosophical activities.  The helps one create a genealogy of philosophy.  This genealogy begins with ancient thinkers were predominantly concerned with “living the good life” as well as understanding how the world worked.  One of the tools that was of utmost importance to the ancient thinkers and has maintained its utility (at least, up until the point where the postmodernists have taken over) is logic.  In the middle ages of Europe and comparable periods of time in locales such as India and Japan, there was a burgeoning attempt to ascertain the fundamental qualities of existence; admittedly, this was universally in a religious or theistic context of some form or another, but that does not negate the contributions made.

In the more modern eras, from the enlightenment[7] to today, the philosophical enterprise has been a predominantly directed at understanding the manner in which man interacts with reality, from the nature of sense experience to the nature of knowledge and its acquisition.  Additionally, there has been a lot of emphasis on the manner in which the individual interacts with mankind at large and how that interaction ought to be conducted.

Depending on one’s definitions and motivations for constructing a narrative, philosophy can be seen as the progenitor of, handmaid to, or companion of nearly other activity in human intellectual life.  Modern scientific methods are the product of ancient natural studies and enlightenment-era epistemology[8].  Computer science is predicated on mathematical principles and linguistic theories which have been formed through philosophical discourse.  Theology is, by and large, the application of philosophical tools to puzzles related to spiritual revelations and religious doctrines.  Economics[9] is the result of a-priori[10] reasoning in conjunction with philosophical tools of introspection and observation.  These relationships cannot be ignored, but the exact nature of these relationships is at the heart of many lively debates.

I can (and have) gone on a much more rigorous exploration of the necessary and sufficient conditions for something to be considered philosophy, but that sort of exercise is better suited for a longer, more exhaustive, procedural work.  For now, I think it would be most prudent to do a quick breakdown of the etymology[11] of the word “philosophy”.  The word, itself, hails from ancient Greek and effectively means “love of wisdom”.

Of course, nothing in Greek translates so directly into English.  For example, ancient Greek has at least four words for love (arguably, there are a few more).  This particular root, “-philia”, would be most appropriately used in the context of a dispassionate desire for (non-sexual) intimacy, such as that of close friends.  Additionally, “sophos” is a Greek word the denotes a wide array of practical and virtuous skills and habits regarding wisdom, rather than just the sterile modern English concept of knowing a lot or having advanced experience.

The best I can do to describe the Greek root of the term is to say that it is “an actionable desire to develop intellectual virtue and put it into practice in the world at large”.  This takes many different forms, as demonstrated by Socrates and Diogenes relentlessly badgering their neighbors concerning how wrong their ideas of how the world worked really were, while Aristotle, Pythagoras, Epicurus and Zeno started schools and lectured ad-nauseam.  Later in history, the general attitude of a philosopher had largely homogenized into academic bookishness and the writing of essays and long-form treatises.  The exact nature of each essay and treatise may be radically divergent with regards to content, method, and end, though.

Ultimately, taking into account all these diverse enterprises and the influence of postmodern thought, I believe that any human enterprise directed at creating an internally consistent, logically sound, empirically viable, and universal worldview which possesses ethical actionability, utility, and (ultimately) Truth can be rightly considered to be “philosophy”.[12]

In order to attempt to construct a worldview that correlates to reality, there are a great many prerequisites that must first be met.  For example, there is the assumption that there is a reality to which a worldview can correlate.  Another example would be establishing the fundamentals of logic in such a way so as to be certain of their utility[13].  Yet another assumption would be that one is capable of constructing a worldview at all.

Rather than dragging my readers through the most meticulous and technical aspects of post-enlightenment thought, I’d like to discuss the general methodology of philosophy and, if my readers are so inclined so as to investigate these problems in their fullness, I can recommend some starting places.[14]  These problems of philosophy are quite significant, and I believe that these issues ought to be examined, but they are not issues for beginners or the faint of heart.

Instead, I recommend familiarizing oneself with the fundamentals of philosophical methodology and begin exploring this new way of perceiving reality, first.  Even though it has taken many different forms throughout history and our contemporary academic landscape, the fundamental methodology of philosophy has found no better expression than that of the trivium and quadrivium of the middle-ages in Europe.  Although these fields of study were crafted in a theistic environment and are, therefore, often ignored or denigrated by modern (leftist) scholars, the methodology they present are still quite valid, even if they may have been used to reach illicit conclusions.

The trivium consists of three stages of thought: the logic, the grammar, and the rhetoric.  Initially, these stages of thought were applied exclusively to language (hence their names).  The logic was the basis of linguistic thought; it contained the a priori principles such as the law of identity[15], the principle of non-contradiction[16], and the resultant laws of induction.  The grammar demonstrated the rules of language which reflected the logical principles outlined earlier; subject-object relations and other syntax relationships are important to maintaining fidelity to the logical principles underlying that communication[17].  The rhetoric refined the above skill sets so as to aid a thinker[18] in convincing others of the facts which he had uncovered through the application of logic and grammar.

Since its inception as a linguistic methodology, the trivium quickly expanded into a philosophical methodology.  This is partly due to the close relationship that language and philosophy has always held and partly due to the axiomatic nature of the trivium lending itself to the inquiries of philosophy.  In essence, a thinker must first establish the furniture of the world (the fundamental principles and objects of those principles), then explore the relationships between those objects, and then must find a means by which to express those relationships.  For example, the “Socrates is a man” syllogism I referenced in the footnote on this page contains material that isn’t merely linguistic.  For example, the categories “Socrates”, “man”, and “being” are assumed to correlate to realities in the observable world.  Additionally, the grammar of the statement establishes a relationship to those categories which are assumed to correlate to the observable world.  This trend is maintained through the rest of the syllogism:

Socrates is a man,

All men are mortal,

∴ Socrates is a mortal.

At each level of the syllogism, new categories and relationships are assumed or established.  On a linguistic level, logic serves as the structural framework for the grammar to populate with the symbols for Socrates, man, etc. and the rhetoric is the manner in which one would express this syllogism to others and defend the validity of the syllogism.  On a philosophical level, the logic serves as the source for the objects Socrates, man, etc. the grammar denotes the relationships between those symbols, and the rhetoric serves as the means by which these ideas move from my mind to the page for your mind to reassemble[19].

This quick introduction into the methodology of philosophy will be expounded upon in the next chapter, as we explore the role of philosophy in daily life or, as the ancient Greeks put it, “how does one live the good life?”

[1] Phenomenon (n): The object of a person’s perception or discussion; an event of which the senses or the mind are aware.

[2] Analytic Philosophy (n): A school or tradition of philosophical thought predominantly populated by English-speaking philosophers which emphasizes procedural methodology and strict definitions and application of logic.

[3] Necessary and Sufficient Conditions (n):  The requirements of any given subject to meet a definition; necessary qualities are qualities which, if absent, preclude subjects from being defined as such and sufficient qualities are qualities that, if present, allow a subject to be defined as such.

[4] Postmodern (adj): Relating to a school of thought which maintains certain attitudes such as indefinability, plurality of reality, and subjective narrative ontologically trumping objective reality.

[5] Continental (adj): Relating to a school or tradition of philosophical thought predominantly populated by thinkers from mainland Europe which emphasizes meta-philosophical influences on philosophy such as culture and economics.

[6] Heuristic (n): A method or system of interpreting ideas as they are presented.

[7] Enlightenment Era (n): A period in European philosophical history, commonly accepted to be from as early as the 16th century to the end of the 18th century; the era is marked by a sudden surge in scientific advance, political upheaval, and sheer number of philosophical schools of thought.

[8] Epistemology (n): The study of knowledge, the manner and mechanisms by which one knows.

[9] Austrian Economics.  This will be discussed in Chapter 4: Political Philosophy and its Discontents.

[10] A priori (adj): A logical justification for a claim based on syllogisms, moving from given premises to their necessary conclusions.  This is often set in opposition to a posteriori or “empirical” reasoning.

[11] Etymology (n): The study of the meaning of words and the changes of those meanings throughout history.

[12] There is a good amount of jargon in this proposed definition; as these terms appear later in this book, they will be defined in more detail.

[13] Utility (n): The capacity for a thing to provide or contribute to accomplishing one’s end, usually in the context of alleviating discomfort.

[14] “The problems of Philosophy” by Bertrand Russell, “Cartesian Meditations” by (((Edmund Husserl))), and (for the preeminent masochist) “Critique of Pure Reason” by Immanuel Kant

[15] Law of Identity (logic): A=A (A equals A), A≠¬A (A does not equal not-A)

[16] Principle of Non-Contradiction (logic): The logical principle that something cannot both be and not be in the same mode at the same time. (Abbreviated as PNC)

[17] For example, in the over-used case of the “Socrates is a man” syllogism, if you were to mistake the subject-object relationship, you can end up with things like “Man is a Socrates” which is not only incorrect, but it is nonsensical.

[18] i.e. The philosopher

[19] There are deeper epistemic realities hidden in this discussion of the trivium method, but those will be addressed in the coming chapters of this book.

Language Barrier

Pod-and-blog-fade seems to be running rampant in the post-election libertarian and philosophy circles. I can’t help but wonder if it’s a combination of political hangover and something like a sigh of relief as certain existential threats have been postponed. Everywhere else, lefty entertainment and philosophy podcasts and blogs have begun their four-to-eight year pity-party, wherein they cry about the president to the exclusion of any other form of content. Technically, that’s why I voted for Trump, was to make these people cry… but I’ve got a bit of buyers’ remorse, now.

Anyway, I’m back on the content-producing bandwagon. Today, I’m talking about words.

 

I expect most of my readers will be well aware of the rules of grammar and have a decently expansive vocabulary. I’m not going to make a “top ten” list of fun punctuation marks… I mean, who hasn’t heard of an interrobang I’m not going to share my fun story about arguing about ancient Greek grammar with Jehova’s Witnesses (subject-object relationships are more important when you haven’t discovered punctuation yet). Instead, I’m discussing the philosophy of language in broad strokes.

As far as I can tell, most people haven’t critically examined the relationship between language and the world around them (unless they’ve smoked a lot of weed or have suffered severe concussions). As such, most people have intuitively just assumed one of two paradigms concerning the operation of language. If this describes you, Understand I’m not talking-down to you, as this is something esoteric enough in the realm of philosophy so as to be compared to particle physics or studying neolithic attitudes towards one’s in-laws. It is, however, an important issue to address when engaging in philosophical discussions.

Now that the disclaimers are out of the way, what are these two paradigms of language people assume? The first is that of what could be called “linguistic realism”: it’s a belief that words and sentences directly correlate to reality (in some cases, one could even say that words and reality are commensurate). In the case of thinkers like Plato and Aristotle, the word “justice” is an actual expression of some form or concept. When a poor soul makes the mistake of using the word “justice” near Socrates, Socrates assumes that the man must know the platonic form of justice so thoroughly so as to be able to utter the word, itself. Aristotle is a little more grounded, but he still assumes a sort of direct correlation between the word “justice” and manifestations in meatspace of someone “giving that which is owed”. In the modern age, that attitude is usually expressed by people who really enjoy Rhonda Byrne, people who think that bad words are bad words due to some innate quality of the word itself, and people who deride the idea of words changing meaning over time as well as the creation of new words. I used to be a linguistic realist.

The second paradigm of language could be called “postmodern nominalism” or “naive nominalism”. This position holds that words have very little correlation to reality; as a matter of fact, the best way to describe the position would be “the belief that words exist as nothing more than a game between individuals wherein rules are made up concerning the meaning and use of words, with little to no relation to the world outside of said game.” In the case of thinkers like Peter Abelard and Ludwig Wittgenstein, the meaning of a word depends on something along the lines of social consensus and common usage. When I say “tree”, it only means “that thing growing out of the ground, made out of wood, and bearing leaves” if I am speaking to someone who comprehends English and understands the botanical context of the statement. In a different context, the term “tree” could refer to a shape, such as that of a propane tree, a family tree, or a decision tree. To a non-English-speaker, it may as well be any other set of phonemes: it’s pure gibberish. In the modern age, that attitude is usually expressed by people who really enjoy saying “a rose by any other name…”, people who think that bad words are bad because of some historical or class-related context, and people who live-tweet their netflix-and-chill experience with their cis-gendered binary life-partner.

One of the clearest ways to delineate between these two positions is to inquire as to the nature of dictionaries. For example, if I hear or read a word I do not recognize, I obviously go to the dictionary… well… to google’s dictionary, at least. When I read the definition of the word, I am reading one of two things: I’m either reading the common context for the use of the particular term at the time of publication, or I am reading the “actual meaning” of the word. For example, if I were given the word “obacerate”, I would obviously have to google it or look it up in a century-old edition of the OED. When I get the definition “to interrupt one’s speech”, is that what the word means in some innate sense, or is that simply a description of how the word has been used in the past? If I were to begin using the word in colloquial conversation, would it mean “to interrupt one’s speech”, or could it take on a new meaning based on the context in which I use it or the context in which others understand it? If I only ever used the word “obacerate” when referencing covering someone’s mouth or punching them in the jaw, could the word take on that connotation?

If one says “the word means what the word means, regardless of context” one is likely a linguistic realist. If one says “the word hasn’t been used for almost a hundred years, it can mean whatever society begins to use it as” one is likely a naive nominalist. A more apparent, but less cut-and-dried example would be the use of words like “tweet”, wherein it could either be onomatopoeia for bird sounds or an activity which takes place on the website, twitter. If the word were to fall out of common parlance concerning birds, would the meaning of the word have changed once Webster cuts out the atavistic use of the word?

As is typically the case, I get the feeling that most people who bother to read this far are asking themselves “Why do I care about this hair-splitting over words?” If you are, you are right to do so. In day-to-day conversation, words just mean what they mean. If there is a misunderstanding, we need merely exchange one word for a synonym or offer a definition to contextualize the use of a particular word. In philosophy (and, therefore, any sufficiently advanced field of thought), though, these sorts of distinctions become important.

For example, if I assume that words have innate meanings and are either direct representations of something or are a sort of manifestation of the thing, itself, then when I start talking about something like colors, thoughts, phenomena, property norms… you know, abstractions, it can get hairy if I’m speaking to someone from a different set of preconceptions about language. I’m a sort of compatibilist nominalist. I greatly appreciate Peter Abelard’s contributions to the philosophy of language and I’m a recovering linguistic realist. As I will eventually get to in the 95 Theses, and I have already covered in the Patreon subscribers-only content, the human experience appears to be one which takes place entirely within one’s mind.

Whoah. Hit the brakes. That likely seems either patently obvious or totally insane, depending on who’s reading it. It’s either obvious that one has a consciousness which navigates a never-ending stream of sense-data and never grasps a “thing-in-itself” beyond those sense-inputs, or it’s insane to start talking like a Cartesian or Kantian solipsist: of course one sees, touches, tastes, smells, and hears the world around them and discusses these things with others…

…Which is a similar divide as the one between the linguistic realists and the postmodern nominalists. As far as I’m concerned, though, my mind is locked away from the world and only sees it as mediated through sense organs, nerve connections, chemical emulsions, brain wrinkles, and more. The only way I can make sense of all those inputs is to pick out regularities and assign concepts to those regularities. Through this systematic approach to those sense inputs, one can create a noetic and epistemic framework by which one can interact (albeit though similar mediation as the senses) with the world outside of one’s mind.

After all that fancy noesis and epistemology is underway, it becomes useful to apply language to this framework. If I consistently see a woody creature growing from the earth and bearing leaves and fruit, and I wish to express that set of concepts to someone else (who is obviously a similar set of sense perceptions, but I assume to be someone like myself), it helps to have a name, a sound, a mark, etc. to signify that set of concepts. And the basis for the word “tree” is created. The intuitive concepts such as causality, correlation, etc. also exist in that bundle of sense inputs and later receive names. If trees, causality, or even a world beyond the phenomena don’t actually exist, the sense inputs I have mistaken for these things still do. The reason I bring up abstractions of relationships, such as causality, is because they seem to relate to certain aspects of grammar. For example, subject-object relationships and prepositions seem to presuppose these causal and abstracted relationships.

Now, of course, there’s hundreds of years of philosophy of language at work and I couldn’t hope to go through even a thorough examination of my particular flavor of philosophy of language. The reason I tried to give this 2,000-word summary of the idea is twofold. First, I think that this is an issue that underlies a lot of misunderstandings and disagreements on the more superficial levels of human interaction. From the comical dilemmas over who’s allowed to say “faggot” or “nigger” to the more fundamental issues of whether or not “rights” or “norms” exist and in what manner, these conflicting theories of language are at play. The 95 Theses will go into the idea more in-depth and if the Patreon subscribers demand it, I’ll explore the idea further.

Second, I want to announce the upcoming glossary page on the website. I am often accused of mutilating language or using words in a way that only I can understand them. Less often, I’m accused of using too many technical words for people to keep up. I hope to remedy some of these issues by providing a cheat sheet of sorts to help people keep up with me and to understand what I am saying when I use words in a more precise way than they are commonly presented in dictionary definitions and colloquial use. Of course, I need feedback on which words should go in said glossary so, please, do comment on this post and send me emails about my abuses of language.

TL;DR: Philosophy of language is a very involved field of study, but nearly everyone is a philosopher of language, provided they speak a language. Even if one hasn’t critically analyzed their understanding of how language relates to the world, they are walking around with a bundle of assumptions as to what they mean when they speak certain words, and whether or not those words have some innate quality to them or whether they are just some sort of social game being played with other speakers of that same dialect. Most of those assumptions can be categorized as being that of “linguistic realism” (words are directly related to things and act as an avatar of the things they relate) or that of “postmodern nominalism” (words don’t mean anything in and of themselves and only vaguely gesture at socially agreed upon concepts). There are other, more nuanced positions that people can hold, but usually only as a result of actively engaging in the philosophy of language, an exercise I strongly recommend for those that are able.

Liberty Classroom: an Invaluable Tool

If you are reading this near the end of November in 2016, you can get some major discounts and provide a great deal of support to the Mad Philosopher project by going to Tom Woods Liberty Classroom and subscribing.  If you are reading this at any other time, you can still provide a great amount of value to the project by doing so.

Tom Woods Liberty Classroom is easily one of the most undervalued resources available on the internet, as it provides a legitimate PhD-level resource on a number of crucial subjects such as history and economics.  The term “legitimate” is important, here, as what most universities provide is only half-true and full of leftist propaganda.  This resource is the closest to comprehensive and the closest to unbiased as can be found.

Click Here to get some coupon codes and subscribe.  This affiliate program is definitely one of the best ways to support the Mad Philosopher project, second only to just sending me Bitcoin directly.

 

Here’s some free samples (the best stuff is behind the paywall, obviously):

the best way to fulfill the maxim “Carpe Veritas” is to subscribe to Liberty Classroom and take advantage of everything such a subscription provides.

patreon-logo

Chapter 3: Orders of Knowledge

Chapter 3: Orders of Knowledge

We have thus far introduced ratio and intellectus. As a quick refresher, intellectus (or intellect) is the inborn faculty which experiences the self and the predecessor to reason, and reason or ratio is the development of said faculty. However, in addressing the human epistemic experience and briefly examining the manner in which our mind operates, we have completely overlooked the primary concern of modern epistemology. Knowledge, in all of its complexity, still haunts our exploration of our epistemic assumptions.

While the exact definition and importance of knowledge is hotly contested in this postmodern environment, one definition tends to maintain its resilience. Knowledge, in my mind, is limited to what is called “propositional knowledge”. The experiential basis of propositional knowledge we have already discussed ought to simply be called “experience”. I define propositional knowledge as “justified true belief”. Now, as the contentious discussion that rages on will demonstrate, this definition is not flawless and self-sufficient, but that should not overshadow the usefulness or accuracy of this definition.

A brief examination of the Stanford Encyclopedia of Philosophy’s page on knowledge1 illustrates the key issues with the above definition, drawing on the works of those such as Gettier. No mater how complex and detailed the discussion becomes, the utility of the above definition is undeniable. Much like Russel’s discussion of our knowledge of universals,2 we already have an intuitive understanding of what knowledge is. As a matter of fact, we use that intuitive understanding to critique our proposed definitions, the chief example of this is the Gettier problems. A brief explanation of the Gettier problems is in order; the Gettier problems are a series of hypothetical instances contrived such that the definitive requirements for knowledge are met, but the conclusion flies in the face of our intuitive understanding of knowledge. A workable solution to such a dilemma is simple: we must accommodate for such an intuitive element in our definition. For now, “a justified true belief in which the justification is factual and sufficiently related to the truth at hand” will suffice. As that is, more or less, our intuitive understanding (ignoring the verbosity of the definition) of knowledge. “Justified true belief” is a good shorthand for this definition. More work clearly ought to be done to develop a rigorous and categorical definition for knowledge, but that is not the intent of this work. Besides, I am confident that whatever rigorous categorical definition is found will simply be a more detailed and explicit form of the one I have given.

Now why, at the beginning of chapter three, do I suddenly launch into definitions, qualifications, and disclaimers with nary a mention of the next thesis in the sequence of ninety-five? Simply put, the next several theses operate with this definition of knowledge in mind and the mere definition of a word does not justify the use of a thesis when I am limited to a mere ninety five. One more minor but crucial point must first be made, however; our intuitive use for knowledge is the formation of a reliable worldview, predicated on the reliability of the mind. As with my explanation of experiential knowledge, man is a habitual creature: our understanding, use of, and reliance on propositional knowledge is no exception. With this tedium out of the way, we may now proceed.

Thesis #7: One gains first-order knowledge by the exploration of logic as pertains to “self-apparent” principles and facts…

As I explicated in the first two chapters, “self-apparent” principles and facts are experiential in nature. Even the existence of a “self” is derived from the experience of reflecting on one’s experiences; this knowledge is not inherent to the mind, brain, man, whatever. Even the definitive and logical truths we find to be “self-apparent” are derived from a more primary experience. The easiest example of which would be that of a triangle. A triangle is a closed two-dimensional polygon with three angles and sides, the angles of which total one hundred eighty degrees. We can identify triangles by these factors, but before we could discover these attributes of triangles, we must first have an experiential knowledge of spatial relationships and basic math/geometry before we can identify or express these characteristics.

In the last chapter, we established certain epistemic tools through our mental experiences. While it is quite productive and enlightening to turn these tools on themselves in a manner similar to which Hegel discusses in his Introduction to the Philosophical Encyclopedia3, it is not required in order to begin observing and acknowledging the world at large. We can establish undeniable matters of truth and fact using syllogistic reasoning coupled with experience (most especially self-apparent facts). Our definitions of knowledge and triangles are prime examples of such a practice. This method is simple enough; one first states a definitive fact derived from experience, then through the use of the PNC explores the implications of such a fact, so long as nothing is self-contradictory or contrary to experience it can be assumed to be first-order knowledge (or, knowledge proper). If the logical exploration results in a contradiction, one must first check their logic before throwing out the initial premise. This work is, itself, an example of such a practice; our first chapter begins with three assumptions made due to their self-apparent nature, and here we are, two chapters later, still exploring the logical ramifications of such assumptions.

My current experience, aside from self apparent principles, is my only source of immediate knowledge. If our friend Mike, from the first chapter, is experiencing a particular event, say the fateful day he shot himself in the leg, he has a whole array of experiential facts at his disposal as well as deductive reasoning to assist him in knowing certain facts. He has the experience of a raw coldness in his thigh as well as a ringing in his ears which are undeniable. Mike calls such an experience “pain” or “injury”. Also, he experiences recalling memories of having dropped the handgun and attempting to recover it on its descent.4 Deductive reasoning may not be able to establish with certainty who or what is at fault for his current circumstance, but it is sufficient in analyzing the circumstance itself. Which, to be frank, is far more important when faced with a circumstance such as:

  • I am experiencing phenomena congruent with severe injury

  • If one wishes not to die, when faced with serious injury, one ought to pursue medical assistance

  • I do not want to die

  • I should seek out medical assistance

rather than to pursue the line of inquiry consistent with “why?”

Syllogistic, or deductive, reasoning is ultimately a practice in exploring the ramifications of the PNC as it applies to a particular claim. In the above example, it pertains to one’s particular experiences of pains and desires. As an astute logician will note, the above syllogism cleverly cheated; it introduced a non-immediate experience or a non-deductive inference. The premise, “if one wishes not to die, when faced with serious injury, one ought to pursue medical assistance,” is not necessarily an experiential fact or a deductively ascertained claim. However, herein lies two details which require attention: intuition and second-order knowledge. The latter will be discussed soon, all we need note now is that one can make legitimate first-order claims which are informed by second-order knowledge, so long as one is cognizant that they are doing so and verify its congruence with the paradigm5 established by one’s first-order knowledge. The case of intuition, though, is slightly more complex. As discussed earlier6, there is a distinctly observable reality that the human mind inherently possesses certain faculties, the ones addressed so far being intelligence and instinct. As far as what the exact cause of these inherent faculties is, is beside our current line of investigation. We will simply play the pragmatist for now; we will treat intuition as a brute fact and discuss its causes and specifics later. In the case of Mike, he would likely have an intuitive response to his gun wound to attempt to staunch the blood flow and such, a shorthand for these series of responses would be, “to pursue medical assistance”.

…it is highly falsifiable, and applies to physical and metaphysical fact as well as matters of truth

The above is a particular instance of what is essentially the only true type of knowledge: the only circumstance of a “justified true belief”. Anything beyond the definitive and falsifiable justification of immediate experience and deductive reasoning cannot provide certainty to a greater degree. This certainty is not, however, absolute. It qualifies to be called certain due to its immediacy and falsifiability. Falsifiability is the circumstance and burden of proof one would have in disproving a particular claim.7

Karl Popper, having posited falsifiability as crucial to epistemological study and having built an entire body of work on such a principle, is a valuable asset to one such as myself. Anchoring an entire philosophical worldview on a few epistemic assumptions, I must be diligent in exploring these assumptions and securing them as best I can. Unfortunately for me, Popper is simultaneously more pessimistic and optimistic than myself; making use of his work will require diligence. We both agree that knowledge is always suspect. It is always subject to criticism and correction. In his ardent desire to avoid supporting authoritarianism8, he seems to fall into a trap of epistemological absurdity in which “all knowledge is human… it is mixed with our errors, our prejudices, our dreams, and our hopes… all we can do is grope for truth even though it be beyond our reach.”9 As the previous chapters10 show, I agree that our knowledge is limited and influenced by the human condition but to assert (unfalsifiably, I might add) that truth in unobtainable due to that reality undermines the very premise of such a claim. Besides, to strive for the admittedly impossible is to waste one’s time. One’s energy would be better spent, at a minimum, on more practical asymptotic activities instead (like curing disease or pursuing pleasure or enlightenment).

With how jealously I withhold the title of “knowledge”, the degree of confidence one can have in their beliefs hinges on falsifiability. In order to claim something as knowledge11, one must be making a claim which is immediately apparent and clearly falsifiable. Falsification of this (and every other) form of knowledge is, in truth, a good thing. Falsification provides an opportunity for better refinement and correction of an otherwise flawed worldview.12 One should always open themselves to rational and rigorous criticisms, so as to avoid becoming a relic-bearer of Lady-Philosophy’s garment.13

This isn’t to say that the first time something unpredictable or inconsistent emerges one ought to throw out their entire worldview and sequester themselves in a mire of Cartesian doubt. Quite the opposite is the case, one ought to defend such a claim until such a time as it is sufficiently disproven or falsified. We will explore this more later. For now, it will suffice to point out that single incidents of inaccuracy in one’s beliefs may in fact be flukes, only cumulative or consistent error is sufficient cause for radical reevaluation.

Now, many may mistake this epistemic framework for some Kantian a-priori reasoning or some assertion of continental brute facts. Neither of these is the case at hand. These self-apparent facts are, in fact, theory-laden. Even the most fundamental facts one can select, such as the Cartesian cogito,14 still contain some degree of implicit theory. In the case of the cogito there is at least the predicate assumption that there is a causal relationship between actions and existants (that the experience of thought must be attributed to a thinker) and that the PNC obtains. The issue is not one of selecting a brute fact or discovering an a-priori truth, but rather to find a sufficient fact on which to vest one’s philosophy because all self-apparent facts are, without exception, theory-laden.15

Of all the things we have allowed into our ontology thus far, this theory-ladenness itself must either be a form of brute fact, an inherent fact that there is no fundamental starting-place to understanding the world,16 or must be an inextricable attribute of man’s mind. I am in favor of both of the proposed options, actually. I believe that the universe is an elegant and logically constituted entity which has no one logical predicate on which all else hinges, but rather is an intricate and interdependent network of logically constituted laws in which the absence of any one equally would cause a total collapse. Because of that holistic nature of reality, our minds are equally constituted as such in order to accurately form a conception of the universe. This inherent holisticism, then, is an aspect of one’s intellect.

As mentioned, this knowledge pertains to physical and metaphysical fact, as well as truth claims. So far, in this work, the most prominent first order claim pertaining to physical fact I have made is that one has embodied experiences. Falsifying such a claim may be somewhat difficult to do experientially with our current technological limitations. However, it could be quite easy to locate a logical inconsistency with the claim. For example, one could at least cast doubt on such a claim by finding an inconsistency between the epistemic claim that one is capable of abstract thought while insisting the primacy of material senses. I clearly have not found one, lest I would have asserted otherwise, but the purpose of publishing a work as such is to allow others to double-check my claims.

In similar fashion, we have made first-order metaphysical claims. Chief among them would be that one’s understanding dictates one’s behavior. Rather, a more specific case in that assertion would be that man operates with an intermediary function between stimulus and response. The easiest manner in which one could falsify such a claim, as far as I can tell, would be to demonstrate that it is superfluous to forming a sufficient paradigm for all second and third order reasoning. I have not yet addressed the framework in which one would do so, but we will get to it shortly.

This naturally brings us to truth claims. Technically, either everything or nothing we have discussed thus far qualifies as a truth claim, given the common usage of the term “truth claim”. As far as I am concerned, a “truth claim” is distinguished from a factual claim (such as the two we discussed above) with regards to its subject matter. A factual claim has to do with a state of affairs in specific or categorical situations whereas a truth claim regards a matter of transcendental realities. This will be addressed in more detail in the next chapter, but for now, we can refer to the PNC as one such claim. While I believe it to be impossible, one can falsify the PNC simply by illustrating a logically cogent circumstance in which something both is and is not in the same mode at the same time.

Thesis # 8: Through the marrying of multiple first-order concepts and further introduction of experience, one gains second-order knowledge…

As the thesis indicates, second-order knowledge17 is predicated on first-order knowledge. The sum total of one’s first-order knowledge creates a paradigm on which one’s second-order knowledge can be built. Having already shown themselves to be self-apparent, rationally cogent, and non-contradictory, first-order claims can be relied upon to fact check one’s second-order claims. In such a circumstance that one encounters or forms a second-order claim, they must critically assess its validity against the paradigm in which they are operating.

Through the application of deductive reasoning, one takes self-apparent logical principles and analyzes their relationships. By analyzing the relationships between their conclusions, they remove themselves from the self-apparent by a minor degree. This line of reasoning has few applications outside of mathematics without the added element of experience. Practically speaking, the marrying of multiple first order concepts and adding experiential data is fairly straightforward.

Mike, now medically stabilized, can effortlessly begin to assess what happened from the perspective of strong belief. He has already ascertained that he is injured and that he dropped a loaded gun. By drawing from experience, he knows it is incredibly likely that, in fumbling to catch the gun, he may have pulled the trigger. He also has a strong belief that the other two people who had possession of a handgun at the time were executing proper gun safety and were not in such a position so as to fire a gun at an angle corresponding to his wound. All of this evidence along with the deductive arsenal provided by his first-order paradigm can (rightly) lead him to the conclusion that he did, in fact, shoot himself in the leg.

The belief he has that his companions were executing proper gun safety is primarily due to experience and collaboration. He has witnessed them demonstrate their skill, knowledgeably, and contentiousness many times before while shooting. Additionally, they are responsible for his knowledge of the rules and basics of gun safety and use. Adding to his certainty that he did in fact shoot himself would be one of his companions serving as a witness to the event, “Dude, you just shot yourself!” In their own way, collaboration and communication are a form of experience which are useable in the development of second-order knowledge. Any one stranger can present a claim to another; without a well-developed discourse between the two, in addition to the critical thinking skills required to assess that discourse, such an interaction is meaningless. If some stranger (or even a friend) simply walks up to you and makes a claim, anything from “the sky is blue” to “Elvis lives”, and leaves promptly thereafter, there was no opportunity to expand one’s knowledge base. However, as will be explored later in this chapter and especially in the next chapter, someone can make an argument for a second-order belief and that allows for the opportunity to expand one’s knowledge base or at least reassess one’s existing knowledge base.

To one familiar with logic, this thesis essentially concerns itself with induction. While Russell explores induction quite thoroughly in chapter six of his “Problems of Philosophy”, he fails to provide a concise definition for quick reference. I will suggest a definition and then recommend that the more ambitious of my readers read Russel for more detail. I would define induction as, “the rational function by which one forms a strong belief by repeated experience and logical inference.”

Clearly, the study of physics18 lands solidly in this category. The empirical and observational study of the world which makes use of logic, mathematics, and repeated experimentation has been developed with the intent and end19 of forming a cohesive and reliable framework of second-order knowledge. Physics has proven invaluable in expanding our knowledge and providing for vast improvements in our quality of life and shows no signs of slowing in pursuit of that end. However, some have fallen victim to the ideology of scientism, believing that this material study of the world must be predicated on a purely material ontology and is the alpha and omega of knowledge. As I have already illustrated, science is predicated on a first-order paradigm and is part of a larger framework of philosophy. I am reminded again of Russell:

“The man who has fed the chicken every day throughout its life as last wrings its neck instead, showing that more refined views as to the uniformity of nature would have been useful to the chicken”20

As an aside that my broader ideology and disposition will not allow me to leave unaddressed, who is crazier, the chicken who distrusts the farmer and awaits and prepares for such a time that the common belief in the farmer’s benevolence is falsified, or the chickens who are content with the utility of daily meals?

… this order of knowledge is less falsifiable than the first.

Like first order claims, second order claims cannot contradict each other. In the popular case of science, it is easy to make a claim that this is not the case. For example, Newtonian gravity is still used universally for most every day-to-day practical application of physics, such as architecture or demolition while Einstein’s theories on relativity have effectively falsified newton’s theories. That claim, though, is naive; certain aspects of Newtonian mechanics have been shown inaccurate and ineffective, but that does not mean that there were not accurate observations, predictions, and knowledge claims contained therein.21 In less esoteric knowledge bases, this reality is more evident. One cannot simultaneously claim that the sun will rise tomorrow and claim that it will not. Mike can not claim that he had shot himself in the leg and that he did not, nor can the chicken claim that the farmer will wring their necks and that he will refrain from doing so.

In reality, if any two second-order claims are found be contradictory, they are likely inconsistent with the first order paradigm one established prior to making such second-order claims. This is because no second-order claim can be made without first assuming the accuracy of one’s first-order paradigm and verifying that second order claim against it. In such a circumstance that there is a true contradiction between two second-order claims (as opposed to a merely apparent contradiction) which are both supported or necessitated by one’s first-order paradigm, one must reassess their first-order paradigm in order to ensure that some mistake was not made which would result in such a contradiction.

If there is no flaw in the first-order paradigm, one must move on to pitting the contradictory strong beliefs against each other and attempt to falsify them. In most cases, second-order claims are experientially falsifiable. Induction, as its primary use, makes predictions about the world and about certain logical results. In these cases, one needs only to seek out instances in which the predictions made are consistently or severely inaccurate.

Thesis # 9: Through the extension of trends in the aforementioned orders of knowledge and the marrying of multiple second-order concepts, one can gain third-order knowledge: this order is rarely falsifiable by any means other than proving logical inconsistencies concerning the first-and-second-order paradigms and between third-order knowledge claims

While it may not be clear, in what I have written thus far, I have attempted to remain as politically correct and uncontroversial as possible while still saying what is necessary to convey my point. Unfortunately, this is the point at which I must descend into touchy material. Mike may have a weak belief that he shot himself because of karma or divine punishment. He may believe that he was predestined to shoot himself or that the CIA had implanted a microchip in his ass that made him do so. Any or all of these beliefs may be true. So long as they do not contradict the paradigms established by the first-and-second-order knowledge sets or each other, it is justifiable to believe such things22. Those examples are clearly a bit extreme, but it wouldn’t be out of line to say that Mike’s justifications for these claims may be more well reasoned and defensible than many claims at people at large take to be determined matters of fact. We will address that in the next section of this chapter.

Typically, third-order knowledge claims reside in realm of such things as esoteric sciences, religious discussions, conspiracy theories, and (especially) politics. Not always are these realms populated solely by third-order claims, but they do tend towards that in the common man’s mind. Other than by showing a logical inconsistency with the pre-existing paradigms, it is difficult to establish a falsifying element in third-order claims, which is likely to be part of the reason why the average man tends to vest so much of their mental narrative in the realm of weak beliefs, because they have the illusion of being bulletproof to the logically illiterate.

This is not a dismissal of weak belief. While this type of knowledge is frequently abused, it does have its utility. Sufficient practical reliability and utility can secure third-order concepts against ridicule. Many times throughout history, some person or organization has made a third-order claim which, by way of abductive reasoning or by advances in the rational or technological tools at man’s disposal, has since established itself as second-order knowledge. Abductive reasoning can best be described as an appeal to a compelling explanation for an otherwise unintelligible or gratuitous circumstance. In the words of C.S. Peirce, “The surprising fact, C, is observed. But if A were true, C would e a matter of course. Hence, there is reason to suspect that A is true.”23 This abductive reasoning is easily third-order knowledge, and can even see itself promoted to the second order, given sufficient supporting evidence.

In the case of scientific and religious discussion, one ought to be diligent in first securing their claims well within the realm of second-order knowledge. Many times, a great deal of cultural upheaval and unnecessary suffering result from people aggressively supporting and advancing weak beliefs in such a way so as to make them mandatory for all. Two easy, controversial, opposed, and equally ridiculous examples are those of six-day-creationism and Neo-Darwinism. Both stand on weak paradigms and contradict matters of scientific and metaphysical fact which are quite cemented as second-order knowledge. It is acceptable to hold religious or scientific beliefs which are third-order, but only so long as one remembers that they are beholden to the standards established by their preceding paradigms.

Thesis# 10: Through the collaboration of certain philosophers (and philosophy’s constituents) throughout history, there have been established a series of compelling arguments and traditions as apply to the truth and meaning of the universe; one must be willing to adopt certain elements from these traditions, but not without first assessing the validity of and categorizing such elements

All of this chapter thus far likely appears to be a matter of stating the obvious. It is possible that one or another of my readers will claim that this model in no way resembles the actual process of knowing and knowledge. I challenge such a reader to provide a more practical, reliable, and accurate model so that I may adopt it. For now, I will extoll the cash value24 of this model.

An interesting concept introduced by the sophists in the “new atheism” movement is meme theory.25 A grossly oversimplified view of meme theory is simple: individuals create and transmit memes betwixt one another much like viruses, only instead of deadly illness, they are ideas held in the mind. The memes that survive are those which provide the most utility or are in some other way given opportunity to spread. This theory was created with the express purpose of attempting to discredit religions as some sort of “meme engineering scheme” in which religious leaders, over the course of centuries and millennia, create and finely tune memes which grant the leaders control over those infected by the memes. If true, this would make religions some sort of mental terrorist organizations.

All sentient creatures, in communicating, are meme engineers. When I form a thought and pass it on to another, I am a meme engineer. When taking ideas in and deciding which to share, which to disregard, and which to modify, I am also participating in meme engineering. All of philosophy, including science and theology both, is party to meme engineering. This does not mean that philosophy is some evil organization creating zombies from a careful application of a trade milennia old, but rather the opposite. While there are bad actors which do attempt to abuse ideology and reason to bend the weak-minded to their devices26, meme engineering is the primary engine of progress.

It is important to note that memes are more like sound bytes than full-fledged ideas. Certain images, affectations, or catchphrases are good representations of memes. Where one can easily remember, recite, or recognize a phrase like, “Form follows function,” they may have no concept of it’s point of origin or even what it means. Only through some form of learning or education does one come to know that it is a principle that is key to the architectural field, and too often forgotten.

Many people, for any number of possible reasons, do not critically assess their belief structures. Our culture has engendered a distinctly emotional and anti-reason attitude. Many insist that, “people need to learn to think” when what they really mean is that, “they ought to learn to think like me.” Social understanding of the term, “critical thought” has been switched to dogmatic neoliberal belief. Our political, religious, educational, and economic landscape clearly illustrates this attitude. Additionally, a popular activity that has emerged is asking elementary questions concerning these subjects of a random selection of people off the street and sharing their absolutely incoherent answers.

Ultimately, this unwillingness to critically assess one’s beliefs in the manner I have thus far outlined has become so widespread for so long that many cultures of intolerance to reason have developed. It is, quite literally, impossible to speak cogently, intelligently, and civilly with a large swath of the population. Neoliberalism, fundamentalism, scientism, fideism, and any number more “-ism”s have evolved from their origins as mere theories or rubrics for action into monstrous, insular, intolerant, and aggressive codes of dogma which cannot coexist in a world with rational actors capable of critical thought. This does not mean that all that ascribe to “-ism”s are mindless warrior drones ever ready to jihad in the name of science, faith, or civil rights; some are quite intelligent, if mistaken. Likewise, some number of “-ism”s have managed to maintain their proper mindset, application, and scope in an otherwise irrational environment.

If one is careful to examine both their own and others’ belief structures, one can inoculate themselves against bad memes and avoid being misdirected. Nearly every individual is rational to some degree. As a result, even the most unintelligent or mistaken individual tends to utter claims which bear some degree of truth. I hope that, though this work and those to follow, I may be successful in distilling said truths from the many, many ideologies and theories to which I have been exposed and arrange them in such a fashion so as to be accurate enough to piss absolutely everyone off. I believe that with proper education or training in logical thought, many will be able to make use of this model of knowing and believing in such a way that, even if they are unsuccessful in forming an accurate worldview, they may at least be able to behave and discuss in a civil and intelligent manner.

As can be inferred from the discussion of this framework, the order in which a particular piece of knowledge falls is contingent on the knower, not the meme (or claim). The argument to the concept establishes its order, not the idea itself. A clear example would be in the realm of ethics, in which one can make a particular claim (murder is wrong), and depending on one’s method of determining the claim can land it in any particular category. Kant can claim “Murder is wrong because blah, blah, categorical imperative, blah, blah,” and it would at least qualify as a strong belief. “Murder is wrong,” says the local minister, “because I have a strong abductive argument for the existence of God and the Bible as a moral authority,” and his claim would be, at a minimum, third-order knowledge. When you ask the first person you see at the super market (as I have) and get the response, “Murder is wrong because… what are you, a psycho? It just is!” you have just encountered a claim with no knowledge content worth consideration.

One cannot possibly double-check every claim that they encounter, especially in this era of information overload. Categorization of ideas can help. Our current society sees an instinctive application of this solution; when presenting an idea (especially concerning a political issue) to one’s acquaintances, one is frequently faced with a dismissive response coupled with a particular categorization (“Oh, this is just that liberal/republican crap”). This can be done in a conscious and responsible manner. After assessing a claim one encounters, they can categorize the claim based on premises, subject matter, the stances that others tend to take on issues other than the claim at hand. In doing this, the next time one encounters the same or related claims, they can expediently determine whether said claims operate in an acceptable and cogent framework. Admittedly, this process can result in one overlooking valuable information due to the manner in which it is presented. For this reason, I find that it would be ideal for one to maintain a stoic agnosticism when overwhelmed and explore one claim at a time, remembering always the larger picture.

The necessity and importance of collaboration cannot be overshadowed by the pitfalls of the human condition. In interacting with others in the philosophical space, one is able to expand their knowledge base, refine and correct mistakes, and increase the number of creative minds working on any given problem. Also, this interaction tends to leave a record. Once upon a time, letters, books, and diaries left a record for later philosophers to engage. In today’s era, those technologies certainly persist, but we have the additional technologies of the internet and all it has to offer. Most notable of which is the permanence and accessibility of data, which are attributes that will likely increase in scope as cryptography and open-source technologies become a cultural mainstay.

Many ideas which have survived the ravages of human history have been passed down generationally, being improved, corrected, reassessed, with each passing century. Not all, but likely some of these ideas and worldviews contain a series of compelling arguments and methodological traditions, hence their survival. It would be a missed opportunity if one did not make an earnest attempt to analyze and selectively accept the accurate and useful from these traditions. As long as one’s first order claims are factual and true, it ultimately doesn’t matter which first-order claims are made, a properly formed reason has the capacity to derive the type of worldview pursued by the philosophers: one that is internally consistent, logically sound, empirically viable and universal, possessing ethical agency, utility, and Truth.

95 Theses

1 http://plato.stanford.edu/entries/knowledge-analysis/

2“Problems of Philosophy” Chapter 9

3Hegel, Encyclopaedia of the Philosophical Sciences p10

4Gun safety protip: don’t do that.

5 Which will be discussed later in this chapter as well

6Ch 2: The Embodied Mind

7Falsifiability is a concept I have shamelessly stolen from Karl Popper and turned to my own uses. I will point the curious reader to hes “Conjectures and Refutations”.

8A desire I share as an anarchist.

9Karl Popper Conjectures and Refutations p39

10As well as thesis 95

11First-order knowledge

12Popper p35

13Boethius’ Consolation of Philosophy p2

14Descartes “Meditations on first Philosophy” Chapter 2

15An idea that, while appearing to be simple, contains implicit meanings and beliefs within it.

16Holistic theory of knowledge

17Also called “strong belief”

18 The branch of philosophy which concerns itself with what our modern culture calls science, namely, a study of the material world

19Greek: telos. “That for the sake of which”

20Russell “Problems of Philosophy” Chapter 6

21For a more thorough exploration of both this specific example, and the principles which underlay it, I reference the reader to Thomas Kuhn’s “Structure of Scientific Revolutions”.

22 I seriously wonder what paradigms he would have to establish in order to simultaneously believe all four claims. If he has reliable second-order knowledge to base his accusations against the CIA, I want to hear it

23Groothuis “Christian Apologetics” p434

24The practical results of embracing a particular idea

25Richard Dawkins “The Selfish Gene”

26We will call these people “sophists” or “government officials”.

Chapter 2: The Embodied Mind

Chapter 2: The Embodied Mind

Thesis #5: One’s experience is phenomenological in nature and derived from the senses; the development of the mind and our understanding of the universe is therefore derived from sense experience and interpretation of said experience

In the previous chapter1, I established that all knowledge is experiential. Even matters of “divine revelation”, ESP, or any other alleged spontaneous acquisitions of knowledge are still experiential in nature, as one is still experiencing such an event within their own mind, regardless of whether or not it is actually happening in a manner consistent with how one perceives it taking place. When we first addressed this state of affairs, it was in the context of one being solely informed by experience. In this instance, we are approaching it from an incrementally more nuanced position: that one’s experience is phenomenological in nature and derived from the senses.

Man has an inborn faculty of intellect. The intellect is a complex and frustratingly mysterious thing; I will describe it in as concrete and simple terms as possible. In the words of medieval philosophers, the intellect is the capacity to which matters of fact make themselves apparent, “like a landscape to the eye”2. This is the primary faculty by which one experiences the world, providing man with direct apprehension of the things around him. Essentially, intellect is the seed containing the mind, the ratio3 within man. This is seen in an infant as he begins to focus on various elements within his environment and as he gathers rudimentary sense data.

With sufficient time and experience, the capacity (seed) of intellectus can grow into the faculty of reason. Again, using the medieval scholars, “Ratio is the power of discursive, logical thought, of examination, of definition and drawing conclusions.”4 A more modern and specific definition would be, possessing the qualities of, or capacities for, self-awareness and a fundamental potential to learn and think logically”. The manner in which the intellect receives those experiences is sensational5; an infant may have a very basic set of instinctual “programs” by which they “know” how to feed, breathe, cry, and squirm, but they do not even have control over the movements of their own limbs, let alone any cognitive faculties. The intellect allows the infant to begin gaining control of their movements through the repeated cycle of stimulus and response in each of its limbs. Through prolonged exposure to patterns in environmental stimuli, the infant begins to expect the patters to continue in the same manner: the first fledgeling sparks of reason.

Before continuing to analyze the relationship between intellect and reason, it would be prudent to expand on thesis number two. “Reason dictates one’s understanding of the universe.” Reason, or the ratio we defined above, is a uniquely human experience. As mentioned previously, animal “experience” is nothing more than a perpetual cycle of stimulus and response. Conversely, humans have the experience of experiencing; or rather, the intellect serves as an intermediary step between stimulus and response. The intellect, as it develops into reason, begins to identify apparent patterns and categories. This pattern recognition is not infallible6, but is the basis of all human experience. While reluctant to abandon his skepticism, Bertrand Russell expresses a very similar and more detailed opinion as this in his Problems of Philosophy7. His term for this process, which I will borrow, is “induction”.

Following induction, both Russell and I approach “deduction”. Deductive reasoning, also called syllogistic reasoning, are matters of logical calculation. Through induction, one can begin to assume patterns, and can even express them syllogistically. “If the stove top is red, it is hot” is a simple premise, which can be derived from simple experiences. Upon witnessing that the stove top is in fact red, one can assert, “if the stove is red, it is hot. The stove is red. Therefore it is hot.”. This is an assertion which is derived from a combination of experience and reason. However, no degree of experience can account for the initial element of reason that dictates that such a syllogism is possible, let alone reliable. Modern research into early human development, though, has discovered that there are strong indications of innate mathematical reasoning within infants. I assert that these mathematical operations are an example of that very intellectus earlier mentioned. Ultimately, mathematics is an expression of logic in it’s purest form8, meaning that logic is something more than just a mere brute fact9: it is a faculty inherent to man.

Deduction can express hypotheses beyond the realm of immediate experience. While our first example was purely experiential and practical, a brief survey of the philosophical tradition will show that deductive reasoning can be (and is) applied to every imaginable circumstance. The accuracy of these deductions is wholly contingent on two virtues, the accuracy of premises as they relate to reality and its adherence to what Russell calls the”Laws of Thought”10. They are as follows:

  • The law of identity: ‘Whatever is, is.’
  • The law of contradiction: ‘Nothing can both be and not be.’
  • The law of excluded middle: ‘Everything must either be or not be.’

In other words, the “Laws of Thought” is another manner of describing the principle of non-contradiction. The best formulation I have seen of the PNC to-date is, “The logical principle that something cannot both be and not be in the same mode at the same time.”

We are fortunate that we are inherently conditioned such that these principles are immediately apparent to us as they are, themselves, unprovable. Our experiences can serve to reinforce these principles and, through their applications, prove their utility even if one cannot prove them in themselves. Through experiences of particular instances, we can come to a greater understanding of the nuances of such a simple and self-apparent set of principles. All the laws of reason, which will be explained and elaborated as they become pertinent in this work, are simply expressions of the particular nuances of the PNC.

The more abstract or complex lines of deductive and inductive thought are no doubt somewhat removed from immediate experience, either by way of their conceptual nature setting them apart from the physical world or by speaking of physical events that are not within a proximate vicinity to the one deducing. This does not make the reasoning any more or less valid. For example, one can engage in mathematical exercises concerning triangles without referring to any actually existing instances of a triangle. Another instance would be a deduction that determines all kangaroos are mammals, even if one has never seen one before (and isn’t likely to… how many people go to Australia, really?). Both of which are valid regardless of whether the one doing the deducing is immediately experientially present to the subject matter or not.

These rules of logic and their applications obtain in such a manner that renders relativism (in all but its softest forms) impossible. Something is said to “obtain” if it is necessarily true in every instance, such as triangles having three sides or the PNC. I say that these obtain in such a way so as to render relativism impossible because relativism is, fundamentally, a denial of objective truth. Extreme relativism denies all objective truths whereas softer forms only deny particular categories of truth such as moral truths. This denial necessarily results in violations or denials of the PNC. Any instance in which one says, “there is no objective truth,” is an instance in which they are categorically denying categorical statements. This is an age-old objection to relativist thinking11 which has simply been hand-waved by the proponents of relativism. Admittedly, there are more refined and delicate relativist arguments, but they all fall prey to this fallacy at some point or another.

Thesis #6: The mind is an embodied entity; all language and imagining is clearly based in bodily experience and all imaginable entities outside the immediate physical world are conceptualized in a sensational metaphor.

This experiential and embodied basis of our knowledge is clearly evident in our language. Every aspect of our imagination physical in nature. It is fitting that, when discussing material circumstances, one should use material language. For instance, “that dog is sitting under the tree.” That statement can be a literal expression of a matter of fact. However, while it may feel intuitive, the same material language is used to express abstract concepts. For instance, “The prospect of war weighs heavy on my heart.” In this case, “the prospect of war” is immaterial and possesses no weight as a result. Additionally, one’s heart is unaffected by some immaterial state of affairs external to the person in whose chest it resides. I do not mean to claim that the above statement is devoid of meaning or veracity, but wish to illustrate the metaphorical nature in which we express immaterial concepts. While I lack the space and attention span to enumerate the various metaphorical uses of material language in the style of Wittgenstein, I contend that there is no instance of using language in a literal and comprehensible manner when expressing an immaterial state of affairs.

Upon brief inspection, I see three common uses of embodied language as referencing phenomena metaphorically. Firstly, it is used with regards to invisible material things, many of which we see the effects of but never the things themselves. Secondly, it is used with regards to metaphysical or spiritual12 entities. Thirdly, we employ embodied language with regards to ontological, or divine, concepts13. It would be prudent to, at least exemplify each of these categories and the relationships between them.

Many will object to me asserting that we use embodied language with regards to material objects metaphorically. “Of course we use material language when speaking of material things!” they say, “why would it be a metaphorical use?” With some invisible material things, like most gasses or electrical currents, metaphorical language in unnecessary; it is literally the case that air can push, pull, heat, or cool things as well as electric currents14 and the like. However, in the case of more esoteric fields such as particle physics or quantum mechanics, we do use physical language metaphorically. A couple easy examples would be the “color” of quarks or the “spin” of particles. Quarks are too small to be directly perceived by way of light and color, but the choice of “colors” provide certain useful conceptual assumptions based on our knowledge of actual colors. The same type of metaphor applies to the “spin” of particles, providing those that study and discuss these things with applicable and comprehendable language to do so even if the terms are literally meaningless is such a context.

Admittedly, I have not yet allowed metaphysical or ontological existants15 into this framework but that doesn’t disallow this analysis of language to enter into our discussion. Even if such immaterial things do not actually exist, we still speak of them and the manner in which we speak of them is indicative of the point I am making presently. Metaphysical entities, such as the principles of logic which were discussed earlier or the fundamental laws of physics, are frequently discussed in the language of math or logic; however, they are frequently expressed in physical language in order to make it practically useful. In the case of a particle’s “spin”, quantum particles travel along vectors as if they have angular momentum, like a spinning object, despite not necessarily spinning. Additionally, in the case of non physical narratives, whether fictional or real, such as dreams, out-of-body experiences, revelations, ghosts, angels, etc. are expressed in physical metaphor. An easy example would be the common narrative which occurs in reports of out-of-body experiences, “I was outside my body, kind of floating above it. I was there, but I wasn’t; I could see everything, but not like one does with their eyes. I was also in the next room over and still inside my body at the same time. I could see a long, dark tunnel, but it wasn’t really there, with a light at the end.” The only intelligible manner in which we embodied creatures can describe a circumstance which was clearly non-spatial and non-bodily is by use of spatial and visual language in an approximate metaphor.

Before we discuss ontolocal language, we must first define “ontology”. Ontology, as it is frequently used, is typically assumed to mean “the philosophy of hierarchy” or “the study of existants”. In my usage, ontology is best defined as “the philosophy of that which precedes physics and metaphysics”. This means that there are ontological commitments inherent within the fields of physics and metaphysics which, themselves, require investigation. These commitments typically involve the status of things as either existing or not, the relationships and nature of substances and logical principles.

As one can assume from the above definition, ontological language tends to be complex and ambiguous at times. This area of study tends to involve exclusively mathematical concepts, the nature of eternity/infinity, discussions pertaining to God, and ideas16. Not one of the things on that list are material or sensual things. Typically, in the case of God, anthropomorphic language has become so prevalent so as to make caricatures of the actual concepts themselves (ie. God is a bearded angry old man in the sky who smites people for petty acts of impoliteness.) Not one of those terms are easily applicable to ontology, let alone accurate metaphorical language for ontological concepts. However, this gross abuse of language does not detract from the fact that the only way a human can grasp such concepts as infinity, especially when attempting to avoid instantiating an infinite17, is through metaphorical use of embodied language.

Additionally, we, as (apparently) willing creatures, tend to use mindful language to express the behavior of non-willing and/or necessary beings. Where we may have refined our language in physics since Empedocles, “Things fall because like things desire to be proximate to like things,”18 we certainly still fall into this trap. Again, it is most common in the more esoteric areas of physics and in ontological discussions, such as particles “seeking out each other” or being “entangled” despite lacking a will or an actual entangling medium. That doesn’t change he fact that we use a language that is limited to embodied experience as a metaphor for more advanced concepts.

There is yet another likely mistake that one can make in reading this chapter. That mistake would be assuming that I am conflating the mind with the body (or the brain). I will not make a case to either materialism, idealism, or substance dualism here. Instead, I intend to explore the manner in which we express such concepts linguistically.

One of the most interesting cases of language operating in an unexpected manner is with regards to the self. For example, phrases commonly used are “my body”, “my mind”, “my soul”, and “my self”. We speak of certain aspects of ourselves in the same manner we would speak of our property; “my car”, “my robot slave”, etc. This linguistic phenomenon implies two things. Firstly, it implies that one’s mind, body, soul, self, and property are each distinct entities which are not reducible to one or the other. Additionally, it implies that what exactly an individual is is either an amalgamation of the above listed possessives, or something radically distinct from them.

We will address the question of what exact relationship the mind and body have, whether they are the same thing, one reduced to the other, or as two distinct and intermarried elements, later in this book19. The additional question of what, precisely, the individual is will be addressed briefly, but it will require far more space and time in order to reach a meaningful answer than I have available in this work. It will also require more intermediary steps than the mere twenty needed to discuss the mind-body problem. For now, it will suffice to merely express the manner in which our mind is embodied, practically speaking.

For fear of being accused of making the same mistake that Nietzsche made,20 I feel compelled to leave a disclaimer at the end of this chapter. I recognize that being a young American, my sole focus in this chapter is the way an individual thinks and speaks in American English. However, I believe, based on my limited grasp of Latin and Japanese as well as my exposure to Hebrew, Greek, and Spanish, this argument still obtains in some manner or another in every human language, with some slight modifications.

95 Theses

1Ch1, “Epistemic Assumptions”

2 Pieper pg 139

3Reason

4 Pieper 139

Also, Thesis #22

5Pertaining to the senses

6A state of epistemic affairs where one in incapable of being wrong

7Russell, Problems of Philosophy chapter 6

8 citation

9Something that simply exists without the possibility of explanation

10Russell ch 7

11The discussion between Thrasymachus and Socrates in Plato’s Republic (Book one, Chapter one) is an easy example.

12 I am not equivocating the two, mind you

13 In this case, the two may at times be equivocated

14Also, electromagnetism

15Simply defined, “a thing which exists”

16 Not to mention imaginary things like unicorns and free national healthcare

17For an introductory example of this type of reasoning, I recommend reading “The Cambridge Companion to Arabic Philosophy”.

18Aristotle attributes such a claim to Empedocles in his work De Anima

19 Chapters 8 & 9

20Namely, being a philologist instead of something a little more… real.

Just Another Friendly Argument 1: Dan

 

Discussing:

Water rights, the tragedy of the commons, cost-benefit analysis,(im)migration, how I may very well be incorrect, muh roads/highways, competition between railroads and highways, ethics vs economic utility and government vs individuals, cardinal vs ordinal values, ethics vs. morals and “thou shalt not murder”, evolutionary biology/psychology, Sustainability in human action, Zomia and the nature of History, Transgender restrooms and democracy, the psychology of voting, the housing crisis, Keynesian economics and my communist roots, Trump-flavored cancer, mass extinction, labor prices and economic growth, minimum wage and education.

This is an audio-only post, and I expect that (provided this becomes a recurring segment) it will remain audio-only.  It’s a little bit longer than most podcasts, but I hope you enjoy it.  As always, I crave feedback, so let me know what you think, so I can do a better job.

Carpe Veritas,

Mad Philosopher

Chapter 1: Epistemic Assumptions

Chapter 1: Epistemic Assumptions

Thesis #1: One is solely informed by experience

“We must, as in all other cases, set the apparent facts before us and, after first discussing the difficulties, go on to prove, if possible, the truth of all the common opinions about these affections of the mind, or, failing this, of the grater number and the most authoritative; for if we resolve both the difficulties and leave the common opinions undisturbed, we shall have proved the case sufficiently.”1 As a read through the canon of philosophy2 will evidence, there is a long-standing tradition of beginning with and stating atomic, self-apparent, facts followed by exploring the ramifications of accepting those facts. While some philosophers may begin with assumptions more apparent and verifiable than others, it remains the case that all worldviews are predicated on basic assertions which are made by the one (or group) which crafted said worldview.

This assertion is, itself, a self-apparent truth. There is no real way to prove that all reason is derived from immediate facts, only to disprove it. The principle of non-contradiction is one such principle: a thing cannot both be and not be in the same mode at the same time3. There is no way to conclusively prove this to be the case, but it is the foundation of all our reasoning. I assert that any example that could be presented contrary to this claim is either simply a convoluted example of my assertion or is an exercise in irrationality and absurdity4. I will choose to arbitrarily select one out of all the available examples of a beginning paradigm which attempts to circumvent this reality. A common line of reason in modern American society is the claim that “There exist, among men, a large percentage of bad actors who harm others. We wish to be protected from bad actors. Therefore we must place men in positions of authority over other men in order to protect them from bad actors.”5. Of course, in this case, there will undoubtedly be bad actors introduced into the aforementioned positions of authority, amplifying rather than mitigating the negative effects of bad actors in society.6 This is one of innumerable examples which demonstrate the impossibility of escaping the paradigm I have presented.

As can be assumed, these self-apparent facts are apparent only through the experience of the one to which the fact is apparent. Each of these (and all subsequent) experiential facts are, themselves, informed solely by experience. Even the most extremely outlandish claims to the reception of knowledge, like divine revelation or telepathy, are in their own way experiential. Ignoring whether or not it is possible or likely that one can have a vision or spontaneously altered awareness which is factual or true, what is guaranteed to be the case is that those who honestly make this claim have had an experience of such which has informed their worldview.

Reason, then, as the faculty by which one can analyze and make judgments about one’s environment, is ultimately derived from experience7. The experience of fundamental principles, like the PNC, allows one to generate the praxis8 of reason. By using the tools and flexing the muscles of the mind, one can begin to develop the faculty of reason.

Thesis #2: Reason dictates one’s understanding of the universe

One without reason, like an animal, exists in a perpetual cycle of stimulus and response. No different than a complex computer program, the sum of all an animal’s behaviors is dictated by a genetic, instinctual, rubric by which an animal eats when it is hungry, mates when it is fertile, and flees predators when threatened. Every nuance in their behavior is simply a property of their programming. This can lead to amusing circumstances when an animal’s conditioning is no longer appropriate for their environment, such as dogs refusing to walk through doorways due to certain cues which lead them to believe the door is closed or Andrew Jackson’s parrot swearing so profusely it must be removed from its owner’s funeral9. These amusing behaviors, though, are prime indicators as to the lack of a key characteristic which makes man unique from the animals: reason.

Both man and animals have experiences: certain events as perceived through the senses. However, man has the unique experience of experiencing that he is experiencing. In other words, “We are not only aware of things, but we are often aware of being aware of them. When I see the sun, I am often aware of my seeing the sun; thus ‘my seeing the sun’ is an object with which I have acquaintance.”10 Experience, itself, is clearly not sufficient, then, to be considered reason or a source of reason. Experience, as the animals have it (animal experience as I will refer to it), is little more than a sensational input to an organic calculator which produces a result. That result, even, is no more than an action of the body which, in turn, generates further sensational input. This cycle simply repeats itself thousands of times per minute, millions of minutes in succession, until the animal dies. The experience of man (or just “experience”, as I will call it), however, is different.

Man still experiences via the senses, but there is a slightly more complex process in operation after that initial sense experience. If a man is still in his infancy, is drunk, caught sufficiently off-guard, is mentally disabled, or is one of my critics (or is any combination of the above), it is incredibly likely that they will have a form of animal experience by which reason doesn’t enter the picture until some time after an instinctual and automatic response takes place. Even though that may be the case, there will be an opportunity later to reflect on the experience and interpret it as one wishes (though, at times, that opportunity is ignored). More commonly, an individual has the opportunity to process sense perceptions with a rational mindset, deliberating whether he should say a particular sentence or another while on a date, for example.

In this example of a date, one, we will name him Mike, can draw on experiences from the past to inform the present choice. Upon reflecting how poorly his last date went, Mike may opt to avoid describing in graphic detail what it feels like to shoot oneself in the leg over a veal entree… at least on the first date. This is an example of how one’s understanding is a direct result of one’s internal narrative. After experiencing the horror and disappointment of a first date ending abruptly and with no prospects of a second, Mike would have the rational faculty to reminisce over the experience in order to find a way to succeed in the future. Having reached an understanding that such behavior is not conducive to a successful date, he can choose to avoid that behavior in the future. This applies in all circumstances besides the aforementioned date. If, say, Mike were to decide to read this book, after reading a miserable and arrogant introduction, he may come to an understanding that this book is not worth it and return to watching football never to read philosophy again (that sorry bastard).

Of course, it is possible that one’s interpretation of an experience can be flawed. In the case of Mike, it’s possible that his earlier failed date had less to do with his choice of conversation and more to do with the fact that his would-be girlfriend was a vegan with a touch of Ebola. In the case of his current date, it is distinctly possible that his current would-be girlfriend is a red-blooded anarchist meat-eater who listens to Cannibal Corpse songs when she eats dinner at home. By misinterpreting previous experiences, Mike is going to spoil his chances with a real keeper. For this reason, I find it necessary to delineate between one’s subjective understanding of particular instances, which may or may not be inaccurate, and one’s faculty of understanding.

Thesis #3: One’s understanding of the universe dictates one’s behavior

As we addressed when discussing the differences between animal experience and actual experience, man behaves in a manner distinct from animals. Due to man’s faculty of reason, understanding and justification are elements which interject themselves between the phenomena of stimulus and response. In any instance of stimulus, a man must choose to assent to the stimulus and choose to respond. In the case of Mike, while reading my book, he would be exposed to the stimuli of mind-expansion, intellectual challenge, existential intrigue, and more. Being unaccustomed to such stimuli, our example, while incredulous of the stimuli, assents and then chooses to cease to read and retreat to the comforts of the familiar simulated manhood of football. In the case of a dog, however, whatever new stimuli it is exposed to are immediately either perceived through the filter of instinct or disregarded outright, much like a blind man being the recipient of a silent and rude gesture. As that stimuli is perceived, the dog’s instinct causes it to behave in one manner or another. For instance, being of domesticated genetic stock and trained to assist his blind owner in particular ways, he may maul the one performing the rude gesture, with no rational process involved, merely organic calculation.

This difference, however, does not mean that man is devoid of animal experience or instinct. As mentioned before, under certain circumstances, man can behave in a manner consistent with animal experience. As a matter of fact, it is the case that instinct may play, at a minimum, as much as half of the role in man’s experience and understanding. Man is clearly not the “tabula rasa” of Avicenna and Locke11. As I have asserted, the faculty of reason is inborn. Evidence exists to support my claim in that infants instinctively act on stimuli in order to feed, cry, swim, and flail their limbs; there are also contemporary scientific claims that the brain operates as an organic calculator, the evidence of this also exists in the behavior and brain structure of infants. Additionally, evolutionary psychologists have observed similar phenomena in grown adults concerning phobias, pain reactions, sexual attraction and many other areas of the human experience. As will be addressed later in this book, it is even possible that this rational faculty my argument hinges so heavily on is, in fact, nothing more than a uniquely complex form of animal experience12. Until such a time that I do address such claims, though, we will continue to operate under the belief that rationality exists per se.

Understanding and habituation, then, drastically impact one’s behavior because they are the medium by which one’s experience informs and dictates one’s behavior. Through experience of particular sensations, and the application of reason to those sensations, man can come to understand his environment. Through application of reason to any given circumstance of stimuli, he can then choose an action understood to be most appropriate in any circumstance. Habituation, additionally, impacts man through the instinctual inclination to maintain a certain consistency in one’s actions. In the case of Mike, this would result in choosing to watch sports over reading philosophy.

Thesis #4: The epistemic and phenomenological endeavors of philosophy (and, by extension, certain areas of physics which pertain to the human experience) are crucial to one’s understanding of the universe and one’s resultant behavior.

In choosing to watch sports rather than read philosophy, Mike is attempting to avoid the discomfort of a new experience for which he is ill-equipped. However, in avoiding that experience, Mike is attempting to shirk his need to engage in public discourse and exposure to culture. Whether or not he succeeds in such an endeavor is less important to us now than what such an experience represents. The experiences of public discourse and culture are key experiences which inform one’s understanding and behavior. Our example in the introduction to this book concerning the need for communication and language is a prime example of the fundamentals of public discourse and culture. “This mushroom bad,” clearly establishes certain cultural norms as well as informing one’s attitudes towards certain concepts. In the case of Mike, it could be a friend coaching him with dating advice or beer commercials during the football game altering his expectations of his date. If he had read my book, Mike would be more likely to succeed in his date, having better equipped himself with a tool set for working with the human condition.

These tools have been graciously provided for us through the long-standing traditions of philosophy, most notable in this instance would be epistemology and phenomenology. Through the study of knowledge and how man acquires knowledge13 and experiences and how man feels what he does,14 philosophy can aid significantly in one’s quest for understanding what and how he knows what he does and how to influence those around him. Most of what has been written in this chapter is lifted directly from discussions I have had regarding various works in epistemology and phenomenology. In this regard, I believe this work is a paradigm example of the assertion made, that one of the most crucial kinds of experience for the formation of one’s understanding is one of a social and philosophical nature.

A strong cultural and public formation of one’s understanding is crucial because a well-informed understanding can ultimately provide maximal utility to an individual and society15 whereas a poorly-informed understanding can effectively cripple one’s ability to develop their rational faculties or provide much utility to themselves or others. As was mentioned earlier, one’s subjective, personal understanding can be flawed. Some merely make a small error in their reasoning while others may be mentally disabled by either material means or due to a cripplingly misinformed understanding. The strongest influence to both the possibilities of an accurate understanding or mental disability is that public influence on the individual. As discussed in the intro, when done correctly, philosophy creates the circumstances most conducive to a well-informed worldview.

In this way, we see that one is solely informed by personal experience. That experience allows one do develop inherent faculties such as reason. Reason, in turn, allows one to analyze one’s experiences and engage one’s culture. This analysis generates an understanding and worldview within the individual, which also has a bearing on one’s habits as well. This understanding is the premise on which one makes a decision regarding how to behave in any given circumstance. As forming an accurate worldview is crucial to one’s successes, philosophy (the strongest candidate in this regard) is crucial to forming said worldview.

95 Theses

1Aristotle’s Nicomachean Ethics (Oxford World’s Classics) p.118

2The widely accepted list of “most significant philosophers to-date”.

3We will explore the Principle of Non-Contradiction, or the PNC, more thoroughly in chapter 3: Orders of Knowledge.

4A claim which is logically self defeating, whose conclusions deny the very premises on which it is built.

5This is an example of how Philosophies written in the mid-17th century (Hobbes’ Leviathan) have percolated though the social consciousness for centuries and are no longer questioned.

6Additional examples and further exploration of absurdity can be found in Hobbes’ Leviathan, chapter 5.

7The next chapter will explore this concept more fully.

8The method by which one, through either experience or theoretical knowledge (“knowledge that”), can develop practical, active knowledge (“knowledge how”).

9 Volume 3 of Samuel G. Heiskell’s Andrew Jackson and Early Tennessee History

10“Problems of Philosophy” Bertrand Russell ch.5

11“Tabula rasa” refers to a “scraped tablet” or “blank slate”, evoking a description of the mind in which there is initially no knowledge or activity whatsoever.

12In Chapter 2: “The Embodied Mind”

13epistemology

14phenomenology

15In this case, I’m using the term “utility” in a very loose way. The best definition of “utility”, though, would be, “the capacity for a thing to provide or contribute to one’s flourishing.”

Abstract of the 95 Theses

Assumptions and their descendants:

From Aristotle1 to Zeno, every man who has claimed the title “philosopher”, has made basic assumptions from which all their later works (if rigorously done) are derived. Even those that demand a priori proof of even the most atomic basis for argumentation (such as those in the Cartesian tradition2) make assumptions somewhere, no matter how well disguised or hidden they may be. There is nothing wrong about doing so, though; being an experiential creature man can only begin to reason from some given truth of which they have experience. The pre-existent knowledge required is of two kinds. In some cases admission of the fact must be assumed, in others comprehension of the meaning of the term used, and sometimes both assumptions are essential… Recognition of a truth may in some cases contain as factors both previous knowledge and also knowledge acquired simultaneously with that recognition-knowledge, this latter, of the particulars actually falling under the universal and therein already virtually known. ”3

Because it is the case that one must begin from assumptions, it is in one’s best interest to select the most fundamental and apparent assumptions and build up from there with the assistance of reason and observation. When one follows these assumptions to their logical conclusion, then, one will likely see the errors of one’s assumptions if the results are absurd or impossible. At that point, one must select an improved set of assumptions and move forward, repeating this process as many times as is necessary. I use epistemic assumptions here, as my childhood experiences in Cartesianism have shown to me the impossibility of accurately describing the universe if one is an epistemic skeptic or nihilist.

In addition to selecting a certain type of assumption, one must be deliberate in what quantity of assumptions one makes. If too few assumptions are made, there will be insufficient material from which to derive cogent syllogisms or conclusions, trapping one in the tiny cell of skepticism. Choosing too many or too advanced assumptions will short-circuit the philosophical process of discovering where the assumptions will lead and will necessarily result in the desired (and likely incorrect) conclusions of the author. Also, too many or too complex assumptions place one’s work beyond the accessibility of critics, in that no critic can hope to verify one’s claims based on one’s assumptions if the assumptions themselves are opaque, obscurantist, or simply a secret to all but the author.

As was implied by an earlier paragraph, and would logically follow from this conversation concerning the quantity and quality of assumptions, certain enlightenment-era questions and practices ought to be bracketed4 for later discussion. If one were to be forced to synthesize their own version of the Cogito, or the world of numena, the practice of philosophy would have halted midway through the enlightenment with each new philosopher attempting to invent a square wheel. That is not so say that skepticism should not be addressed; only that it doesn’t necessarily have to be the starting point. Nor does it mean that one’s assumptions suffice on their own; they ought to result in an empirically falsifiable claim by which one could determine the validity of one’s assumptions.

The physical world and our understanding:

Why would my project run straight from epistemological assumptions into physics? The physical sciences are the first source of certitude after the basic epistemological claims are made. It is far simpler to state that we can know things and that the primary engine for any knowledge is our experience and discuss that experience as opposed to making such an epistemological claim and immediately begin attempting to discuss experience or knowledge of some transcendent or ethical claim, as their experience is often derived from some manner of physical experience to begin with.

This is because philosophy, like reason, operates from the ground up: first, building a foundation before building arguments atop that foundation. “…If a house has been built, then blocks must have been quarried and shaped. The reason is that a house having been built necessitates a foundation having been laid, and if a foundation has been laid blocks must have been shaped beforehand.”5 As our immediate experiences are derived from our bodily senses, which are confined to matters of a physical nature, so too must our immediate foundations. Even universal and unavoidable principles, like the principle of non-contradiction or many ethical principles, are made known to one by way of physical sense experience (with assistance from reason, of course). In addition to the foundation which physics provides on an experiential level, it also provides a conceptual basis. One cannot properly ask “why?” without first asking “what?” and “how?” Physics, when done properly, effectively shows one what happens in our physical universe and how it does so.

Metaphysics6, as the name would imply, can also be appropriately appealed to in this stage of development. As a counterpart to the physical studies of how our universe operates, metaphysics applies a slightly less experiential and more rational but very similar method as physics to immaterial questions regarding our experience. Metaphysics and I have had a very rocky on-again-off-again relationship throughout my life. As a confessed former adherent of scientism, for quite some time I disavowed that metaphysics could even rightly be considered to exist. I am sure that by the time my life ends, I will have left and returned to metaphysics at least once more, but each time such an event occurs, our understanding and appreciation of each other grows.

Ontology as derived from experience:

Why ontology? If ontology is to be understood as the study of existence or existants, then it would naturally follow from our study of our experience to move on to the study of the things we are experiencing, namely, that which exists. There is a question more likely to be asked by a modern readership. That is, “why theism?” I have long struggled with the discussion of theism or atheism in the realm of philosophy. Even as a “scientist”, I was agnostic as to whether there existed some being beyond the physical realm, primarily because both a positive or negative claim as to theism are empirically unfalsifiable.

However, that was at a period of time where I was still immature, both biologically and philosophically. I have come to realize (as will be discussed in the Theses)7, that one’s assumptions on which one builds one’s philosophy necessarily result in either a positive or negative claim concerning theism. In the case of any teleological philosophy, it must result in a positive claim and, conversely, in the case of any nihilist philosophy, it must result in a negative claim.

Also, after physics is able to establish an empirical validation of one’s assertions, it must be relegated to the role of double-checker, simply checking all later claims against man’s experiences, ensuring that no claims made by other fields of study run contrary to that experience. Naturally, after physics establishes what happens and how, the philosopher must ask why it happens, or another way of phrasing “why” would be, “what is the practical universal significance of such an event?”

Although the question asks for the practical universal significance, and despite the claims made by postmodernists, it is not in any way untoward or egotistical to presume that the universal significance of such an event must, in some way, be centered upon ourselves. There is a twofold reason that this is the case. Firstly, the nature of man is such that he feels a compelling need to search for meaning in his existence; any teleological philosophy would rightly assign an end to that compulsion. Secondly, our definition of philosophy is predicated on the assumption that man is capable of discerning a relevant place in the cosmos for himself. Ultimately, in this case, the absurdist is right, it matters not whether there is a significant place for man in the universal sense or not, man can always make one.

In knowing man’s role and significance in the cosmos, one possesses a tool set which one can use to determine what one ought to do. Now, many will refer to Hume at this point and will insist that “One cannot derive an ought from an is,”8 but rather than conclusively disproving my point, they merely indicate their lack of understanding of Hume. The prohibition of deriving an ought from an is assumes that the realm of “is” consists merely of objective impersonal atomic facts. If one allows value claims into their ontology, or their category of “is”, it becomes inevitable that the is/ought distinction collapses. These value claims are clearly not empirical, but that brings us to our earlier discussion about the relationship between the sciences and philosophy, the moment that certain supplementary matters of fact are allowed into the realm of discourse, such as metaphysical, psychological, teleological, or ontological assertions, it can easily stand to reason that one can derive an ought from an is.

Even in such an event that objective values do not exist, the subjective values of individuals must be informed by a proper understanding of physics, metaphysics, and ontology. If one values a particular activity or outcome, one’s ability to achieve such a result is dependent on properly navigating reality. Many would-be “oughts” are simply impossible or absurd and are beyond the human capacity for comprehension, let alone accomplishment; thus, the realm of values to which one can assent is limited by the same factors which have confined our definition of the philosophical activity thus far. Even after one assents to a rationally consistent and metaphysically possible value, the methods by which one achieves such an outcome is dependent on the nature of reality and the actor’s ability to navigate it. With these strictures in place, it is essentially actionable to claim that one can derive an “ought” from an “is”.

The problem of evil and subsequent ethical prescriptions:

All philosophers are eventually faced with the question which plagues all men: “Why does life suck?” It finds itself phrased in many different ways but, since the time of Epicurus, the problem of evil has remained central to the discourse of philosophy. The most common phrasing would be something akin to, “If there exists an omnipotent, omniscient and omnibenevolent god, how can he allow innocent people to suffer as horribly as they do?”9 Usually, there are citations of disease and natural disasters killing small children to this effect.10

Different philosophers and traditions provide different answers, some more radically different than others. Some, such as Epicurus, would say that the problem of evil is sufficient cause for a practical atheistic hedonism. Others, such as Pascal, argue quite the opposite. Not the least of the responses, while being more or less outside the theistic spectrum, would be the approach popular in the ancient East (and the answer I once held myself), “Life simply sucks”. While my answer now is slightly more refined, the practical application of it remains mostly the same. So, what to do about the problem of evil? This is, again, more clearly and articulately discussed in the Theses11 than I could hope to write here. It will suffice to say, for now, that our understanding of man’s telos must accommodate for the problem of evil.

What can one do about the problem of evil? I believe that the answer is twofold. In the case of the philosopher, one is obligated to, at least, address and accommodate for it and move on with their reasoning. Each man, however, must be able to address and accommodate for the problem in their daily lives. While the appearances between these two courses of action are very similar, I believe that each require individual attention. The problem of evil serves as a strong device for proofreading philosophical assertions; insofar as one’s philosophy can or cannot address the problem, one can quickly assess the practical viability of said philosophy. The personal approach, while strongly tied to the philosophical one, need not be as rigorous or well-reasoned as the philosophical. The great acts of kindness displayed by those such as Blessed Teresa of Calcutta or Saint Nicholas are no less great a response to the problem of evil because of any lack of philosophical argumentation for their actions. In this work, I hope to articulate the philosophical side of the problem, and in a later work I hope to provide practical tools for living in accordance with that philosophical approach.

As will be discussed in this work, in all reality, the problem of evil only exists in the form of a problem because of the innate desires of man. Man bears in his heart the desire and freedom to excel. Whether one is aware of it or not, a majority of his actions are caused by or strongly influenced by that desire. Despite the common formulation of the problem of evil, it is less an ontological statement of “How can this thing possibly exist?” and more a plaintive cry of “Why do I want this, if the universe conspires such that I cannot have it?” One must be able and willing to address the problem and either overcome or circumvent it in order to achieve the self-fulfillment sought after by all men.

Conclusion

My aforementioned saloon discussions have operated as a club of sorts, with the working title of Lucaf Fits, which is an acronym for “Let us create a foundation For it to stand.” As the basis of logic, reason, philosophy, and ultimately all human endeavors, a solid rational foundation is required for all meaningful discourse and progress. “Lucaf Fits” serves well as both a goal and mantra for my group and myself. With this work, I hope to begin setting forth a foundation on which my other discourses may stand.

This work, as I have already said, is to be a starting place, not an exhaustive foundation or even an introductory work like the Summa or Prolegomenon. In sharing this work, I am exposing the beginnings on my internal discourse to the harsh elements of the social world. I hope to be met with great amounts of constructive criticism and support from my peers and superiors, but I am not so confident so as to expect it.

Regardless of the social and financial success or failure of “A Philosopher’s 95 Theses”, I intend to continue this line of work, exploring and expanding the 95 Theses, following them to their logical conclusions and modify the foundation as is needed to most successfully pursue the goal of philosophy. I also hope that with sufficient time, effort and experience, I can one day move beyond such foundational types of works and move into a more practical style of discourse and argumentation. I believe that the foundations such as these outlined here will necessarily lead to the conclusions that I so frequently argue and strive to engender in social media and day-to-day life; I hope one day to have outlined from this foundation those points so that others may see the validity of my position and actions. If, however, my conclusions are invalid and do not follow from the premises I am currently laying out, then, just as well, as it will guide me to the Truth which is far more valuable to a philosopher than public affirmation.

Because such discussion is directed at the revision of one’s arguments and beliefs, I will likely revise and correct this work through time. I have already, in the writing of this introduction, revised a few of the theses contained within this book, and have since edited each one a number of times, so as to more appropriately maintain their cohesion and logical validity. While I hope that such causes for revision will appear less and less frequently until, one day, I have acquired Truth, I am skeptical that such a time or event will occur in my lifetime, or even this world at all.

The ideas contained herein are the product of nearly two decades of oral discussion12 and revision, as well as excessive reading of philosophers across time and traditions. I am simultaneously both encouraged and discouraged by the genealogy of my current position. Having run the gamut of political, economic, religious and philosophical stances in my short lifetime, I am emboldened in saying that I have recognized my own mistakes and intellectual frailty enough times now to be more willing and able to admit my own mistakes when they are made. At the same time, however, I find myself skeptical of any truth claims I do make, now, because of my long list of fallacious stances in the past.

With luck and a fair degree of self-control, God willing, I will be able to make use of another seven or eight decades in this endeavor. That, I would hope, will be sufficient time to complete the revisions to this and my later works. Perhaps, one day, my ideas will be perpetuated in the traditions of philosophy. Perhaps commentaries on my work will be required reading in some institutions.

After all, the entire tradition of philosophy consists of free ideas. I do not mean “free” as in without cost, for many of the greatest and worst of the world’s philosophies have been crafted at great price. I mean “free” in the sense that the ideas, granted an appropriate environment, will spread and flourish like wildflowers. As I mentioned before, these ideas are as much a part of the intellectual atmosphere as any other cultural trend or idea. In many cases, these ideas are so liberated from the moorings of their original author that they are falsely attributed to one who was unwittingly synthesizing an already existing work.

It is an obligation of the philosopher to give credit where it is due. One ought especially to give citations to one’s contemporaries, as they are still present to take advantage of what approbations and criticisms come their way. To only a marginally lesser degree, one ought also give credit to those who have come before and laid the foundations on which one now builds, both so that one is not falsely assumed to be the progenitor of another’s work and so that one’s readership may be able to find the primary sources for their own edification. That being said, one must not be so averse to inadvertent plagiarism so as to hinder actual progress. A healthy balance must be struck between progress and citation.

In addition to the intellectual and social coin of credit given where it is due, actual coin ought to be given as well. Being merely human, a philosopher still needs food and shelter and time. When one works full-time performing menial and self-debasing labor (as is common in this age), it can be difficult or impossible to set aside sufficient time, resources, and motivation for such an undertaking as philosophy. Even if the ideas and art of philosophy ought to be unbound by financial constraints like all other intellectual or artistic works, the one producing the work is. I can justify selling this work as opposed to making it freely available to all only because it is being sold at an affordable price and because I am willing to donate copies and excerpts to those who can and will benefit from it but cannot possibly afford it13.

I make this financial case for philosophers with a caveat: no man should solely be a philosopher. If not working some form of job at least part-time or arranging for one’s self-sufficiency to supplement both one’s wallet and mind, than one must be working in some capacity either for survival or for art. A man’s mind can stagnate on outdated and fallacious thought if he is not careful to keep both his body and his social life healthy and active. Even if one makes enough money from teaching or publication (which, I understand, is rare), one must at least volunteer for a local, personal charity in which one works with other people and worldviews.

To this effect, I intend to continue this course my life has taken and see where it leads. I hope you, my reader, are willing and able to make use of this work and to aid me in my quest for Truth.

95 Theses

1Technically, Albertus de Saxonia is alphabetically prior to Aristotle, but he is much less known.

2The philosophers who followed in Descartes’ footsteps, maintaining a skeptical stance towards all facts that are not entirely doubt-free

3Aristotle “Posterior Analytics” book one

4Set aside with the intent to more thoroughly explore at a later time, it is a technique to be used only on concepts that are not crucial to the discussion at hand.

5Aristotle “Posterior Analytics” book 2

6From Greek: “after physics”. While the name denotes only that it was the subject Aristotle would teach after physics, it can be said to deal with the non-material aspects of physical inquiry.

7Chapters 5 and 13

8Hume “A Treatise of Human Nature” book 3

9 Hospers “An Introduction to Philosophical Analysis” p310

10Dostoevsky “Brothers Karamazov” is an excellent example of such descriptions.

11Book 5

12 In this case, I consider social media as a form of oral discussion.

13 Ironically, I qualify under my own rubric for a free copy

Philosophy in Seven Sentences

I’ve previously presented a brief review of Christian Apologetics (which seems to have vanished… I will have to write a second one or re-publish it). From the same author, InterVarsity Press has recently published Philosophy in Seven Sentences. Now that I’ve read the book (twice), I feel compelled to share it with my readers.

I love teaching/tutoring, especially audiences yet uncorrupted by academic ignorance and apathy. A few years ago, I taught a series of philosophy classes to a local homeschool group. It was well-received, it payed the bills, it gave both myself and my audience a newfound appreciation for the science and art that is philosophy.

The average age of the class was somewhere in the vicinity of thirteen or fourteen years of age, so they were largely unaware of philosophy altogether (which is a shame). I had four lectures with which to cover all the bases of “Philosophy 101” in a manner amenable to a young audience. Ultimately, I decided on pulling four themes/philosophers from history and simply walking the class through a philosophical exercise of exploring those themes. Almost the entirety of my preparation time was spent choosing the four themes. Ultimately, I think I chose Plato’s (Socrates’) apology, Aristotle’s categories (basic logic), Descartes’ cogito, and Kant’s categorical imperative. Of course each philosopher served as a foil for their contemporary history of philosophy and their inheritors, thereby covering the bases of philosophy’s history. Having taken two Philosophy 101 classes (from two different schools, long story), I get a feeling this is a popular way to teach such courses.

All this dry nostalgia is to set the stage for a brief overview of “Philosophy in Seven Sentences”. Typically, this would be a full-on “teaching from the text” post, but this book is literally fresh off the presses and both you and Douglas Groothuis would be better served if you ponied up the small amount of money required to acquire the text itself. That said, I do intend to give the text its due justice.

In eight short chapters, averaging about sixteen pages each, Groothuis takes one sentence per chapter (plus a short challenge at the end) and gives an excellent introduction to both the tools and traditions of philosophy. Typically, such a text will either attempt to impress its readers with technical terms, obscure references, and complicated methods of presentation or it will be written so casually and simplistically so as to render a rich and beautiful tradition banal and empty. Groothuis manages to dance a fine line between condescension and elitism, speaking plainly and straightforwardly but also challenging even seasoned readers to step up to his level of mastery concerning the material at hand.

I genuinely enjoy reading primary sources which, I guess, makes me weird; secondary and tertiary sources are generally less appealing to me, but I read any material with a sufficient insight-to-page-count ratio. As a case-in-point, I’ve already read many of the texts referenced in “Philosophy in Seven Sentences”. Even so, Groothuis manages to take a broad array of information, presumably acquired through extensive reading, discussion, and lecturing, and distill it down to one of the highest insight-to-page-count concentrations I have seen, even for someone with reasonable familiarity with the material presented.

The seven sentences in question are well-selected: spanning history and traditions from ancient Greece with Protagoras, Socrates, and Aristotle, to the early Church with Augustine, to the enlightenment with Descartes and Pascal, to modern existentialism with Kierkegaard. While I may have selected a couple different sentences (exchanging Paschal for Nietzsche and Kierkegaard for Camus or Sartre), Groothuis tells a progressive narrative which begins, dialectically and historically, with Protagoras’ “Man is the measure of all things,” and concludes with Kierkegaard’s pointed and melancholy “The greatest hazard of all, losing one’s self, can occur very quietly in the world, as if it were nothing at all.”

Readers who have no prior exposure to philosophy proper should, at least, recognize three or more of these quotes, as they have become memes referenced and repeated throughout popular culture. “Man is the measure of all things,” “I think, therefore I am,” and “The unexamined life is not worth living,” are referenced in popular films, shows, books, and songs. Descartes’ contribution, in particular, is the subject of a great many common jokes. I once owned a t-shirt which read “I drink, therefore I am.”Groothuis does an excellent job of setting misconceptions concerning these sentences without becoming a party-pooper.

Usually, a book I enjoy reading is full of highlights, annotations, and sticky notes. Every page of Human Action and Existentialism is a Humanism has some sort of mark on it. One would expect, then, that an unmarked book would be a sign of disinterest and, typically, one would be correct. In the case of “Philosophy in Seven Sentences”, though, nearly every line would be highlighted (defeating the purpose of highlighting) and there is no need for annotating the text; it is clear, concise, and wastes no time or space in exploring, if not the history of philosophy, a powerful narrative through the tradition of philosophy.

I have never before encountered a book better suited to serve as a textbook for an intro to philosophy class. Admittedly, this book would likely be better received in a Christian institution than elsewhere but, even elsewhere, it far outstrips and conspicuously secular text as far as both demonstrating the techniques of the philosophical exercise as well as exploring the philosophical tradition. I guess I’ve been salivating over this book long enough and ought to move on to “teaching”.

The general plot of the book begins with Protagoras’ exploration of subjectivity. Given that the pre-socratics are the progenitors of western philosophy, it makes perfect sense that one would start the narrative there. With a quick glance over extant pre-socratic works, one largely has a choice between the Zenos’ contributions of stoicism and obnoxious math problems, Pythagoras’ trigonometry, Heraclitus’ almost Buddhist sense of impermanence and meaninglessness, or Protagoras’ relativism. While Zeno (either one), Pythagoras, Heraclitus, et.al. each contributed quite a lot to philosophy as a whole, Protagoras sets a particular stage for Plato and Aristotle to get the show really going.

“Man is the measure of all things,” could easily be the opening lone of a stage play concerning the history of philosophy. I know from firsthand witness that phrase has hung on the wall of many dorm rooms that have borne witness to activities often reserved for cheap motel rooms outside of town; it has also, quite contrarily, remained very near the heart of philosophical discourse for over two millennia.

Such a mentality is easy for the philosophically-minded to slip into. As the exercise of philosophizing often consists of comparing and contrasting (AKA “measuring”) experiences, narratives, and ideas, it’s a natural temptation to declare oneself (or one’s kind) “the measure of all things”. Given the absence of an immediately apparent alternative to man, as far as measuring is concerned, Protagoras can’t really be blamed for making such a claim. Groothuis does an excellent job of exploring Protagoras’ position, the rationale behind it, what such a position means, and the ultimate results of a position. I don’t have the ability or word count to do so.

Moving on, a younger and arguably more famous contemporary of Protagoras is reported to have said “The unexamined life is not worth living.” Of course, if man is the measure of all things, then such an examination is likely to be very short in duration. Groothuis shows the tension between Socrates/Plato’s views on the transcendental nature of reality and Protagoras’ more materialist understanding of reality. While also setting up an opposition between Protagoras’ camp and the Socratic camp (which remains in the narrative all the way through Kierkegaard), he describes Socrates and his basis for such an extreme statement as “The unexamined life is not worth living,” in its own right as well. Admittedly, I feel that, despite explicitly addressing the key issue in interpreting Socrates (he didn’t write anything down, so all we have is other peoples’ accounts of what he said), Groothuis blurs the line between Socrates and Plato as far as their ideas are concerned.

Regardless of whether Plato or Socrates ought to get the credit allotted by Groothuis, they effectively prepare the stage for Aristotle who begins the discussion of man’s nature. Ultimately, the issue of man’s nature is what Augustine, Descartes, Pascal, and Kierkegaard are called to opine upon. Each one comes from a particular philosophical school and era in history and, therefore, has something unique to contribute to the discussion and Groothuis demonstrates a depth and breadth of knowledge on both the philosophers and their ideas.

This book is a must-read and must-have for anyone who is even fleetingly interested in matters beyond dinner, dates, and this week’s sportsball game. This goes for the engineer who did everything in his power to avoid liberal arts as well as the philosophy masters’ students who may need a reminder on the basics, a reminder of where philosophy 101 students stand, or as a textbook from which to teach. This book is one of the few secondary sources I will suggest, and I plan on snagging a few of the books listed in the bibliography for my personal extra-credit.

TL;DR; Philosophy in Seven Sentences, by Douglas Groothuis, is a paradigm example of how the more knowledgeable one is concerning a particular subject, the better one ought to be at explaining it in terms everyone can understand and, hopefully, enjoy. Derived from a popular introductory lecture style, Groothuis’ work takes seven deep, meaningful, and crucial sentences from the history of philosophy. While I may have chosen sentences from Nietzsche, Rousseau, ort Sartre instead, I would not have been even remotely capable of laying out so much information in so concise and readable a narrative. If anyone has a hard time keeping up with the terminology or argumentation in this blog, “Philosophy in Seven Sentences” is my most highly recommended starting place (followed by Liberty Classroom).

Introduction to the 95 Theses

Introduction

“A Philosopher’s 95 Theses”, a silly and audacious title for a work by a college dropout with little to no substantive endorsements. What is this work even supposed to be? This work is primarily an attempt to begin a systematized and traceable discussion concerning my particular brand of philosophy. Having spoken in various public forums, from in the classroom, to hosting salon discussions (thank you, Voltaire), to water cooler discussions, to arguing on Facebook (a noble means of communication, to be sure), teaching and tutoring homeschoolers, and managing a blog, I have found that many people in my generation and social stratum lack even rudimentary exposure to true philosophy or even formal logic. This isn’t the case for everyone, but a majority. Many times, people disagree with my statements or beliefs, not because of any logical or ideological error on my part, but rather a lack of understanding of how conclusions follow from premises. Ultimately, the discussions belie no understanding of the objective material at hand, but merely emotional attachments to already-existing prejudices as well as a fundamental lack of foundation from which they are arguing. When presented with this fact, others are wont to accuse me of the same. In this work, I hope to both soundly establish a defense from such accusations as well as begin to spread a culture of “lower-class intellectualism”: a culture of self-education and intellectual progress compatible with and available to “the lower class”, economically speaking. The first step of doing so would be to make something accessible and affordable available to what I call “my social stratum”, as well as simply raising awareness of alternatives to the current institutions which are fueled by big money and political agendas.

Clearly, as a starting place, this work is merely the beginning of what I hope to be an expansive and pervasive body of work. I hope to one day move beyond this project of establishing my foundations to making these concepts concrete and practical, providing a certain utility to all that would be open to a paradigm shift from our current postmodern sensibilities. From this body of work, I intend to expand and build on these ninety-five theses using the same style and methods contained herein, as well as writing a series of philosophically weighted articles concerning how one ought to live from day to day.

As most anyone who reads this work can tell, there is nothing groundbreaking or even original in this work, other than the arrangement of these ideas pulled from the atmosphere of the philosophical tradition. As a foundational work, I would expect this piece to be fairly conventional. Besides, as one prone to taking things too far and stating the outrageous, I want to give myself a moderate baseline from which to work in order to give some credence to my more extreme assertions which I have begun to publish already, alongside this work.

Despite the conventional content, I chose a particularly evocative title, (if I do say so myself). The title “A Philosopher’s 95 Theses” is an unabashed attempt to cash in on the fairy tale of Martin Luther’s dramatic succession from the Church. There is a narrative in which Luther made official his succession through the posting of the 95 Theses on the church doors as an overt “Eff-You” to the Church. While evidential support for this re-telling of history is nonexistent, the actual format and concept of the work itself is worthy of emulation. This is certainly the case if this is to be a beginning of a break from the status-quo of contemporary philosophy.

To be honest, the suggestion for the title and style for this work was presented to me by a friend who seemed quite earnest in wanting me to write my thoughts for his own edification. The suggestion was made primarily from a religious awareness of the Theses as a work of philosophy which could be easily adapted to a social media format. The concise nature of each thesis makes it easily tweeted in ninety-five segments. He leveled a challenge to me to post ninety five philosophical theses in ninety five days on Twitter and Facebook in order to encourage me to begin writing my ideas in a codified and discussion-friendly format. After a hilariously disorganized and epistemically infuriating four months, I had ninety-five theses, a ton of notes from discussions that were sparked (by the early theses, I think many friends and loved ones lost interest around #35 or so), and a new-found energy for attempting to publish something of worth.

The name and format of the original “95 Theses” has been lifted, but much of the argumentation and content has been abandoned, as Luther and I have very different intentions and circumstances concerning our respective works. Where Luther simultaneously affirmed and protested various Church doctrines and principles of theology, I intend to do the same for the philosophical doctrines which many contemporary philosophers have confessed. As such, rather than explicitly arguing the finer points of revelation and redemption, I intend to establish a solid foundation for later arguments in the philosophical realms.

As I will address in detail later, philosophy is a historical and holistic entity. Due to the nature of philosophy, I don’t expect to have come up with any original material, even if I know not where it has been written before. In the words of Descartes, drawing on Cicero, One cannot conceive anything so strange and implausible that it has not already been said by one philosopher or another.1 The ideas and truths of philosophy are simply “in the air”, as it were. One of the marks of truth in the philosophical world is its longevity. Many ideas that emerge in these theses, as well as my other works, are strongly rooted in classical philosophy as it has survived to this day.

I borrow heavily from existing works, as all philosophers do. I give credit where I can recall or research the original source, but it would be impossible to trace the genealogy of every idea which springs from my mind. This arrangement of concepts and their relationships is likely to be original, but the ideas themselves are old and deep-rooted. It is the perennial duty of the philosopher to water, trim and tend to the tree of knowledge which is philosophy: to hold the ideas in one’s mind, to criticize and correct errors, and generally allow the Truth to become known. Not a bonsai tree, but a veritable orchard of delicious and ripe fruits.

This work, hopefully, will establish a faux a priori2 foundation from which I can assert all of my later reasoning. Now is your chance, critics. Now is the time, in this work, to correct my premises, my errors, my moments of weakness, before I attempt to plumb the depths of truth in this vessel I have cobbled together. It will be too late, I am sure, when I arrive at a premise so incomprehensible and flawed to point out that I had overlooked a basic truth here and now.

I have grandstanded long enough on what philosophy is, without giving an appropriate definition and description of it. One should not assume that one’s use of terms is identical to that of one’s readers or opponents.

What is philosophy and why bother?

I believe that all who can rightly claim to be a philosopher will recognize certain fundamental characteristics which I believe to be necessary conditions for philosophy. It must be rational, as even the most blasé and stale philosophy assumes the basic precepts of logic, non-contradiction, and the ability of the mind to grasp truth. It must be consistent, as rationality simply can not allow for the possibility that the principle of non-contradiction is invalid. Therefore, all rational things are self-consistent. It must be empirically viable, as our experiences determine our understanding of the universe and, subsequently, the truth (the theses themselves will discuss this3); we cannot hold a belief which predicts or necessitates an experience divergent from what we actually experience. It must be universal, as any truth which is contingent upon circumstance is not a truth, but merely a fact.

In addition to these necessary attributes of the practice itself, I believe it must also produce certain results, fruits if you will, lest it be nothing but a mental exercise. Without ethical agency, this exercise would have no bearing on our lives as a prescriptive measure which, in the absence of an equivalent authority for prescription, would result in aimless and irrational lives, driven simply by the reptilian and hedonistic pleasures of our own genome. Without utility, this exercise would be superfluous to any other activity man would undertake; very few (and no sane) men would choose an impotent and laborious endeavor at the expense of something enjoyable and productive. Ultimately, without truth, there would be no rhyme or reason to the philosophical endeavor; if it were to be self- consistent and pursue truth, it must actually be capable of and ultimately accomplish the task of acquiring Truth. For these reasons, I assert with a fair degree of certitude that the purpose and goal of philosophy, as well as its necessary and sufficient conditions, (and, therefore its constituent elements, such as theology, physics, etc.) is to create an internally consistent, logically sound, empirically viable, and universal worldview which possesses ethical agency, utility, and (ultimately) Truth.

As mentioned in the above definition, philosophy possesses many constituent elements and tools of which it avails itself. As a reading of Aristotle or many of the enlightenment philosophers will support, I find that it is most natural to begin the philosophical journey in the realm of epistemology or phenomenology. A definition of each is in order, I believe, before addressing the practicality of such a method. Epistemology, taken from the Greeks, can simply be considered “the philosophy of knowledge and thinking, an explanation for how one thinks and knows”. Similarly, phenomenology would be “the philosophy of experience, an explanation for how one experiences and interprets those experiences”, also from the Greeks.

An approach starting from the angle of philosophy of thought and experiences does present some inherent issues, like the infamous discussion between Kant and Hegel:

“We ought, says Kant, to become acquainted with the instrument, before we undertake the work for which it is to be employed; for if the instrument be insufficient, all our trouble will be spent in vain… But the examination of knowledge can only be carried out by an act of knowledge. To examine this so-called instrument is the same thing as to know it. But to seek to know before we know is as absurd as the wise resolution of Scholasticus, not to venture into the water until he had learned to swim.”4

Hegel presents a very pragmatic alternative approach, which was quite popular with later Hegelian philosophers, like Marx. Essentially, he asserts that one ought to simply begin thinking and doing philosophy and will learn how one learns by witnessing one’s own experiences, much like how one learns to swim. As one can see, in reading the first ten or so theses, my assumptions and their descendants take a very Hegelian approach to early epistemology.

Amongst the historical traditions of philosophy, a debate as old as the pre-Socratic philosophies rages to this day: the theists vs. the atheists. Despite the greatest attempts of the moralist atheists, though, the arguments between theism and atheism ultimately deal with a more fundamental question. Whether or not there is a God is ultimately an argument as to whether there is any Truth at all. Again, as the theses address, either the universe is nihilist (devoid of any fundamental or objective meaning and purpose) or it is teleological (purposeful and directed)5. The most common theistic argument made is one concerning teleology, “What’s the point, if there’s no point?” Conversely, the atheist makes an absurd or existential (presenting logically inconsistent facts, or asserting that the universe itself is logically inconsistent) argument, “If there is no point, I can make one.” These arguments will be addressed in the theses6.

Ultimately, all forms of science and pseudo-science (assuming that they are rational and logically rigorous) are constituent elements of philosophy. If our definition of philosophy is accurate, then all rational activities which are directed at the goal of achieving ethics, utility, or Truth are elements of the grand attempt that is philosophy. The scientific endeavors are all part of the philosophical school of physics, by which one establishes the empirical viability of any particular philosophical view. The pseudo-sciences, ranging from sociology, to psychology, to astrology, to magic (again, assuming that they are rational and logically rigorous) can sometimes be appropriated into either physics or metaphysics. Some rare cases may even wander further from physics into epistemology or phenomenology, but all intellectual pursuits are ultimately an element of philosophy. Many of the individuals which pursue these endeavors lose sight of the forest for the trees, but that does not make their work any less valuable to the philosopher.

Bertrand Russel asserts, in chapter fifteen of “Problems of Philosophy”, that science becomes science by divorcing itself from philosophy once it becomes useful. Joseph Pieper, similarly contends that scientific inquiry is capable of achieving conclusions which are resolute and unyielding, whereas the philosophical endeavor can not.7 Both Russell and Pieper have a distinctly post-enlightenment flavor to them in this regard, which is unfortunate. They both fail to see that science is but a tool and a field contained within philosophy. Science may try to distinguish itself apart from its mother, with such cultural figureheads as Neil DeGrasse Tyson outright ridiculing her, but it can never truly extricate itself from the frameworks from which it came. Instead, it would be more appropriate for the specialists to concern themselves with their specialty and the philosophers to draw on them when needed.

Above all, reason is the driving force of man and his works. Above all rational pursuits, philosophy reigns. While not all men may have the ability to be great philosophers, all men are called to be philosophers, nonetheless. If in no other way, one must examine their choices and their lives in such a manner to achieve the best outcome available. Unfortunately, in this day and age, I fear that even this minor task proves to be too much for most.

It is no surprise, really, that this task has proven too much for my generation. The heart of philosophy is discourse and my generation is illiterate and disjointed in this regard. Rather than bemoaning our state of affairs, however, I ought to concern myself presently with the discursive nature of philosophy. Whether the discussion be oral debate in the city square, essays and books written in the solitude of a cave or study, or a college dropout’s ramblings on social media, philosophy only flourishes when an idea is shared, tested, refined, and put into practice. The manner in which this discourse and implementation takes shape is varied and veiled, but it is very real, even today.

The ideas and themes in popular philosophy pervade every area of our society, especially in the United States of America. They are boiled down to aphorisms and images and spread like a plague or meme through the cultural ether. I say “especially in America” as our nation was founded on a social experiment derived from the popular philosophies of the time (social contract theory), and that is a tradition that has continued for two centuries. Those that participate in the creation and sharing of art in society play a crucial role in the spread of these ideas.

Literature has been a long-suffering companion to philosophy. As far back as Homer and Gilgamesh, we see philosophical themes and musings riddle the characters and narratives of the culture. In more modern times, with the rise of the printing press, we saw an emergence of overtly philosophical fiction and some less-overtly philosophical fiction. There was such literature before the press, to be sure, just look at the classics. However, I find it unlikely that “Candide” or “Thus Spake Zarathustra” would have lasted the way the “Iliad” or “The Divine Comedy” has in the absence of the press. Even popular works of both fiction and nonfiction, whether intentionally or not, are rife with philosophical commitments.

These commitments are equally prevalent in film. While film is a fairly recent advancement in technology, it shares a common lineage with literature. We can easily trace its heritage from screenplay to stage play to oral traditions which stand as the forebears of ancient literature. For the sake of this discussion, I will consider video games and television shows as film, as their storytelling devices and methods are more-or-less identical. In addition to the words and language used in literature, film also presents ideas and commitments through the visual medium as well, certain images or arrangements can, consciously or unconsciously, link certain ideologies and characters together. The same holds true for music, sculpture, painting, any artistic or cultural endeavor, really, even dance.

Through the public discourse and permeation of cultural works, philosophy drives a society’s zeitgeist8. Any of the uninterested or uneducated who participate in cultural events, from watching movies, going to school, being subjected to advertising, have their minds and views molded by the underlying philosophy. Through exposure and osmosis, ideas that were once held in contempt have become mainstream and vice versa. This is the natural cycle of philosophy, and it is always made possible by the liberty of the minds of true philosophers. Even if the zeitgeist demands that the world be one way or another, the free thinkers are always at liberty to pursue the truth and share that quest with others through discourse.

Philosophical Schools, the Good and the Bad

Philosophies, taken in their historical and cultural context, ultimately tend to land in two categories: that of “the man” and that of “the rebel”. Whatever cultural or institutional norm for a culture may exist, it exists because of the philosophers who have brought those concepts to light and shared them via the public discourse. Those ideas that find themselves in favor of the ruling class or establishment naturally become the driving force of a society or state. Those ideas which are newer and less conformed tend to become popular amongst the counter-culture. It is important to note: this observation does not lend any judgment to the truth value of any one or another idea, simply its cultural impetus. It is the duty of the free-thinking philosopher to sort thought these ideas, regardless of the cultural context, and to ascertain the objective truth value of each respective idea. This often makes their philosophies unpalatable by both “the man” and his reactionaries. (C’est la vie.)

This cultural presence and impetus of popular ideas is revealed in every cultural work. From little nuances in color choice, sentence structure, musical tonality, to overt themes and statements, certain ideologies become manifest to an audience. These manifestations can be analytical and conscious and others can be more insidious or subconscious. The two most prominent contemporary examples are in the mainstream news and popular film, where phrasing and imagery is specifically designed to impart a worldview and philosophy on the unwashed masses.

It is no mistake or coincidence that the more authoritarian a state becomes, the more strictly social discourse and cultural works are censored. It is always in the best interest of the establishment to engender in their subjects conformity of thought and philosophy. The most intuitive and frequently used methods towards that end are limiting the subjects of discourse and subverting the thoughts of the masses. I believe that now, like any other time in history, the people of the world are having their thoughts and philosophies subverted and censored by the social and political establishments around the globe. An easy example of this phenomena would be the blind adherence to material reductionism, Neo-Darwinism, and cultural relativism which is strictly enforced in academia as well as by societal pressure, despite the lack of compelling rational evidence to support any of the three.

It is possible, however, that the prevalence of “bad philosophy” in popular culture is less a conspiracy of idiocy and more a benign zeitgeist of an uneducated time. Regardless of whether it is intentional or incidental, there is a silver lining in this situation. Philosophy, when maligned, can be a powerful tool for subjugation, but it is also, by its fundamental nature, liberating. Philosophy, as the pursuit of truth by rational means, necessarily drives its earnest adherents to freedom. By questioning the reasoning behind the social structures and institutional norms one encounters, one comes to understand where the truth lies and liberates oneself from the lies perpetuated by a society devoid of reason. Because of this, we see a dichotomy emerge: popular culture and its discontents. Now, this doesn’t mean that philosophers cannot enjoy and partake in the fruits of popular culture; it simply means that one ought to be aware of what is being imparted upon oneself, especially when there is a surplus of material available.

Reality exists such that there are several misconceptions and maligned concepts in the realm of contemporary philosophy. One of the popular misconceptions concerning philosophy and intellectualism is that it is a domain primarily inhabited by out-of-touch nerds arguing about stupid questions. “Which would win in a skirmish, the Enterprise or the Executor?” While the answer is obvious after a short bit of reflection (Enterprise), it is a dilemma that only a specific and small demographic will ever face. It is also a question that has questionable practical significance. I have witnessed in both the media and the general public a rising belief that those that contemplate such questions are to be considered intellectual and philosophical, at the expense of those that are deserving of the titles.

Of course, those that are deserving of the title have long been plagued by equally absurd-sounding puzzles. “When removing stones from a pile of stones, at which point is it no longer a pile?” While the answer may appear to be obvious to a mathematician or engineer (the pile is a designated set, it remains a pile even if there are no units in the set), it has far-reaching implications in the way man thinks and knows, or in other words, in the realm of epistemology.

Without philosophy, man would lack a crucial tool of introspection and rationality. The very question “What is knowledge?” does not have a satisfactorily categorical answer. Through our pursuits in philosophy, man has made great strides in addressing such a fundamental question which has evolved from “What is justice?” and moving onto “How can I be certain I exist?” and now addressing a wider, more complex assortment of queries. The fact remains, we must always ask, “How do I know this?”

These questions form our culture and our ethos. Or, rather, the pursuit of answers to this class of questions drives the popular zeitgeist. Even banal entertainment, like prime time television and late night talk shows touch on the questions which plague all sentient beings. “Why am I here?”, “Why am I unhappy?”, “What’s for lunch?”9 are all questions which people are desperately trying to answer whether they are aware of it or not. Philosophy attempts to codify and rationalize the pursuit of these answers, to make it accessible to our contemporaries and future generations, not only for our own sakes, but for the sake of man as well. These attempts are frequently used to answer these questions by taking our common assumptions and putting them to the test.

In each age and culture, there are certain ideas that become popular and omnipresent. An example would be polytheism in ancient Greece, or Christianity in 13th century Europe, or social Darwinism in the early 20th century. As can be seen through the examples presented, many of the common assumptions of the time fall to the wayside as a culture’s awareness evolved. In the words of Paschal: “Whatever the weight of antiquity, truth should always have the advantage, even when newly discovered, since it is always older than every opinion men have held about it, and only ignorance of its nature could [cause one to] imagine it began to be at the time it began to be known.”10 In some cases, those changes are for the better or worse (the shift from superstition to reason or the social ideology which fostered Nazism) at the time that change occurs. However, in the long run, philosophy always allows the individual and their culture to learn from the past. Typically, though (as I indicated above), this puts the individual at odds with his culture until the culture can catch up with him. This often makes the more notable philosophers those that were considered nonconformist.

A popular postmodern mindset in the philosophical landscape today has attempted to artificially generate that notoriety through philosophical non-conformity. What I mean is, they attempt to protest even philosophy itself. This is a trend which began in the enlightenment and found its perfection in the existentialist movement. Where enlightenment philosophers tended to either decry the philosophical mindset as some form of mental illness or feel the need to announce that it isn’t a “real” science, existentialists were (and are) wont to denounce not just the rationale of philosophy, but the very existence of logic altogether.

Absurdity is, fundamentally, simply denying or violating the principle of noncontradiction: asserting that something both is and is not in the same mode at the same time. Absurdism is a whole realm of postmodern philosophy in which one, such as Jean-Paul Sartre, attempts to use the tools of philosophy without following the rule of logic. While such attempts are entertaining and mind-expanding, they are just as the name says: absurd. As the 95 Theses (like all philosophy) assumes the existence and necessity of logic and rationality, this treatment of absurdism will be short and off-handed. Even so, Sartre, Camus, and other existentialists manage to contribute observations and arguments of value to those pursuing truth. I hope, in other works, to address the good and the bad of absurdist philosophy, but not today. This will be explicitly outlined in the theses themselves11, but this will help to better prepare a novice for the oncoming vocabulary contained in this work.

Nihilism is not a new concept in philosophy, but it has recently found a surge in popularity after witnessing the World War and all of its continuations. It is tempting to deny the existence of meaning when witnessing the most inhumane behaviors being perpetrated by man. “What is the meaning in millions of men killed by other men?” can easily become “What is the meaning?” However, as a being capable of asking such a question, the answer literally precedes the question. If one is able to witness and analyze whether or not something has meaning, there is, at a minimum, the production of that question. In the case of an absurdist, he looks no further than the mind of the inquirer, asserting that the inquirer/philosopher must give meaning to an otherwise meaningless world (and ultimately violating the PNC to do so). In this way, nihilism, in using a meaningful discourse to establish that there is no meaning besides the absurd is, itself, absurd. In the case of a philosopher, one asks “from whence does that desire for meaning come?”

In order to make sense of the universe at large, philosophy must be logical. Taking the evidence available to the philosopher and arranging it into a coherent narrative which is both satisfying and capable of producing utility and accurate predictions of cosmic behavior. The fact that our minds and our philosophical endeavors exist in such a way, and the fact that it is successful as such, we conclude that the universe itself must follow a form of logic. While the human intellect may be limited to codifying and adapting a series of laws to describe the universe’s behavior distinct from that behavior itself, the universe’s behavior is quite clearly consistent and logical, regardless of our perception of it.

This, of course, brings us to the subject of relativism. Relativism, in all but its softest forms, asserts and assumes the absence of objective existence, either in the form of moral reality, or physical or ontological reality. Moral relativism and its twin, cultural relativism, asserts that, because of the diversity of contradicting perceptions of ethical truth, there can be no absolute moral truth. Naïve relativism follows this form of logic to its inevitable conclusion: anything that can have contradictory observations or beliefs concerning it does not exist objectively, therefore reality itself does not objectively exist. While, at times, some form of scientific study is used in an attempt to justify such an assertion, typically it is an extreme reaction to scientism.

As objectionable as relativism is, it is at least identifiable and easily refuted. Scientism, however, is a beast of a different nature. Scientism is a strict adherence to the scientific method predicated on the philosophy of materialism, it is a union of empirical positivism and material reductivism. Anything not immediately falsifiable12 is of no consequence and ought to be done away with. Not all elements of scientism are bad (coming from a former adherent to scientism); a strict adherence to the methods of reson and empirical observation is what has elevated the school of physics to become the driving force of modern society it is today.

In recent centuries, most noticably the twentieth, there was a sudden surge in scientific thought and progress in all of the civilized world. There were innumerable factors that contributed to this phenomenon and, thankfully, I have no intention of going into detail concerning them. At the moment, I am far more concerned with the fruits of this technological renaissance than its causes. In the nineteenth century, the perpetual swell of knowledge and increasng standards of living appeared to be infinitely sustainable. This led to an optimism in the whole of society, but most especially in philosophy and its constituent sciences.

Confidence in science’s ability to cure all of humanity’s ails was joined by a popular trend in science known as reductionism. It was widely believed that science’s messianic qualities were a result of its percieved ability to reduce the most complex psychological or biological ailments into some simple alchemical formula (female histeria and electroshock therapy come to mind) and even the darkest and most troubling metaphysical questions could be exorcized with a simple application of mystical scientific hand-waving. Reductionism isn’t a modern invention, even the pre-Socratics strove to reduce all things to one atomic principle (the world is air/water/fire/flux/love/whatever), but never before was it so widespread and influential as during the rise of modernism and postmodernism.

Unfortuntely, in all their excitement over the leaps and bounds that were being made in their discoveries, true scientists (one who studies the physical sciences) became “scientists” (those that adhere to the philosophy of scientism). Subsequently, some bad science was introduced into the realm of sceintism without sufficient criticism. A handful of non-falsifiable theories, like Neo-Darwinism and String Theory, have managed to charade their way into the cult of scientism and are now defended with a fervor and blindness rivaled only by the most rediculous of religions. While it is not currently my goal to write a full-fledged indictment of scientism and other instances of bad science, I am compelled to at least demonstrate that materialism is insufficient and direct my readers to a work that more than completely shows that materialism and Neo-Darwinism are incomplete and illogical worldviews13. In favor of misguided science, many are equally prone to jihad in favor of bad philosophy (ie. relativism and consequentialism14). Some of these people have legitimate exuses for doing so (public education and demographics of their upbringing come to mind), ultimately, their excuses can be reduced to the defense of, “I didn’t know any better.” Some despicable men, however, are quite aware of the logical fallacies they commit in the name of furthering an agenda contrary to the pursuit of Truth.

Sophists, since ancient Greece, have always profited from making defenses of the indefensible, either for the acquisition of wealth or the silencing of their own conciences. Whenever an ill-informed or malignant trend emerges in a culture, it is certain that some sophist or another will emerge from the woodwork to champion it. Unfortunately for true philosophers, most sophists find their roots in philosophy and academia. This is unfortunate because, to the unwashed, the sophists and philosophers are indistinguishable between each other, save for sophists defending the fulfillment of their base desires while the other demands intellecual rigor and consitency. These sophists were the enemy of the ancients and are the enemy of philosophy today. As certain historians through history (like Cicero) have noted, there has been a noticeable trend of cultures falling for sophistry not long before their demise. In our modern culture, we see popular philosophy dominated by sophistry and intellectual vacuity. In academic philosophy, it would appear that a certain apathy to the common man and common culture has gripped the hearts of philosophers as they discuss the impractical and esoteric. Worse, though, than the philosopher turned sophist, is the celebrity or lawyer turned “philosopher”. Lawyers are paid to play by the rules and obfuscate the truth. Celebrities are paid because they make people feel good. Both of these careers are antithetical to the pursuit of truth. In such a case that one who makes a career of pursuing personal interest (whether it be thier own or their clients’) turns their attention to announcing certain ethical, social, scientific, or really any intellectual claim, they ought to be met with close scrutiny. An example which has plagued America (and the world) in recent years is the Hollywood zeitgeist of celebrities loudly and aggressively endorsing the political ideologies of the radical left. While these endorsements ought to be recieved skeptically, we instead have seen a widespread voice of agreement in the public forum. This is no different than the phenomenon observed by historians of bygone empires and cultures.

The same cult of irresponsibility and self-promotion in both popular culture and academia that existed in ancient Athens still plauges true philosophers today. At times, given the ascetic15 nature of the philosophical disciplines, it can be incredibly temptng for one to compromise one’s integrity for the sake of wealth or popularity which a philosopher would never see otherwise. Additionally, even if one is unaware of what they are doing, it is common for one to confuse one’s ideas with one’s self, which leads one to take justified criticism poorly and leaves no room for improvement and correction of ideas. When one is more concerned as to whether they are well-liked or can turn a profit rather than engaging in a genuine loving pursuit of wisdom and truth, it can only end badly.

As Socrates is credited to have said (which is more likely a paraphrase of his entire body of work), “The unexamined life is not worth living.” In order to successfully achieve eudaemonia16 or Truth, one must be vigilant and develop the ability to accurately assess one’s self. As will be expressed in the theses, one’s experience and examination of that experience is fundamental in one’s understanding of the universe and subsequent actions. Additionally, seeing as how eudaemonia and truth are the goals of the philosopher, it is clear that any philosopher and, truly, every man must live an examined life.

Now, this is not to say that every man must so thoroughly analyze and examine every atomic facet of his life in perpetual stoic apatheia. In fact, the reality is quite the opposite. While the philosopher must develop a categorical and pervasive habit of self-assessment, this could be crippling in other endeavors. Some men are simply incapable of this degree of introspection and others live in an environment which disallows such behavior. Even these men, though, can and ought to engage in what could rightly be called a “partially examined life”17: a lifestyle in which one at least routinely examines one’s conscience and actions. Training in and awareness of philosophy are invaluable tools in such an endeavor.

After all, our definition of philosophy clearly illustrates that philosophy is universally applicable. In clearly defining how the universe operates and why, as well as exploring what our actions must be in any given circumstance, philosophy establishes itself as the prime candidate to be the very center of culture and individual lives.

Through careful examination of one’s self and of the universe at large, one can come to an understanding of what one needs in order to acquire self-fulfillment. The desire for self-fulfillment is already the driving force behind culture. In developing and advancing the understanding required to achieve self-fulfillment, one contributes to the formation of a culture of self-fulfillment. This culture, informed by philosophy, would be a haven for those seeking eudaimonia.

As the centerpiece of ancient Greek culture and subsequently of philosophy, eudaimonia deserves a more thorough examination and definition. While it is alluded to in the 95 Theses, it may not get the fullest treatment it deserves. It then falls on the introduction here to give at least a high-altitude explanation with which to work. Eudaimonia as it is used here and in the theses can most easily be described as “the freedom to excel”. This means not only the presence of the mental faculties required to conceptualize and pursue excellence, but also the material and metaphysical circumstances required. In truth, I believe that this has always been the pursuit of man: to live in a culture of eudaimonia.

Philosophy: a Brief Genealogy

Regardless of which narrative one adheres to concerning the origins of man, there are certain circumstances which must have occurred at some point. While the beginnings of just such a narrative exist in the theses, I will attempt to imagine the worst-case scenario for the point I am attempting to illustrate. That point is, from the inception of the human race, philosophy has existed. With the emergence or creation of the first man, whether he was a mutated member of an ancestor race or created fully formed from the dirt by the very hand of God, his was the unique responsibility of siring the human race. While language and conceptualization may not be required in order to find a mate, it could certainly help. However, from the birth of the first progeny of man, communication and conceptualization become necessary for the continuation of the species. In order for her offspring to survive long enough to fulfill its duty to the species, our Eve must be able to express the concepts necessary for survival. Even if one is to assume that genetics supplied her offspring with instincts concerning fight-or-flight responses or aversions to creepy-crawlies that could be harmful, they would be insufficient for the task of allowing the offspring to learn, “This mushroom is bad,” or “This is how you kill a boar,” when they are one-chance circumstances which drastically impact survival.

It is clearly in the best interest of humanity’s survival to build on and diversify the material each generation inherits. “This mushroom is bad,” can only take one so far; it certainly does not place one at the top of the food chain. However, inquiry, discovery, and purpose can drive a nomadic people, scratching a meager sustenance from the earth, to ever greater achievements. I may not be able to kill a bear in hand-to-hand combat (I have never had the chance to try), but I don’t have to. By virtue of the utility of philosophy (and its constituent physical sciences), I live in an environment which is naturally repulsive to bears (though, in the instance of this region, the case was quite the opposite until recently); as added protection, though, I have many tools at my disposal, not the least of which is my Mosin–Nagant.

Aside from mere survival though, philosophy also provides mankind with an awareness of purpose and ethics which provides far more utility and impetus than survival, especially once the requirements for survival are met. In the pursuit of eudaimonia, we can imagine a genealogy of thought, moving from, “This mushroom is bad,” to, “Why is this mushroom bad?” to, “Why is?” With as many intermediary steps. Alongside this line of reasoning, we also see a diversification of material, branching from mere survival and pagan “gods of the gaps” into physics (including biology, astronomy/astrology, chemistry/alchemy, etc), metaphysics, epistemology, theology, etc.

While all these endeavors are oriented towards one end: the creation of an internally consistent, logically sound, empirically viable, and universal worldview which possesses ethical agency, utility, and (ultimately) Truth, they are sufficiently detailed and esoteric that one could spend their entire lives in devotion to one small element of a particular area of philosophy. This should not, however be used as a justification for skepticism18 as it would only serve as justification if philosophy were a solitary venture. Philosophy, by it’s nature, is collaborative. Each area of philosophy, no matter how distinct from another in focus and subject, bears at a minimum a holistic relationship to each other. In the same way that each area of study collaborates with the others, so too must individual philosophers. This relationship of the areas of study is due, in part, to their common material and practical significance; each area of philosophy informs the others and serves as a check against fallacious reasoning.

Being a human endeavor, philosophy finds itself the victim of human error quite frequently. As optimistic and teleological as my views are concerning this endeavor, I am not ignorant of the inherent shortcomings and roadblocks such an endeavor faces. I fully expect that even in the case of my own contributions, I will find myself (many years from now) arguing against the very assertions I make in this work. These shortcomings often lead to the development of dead-ends and half-truths. Some of these are quite speedily identified and handily defeated (like geocentrism) but many others are quite bothersome. Concepts which are rooted in truth or bear tangential resemblances to the truth often mislead the philosophical discourse. One need only to look as far as Epicurus’ problem of evil and subsequent resolution, or Puritanism, or the Copenhagen Interpretation, or Marxism to see what kind of damage can be done by philosophy run awry. These mistakes, as damaging as they may be, will, ultimately become a footnote in philosophy as failed experiments, as the utility of accurate reasoning becomes apparent and the march of the true philosopher continues unabated.

As the definition I am using for philosophy states, philosophy is an ongoing pursuit of truth (or, the Truth). All legitimate philosophers have, at one point or another, made a categorical assertion regarding truth. Even most faux philosophers make categorical assertions regarding truth, even if that assertion is a naive and misguided utterance of, “There is no truth.” While I do not necessarily believe that the “end of philosophy” has some metaphysical role to play in directing philosophy or that it may be attainable in this world, I do believe that the simple utility of truth allows and encourages “those who have eyes to see” to be diligent in selecting the philosophies to which they ascribe. This “natural selection” of memes will, naturally, lead towards the end of philosophy. I know this sounds quite similar to the Darwinist narrative which I have rejected mere pages before now, and it should, as there are some good ideas buried amidst the bad science. The survival of the fittest, as Herbert Spencer is credited with having formulated it, is one such concept.

Such memes as survival of the fittest are a prime contemporary example of how philosophical concepts tend to simply be a part of the atmosphere in which society functions. Most everyone has heard that phrase in one memorable context or another, even if they have no idea or a misconceived notion of what it means. In the case of philosophical culture, or rather the culture of philosophers, far more obscure and odd concepts are part of the atmosphere. In this way, a well-read and intelligent philosopher may breathe in Descartes, Scholasticus, Nietzsche, and Groothuis in order to utter forth a synthesis of these elements unique unto himself, even if it is identical to another’s work.

What utterance do I have to make? What can one such as myself bring to the banquet table of philosophy? I desire to partake of the feast about which those before me have written, but what can I do to pay admission? As will be clear to those who will bother to read these Theses, I am not yet sure, but I hope to one day have applied myself thoroughly enough to this, my vocation, so as to be worthy to touch the garment of lady philosophy.

This work, itself, is an attempt to codify my existing ideas in a format suitable for public development and critique. Philosophy, by its nature, is discursive and social by nature. I could not rightly call myself a philosopher if I were to merely wonder at the cosmos. Only if I were to share my wonder with others and argue my way to the truth alongside my companions would I be worthy of such a name. This is my first of a thousand steps towards the banquet for which I was created. I hope to bring along as many as can come with me to sing the praises of the Grand Architect of such a marvel as creation.

All I can rightly ask of philosophy and of those philosophers who would aid me in this journey would be that I contribute one more voice to this chorus as old as man: to be heard and considered by others, to have what truth I can find be perpetuated while my own shortcomings be disregarded. A lesson I have learned from Ayn Rand: to be considered sophomoric and redundant is still, at least, to be considered. If I could rightly ask more, however, I would ask that I be granted a personal fulfillment of my unslakable thirst for answers.

Hopefully, I can play an integral role in this chorus, can make an impact. I want to bring the practice of true philosophy back from the grave that enlightenment dug, existentialism filled, and postmodernism hid in the woods. The death of god19 was less a death of god and more the abortion of philosophy. I want to aid in the restoration of Lady Philosophy to her former glory, to clothe her once again in dignity and honor, and to bring her back to the common people, not as an object of rape, but of royalty. This novitiate book is the inauspicious beginning of such a daunting career choice.

95 Theses

1Discourse on the Method of Rightly Conducting One’s Reason and of Seeking Truth in the Sciences” Pt. 2

2Self-evident and deductively reasoned

3Chapter 1: Epistemic Assumptions

4Hegel, Encyclopaedia of the Philosophical Sciences p10

5Chapter 5: Teleology?

6Also Ch 5

7“Leisure: The Basis of Culture” p110

8German: “Spirit of the times”

9“Time is an illusion, lunchtime doubly so.” Douglas Adams

10Groothuis, On Pascal (Stamford: Thomson Learning, 2003), 202

11Chapter 5

12 a theory resulting in an empirically verifiable prediction which, if inaccurate, determines that the theory is wrong

13Groothuis “Christian Apologetics” chapter 13

14An ethical school of thought which argues that the result of an action determines the ethical quality of said action

15Self-disciplinary and abstinent

16Flourishing and fulfillment

17 A phrase that is certainly as old as the Socrates quote from before, but never better implemented than as by the people on the Partially Examined Life podcast: http://www.partiallyexaminedlife.com/

18 disbelief that it is possible for one to obtain truth or knowledge of the truth

19Nietzsche used the phrase “god is dead” quite frequently. Most notable of which is his parable of the madman from “The Gay Science” book three.

SCIENCE! and Epistemology

Today’s resource suggestion is a little more involved than previous ones.  Today’s resource suggestion is Karl Popper’s Conjectures and Refutations.  This book primarily concerns itself with the problem of doing science from an epistemic standpoint.  This may not seem to be too important to the project I have been engaged in with this blog, but to anyone who reads the book, you will likely see the connection very quickly.  My post on Paradigmatic Awareness is, essentially, a synthesis of this work and another by Thomas S Kuhn, which will likely be another resource suggestion soon enough.

While Popper was primarily interested in the philosophy of science in this book, I believe his insights apply to all of epistemology, not just the study of the material world.  As a classical liberal, Popper extends his epistemic reasoning out to his own version of social contract theory.  I think that, while he had a good basis to work off of and an amazing intellect, he made the mistake that many classical liberals made: he forgot that the institutions he advocated for would never go away; where tolerance, as he imagined it, was only supposed to be implemented so long as it was practically useful to collective flourishing, it has become the monster that it is today… inspired by his own words.

So, please read Conjectures and Refutations.  It will help broaden your understanding of how one can say that they know what they know, how science as an exercise ought to be done, and reveal a great deal of the social philosophy that has gotten the western world into the trouble that it is in now.

http://www.amazon.com/Conjectures-Refutations-Scientific-Knowledge-Routledge/dp/0415285941

Police Accountability or Racism? Your Choice.

I haven’t been posting very much this week.  To those of you that care, I apologize; I’ve been working on editing, formatting and writing new chapters for the book I’m trying to get done by the end of this year.  Between an insane work schedule and the amount of effort I’ve been pouring into this book, I haven’t had time to even feed or bathe myself properly (gross, I know).

Anyway, there was this little gem I found the other day, and I had to carve out some extra time to share it with you guys.  Lately Chris Cantwell has been worrying himself more with how different genetic and cultural factors are not conducive to freedom as opposed to sticking to his “thin” libertarian brutalism, which he used to be so well-known for.  Basically, he’s gone “thick-right” to the same degree as those that have gone “thick-left” in the libertarian movement.

One of the interesting results of this move is how quickly and effectively he resorts to tearing apart left-libertarians, even as compared to before.  Today’s Daily Resource Suggestion is this video in which he argues with CopBlock, an organization that used to be a police accountability group that is now a Black Lives Matter soapbox.

I don’t agree with everything Cantwell says, I never have.  but he is definitely in the top three celebritarians as far as rational consistency, epistemic rectitude, and actually researching the subjects they discuss.

 

10470939_764378556918130_4721853810608573940_n

More Science Complaints

As a fun follow-up to my recent post concerning some of the troubles with how people do science, I present to you an otherwise very smart man who would rather try to fix politics than academia.

This article is primarily about The Needless Complexity of Academic Writing and the ill-effects it has on academia as a whole.

Related to that article is a fun example of what he’s talking about:

20 Grad Theses explained in common terms

Scientists Are No Longer Skeptics

I have previously shared resource suggestions critiquing she way in which “science” is done today.  I do this not out of distaste for science, but instead because I love it.  The primary avenue by which I chose (and dropped out of) a college major was because of my relationship with science.

I wanted to do science, but discovered that the way science is done today is totally broken.  I believe the reason for that brokenness is the lack of philosophical grounding in the science community coupled with the pernicious influence of state funding.

Today’s Resource Suggestion is an exploration of one small symptom of science’s brokenness.

http://www.vox.com/2015/5/13/8591837/how-science-is-broken

About The Author (and his ideas)

Howdy? I am the titular Mad Philosopher of this particular work. I am a philosopher in my late twenties. Rather than focusing your ire on my lack of years, though, you may feel more vindicated by directing such feelings towards my lack of academic credentials. I am a proud college dropout who routinely speaks out against the academic industry.0612141803b

How can a man claim the title of “philosopher” without a degree or a chair at university? What are the
necessary and sufficient conditions for one to be a philosopher? I would argue that a philosopher is one who habitually engages in the activity of philosophy. Of course, philosophy itself is quite controversial. Is it merely thinking deep thoughts or questioning authority, or is it building a vocabulary and grammar for describing and discussing the human experience? Is it the activity of stoners and pedophile Greeks or is it the activity of academics and lawyers?

I am working on publishing a book dedicated, in small part, to addressing this controversy. In the mean time, readers of this blog (and listeners of the podcast) will notice a few family resemblances betwixt the entries on this blog which may inform the readers concerning what I believe philosophy to be. Readers of this introduction will receive the added bonus of my current working definition being explicitly provided here:
“Philosophy is the ongoing exercise of attempting to create an internally consistent, logically sound, empirically viable, and universal worldview which possesses ethical agency, utility, and (ultimately) Truth.”

I have been engaged in just such an exercise ever since I began reading the Nicomachean Ethics at the naive and virginal age of eight years. This has resulted in incalculable quantities of reading, writing, and arguing over the course of a couple decades. Also in that course of time, I have camped under the open sky for just shy of one thousand nights, earned the rank of Eagle Scout, renounced the honors associated with such an award, attended and dropped out of university (earning an associate’s degree in philosophy despite being a mere 20 elective credits short of a bachelor’s degree), married a (still) smokin’ hot woman, sired three beautiful daughters, and a bunch of other life experiences that likely only matter to me. These experiences have informed my worldview, though, and I thought it only fair to share them with you.

I tend a 500 square foot microfarm which provides nearly one ton of food each year. I make a meager living working facilities and maintenance at a church. I make time, daily, to work on this blog and my books as a matter of vocation and passion. I host philosophy clubs, play Dungeons and Dragons, shoot guns, do landscaping work, and tutor in writing, logic, and philosophy on the side.

More important than the man, I believe, would be his ideas. I doubt you are reading this blog to get to know me, personally, and are instead interested in engaging some unique and challenging views presented in a rational and grounded manner. Why else would someone read a blog titled “Mad Philosopher”? I cannot guarantee that any of these ideas presented will be unique in their substance, given that it is far more common for one to read numerous sources and simply synthesize a new arrangement of old ideas. I do guarantee, however, that I do what I can to make these ideas digestible to all audiences, that I try to make the form of the discussion engaging and bite-sized, and that these ideas are central to a series of worldviews and schools of thought which I contend ought to be at the heart of a fulfilling and eudaemonic life.

Many individuals, across the entire spectrum of intellectual ability, strive to eschew labels and “-ism”s in order to not bring others’ baggage into a discussion prematurely and to avoid feeling constrained by specific doctrines or dogmas. It may be my semi-religious upbringing speaking when I say it, but I find labels and “-ism”s to have a very unique and indispensable utility. For instance, I can provide you with a list of ideologies and “-ism”s which are the strongest influences on my worldview and method of reason, and that will help frame the discussion on this blog in such a manner that you are less likely to misinterpret my arguments.

As a matter of fact, that is what I intend to do. I will list here a series of ideologies and methods to which I owe my worldview, in order of philosophical priority, with each successive entry on the list obtaining only insofar as it is compatible with the preceding entries. I, Mad Philosopher, am a/an:

  • Epistemic Popperian: Of course, I have to put the most complicated entry at the top of the list. In all reality, it’s not too complex, only the terminology. Basically, I believe that “knowledge” defined as “justified true belief” is something to be approximated due to phenomenological limitations of the human mind (we can’t necessarily trust our senses and interpretation of experience). When one makes a knowledge claim, it must be accompanied with falsifying criteria: criteria that, if met, would force one to renounce the held belief. This is (ostensibly) the driving mechanism behind the scientific methods. I like to think that this is the underlying operating principle for all of my claims, given that I have had ample opportunities to change my mind concerning a great many important subjects. Reading this blog will gradually expose one to this catalogue of mind-changes.
  • Anarchist: This blog is technically about philosophical subjects in general. However, I choose subjects for blog posts based primarily with discussions I have IRL (in real life) and on various spots on the internet. As such, most of my posts would center on the most contentious of my beliefs. anarchism is, by far and away, the most controversial. Not because people would disagree with the premise (people shouldn’t murder, coerce, or steal from others), but because they don’t want to apply that claim to their own behavior in an intellectually consistent manner.  as far as the AnCom vs AnCap debate is concerned, I like to call myself “merely an anarchist“, but I am fairly economically literate, which would make most people consider me an AnCap by default.
  • Catholic: Yes, an anarchist can be Catholic and vice-versa. I have not fully explored this discussion in a blog post yet, but I assure you, it’s on its way. For now, It will have to suffice to say that I believe the doctrines of the Church to have sufficient falsifiability criteria to be provisionally assented to and that the doctrinal moral teachings of the Church bolster rather than contradict the Non-Aggression-Principle in any of it’s more intelligible forms. One will notice that I have issues with Catholic social teaching and a great many non-doctrinal claims. These issues are informed by the preceding entries on this list as well as a simple rational and critical inquiry into the teachings of such figureheads as Aquinas and Augustine.
  • Optimist: As a Catholic, I believe that this must, in fact, be the best of all possible worlds (It would have to follow from the claim of an omnipotent, omniscient, omnibenevolent God). There’s is the glaring issue of the problem of evil, regarding which I have several posts in the works. Given my issues with Aquinas, I am disinclined to endorse the Augustinian Theodicy (which is really a construction of Aquinas’) and instead hold to a cross between the Irenaean Theodicy and what I call the Rorschach Theodicy.
  • Brutalist: Almost as if to balance the claim of optimism (this is the best of all possible worlds) I also believe that this world sucks. Mankind has largely been concerned with the activity of enslaving, domesticating and murdering itself throughout all of known history (excepting the possibility of pre-agricultural revolution, pre-government times), and this has resulted in a world wherein humans are a tortured, maligned wreck. Unfathomable potential squandered by the lazy and criminal. This is why I listen to Death Metal.
    Taking on the label of brutalist is a sort of double-entendre, as there is the general disposition of a metalhead which is called “brutalism” and there is a line of libertarian/anarchist thought which strictly adheres to the Non-Aggression Principle. When I say I’m an anarchist, my particular brand of anarchism very closely resembles that of the brutalists to begin with. I do have various ethical and virtue-oriented prescriptions above and beyond that which the brutalists allow for, that’s why Catholicism precedes brutalism in priority on this list.

The name “Mad Philosopher”, itself, is a double-entendre. It’s obviously an homage to the popular phrase “mad scientist”, which seems appropriate: a mad scientist is often depicted as a social outcast reviled by other scientists and engineers for holding unorthodox views and implementing unorthodox methods. Would not this blog be the philosophical equivalent? That aside, I consider myself “Mad” in the same spirit as the mad scientist. Additionally, I am mad… well… livid, enraged, infuriated, wrathful, incensed, disturbed, repulsed, inflamed, and tempestuously, violently so. It is beyond my comprehension how one could be aware of the circumstance of contemporary culture and not at least feel a twinge of the pain, outrage, or guilt that I feel is warranted and just.

This blog is an opportunity for me to sublimate some degree of the infernal wrath I harbor, so as to maintain a level head in my day-to-day life while also hoping that others’ minds will catch fire as well. While I expect no amount of success with this project, if I were to have my way, this blog would generate a sufficient following such so as to instill a culture of resistance and intentionality. This culture would aid in making the world a better place in general, but also (more importantly to me) aid in the possibility of starting an actual intentional community outside the reach of Empire so that I can achieve some semblance of freedom in my lifetime. Oh, and it couldn’t hurt to get some bitcoin and sell some merch. on the side.

Carpe Veritas,
Mad Philosopher

Morality and Ethics

It seems that my philosophy posts get less feedback than my more political or religious posts. I find this disappointing but unsurprising. Today, as you could guess from the introduction is a philosophy post.

Deontology and virtue, morality and ethics… I started discussing these relationships last week. It was only briefly exposed and not defended, so I guess I should probably defend those claims. The first claim was that any statement of “should” or “ought”, when concerning a person’s actions, are either ethical or moral statements, without exception. I don’t know if this statement really needs defense, as it it merely a definition. I would define moral or ethical statements, broadly, as statements that concern themselves with how one ought to behave or act.

Moral and ethical statements obviously rely on a framework for a determination of truth value. One cannot say “One ought to voluntarily work towards the extinction of the human race,” without a justification for such a claim. One such justification could be “Human beings are destroying the global ecosystem, therefore one ought to voluntarily extinct themselves.” That justification, though, can only be said to be valid if it is operating in a framework which dictates that moral statements are derived from some cosmic preservation principle (ignoring that humans are a natural part of that global ecosystem), or an aesthetic principle that is dependent upon the one uttering the statement, or a misinformed understanding of how one ought to achieve a particular valued state of affairs (if you value nature, humans ought to extinct themselves). Validity does not necessitate actually obtaining in reality, though.

In order to obtain, the statement and it’s framework must comport to objective reality while also being logically valid and based on factual premises. “Don’t murder because Jesus said so,” is an example of failing to meet these criteria while also stating a moral truth. I argue that “Thou shalt not murder,” is an easily defended and true objective moral fact. However, appealing to something Jesus is purported to have said is not an argument in defense of a statement, it is merely appealing to an authority hidden behind two thousand years of history. Additionally, exclusively using the Bible as a moral framework is impossible; without additional work done outside the realm of Scripture to inform one’s interpretation of it will inevitably result in ridiculous statements, such as“homosexuality and abortion aren’t sins because Jesus never mentioned them.”

If “Thou shall not murder,” is an objective moral fact, it requires some form of deductive or inductive argument to demonstrate its categorical nature and its unimpeachability. There have been numerous arguments made for such a claim, and I don’t feel like pointing them all out. The first ones that come to mind, though, are Kant’s formulation of the categorical imperative in “Grounding for the Metaphysics of Morals”, Rothbard’s defense of the NAP in “War, Peace, and the State”, or Ayn Rand’s formulation in “Man’s Rights”. Essentially, the shortest and easiest formulation of “Thou shalt not murder” is thus:

  • Murder can be defined as “killing an individual against their will without first facing the threat of murder from that individual”.
  • The definition of a right necessarily extends to all individuals. If one has a belief in a right they or another possesses, it must necessarily extend to all individuals.
    ∴ If one individual has a right to defend themselves from murder, all individuals have the right to do so.
  • If one denies the right of another to be secure from murder (demonstrated by killing them against their will) one is denying this right to themselves, thereby willing the possibility that they may be killed by another.
  • In willing that another be able to murder oneself, it makes murdering this individual definitionally impossible, as unwillingness is a necessary condition for murder.
    ∴ If one murders another individual (or argues for the legitimacy of doing so), it does not revoke murder’s definitional status as a violation of a right.

I don’t fully agree with this argument, but it is the shortest and most straightforward case for objective moral facts.

Of course, if one is arguing for objective moral facts, they are a deontologist of some sort. Most of which are divine lawyers, trying to figure out God’s commandments based on revelation. While a noble effort, such activities are rarely compelling to those outside of whatever cult the divine lawyer is a part of. Deontology, then, is better suited to pursuing objective moral facts by way of rational and axiomatic inquiry into the nature of reality and of man’s relationships.

Where Kant or any social justice warrior will argue that deontological maxims can be positive statements of rights, I argue that only the inverse is true. One cannot say with axiomatic certitude that “one must affirm life” is an objective, and therefore categorical, moral fact. An example of why one cannot say “one must affirm life” is because it breaks down in limit cases (and some not so limiting cases). For example, if one is witnessing a murder taking place, can one kill the murderer? Or, if one eats something unhealthy or neglects to devote all their resources to the sustenance of the brain-dead or the starving people on the opposite side of the globe are they committing a crime? Pope Francis’ answers aside, I argue that these are obviously not the case. This line of reasoning is what has led to my mantra of “Murder, coercion, and theft are categorically unjust.” This far, these are the three behaviors I have found to be inconsistent with reason in every instance, by definition.

So, “Thou shalt not commit murder, coercion, or theft,” is a deontological objective moral fact. Something that simply exists no more or less than the matter from which my body is constructed, if in a different modality. Of course, as I’ve said before, this is a certainly stronger moral framework than what is seen in mainstream culture, but it is still incredibly impoverished. One cannot necessarily achieve flourishing by simply ensuring that their interactions with others are voluntary, as one may still do stupid and ill-advised things. The agent in question, of course cannot be coerced into not engaging in these voluntary, but ill-advised, actions. They can, however, be discouraged by rational persuasion. Enter: ethics.

Ethical statements, unlike moral statements, are not predicated on objective moral facts. These are positive statements that can be built on top of moral statements. These statements are subjective, based upon positive value judgments. “If one values the virtuous life, they ought to pursue virtuous actions,” for example. Some are very simple: “If one wishes to make money, they ought to provide products or services in trade with those who have money.” Others may be more complex: “If one wishes to prevent fishing entire species into extinction, one ought to purchase the bodies of water in which the fish reside or construct fish farms.” These more complex ethical statements are usually at the heart of the heated debates found on facebook and in politics.

These statements have a common grammar and syntax; they are all if->then statements. The “if” portion of the statement is an assessment of the value in question. Usually the value in question is an aesthetic or pragmatic issue. In other words, it’s either,“I like this thing because it makes me feel good” or “This is a thing that I need/want in order to be fulfilled.” The “then” portion of the statement is the place in which action is informed. Once one has determined the value in question, the “then” portion is where understanding the causal nature of reality can say “this is the way that is most likely to achieve that valued outcome”. In order to utter a true ethical statement, then, one must actually understand the innumerable influences of reality on the particular valued outcome in question, at least sufficiently to make an accurate and informed guess. In the realm of human action, economics, biology, and other areas of philosophy are crucial in generating an accurate “then” statement. The reason I argue this is the case is simple: unintended consequences or acts of ignorance are unlikely to accomplish the valued objective and are more likely to prevent the accomplishment of that valued objective. Walter Block, in his “Defending the Undefendable”, demonstrates this very clearly, concisely, and evocatively.

This understanding of morality and ethics is why I have attempted to eschew use of the terms “good” and “bad/evil”. These words, in our common parlance, and even in philosophy have been reduced to mere aesthetic judgments. There is little distinction between “this pizza is good” and “giving money to hobos is good” or “sushi is bad” and “drugs are bad”. As a basis of morality or ethics, then, these aesthetic judgments are essentially meaningless. I can say, “If you think drugs are bad, then you shouldn’t do them,” but that is the extent to which an ethical statement can be produced based on that flimsy of an “if” statement. If something can be determined as immoral (or unjustifiable, as I tend to refer to it) there is no need to make an additional aesthetic statement about it. If one is attempting an ethical prescription to others, they ought to have more compelling a case for the “if” in question than “it’s icky and I don’t like it”.

Remember, anarchy is a philosophy of personal responsibility. If you want to accomplish an ethical action, such as bettering the livelihood of the impoverished in the third world, one ought to ensure that they are well-informed as to what course of action is most likely to result in achieving that valued outcome. For example, just throwing money, food, and Bibles at them creates a perverse incentive to remain poor and continue to receive free stuff from other people. However, bringing an industry specific to that region (for example some sort of crop or livestock that will grow better in that region than elsewhere or acquisition of a natural resource found in that area) to the people and employing those that are willing to work will improve the infrastructure and quality of life for all of the people in the area.

TL;DR: In the interest of producing valued outcomes and maintaining one’s own integrity, individuals ought to attempt to develop a solid moral and ethical awareness and grammar. In order to pursue this end, an awareness of deontological principles and the causal nature of reality is a necessary skill. Objective moral facts are few in number but categorical in scope: “Thou shall not.” Ethical statements are subjective and as numerous as there are value judgments, but must be informed by the objective causal nature of the universe. Before arguing on facebook about “If we just…” or “If you don’t think this, you’re stupid”, it would behoove the agent in question to assess their aesthetic premises and their fundamental values. After expressing those premises, the discussion is a matter of clarifying the “then” portion of an ethical statement.

Picture

 

Moral Ambiguity

The time has already come for another dose of procedural philosophy.

 As is always the case with procedural philosophy, some homework is in order. If you want to get the most out of this post, you should read or listen to the post about “Paradigmatic Awareness”. Today, we are talking about ethics directly, as opposed to the usual posts about how ethics impacts our relationships. Ethics, like all terms, requires a shared definition in order to be useful.

Ethics is the study of principles which dictate the actions of rational actors. Some will note that this closely parallels some people’s definition of economics. This is not an accident, but this phenomenon will have to be addressed later. There is a glut of ethical theories which assume different premises and result in wildly different prescriptions. This is a problem for an individual who is genuinely concerned with pursuing an absolute truth by which to live. Being one such person, I must admit I’m still searching; but I can help others make it as far as I have and ask others to do the same for me.

“But wait, ain’t you one o’ dem Catholic fellers?” Yes, I am. The Church has a pretty solid grasp on it’s doctrine and dogma (of which there is surprisingly little) and has built an ethics on top of that, something akin to a divine-law-meets-metaphysical-utilitarianism to which it appeals in every ethical discussion. One will notice that I do not advocate a moral stance which violates the doctrinal positions of the Church. I am fortunate that my quest for the truth has not yet forced me to choose between my own faculty of reason and the divine law of my faith. One will also notice that I staunchly oppose certain modern positions of the Church, especially in cases surrounding “divine right of kings” and compromise with injustice, such as “You have to pay taxes, because of the politically expedient manner in which we interpret ‘Epistle to Diognetus’, a letter written thousands of years ago.” (CCC-2240) What I am trying to say here is that “God said so” is never sufficient justification for one’s actions, but what “God said so” may nonetheless be rationally justifiable.

That tangent segues nicely to where we are going today. Ethics operates identically to the method outlined in “Paradigmatic Awareness” in many ways, with some variation. As the numerous postmodern moral nihilists are wont to point out, ethics faces an important problem: the is/ought divide. This problem, popularized by Hume, essentially points out that objective material knowledge of what is does not give rise to ethical prescription without first approaching what is with a subjective value assessment, an ought. This is where the procedure outlined in “Paradigmatic Awareness” becomes crucial.

Simply put, I must determine by way of intuition and abduction from what is to what I (should) value. Ultimately, anything could conceivably be the basis of ethical reasoning; hedonism, consequentialism, stoicism, legalism, virtue ethics, divine law, statism, nihilism, and anarchism are all predicated on different values and represent a fraction of existing ethical frameworks. Many are compatible with each other; as a matter of fact, most ethical frameworks are ultimately either nihilist or teleological in nature and tend to compliment others of the same nature.

Ethics, really, is the ultimate product of philosophy. Philosophy can answer any question, “How did the universe come to be?” “What is it made of?” “How can we know anything?”, but without answering “Why should I care?” it has no real utility. I propose that the best answer to “Why should I care?” is “because, if this worldview is factually true, you ought to do X and here is why.”

Of course, an ethics which is too esoteric or complex for common application and immediate results is as equally useless as a philosophy with no ethics whatsoever. This is where rules become attractive; “thou shalt not” and “always do” are certainly the result of most or all ethics. For instance, if I were a Kantian (I am NOT), I would value the rationality and identity of individuals, which results in the mandate that people be ever treated as ends only and never means; followed to its logical conclusion, one could say, “Thou shalt not enslave others.” Those that lack the faculties or resources to consider the corpus of Kant (a waste of time, really) can simply rely on the rules which fall out of his work. Without an understanding for the cause of these rules, though, one cannot reliably improvise in a circumstance not outlined in the rules, nor can they discuss ethical matters in an intelligible way. “You can’t do that, because this book said so” is a laughable claim, regardless of the book in question.

Everyone considers themselves to be an intelligent person and feel themselves to be very ethically-minded. They are correct in thinking and feeling so. Even psychopaths have a set of motivating factors for behaving in the way that they do. However, such a set of motivations, even in the form of a rule-set, does not qualify as an ethical framework. As a matter of fact, if one does not pursue the full rational grounding of one’s motivations, they will likely adopt a heterogeneous hodgepodge of contradicting rules from various sources. Any ethical claim which feels intuitive or justifies an action one desires can be easily adopted and, with a little mental gymnastics, can be incorporated into one’s rule set without too much apparent contradiction.

This results in an emotional minefield scattered with beliefs such as, “I value property rights above all else, so we have to steal from people to prevent theft.” All one needs to do is go on the internet and read the intellectually toxic political arguments found in nearly every comments section and they will see what I am talking about. The problem is not the argument or even the belief held (though, by definition, nearly every political belief is wrong), but instead the lack of paradigmatic awareness. If someone lacks the foundational knowledge of what is, a clear definition of one’s values, or a grasp of logic sufficient to put it all together, it is impossible to assess others’ claims or to sufficiently convey one’s own belief. Instead, such people (regardless of whether one’s claim is factual or not) are forced to resort to dismissive name-calling and an arsenal of rhetorical and formal fallacies.

So, then, the same prescription in “Paradigmatic Awareness” applies in ethics as well. When encountered with a radical and apparently nonsensical claim such as, “You have a duty to vote, even if it is merely a choice between two evils,” it is important to inquire as to the value and basis for such a claim. Conversely, when meeting resistance to a personally forwarded claim, it is crucial to present the premises and method used to reach the contested claim, lest one look no different than a generic social justice warrior or fundamentalist republican.

Also, just like with paradigmatic awareness, if someone is not willing or able to have a calm rational discourse, they are not providing an opportunity for critical thought. They are wasting everyone’s time. One’s time is better spent writing blog posts no one will read, reading books, or smashing one’s face in with a hammer rather than getting into a shouting match with a morally illiterate person. The goal, as is the case with all of philosophy, is pursuing truth; one cannot do so while stooping to the level of the ignorant. However, if one pursuing truth happens to bring others along, all the better.

Ultimately, my motivation for writing this post is twofold. I want to invite people to critically assess this approach and help me do a better job of understanding how I ought to live my life. I also want to find someone, anyone, who can play by the rules I’ve outlined and believe to be absolutely crucial to communication and progress. I honestly desire for someone to prove me wrong. The ethic that I have managed to cobble together over the last twenty years is incredibly taxing. I would love to (re)apply for welfare, to stop going to church, to stop trying and start partying… but I can’t. My rationality and what little virtue I do possess prevent me from doing so. I think I could do well as a Fascist (which I believe to be the only logically consistent alternative to anarchy), but no one has proven me wrong yest, so as to grant me the opportunity to try my hand at it.

Remember, despite the immense and demonstrable utility that it provides, anarchism is a moral philosophy. It holds the utmost value for human rights and, as a result, human flourishing. When an anarchist says “you shouldn’t do that,” they aren’t forcing someone else to behave in a manner consistent with their opinion. Anarchists cannot point a gun at someone and demand that they refrain from doing so, nor can they vote and delegate that task to someone else.

TL:DR; If someone wants the privilege of being able to criticize the actions and ethics of others, they ought to put in the work of critically assessing one’s own position and actions. If people cannot communicate the reasons for the rules they are so wont to broadcast, they are wasting everyone’s time.

An Open Letter to Mom and Dad

 

Dear Mom and Dad,

We rarely find time to talk anymore. I guess that’s what happens when you have eight kids and your son has three more. Rushed, oft-interrupted, and emotionally-charged bursts of conversation are not conducive to mutual understanding, and I understand you are too busy to read and understand everything I write. While considering this reality, I’ve decided to address my confusion over our philosophical disagreements and consolidate my ruminations into the most direct and concise letter I can write for your to read at your leisure. Depending on how the letter turns out, I may publish it as an open letter on my blog, for others to better understand as well.

Really, the heart of my confusion is centered on mom’s disparaging and dismissive attitude towards my ideas and understanding of the world. I have arrived at this stage of my understanding primarily due to your influence. Dad’s perennial pragmatism and skepticism gave me a high standard and difficult challenge for rational methodology and mom’s example for action has given me a healthy respect for intuition and substantial consideration regarding virtuous and moral action. In a way, I guess I’m concerned that I may have put you on a pedestal and now require more form you than you can provide, but I am extremely reluctant to admit that possibility. So, here I will write the things I feel you have taught me and how they have led me to the conclusions I have reached; hopefully, it will give us somewhere to begin understanding each other.

If an idea or approach is discovered to be false or does not work, eschew it for what is and does:

When I was a little kid, I often had great ideas or plans which were poorly engineered. Clubhouses which required far more than the few pieces of scrap wood I had available, for instance. While he may not have had the greatest method of explaining why, dad was very good at pointing out why the idea was impossible and providing a more realistic, comparable plan. After the school system had demonstrated that it wasn’t working, mom pulled me out and attempted home schooling. At which point, you perpetually modified and refined the curricula and methods of schooling. Trying different methods for allowance, chores, discipline, and personal liberties, keeping what worked and dropping what didn’t was a constant state of affairs growing up. It seems that ethos is still in full force today.

It shouldn’t take too much explanation to see how this ethos has had an effect on my journey thus far. Primarily, identifying and learning from mistakes. Whether it be my approach to studies, finances, personal life choices, whatever, I’m not afraid to admit error and strive to rectify it, and to rectify the subsequent mistakes made in the attempt to rectify, ad infinitum. Philosophically, I have always had a set of needs. I’ve applied this ethos to fulfilling those needs, moving through pursuits such as paleontology, vulcanology, meteorology, astronomy/ology, cryptozoology, theology, astrophysics and demonology, ultimately settling on philosophy. Along this path, I’ve found what fulfills this need and what doesn’t

This process has served as a useful tool for self-awareness, but I will save that for later. For now, I will move to the things you have shown me which have been consistently shown to work.

Deontological maxims supersede practical considerations:

This is a truth that was a long and hard task to learn. For a long period of time, possibly due to the environment in my early childhood, it was hard to critically assess the position that, “The ends justify the means.” “If my goal is noble enough and attainable, the most direct course of action to get there must be taken, regardless of how undesirable the course of action may be.” This claim, in it’s myriad forms, consistently saw resistance from you. “Murder is still murder, even if it’s for a good cause,” was a common response I would get.

As I warmed up to the idea, for example, that the ten commandments are non-negotiable, I explored the real world and hypothetical ethical dilemmas which would test such a deonotological maxim; trying to expose inconsistencies and contradictions with such an approach became a daily exercise. So far, after trying to break deontology, all I have found is that a clearly-defined and concise set of maxims are the most resilient and reliable basis for moral action. Sometimes, these maxims set a standard too difficult to achieve; this is due to human failings, though, not the mind of God to which we ascribe these maxims.

It is infinitely more honorable to set a moral standard, strive to meet it, and fail than to set a low standard or otherwise make no effort:

These moral maxims, such as “Thou shalt honor the LORD above all else,” “Thou shalt not murder, steal, or covet,” and their necessary conclusions, “Love your neighbor as I have loved you,” and “Uphold the dignity of the human person,” can be more demanding that one can manage at times. This is not an indictment of these maxims, but instead an empirical fact of the human condition. When faces with this fact, one may choose to dissemble and rationalize or justify their failures and accept them or, worse, to simply give up altogether. I’ve lost too many friends and seen too mane others loose friends to this temptation. Seeing you strive to more consistently meet that standard, and succeed, has demonstrated the honor in doing so.

Rather than striving to meet such a standard, I would often attempt to reinterpret these maxims or rationalize my status. You dissuaded me for doing so, mostly by example. It helped that, as I explored limit cases of these maxims, you made an effort to resolve issues or directed me to resources wherein others made the effort. Often, neither you nor the sources could provide a compelling resolution, but instead gave me the tools needed to do so for myself. The important trend through this process was the need for integrity: if someone abandons honesty to themselves and their standards, it is tantamount to lying.

Acting justly is more important than comfort:

Between the maxims mentioned above, the need to act in accordance with those maxims, and the need for integrity, one has a duty to accept responsibility for their situation. Again, this is something I learned from your example, first, and be exploring the philosophy behind it later. Simply assessing your circumstances and making what is ostensibly the best choice available, even when it will be difficult or uncomfortable. Those instances when we would move, switch to hippie food/medicine, move to homeschooling, etc. seemed to demonstrate that duty and the discomfort associated with it. Discussing my situations concerning college, marriage, kids, work, etc. with you also followed that trend.

To engage in or directly benefit from immoral action is to be complicit in that act:

Part of acting justly despite discomfort is to avoid immoral action. When I was younger, I had a hard time understanding why you would discourage ideas of what would be a clearly profitable venture: varying from things like selling vices or running (relatively) harmless scams. The recent example would not be wanting Tommy to be a security guard for a pot shop. While I may disagree with you on specific questions of morality, I think we all agree now that selling one’s morals for profit is unacceptable.

That which is immediate and actionable supersedes, distant, future, or theoretical concerns:

Even though it may pay the bills to sell cocaine out of the Church garage, and may make enough to be comfortable on top of paying the bills, but the ends do not justify the means. There’s a story stuck in my head that I think dad told me, but even if it was someone else it sounds like all the other stories about poop brownies and the like. There was a olympic rowing team that lived together and whenever someone wanted to do something, the team would ask them, “Will it make the boat go faster?” At face value, it would seem to justify the idea that the sole justification of the means is in fact the end.

That interpretation is incredibly naive, though. The olympic rowers found themselves in the circumstance that they were olympic rowers; the olympics was upon them and they had a demonstrable and immediate goal of making the boat go the fastest. In their case, the olympics is as distant or theoretical as getting shot is when on a battlefield or being corralled onto a train in 1939 Poland. That is to say, not very abstract. When faced with a choice, as one is thousands of times a day, the primary consideration of that choice ought to be, “is this option just, in and of itself?” and then whether the demonstrable outcome of the action will “make the boat go faster”. After that analysis, the “what if?” and big picture enter into the equation.

This is how I was coached with regards to Boy Scouts, college prep, financial issues… Dave Ramsey‘s version of this is “debt is bad, mmk? Avoid selling your future for unnecessary gains (like one does with a car loan). Use what is on-hand to solve the problem.”

It is impossible to judge the heart of another, for your sake you must give them the benefit of the doubt even when judging their actions:

The way I have best seen this expressed is Hanlon’s Razor: “Never attribute to malice that which is adequately explained by stupidity.” Dad has consistently stated and re-stated this claim in some form or another at every occasion I have judged another person. It took an embarrassingly long time to come around to the idea. Philosophically, I call it the “phenomenological/epistemic barrier”. That is, one is privy only to one’s own internal experience, it is impossible to directly apprehend the outside world, especially the internal experiences of others. One has an indirect access to others’ behavior (the same way they have access to the behavior of a rock, tree, or beast) but not to the internal experience corresponding to the behavior.

One can, with varying degrees of ease, judge the behavior. For example, dismembering an infant with scissors can easily be identified as the crime of murder, regardless of whether the murderer’s internal experience reflects that behavior. The CIA could have slipped the murderer some crazy drugs, he could be indoctrinated by the medical school system to do so, or he could simply have dementia. I can’t judge his internal experience and call him evil or insist that he is going to hell, but I can say that he has murdered a baby. However, some cases are not so clear-cut and it would not be unjustified to err on the side of caution.

Question the auspices of authority (the only authority is epistemic):

This is something that I think I watched you learn which, of course, is what taught me. My early life experiences like my appendicitis ordeal and elementary school career demonstrated the need for skepticism when interacting with an individual or institution, even if they have the credentials (like an M.D., 100-ish years of history to back them up, or a teaching certificate). The authority of the doctor, teacher, administrator, or priest is not some metaphysical or divine attribute, but instead an epistemic one. The doctor is an authority in medicine insofar as his knowledge of the field is accurate. Not all doctors, teachers, etc. are created equal. Hearkening back to how those who have no standards tend to dissemble and rationalize, those that lack authority tend to lean on their credentials and auspices of authority and, subject to skepticism, are therefore not to be trusted.

Independent research and conceptual reasoning countermand the status quo:

Alongside authority, the status quo is also subject to skepticism. Your rejection (or partial rejection) of vaccines, standard education models, debt-oriented finances, moral/legal equivalence, and the “2.4 kids and a puppy” paradigm is the logical extension of the skeptical approach to the auspices of authority. Independent research can be anything from getting a second opinion from another authority to actually doing the requisite work oneself. Very little on the internet is true, of course. For that matter, very little outside the internet is true, either. This makes independent research incredibly difficult; by extension, that difficulty makes finding an actual authority equally difficult.

What, then, can one rely on when searching for factual or true knowledge? Conceptual reasoning can guide the process, at least. The application of careful deduction, induction, and abduction is ultimately the only tool one has in discernment between different claims, authorities, or options. Of course, like a hammer and nails, reason is useless without experience. All epistemic crises aside, the facts one is able to discern as immediate and actionable often come into conflict with and overcome the status quo. That’s because the status quo is an emergent property of human nature.

The human condition is such that utopia and systematization is impossible:

Back in my Marxist days, dad frequently said things like “people don’t work that way”, “You can’t program society like a computer”, and “who is going to program the computer you put in charge?” Meanwhile, mom was vocally denouncing standardization, especially in education but also in medicine and just about everything else. That, coupled with the Scriptural education you provided, paints a pretty clear picture about the relationship between the human condition and utopia. Utopia being the Greek word St. Thomas More made up which means “no-place”.

Namely, that relationship is radically irreconcilable. In spite of rejecting gnosticism, I am certain that corporeal paradise as we can conceive it, is fundamentally opposed to the human condition. This is not a failing of the human condition, but instead one of utopia. Utopia, in all of its implementations, requires humans to be standardizable, equal, replaceable, and incapable of growth or change. Humans are none of those things; attempts to make them such are doomed to failure.

Coercion doesn’t work, neither does rules:

Coercion is essentially any engagement which can be reduced to, “Do/don’t do X, or else.” In hindsight, almost every moral crisis I had faced until recent years was a result of being coerced. Sometimes, the coercion was an explicit statement as above. Other times, the coercion was inferred from consistent exposure to the above statement or the behavioral equivalent. I don’t want to air dirty laundry, new or old, especially as everything is essentially forgiven and forgotten or is still a secret and not yet beyond the statute of limitations. Having been on both the giving and receiving end of coercion, even in the form of rules that are “for your own good”, I have seen how such behavior does infinitely more harm than good and, on a long enough timeline, ultimately fails to accomplish its intended end. Besides, the ends do not justify the means and coercion undermines the human dignity of the victim in every instance.

Contracts are bullshit:

This is something I have to pin on dad, so you can skip this portion, mom. This comes primarily from our discussions on social contract theory. I unknowingly, used to place undue metaphysical belief on the social contract. You brought this to my attention be demonstrating how the social contract has no effect on the physical world. In a world such as Hobbes’ state of nature, there is no difference between two people backstabbing each other over a limited resource and the leviathan’s people/leaders backstabbing each other over other issues. The social contract has no more effect in the real world than any other metaphysical fairy-tale. I can believe in ghosts all I want, but that will not change your behavior. The same is true for “real” contracts. Ultimately, any contract signed is nothing more than a promise which alludes to the integrity and ability of the signers to uphold that promise, a-la the social contract. Admittedly, there is a difference between the social contract and a “real” contract. That is, a social contract attempts to coerce its “signers” with the boogeyman of anarchy and a “real” contract attempts to coerce its signers with the threat of government violence. But we’ve already had this discussion.

The dignity of the human person:

More important than the practical issues concerning coercion, there is a moral issue. Being created in the image of their creator and being given a special moral quality which is at the center of salvation history, there is a certain revealed dignity to human persons. Even “natural man”, a.k.a. Pagans, are aware of this dignity, expressed in our reason, will, and relationship to each other and the divine. Actual catechesis aside, you taught me this be way of debate, example, and counter example, just like all the other items in this letter.

I’m going to circumvent the whole Plato vs. Aristotle, “human being” vs. “human doing” debate and just assent to people possessing their own dignity by virtue of being human. Ultimately, that’s the only available underpinning for individuals’ duties and rights, but I’m trying to avoid getting too philosophical and lengthy in this letter. I’m just going to stick to the duty (or right) to life, in the interest of time. Simply by virtue of our relationship with out creator, humans have inalienable rights. Chief among those, that from which they are all derived, is the duty to life.

Simply put, it means murder is wrong. By extension, coercion (the threat of murder) and theft (depriving one of their resources used for living) are wrong. Accidental murder, that is, killing someone through avoidable circumstance is still murder. For example “If I leave this toxic waste near the well, people may get poisoned and die. Oh, well, I’m will do it anyway.” So, abortion, murder proper, the death penalty, and war are necessarily a violation of human dignity. Additionally, abdication of one’s humanity and person-hood is an offense against human dignity. I imagine this is the basis of mom’s paranoia concerning drugs, but I’m not sure. I am sure, though, that intentionally allowing oneself to be objectified, abased, or to lose one’s free will/discipline is a violation of human dignity as if they had done the same to someone else.

I guess this is as good a place as any to ask why you changed your mind with regards to the American proxy war in the Middle East. When Bush Jr. wanted to re-invade Afghanistan and Iraq, I fell for the propaganda. You were quick to try and dissuade me from that position. A decade later, I came to your earlier position by a different avenue, that is, by way of the dignity of the human person. I was surprised, then, that mom is so anxious to continue that war and the slaughter of millions of innocents that she tried to dissuade me from supporting. Dad is a bit more coy on the subject, but I think he agrees with mom.

Find what you love and pursue it; make it a tool for survival:

I have a million interests and desires, but the all grow from a root desire which is a love affair I have with Truth and my family. Unfortunately, there is a very limited market for these things in a world rife with lies and captivated with misanthropy. That’s not an excuse, but an assessment of my situation. Why does it matter though? I mean, the aspect of the “american dream” you preached to me the most was entrepreneurship and the ability to turn one’s loves into a tool for living. So, then, I ought to determine how I and my family are called to live and do what we can to fulfill that vocation.

“If you’re not growing, you’re dead.” Another nice soundbite from dad that I now totally agree with. In each aspect of one’s person, if they are not growing, they are dead. Spiritual, mental, and physical growth, at a minimum, is required for one to uphold one’s dignity and pursuit of Truth/flourishing/perfection/“the good”/whatever. Mental growth is clearly the aspect of person-hood I am most disposed towards, with a constant pursuit of numerous “-logy”s and “-ism”s and such, seeking to ground my rational faculties in Truth. Mental growth alone has it’s limits. To pursue mental growth, spiritual and physical growth are required. People and action are required.

I am confident in a great many beliefs I have as to what my own vocation has in store for me, and only slightly less confident in what I feel my family’s vocation is. Of course, to come to such conclusions, I have to constantly work together with them; I know only myself, and must rely on them to know themselves.

Exit Strategy. Have a concrete goal with demonstrable success/failure criteria and have a contingency plan:

There is so much I have to write on this and the preceding subject, as the main initiative for this letter is to try to figure out where our misunderstandings lie in general, but most especially concerning moving to New Hampshire and later fleeing the american empire. Unfortunately, I’m running out of steam for writing this letter, so I’m sure you’ve run out of steam and time to read it.

One of the many books dad is never going to write inspired this one. I know I took his treatise on eschatology and turned it into a practical tool, but you grab truth where you can find it. I don’t know how much I need to expound on the heading, it seems straightforward enough.

So, what?

This collection of beliefs and lessons has obviously influenced my worldview at large. I think I’ve spent far too much space and time exploring these ideas, so I will try to wrap this up quickly. Really, I can’t understand why you would be so dismissive and crude about the things I have come to understand and what I intend to do. I totally understand disagreeing, as we have always had disagreements, but those disagreements were (generally) calm and rational. Yelling, name-calling, and repeating fallacies is unproductive and neither calm nor rational. It certainly won’t change my mind as previous discussions have.

I don’t find the beliefs I have to be too extreme. Due to the dignity of the human person, no one has the right to murder, coerce, or steal from another. One has a duty to life, in the fullest philosophical sense of the words. One has an obligation to uphold whatever responsibilities and obligations one takes one. One must have rational justification for one’s actions, derived from these first principles.

I find myself in a position where I have taken on the responsibility for the well-being of four other people whom I love dearly. I have this responsibility in the midst of a disturbing situation. This situation is one where I live in a culture centered on misanthropy and death. A society where myself and my children are treated as livestock, coerced into various behaviors by the perpetual threat of murder, routinely stolen from, and ridiculed for pointing these things out. A brief study of history demonstrates an unavoidable cycle of imperialism, where we are currently in one of those cycles, and the fates of those unable to predict such historical cycles. Most importantly, the situation is such that a murderous gang of kidnappers with no accountability, far more firepower than I possess, and a predilection for kidnapping children from those who have beliefs such as mine operates in my neighborhood (funded by the money stolen from me, no less).

A simple cost/benefit analysis revels a clear course of action, especially when the well-being of my children, all the way down to the state of their immortal souls, hangs in the balance. We must assess what fundamental needs we have, what desires we have, and how to change our environment to best fulfill those needs. In order to achieve the flourishing we seek, we must be able to avoid or counter the coercion, murder, and theft we may encounter. That is categorically impossible where we currently live, therefore we must go somewhere else. We must go somewhere where we will either not encounter such things or have more of a fair fight against them. The simple matter of fact is that it is too late in this place to fight back and I don’t want myself or my children to face the circumstances that naive Catholics have been faced with in first-century Rome, 18th century Prussia, 20th Century Poland/Germany/France, and at least a dozen other places and times.

I am fully aware that I am to be a martyr, but martyrdom comes in all shapes and sizes. I would like to be a martyr worth emulation, even if never recognized by historians. I would not hesitate to kill or die for my children, so why should I hesitate to forego creature comforts and worldly status? If the status quo is such that I could take advantage of criminal activity, imperial decadence, and misanthropic agendas if only I would forego my conscience or “move to Somalia”, I would side with morality, reason, and my conscience. Not for my sake, but for my kids, so that they will not have this dilemma foisted on them because I didn’t feel like addressing it.

I don’t need you to understand. I don’t need you to agree or condone my ideas or actions. What I need is to understand you, your actions, and help giving you a chance to prove me wrong. I wrote this down so you could read it at leisure and approach the discussion more calmly and rationally and so that you could see that I still value our relationship and your opinions, even if they are wrong.

 

The Dark Side: Crime, Vice, Sin

Today, we explore the dark side of humanity: crime, vice, and sin.

As readers of previous posts and my facebook page are well aware, I use these terms quite frequently. I have come to realize that, despite my best efforts to contextualize the use of these terms, many people are either unable or unwilling to understand what I mean by crime, vice, and sin. Today, I plan on setting things straight such that I don’t have to explain it quite as frequently.

As can easily be guessed, being a philosopher and an anarchist, I do not believe the contemporary and common use of the term “crime” is valid. As I have expressed already, the laws of man are inherently unjust; as such, the term “criminal” cannot apply to an identical class of things as the term “illegal”, as is commonly assumed in our culture. Instead, I define a crime as any action intentionally or negligently directed at the invasion or destruction of another’s life, liberty, or property. In other words, it is an action which violates someone’s rights or duties.

Easy examples consist of incidents of murder, coercion, and theft. Some such instances of these crimes are difficult to discern outright, as would be the case of unreasonable bank fees, protection rackets, systematic coercion, or deprivation of life essentials. There exist any number of examples that could be presented. It is crucial to have a clearly defined set of necessary and sufficient conditions for what is to be considered a crime for reference in these more veiled instances of crime, given the dire consequences.

I doubt anyone is reading this, let alone anyone accepts or wishes to help me refine these conditions, but I am compelled to attempt a definition. The result should be intuitive, but still analytically sound such as to justify one’s response. I believe that if one demonstrates resolve with regards to performing an action, has a demonstrable ability to perform such an action, and the action in question is an immediate or direct and demonstrable causal violation of someone else’s life, liberty or property, the action in question is a crime. In this way, holding a gun to someone’s head and demanding a particular behavior or taking someone else’s property without consent is a crime. Conversely, making idle threats, wishing cancer at people, and using incandescent light bulbs are not crimes as they do not meet the conditions I have outlined to be necessary and sufficient.

Now is a good time to point out why my definition of a crime possesses more

A handy flowchart I found that explains this reasoning

utility than the non-aggression principle (NAP). The most commonly accepted iteration of the NAP can be and is used to justify coercing, stealing from, and even murdering people for things like using incandescent light bulbs, belonging to a different community, smoking tobacco, driving a car, refusing vaccines, and just about any other non-criminal action that could be considered a nuisance by some. These justifications are logically consistent when using the NAP as one’s initial premise. Of course, attempting to do such things to someone for using the wrong light bulb is, itself, aggression. The issue hinges on people’s definition of “aggression”, and any definition which does not result in counter-intuitive or absurd claims will be equivalent to my definition of crime. A similar issue arises with the less popular objectivist “non-initiation of force” principle.

If I were to simply claim that my definition of crime and prescription as to how to handle it were the extent of moral and ethical reasoning required, we may very well witness a conservative’s nightmare: legions of communist, polygamist, sodomites freebasing coke and praying to Allah simply because it isn’t a crime to do so. Of course, it’s equally likely that we would see a liberal’s nightmare emerge: mobs of tobacco-chewing, corporatist, racist, fundamentalist Christians chugging liters of soda while deforesting the amazon. What I am alluding to, obviously, is that there are courses of action which are not crimes but are not conducive to human flourishing. The main focus of this portion of the post is vice. A vice is any non-criminal activity which would prevent or inhibit the participant from pursuing their telos.

Again, I am guilty of referencing my still-unfinished book. A quick primer is in order. “Telos”, a Greek term which has been at the center of philosophical discourse since Aristotle, essentially means “end” or “purpose”. I argue that any individual is beholden to a certain hierarchy of teloi (plural of “telos”), but that is a discussion best left to my book or later posts. For now, we can simply say that eudaimonia is any individual’s ultimate goal. Another Greek word: “eudaimonia” is a very technical and precise term which, for our current uses, can be reduced to “free and productive flourishing”.

Any activity which would limit one’s freedom, productivity, or well-being can be considered a vice. Addiction, mind-altering substances, dependency, time-wasting activities, body-harming practices, character-undermining activities, prophylactics… essentially the traditional list of vices are good examples of what can be considered a vice.  Now, am I a tee-totaling puritan hellbent on avoiding anything fun? I play video games, drink alcohol, smoke cigars, stay up late, work a 40-hour wage-slave job, and so much more. I am still dependent on others’ skills and resources. I still rely on less-than-perfect activities to sublimate my aggression and discomfort. I still use Google, Facebook, and Windows. In other words, I still have my vices.

As I will likely discuss in an upcoming post, the virtues of prudence and temperance are paramount in flourishing. With regards to handling vice, prudence and temperance are also key. While it would be ideal for people to simply commit to being a taoist or stoic sage, an ascetic monk, or whatever and eschew all vice outright, it is not entirely possible and may, itself, be a vice of sorts. Instead of abandoning the real world for some gnostic exercise in death, most people may flourish best by approaching their own vice from the perspective of a responsible cost/benefit analysis. There is a reason I smoke cigars rarely as opposed to mainlining heroin daily.

Whereas “How do I deal with criminals?” warrants a near-infinite number of discussions, “How do I deal with a vicious person?” is pretty straightforward. If one’s vices are, in fact, vices and not crimes, they ought to be free from coercion, murder, or theft, like any other human being. If their vices are beyond the realm of tolerance, such as someone vigorously masturbating in public, they can be refused service, reprimanded, shunned, etc. The social norm can be enforced without resorting to criminal actions against someone. Social norms, tolerance, and exile are ideas that will be more thoroughly explored when I get around to talking about cities, the Dunbar number, and intentional communities.

If any of my nine readers are Christians, they are likely pulling out their hair and screaming, “SMOKING WEED WILL LAND YOU IN HELL!” I jest. In all seriousness, though, a great many vices and all crimes are sins. If a crime is someone violating another’s rights and a vice is someone preventing their own flourishing, where is sin in this whole mess? I’m going to try to keep this short and sweet. So far, I’ve written very little on relationships. There are a handful of reasons this is the case, but now I’m compelled to do so.

Sin is relational. I can pretend that I have a relationship with you, my anonymous, silent reader. If I start hiding pictures of my manhood in my posts or if every post were to gradually devolve into senseless diatribes against Ronald McDonald and the lizard Jews, I would be damaging my relationship with those of you who expect philosophy from me. I would be sinning against you.

If I am in relationship with an omnipotent, omniscient, omnivalent, omnibenevolent, omni-omni, being… especially one that created me personally for the sake of us coming into full communion with each other… any action which would make me less omni-omni and therefore less able to come into communion with Him would be a sin against Him. The same applies to any action which would otherwise damage our relationship.

TL;DR: If someone is intentionally and willfully acting in direct violation of another’s rights, they are committing a crime. If someone is doing something which prevents or inhibits human flourishing but isn’t a crime, they are committing a vicious act. Sin is any activity which damages a relationship. In this way, sins against God would be actions which damage one’s relationship with God. As always: you ought to defend yourself from criminals, reprimand and ignore vicious people, and avoid sin.

Paradigmatic Awareness

 Why can’t we all just get along? When it comes to discussion, why can’t we seem to understand what each other are saying?

            As is outlined extensively in my yet-unfinished book, epistemology (how we know what we know) is a field of intense and voluminous study.  I will do my utmost to remain concise and direct today, but we will see if I can manage to get my point across.
Among thinking people, there is a disturbing trend of people missing each others’ points and progressively resorting to name-calling and physical altercation.  Friendships end, wars erupt, libraries are burned… all over a misunderstanding as to whether Star Trek ToS is better or worse than J.J. Abrams’ reboot.  This phenomenon is easy to see every four years in America, when just under half of the population suddenly erupts in closed-minded and aggressive rhetoric over which master we should be owned by and what behaviors we ought to compel with the violence of the state.  For many people, this argument continues on a daily basis (Thanks, Obama).

Very, very rarely does one actually change their mind or realize that oneself was wrong.  On the occasion that one does so, it is rarely a result of dialogue, but instead a result of a personal and concrete experience of their worldview and reality not comporting.  This sort of event is at the heart of every popular feel-good drama about a grouchy old person overcoming his racism.  My purely subjective standard by which I choose to judge a philosopher’s ability to philosophize is their willingness and ability to change their mind and admit error by way of dialogue as opposed to concrete experience.

While very few people my be called to be a philosopher, everyone ought to be capable and willing to do philosophy, lest they be vulnerable to misanthropy, self-dehumanization, and falling for vicious and criminal ideologies.  What is required in order to do philosophy?  There is a multitude of tools required and yet another multitude of tools that are merely useful.  The first two, the most fundamental and primary, of these tools are logic and paradigmatic awareness.  Of course, one is a prerequisite for the other.

What is logic?  Logic, contrary to popular belief, does not refer to “all of the not-emotional things that happen in my brain”.  Logic is a science and an art as old as man’s pursuit of knowledge.  As a science, the body of theories and research has been steadily growing through the generations.  As an art, the technique and skill of those who wield it waxes and wanes with times and cultures.  Logic is the place where language, reason, and objective observation meet.  Logic, in its purest form, is the exploration of the principle of non-contradiction and its application to our experience of reality.  The quest for knowledge requires a reliable and finely-tunes toolset.  The study of logic, epistemology, and phenomenology, has been directed towards the development of these tools since their inception.

Even though some high schools teach introductory classes on deductive symbolic logic and may touch on inductive reasoning, logic has been widely abandoned by our education system and, by extension, society at large. Without a working knowledge of and praxis concerning deduction, induction, abduction, and the interrelationship of the three, one cannot be expected to be consistent in their beliefs, claims, and behaviors. Unfortunately, a blogcast of this length and quality is insufficient to teach such a skill. Fortunately, there is a vast body of material available on the internet for those that wish to be rational.

A grossly oversimplified and brief introduction of the three is required, though, before I can address paradigmatic awareness. Deduction, then, is described as “arguing from the general to the specific”. A classic, if not entirely reliable, example is the famous “all men are mortal” syllogism.
“All men are mortal. Socrates is a man. ∴ Socrates is mortal.”
In this case, it assumes general premises such as “all men are mortal” and uses the principle of non-contradiction to reach the conclusion, “Socrates is mortal.” So long as the premises are factual and there is no error in the logic, the conclusion must be true.
Induction, in simple formulation, is arguing from specifics to the general. An example frequently addressed in modern philosophy is the claim, “the sun will rise tomorrow.” This claim is made based in the consistency of such an occurrence in the past as well as an absence of any predictors which indicate that such an occurrence would cease (for example, the sun vanishing would leave some pretty significant clues). Induction does not produce certainty in the same way that deduction may, but instead some well-reasoned and reliable guesses which have a particular utility about them.

Abduction can be considered “making the strongest case”. If the circumstance arises such that a question presents itself which requires an answer and neither a deductive nor an inductive argument is possible, one can produce an answer which does not contradict accepted deductive and inductive claims and is, itself, self-consistent. Using tools such as observation, occam’s razor, intuition, and a detailed understanding of one’s paradigm (we’ll address this is a minute), one can make a compelling case as to why their chosen belief is true.

This brings us to the interrelation of the three. Due to the certainty produced by valid deductive reasoning, one’s inductive claims cannot come into contradiction with such claims. If one is committed to a particular inductive claim which is found in contradiction with deductive claims, they must first demonstrate a flaw in the premises or logic of the existing deductive claim. This same priority is given induction over abduction for the same reasons.

Of course, this description ignores the source of our general premises that this whole process began with. In all reality, premises are produced by abductive reasoning and ratified by the simple Popperian principle of trial and error. This means that, per Gödel, any complete philosophical worldview cannot prove itself to be factual. Only by way of comparing a worldview’s predictions and claims against one’s experience of reality or confirming the strength of the premises’ defense can one ultimately justify any particular worldview.

This finally brings us to paradigmatic awareness. Those that have read this far, I salute you. Using a modified version of Thomas Kuhn’s definition of “paradigm”, a paradigm is the set of established or assumed claims which take priority before the claim in question based on the rubric I briefly described when addressing logic. Why does something so simple-yet-esoteric matter? It may sound intuitive once described, but despite its intuitive qualities, very few (if any) people truly possess paradigmatic awareness

For instance, when faced with a claim one may find absurd, such as “We need to tax every transaction possible in order to pay for government guns,” it is possible that the (clearly incorrect) individual may have a valid logical argument to reach that conclusion. More likely they hold, either implicitly or explicitly, flawed premises from which they derived an absurd conclusion. There is really no point in discussing the conclusion itself so long as the premises are left unacknowledged and unaddressed. Communication simply isn’t possible without commonly accepted paradigms between communicants.

This is where the standard of being able to change one’s mind comes into play; in the process of exploring the premises held by someone else which resulted in an apparently absurd claim, three beneficial results may arise. In exploring the paradigm of someone else, you may bring to light counter-intuitive or implicit premises that your conversant may never have previously critically assessed. Additionally, it will give you the opportunity to cast doubt on another’s premises, allowing them the otherwise impossible moment of self-reflection. Lastly, of course, by holding a counter-factual presented by someone else, there is always a chance (however slim) that you may realize that you, yourself, are wrong.

Now, one cannot always explore others’ worldviews without expecting the same intellectual courtesy in return. By following the advice given above and explaining what you are doing along the way, you can effectively provide an education in communication skills and logic that far exceeds what meager offerings most people are exposed to. This will give them a greater chance to entertain your correct but unpopular claims like, “Taxation is theft.” Additionally, anyone unwilling to explore their own premises or yours are clearly not interested in intellectually honest dialogue directed at obtaining truth and, therefore, are not worth your time or energy; a handy resource management tool, if you ask me.

So, why can’t we get along? Because no one is given the tools required to even consider getting along. Why can’t we understand what each other are saying? Because we don’t try hard enough. Remember, no unwilling student can learn, this includes yourself.

TL;DR: Listen to what people claim. Ask, “How did you reach that conclusion?” Make it a point to maintain an awareness of your opponent’s paradigm. Genuinely search for the truth in their words. Expect and demand that they reciprocate the effort, lest you waste both parties’ time and energy.
As I said on facebook the other day (while re-realizing some flaws in the AnCap worldview):
I love being a philosopher. My worldview is constantly shifting and undulating… but always gradually comporting itself more closely to reality. Where fleeting moments of intuition can, decades later, be given meaning and purpose and carefully constructed arguments and justifications can crumble, there is where humility and virtue can grow. The fires of truth and the crucible of reason can lay bare natural and artificial landscapes of mind alike, and enrich the soil for new growth and the return of the most robust ideas to carry on their existence.

Surprise! Another Post

 On rare occasion, I am surprised. Sometimes, it is something as mild as hearing a decent song on the radio. Other times it is something as extreme as finding scorpions in my hair. Yesterday, I was surprised to be inspired by an atheist podcast I listen to… so here’s what I was inspired to write about. Surprise can be unpleasant, hilarious, or any blend of the sensations in-between. What, exactly, is surprise? A neurobiologist with a higher IQ and worse social life than mine own may be able to answer this question better, but I thought it was worth exploring.

I contend that surprise occurs when someone experiences a state of affairs contrary to their noetic framework. An easy example would be when evil clown appears before you and you shit your pants in surprise.

The cause for surprise is not the clown itself, it is the experiential contradiction to one’s noetic framework. In this example, it is the implicit (or explicit) belief that one holds which states, “I live in a world in which evil clowns do not appear before me without warning,” being violated which causes surprise. Other common beliefs which are frequently upset could be, “this is the last step in a flight of stairs”, “you’ll love this joke”, or “my bed isn’t full of spiders”. That gut-wrenching shock occurs simply because those beliefs were incontrovertibly disproven.

A great many of our entertainment dramas play off of this reality. Coming-of-age flicks like “My Girl”, feel-good dramas like “Gran Torino”, horror films like “Alien”, etc. all demonstrate or a assume the audience or protagonist’s belief structure and proceed to to surprise the audience and protagonist over the course of two-ish hours. Showing the protagonist and audience that the world (either the real one, or the fictional one which is the center of attention) doesn’t work the way they thought it does is pretty much the singular impetus of the plot.

But, why should someone care about surprise? Well, as it turns out, it took me about two years to come up with an answer to that question. I was surprised when presenting this idea to my wife… she got mad at me, which was unexpected. It turns out, two years ago she brought this idea to my attention, but I couldn’t find a place in my worldview that could be enriched by such a line of questioning… and so I forgot the conversation altogether. </anecdote>
You may laugh, but my newly-realized reason for caring about surprise is an ethical one. As any poor soul still reading this post ought to know, I am a virtue ethicist. What does surprise have to do with human flourishing, though? Well the connections are twofold.

Firstly, Surprise is an opportunity for discipline. When one is surprised, as I already explained, it’s because they are faced with a reality that is distinct from the one in their head. In science, this is called a “discovery” or “falsification” (in my under-caffeinated state, I can’t remember what exactly the rubric is for declaring something a “discovery”). In a horror movie, it’s called “being dead”. What it really is, though, is an opportunity to correct one’s beliefs and resultant behavior.

For example, if one consistently wins at a competition of skill (ie. chess, first person shooters, martial arts, etc.) and is surprised by a loss, it is an opportunity for them to fill whatever blind spot they had. With a demonstrably superior physique or mind, there must be a blind-spot in their knowledge of their particular sport. After a surprise loss, they can survey the playing field and actions of their opponent with a new perspective, analyzing which implicit beliefs they held which resulted in their loss. Another example would be if one is surprised by a bed full of spiders, they are given the opportunity to incorporate that knowledge and develop the habit of checking their bed before staggering in and collapsing in a drunken heap. Maybe, they could even discern the cause for a bed full of spiders and develop habits which prevent such a possibility in the first place.

I used to be surprised quite frequently in my younger years, probably due to the fact tat I was an immature insufferable know-it-all. Nowadays, I am pleasantly surprised at the rare occasion of surprise in my life. This brings me to the second reason a virtue ethicist would be concerned about the nature of surprise; surprise can serve as an excellent self-diagnostic tool. The frequency and trends of a person’s surprise can express to the surprisee their general attitudes and their epistemic strengths and weaknesses. This, again is divided in two ways: determining the cause for one’s lack of surprises and revealing epistemic blind-spots. In the case of lack of surprise, I can think of three reasons one would be infrequently surprised:

  1. They have an unusually accurate worldview, resulting in few instances where they would be surprised by inaccuracies
  2. They are a Taoist sage, with a certain expectation of epistemic inaccuracy built into their worldview, “It’s not surprising that I was wrong, as I am always wrong” or, alternatively, “I hold no beliefs… so none of my beliefs can be shown false.”
  3. Or, this person could just be a total jerk. “I knew that all along”, “Did I just think of that? I had to have… because I am the greatest”, “That can’t be an evil clown standing in front of me… because I didn’t predict that it was possible.”

While it is the case that a virtue ethicist such as myself would insist that one strive for omniscience, resulting in a total lack of surprise due to cause #1, I am aware that such an achievement is impossible for a human being qua the human condition. Therefore, the most practical solution to the question of surprise would be one of fine-tuning. Finding the appropriate blend of omniscience, Taoist apathy, jerkiness, and surprise-ability is likely to be the most direct path to flourishing with regards to surprise. Despite the credit I would like to give myself, I don’t think I’ve yet found the appropriate balance of the four… I’m likely less surprised simply because I’m now a mature insufferable know-it-all.

The second useful diagnostic tool that surprise provides us with is one of trends. If someone is frequently surprised by similar things, for instance that people around them are smarter than one thinks, they are likely to have an implicit belief that everyone around them is an idiot. Alternatively, if one is consistently surprised that the guy they are dating is a jerk, maybe they have an implicit set of beliefs that gives them a poor taste in men. These can also be positive surprises. An example would be if a shy person with low self-esteem presents a rare idea to a group and the idea is surprisingly well-received, then there is likely a set of implicit beliefs that leads the shy individual to underestimate their own intelligence.

By keeping a record of one’s surprises, they are more likely to find the appropriate fine-tuning of their behaviors and worldview in order to flourish. As always, knowing oneself is most of the battle when virtue is concerned, and surprise can be a valuable asset in the discovery of oneself.