Thursday, July 28, 2016

A Picture Is Worth A Thousand Words- Or is it an indication that we are no longer capable of prolonged thought, that our mental capacity is waning?


I love those quirky sayings that crop up on social media and other sites on the Internet. Here's one I came across recently:


"A Picture Is Worth A Thousand Words" is an old saying and we appreciate that a picture can speak multitudes in a flash of insight, but are we taking this to an extreme? Have we gone beyond the pale in this age of online social networking? I am continuing to challenge the validity of some of the myriads of posters we see that are all too glibly believed and lauded as truth on social media sites.

In a previous post I demonstrated (I hope) that not everything these truth posters seem to say so elequently and with so few words - should be taken at face value. And this is not the first time, here is a couple that, with some thought, proved to be not quite what they seem.


Here's another one and I think it relates a little to the previous one:


Let's just do a little interrogation of the thinking behind this sandwich board. 
  • Didn't the person who wrote the slogan already implicitly assume the belief that being a better person implies the belief that it would be beneficial to make this belief known? 
  • Is it not true that the person who wrote this slogan about belief, wants us to believe them, with the goal of producing better behaviour, because they themselves have faith in the truth that right beliefs underlie right behaviours?
  • Hasn't their faith in the power of beliefs to cause better behaviour, already been assumed by the behaviour involved in producing the sign?
However sarcastic my comments might seem, I don't suppose that anyone misses the important truth implied by the sign. There are differences in the degree to which people hold beliefs. Some give mental assent to beliefs without any consequent change in behaviour. As another has written "Hypocrisy is the compliment that vice pays to virtue."

Here is a variation of the sandwichboard and another variation of that which entails why it is seen as simplistic, and erroneous more often than it is true. Like the sandwichboard, the idea is (apparently) that beliefs are not responsible for actions. What it fails to account for is the reality that we can hold beliefs that we don't act on, and which are therefore held as mere intellectual assent, but then there are other beliefs which are held on a deeply existential basis, and are the fountain of all actions.







Friday, July 8, 2016

In Memory of Elie Wiesel

Is it necessarily true that when an outspoken passionate eyewitness to terrible atrocities passes from this life- we have therefore entered a period in which we are less likely to remember the lessons of history? If that is indeed the case...we are in for even more perilous times ahead.

Elie Wiesel a Jewish writer, professor, political activist, Nobel Laureate and Holocaust survivor passed away over the weekend of July 4, in which the United States of America celebrated its independence. An independence which in fact marked what has been described as the greatest experiment in liberty the world has known.  In his most well known piece called “Night” he describes in gruesome, graphic detail the prolonged death by hanging of a young boy at the hands of the Nazis. The older men had died within seconds because of their greater bodyweight. The young boy suffered for half an hour.

'...An onlooker was heard to mutter under his breath with increasing desperation, “Where is God? Where is He?” From out of nowhere , Wiesel says, a voice within him spoke to his own heart, saying, “ Right there on the gallows; Where else?”
 It was Weisel's commitment to make sure the voice of those whose blood cries out might never be forgotten. But, even more important than remembering the atrocities of that time, we ought, for the sake of humanity, be aware of the ramifications of Viktor Frankl's thought, another holocaust survivor who reminds us that these were ideas that had their genesis in the "lecture halls" of Universities, quite removed from Hitler who brought the evil tree to fruition, who took those ideas of Eugenics, superior races, and the philosophy of Nihilism to an ultimate conclusion. While others were yet thinking of them, he enacted these heinous ideas.

The warning is clear. If we continue to seek to understand humanity in terms of an erroneous perception of human nature, we will continue to reap the nefarious results.

What are some of the influences governing today's perceptions of human nature that are being widely promoted and are beginning to take a toll on how we perceive ourselves, and how that perception is played out?

The following is copied from The Discovery Institute, written by Richard Weikart, July 18, 2008 following a history of bad ideas with regard to human nature:
 


The Dehumanizing Impact of Modern Thought: Darwin, Marx, Nietzsche, and Their Followers


Viktor Frankl, a Holocaust survivor who endured the horrors of Auschwitz, astutely commented on the way that modern European thought had helped prepare the way for Nazi atrocities (and his own misery). He stated, "If we present a man with a concept of man which is not true, we may well corrupt him. When we present man as an automaton of reflexes, as a mind-machine, as a bundle of instincts, as a pawn of drives and reactions, as a mere product of instinct, heredity and environment, we feed the nihilism to which modern man is, in any case, prone. I became acquainted," Frankl continued, "with the last stage of that corruption in my second concentration camp, Auschwitz. The gas chambers of Auschwitz were the ultimate consequence of the theory that man is nothing but the product of heredity and environment--or, as the Nazi liked to say, of 'Blood and Soil.' I am absolutely convinced that the gas chambers of Auschwitz, Treblinka, and Maidanek were ultimately prepared not in some Ministry or other in Berlin, but rather at the desks and in the lecture halls of nihilistic scientists and philosophers."1
As a Christian undergraduate in the 1970s, I was drawn to the study of modern European intellectual history in part by the realization that much modern thought had debased humanity, as Frankl suggested. My concerns were originally stimulated by reading C. S. Lewis, especially The Abolition of Man, and several of Francis Schaeffer's works, but they were reinforced by courses I took in intellectual history and the history of philosophy. In my own private studies, I was dismayed by the vision of humanity sketched out in B. F. Skinner's Beyond Freedom and Dignity, which it seemed to me would lead to dystopias, such as the fictional ones in 1984 and Brave New World or the real one described by Alexander Solzhenitsyn in his novels and in The Gulag Archipelago.
A few modern thinkers specifically criticized the "anthropocentric" view that humans are special, made in the image of God. In the nineteenth and early twentieth centuries the famous German Darwinist Ernst Haeckel, for example, blasted Christianity for advancing an "anthropocentric" and dualistic view of humanity.2 Today the famous bioethicist Peter Singer, along with the atheistic Darwinian biologist Richard Dawkins, argue that based on the Darwinian understanding of human origins, we need to desanctify human life, divesting ourselves of any notion that humans are created in the image of God and thus uniquely valuable.3 An evolutionary ecologist at the University of Texas, Eric Pianka, fights overtly against anthropocentrism, even expressing the wish that 90% of the human population will be extinguished, perhaps by a pandemic.4
Often, however, modern thinkers have masked the dehumanizing impact of their ideas by calling their philosophy "humanism" of one form or another, implying that their views exalt humanity. However, most attempts at exalting humanity have ironically resulted in diminishing humanity, demonstrating the biblical truth: "He who exalts himself will be abased."

After the waning of Romanticism in mid-nineteenth century Europe, many intellectuals embraced science as the sole arbiter of knowledge, including knowledge about humanity and society. The renowned, but quirky, French thinker Auguste Comte gained many disciples for his philosophy of positivism, which rejected any knowledge not obtained through empirical, scientific investigation (except, of course, this epistemological claim itself is not subject to empirical demonstration, so it seems to me that his epistemology is self-defeating). Comte hoped to initiate the scientific study of society, coining the term sociology for this endeavor. He was optimistic that a scientific study of humanity would lead humans to practice altruism, another term he coined. Though Comte considered all metaphysics, including religion, unknowable, he wanted to create a religion of humanity which would place humans on the highest pedestal. Most of Comte's disciples, such as John Stuart Mill, embraced his positivist epistemology but rejected his religion of humanity, especially in the ludicrous form he presented in his later writings (which involved many specific religious practices, including praying to a female that one admires).

Though not as prominent as positivism in the nineteenth century, materialism also increased in influence in the mid-nineteenth century. Though positivism rejected all metaphysical claims, including materialist ones, it shared many common features with materialism nonetheless. Both materialists and positivists idolized science as the only path to knowledge. By extending scientific investigation to humanity itself, however, they made assumptions about human nature that were not subject to scientific investigation. Effectively they dismissed body-soul dualism, thus reducing humanity to matter in motion. Also, their insistence that the scientific method could provide knowledge about all features of human life led them to embrace determinism. By the late nineteenth century some prominent thinkers were rebelling against reductionism and determinism, but in the nineteenth century, these views gained currency to such as extent that Francis Galton, the cousin of Darwin and the founder of the eugenics movement, coined the phrase, "nature versus nurture" to frame the intellectual debate over humanity. Galton's phrase is still commonly invoked in intellectual discourse about human behavior.
Galton and many of his contemporaries rejected free will, claiming with circular logic that science had disproven this supposedly antiquated religious conception. (This was circular reasoning because they defined science to exclude free will, and then claimed that science disproved free will). Their insistence on determinism effectively ostracized religious or spiritual conceptions of human nature. The new fields of psychology, sociology, and anthropology, which only became institutionalized in the late nineteenth and early twentieth centuries, generally embraced this deterministic view of human behavior.

By rejecting free will and embracing determinism, Galton and his contemporaries were left with three main options: humans were either the product of their biological makeup, or they were the product of their environment, or they were the product of some combination of heredity and environment. Either form of determinism (or hybrids thereof) reduces humans to inputs, either from internal or external influences. They deny independent human agency and thus strip humanity of any moral responsibility.
In the mid-nineteenth century environmental determinism was more prominent than biological determinism. The philosopher Maurice Mandelbaum argues that one of the ideas dominating nineteenth-century philosophy was the "malleability of man," i.e., the idea that human nature is shaped largely by external forces, such as culture, education, and training.5 The father of John Stuart Mill exemplified this perspective, rigorously educating his son from an early age. Mill became a leading voice in Europe touting the power of education and training in shaping human intellect and behavior. Many mid-nineteenth-century liberals and socialists embraced this vision of environmental determinism.
Karl Marx is a prominent example of a socialist committed to environmental determinism. He called his perspective "scientific socialism," because he believed that his analysis was based on immutable economic and social laws. He was convinced that social institutions and even human nature itself were shaped by economic forces. If economic conditions changed, human nature would change accordingly. In Marx's view private property was the source of all the evils in human society, especially the oppression of the urban workers by the bourgeois capitalists. Private property thus spawned a class struggle in every age. Religion, morality, law, political structures, and other institutions and cultural factors were merely tools of the propertied classes to oppress the unpropertied masses.
Marx's primary motivation was not establishing human equality, though his socialist philosophy did militate toward greater equality. Rather Marx's primary concern was liberating humanity from oppression and tyranny. This is a laudable goal, and anyone who has read Marx's Capital or Friedrich Engels' Condition of the Working Class in 1844 should recognize that Marx had legitimate grounds for complaint. Many factory workers, not to mention the unemployed, lived in squalor and misery. Marx rightly criticized the dehumanizing effects of the Industrial Revolution. Nonetheless, when we examine the practices of Marxist regimes in the twentieth century, we see incredible oppression and tyranny. The quest for freedom was turned on its head. Why?
I suggest it is largely because of Marx's faulty view of human nature. Neither Lenin nor Stalin nor Mao nor Pol Pot nor Castro nor any other Marxist leader could alter human nature by ridding their society of private property. Changing the economy could not bring about utopia, because human behavior is not determined solely by the economy. Marxist philosophy failed because it denied to humanity its spiritual character, its free will, and also the Christian insistence on original sin. Alexander Solzhenitsyn clearly depicted the Soviet problem with altering human nature in his novel, One Day in the Life of Ivan Denisovich. In this novel the prisoners in the Soviet labor camp, who are supposedly being reeducated to become good Soviet citizens, continue to act as capitalists in any way they can, even while incarcerated. The protagonist expressed at one point that the Soviet regime simply could not change his nature.
In the late nineteenth century, especially by the 1890s, the pendulum swung away from environmental determinism, and biological determinism increased its influence among European thinkers. Galton was a pivotal figure in this development, publishing his seminal work, Hereditary Genius, in 1869. Galton’s influence was profound, especially since he convinced his cousin Charles Darwin that heredity was more important than environmental influences in shaping human intellect and behavior. Many Darwinists in the late nineteenth and early twentieth centuries came to believe--as Galton and Darwin also did--that many human character traits, such as loyalty, thrift, and diligence (or on the negative side--deceit and laziness), were biologically innate, not malleable moral traits, as most Europeans had previously thought.
Darwinists in various fields--especially in biology, medicine, psychiatry, and anthropology--were in the forefront promoting biological determinism. Cesare Lombroso, the famous Italian psychiatrist who founded criminal anthropology, built his ideology on Darwinism. He argued that criminals were atavistic creatures, throwbacks to ancestors in the evolutionary process. He was most famous for promoting the idea that criminality was hereditary, not formed through environmental influence. One of the most prominent popularizers of Darwinism in Germany, the famous materialist Ludwig Büchner, published The Power of Heredity and Its Influence on the Moral and Mental Progress of Humanity in 1882. In the midst of his extended argument for biological determinism of mental and moral traits, Büchner showed where his vision of humanity led. He stated, "In the flow [of time] the individual is nothing, the species is everything; and history, just as nature, marks each of its steps forward, even the smallest, with innumerable piles of corpses."6
By the 1890s and especially in the early twentieth century, the eugenics movement gained popularity, especially in medical circles, in Europe and the United States. Eugenics was driven in part by fears that modern institutions had set aside the beneficial aspects of natural selection. Eugenicists continually played on the specter of weak and sickly humans beings preserved through modern medicine, hygiene, and charitable institutions, while the more intelligent and supposedly better human beings were beginning to voluntarily restrict their reproduction. This was producing biological degeneration, according to many eugenicists. Their solution? Introduce artificial selection by restricting the reproduction of the so-called "inferior" and encouraging the "superior" to procreate. Biological determinism permeated the eugenics movement, which pressed for marriage restrictions, compulsory sterilization, and sometimes even involuntary euthanasia for the disabled, because they were deemed biologically inferior.
Another prominent feature of the biological determinism of the early twentieth century was its stress on racial inequality. In Europe racist ideologies proliferated in the 1890s and early 1900s, partly under the influence of Darwinism and biological determinism. Many biologists, anthropologists, and physicians considered black Africans or American Indians less evolved than Europeans. As Europeans colonized vast stretches of the globe, many scientists proclaimed that non-Europeans were culturally inferior to Europeans. Further, they believed that these cultural differences were manifestations of biological inferiority.
By reducing humanity to their biological makeup, these Darwinian-inspired biological determinists contributed to the dehumanization process. Many nineteenth-century Darwinists emphasized the continuities between humans and animals, with Darwin himself arguing that all the differences between humans and animals were quantitative, not qualitative. Darwin even explained the origin of morality as the product of completely naturalistic evolutionary processes. The idea that humans were "created from animals," to use a famous phrase from Darwin, rather than created in the image of God, gained greater currency in the nineteenth century.
Just as one form of environmental determinism--Marxism--produced unfathomable misery for millions of humans, so did biological determinism. Hitler's National Socialism was based on a biological determinist vision of humanity that stressed racial inequality. Nazism endorsed discrimination--and ultimately even death--for those with allegedly inferior biological traits. On the other hand, it hoped to promote evolutionary advance for the human species by fostering higher reproductive levels of those considered superior biologically. Hitler's regime ended up killing about 200,000 disabled Germans, 6 million Jews, and hundreds of thousands of Gypsies in their effort to improve the human race.7
While many modern thinkers, especially scientists, psychologists, and social scientists, have embraced one form of determinism or another, many thinkers have followed the nineteenth-century philologist and philosopher Nietzsche in rebelling against determinism. Nietzsche attempted to rescue humanity from scientific reductionism by positing radical individual freedom. He believed that all knowledge and truth are created by humans, not imposed on us by some external reality. We cannot blame the environment, nor biology, nor God for our character and behavior. Nietzsche rejected the idea that humans have fixed natures or essences. Rather, the choices we make as individuals shape our destiny. Many subsequent existentialists and post-modern thinkers have exulted in Nietzsche's liberation from reductionism and determinism.
While Nietzsche's emphasis on free will might seem to rescue humanity from the degrading philosophies of environmental or biological determinism, it does nothing of the sort. It only elevates a small elite of humanity, whom Nietzsche called the Superman, or more literally, Overman. Nietzsche's freedom was freedom only for these Supermen, the creative geniuses (like himself) who would rise above the hoi polloi. He had nothing but disdain for the masses, whom he thought incapable of exercising true freedom. What Nietzsche contemptuously called the herd instinct of the masses fitted them for nothing other than submission to the domination of the Superman.
Despite its stress on freedom, then, Nietzsche's philosophy is really a philosophy that aims at enslavement. Power ultimately decides not only who rules politically, but also what counts as truth. Nietzsche rejected any form of fixed truth or morality, thus undermining the very notion of humanity and human rights. Nietzsche despised weakness, compassion, and humanitarianism, preferring strength and domination. He was especially vehement in his rejection of Christian ethics, because it catered to the weak and downtrodden. His aristocratic morality aimed at justifying and benefiting the strong and powerful.
In the twentieth century many existentialist philosophers, such as Heidegger and Sartre, embraced the general contours of Nietzsche's philosophy, denying that humans have any fixed essence and stressing radical free will in human decisions. Later in the twentieth century, however, many postmodern thinkers, though heavily influenced by Nietzsche, have reduced the element of individual agency still important to Nietzsche. Many literary scholars emphasized the written text over the author, who disappeared from consideration. Human intent became irrelevant in interpreting human documents. Dehumanization thus spiraled even further downward, as all human values were construed as socially constructed.
Now that I have sketched out in broad strokes some of the dehumanizing influences of modern European thought and culture, I would like to suggest why this should be important to us. Not all environmental determinism leads to Marxism, nor does all biological determinism lead to the Holocaust. Not all existentialism or postmodernism leads to immoral behavior, either. However, false conceptions of humanity can lead to destructive behavior and harmful policies, both by societies and by individuals. It can and does affect the way we treat other human beings. Human rights are meaningless is a world of determinism or social (or individual) constructivism.
The underlying vision of human nature in any society shapes the political and social institutions, the laws, and the entire culture in far-reaching ways. The converse is also true--the political, social, and legal developments in a society influence its view of human nature and the dignity of human life. People who believe that humans are created in the image of God will have different values, ideals, practices, and institutions than those who view humans as merely the sum total of environmental and biological inputs, or those who believe that humans can create whatever truths they desire.
NOTES:
1Viktor E. Frankl, The Doctor and the Soul: From Psychotherapy to Logotherapy (New York: Vintage Books, 1986), xxvii.
2Ernst Haeckel, Die Welträthsel: Gemeinverständliche Studien über Monistische Philosophie(Bonn: Emil Strauss, 1903), 11.
3Peter Singer, Writings on an Ethical Life (New York, 2000), 77-78, 220-21; Richard Dawkins, “The Word Made Flesh,” The Guardian (December 27, 2001).
4Eric Pianka, “Biology 301M. Ecology, Evolution, and Society,” at www.zo.utexas.edu/courses/bio301; accessed 4-3-06; “Student Evaluations [for Dr. Pianka]--Spring 2004,” at www.zo.utexas.edu/courses/bio357/357evaluations.html, accessed 4-3-06; “Excerpts from Student Evaluations [for Dr. Pianka]--Fall 2004,” at www.zo.utexas.edu/courses/bio357/357evaluations.html, accessed 4-3-06.
5Maurice Mandelbaum, History, Man, and Reason: A Study in Nineteenth-Century Thought(Baltimore: Johns Hopkins University Press, 1971).
6Ludwig Büchner, Die Macht der Vererbung und ihr Einfluss auf den moralischen und geistigen Fortschritt der Menschheit (Leipzig: Ernst Günthers Verlag, 1882), 100.
7See Richard Weikart, From Darwin to Hitler: Evolutionary Ethics, Eugenics, and Racism in Germany (New York: Palgrave Macmillan, 2004); and my forthcoming book, Hitler’s Ethic.




Wednesday, June 29, 2016

Loving God and Knowing God- What is the Difference

At a Ligonier Conference a panel of speakers answer questions posed by the audience. Among them was this one that got me thinking about the word "know" or in Greek, "ginosko".





The question was asked: "What is the difference between knowing God and loving God?" As some have said, it would be difficult to differentiate because of the nature of God, and the nature of humankind and therefore- if a difference at all- it is a very fine line. Why is that? According to the Westminster Confession of Faith, The Shorter Catechism:

Q. 1. What is the chief end of man?

A. Man’s chief end is to glorify God, [1] and to enjoy him forever.[2]

God by his very nature is so worthy of adoration and worship- being perfect in every possible sense, he is therefore due this from all his creatures. Humankind is constituted in such a way, that, when the blindness of sin is taken away, so that we may know Him, then in as much as we truly know him, spontaneous worship necessarily follows, as the day follows from the rising of the sun. Because the original and fitted purpose of humanity is restored to its proper relation to God through the only mediator, Christ. But this fitness of worship is true not only of the special creation exemplified in humanity alone, but the whole of creation is engendered with this “rightness” of worship. In his triumphal entry into Jerusalem, the people, in common adoration of their Lord, reached such a pitch, that “the whole multitude of disciples began to praise God joyfully in a loud voice for all the miracles they had seen”. Seeing, what must have appeared to them as inordinate worship, the religious leaders told Jesus to rebuke his disciples, to which Jesus responded: “if they remain silent, the very stones will cry out.”… [3]

Although the proper relation between God and humankind, on the part of man, is no doubt obligatory, yet it becomes not a duty, but a privilege, when- and to the extent- the veil is taken away. This privilege, or undeserved honour, by which grace has revealed the ultimate purpose for which we have come into existence, then elicits its proper response, just as surely as the proper response of frost is to give way, when the sun begins to light the shadows in which it lays. It is then that we are fit for our original purpose as those that enjoy who God is, and adore Him for what God is.

So, to know Him is to love Him.

Seen in this light we begin to understand the depth of meaning in the word “to know” that is used in Scripture. It is used in the sense of being in an intimate, loving relationship. This makes it distinct from mere intellectual “knowledge” such as what one might mean when one says one “knows” about God. We therefore view “knowing” as it is often used in Scripture as being intimately in relation with the person in view. A striking example of where this is seen occurs in Luke chapter one in Mary’s dialogue with her heavenly visitor who proclaims that as one who had gained favour with God, that she would bear a child, and that the “ Lord God will give Him the throne of His father David, and He will reign over the house of Jacob forever. His kingdom will never end!” To which upon hearing this Mary exclaimed: “How shall this be, seeing I know not a man?” [4] Immediately we perceive that this Greek word “ginosko” that is translated as “know” may be, and often is, used in a very intimate sense, connoting a deep relationship. The word, in this case must be understood on these grounds as it pertains to a personal relationship, even, as in this case, including an intimate physical relationship. It cannot mean that she didn’t know any man in the other sense, since she was already engaged to one. I know Barack Obama in the former sense, I don’t know him in the sense of having met him personally.

Having built then, a foundation for the word “ginosko” or “know” in the sense often intended, as an intimate, and personal way of relating it is not hard to see that it exists in close relation to the idea of love. 

At this point I would remind the reader of God’s omniscience. God by nature is perfect, and one of those perfections involves the extent to which God knows his creation. To be omniscient is having complete, unlimited knowledge. “The eyes of the Lord are in every place, keeping watch on the evil and the good” . [5] “Can anyone teach God knowledge, since He judges those on high?” [6] When King David drew Solomon close for instruction he said: “As for you, my son Solomon, know the God of your father, and serve Him with a loyal heart and with a willing mind; for the Lord searches all hearts and understands all the intent of the thoughts. If you seek Him, He will be found by you; but if you forsake Him, He will cast you off forever” [7] 

And so for instance, in the early chapters of Genesis, “when the man and his wife hid themselves from the presence of the LORD God among the trees of the garden” [8] do we suppose that God didn’t know where they were or why they hid? When “the LORD God called unto Adam, and said unto him, Where art thou?” . [9] Was this for His own benefit, or was He in fact suggesting: Adam, I know where you are and what you are thinking, and why you have attempted to hide, but do you yourself really know what this means? Are you aware of just what you have got yourself into? Do you know where you are, in relation to me now? Do you know what must now inevitably follow? The day they ate, the prophecy came about, and their natural, and proper relationship with God ended, and estrangement began. And then God pronounced the curses upon them and their posterity and banished them from the garden.

The importance of knowing this distinction between mere mental agreement, (“knowing” as it is commonly understood), and knowing by intimate relation, as it is intended in scripture- is laid bare when we consider a passage that strikes a deadly chord, once we come to appreciate the subtlety.

“Not everyone who says to Me, ‘Lord, Lord,’ will enter the kingdom of heaven, but only he who does the will of My Father in heaven. Many will say to Me on that day, ‘Lord, Lord, did we not prophesy in Your name, and in Your name drive out demons and perform many miracles? Then I will tell them plainly, 'I never knew you. Away from me, you evildoers!' “ [10]

Clearly it cannot be intended from this verse- that God in Christ, the one who said “before Abraham was- I am” [11]- never knew these self appointed prophets, miracle workers and exorcists? To understand it in the sense that- he didn’t know of their existence, or what spirit they were of- is to deny a fundamental attribute of the Godhead- His omniscience. Therefore it must be seen in the light that we are now considering. He must mean, therefore that he has no relationship with these people, they laid claim to knowing him, but he, disowned them in terms of a relationship, “I never knew you” he said with emphasis. He of course, knew of them, but he bore no relationship to them. They were not his people. 

Now we turn to a derivation of this word “ginosko”- “to know”. The Greek word is “proginosko” and means “to know beforehand”. We can appreciate the prefix when we consider the commonly used medical term “prognosis” which a doctor will use when she predicts the course of a disease and its outcome. It has the same origin as the word we are studying. So it has the future in mind. It is predictive knowledge.

Turning again to the scriptures- we learn from the context what intent the author had in mind when using this word. With this in mind then, it requires no stretch of the imagination to understand that the word “proginosko” or foreknowledge speaks of God’s omniscience in relation to the future. Indeed the scriptures leave nothing to the imagination where it says in Isaiah 46:10,11


“Declaring the end from the beginning, and from ancient times the things that are not yet done, saying, My counsel shall stand, and I will do all my pleasure:...yea, I have spoken it, I will also bring it to pass; I have purposed it, I will also do it.:”

In the book of Romans chapter 10, Paul speaks of his love for his own kind, and prays that Israel is saved, and lays out how it is that they have gone astray by trusting in a righteousness that isn’t according to faith, but according to the law. In the spirit of reconciliation He explains:


“...there is no difference between the Jew and the Greek:for the same Lord over all is rich unto all that call upon him. For whosoever shall call upon the name of the Lord shall be saved.” [12]

Drawing from various sources out of the Old Testament Paul explores the various advantages that Israel enjoyed by having the oracles of God, and by which they were made more guilty, because of their subsequent disobedience. “To whom much is given, much is required.”

He is careful to raise the question: If Israel is God’s chosen people, then why don’t they (by and large) accept the Gospel? Is it because they haven’t heard? Is it because they weren’t sent a preacher? Why is it they don’t believe? It is important to realize that Paul is declaring their guilt in not believing the Gospel from the Old Testament. He even shows that the Old Testament prophets Moses and Isaiah foretold of a coming day when that which they hoped for, the God, whom they sought, was hidden from them, but revealed to those that were not looking. That revelation, was given to another, the Gentiles to provoke them to jealousy. 

Having built his case demonstrating their estrangement, he then asks:

I say then, Hath God cast away his people?

Which he immediately answers:

God forbid. For I also am an Israelite, of the seed of Abraham, of the tribe of Benjamin.

As if to say, “Of course not, see- am I not an Israelite? Have I not believed? Have I not been saved?” But then he qualifies this affirmation of saving Israel, with the statement:

God hath not cast away his people which he foreknew. [14]

And so we come crashing back to the relevance of this word proginosko or “foreknew”. To recap:

We saw that the word “know” can be used in a strictly formal sense. That is- that God “knows” everything as a simple mental assent to his omniscience.

But we have also seen that a legitimate use of the word “know” is that of knowing in the sense of being in intimate relationship with. Or as in the case of those that Christ told to depart from him, he was declaring the lack of this intimate relationship. “I never knew you” he declared, though it is equally obvious that he would indeed know them in the other sense.

From the context, and our understanding of this infinite capacity to know all that is knowable, that we rightly attribute to God, it must follow that the word in that context must mean that Jesus was never in relationship with these self appointed people claiming to be his.

We must now ask:

May the word proginosko or “foreknew” be used in precisely both the same ways as the word ginosko is used in sense 1. and 2.?

May it be used- not only in the way God understands the future, but also in the way that he is speaking of a love relationship that preexisted, with those that were yet to come into existence? Was it indeed speaking of the love that God had set upon a distinct people, that would be distinguished from all humanity by faith, that were preplanned to come into existence in the passage of time?

This may be answered first, by assuming that the verse in question only refers to the sort of “knowing” exemplified in sense number 1. 

Doesn’t God foreknow every human being that came into existence?

In fact that would render the statement senseless. If God hasn’t cast off any that he foreknew then that would entail Universalism. God has not cast off any he foreknew, and he foreknows everyone, therefore everyone is saved. But this wouldn’t make sense, because Paul does speak of those being cast off, later on in this same chapter, and referring to the same people:

“For if the casting away of them be the reconciling of the world, what shall the receiving of them be, but life from the dead?” [15]

Therefore, if it is not coherent, when using the word “foreknew” in sense number 1. It therefore follows, that the sense we ought to understand it is sense number 2.

God hath not cast away his people- those with which he was in intimate relation to before they were even born. These were predestined to be loved by Him in existence even as He loved them before they arrived in due time. 

If Christ was:

“... the Lamb slain from the foundation of the world...” [16] meaning for all intents and purposes, in the heart of God from the creation, was the intention that "God will provide for Himself the lamb…” [17] then the lamb was also slain with those in view for whom the sacrifice was already intended- from the beginning of creation.

We must assure ourselves that we are not reading into the text that which we wish to affirm, but that we allow the text to speak for itself, and allow scripture to interpret scripture.

How then does this sense concur with other passages?

We don’t have far to look. After making the point that God had not rejected Israel completely, qualifying this idea of rejection with the exception of those “whom he foreknew”, Paul immediately quotes scripture by which he builds a case, with the intention of persuading his hearers of this distinct division within Israel. He quotes Elijah appealing to God against Israel who “killed the prophets, and tore down the altars”. Paul then draws attention to Elijah’s plea to God- “I am the only one left, and they are seeking my life as well”. But what did Elijah mean where he said “I am the only one left”? The only one of whom?

Paul is introducing the idea of the true remnant of Israel that Elijah referred to. Paul reiterates God’s comforting answer to Elijah which follows: “I have reserved for Myself seven thousand men who have not bowed the knee to Baal.” [18] Paul then follows this up with his explanation for how things are still working according to this plan. He sees his own experience, in a similar light to Elijah’s. He himself, an Israelite, is also one of the “remnant chosen by grace.” In Paul’s understanding, his experience of the call of God in the Damascus road experience, was no less than the experience of Elijah, as examples of a calling out of the chosen people. It was not seen by him as a novelty, but as part of the continuing experience of that which was from the beginning. He sees himself therefore, along with all others that embraced Christ, as the true remnant of Israel. Whereas the rest of Israel “failed to obtain, but the elect did. The others were hardened...” It is important to notice that the word “chosen” verse 5, is identical in the Greek language as the word “elect” verse 7.

This must call into question then, the almost universal acceptance of Evangelical Christianity of Christian Zionism. In effect Christian Zionism alludes to all Jewish people (irrespective of their view of Christ) as God’s “chosen people” or God’s elected people, as indeed Jewish people see themselves to this day. But then one must ask: 

Are there two chosen peoples? Are there the chosen ones who are those who are Jews by blood relation to Abraham, and are there those chosen by the election of grace?

[1] Psalm 86, [2] Psalm 16:5-11, [3] Luke 19:37 fwd. [4] Luke 1:21 fwd. [5] Proverbs 15:3, [6] Job 21:22'

[7] 1 Chronicles 28:9, [8] Genesis 3:8, [9] Genesis 3:9, [10] Matthew 7:23, [11] John 8:58, [12] Romans 10:12,13, [13] Romans 10:14-21, [14] Romans 11:2, [15] Romans 11:15, [16] Revelation 13:8, [17] Genesis 22:8, [18] Romans11:4, 1 Kings 19:18

Saturday, June 25, 2016

Mitch Stokes, David Hume, An Atheist And Me

Mitch Stokes is a Christian philosopher who had the good fortune to study under some of the most influential Christian thinkers of our day. For some people that might conjure up big ticket names such as Billy Graham, or John Stott, maybe Ravi Zacharias. Yet others might think of Joel Osteen or Jerry Falwell. And still others might focus on the very idea of whether or not a Christian might legitimately be interested in philosophy in the first place. However the names that he studied under, Alvin Plantinga. Nicholas Wolterstorff and others may not seem household names, but in philosophy their influence is profound. To many unfortunately, reason and rationality are mistakenly perceived as the enemy, of our faith,. In fact this attitude that reigns very strongly in certain quarters of Evangelicalism has been responsible for the continued marginalization of the Christian voice in many Western and increasingly secular countries.

Here is a guest post by Mitch Stokes PhD. from Crossway about the role of apologetics:


1. Apologetics is as much for believers as it is for unbelievers.


Let’s roughly define apologetics as the use of arguments to remove doubt or unbelief (I’ll qualify this in the next point). The point here is that unbelief often comes from our own hearts and minds, despite the fact that we’re Christians. For my own part, apologetics has always been something I do as much for me as for others.

2. Apologetics can be used preemptively.


Here’s the qualifier I mentioned above: although we often use apologetic arguments to remove doubts, we can also use them to prevent doubts. Teaching apologetics to young believers can be a preemptive strike on unbelief.

This won’t prevent all doubting, but it can certainly mitigate it. This point is particularly important for parents. Notice that points (1) and (2) imply that apologetics is for absolutely everyone—Christians and non-Christians, doubters and non-doubters (i.e., not-currently-doubters).

3. There is a difference between knowing that Christianity is true and showing that it’s true.


Ultimately, we know that Christianity is true because the Holy Spirit opens our eyes to its truth (which should remind us to steep our apologetics in prayer).

That’s not to say that arguments can’t confirm or further support our Spirit-induced belief—or that arguments are never part of coming to faith—but the arguments we use on ourselves are sometimes different from the arguments we use to try to show someone else that Christianity is true.

4. No one has all the answers.


Be realistic and avoid the temptation to think that in order to address your neighbor’s skepticism you must first have all the answers. No one has all the answers. When you don’t know something, say so and be fine with it. Know your limits.

This isn’t an excuse to be sloppy or to avoid the hard work of study, but rather an encouragement to be humble, and to therefore be relaxed and gentle. Also, be prepared to come to the realization that the more you learn, the more you’ll see how complicated the issues are. This is just a design feature of learning.

5. There are no airtight arguments.


Although there may be strong arguments for Christianity, none of them is absolutely compelling, forcing everyone to believe the conclusion upon pain of irrationality. To put it differently, there are no proofs for Christianity in the strong sense of "proof."

This shouldn’t be troubling—after all, there are few, if any, arguments whose conclusion can’t be avoided somehow—even if this avoidance puts the person into some intellectual contortions. Can you prove that there is actually a computer in front of you and that you’re not in the Matrix? No. So don’t expect more from arguments than they can deliver.

6. Don’t mistake the strength of your loyalty to Christ for the strength of your argument.


We can often mistake the strength of our commitment to Jesus for the rational strength of our arguments for Christianity. Properly acknowledging the limitations of an argument doesn’t imply that you’re somehow hedging on your profession of faith. Similarly, acknowledging that there are good arguments for atheism or agnosticism doesn’t mean that you’re being disloyal.

7. The strength of arguments is person-relative.


A watershed experience for me in graduate school was seeing equally brilliant philosophers, each of whom knew all the same arguments, come to wildly different conclusions. When we evaluate arguments, all of us weigh them against our own unique set of background beliefs, experiences, temperaments, proclivities, and emotions.

And though this doesn’t mean that “anything goes” when evaluating arguments, neither are arguments purely a matter of logic and observation. Everyone is unique and no one is neutral. By the way, none of this implies that truth is relative.

8. Apologetic method is person-relative.


This will be controversial among die-hard devotees to specific methods, but don’t get too caught up in “schools” of apologetic method. It is helpful to become familiar with them, and even fine to have a favorite, but the best “method” for the job will depend on many factors. Some of these factors include your background/expertise, interests, personality, and temperament (as well as those of your audience). Your approach will also depend on the physical setting. A lecture hall is different from a coffee shop or the internet.

Again, this doesn’t mean that just any old thing is fine—or that all methods or approaches are equally good. I sometimes think of it in terms of learning martial arts styles: it’s best to learn a number of them, taking the things that work best (for you) from each one. Learn them but don’t get too distracted with their categorization.

9. Apologetics is more a matter of planting than a matter of harvesting.


Changing someone’s mind isn’t the only goal of apologetics. In fact, that’s unlikely to happen in the moment. Rather, think of any apologetic encounter as planting a seed that will come to fruition later. Or perhaps you’re simply helping prepare the soil so that someone else can plant.

That’s not to say you shouldn’t pray for God to do big things, but remember that we often don’t get to see firsthand those big things. So you shouldn’t be discouraged (or angry or defensive) when the person you’re talking with doesn’t agree with you. It’s not all on your shoulders.

10. Apologetics is ultimately about people.


It’s easy to get caught up in ideas, concepts, and arguments—especially for people who are naturally drawn to apologetics. But apologetics is a means to an end, a means of helping people to live for Jesus.

An apologetic encounter isn’t a sales pitch; neither is it a fight (my above martial arts example was a training metaphor, not one about attitude). Love the people you come into contact with. Ask them questions and genuinely listen to their answers. Be gentle and humble.

Be like Jesus. Mitch Stokes at Crossway

Mitch Stokes (PhD, University of Notre Dame) is a senior fellow of philosophy at New St. Andrews College in Moscow, Idaho. In addition to studying philosophy under world-renowned philosopher Alvin Plantinga, Stokes holds degrees in religion and mechanical engineering, and holds five patents in aeroderivative gas turbine technology. His most recent book is How to Be an Atheist: Why Many Skeptics Aren't Skeptical Enough.

Years ago, the Christian thinker C.S. Lewis opined-

"To be ignorant and simple now—not to be able to meet the enemies on their own ground—would be to throw down our weapons. . . . Good philosophy must exist, if for no other reason, because bad philosophy needs to be answered.”

Bad philosophy needs to be answered. But why? Because bad philosophy leads to a bad world. Philosophy comes from an amalgam of Greek words that combine to mean a love of wisdom. And we know that Christ is our Wisdom. We also know that philosophy is very much into thinking logically and in a disciplined way that helps us spot fallacies, promote truth and better reflect that which is real. Again we understand that this is linked to Christ who is the Word come down from heaven and who became flesh and dwelt among us. The Greek word for that "Word" is Logos, from which we get the word "logic". He is our reasoning Word.

So, unlike many who grimace at the thought of Christians involved in philosophy, so long as we distinguish between the wisdom of God, and the wisdom that comes from below, rather than finding a proof-text for discouraging the Christian from thinking in more philosophically adept ways, we should acknowledge that in today's philosophically driven world that this discipline is a prerequisite for disarming the ever more sophisticated attempts to presume that other worldviews are equal to or superior to the wisdom that comes from above. Philosophy, whether it is recognized or not, is what steers our ship. But it is not merely the individual's ship, it is the ship we call culture. The society in which we live and move and raise our children in. Here is a classic reason from the scriptures to engage in philosophy, because our world is driven by ideas that do not "bow their knee to Christ"


"For though we live in the flesh, we do not wage war according to the flesh. The weapons of our warfare are not the weapons of the world. Instead, they have divine power to demolish strongholds.  We tear down arguments, and every presumption set up against the knowledge of God; and we take captive every thought to make it obedient to Christ.…"  


In todays world, without a fundamental grasp of the discipline of ordered thinking that is taught in philosophy, there is very little ability to "tear down arguments" that oppose the knowledge of God. It is difficult enough even with some knowledge of ordered thinking.

Here is a video that demonstrates I hope, something of why it is that every Christian is called to give a reason of their hope. To give a defense of the faith. This is the unadulterated version of Stokes' video showing the early skeptic empiricist David Hume and how Hume's  skepticism couldn't really account for reason and logic without reference to reason.


The next video is one in which an atheist takes issue with Stokes' views.


Friday, June 24, 2016

Blaming Christians for Orlando?

Blaming Christians for Orlando?
THE MEDIA HITS ROCK BOTTOM
 SUBSCRIBE TO BREAKPOINT DAILY
ERIC METAXAS
It’s not news that the news media treat Christians unfairly. But to blame us for terrorism?

Two Sundays ago, an ISIS-inspired terrorist killed forty-nine people at a gay night club in Orlando. Yet just three days after the attack, the New York Times editorial board laid the blame for Omar Mateen’s self-professed act of Islamic terrorism squarely at the feet of…believers in traditional marriage. I’m not kidding.
For those confused about how Christians and social conservatives are responsible for a radical jihadist’s actions, the Times helpfully explains: Our “corrosive politics,” they write, paved the way for this monstrosity. And by “corrosive politics,” they make it clear they mean defense of the natural family and created differences between the sexes. The Daily Beast followed up, accusing conservatives who are mourning the tragedy of “exploiting the LGBT community.” Evidently if your politics don’t line up with the goals of the sexual left, you’re not allowed to shed tears for the victims of terrorism.
But by far the most disturbing response, at least to me, came from CNN’s Anderson Cooper, who decided to publicly shame Florida Attorney General Pam Bondi during a live interview. While Bondi tried to explain what Florida is doing to help the victims and their families, Cooper raked her over the coals about her opposition to same-sex “marriage.”
In fact, he all but called her a hypocrite for defending the Florida constitution which—at the time—defined marriage as the union of man and woman. An attorney general’s job, of course, is to uphold and defend her state’s constitution. But Anderson Cooper did not seem to care.
As Mollie Hemingway remarked at The Federalist, apparently Cooper and CNN cannot fathom how anyone could oppose gay “marriage” and alsogrieve the murder of fifty fellow human beings. The implication by the media is clear: If you haven’t been on board with the LGBT political program, you’re partially responsible for what happened in Orlando.
Let me just tell you my first reaction to this: I was angry—very angry. I wanted to get on the air and scream from the rooftops how absurd, immoral, and unfair this kind of equivalence is. A self-proclaimed ISIS devotee committed the worst mass murder in this country since 9/11, and the media can think of no one to blame but conservatives and Christians!
Now that I’ve had some time to compose myself, I think it’s important wedon’t respond with anger. In fact, my BreakPoint colleagues and I debated whether we even should dignify this foolishness with a response. And we decided to do so for a couple of reasons.
First, although we can expect to see more abuse of Christians in the news, we cannot let this become the new normal. Not in America. And we should respond by defying the caricatures—just like the Orlando Chick-fil-A managers did when they opened their stores on a Sunday to feed blood donors. And then there’s Lutheran Church Charities, which sent comfort dogs to help mourners in Orlando.
Second, the media has a long history of botching religion stories. Recall when The New York Times described the Church of the Resurrection as “the site where many Christians believe that Jesus is buried.” As Mollie Hemingway points out, sin and salvation are foreign concepts to journalists in America today. And the three-dimensional idea of grieving the deaths of those who disagreed with our politics doesn’t fit two-dimensional narratives. So when the Anderson Coopers of the world insist loving people means approving of everything they do, we must reply that love often means exactly the opposite.
No, we’re not responsible for what happened in Orlando. But it’s our love and proclamation of God’s grace, not our angry self-defense, that will prove it. 

Tuesday, June 21, 2016

Peter Boghossian On Faith- A Critique



There is a known principle in philosophy that one might call “the principle of charitable representation”. What this entails is that when critiquing a view that one believes is mistaken, or erroneous, one ought- as a sign of integrity- first define the view that is opposed, in a way that fairly represents the argument, and, as a mark of goodwill, present it in its best possible light.

But wait, perhaps I’m jumping ahead of myself. If we ask the question: What does philosophic inquiry look like? Then what follows are some rudiments of formal inquiry employed in the pursuit of philosophy. Following some of the fundamentals, we will look at an example of a philosopher presenting his arguments for the view he holds. And then we will critique that view in turn.

An argument, in the sense used above, isn’t about bickering, or verbal abuse, it’s about rational thinking that comprises a premise, or premises, and a process of logical reasoning from which a conclusion follows and is established. The process, may be thought of as a house, the roof of which- is the final assertion, the logical conclusion of an argument. If an assertion, or a claim to truth is not supported by reasoned arguments, then it is like a roof without walls. The arguments then, are those reasons by which the conclusion is supported. The arguments are the reasons that hold up the conclusion, the thing claimed to be true. A bare assertion without supporting arguments is a house roof that will shelter nobody because it has no walls. If the conclusion has strong supporting arguments, the roof will hold up to repeated challenges to its integrity. If the argument is supported weakly, then the possibility is that- as the walls cave in to pressure- then the whole roof collapses to the ground. Needless to say- do not seek shelter under that sort of roof.


TheStandardForm.png

The above argument is logically valid. That is, if the premises P1 and P2 are true, then the conclusion C is guaranteed true. So, in the argument the basis on which it depends are twofold. It may fail only if one or both premises are untrue. And it may fail if the logic is not valid. In this case the logic is valid and therefore the soundness or not of the argument is reliant only on the premises. On the face of it a major in business studies would do well to include logic and so there is reason to question premise 1, it may in fact be untrue.

Logical arguments sometimes employ deductive reasoning and sometimes, it’s a matter of induction. Deductive arguments are considered valid, and- providing the premises meet the standard of truth- it guarantees the truth of the conclusion.

“If a valid argument has true premises, then the argument is said to be sound.”

In this case, if there is general agreement with the premise, then there can be no argument with the conclusion.

“if the premises are true, then it would be impossible for the conclusion to be false.”

However, consider this, a conclusion that follows a deductive argument, if developed from false premises, is still said to be valid but is not true. The logic of the deduction is not false, but the premise is, and therefore the result, or its conclusion, is false.

“Either Elizabeth owns a Honda or she owns a Saturn.

Elizabeth does not own a Honda.

Therefore, Elizabeth owns a Saturn.

It is important to stress that the premises of an argument do not have actually to be true in order for the argument to be valid.” (words in bold from Internet Encyclopedia of Philosophy )

However inductive arguments are those in which inferences are made. Inferences are those arguments in which either an argument is more likely, or less likely to be true, according to its premises and how well informed are its arguments.

“In an inductive argument, the premises are intended only to be so strong that, if they were true, then it would be unlikely that the conclusion is false. There is no standard term for a successful inductive argument. But its success or strength is a matter of degree, unlike with deductive arguments. A deductive argument is valid or else invalid.” (words in bold from Internet Encyclopedia of Philosophy )
ArgumentTerminology.png

Another sign of integrity in logical argument is that in order to show this attitude of charity, the better party will not try to make out that their argument is always a deductive one- if there is any doubt as to this. According to the rule of charity, the idea is that the interlocutor should only claim deductive certainty when it is incontrovertible. So with this principle in mind, many will state things like: “In all fairness it seems that…” or “given that...it is appropriate to reckon” or “we ought to account for this therefore by…”.

In the following YouTube video entitled “Is Science the Only Source of Absolute Truth?” by Christian thinker, philosopher Mitch Stokes, he demonstrates the charitable principle by the use of understatement in his critique of the claim “that science is the only source of absolute truth”. We may contrast this with the over exuberance of skeptic philosopher Bertrand Russell who made this absolute claim of science:

“I conclude that, while it is true that science cannot decide questions of values, that is because they cannot be intellectually decided at all, and lie outside the realm of truth and falsehood. Whatever knowledge is attainable, must be attained by scientific methods; and what science cannot discover, mankind cannot know.”



Defining terms is, in formal and informal refutations, a fundamental step, one that ensures that everyone involved is clear as to what the argument is about, and ensures that people don’t talk past each other. It is the effort to minimize unnecessary misunderstanding at the outset. By avoiding equivocation, whether an opponent purposely misleads with ambiguous language or whether it happens by accident, its practice in the long run saves time and energy. The opponent states the opposing argument in her own words, to show that they have not misunderstood the argument, and that discourse is then possible. Just as polite conversation is guaranteed by rules of social etiquette- such as don’t interrupt while the other is speaking- so too, do the rules of engagement guarantee more satisfactory dialogue. There ought- in the name of consistency- to be at least some attempt to standardize terms.

My pre-teen son was having a conversation with an Irish boy who didn’t understand his Kinglish, (Kiwi/English) very well. With a puzzled look, he asked his mother, “Mum, what's a ‘cave’?”. To which his mother replied, “You know- a kev” Instantly the two boys could continue the discussion on equal terms, because each understood what the other was saying. So defining terms is important to clear away misunderstandings that would cloud the real issues of difference. Defining terms is crucial, it makes a dialectical discourse possible.

Many will be cognizant of “the straw-man argument”, where an opponent uses a poor representation of an opposing argument to make it easy to demolish or refute it. Obviously the straw-man argument is an unfair, unreasonable tactic, and disingenuous for anyone to employ it, because the idea that is being “demolished” is not a true representation of the opposing view.

Many arguments are won on the basis of rhetoric rather than reason. Rhetoric is the art of persuasion that makes no scruples about its tactics. It is free to use hyperbole, emotive language, and leverages upon the prejudices and foibles of human nature. Style, psychology and presentation are as important to rhetoric, as rationality, logic, coherence and correspondence are to reason.

It has become common for people to appeal to pseudo-straw arguments, an attempt to make the opposing argument weak, and foolish at the same time. For example, the case that an argument for the existence of God is synonymous to an argument for the existence of a “flying spaghetti monster” is really a more covert appeal to a straw man argument with little more subtlety than a sledge hammer. And as Professor John Lennox pointed out, Richard Dawkins did not write a 400 page book attempting to refute belief in Wotan, or Zeus, nor does anyone know too many people who came to believe in Santa Claus as an adult. The interlocutor that uses this language is trying to persuade others by ingratiating herself into the favour of her sympathisers, by appealing to their prejudices, with the sentiments of many who deride and mock the beliefs of theists. It is an emotive tactic employing sarcasm, that is enticing favour by ridicule and mockery, in treating the theist as an idiot by association.

But in the case of “the principle of charitable representation”, the idea goes beyond what it means to refrain from using the “straw-man” argument. In fact it requires of the opponent, to fulfil an obligation to honesty and integrity, such that they make the best case possible on the behalf of that which she is opposing- and then to refute that “best possible case” with all the energy and logical discourse she is capable of. When people do this with integrity, then no-one can be accused of not doing an honest job of their refutation. It relies on what one may call the etiquette of philosophic best practice, so that they- at least at the beginning- attempt walking a mile while wearing the shoes of the opposition, or putting themselves in the others place. This shows a commitment to honesty and integrity that cannot be gainsaid.

Now, having laid the groundwork for an honest, and clearly defined way of proceeding with a dialectical discussion, we turn to the subject at hand.

Peter Boghossian, is an eloquent and outspoken speaker, “an American philosophy instructor, social activist, author, speaker, and atheism advocate”. Boghossian is “Assistant Professor of Philosophy at Portland State University. His primary research areas are critical thinking, philosophy of education, and moral reasoning.” (Wikipedia)


In a presentation entitled “Jesus, The Easter Bunny and Other Delusions: Just Say No!” which may be seen on YouTube, he presents an argument claiming, quote- “My thesis tonight is simple, and should be uncontroversial, faith-based belief processes are unreliable- and will not lead one to the truth” unquote. With a view of charity in mind, let us ignore for the moment his betrayal of the etiquette we might expect in a philosopher who is paid to give students an appreciation for good philosophy, that is evident even in the title of his discourse. He is after all, in this instance, largely preaching to the choir- it is an address to skeptics.

Aside from some preliminary statements that help define the terms of his thesis, he begins by outlining some poor ways of reasoning that people have employed in attempts at gaining knowledge of certain things. He highlights some uncontroversial, reliable conclusions about unreliable methods of reasoning such as astrology, homeopathy and dowsing. The evidence he quotes is from an appeal to the authority of the Mayo clinic, who conducted trials in the effectiveness of homeopathy. In another example he appeals to the authority of one Shawn Carlson’s “A Double-blind Test of Astrology”, published in the journal Nature, in 1985. These and other examples he uses to make his first major point:

“Unreliable processes, lead to unreliable conclusions, that is if the process one uses is unreliable the conclusions one comes to cannot be relied upon, cannot... ”

So far, none of this is remarkable or contested, and may pass without further comment.

Drawing an analogy to exploring the necessary steps that may be taken to find the dimensions of his missing bathroom door, Boghossian demonstrates some unreliable, even foolish efforts. Asking his dog being one of them.

He mentions one of the genuine possibilities may entail asking an expert. This appeal to authority is, in his own terms, of genuine epistemic value for the purposes of inquiry. Some people, in a mistaken understanding of the “appeal to authority fallacy” make the claim that any appeal to authority is out of court. That isn’t so. And as has been pointed out by C.S. Lewis- “A man who jibbed at authority in other things- as some people do in religion- would have to be content to know nothing all his life." For those that weren’t brought up in the days of horses and horse carriages, the word “jibbed” means to stop and refuse to go on. In Lewis’s view therefore, if we took the mistaken notion that authorities are not to be referred to, then science would cease to advance knowledge, philosophy would come to a standstill. It would also render senseless, Boghossian’s claim to the twin epistemic goals that we will look at next.

As an example of the value of referring to authorities, we ought to acknowledge that every person that ever made a significant contribution to knowledge in any field, in all humility stands on the shoulders of those that went before them. So, for instance, we have Newton who gave us a better understanding of gravity. Working from Newton's position, Einstein moved us from Newtonian physics to relativity. From Einsteinian relativity we graduated, or rather some scientists, (Stephen Hawking, Leonard Mlodinow) have graduated to quantum mechanics. But then we ought also to know that at this cutting edge of scientific knowledge not everything is yet settled. Roger Penrose is just one such scientist who has worked with perhaps the world's best known scientist, Stephen Hawkings.

“the universe has a purpose," Roger Penrose

Sir Roger Penrose calls string theory a "fashion," quantum mechanics "faith," and cosmic inflation a "fantasy." Coming from an armchair theorist, these declarations might be dismissed. But Penrose is a well-respected physicist who co-authored a seminal paper on black holes with Stephen Hawking. He argues that known laws of physics are inadequate to explain the phenomenon of consciousness. Penrose does not hold to any religious doctrine, and refers to himself as an atheist.

This mistaken idea (that referring to authorities is verboten), is ironically a reference from a poor authority, that has been referenced by unthinking, uncritical people with blind faith in that authority. In that form, it is a knowledge deterrent, a science stopper. The claim that referring to an authority is a fallacy, amounts to little short of dishonesty, in practice everyone refers to authorities. The real danger is in trusting an authority so much that we unquestioningly accept what they have propounded without question, and that is blind faith, which, in agreement with Boghossian, we should all be opposed to. The other danger of course is where we quote, or refer to an authority on an issue where the authority has no expertise, or is unqualified to comment on. Stephen Hawking is an indubitable authority in physics. But he is no philosopher. Metaphysics is not his area of expertise. Accordingly, when he said: “Philosophy is dead” he demonstrated the danger of using his mantle of scientific authority to propound something he has no qualifications in, to support that claim. This problem has been exacerbated in recent years due to specialization, where a person's particular field so encompasses all their energies, that they are in fact quite narrow in terms of overall knowledge. So,no problem with his reference to an expert on the size of missing doors.

Here, Boghossian underlines an important concept, the twin epistemic goals. Epistemology, is the study of how we come to know what we know. These, (twin goals) he rightly explains as the goals of increasing true knowledge and decreasing false beliefs:

“We want to maximize the number of true beliefs that we have...but if this was our only goal, then we could just believe everything we read, think, and hear...but...because we have a second goal, that we want to minimize the number of false beliefs that we have... But again, this doesn’t mean not believing anything, because that would mean we wouldn’t have any true beliefs.”

Another way of putting this is by imagining- that truth is to the mind- what food is to the body. By imagining that a radical skeptic might be thought of as a person who starves herself to death by rejecting all claims to truth, and conversely a gullible person is one who believes anything at all that is presented to her mind, and thus chokes to death trying to consume it all, or perhaps dies of obesity related diseases. To increase the likelihood of maximizing true beliefs, and minimizing false beliefs the reliability of the methods used must be taken into account. As referred to earlier, referring to authorities- is an integral and acceptable, even necessary part of the epistemic goal. An important part of the epistemological process, is involved in consciously evaluating the reliability of the authorities you choose to listen to.

“To do this we cannot use processes that are unreliable... Are there any commonalities among processes that take one away from reality?...Processes that decrease the likelihood that the conclusion one comes to will be true?...There are two.

The first, is that these processes are not based on evidence...

The second commonality, is that they’re based upon what one thinks is evidence- but is not actually evidence. I term such ostensibly benign superstitious beliefs, ‘gateway beliefs’. The Easter Bunny is an example of a conclusion, for which certain epistemic agents, certain people, think that there’s sufficient evidence….starting out with seemingly inconsequential beliefs for which one lacks sufficient evidence, may lay the framework, for one to cognitively habituate oneself, to lending one’s belief to other propositions or other things, which one also thinks that there’s sufficient evidence- when no such evidence exists. Now,we have finally set the groundwork to talk about faith. What is faith? ” Peter Boghossian

At this point I have no substantial disagreement, in fact Boghossian is doing me a favour, in that I have for some time suspected that many people, (and I see this among believers, and non believers), are holding beliefs that are not based on good grounds, or that are not warranted by the evidence they hold, or rather don’t hold. Boghossian is giving voice to my own loosely held suspicions that I have arrived at by observing a tendency among contemporary Christians to venerate, and propagate a form of faith that is unscriptural, and tends to keep people as perpetual children.

“Well, faith is belief without evidence, if one had evidence one wouldn’t need faith”

I have to interrupt Boghossian at this point, it seems that his definition of faith is common amongst atheists, but to take this as being indicative of reality is merely to feed a confirmation bias, or to accede to the availability heuristic. With the whole point of how wrong it is to give a strawman argument clearly in view, I must ask the reader, is it a sign of integrity to give one’s own definition of something that is not shared by the opposing view? Evidently this is a case of a violation of one of the most basic laws of thought. A violation at this level is, well I won’t say unforgivable, but at the very least, surprising. Boghossian has violated the law of identity. In his case “A is not A”. If I talk about Christian faith, the definition I would use is that which is accepted as standard, one which has a history of use by the Christian community. Or as close to it as I could possibly express.

I cannot speak for other belief systems, as I here, only speak on behalf of the one religion that he has lumped in with all the rest. Which brings me to my next point and that is the fallacy of hasty generalization. Or, to be more precise, a variation of that fallacy known as “the hasty conclusion”. He has made a bold assertion that faith is belief without evidence, but from what religions has he drawn this conclusion? Where is the supporting evidence of this? A bare assertion is not an acceptable statement without supporting evidence. I have no idea what his qualifications are with regard to religions, but astonishingly he has appeared to have made the assertion based on insufficient evidence- he appears then to be guilty of that which he is claiming is the problem among those who express faith! In fact he offers no evidence that his definition of faith is accurate across the broad spectrum of faiths. It may well be true that other faiths are guilty of believing things without proper warrant. This I cannot comment on, but to accuse Christianity of this is inaccurate. He is therefore at least guilty of using faith in a generic sense, which may well apply to other religions, but not to Christianity. It's an insult to intelligence to summarize the faith of all religions indiscriminately, and then compare that with the the beliefs of atheistic materialism. If you totaled up all the penalties awarded against every national rugby team of the world and then compared it to the number of penalties awarded against the All Blacks- would anyone in their right mind agree that it thereby demonstrates the All Blacks are the cleanest players of all? To do so is unbridled bigotry. Each must be taken on its own merits or there must be justification for that generality.

Clearly we have laid the groundwork already for fair and constructive ways to debate an issue, to arrive at truth. One of those crucial issues is the defining of terms. To put it another way, I don’t believe in the “faith” that Boghossian doesn’t believe in either! In fact, what Boghossian is describing is not faith, but fideism. What makes his argument particularly galling, is that Boghossian is attempting to ascribe to believers, and here I only speak for the Christian faith, he is, (in regard to our faith) ascribing to Christians something which he lays no claim to have experience of. Who is qualified to define faith? Someone who claims no experience of it? Isn’t that a little like someone from Sub-Saharan Africa proposing to define what it’s like to live in the Arctic Circle? Given, that faith- even as it is spoken of legitimately amongst Christians- has several meanings, which must always be understood by the context in which the word is used, it is not difficult to discern the sense in which we must use it for this exercise.

But to make matters clearer, let’s examine a definition of “fideism”, and see if it fits Boghassian’s view of “faith”. The exemplary Christian philosopher,

“Alvin Plantinga has noted that fideism can be defined as an ‘exclusive or basic reliance upon faith alone, accompanied by a consequent disparagement of reason and utilized especially in the pursuit of philosophical or religious truth’. Correspondingly, Plantinga writes, a fideist is someone who ‘urges reliance on faith rather than reason, in matters philosophical and religious” and who “may go on to disparage and denigrate reason”. Notice, first, that what the fideist seeks, according to this account, is truth. Fideism claims that truths of a certain kind can be grasped only by foregoing rational inquiry and relying solely on faith. ” (Alvin Plantinga as quoted from the Stanford Encyclopedia of Philosophy”

To recap, we ask again, how Boghossian has described what he calls “faith”. This is his definition: “Well, faith is belief, without evidence, if one had evidence one wouldn’t need faith”. Clearly this comports well with Plantinga’s description of fideism. Fideism is faith or belief, that is confidence or trust- that what one believes is true- but it is confidence without reason. Others refer to it as “blind” faith. One only needs to ask then: How is supporting evidence arrived at? The broad answer, as discussed above, is by reason. So, fideism is belief without reason, fideism is belief without supporting evidence or reasons. Fideism is holding something to be true, without sufficient arguments for vindicating that belief. It is in fact the roof without supporting walls spoken of earlier. Fideism is to Christianity, what multiverses are to some scientists- without any reason, or evidence.

Apparently, his definition of faith (at least as it regards Christianity) has been confused with something else. So, having made clear that Boghossian has been describing faith as it should not be, it therefore seems only fitting to define it as it should be. As I am only speaking on behalf of Christianity, I will therefore propose to offer the Christian understanding or definition of faith. Now this may not be the faith of some who may pose as Christians, or it may not be the definition of some who are Christians may express, but it is an historic and well accepted definition.

“Faith is the substantiation of things hoped for, the evidence of things unseen”

Clearly then this is not faith without evidence as defined authoritatively. Nor is it faith as it has been understood in practice for many centuries. It is my contention then, that Boghossian has- either by willing equivocation, or unwittingly- conflated fideism with faith. True, there is a relation, but the distinction is as important to Christians- as it may be vague to others.

Speaking only on behalf of Christianity, nowhere are people urged to believe in the tenets of the Christian belief system without adequate reason. Saint Paul is an authoritative source with regards to the orthodox doctrine of the Christian texts. Saint Luke, records an instance in the life of Paul where he criticizes people of other belief systems for their lack of evidence to support their belief in that which they purport to worship. He lambasts the early Greeks of Athens, on Mars hill, for fideism or belief without evidence:

“I perceive that in all things ye are too superstitious. For as I passed by, and beheld your devotions, I found an altar with this inscription, “TO THE UNKNOWN GOD”. Whom therefore ye ignorantly worship…”Acts 17:22-31

Synonyms for the word “superstitious” relate to beliefs that are: groundless, or unfounded, irrational, illusory or mythical.

There is also another point that needs to be raised in all of this. The word “faith” is most often associated with religion, and because its true meaning, at least with regard to Christianity, has been conscientiously denied to the Christian faith by people with an ideological axe to grind, it is therefore commonly used in a pejorative sense. Occasionally one might hear it used in conjunction to other disciplines such as science.

Here is one such example where it is used in an acceptable way, one in which it’s true sense is apparent:

'Paul Davis, a brilliant physicist at ASU says "that the right scientific attitude" now listen to this, Paul Davis is not a theist- "the right scientific attitude is essentially theological, science can only proceed if the scientist adopts an essentially theological worldview, even the most atheistic scientist accepts as an act of faith the existence of a law-like order in nature that is, at least in part, comprehensible to us." Einstein said " I cannot imagine the scientist without that profound faith"- note the word" ‘ John Lennox

When a “scientist accepts as an act of faith the existence of a law-like order in nature” what is she in effect doing? Here is a similar account by C.S. Lewis:

“Men became scientific because they expected Law in Nature, and they expected Law in Nature because they believed in a Legislator. In most modern scientists this belief has died…”

What has died? And what remains? Among many scientists, belief in a supreme legislator has died. The faith or belief in a supreme being that legislated the order in nature has in the hallowed halls of science and academia, largely passed away, but what remains is a belief or faith in the law-like order in nature by which they expect that their investigations into nature will reward their efforts. That wasn’t, for the most part, how it was for the early empiricists.

But let’s change the language somewhat. “Faith” and “belief” are such religious terms aren’t they? The modern concept of science has chalked up such numerous successes that the momentum of its success has carried it beyond the confines of its earlier progenitors. As Lewis pointed out, faith in God prompted a faith in nature, because of a principle of correspondence. The God of scripture, was also the author of nature therefore, they inferred- that just as we understand many things by scripture- so too will we gain knowledge from examining the other “book” he wrote. What Einstein called “the eternal mystery of the Universe”, its intelligibility, was something he couldn’t account for, therefore in his mind it was a puzzle. The accessibility to human endeavour and reason, the fact that a characteristic of the whole Universe may be encapsulated in an elegant formula only an inch long, with just a few characters, was, to Einstein’s prodigious mind- a matter of deep wonder. Why? Because a materialistic worldview, the view that nature is all there is, cannot give a credible answer to that issue. The laws governing the Universe are still confidently assumed, but the reason they were originally inferred has been lost. Einstein like many others had inherited the attitude of expectation, driven by the impetus of its own success, but without the divine explanation.

All of this points to the reality that scientists are exercising faith in the law-like order of nature without rational grounds which supports that confidence. They are, in short fideists. Ah, but we know that isn’t true is it? No, and why not? Because the ongoing and continued success of science in explaining reality- is now the ground upon, and for which the contemporary scientist continues to rest his confidence. But as others have cogently argued, the scientific endeavour might never have got off the ground if it had not been for the existence of the general conditions of faith in God infused in the culture in which science was nurtured. To follow the historical relation of science and religion, see Alister E. McGrath, Science & Religion: An Introduction.

But I promised that we would depart from these religious terms such as faith and belief didn’t I? If you review the previous two paragraphs, as you do so, ask the question: Did those paragraphs make sense? Note that I am not asking if what I said was true, you will no doubt have your own view on that, but what I do ask is that you acknowledge that you had no difficulty in understanding what it was that I was saying- that it was legible, and followed accepted conventions of grammar and semiotics. My point is that words such as having faith and belief, as it regards our subject, may be exchanged quite happily with words or phrases such as “inferred”, “confidently assumed”, “attitude of expectation”. Suddenly the words largely associated with religion, have been replaced with synonymous terms, but without the religious connotation, without the stigma.

While “inference to the best explanation” is an acceptable term in philosophy, it is also accepted as a non-deductive logical argument in science. In fact every reference to the laws of nature is one that has a non-deductive implication. And this is not merely an opinion, this is accepted by even atheist or skeptic philosophers of science such as David Hume, and Bertrand Russell. Both highly respected, and renown for their influence in philosophy and logic. How much science can be done without reference to the laws of nature? In short, if my definition of faith with regard to the Christian religion is correct, in that it infers the existence of God from evidence, or it infers belief in miracles from reliable sources, then Christians are doing no more that what is required of science. Ironically it was Charles Darwin who explored and popularized the acceptance of inference to the best explanation among the scientific community with his work on the “Origin Of Species”

To summarize then:

Peter Boghossian with regard to the philosophers obligation to provide an honest “best case” of that which he opposes has not been observed.

He failed to correctly define terms that were integral to the disagreement.

He engaged in some rhetorical tactics, appealing to prejudice.

In providing a generic definition of faith he failed in regard to Christianity. His definition may well apply to some, or all other faiths. But not to Christianity and is therefore guilty of hasty generalization.

If his argument succeeded against Christianity, then it succeeds against science and philosophy as well since logical inference is used and accepted by all.

He failed to show how Christian ways of knowing are unreliable, the three examples he uses of belief without evidence were pseudoscientific that have nothing whatsoever to do with Christian faith which, from the title given to his argument is the main target of his skepticism.