Explaining Language and Thought Naturalistically

James Fodor / 25 August 2014

During August 2014, Oxford Professor of mathematics John Lennox delivered a series of lectures in Australia on "Science and Faith in God." Considered a leading figure of the evangelical intelligentsia, Lennox made claims that James Fodor, President of the Secular Society of the University of Melbourne, disputes in a series of five articles.

In Part 1, Fodor critiques Lennox's claim that modern science owes its development to Christianity.

In this Part 2, Fodor refutes Lennox's claim that language and semantic meaning cannot be explained naturalistically.

Part 3 deals with Lennox's claim that naturalistic science is unable to provide an explanation for the intelligibility of the universe.

Part 4 critiques the role Christians played in many important reformist social movements.

Part 5 deals with Lennox's references to the so-called "evils of atheism".


Here I continue my critique of the major arguments raised by John Lennox at the recent "Cosmic Chemistry?" public lecture and the "Faith has its Reasons" conference. In this piece, I will address the claim made by Lennox that language and semantic meaning cannot in principle be explained naturalistically. As far as I could tell, Lennox made two slightly different (though related) claims on this subject: first, that the mental activity of the mind is not reducible to the physical mechanisms of the brain; and second, that the semantic-bearing component of language - the fact that language means something - cannot in principle be explained by reductionistic, materialistic theories. I will address each of these assertions in turn, arguing that in both cases Lennox grossly overstates his case, and fails to substantiate his claims.

What did Lennox Actually Say?

"That writing there that you take to have meaning cannot be reduced to the physics and chemistry of the paper and ink on which these symbols appear…the problem is that it cannot be explained reductionistically"
"The one area when explanations do not move from the complex to the simple is in language"
"You cannot reduce the mental to the physical"

Physicalism and the Mind

The question of whether the mind is purely the product of the physical brain is an exceptionally old, complex, and controversial one. As before (see part one of this series), Lennox gave no hint of this complexity in his presentation. He did briefly mention Thomas Nagel, a philosopher who wrote the well-known paper What is it Like to Be a Bat?, where he argued that mental states have an essential subjective character which many argue is difficult or impossible to explain naturalistically. Lennox might also have mentioned thinkers like John Searle and Rodger Penrose, both of whom are to varying degrees critical of reductive physicalist explanations of the mind.

On the other hand, Lennox made no mention at all of the many philosophers of mind and cognitive scientists, including Paul and Patricia Churchland, David Marr, Daniel Dennett, Jerry Fodor, Hilary Putman, and many others, who do think that such a reductive program is possible. Indeed, a recent PhilPapers survey (http://philpapers.org/surveys/results.pl ) found that 57% of philosophers 'accept or lean towards' physicalism about the mind, compared to 27% who support non-physicalism. Obviously this is not proof that Lennox is wrong, but at the very least it surely casts considerable doubt on his confident and unqualified claims that such reductive physicalism is not possible. Lennox presented no real evidence or arguments for his position (and there are many arguments on both sides), but simply made assertions and spoke as if the issue was already decided, which is by no means the case.

Reductionism in Semantics

I will now move on to examine Lennox's related claim that linguistic meaning cannot be explained reductionistically - that this is the "only area" where explanations do not move from "the complex to the simple". Personally, I find such an assertion to be patently absurd, as it seemingly constitutes a blanket rejection of an enormous proportion of contemporary research in cognitive science. Let me provide just a few illustrative examples of such research.

Linguists study language at multiple levels, considering separately how syntax, semantics, pragmatics, and other factors contribute to meaning. Context-free grammars are a theory used in linguistics to understand how highly complex sentences can be constructed out of smaller components by following simple rules. Semantic networks and neural networks are models used to explain how meaning is instantiated in a web of beliefs about related concepts, which can sometimes be developed without explicit 'top down' instruction. Formal semantics of logic studies the formal structure of inferring meaning from premises. Neurolingustics uses lesion studies and neuroimaging techniques to determine which regions of the brain are responsible for particular language functions, including articulating speech, processing sounds, and even processing different types of semantic content. Machine translation and other natural language processing techniques are finding increasing use in the field of computational linguistics, which uses computer algorithms and machine learning to identify structures within, and hence to infer the meaning of, pieces of language.

I am certainly not saying that these techniques, fields, and theories constitute a complete, or even a nearly complete, reductive account of semantic meaning. Obviously we still have a great deal to learn, and much remains a mystery. What I am saying is that over the past few decades we have made very considerable progress in understanding meaning and how it is instantiated in the brain, and there is ample reason to suppose that such progress will continue. Perhaps the reductionist program will eventually reach its limits and falter, but I don't think the field at present shows any real signs of doing so.


In arguing that the mind cannot be reduced to the brain, and that semantic meaning cannot be explained reductionistically, Lennox made sweeping claims with little or no justification. He ignored the immensely complex philosophical issues, and failed to even mention any thinkers who disagree with him or engage with their arguments. He also ignored the many examples of significant progress that has already been made in linguistics, computer science, psychology, and neuroscience in developing reductive accounts of semantic meaning. Overall, Lennox's argument was unconvincing, and almost completely lacking in substantive content.


All the more reason.

Sign up & stay informed with our daily digest.