Commonsense and the Embodied Mind


For too long, the Philosophy of Mind has been struggling to get out of the 17th and 19th centuries – represented by the basic mind-body dualism of Descartes and the materialist reductionism which, rooted in 18th century empiricism, flowered in the 19th century, with the growing assumption that science would eventually tell us all we need to know about everything. It was not helped by the 20th century debates about meaning and the argument put forward by the Logical Positivists that empirical verification alone constituted a valid basis for factual claims, and the more recent materialist assumption that the mind is no more than a description of brain activity, an illusion to be dispelled by neuroscience.

A commonsense view of the mind recognizes that it is not the same thing as the brain, although acknowledging that the brain is the motor of mental activity. Nor does it make any sense to try to ‘locate’ the mind elsewhere. The mind is what we know and use as we experience ourselves as embodied human individuals, dependent upon our physical bodies, but enmeshed in a social and cultural world. We know what minds are because we read biographies, make friendships and interact with other people in a way that recognizes them as separate beings but in most ways much like ourselves. We also have a fundamental sense of who we are, not just through our experience of reasoning and reflecting, but in our engagement with and acknowledgement by others.

It is therefore refreshing to see that commonsense is at last breaking into the dead ends of traditional dualism and neuro-determinism. It has been a long time coming, but exemplified by such major 20th century thinkers as Husserl, Heidegger and Merleau-Ponty. The key thing – emphasized by Merleau-Ponty – is that the mind is embodied. Neatly summarized by Rachel Paine (in The Philosophers Magazine, 1st quarter, 2016), it is expressed as ‘4EA’. The four ‘E’s are:

Embodied – we are living beings within the world.

Embedded – within a social as well as a biological environment

Enactive – we build up and share our ‘world’ along with others

Extended – we are not ‘in’ the brain, but our selves extend out into the world.

added to which is the ‘A’…

Affective – we feel ourselves to be within a world with others; we do not observe a world our ourselves in a detached way, simply because we are part of the world.

To me, it makes sense to think of ourselves in this way; whereas both the reductionist approach of neuroscience and the attempt to revive a rather crude form of mind-body dualism seem to be creating problems by straining after theories that simply do not match experienced reality.

At last I sense the old cognitive log-jam in the Philosophy of Mind, is starting to shift, aided of course, by the very sensible criticisms of neuroscience offered by Raymond Tallis amongst others. Commonsense might eventually prevail.

This is my second blog post on the issue of the mind and neuroscience. If you’re interested in reading more on this topic, visit my website:


Walking on unsafe ground


To me, there is is something fascinating and threatening about walking through an area of volcanic activity, as here amidst the ‘Craters of the Moon’ in North Island, New Zealand. Steam hissing from fissures in the earth; bubbling pools of hot mud; the glooping sound as bubbles of mud bust into the air; the small of sulphur. They are reminders that the habitable world is fragile, and that most of the universe is hostile to what we celebrate as life. We are, as the Buddha put it, like froth on the crest of a wave. We have nothing as of right; no environment – however carefully controlled – can ever give total protection; we tread carefully, recognising that life may not provide quite what we expect of it.

I’ve been reminded of unsafe ground recently, working on a book about two theologians who fought on opposite sides of the same part of the Front in 1916. The ground over which they worked, bringing back the dead and wounded, was as unsafe and unreal as one might possibly imagine: an interlocking series of deep and flooded shell-holes, each deadly for anyone chancing to slip down their glutinous sides.  For Paul Tillich and Pierre Teilhard de Chardin, their conventional world of the pre-war years had been shattered and they spent the rest of their lives walking on what must have seemed very unsafe ground.

How we deal with the fragile and uncertain nature of life is a question for philosophy and a challenge for religion. Mostly we avoid thinking about it, comforted by the familiar. But it does not take much – a medical examination, an accident, an unexpected redundancy or bereavement or, as in today’s news, a horrendous mudslide, to rekindle the sense of our own vulnerability.

Morbid thoughts?  Perhaps, but also realistic ones. Celebrate when you can, but always count yourself lucky; you never know what’s ahead.

Courage, action and philosophy


Morning coffee on arrival at the Front, near Verdun, in 1916. The man on the right of the group is Teilhard de Chardin, the French scientist and religious  mystic.  A moment of normality and comradeship in the midst of hell. How did they find the courage to go on? And how did it shape Teilhard’s theology? (This image was published in the Teilhard Album, published by Collins, its provenance and copyright are being sought.)

I’m working on a book about the First World War and the impact it had on two religious thinkers – Paul Tillich and Pierre Teilhard de Chardin – the one a German Protestant chaplain, the other a French Catholic stretcher-bearer. By chance, they found themselves on opposite sides of the exactly the same ridge to the west of Verdun in 1916. The experience of the war was to shape their thinking and their lives, until both found themselves living as exiles in New York, the one having been banned from Germany by the Nazis, the other from France by the Jesuit hierarchy.

In exploring their lives, I find that their story touches on many issues in philosophy and theology that have developed over the last century. With just one chapter to go in the penultimate draft, here is the concluding paragraph of Chapter 10, where I have been exploring how men found the courage to enter into the hell of the trenches.

It is a feature of the best philosophy and theology that, beyond critical analysis or dogma, it encourages us to think about the values and goals to which we commit ourselves. Its value is illustrated by the courage of those who act, with their eyes open, in a world where the future always appears uncertain but the past, with hindsight, sadly inevitable.’

Does the world make sense? Are we fated to attempt to make sense of it? How, in times of confusion or trauma, do we find the courage to act decisively?  War – and especially the monstrous experience of entering a killing ground across mud and barbed wire – sets a question mark over all our easy philosophical and religious assumptions.

Through Mud and Barbed Wire is scheduled to be self-published (my first attempt at circumventing the conventional publishing route) this autumn.

Neuroscience as an antidote to commonsense? I doubt it!


Advances in neuroscience have given us new insights into the workings of the brain, at least to the extent that the measurement of blood flow suggests which parts of the brain are operating at any one time. When we make a decision, the only physical evidence for how we do it is in terms of brain activity, just as when we go for a walk, the only physical evidence for how we do that is the movement of muscles and limbs, along with the corresponding unconscious brain activity. Such an observation is in accord with a common sense view of the mind, for few today would subscribe to the idea that we have a disembodied self, independently capable of pulling our physical puppet-strings. We think, we walk and we decide what to do – that is how we experience ourselves. We are real and we are embodied.

I am utterly frustrated, therefore, by those who take a further step and try to suggest that the self is nothing other than neural activity, or that our every decision is an illusion, created by neural activity that has taken place prior to our becoming aware of it. They suggest that, because they can detect activity even a fraction of a second before we make a decision, it is not we who have made the decision at all, but our brains, and therefore that we have no more than an illusion of being in charge or of being morally responsible for our actions. At this point, neurodeterminism parts company with common sense. We know what it is to agonise over a decision and then take responsibility for it, and no analysis in terms of neural activity is going to render that process illusory, any more than a Mozart symphony is rendered illusory by being analysed in terms of a sequence of sound waves. Of course there is no symphony without sound waves, nor some extra-terrestrial ghost of Mozart, but no list of frequencies is going to replace what we mean by the symphony or our experience of hearing it.

Neurodeterminism only makes sense if we assume that the human brain is the cause of its own activity and that human social interaction and communication are merely its by-products. Indeed, some enthusiasts for neuroscience mock the common sense view that we have of ourselves as thinking, choosing, creating, conscious beings as a relic of a pre-scientific outlook. If it can’t be measured, it can’t exist!

In fact, I would argue that the relationship between self and brain is exactly the reverse. Communication and social interaction, with the development of signs and language, provided the context within which natural selection favoured the development of mental capacity. Those best able to identify one another, communicate and make good decisions about how to act together, were able to survive in a competitive world, and the brain capacity that made possible such thought and communication therefore increased over time. To suggest otherwise requires belief in some external force that appears to have determined that hominids should have ever-increasing cranial capacity. But – if natural selection is a valid way of looking at evolution – it just doesn’t work that way. Change requires context and competition. It is because we flourish as a species if we think, decide and communicate, that our brains develop over time. Pure Darwin.

Notice that it is the reality of countless individuals in their interaction with one another and with their environment that enables this evolution to take place; it provides the context within which increasing brain-power makes sense. But, quite apart from evolution, we also know that the brain is plastic and constantly changing. It responds to our choices and actions. As we learn a new skill, the relevant neural pathways enlarge to reflect that achievement and to facilitate it further. We don’t find that we have a new skill because the neural pathways have changed; they change as we learn the skill!

This popular and ‘reductive’ misconception of neuroscience is not just a matter of putting the cart before the horse, its having a cart with no horse at all – and that is a recipe for going nowhere, and for having no explanation for how the cart arrived in its present position! Let Darwin come to the rescue of commonsense on this one!

What happens in the brain mirrors and continues to make possible what happens to us as persons and as social agents. We are more than our brains, and even if neuroscience were one day to achieve the impossible and give a full description of the activity of each and every neuron, it would still not explain what consciousness is like, or what it means to be a human being. That may be a common sense view, but I think it is none the worse for that!

For more on my views on The Philosophy of Mind, visit my website.

Naked thoughts?


Another extract from The Philosopher’s Beach Book

George Formby got his cheeky implications wrong. In his ‘Hi-Tiddly-Hi-Ti Island’ (a Pacific beach paradise with fantasy images to rival Gaugin, but without the latter’s sinister side) he offers what appears to be an escalating scale of sexual provocation when he suggests that ‘the girls there are all full of sport’ leading to ‘and wear their frocks a trifle short’, and ending with ‘and some are simply wrapt in thought’ in Hi-Tiddly-Hi-Ti Isle. Trouble is, it doesn’t quite work like that – and never did, not even in the 1950s when he recorded the song. No; when it comes to nakedness, glimpses of flesh and whisps of material are always going to be more sexy than straightforward, in-your-face, all-shapes-and-sizes-accepted nakedness.

We’ve already looked at paradise, so what about nakedness? Should the genuine philosopher, straining to appreciate the existential implications of being alive, prefer to be naked on the beach?

Among ‘textiles’, nakedness is associated with sex; but among the naturist fraternity it is just the most beautiful and natural way to be. Feeling the breeze and sun on your body, outside with no clothes on, you feel ‘in’ the world in a way that the clothed cannot. Even under an overcast sky, with a breeze whipping in off the water, there is something wonderful and liberating about displaying your goose bumps to the universe at large.

And clothes are problematic anyway. They make social and existential statements that may or may not reflect the reality of who we are. Uniforms take away our individuality and encourage us to conform to the social role they represent. A major feature of the ingenuity of British teenagers lies in modifying their school uniform in a way that is provocative and rebellious while remaining just about within the letter of the school law. But clothes make statements in so many ways: the hijab and burqa, the veil, the clerical collar, the punk outfit, the studded leather jacket, the judge’s wig. Clothes are eloquent. But are they necessarily honest? Martin Heidegger, in his Being and Time (caution: this is a great book, but not an easy read), argued that we are often tempted to adopt particular social masks rather than being ourselves, to play a role rather than act with authenticity. Clothes play a large part in affirming such masks.

Clothes may also reflect the wearer’s attitude. The extreme example of this is The Dandy – the title also of a new book from Nigel Rodgers. He makes the point that Dandyism is not simply a matter of fastidiousness of dress, it is a way of thinking about oneself, an aloofness of thought and behaviour, an elevated state of mind.   At a more mundane level, designer labels perform the same function – they say more about you than flesh ever can.

But there is a negative side to nakedness, for it is associated with death and with poverty. We are encouraged in the New Testament to clothe the naked as a means of giving them basic aid, and none can forget the terrible images of naked bodies piled up in heaps at Auschwitz. To choose to be naked is one thing, to have it imposed is quite another. Those who are naked have no status, they reveal their human vulnerability; to humiliate someone, first remove their clothes.

But ever since Adam and Eve were said to feel shame at their nakedness, and reached for the fig leaves, a minority of religious people have been trying to regain their lost innocence. A major dispute within the Jain community in its early days was whether monks should remain naked (‘sky clad’ was the delightful term used) or accept a simple form of clothing. Some became clothed, but nakedness remained the ideal. Nakedness for the Jain is a sign of renunciation, of absolute simplicity and innocence. Those who have nothing, not even clothes, symbolise the value of non-possessiveness. And that, of course, holds true for the long tradition of naked asceticism within Indian religions. Of course, the temperature helps; naked ascetics do not thrive in polar regions.

Simplicity is one thing, innocence another, and the quest for innocence through nakedness is best exemplified in the Adamites, a 17th century English sect who undressed to worship. Nakedness expressed innocence, absolute equality (and, no, we’re not talking physical features of a personal kind) and open honesty within the community. They saw it as a return to the Garden of Eden, a celebration of what humankind was meant to be, going naked and unashamed before God.

So, as you look about you on your literal or metaphorical beach, consider what the textiles are saying with their clothes, however casual or minimal; from designer gear to distressed jeans, clothes are eloquent at presenting a personal image that may or may not be the truth about the wearer. Clothes categorise people and therefore also divide them. By contrast, there is a natural camaraderie on a naturist beach.

And if philosophy, particularly of the existentialist sort, includes exploring who we are and how we relate to the world and to other people, affirming ourselves in honesty and acting with authenticity, would it not be better of philosophers remained naked? Weather permitting!!

For more about The Philosopher’s Beach Book, click here.

Did you bring your camera? (from The Philosopher’s Beach Book)


The joy of the small-format digital camera is that it enables you to capture the moment with ease, edit out the unwanted details in Photoshop, shrink the image to an appropriate size for emailing and then send it to your friends. Even the most fleeting experience can be captured and shared, if not with a camera, at least with a phone. How things have changed since aristocrats, arriving in Venice as part of their grand European tour, would commission a painting to commemorate their visit. Canaletto thrived on wealthy patrons who wanted to capture the moment.

But Canaletto also anticipated (in his own most wonderful style) the curse of the small digital camera, in that both he and it tend to make everything in the scene so sharp. That, of course, is no doubt what he, and those who market cameras today, parade as a virtue; people want sharp pictures. But tear your eyes away from the Canalettos as they hang in the gallery and observe for a moment those who are viewing them. They first stand back, taking in the scene, but almost immediately move forward to scan the painting, enjoying the small tableaus that are to be found in every corner, each detail revealing something of the life of the city. Some, like moles, almost sniff their way across the canvass, luxuriating in fine brushstrokes and acute observation. Take a Canaletto home from your travels and you have not one picture but a whole album. So also the small format camera, with its short focal length lens, can render everything sharp across the frame, the foreground characters are set against an equally sharp background that constantly distracts the eye. You are tempted, as with a Canaletto, to explore detail. Continue reading

‘Reader View’ on iPhone or iPad?

If you use an iPhone or iPad, I need your advice.

RV off

When you visit my site on an iPhone, this is what you see. Notice the ‘lines of text’ icon on the left of the address bar. This is  the ‘Reader View’ button.  All of my ‘Notes for Students’ pages, in addition to my home page, are now ‘Reader View compatible’, but do you use Reader View if you visit using Safari on your iPhone or iPad?

I asked three people who seem to be on their iPhones all the time and none of them even knew what Reader View was. Do you? I would make reading my notes much easier.

When you visit a ‘Reader View compatible’ page, there is a little icon representing lines of text to the left of the address bar. Click on that icon and you get the main column of text set out in an appropriate size for your phone or pad, without any of the menus or additional columns. It makes reading chunks of text much easier. If you want to go back to see the whole page, you just click that icon again. This is what you see on my home page (as it is at the time of writing)…

RV on

This is significant for anyone designing a web page – and it may change the way I present my pages as I develop my site. Up to now, the standard way to make a site compatible with smaller screens is to make it ‘responsive’ – in other words, it either expands of shrinks to fit the screen available, generally presenting just a vertical menu followed by a text column if you are browsing from a mobile.   But the result is that many of the options for those with large screens are not available. You can’t see my comments in the side column, or run down my ‘contents’ list for the page as you work your way through the notes.

Using Reader View, you can glance at the whole page – just using your fingers to enlarge the menus or whatever else you want to see – and then hit ‘Reader View’ to get on with reading the text, returning to the full view whenever you need to.

It would be ideal to design pages with ‘Reader View’ in mind – making sure that the first column just has photos and text, making for an easy read, and keeping everything else to the side for those who want to refer to its menus.

If you have not used it before, give it a try and let me know what you think. If you already use ‘Reader View’ for my notes, how does that work for you?

And is there an equivalent for those using Android?  How do you generally view multi-column pages?

Over the next couple of months – while students are off doing more important things on beaches or up mountains – I shall be up-dating my site (apologies… it has been much neglected during the writing of three more books), so your advice at this stage would be most welcome.