The Stretch Cycle

Experiential learning – and what other kind of learning is there? – relies at its most basic upon four components: a safe environment from which to launch exploration; the hunger to discover and learn new things; the occasion to experience new things within a measured, controlled space; the opportunity to reflect on those experiences in a way that permits us to incorporate the reflection into the enhanced place from which the adventure began.

All this can be summarised conveniently in what I call The Stretch Cycle, a cycle that turns out to be more or less ubiquitous in all aspects of education.


Take the sequence we follow every day: we start in the comfort of our beds and home; we are motivated to get out of bed and/or leave home to do some business in the rest of the world; that business may occasion some discomfort and should do so if we are learning and making some kind of progress, and we are to some extent out of balance; but there are limits to how much we can experience and deal with, so we eventually make our way back towards our comfort, rebalancing our lives and accommodating our experiences in a process of reflection, to arrive back in our comfort zone but changed and enriched and strengthened. (We will look at where this process can become destructive, negative and stressful later on.)

On a day-to-day basis this is cycle is commonplace and unremarkable, but we less commonly identify the same process in other areas of life. For example, every lesson we teach in a school should follow the same sequence, perhaps several times. We take students (and, if we are good teachers, we also take ourselves), from a position of relative comfort through a process of exploration, discovery, challenge, absorption, accommodation and back to a position of comfort but in an enhanced place. Teachers also take themselves through this cycle because if they are not experiencing the learning process themselves, however many times they may have taught a particular topic, they are almost certainly not teaching it well either.

However many times we have taught a topic, we should always teach it as if we have never taught it before.

Where lessons do not follow such a pattern it is hard for experience to become learning. Things can break down in many ways:

  • Some students will be reluctant to leave their comfort zone at all, which represents a motivational dysfunction;
  • Some students will find being challenged in a process that unbalances them and makes them need to revise what they already know or assume unpleasant and threatening, which is a dysfunction of safety and trust;
  • Some students will not want to experience whatever new ideas or adventures are on offer because they will touch an area of uncertainty or fear in them that they are unwilling to tackle at that time, for example someone who gets to an airport but then finds themself unable to board the plane; this is a dysfunction of capacity;
  • Some students, even when they have left their comfort zone and had new experiences, will be reluctant to reflect deeply on them in a way that converts them from being mere experiences into being learning experiences; this is a dysfunction of growth and change.

Effecting the transitions between each of the four phases of the stretch cycle requires the teacher as facilitator to understand what can enhance and what can inhibit these processes. Too much imbalance, too challenging a discomforting experience, experiential overload that requires too long to be absorbed and accommodated, and too large a distance between one level of comfort and the next can each not only inhibit the effectiveness of the process, but destroy it, even turning it into something destructive and negative.

Experienced teachers know which areas of their syllabus will occasion more or less difficulty and they adjust the speed with which they deliver them accordingly. But too much new material can have a devastating effect on student confidence and make further progress even with easier material very difficult to achieve. Moreover, each student will have different capacities for absorption of new experiences, will start from different positions of comfort (and sometimes will not be at all comfortable), and will be more or less willing to engage with the learning objectives. Some students will accept on trust that the teacher is taking them somewhere they need to go; others will question whether they need to go there at all other than in grudging acceptance of the requirements of the syllabus.

So in addition to managing the four phases of the stretch cycle, we need to deal with an additional four transitions and the things that may enhance or inhibit them. We also need to be able to assess the travel from one level of comfort to the next if we are to make a wise decision about how much new material to cover and how challenging to make the new experiences that need to be absorbed. And we need to do this for each student, insofar as it is possible (which is often not very much).

It is worth considering the equivalent cycle in connection with other processes. What, for example, is the relationship between education and society? What seems clear is that the values and practices of education both influence and are influenced by the political expectations of society and the kinds of citizenship that society expects.


Here education is seen by society as a way of inculcating it values, and that involves a transition from living how we like to living how society wants us to live; society in its turn attempts to regulate what education inculcates, expecting education to embody the same values and ideals that society espouses. So a totalitarian society will regulate its educational institutions to support  acceptance of the authority of teachers as a model of acceptance of the authority of the state; a liberal society will encourage education to promote free and critical thinking. Whether in the final analysis education creates society or society creates education is a moot point; probably they emerge together, but what is certain is that a state that becomes aware that its education system is changing the parameters of citizenship – for example from the acceptance of to the questioning of authority – will move quickly to impose tighter controls over it.

The tension between education and training adds further complexity to this interdependence. Commercial and industrial interests constantly complain, in a way that has not changed for at least two hundred years since the education of the masses began, that education does not produce sufficiently literate and numerate students. This complaint is based upon the perception that the purpose of education is to provide the employees that commerce and industry need, but this is to confuse education with training. To be prosperous, societies need trained employees, but not necessarily educated employees; in some more totalitarian or authoritarian societies the notion that citizenship involves education would be greeted with horror: an education population is much more difficult to control than a trained population because training by definition meets the needs of society whereas education does not necessarily do so. Remember Socrates, accused of “corrupting the youth of Athens” by encouraging them to think for themselves.

Unfortunately, matters are not quite that simple, especially in a rapidly-changing world. When education and training could remain unchanged for long periods of time, typically working-lifetimes, it was perfectly possible to train someone in youth and leave them to work without much further training for forty or fifty years. Now that the world is changing so rapidly, this is no longer the case, and we need a more dynamic relationship between education and training in which education advances capacity that is then brought to bear in a process of convergence into training, and training is itself enhanced by a divergent process that turns it back into education. Without this dynamic, training quickly becomes obsolete and education quickly becomes unproductive and irrelevant.


But even this needs further qualification depending on how far from the needs of contemporary training education goes and how far the needs of contemporary training can fall behind education. By and large the population, and education, have been wrong-footed by the rise of the digital revolution, for example, so now the complaint is not just that education does not supply sufficiently numerate and literate adults, but that they are not digitally savvy either (notwithstanding their adeptness at social media).

This training-education cycle maps onto the stretch cycle in an obvious way: education here consists of the challenging processes that unbalance our comfort; training consists of the regular processes that confirm it. Divergence arises from unbalancing ourselves; convergence arises from reflecting on and accommodating new experiences.

When we consider the creative dimension of citizenship we find ourselves with a different cycle, although one that still exhibit the same characteristics.


Here education provides us with the received knowledge, skills and ideas from the past, and imagination puts them together in ways that stretch them and unbalance them in a process that leads to creation. Some of the creation is more successful than others, and so reflection brings us back to a new educational body of understanding that enhances what can be passed on having refined it against the successes and failures of the creative process.

Where education fails to generate imaginative activity – where we rest content with the skills and knowledge of the past but make no attempt to extend them to deal with the question how we are to create the future – it does not add to the reservoir of human knowledge and achievement, and so it cannot rise to the challenge of creating the future. So education is forever balancing the demands of training and the challenges of creativity.


So we address the question whether the skills we need to understand the past are sufficient to create the future using a model of education that encompasses both convergent and divergent learning, imaginative and reflective creativity. And all are embraced by The Stretch Cycle.

On Storytelling

Elsewhere we have seen that it is one of the primary functions of education to help us to redefine our meta-narratives, the over-arching stories by means of which we make sense of our lives, and in particular to do so in circumstances where changes in the world render our existing stories inadequate.

Part of my argument was that liberal education – the kind espoused by the International Baccalaureate and by the United World Colleges – basically lacks a compelling meta-narrative. Both organisations think otherwise because they believe in the values Kurt Hahn advocated, and especially making education a force for peace and sustainability. But they are blind to the limitations of peace and sustainability that are exposed by their lack of a meta-narrative that defines the kind of peace and the kind of sustainability for which they strive. Under such circumstances the things they nobly advocate – diversity, tolerance, responsibility, reflection and the ideal of creating a better world – fall foul of the lack of a framework within which it is possible to understand diversity, tolerance, responsibility, reflection or the nature of a better world. While we are sick we want to be healthy just as while we are at war we want to be at peace, but neither a state of health nor a state of peace is capable of generating the conditions which give them meaning unless they are experienced within the structures of a meta-narrative that gives them meaning.

The problem for the IB and for the UWC movement, in other words, is that they are so frightened of becoming in some sense or other quasi-religious on the one hand (although they have become exactly that in the minds of some of their more devoted advocates) and so frightened of being imperialistic on the other (despite their blindness to their own embodiment of the fundamental imperialism of western liberalism) that they deny themselves any capacity to generate or be defined and guided by a meta-narrative that could possibly give them meaning and provide the framework within which diversity, tolerance, responsibility, reflection and the nature of a better world could be defined.

A typical and probably universal characteristic of this kind of deficiency – not of vitamins or nourishment but of persuasive and pervasive intelligible meaning – is that a movement comes to be defined by what it stands against rather than what it stands for, and draws its energy from burning what it hates. The problem, as Lesslie Newbiggin once said so powerfully of the Enlightenment, is that because it fuelled its flames on the wood of Christianity, when the fuel was exhausted the fire died. And that is the problem with all movements that define themselves over against something without having the courage to define a meta-narrative that gives shape and meaning to themselves: in the embers of the funeral-pyres that mark their triumph over what they oppose can always be discerned the lack of passion, the ennui, that will ensure their own death.

Although conceived in the context of education, this thesis can equally be applied to other phenomena, and most starkly to the recent tragedy of the United Kingdom and the Pyrrhic victory of the Brexiteers. The point, on the one hand, is that the Brexit campaigners had a story, albeit a completely crazy story (but that a story is crazy does not mean that it will not be popular): the story of taking back control of Britain, of reconstituting a long-lost greatness, of re-establishing something called Britishness over against its dilution with supposed tidal waves of immigrants with allegiances to other cultures and religions. And the other point, on the other hand, is that the EU campaigners had no such compelling story because Europe itself has no compelling story: it is a coalition of self-interest ruled by bureaucrats none of whom is capable of articulating any kind of meta-narrative over against the crazy meta-narratives being invented by nationalists across the continent that will undoubtedly tear the European Union apart.

On the other side of the Atlantic we can see a different version of the same phenomenon: people would rather hitch their wagon to an incoherent decisiveness however irrational than wallow in the undifferentiated soup of reasoned meaninglessness.

Just as with the IB and the UWC, the EU is terrified both of becoming quasi-religious and of becoming imperialistic, and so it has steadfastly refused to define a meta-narrative, a story, within the structure of which to give itself real meaning, real purpose and real backbone. It was born, two generations removed, from a desire to avoid a repetition of the horrors of war that scarred the twentieth-century, but it is in danger of incubating precisely the conditions in which the extremism that leads to war can germinate and grow. In particular, it is creating conditions in which the meta-narratives of the far right or even perhaps the far left, although at the moment that seems less likely, will fill a vacuum and supply populations weakened by prosperity and disenchanted with a wilderness of purposelessness with something to believe in. Such populations will always be susceptible to the kind of platitudinous lies that enabled the Brexiteers to win despite having absolutely no coherent plan for what to do having done so.

Which brings us back to the role of education in defining not just any meta-narrative, but a positive meta-narrative capable of galvanising the resolve of those who wish the world to be a better place in the face of those who exhibit instead only the characteristics of nihilistic opportunism. Without such meta-narratives we have no defence against the forces of darkness that arise from, represent and reinforce the meta-narratives of destruction. And in that case the world is in a more precarious state now than it has been since the rise of National Socialism and the publication of Mein Kampf.

On Nietzsche

Nietzsche presented the world with what is essentially a simple choice: either choose to allow yourselves to be defined by something that you regard as essentially outside of and other than yourselves, or choose to define yourselves. He chose the latter unequivocally and regarded as weak and feeble anyone who chose the former. For Nietzsche, to define oneself is to take command of one’s own humanity, and to take command of one’s own humanity is to become fully human, some semblance at least of the Übermensch.

Unfortunately, there is a paradox here: even the Übermensch must define himself or herself according to some set of parameters, some story, some meta-narrative, even if that narrative has been self-generated. In other words, even das Übermensch must regulate itself in relation to some super-story that gives shape and meaning to life and existence; otherwise, there can be no purpose and no reason to do or say or write anything.

Nietzsche falls foul of the same blind-spot that Descartes could not see through: that one must use language, and one must define one’s use of language in relation to some purpose. Descartes thought language was not a social construct, or perhaps more properly we should say failed to see that it was; Nietzsche fails to see that meaning is similarly a social construct, and therefore the product of a story that we choose to tell and choose to take seriously. There can be no question of a compelling self-sufficient, self-authenticating story that somehow obliges us to live by it and to measure ourselves according to it, for to admit that would be to reintroduce the externally-defined meaning that it was Nietzsche’s entire life-work to repudiate.

Of course, Nietzsche would say and does say that the authenticity of speech comes from the act of speaking and requires no other authentication: for him we each become essentially our own god, whose speaking is creation; Meister Eckhart would have said the same thing centuries earlier had he lived long enough.

But there is a convergence here, a limit-point upon which hinges and changes much of our understanding of the world: the distinction between defining ourselves in relation to something external and something internal is bogus; both amount to the same thing: we have no choice but to choose. Sartre essentially saw this in his notion of bad faith, mauvais fois: that to pretend we have no choice when we have nothing but choice, nothing but freedom and being condemned to freedom, is the ultimate human self-deception, folly, lie and sin. So even if we pretend to ourselves that we are defined by something external to ourselves, we lie to ourselves: we are defined by what we choose to be defined by. So there is no difference between external and internal; there is only a difference between honesty and self-deception. No story has the power to authenticate itself; only we can authenticate it. There is no difference between a belief in something external as the source of meaning and purpose and belief in something internal as that source: both must be chosen. We can pretend to ourselves that we have not chosen, that something has somehow imposed itself upon us, authenticated itself, justified itself, but it is a lie: we have chosen, one way or the other.

Nietzsche’s paradox then reduces to the question of how we resolve conflict: what do we do when our choices differ? Das Übermensch must face this challenge because it cannot be avoided: once to rely upon or appeal to some higher authority, some self-authenticating and definitively authoritative source is to recreate God in some form as an ultimate external power; so we must find an alternative resolution to conflict, and hope that it is something other than war conceived as an end to war. (Or should we even instead embrace war as therefore unavoidable?)

Some choose their own way but most choose the way others have chosen as a poor substitute for the ways an earlier generation would have thought of the ways God has chosen. So the paradox reduces to a response to a simple command: choose! For Nietzsche we are defined by what we choose; there is no right or wrong, no good or bad; there is only the brute facticity of the nature of the world insofar as it can be shaped or determined by our choices. “Right” and “wrong” are labels we give as short-cuts to socially-agreed practices; sometimes we agree with those practices and sometimes we do not; sometimes we need to change society’s views and in so doing change what is called right and wrong; saying I do or believe or say something because it is right only means because I believe it or because society believes it. What matters is that I say it or deny it; that is its right and mine. The only thing that can prevent me from saying what I wish to say is fear for myself or those I love; but to be silenced by fear is not to acknowledge the right, only to flee in the face of the wrong.

Of course every society tries to absolutise its own rights and wrongs; that is ultimately what it does when it projects them upon some external being it calls its god. But there is all the difference in the world between accepting that something is right or wrong by society’s current lights and allowing that it is right or wrong according to the transcendent perception of some all-knowing, all-seeing eternal timeless being: the difference is that between a temporary conviction or aberration and an everlasting truth.

So what we might call “Nietzsche’s Fork”, the choice between an externally- and internally-defined source of authority, is really a disguised form of an entirely different fork: the choice between integrity and – what?, how odd that English, that has words for almost everything, doesn’t really have a word for it! – disingenuousness, perhaps, hypocrisy on another reading, but we probably need a neologism like disintegrity; rather more lamely between authenticity and inauthenticity. We can pretend that we are forced to believe something from an external source, but this is just self-deception: there is only our choice; and the choice is between integrity and disintegrity, hypocrisy or self-deception.

But let us return to the Master …

We are unknown to ourselves, we knowers: and with good reason. We have never looked for ourselves, so how are we ever supposed to find our- selves? How right is the saying: ‘Where your treasure is, there will your heart be also’; our treasure is where the hives of our knowledge are. As born winged-insects and intellectual honey-gatherers we are constantly making for them, concerned at heart with only one thing – to ‘bring something home’. As far as the rest of life is concerned, the so-called ‘experiences’, who of us ever has enough seriousness for them? or enough time? I fear we have never really been ‘with it’ in such matters: our heart is simply not in it and not even our ear! On the contrary, like somebody divinely absent-minded and sunk in his own thoughts who, the twelve strokes of midday having just boomed into his ears, wakes with a start and wonders ‘What hour struck?’, sometimes we, too, afterwards rub our ears and ask, astonished, taken aback, ‘What did we actually experience then?’ or even, ‘Who are we, in fact?’ and afterwards, as I said, we count all twelve reverberating strokes of our experience, of our life, of our being – oh! and lose count . . . We remain strange to ourselves out of necessity, we do not understand ourselves, we must confusedly mistake who we are, the motto ‘everyone is furthest from himself’ applies to us for ever, we are not ‘knowers’ when it comes to ourselves . . .

Nietzsche, On the Genealogy of Morals, Preface §1.

What can one say? We are unknown to ourselves because the bee-gatherer has no interest in anything that has not already been conceived to be of value to the hive, which is to say what it has not already been taught to think of value to the hive: no experience is worth reflecting upon that does not immediately appeal to prior knowledge, which is to say, prejudice. This is the paradox of education, that the knowledge we acquire of and from the past can appear to important to those blind to the requirements of the future. The twelve strokes of midday make no impression on us; we wake and ask what time it is, but it is already too late. “The watchman waketh, but in vain” (Ezekiel, 33:34???)

I had not read the Genealogy when in the mid 1990s in Princeton I wrote the first (and so-far only) draft of Between Silence and the Word: A Study in Creation. But the opening line echoes Nietzsche’s in the Genealogy: “We do not know ourselves very well; neither, fortunately for us, does God”. We are not “knowers” when it comes to ourselves, says Nietzsche, but neither are we completely ignorant, as he supposes. Fragments of who we are emerge as we act and speak, but other actions and words consist only of lies. This dichotomy, this “fork”, is what defines us: that between integrity and hypocrisy, between owning what we have done and said, and pretending, in our own personal version of Sartre’s “bad faith”, that we were not free, or not responsible, or not informed, or, like Pooh Bah in The Mikado, “not there”. We are altogether too preoccupied with and blinded by the need to “bring something home” to see the opportunities that might come to us were we to stay out there in the world.

The more we think about it, the more clearly we can see that we are blinded by not one but several (indeed, innumerable) layers of self-deception, and perhaps there is no greater folly than to believe that in having removed only one layer of self-deception we can then see clearly. On the contrary: there is no greater self-deception than to believe the we can measure the extent of our own blindness. We are as a prisoner who, in escaping from his cell, forgets that he has yet to break out of the dungeon and scale the walls of his prison. We are as those chained and compelled to watch forever the shadows in Plato’s cave who, in breaking their chains and escaping from the darkness, are blinded by refracted light and do not see that the cave is situated atop a precipice in a desert wherein can be found no life.

“We are unknown to ourselves, we knowers”: but is it with good reason, or is our reason only another layer of our blindness? Certainty, clarity of thought, can come to us by virtue of a number of strategies: one can simply be to refuse to consider any further the possibility that we might be wrong. But we can always be wrong, and the price we pay for escaping from the inevitability that we might be wrong is to guarantee that we are wrong by making up our minds.

And this is really the point of “We do not know ourselves very well; neither, fortunately for us, does God”: that even God can be wrong, which is to say that whatever metaphor we choose to employ to describe the source of our greatest certainty, cannot escape from the possibility that we might be wrong. And any “theology” that imputes the kind of perfection to God that exempts God from being condemned to this kind of uncertainty “by definition” is no more than a projection into the divine of a human hankering after a final certainty. There is no certainty: that is our fate and the source of our life.

Nietzsche started his life as a student of theology and those studies never left him; his writing is completely saturated with metaphors and allusions that arise from the world-view of the very Judaeo-Christian tradition he despised. But in failing to escape from the metaphors of the Judaeo-Christian tradition he also failed to provide us with new metaphors through which to create different narratives in relation to which we can understand and shape our lives. So how do we create new meta-narratives to supersede the worn-out metaphors of the ancient world?

How many layers of self-deception must we remove before we begin to see more clearly, and can we be sure that even the metaphors of the ancient world will not come to seem less time-worn if we can but see them in a new light?

Here is the Master once again, peeling away another layer of the existential onion:

So let us give voice to this new demand: we need a critique of moral values, the value of these values should itself, for once, be examined – and so we need to know about the conditions and circumstances under which the values grew up, developed and changed (morality as result, as symptom, as mask, as tartuffery, as sickness, as misunderstanding; but also morality as cause, remedy, stimulant, inhibition, poison), since we have neither had this knowledge up till now nor even desired it. People have taken the value of these ‘values’ as given, as factual, as beyond all questioning; up till now, nobody has had the remotest doubt or hesitation in placing higher value on ‘the good man’ than on ‘the evil’, higher value in the sense of advancement, benefit and prosperity for man in general (and this includes man’s future). What if the opposite were true? What if a regressive trait lurked in ‘the good man’, likewise a danger, an enticement, a poison, a narcotic, so that the present lived at the expense of the future? Perhaps in more comfort and less danger, but also in a smaller-minded, meaner manner? . . . So that morality itself were to blame if man, as species, never reached his highest potential power and splendour? So that morality itself was the danger of dangers? . . .

Nietzsche, op. cit.,  Preface §7

To which one is inclined to add, “provided it is not education that is the danger of dangers” inasmuch as, like morality, it can prevent us from creating a future by mesmerising us with the blindness of the past.

On Painting

I do not paint and I probably cannot paint, but I am fascinated by painting. The only quality that a painting needs, to my mind, is that it compels me to look at it, and continues to compel me to look at it even after some considerable time. It is not necessary to be able to say what qualities give it this power, only that it has them. The same is true of music: music only needs to compel you to listen. And writing needs only to compel you to read. Or speech to listen.

But for me at least there is something special about painting. At one level, the level of physics, of paint and of canvas, it is what it is, timelessly. I remember walking innocently into the Groeningemuseum gallery in Bruges, finding myself confronted by Jan Van Eyck’s George van der Paele, and being literally struck dumb by its compelling power. I still cannot look at it without feeling that moment of transcendence again: that here something touches eternity. Yes, it is famously regarded as one of the greatest paintings of the late middle ages (1434-36), and it has been analysed to death in terms of its perspective, use of colour, imagery, proportion, … you name it, it has been said. And then there is the painting, which has the power to strike you dumb as Job was silenced by the final appearance of God.

Of course, not all paintings have such a dramatic effect, and some of the most famous famously disappoint. The trouble is that we are infected by fame in a way that renders us incapable of seeing; it is as if we see only the fame and not the picture. La Jocunde or Mona Lisa is perhaps the best example: it is a wonderful painting, but it is hard to see it anew because our expectations are already well beyond realisation. (The same is true of human beauty: it is often, perhaps always, more difficult to see the true beauty of someone who is also, at least according to contemporary taste and therefore ‘fame’, superficially beautiful.) Not so the Botticelli room in the Uffizi in Florence: there is nowhere else like it in the world, with The Birth of Venus on one wall and Prima Vera on the other (to say nothing of the Ucello on a third). This is not fame: these paintings transcend their fame; to see them is to be inspired and humbled simultaneously and realise that whatever one had been told or seen or heard is nothing in comparison with the real thing.

But there are other experiences that are less dramatic. Tate Modern in London hosted a Mark Rothko exhibition somewhere round about 2006, and it silenced the cynicism in almost everyone who went to see it. No reproduction in a book or on a website could prepare a viewer for the power of these vast canvases, these floating, mesmerising colours, images that demanded and compelled one’s attention.

Once something has been done there will always be plenty of critics and cynics who will say that anyone could have done it. It isn’t true. Consider writing, which is perhaps an easier way to appreciate the same point: these are just words, familiar words, written in a line. Anyone could have written them in the same sequence. The point is that nobody did. And so it is with painting: anyone can buy paint and a canvas; few of us can produce anything that compels attention with them; almost nobody can create timeless images as van Eyck or Botticelli or Rothko created them.

This asymmetry is one of the most mysterious of phenomena: that once someone has done something, once something has been done, suddenly everyone can see that what they have done could have been done by almost anyone; except that it wasn’t, and nobody did, and in most cases nobody even dreamed of doing what they have done.

Here the writer and the artist are joined in their creative struggles: the blank sheet of paper and the empty canvas allow infinite numbers of possibilities; but they must choose one. And neither knows at the start what they will choose to write or paint; only that they must write and paint, almost as if the writing or the picture were a child whose time had come and who was demanding to be brought into the world with all the pain and danger that accompanies any birth. And the writer and the painter endure in some senses a greater pain even than a mother, for their child is lost to them as soon as it is born. There may be a little editing, a little retouching, a few changes of heart, but essentially it is already gone on a life of its own, to be seen and read and used and abused by a public neither author nor painter may ever meet or know.

But painting is not writing and a painting is not a writing. Paintings command because they are before us in their totality; writing must be experienced sequentially and integrated by the reader in a way that differs sharply from the immediacy with which the painting confronts and commands the viewer. To read is to journey; to view is to arrive. Of course, a reader finishes reading and integrates what has been read, and a viewer sees a painting differently with every viewing, but the experiences are different and the sequences not the same.

A painting can compel us to look at it in a way that a writing cannot compel us to read it. The difference is as apparent as that you cannot hang a novel on a wall and expect someone to admire it; with a painting the opposite is true: you can and must hang it on a wall but you cannot experience it sequentially in the way that a writing must be savoured piece by piece. Book “signings” are a poor and often embarrassing substitute for exhibitions just because what the author signs is something that cannot be appreciated in the present, whereas the artist can show everything immediately.

Or so we may think. But the notion of “immediacy” implicit in such a statement equates looking with seeing and seeing with understanding. In fact few great paintings can be taken in at a glance, still less appreciated. They can be looked at, but not seen. And so the viewer must intuit the quality of the painting and feel its compelling power if he or she is to move to buy it or to spend some considerable time admiring it.

It is like falling in love in the true sense of the word, for it is to stand before it and feel oneself made alive by its promise: that however long we are compelled to look at it, we will never tire of it. And so it is however many times we read great writing, or listen to great music, or enjoy great conversation: we are made alive by the promise of the other.

On Failing

Praise of failure as a necessary part of the path to success has becomes something of a growth industry in recent years as we have come to appreciate better the learning opportunities that failing affords. That does not make the experience of failure any more palatable, but it can help to contextualise it.

However, there is a mistake that is commonly made here, and I have already to some extent made it even in these four or five lines, which is the assumption that something good must come of everything bad. We like to tell ourselves this to console ourselves, and telling us such things as that “every cloud has a silver lining” may help to soften the blow or alleviate the pain; the trouble is that it isn’t always true. And the reason why that matters is that if we allow ourselves to believe the lie that it is possible to find the positives in anything, we will be particularly distraught on the many occasions when we just can’t. And the reason why sometimes we just can’t is that sometimes there just isn’t anything positive to find.

So let’s throw away this false assumption, this consolation, and face the fact that we will all from time to time fail; let’s equally, without making as much fuss about it, recognise that sometimes we will fail for no fault of our own just because there is stuff going on out there there is neither rational, justifiable nor fair; and let’s not forget that sometimes we will fail purely and simply because we mess up, get something wrong, don’t prepare thoroughly enough, find ourselves in competition with someone who is better or deemed better by those with the decision-making power, and that sometimes we are just not good enough.

Everybody fails sooner or later. Even those who seem to live lives blessed by the gods fall over eventually. Or so we like to think. But it is just as important to set this consolation aside as well: there is absolutely no point or purpose or consolation to be had from the fact that others fail, too. Allowing ourselves a self-indulgent smirk when someone we deem more fortunate or successful than ourselves comes a cropper is debilitating and demotivating; as a way of thinking about and rationalising our own misfortunes and failures it absolutely sucks. Why? Because it helps us to achieve absolutely nothing: it makes us feel good for something that represents no positive achievement, no step forward, no learning, no advancement whatsoever. So it removes one of our strongest motives to make progress, by affording us satisfaction even though we have achieved nothing.

Let’s face it: sometimes we will fail for no better reason than that the world plays one of its unkind tricks that has no moral or rational justification; we just find ourselves dumped on our backsides and staring up at the stars wondering whether it is worth getting up. And sometimes the reason for our failure will be that we just weren’t good enough; maybe we will never be good enough to achieve something we aspire to. But telling ourselves that only makes sense when it becomes incontrovertibly true. As someone once said, engineering says that the bumble bee can’t fly, but fortunately nobody told the bumble bee. The only absolute certainty is that you can’t succeed if you don’t try. But there is no law of the universe that says that you can’t fail and still succeed except the one we find floating about in our own heads because we allow it room to be there, the one that says “There, I told you so: you’ve never been any good at anything; you’re such a loser!”

There is a horrible phrase that occurs in some ghastly American boy-makes-good baseball movie or other whose title I can’t remember and frankly don’t want to remember: “You show me a good loser, and I’ll show you a loser!” But actually we need to turn this on its head: we all have to be good losers; we all have to lose and smile and affirm ourselves and tell ourselves how well we did even though we lost or failed or fell or metaphorically died, even and perhaps especially when it was because we messed up. Ask a stand-up comic; ask actors who have dried; ask politicians who have been defeated; ask job-applicants who have been rejected despite the best cvs and the best experience: the thing that makes the difference is that those who fall over and remain flat on their faces staring into the mud, those that is who believe that failure is forever, subscribe to a self-fulfilling delusion, that there is something wrong or unnatural about failing.

And yes, it may be unfair, unjustifiable, irrational, a complete coincidence or an elaborate conspiracy, but telling yourself that will not make you more likely to get up and try again. And hating the person who succeeds where you fail, or worse still envying them their success, will make absolutely no contribution to your own recovery at all. Better to say “Well done! You deserved it! I’ll achieve as much next time.” So let’s change the metaphor: the great merit of falling over and lying flat on your back is that it affords an excellent opportunity to gaze at the stars.

Our greatest glory lies, not in never failing, but in rising every time we fall.

Attributed to Confucius.

On Writing

The art of writing is an exploration of our unknown selves. As Adam Phillips has said, we do not write to say what we believe; we write to discover what we believe. The empty page or the blank screen is an invitation: come fill it with your words and discover who you are. All real writing is about self-exploration. The greatest writing gives the author more pleasure than it can ever give a reader because it tells the author someone about himself or herself that would otherwise never have been known.

People who do not write, or do not write much, sometimes imagine that the great writers start with a worked-out plan and then painstakingly execute it to a pre-defined formula. Great writing never works like that. The author no more knows the way the story will unravel than the reader, and part of the pleasure of writing is to have one’s story and one’s characters take over the plot and start to force it to go in directions that the author might never have imagined or even wished. The creative process is about bringing into being that which but for that process would have no being. As Wittgenstein once so powerfully put it, “The first time I knew I believed that was when I heard myself saying it”.

“We do not know ourselves very well. Neither, fortunately for us, does God.” Those two sentences stand at the start of my unpublished work Between Silence and the Word: A Study in Creation that I wrote over three blissful summers in Princeton in the mid ’90s. What that book argues is that if we share anything with God at all, if a God there be, it is that we find out who we are by speaking, by bringing into words what would otherwise remain buried in silence and unknown. And it is not – it most emphatically is not – that but for these words only we would know the secrets of our hearts; it is that but for these words even we would not know the secrets of our hearts. Self-expression is creation and self-discovery and self-realisation; we become more by speaking, writing, acting, doing. As Eberhard Jüngel once put it, Gottes sein ist im werden: God’s being is in becomingAnd God becomes through speaking as we become through writing.

Of course one must first learn the language of writing. One can no more write well if one can neither spell nor use grammar correctly than one can play an instrument without mastering the techniques required by that instrument. But what makes writing so powerful and evocative is that it becomes a conversation with a part of oneself that would otherwise be completely inaccessible. Or perhaps I should say that “for most people” would be quite inaccessible. There may be people who can speak to themselves inwardly in the same way, but for most of us the act of writing sets up an other over against which and through which to argue and discover what one really believes.

And the problem is that what one really believes may be as shocking to oneself as what other people believe is sometimes shocking. On a daily and lifelong basis we are engaged in a battle with ourselves to discover what we can tolerate and what we have a need to suppress; Adam Phillips says the same, the life is a battle to manage those aspects of ourselves that we find unacceptable. But acceptable to whom? If I am not who I believe myself to be, who is it that deems me acceptable? If my truer or better self is hidden, sometimes exposed and expressed in writing, who is it that I the writer before these revelations am to myself? What makes us “rather bear those ills we have than fly to others that we know not of”? Why would we? Isn’t our comfort zone the place we prefer to be? But writing is not entirely voluntary; writers are driven to write, to expose what might otherwise remain hidden forever about themselves or about the world.

Inside the house of the mind that writing explores there are false doors, distorting mirrors, convoluted staircases and terrifying dungeons. Sometimes writing, for all its courage, is deceived by those false images and distorting devices, and repeats what it thinks it is obliged to repeat or what it deems safe to repeat rather than explore the spaces behind those deflectors. We come to a door and face a choice whether to open it or turn aside to more familiar and comfortable places, thereby adding another layer to our collusion with self-inflicted blindness.

But there is another kind of distortion to which writers can fall prey, and it is far less easy to see than a door or a reflection in a mirror: sometimes we sense that our writing is developing in a direction that leads to dangerous discomfort and disturbance, and we veer away from it instinctively as a helmsman will avoid breaking waves. But great writing has to venture closer to the source of danger, and its ability to navigate the waters around it is its strength and its lasting power.

The possibility of newness gives rise to interesting philosophical questions, and to some extent those questions occasion doubt about that possibility. After all, everything said uses words that are not new, yet they can be used in ways that have never been used before. Sometimes we are confused in this respect by a common and pernicious fallacy: that if the number of words we have is finite, the number of things we can say with them is also finite. By analogy with numbers, we can see that this is untrue: the ten digits 0, 1, …, 9 can be arranged not only to form an infinite set of numbers, but a transfinite set of numbers, which is to say numbers that cannot be counted. So if that is true of ten digits, it is even more true of the hundreds of thousands of words in a language. What follows from this is that the notion that we might be able to program a computer to generate a list of all possible sequences of words is not only difficult: it is impossible.

So as we add one word to another, what governs the navigation of our thought through the complex tree of possibilities that can arise? When we start with a blank sheet of paper and a pen, what is the relation between what has not yet been said and what will soon have been said? There is something here about order and disorder, about energy and entropy, about information and knowledge, about the “no longer” and the “not yet”. And in this creation, this marshalling of the molecules of ink from their inchoate reservoir into lines upon a page, irrevocable change occurs: something new enters the world. And what was not written – like the missing lives we will never live – will remain unwritten, for even if and when we return and revisit the idea and write it again, it will not be the same; it will not be as it was when first we drew upon the page.

Picasso famously once said that he decided and discovered what and how to draw by drawing, and the same is true of writing: this blank page has no destiny, no text that is already its own but as-yet unwritten and somehow embedded in it waiting to be discovered; what will occupy the third line cannot be decided until we have first filled the second. Writing is to the page as painting to the canvas and as making love is to the beloved. Someone – I haven’t yet been able to trace it (George Steiner in Real Presences?) – once asked “For which of us has ever made love anew?” To which the only appropriate answer is “Someone who has never made love anew has never made love at all”.

Gabriel Garcia Marques allegedly once wrote “It is not that as we grow old we cease to fall in love, but that as we cease to fall in love we grow old” (I need to check the exact quote), but more probably wrote “It is not that as we grow old we cease to dream, but that as we cease to dream we grow old” and so it is with writing and the creation of newness. It is possible, indeed it is highly likely, that our brains die when we stop thinking new thoughts that arise from and with the creation of new neural pathways, and so to as with Marques, “It is not that we grow old we cease to think new thoughts and dream new dreams, but that as we cease to think new thoughts and dream new dreams we grow old”.

To be truly in love is to be made alive by the promise of the other (I first wrote “by the presence of the other”, but “promise” is far better and far deeper and has far more scope, and I note the change of mind because the difference is itself worth drawing attention to). A writer is in this sense in love with the blank sheet of paper because it promises something that almost nothing else can provide. And so the pleasure of writing, which for some is little short of a compulsion, lies in the encounter with promise in a way that is almost exactly analogous to the desire for the other that drives us when we are in love: the writer can no more be herself when she does not write than the lover can be himself when separated from the beloved.

Adam Phillips (again from @Brainpickings) with his Jewish background, perhaps unconsciously or even consciously affected and influenced by the Talmudic tradition, observes that it is in conversation that we find ourselves, not in monologue (I think I may already have said this), but I am not sure he is entirely right, for writing is a peculiar kind of monologue, albeit a monologue where the written begins to become the interlocutor. The writing, like but not quite like the speaking, which can so easily be ephemeral, lost and forgotten, assumes its own existence: “What I have written, I have written” as Pontius Pilate notoriously once said. And there is an interesting point here, for human writing is in this sense as the biblical authors conceived of divine speaking. Unlike human speaking in its ephemeral transience, divine speaking is conceived by them as permanent and unrealisable: “For so shall my word be that goes forth from my mouth; it shall not return to me empty, but shall accomplish that which I purpose, and prosper in the thing for which I sent it” (Isaiah 55:11). And the Hebrew word for “Word”, dabar, is the same word as the Hebrew verb “to drive”; words are “driven out”, “sent” on a purposive course that will achieve an end. When for the biblical authors God speaks, God creates: “And God said, ‘Let there be light!’, and there was light”. This is not a causality of speech and action; speech is action; there is here no distinction between God’s existence as one who speaks and God’s existence as one who creates. And so it is, albeit on a more modest scale, with human writing (and, to a lesser extent, with human speaking).

Words assume a life of their own: they are written or uttered and then taken into a world of language and culture where their significance can not longer entirely be under the control of their author. They can be a gift, but they can as easily be a curse, for what can be understood can also be misunderstood. But like our children, our words also remain to a greater or lesser extent our responsibility, or at least things for which we feel responsibility, and we do not like them to suffer ill-treatment or be abused. So the much-argued issue of the significance of authorial intention is nicely illustrated by the analogy with parenting: yes, our parents made us, but they are not responsible for everything we have become or every purpose to which we have attended. “Our words, insofar as they mean anything at all, must mean far more than we can ever know” (Michael Polanyi, Personal Knowledge).

Writing creates an other which is “to us” and “for us” (Martin Buber’s pro me), but it is not an other which can ever remain ours, and so it must also be for and to others. And so the creation of the written or spoken other cannot but involve separation, which is to say that the creation of the written or spoken other must involve the practice that Rilke ascribes to love: “We need in love to practise only this: letting each other go” (Requiem for a Friend).

This “letting go” connects with another aspect of Phillips: that when we speak or write we must place over there and as an “other” something that has already gone beyond our control and therefore may as easily challenge or offend us as please us, something that we have created that demands that we come to terms with it. And some of this may so offend us that, as we noted before, we veer away from it or wish we had not written it or had not said it (or even had not made it). And so the written as other becomes like an interlocutor, a conversational partner who will not always say what we would like to be said. So Phillips is right to say that much of life in general, and this is especially so of our creative lives, involves coming to terms with aspects of ourselves that we are uncomfortable with, and even aspects of ourselves that we would rather repress. For him, psychoanalysis offers a possible but not guaranteed path through this morass of uncertainty, but writing can also afford such a remedy.

Writing also involves letting go because if we seek to write while retaining complete control of what is written the process cannot flourish any more than a human relationship can flourish where one party tries to control the other. To move from monologue to dialogue the written as other must be permitted the capacity to go to places and in directions we might not initially either imagine or intend.

Writing that is not creative, which is to say writing that is turning the handle in a predictable and stereotyped way in which what comes later is predictable from what comes first (Ilya Prigogine says something like this in one of his books on complexity theory where he observes that in most books what comes in the second half is entirely predictable from what comes in the first), is better called scribbling (I am looking for a better word; perhaps “drafting” would do, but even that isn’t right; perhaps “dictating” does it because of the obvious double-meaning). To scribble is to make marks mindlessly, marks to which one has no personal or essential connection, marks that might as well not be made for all the difference they make to the great scheme of things. Lawyers are the quintessential scribblers because their writing is intended to eliminate ambiguity and diversity and possibility in order to achieve an entirely controlled and unambiguous text that cannot be misinterpreted.

Ursula le Guin reminds us (also courtesy of @Brainpickings) of the partnership between the writer and the reader, for to all great writing there must correspond great reading, just as to all great speaking there must correspond great listening. Great reading augments the creative process by imagining new worlds that are stimulated by writing but not constrained by it. Just as what is written cannot contain or convey all that might have been written or all that has not been written, so what is imagined by the reader from what is read cannot be known to the writer, and so the question whether the author “intended” the written to be understood as it is by the reader is just the wrong question: far better to ask how much has been understood by countless readers from what one author has written and be content with that as a measure of writing’s greatness.

“He allowed himself to be swayed by his conviction that human beings are not born once and for all on the day their mothers give birth to them, but that life obliges them over and over again to give birth to themselves.”

Gabriel Garcia Marquez, Love in the Time of Cholera

We are reborn in writing because we become another to ourselves, but that for some is a reason not to write, not to think, not to speak; it is as if we are afraid that in becoming we lose our being rather than discover and enhance it; it is as if in change we identify an enemy rather than the friend that is the source and essence of life. Some ask about continuity of self in a world of change and flux where everything is being renewed, but presence is less important than promise, and history than future. What we have been liberated from in the transformative creation that dissolves the past in order to create the future is any sense that we are obliged to be in our present what we may have been in our past. In the renewal that creates discontinuity we are set free.

And this is what is really entailed in “the promise of the other” that makes us alive, whether the other be writing, speaking, a person or a lover: that in communion with this other we are transformed and remade, and in the hope that has no expectation we are reborn.

And as an afterthought which is another promise rather than an ending, it is important to recognise that, just as there is no finite limit to what can be written, we must live with the terrible realisation that what we can write but do not write may never be written, rather as the lives we can live but do not live will never be lived.

End of Term – I

At first, nobody noticed. There had been too many false dawns, broken promises, unrealised dreams. What was heralded as something that would change everything had changed almost nothing, except for the worse. Learning had long been confused with memorising, but now knowledge had dissolved into information, skill into technique, quality into quantity, and value into price. It was not reasonable to criticise anyone for cynicism because there was so much to be cynical about. Everything, almost, except the thing that changed education, and in changing education, changed everything.

What everyone expected and supposed was that to program a computer – actually rather a lot of computers – to be good at something one first needed to be good at it oneself. Good enough at least to know what it was that one was trying to program. And then it turned out that one didn’t. All one needed to be able to do – and of course this “all” is profoundly ironic – was to know how to enable the computer to learn. And once the computer could learn, it didn’t matter how good the programmers were; all that mattered was how good the computer’s learning had become.

Nobody noticed because everyone was so focused on the result and its implications for human thought that they missed the real point. AlphaGo had beaten one of the best Go players in the world by four games to one despite the fact that none of the people who programmed it was remotely good at the game. What they were good at was enabling the computer to learn, and programming what was needed to play the game, even if they were themselves incapable of playing it very well.

And that was really the message: all we need to be able to do is to discover what needs to be done in order to be good at something and then program it. Nobody needs to be able to do it; nobody even needs to be able to understand how the computer does it; we just need to know how to tell the computer to do it.

They quickly found, or actually to be honest stole from a little-known twentieth-century philosopher of science, a name for it: The Domain of Sophistication; the place or rather the territory where computers start to be able to do things better than we can do them and where ideas we understood to start with start to get so complex that we cannot understand them any more. Not just by calculating faster, or memorising more, but by being able to operate in territory of great complexity better than any human being can or ever could operate. And the thing that really delivered the killer-punch was that these systems were interactive: they not only learned from their human interactions; they learned even better from their interactions with themselves. They learned how to learn and how to teach learning by teaching themselves how to learn.

So, at first, nobody noticed. The odd, unpredictable move was viewed as a curiosity, an interesting intellectual challenge, viewed almost as we once entertained ourselves by watching physically deformed people in freak shows: how odd; how weird; how strange; how amusing; how inhuman. But if the system could learn to play one of the most sophisticated of games by teaching itself and playing itself, why could it not learn to do anything else that its creators thought sophisticated? In particular, why could something that could teach itself to learn by learning not also teach itself to teach by teaching? Why could not a system that could learn to play a game better than any human being by playing human beings and itself also learn how to teach a subject better than any human being by teaching human beings and itself?

So, at first, nobody noticed.