Regression Analysis Redefined

Introduction: Regression Analysis Redefined

            We live in the time of the world regression. How do we then respond to such a world, wherein what appears to have become the most plausible sensibility is the least sensible, the most probable the least possible? In a word, that time can run backward, that history can fold in on itself, and that culture can regress into, and unto, its childhood, even yet its primordial inexistence. And though such an event in some of its symptoms can be measured statistically, this volume of studies suggests that we redefine the analytic of regression. To do so, we might use the following rubrics:

            1. That regression is present any time one desires to base reality upon fantasy, and has thereby lost the ability to distinguish between the two.

            2. That regression is present any time nostalgia is in the ascendant, no matter the cultural thematics or personalist narratives involved.

            3. That regression is present whenever childhood, a mere phase of life, is exalted as both innocent and yet also wise at once.

            4. That regression is present if and when youth and its experiences, once again, a brief phase in human existence, are negatively sanctioned, limited, mocked, or bullied.

            5. And most importantly, when history is itself understood as the handmaiden of myth, and thus its auto-teleology is aborted, regression is the source of this inauthenticity.

            The exaltation of childhood, the disdaining of youth, the disbelief in reality through ‘anti-science’, the dismantling of history through ideology, the inability to discriminate between fantasy and the world as it is – perhaps observed most popularly in our entertainment fictions and more darkly, in our moral panics – and, most insipidly, the inveigling of a marketeering that plays upon our personal desires to re-attain lost youth or yet childhood in the form of generational nostalgia in fashion, popular music, and once again, emerging from the shadows, in mores and norms, is the source of the world-crisis today. ‘I want to return our education system to about 1930’, says Dennis Prager, the billionaire founder of PragerU, a private sector purveyor of fantasist school curricula, ‘but without all the bad stuff’. Which would be? The only thing that comes to mind that would perhaps be better would be textual literacy – more people read books a century ago than today; but at the same time we must ask, what kind of books? – but this too would have to be oriented to other more contemporary forms, such as that digital, in order for it to be salutary to literacy in general. This is but a single example of hundreds globally, which would include populist and nationalist movements in politics, ethnic-based religious affiliations and churches, charter schools based upon ethnicity feigned or historical, government policies that pander to the neuroses of otherwise absent parents, and so on. Let us recast as questions each of the five points listed above, which designate types of regressive presence.

            1. How can one distinguish between what is real and what is non-real?

            The irreal is a third form of general human experience which occurs only when something ‘irruptive’, an event or a presence which breaks into waking reality as if one suddenly and momentarily dreamed awake, makes itself known. These kinds of experiences are rare and we, in our modernity, no longer interpret them in the traditional mode of the visionary or the religious-inspired presence. That they continue to occur sporadically is certainly of interest, given that the cultural matrix which might be seen to have generated them in the first place is long lost. This phenomenological concept can serve us in a different manner today: anything that tends to hitch itself up to the authentically irruptive but is not itself irreal is fantasy, pure and simple. The difference between Israel and Zion is a current example of a political attempt to base a modern nation-state on a legendary construct. Similar historical examples abound: Victorian England’s smittenness with Arthurian Britain, given ideological, that is, unhistorical, literacy by Mallory; The Second and Third Reichs’ genuflections directed to Nordic mythos, given artistic transcendence, but equally non-historical this time, by Wagner. Is there now a Zionist composer or children’s author about?

            2. Why do our desires for youthfulness take on a nostalgic formula?

            Mostly for market purposes, childhood and youth are extended far beyond their phase of life appropriateness. It may well be that the reappearance of neoconservative or even neo-fascist norms regarding child-raising and the curtailing of youthful desire and wonder are the result of simple economics; the market targeting the only people with non-responsible disposable income coupled with the general lack of control over anything but consuming by which children and youth are characterized. In this sense, youth consumption is no different from anorexia; a simple attempt to exert agency in an otherwise adult world. Even if this is the case, however, such regressions are no less than evil, as they strike at the heart of what makes youth profound. Hazlitt, writing at the time when ‘youth’ itself was a novel concept, is correct when he states that youth’s very lack of experience is what makes it not only a unique period of human existence, but also gives it its patent sense of wonder, wanderlust, desireful passion, and naïve compassion all at once. From our first love to our first knowing brush with death, such events appear once again to be irruptive, so filled with wonder are they. The very absence of the human irreal in mature being prompts a regressive desire to ‘return’ to our salad days, green not so much with envy but with a desperate melancholic anxiety.

            3 and 4. How is it possible that the absence of experience generates wisdom?

            It isn’t. If experience can sometimes harden our biases, turning us into ironic bigots, it also has the power to banish prejudice and for all time. Akin to the jaded hypostasy that suffering makes one insightful – for the artist this may be true in some cases, for the rest of us, suffering produces primarily misery, secondarily, resentment, even ressentiment – lack of temporally adjudicated biographical experience in a life is, writ small, the lack of historical consciousness in a culture. What adults are reacting to in the child-mind is a naivety that appears to make suffering blissful; if only we could manage to bracket the world so easily! And what we are reacting to in the mind of the youth is the ability to dare to question the world as it is. Now this second aspect of the illusion of the absence of experience is an excellent tutor, if only we adults would take it up with all due seriousness. Instead, we seek to limit the questions of youth just as we limit youth’s ability to express its phase-of-life’s essential characters; wonder, desire, passion, romance, and most importantly, its rebellion against authority. If we merely took the last facet of the youthful gem and lived it, leaving the other more phantasmagorically inclined imagery behind us where it belongs, we would be by far the better for it.

            5. How do we attain an ‘effective historical consciousness’?

            The phrase is Gadamer’s, and points to a kind of working pragmatics that, in its ‘fusion of horizons,’ generates Phronesis, or practical wisdom. One simple way to approach a sophisticated state of being is to recall to ourselves the how-to skills associated with a specific material task, such as fixing something around the house or cooking a meal, a project in the workplace or helping a child with their studies. These are aspects of a consciousness directed ad hoc, or to some specific task or object. They are also the stuff of Weber’s ‘rational action directed towards a finite goal’. Finite goal agency is, in turn, a manner of thinking about the self: I am an actor who needs to get from here to there – what do I need to do to accomplish this movement? The process by which I do so, whatever its content, is a temporal one, but one that belies its own historicity due to its intense focus on what is at hand. Nevertheless, time has passed, and a small part of one’s own personal history has been acted out. Now think of species-being in History as a form of agentive action directed to specific, if various, series of goals. This can not only provide some inspiration in anxious times, when once again, the mythic apocalypse is being contrived as an overlay upon very material conflicts regarding resources and their distribution, it can also give us, as individuals, the sense that what we do matters within the wider cultural history of which we are a part.

            Finally, the redefined regression analysis (RRA), differs from demythology in that it cannot take place through art. It is an aspect of critical and reflective thought alone. Its effect may be equally disillusionary, but its means must stay analytic, never adopting either the allegorical or the agenda narrative. It also differs from a deontology, which is to be seen more as another effect therefrom rather than a source method. Demythology is an anti-transcendentalist critique that is perhaps best performed in art, deontology similarly in philosophy, but RRA in the sciences, and specifically in the human sciences, their critical allies.

            This volume of essays, both popular and scholarly, is dedicated to redefining the analysis of regression in all its forms. It does so at a time when we are witnessing a worldwide regression, the psyche of which is desperate, anxious, and fearful, all of the very weakest aspects of our shared human character. Instead of giving in to those base impulses, grasp rather the more noble cast of compassionate critique, both in your own life and in the life of the world itself.

            The following two articles first appeared in edited form in peer-reviewed journals which are now defunct. They are reprinted here in their original state for the first time.

            2011v    ‘On Distinguishing Between Criticism and Critique in the Light of Historical Consciousness’, in Journal of Arts and Culture Volume 2, #3, Nov. 2011. Pp. 71-78 dc. ISSN 0976-9862

2012v    ‘Is there Hermeneutic Authenticity in Pedagogical Praxis?’ in Journal of Education and General Studies, volume 1 #8, July 2012. Pp. 180-187 dc ISSN 2277-0984

Te Deum Tedium

Te Deum Tedium (Godforsaken Talk)

            The objective factors in the ascendancy of neo-fascism in our times are well known. The demographics of biopower, the two-income earning family as a general necessity, the marginalization of male labor, the public appearance of alternative family and community due to technological advances in logistics and the military, and so on. But none of these, either alone or together, should be enough to convince a human being that their world is coming to an end. Change, certainly, but not apocalypse. So if more macro and historical factors have been exhausted without resolving explanation at this human level, what other variables might be present that would turn this specifically difficult trick?

            I am going to suggest here that there is one such stressor in particular, which in turn contributes to an existential anxiety; the kind of concern that leads a person to believe in the coming void, and not merely become frustrated that the world has left one behind. For the Calvinists, it was their earthly or material conditions which were taken to be a sign that they themselves were to be saved, that they were of the elect. The Reformation had brought with it a renewed interest in the sense that one could not know of one’s fate until and unless the day of judgment arrived. One’s Christian destiny was predetermined, true enough, but one lived on in ignorance of the final result of this prejudgment. Originally adopted and thence adapted from the Egyptian scales of judgment, with Horus asking the shade if it had struck a balance between its potential and its acts in life – the few who punched above their ethical weight class were honored in the afterlife, but woe to those who did not rise even to their own gifts, no matter how slight – the Christian version of evaluation eventually did not need to ask, per se, but rather one was simply informed of one’s record upon death. So a person, thence a culture, for the apocalypse, a personal judgment writ large and an historical one completing the narrative in the ‘end of all history’, was to evaluate an entire species’ accomplishments and its deficits alike. To be found wanting as a soul within the arc of the Oversoul was to determine one’s final fate.

            And for all eternity. How could there then be a more stimulating motive to make one’s earthly existence into a paragon of the good? The Reformation sectarians who invented the Protestant Work Ethic could in no way find fulfilling the idea that one could not, in principle know anything at all about one’s destiny. Just as there had been signs of God’s presence in the world, the narrative of the Medieval period suggestive in the sense of the authorship, the creation, of that world as being autographed by a divine hand, so there must be similar signage which pointed to, in an individuated sense this time, a greater meaning for one’s life. This sensibility, originally regionally Dutch alone, rapidly spread, through the Anabaptists and into North America with the Puritans and by the early 18th century, the Baptists themselves. It should be recalled that this American church, now associated with the historic South and Mid-West, had its origins squarely in the Yankee mindset, with the very first Baptist church, which is still standing in Providence. This is not insignificant, for it was the unique amalgam of faith and works which animates much Christian orientation in today’s America, that could only have been forged in the revitalized region of Puritanism and its work ethic. Indeed, part of the Salem effect, perhaps its largest part, was the sense that those who worked through uncanny means were simply cheaters to the general ethic, whilst most others slaved away in the duller light of the day to day.

            So then as now. The alternative genders, the wealthy urban professionals, the intellectuals, the leisure and vice of the inheritors and the like, all these are the contemporary witches. They have attained such numbers and power that surely this too is a sign, this time of the end times; the day of judgment must be nigh. Puritanism may have lost its purity, but it has maintained both its faith and its works, or better, it has fostered a faith in works while at the same time a working faith. And if divine judgment seems distant and even a trifle aloof in our modernity, earthly judgment can itself provide a sign, a way to winnow those who might yet be saved from those who have given up salvation for the salacious salivations of this world alone. In order to make that evaluation, of course, the remaining Puritans have to wrest power from those accursed, as well as those who may well have cursed themselves; those who were never Christian certainly, but also those who had been, but then had let their mortal desires overtake their better sense of self. This is the political aspect of sectarianism: a way to prove that evaluation still exists.

            But in order to vouchsafe its efficacy one must go a step further, and it is this I will suggest is the motivating leitmotif of Evangelicalism today. If for half a millennium Protestants could rest something of their living soul, their conscience, upon the pillow of earthly wealth and success, and thus correspondingly, of a relative lack of material impoverishment and failure, the loss of these worldly props would prompt a crisis, not just in culture, but rather in existence. If one loses the signs of one’s elect status, this is no mean historical shift. It is not a question of demographics, technology, economics or politics, but rather one of ontology itself. I am no longer amongst the elect, or I am in danger of losing that status. There could be nothing more devastating, to the point of its appearance as a patent and potent evil in one’s life, the very worst thing that could ever even be imagined. I mock them not, but am rather attempting to convey some of the emotion that must be present in any heart which has witnessed the very promise and premise of its eternal existence suddenly vanish.

            Any one of us can surely empathize with such a tragedy. The loss of a loved one would come the closest, but even here, while it calls into question one’s own life and one’s future, one indeed lives on, even perhaps with the solace that we might at some point ‘meet again’, as the old song has it. But to be told, even in indirect terms, that one’s eternity is now annulled, that one is at least as liable to find oneself in hell as in heaven, overtakes even the most intimate of losses. So too then does the kind of mourning involved overtake any personal grief. For such faithful, no matter that this intuitive belief has been muted by both the day to day and its distractions as well as the simple passage of time blunting the edge of its soteriological suasion, such a loss has to be reckoned with before the time in question, if there could be any possibility that salvation was still an option.

            Enter leaders who are either cynical opportunists, narcissists, or perhaps even a few authentically concernful persons who, like their needy followers, also see their souls awry, and thus the faithful must risk choosing a political Anti-Christ of Revelations in order to make a meaningful choice at all. This only adds to their burden, which the rest of us may witness if we care to do so; tragic, solemn, and desperate as it is. For at its deepest level, sectarianism and neo-fascism in today’s society rest upon the sense that those involved within its ambulatory aura are trying to save themselves and for all time. In doing so, they have asked, nay, begged us to join them. That we refuse to do so, that we indeed mock them instead, is only the further proof that we are the damned after all, and that God would forgive His faithful of even our outright murders, since we had the same choice they themselves did, and rejected it out of hand.

            And so this is our current scene: a large minority of the once-elect searching with all due diligence and desire, desperation and doxa for any possible sign that their eternal souls will not suffer the dismal dirge of a devil’s drag. That the rest of us are blind to both the metaphysics, and much more importantly, the social reality of this ultimate motivation, truly is a sign that we are in for a coming hell on earth.

            G.V. Loewen is the author of 58 books in ethics, education, religion, aesthetics, social theory and health, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

The Good-in-Itself versus the Good-for-Oneself

The Good-in-itself versus the Good-for-Oneself (an excursus in grounded meta-ethics)

            The term ‘meta-ethics’ first presents an inherent contradiction. Ethics is, by definition, about the space of action in the world. It is grounded only in the sense that it occurs in medias res, on the ground, whilst running along. It is perhaps typical of analytic philosophy to make the goal an ‘in itself’ and then the means to it quite contextual. The leap of faith is simple enough: can we establish a principle – in this case, about the essence of morality – based upon all that is unprincipled in itself. This faith does not, perhaps ironically, include a moral judgment, for ‘unprincipled’ is meant to suggest only that which is relative to condition. Think of Durkheim’s understanding of ‘deviant’, which was highly statistical, and in which the normative was equally seen as simply that which most people in fact do, or believe. It is the same, and even more so, for his kindred concept of ‘pathological’, which is deliberately contrasted with the stark and even jarring term ‘normal’, so disdained today. The social fact, to again borrow from the same thinker, that everyone is ‘normal somewhere’, belies without entirely betraying the sense that our shared condition is not experienced in an identity relation with itself. But if one seeks a principle, one would either have to assume that there are actions which lend themselves to a choate whole as in a structure made of differing but corresponding elements, or that at some point, with the presence of enough of a certain kind of action or sets of actions, that a Gestalt belatedly arrives which can be thenceforth named ‘morality’, or some other like category.

            As a hermeneutic thinker, I am cautious about such claims. Ethics is never by itself, or thus an ‘in itself’. It is quite unlike physics in these regards, which, though certainly not acting in the proverbially ‘closed system’, it is nevertheless highly predictable in its correlative effects, and can be, with great aplomb, analytically worked out backwards, as it were, to specific precipitates and even ‘causes’. Now it is not that ethics so named is random, entirely spontaneous, or improvised on the spot every time. Clearly, there is some relationship between the action of the good and the good in principle, and it is thus a matter of discovering more exactly what that connection may be, how it has altered itself over time, and how living human beings perceive it. But unlike the formal study of meta-ethics, what I will suggest here is that we begin quite inductively and without any principled goal in mind, by attempting to understand a Verstehen of Verstandnis, if you will, which ultimately returns to a Selbstverstandnis. This ‘selfhood’ is not, in the end, oneself, but rather about the self as it is currently experienced and acted out by our contemporaries in the world as it is. Insofar as it is not overly personal nor overtly subjective, this selfhood should contain within it at least a semblance of a principle.

            Let us begin then by taking a familiar example in which the contrast between action and order may be glimpsed. It is very often the case, in teaching undergraduates of any age or possessed of any credo, that they imagine that their personal experience can by itself generate facts, or that what they have known is the whole of social reality. Long ago, when I was still myself possessed of a sense of experiential superiority, I responded to a hapless young student who, in reacting to the statistic about youthful marriage which, at the time, had it that 85% of marriages entered into before the age of 26 ended in divorce, objected that her parents had been high school sweethearts, meeting one another at age 14, married at 18, and were yet together some decades later. Congratulate them, I replied, they are part of the 15%. This generated buckets of belly-laughs from the rest of the class and I am sure the poor thing was humiliated. That I only felt some minor bad conscience about it years later, suggests that ethics, at least of the pedagogic variety, had been conspicuously absent in this specific case.

            At the same time, such an event served the wider case quite well, as it pointed out, rather pointedly, that one’s own experiences were not enough to understand fully the human condition. Now, if we take the same sensibility to ethics, we might argue that since one’s own actions in the world are not representative of any kind of morality which might be known by other means, they are also not the fullest expressions thereof. What is meant by this latter remark? If one does not know the good, one cannot be either a representative of it, nor can one express it through one’s actions. This is a moral statement, and as such, it evaluates the value of the principle, not by its enactment, but rather, to borrow from Foucault, by virtue of its ‘enactmental complex’. The status of morals in society is one of the salient variables for analytic philosophy’s idea of what meta-ethics might be. The term ‘status’ implies both its state and its value, what it is in itself and how we esteem it, even if we do not precisely know what it is, or what else it may be, unto itself. The old-hat problem of our perception of the world comes immediately to mind, but morality, as Durkheim for one stringently reminded us, is not of ‘this’ world at all. It is social alone, for, as he calls it, ‘there is no other moral order apart from society’. Before Vico, one could read ‘not of this world’ as implying an otherworld; in the premodern sense, one of divinity but also one of spirit. With the Enlightenment, ‘spirit’ disconnected itself from the divine, became ‘objective’, as it were, and whether dialectical in nature or more simply existential, the one thing it no longer was, was essential;  spirit had become its own deep deontology.

            Now however this may strike us, one day as liberating, the next, alienating, and either way, certainly as a foreground to our favorite modernist expression, that of ‘freedom’, the deontology of morality did little enough to thenceforth favor ethics as any kind of ‘in itself’. Ethics, in our day, is more often thought of in the context of business, the white collar professions, medicine, or the law. We do not regularly hear of ethics as a stand-alone discourse, and when I tell people that I am an ‘ethicist’, or that is one of my philosophical areas of study, they always ask, ‘do you mean professional ethics or business ethics or…’ and so on. It is clear enough that ethics, recently divorced from morality, has accomplished what Aristotle began only through the sleight of hand of popular language in use. And while morality is itself shunned as both dreadfully old-fashioned as well as avoided since it is perceived as a prime candidate for interpersonal conflict, ethics has almost vanished entirely from the ‘open space of the public’. And if one gets a bemused response to being an ‘ethicist’, just think of how people might react if one introduced oneself as a ‘moralist’!

            The foregoing should not be seen as a digression. Just as personal experience does not in itself comprehend the world, the actions based upon each of our individuated experiences cannot in fact construct an ethics, let alone a morality. In our day, the quest for principles is either a mirror for the ventures in technique and technology which seek indefinite perfection – research in stem cells, in artificial intelligence, in extraterrestrial contact, in cybernetics or cyber-organicity including portable or downloadable-uploadable consciousness – or it is simply another one of the same type. The moral objection to each and any of these is that they are the latter-day Babel, and can thus only be the products of an arrogant but still mortal mind which seeks to be as a God already and always is. One could ask the question, in return, ‘are moral objections always in themselves moral?’ but this would take us beyond the scope of this brief commentary. Instead, though not in lieu of, let me suggest that meta-ethics as framed by analytic thought is fraught with a problem similar in likeness to that of making something perfect, or at least, superior to what it had been. One, we are not sure if morality is in fact superior to ethics: the timeless quality of moral principles is obviated by history. History slays morality just as disbelief murders Godhead. We know from both our personal experience and our more worldly discourses thereof, that what is good for one goose is not necessarily good for the next, let alone the ganders, diverse in themselves.  Two, what is it about the character of ethical thought, and the witness of ethical action, that necessarily requires us to hitch it up to a more static system of principles? We have already stated that ethics is not about the random, and if we take our proverbial chances in the world each day, we do so with the prior knowledge that almost all others are very much aware of doing the same thing, and thus as a society we are alert to a too egregious over-acting, and that of all kinds. Durkheim’s sense of the source of morality again comes to the fore: here, morality is understood as a working resource which expresses its historical essence through the action of ethics. As such, there is nothing to be gained by esteeming morality for its own sake or even contrasting it with the discourse of ethics in a manner that exalts its status.

            The virtuous must be decided not by a grounding, but rather on the grounds of all that which is at first needful of some kind of adjustment. How we access the frames by which we make ethical decisions is certainly of interest, but I suspect that most people do not refer to principles in so doing. Instead, they rely on what has worked for them in the past, their ‘previous prejudices’, which can appear to them as if they were a set of principles in spite of what we have just observed about the essential parochiality of personal experience. In a word, prejudice is not principle. Certainly, morality attempts, and mostly in good faith I imagine, to overcome the individuated quality of merely biographical self-understanding. All the more so is it not present to mind when we act. It is not that morality is utterly moribund, a relic alongside other ontotheological constructs fit only for the museum of thought and never for the world as it is, but we would do better to work on a more effective discourse concerning ethics and specifically, ethical action, with the only pseudo-ideal present perhaps the irruptive figure of the neighbor in mind. This anti-socius works against the moral order of society and thus momentarily stands outside of ethical discourse as well. But the action of the neighbor serves as an expression of the essence of our truly shared condition and as such, reminds us of the radical authenticity that must be present in order for ethics to have any reality at all.

            G.V. Loewen is the author of 58 books in ethics, education, religion, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

The ‘S’ Word

The ‘S’ Word (Hint: this is no fecal matter)

            It is perhaps ironic, given its most human and personal quality, that sex is the topic that most people find the most difficult to speak about. If death is the unfathomable topic, and religion as well as politics are the ones most likely to lead to conflict, it is sex that all agree upon in an oddly related manner. It is quite fathomable, and simply speaking about it, at least, is unlikely to lead to conflict per se. And yet it remains taboo in all agrarian and post-agrarian social organizations. Any investigation into the reasons for this perduring sensibility would have to be anthropological in scope, and I am not equipped, so to speak, to perform such an analysis. What I can do, however, is ask a number of questions about the current version of the taboo, to see how it employs the same principle as the Durkheimian sacred in order to traverse world-scale historical and cultural boundaries and thus maintain its almost ominous elephant in our shared rooming house of society.

            Before the agrarian epoch, as may be observed in the remaining ethnographic contexts over the past century and a half, eating and sex were equivalents. In a great many such horticultural organizations, even the word for both acts is exactly the same. This notion that consumption was somehow related to consummation suggested a nascent mysticism. The Christian cult, in its first fluorescence, held that the Agape, the ‘love-feast’, was a mimesis of the union between Man and God, between the material and the mystical, and it is very likely that in pre-canonical Christianity, a strong erotic component was present, as it was in almost all of its Levantine competitors. The two key factors for Christianity’s ultimate success, well before it became imperially legal and thence ultimately official, was that unlike its competition, it admitted both males and females equally, and it also did not hitch itself up to a specific laboring class. Though, as Weber notes, it was the artisanal class of the Roman Empire that was first attracted to its ideas – which reminds us that Nietzsche’s comment that Christianity was a ‘slave religion’ must be taken metaphorically only, however else one may take it – these radically new sensibilities regarding ethics quickly spread. The artisans, used to working for aristocracy and thus witnessing both its splendor and leisure without ever being able to partake in it themselves, were the most obvious first catch of Pauline pastoralism, and it is rather this other historical point that lies in Nietzsche’s favor when he also characterizes Christianity as a religion based upon ressentiment.

            The Agape would likely scandalize today’s evangelicals, but it also served to promote the anti-gay stance that gradually became associated with the new instanciation of Abrahamic social relations. Evangels tend to want to have things both ways, as it were, but in this case, one must accept the usual bimodal eros untethered in order to maintain the boundary against same-sex unions and associated activities. Repressing the former only yields the presence of the latter, as the early Christians were aware but which our own versions of them desire to deny. Yet as early as Hellenistic times, as Foucault relates in his celebrated if regrettably truncated ‘History of Sexuality’, tracts and texts abounded exhorting people to abandon same-sexuality in favor of what was to become the dominant act. The Hellenes’ arguments were, by our standards, often earthy, laying out in the plainest language, for example, the advantages of womanhood as a sexual being in terms of there being willingly present a full three apertures, to stay civil, as opposed to the mere two available in men. That these are pre-Christian positions is instructive; the sense that large-scale Near Eastern civilizations had an immense demographic and hence military advantage over those Mediterranean was already very clear. With the Alexandrian empire at its height, these same early Europeans had at first hand come up against the great hordes of Asia Minor and well beyond, before Alexander himself wisely chose to stake his uttermost outpost in southern Afghanistan and proceed no further.

            Gay unions do not reproduce, and this basic biological fact contributed mightily to the sense that such activities would, in the end, result in the loss of culture as a whole. It is this sense that became a true sensibility, and may be seen today not only inside evangelical circles. There is yet a widespread notion that any departure from doxic sexuality is dangerous, even promoting of a crisis. That there are differences along more picayune lines – that sexual activity should be the sole purview of formal marriage, say, rather than of youth and its attendant ‘fornication’ – does nothing to obviate the more general agreement that in order for a culture to preserve itself, it must bear its own children. This last can be emphasized as a rider to the previous because anti-gay sentiment is often linked up with that anti-immigration. This too has both an irony and an authenticity to it: these ‘others’ still know how to breed! That is the essence of their threat to us. Even if we can scoff at the outlandish claims of Moscow and Tehran that ‘there are no homosexuals in our country’, what cannot be sniffed at is that however many are present abroad, their combined presence has no effect on the ability of these other cultures to think to dominate, and on a global scale. The clearest sign of this emerging dominance lies of course in the market, and hence in economic power, rather than that military, but the latter is coming along as well. In an age where large standing armies are obsolete, the link between demography and power is much more indirect than it was for the Alexandrians and their immediate successors. Even so, such a link is not entirely vacant. All of this contributes to the anti-gay line, and its historical bases, if not necessarily its contemporary concerns, are eminently factual.

            Aside from these more objective factors, there remains the residue of mystagogic variables which proclaim that informal sexuality, let alone same-sexuality, constitutes a betrayal of the covenant Man has with God. If Adam’s rib is more truly his upstanding member, then we allow ourselves to perform the singular act of mystical union, a kind of personal Agape, if you will, and with the same goal as had the early cultists, and this aside from admitting a great variety of obvious japes; the ‘ribbed’ condom, say. This singular goal, to reproduce the new ethics and spread the glad tidings – the one is the action and the other the resultant act – had as its resistance the previously dominant same-sexuality which was seen in Greek and Roman cultures as both a form of mentorship and of simple pleasureful leisure, as well as the sensibility that the Gods were themselves equally capable of desire, even lust. Zeus’ intimate and unceasing peregrinations were well known, but it cannot be more clear that the pre-Christian Mediterranean also made less transparent distinctions amongst love, lust, desire, and pleasure which became more rigid in the following epoch. As Nietzsche cleverly put it ‘Christianity gave Eros a poison to drink; he did not die of it, to be sure, but degenerated into vice.’

            And the scope of what constituted vice thus became much wider, so that by the time of the Troubadors and the incipience of romantic love, the chief draw of this new feeling was not so much that contrasting-sexuality be abandoned, but rather that its very formality had gotten in the way of its authentic celebration and thus union. And though it is certainly the case that in some arranged or pseudo-arranged marriages, the latter the ideal of evangelicals, love can arise after the fact of formal union, most Westerners agree that love precedes marriage and must do so if the formal socially sanctioned relationship should have any authenticity and perdurance itself. And it is not that vociferous Christians entirely disagree with this notion either, it is just that one should refrain from materially consummating such love in sexual union pre-maritally. This mutual chastity, it is argued, can only heighten desire, and thus and thence the desire of the lovers for one another. It is, perhaps oddly, an ethic borrowed not from the history of religion but rather from that of poetry and the courtships of the medieval romances.

            We have briefly seen that there are a number of related and unrelated factors at work which backdrop the ongoing taboo surrounding sex in our society. Foucault himself warned us that we should talk less about sex, perhaps contra to Salt-N-Pepa and a myriad of others, and actually do more of it. This Dionysian cast was not at all absent for the early Christians, so we are left to explicate, insofar as we can given the vicissitudes of history more generally, how we have moved ourselves from doing to talking, from openness to secrecy, from blitheness to neurosis, about sex. If we can do so with candor, it may be the case that we begin to see more clearly the relationships between technology and same-sexuality, between demographics and political economy, between morality and ethics, and of the utmost, those tensions interior to the intimacies between actual human partners, no matter their ‘orientations’. If we yet seek a unio mystica between and amongst human beings, if we continue to imagine that reality may be reenchanted through desire and even lust, then by working through what can only ever be a partial talk therapy of the phenomenology of sexuality, we may find ourselves much closer to understanding our culture’s essential ideal; that of love itself.

            G.V. Loewen is the author of 58 books in ethics, education, health, aesthetics and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

A Brief History of Real Time

A Brief History of Real Time (very brief, very real)

            Hawking’s well known A Brief History of Time provides for us a cosmology, through really, more a biography, than a history, of the career of the scientific understanding of temporality, time in the abstract and as an abstraction. It is astonishing that the human mind, historical through and through and with no conception of being without time in both senses – not having it as well as being outside of it – could conceptualize cosmic time in such an intimate fashion. Even so, it is of limited usefulness to mortal consciousness to be able to contemplate the infinite in any form or by any formula. Heidegger, in his much earlier History of the Concept of Time, the prolegomena to his masterwork, Being and Time, provides a more down to earth set of proposals. That which is closest to us, the history within which we are compelled by the happenstance of birth to live and in-dwell, is generally that which is least well known to us, in part, due to the drama of the cosmic, which science sets out to script in a comprehendible manner. The perhaps bastard child of religion, science seeks to take hold in the same territory as did its somewhat absent parent explain. This is its truer limit: that modernity accepts the fruits of science, its applied innovations, and rejects its methods, as Sagan aptly stated. In doing so, we find ourselves inhabiting a kind of divided time; one half shot through with superstition, the other shot up with technology.

            The evidence of this temporal schism is all around us. The creationist drives his SUV, the cybernetician attends church, the pilgrim hops on a commercial jet, the hermit is a virtual globetrotter, the atheist worships nature, the most avid of empirical religions. It is a challenge, in our day, to know what time actually is, hence the projection of cosmological narrative in an effort to overtake mere history, human and thus passing. While Hawking’s book was a best-seller – and who among us is a physicist, after all? – Heidegger’s book remained unpublished for many years after it being written in 1925. It is fair to say that no one reads philosophy either – who among us is a philosopher, after all – but there is more to it. In the effort to assuage our anxious doubt about the exact time in which we live, we have reconstructed temporality as a mere fact of nature; something to be observed and explained, rather than witnessed and understood.

            On the one hand, it is a case of ‘plus ca change’, as is said. Science is indeed new wine but its bottles are ancient. The life-blood of the redeemer is no longer poured from them, but something of the sort remains as an aftertaste, just as God Himself maintains an afterlife as we speak. Real time, that which humans in-dwell, begins only with history itself. Before this, time had no serious meaning. In the original human groups, the only division of labor was that of age and thus experience. The legend of the Fall contains the recognizance that man and woman are different, and we are ashamed of this fact, not because of the difference, but rather due to its self-discovery. For in social contract cultures, sex and gender were oddly irrelevant to social reproduction. The realization that the beginning of surplus altered the very fabric of what it meant to be a social animal is certainly a source of shame, and we bear that stigma to this day, and the more so. Men and women were distanciated from one another from this point onward, further dividing the human sense of time. As production gradually outgrew reproduction, these divisions only multiplied, if not exactly in the same sense as the edict given voice by the eviction.

            For we were not so much expelled from a place but rather from a time which was non-time, ahistorical and not even prehistory, for the latter term implies that history has already begun and thus we are able to recognize what came beforehand. This kind of timeless time is yet better thought of as non-historical, and thus also as non-human. The social contract is the real world expression of Eden, and so it was seen by the Enlightenment thinkers, though for moral reasons and not those temporal. In any case, temporality today consists of a dual flight from the shame of being so divided. On the one hand, we delude ourselves that we still have some connection with our origins in the primordial primavera of the garden, by prevaricating the mythos associated with non-history. In doing so, we ignore the that there exists an essential and qualitative break between our beings and the Being made choate in primordial non-time, preferring instead to imagine that Edenic life was simply present on the horizon of a remote antiquity, which is nevertheless somehow measurable. Biblical chronology is only the most literal example of this delusion. On the other hand, we demand of ourselves a Neuzeit, to borrow from Koselleck, which promotes a much too recent chasm, that associated with the revolutions of the eighteenth century.

            This divided temporal selfhood is experienced as subjection to the either/or of the vapid ‘culture’ wars, and the misuse, or rather, abuse, of terms such as ‘ideology, ‘value’, and ‘truth’. That one is ‘traditional’, that another is ‘contemporary’, that one is reactionary and another progressive, that one is conservative or liberal, fascist or anti-fascist, when in reality all exist in the insularity of their own self-imposed fascist reactions to all things which might offer an ounce of perspective. Yet if the divided temporality were not present as a phenomenological structure, as a foundation for schismosis of institutional and political life the both, such symptoms as ‘value’ conflict would be at most agues rather than the plagues they have become. In the effort to avoid living in our own time as it is, both the grand and the grandiose shift their process of self-validation onto the culture of our long-dead cousins or, those others who have not yet lived at all.

            Heidegger stressed the need to live in real time. The transition from Mythos to Logos, the most important process in the history of consciousness, could only be made itself real by experiencing life as an ongoingness, in its fullest presence, and called to conscience by that which is nearest to me. If mine ownmost death occupies such a salient rhetorical place in Heidegger, such is it that mine ownmost life, in the meanwhile, receives its most encouraging support from every less studied page of the great thinker’s works. Truly more of an ethics than an ontology, Being and Time recognizes the ‘andness’ of these two conceptions as essential humanity. Our very beings are historical, nothing more but also nothing less. Therefore, it is the Logos which is the fitting metaphysics for any historical being, and not the Mythos. That we remain so entertained by the latter is also a symptom of what he refers to as ‘entanglement’. Instead, I have reiterated that the new mythology is demythology, nothing more, but also nothing less. The Neuzeit actually begins some 2500+ years ago, and even if it has not quite yet come to its fullest expression, the process of demythology is as a force of nature, equally cosmic, but thus far wholly human and hence providing us with the only certain relevancy in our otherwise divided times.

            G.V. Loewen is the author of 57 books in ethics, education, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

The Dreams of the Perpetrators

The Dreams of the Perpetrators (A deathless Arcadia in Ego)

            “We do not know the dreams of the enthusiasts, the victors…” Koselleck intones in his Holocaust study ‘Terror and Dream’. And we are immediately reminded of the deepest of connections; that all humans, no matter their worldly merits or deficits, sleep and dream, as Whitman declaimed. The content of such dreams must differ, pending the dreamer, we might assuage ourselves. But it is not so much the character which is at stake but rather the conditions in which I might find myself, now sleeping peacefully, now fitfully, now lethargic and thence insomniacal. “…they dreamed as well, but hardly anyone knows how the content of their dreams related to the visions of those that were crushed by the temporary victors.” Koselleck finishes. If the murderer sleeps and dreams as well as does his victim, what then characterizes the difference which we feel must be present?

            In the dreamscape, I am not free to master the otherhood of the self. How often have I seen the looks of reproach, even revulsion, on the faces of the young women I encounter in this dream or that. As often those willing, lustful, playful. Why does the lover turn to the one who hates? Mostly, we do not ask such questions, preferring to dwell on the ‘how’ of it all, which in such cases might be able to be explicated by an advanced neuroscience. And what drives the compunction of my dreaming self, along with its compulsions, so that dreaming content is so often conflicted, even if the act of dreaming and its attendant Traumdeutung occur precisely so I can ‘process’ the real-time conflicts of the day to day? I once hauled a girl in full Blytonesque school kit into a specific room to beat her. I equally foreswore having sex with a young woman who, after we kissed somewhat diffidently, told me she ‘could not do this’. I ‘decided’ to assault another in an office but her look of absolute disgust stopped me cold. I was myself accosted by many, but since I am male, I took it in my supposedly so-masculine stride and allowed ‘nature’ to take its burlesque course. All these were but dreams, at once the playing out of suppressed desires, so we are told, but at the same time, themselves hermeneutic commentaries on those same desires. And why are there scenes which we know so well that are never replicated in the dreamscape? I have never been a death camp guard, that I recall. I have never been the pope. I have only once or twice been emplaced as another gender. I seem to be stuck on myself, in myself.

            It is commonplace to acknowledge a kind of gatekeeping mechanism between one’s desires and one’s sociality. This ‘superego’ style of boundary maintenance keeps the extremities of the ‘id’ from becoming too real in the world of both the ego and its fellows. Koselleck notes that “It is a characteristic common to all camp dreams that the actual terror could no longer be dreamed. Phantasy of horror was here surpassed by actuality.” When indeed the extremes of human intent turn to action in the world, as they do all too often, it appears, we no longer have the ability to separate the unreal from reality. The very unreality of human horror is suggestive that those who perpetrate it have themselves lost the means of dreaming it. What can no longer be processed by the unconscious aspect of my mind breaks forth into the open space of other minds. Is it a mere case of bad manners, wherein we can no longer keep our hands to ourselves, as it were? A case of being a child in an adult’s body, having a childish mind but the capabilities and resources of a mature being? Certainly, cognitively disabled persons who are violent manifest this kind of admixture, attacking their caregivers with willing wantonness and yet somehow also knowing that they are, for whatever rationale, exempt from any serious consequence, unlike the rest of us. There are, however, darker disabilities than those which prevent maturational growth. Such a list would include the lack of compassion, absence of empathy, ignorance of otherness, and the like, which we observe as being regularly present in much politics of our time. There seem to be few enough public figures who do not express such disabilities, at least in their rhetoric. Anyone who stakes their own claim to existence through annulling the other’s equal claim seems the willing vehicle for a desire so vain as to be bereft of self-recognition. There is a certain solipsism in political life which strides bodily over the claims of others to exist at all.

            Are these then some of the monstrous forms that the ‘dream of reason’ has produced for us moderns? Have we been regressed to the inferior forms of pre-modernity, recreating a world in which the other is automatically an enemy, and at best, a passingly dormant threat? Is youth the assassin of adulthood, or is it the other way round? In my vain desire to be ever youthful, my dreams speak to me not so much of desire alone, but of slaying the process of aging before it can itself do me in. I no longer want to possess the young female; I want to be her. To live again from the point of optimal departure, to have not a care for health and fitness, to be the envy of all who are called to witness my outward beauty, to have the market pander to my every whim. Surely there is a link between the industry-contrived charisma of a Taylor Swift and the very much self-constructed charisma of an Adolf Hitler. Practicing endlessly in front of the mirror, the latter, cast into an autonomic obloquy by his social anxiety, could not rely on himself to stand and deliver in any spontaneous manner. This contraption, so calculated yet never cool with itself, unlike Swift’s, is mimicked in the death camps. The rationalized precision of mass murder makes the desireful sprees of splayed-open recent nightmares look amateurish. The terrorist of today can only ever dream of being the Fourth Reich. As well the politician?

            Yet the chief character of human reason is that it does not dream. Reason is the tool of the waking mind alone, conscious of itself without becoming self-conscious. This may be a key: that we are capable of compassion only in forgetting the self. When we proffer our desires unto others with the expectancy they will comply, we are lost. The parent who demands obedient children is the living archetype of this fascist fantasy. The lawmaker who expresses only his own druthers is their child, along with the barking coach, the banal teacher, the masturbating school administrator, the self-serving civil servant, the insolent official. Even the best of reason, held within its mortal coil, does not necessarily escape its own monsters. Aristotle’s exclusion of the female, his xenophobic hatred of barbarians, Russell’s disdain of women, Foucault’s reckless abandon. And then what of my own dreams? We know that violent sexual imagery, a leitmotif of Wagnerian proportions in the libidinal world, is so commonplace within the dreamscape as to not excite comment. Yes, analytically, perhaps. The psychoanalyst’s guild, a new priesthood born at the height of modernity but actually practicing a postmodern art, one which we have of late suppressed, perhaps inevitably but certainly ironically, allots our confessional and thence allows our confession. If unreason is demonic, then reason has become the new religion, its ‘spirit’, if you will, the ghost in our shared mechanization; what we might have called ‘conscience’ if it weren’t for our collective disenchantment.

            Mostly, we are jaded with ourselves. How can it be that my mere dreams are more exciting, and assuredly also more immoral, than my waking life? Would I trade the one for the other? It has been done before: “The compulsion to de-realize oneself in order to become paralyzed at the final stage of existence led also to an inversion of temporal experience. Past, present, and future cased to be a framework for orienting behavior.” Koselleck is aware that both memory and anticipation, dual phenomenological forces that act as a bulwark against absolute desire, have no place in the camp, just as dreams are themselves taken outside of human and historical time, instituting their own vapid irreality in its stead. Oddly, there are living spaces which seek to mimic such primordial experiences, including the casino and the church service, the vacation and the spectacle. It is as if we remain possessed, not by the collective unconscious and its memory of the visionary, the creation of all things and their destruction as well, but rather the pressing absence of vision in our current and very much conscious condition. Is it also then the case, that along with compassion, we must bid final farewell to futurity itself?

            In dreaming desire, there are no real consequences. In order to make such fantasies real, we must disarm and thence dismiss no less than history along with biography. The perpetrators dream awake. This is how they can commit the impassioned acts of horror upon the others who now appear to them as mere projections, in their way or submissive, it matters not. It is not a case of decorum managing desire, or even compassion trumping the passions. It is rather that the vision of primordial Man has been reconstructed, and at cost, in the picayune and rationalized manner which modernity requires of it. No less costly than the first murder, the most recent one is yet less authentic since it is so seldom necessary. I am no longer an endangered species. In my fullest presence, I have become the one who endangers, and mine ownmost death can only be owned in life by the killing of others. This is the unreasoned monstrosity of a faux-phenomenological phantasy: that there are no unwilling victims, that I no longer dream alone.

            G.V. Loewen is the author of 57 books in ethics, education, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

Bring Me the ‘Head’ of Sergio Garcia

Bring me the ‘Head’ of Sergio Garcia (anomie and anonymity)

            Samuel Peckinpah’s 1974 low-budget pulp film ‘Bring me the head of Alfredo Garcia’ has recently become somewhat of a cult classic. Pop culture references abound, perhaps inevitably, but truest to pedigree was the 1991 aborted attempt by the band Iron Prostate to record a song entitled ‘Bring me the head of Jerry Garcia’. Given the much more famous band the named person led, perhaps such an act would have been met with gratitude. Of course, we are not demanding the actual head of the professional golfer, but rather what is inside it. Why so?  Simply because he is an excellent example of someone who is a well enough known figure to have engendered a persona for himself. He, as with any athlete or entertainer, is someone who is both immediately recognizable and yet completely anonymous. More than this, he has publicly, earlier in his career, given fans and followers, detractors and disdainers alike, much fodder to believe that Garcia himself believes in another world which both influences this one, and yet is utterly impassive in its influence; in short, a manner in which to assuage or avoid anomie.

            Anomie is, in a word, subjective alienation. Garcia’s plaintiff, voiced especially after close shaves in major golf championships, that the ‘golf gods’ were out to get him and such-like, suggested that he was feeling anomic about his vocation, picked on, singled out. How many of us have had the same feeling, less publicly perhaps, but even so? Our very anonymity promotes it. ‘Who cares about us?’ We might well ask, on our way to work Monday mornings, along with ‘Another week killed’, on each corresponding Friday evening, but temporary relief aside, what is the meaning of it all? Emile Durkheim coined the term in his 1897 analysis of suicide, rather appropriately, and stated that according to known data at the time ‘anomic suicide’ was the most common form in European society. This contrasted with other forms, such as ‘altruistic’, where one sacrifices oneself for the group, or ‘egoistic’, where one is certain the a quick ‘goodbye cruel world’ is the best response to unfulfilled desires. Given that only about twelve percent of suicides leave notes, it is possible that this category may be under-reported, but however that may be, Durkheim found that his analysis had also generated an empty-set category, which I have elsewhere named ‘fatalistic’ to balance out his conceptions of the other three. A fatalistic suicide would be the kind to be found in an Ibsen play, for instance, or Romeo and Juliet, but in real life, they are exceedingly rare. The very opposite of the anomic, the fatalistic would occur if there were too many structures and strictures in place, prohibiting agency.

            But anomie, by contrast, implies the lack of community and thus responsibility or obligation in one’s life. It is alienation sourced in the absence of salient structures, but not in Marx’s directly structural sense where the person, especially the worker, is subjected to forces that are much more abstract, such as competition with other workers or the commodified reduction of the human being into his ‘labor power’. The anomic person loses her personhood through a lack of the looking-glass selfhood which tells her who she is to others and for others. It is a kind of externally enforced solipsism, and many young people today suffer from anomie, which is yet the leading cause of suicide for this demographic in our own time. The lack of connection, the absence of affection, the abyss of meaningfulness, all combine to threaten our sense of purpose in life. But when one does have that sense, and everything else is also in place, one can still be thwarted. And this is where the ‘gods’, golf fans or no, come in.

            Vindicated and no doubt also relieved, Garcia belatedly won the Masters in 2017. Ever since, he’s cut a different figure than in his youth. He rolls with it, even when the rock itself isn’t rolling. In a word, he has become a mature being, understanding that life is sometimes simply about ‘that’s life’, a song sung by Sinatra on behalf of everyone else who lives or who has ever lived. There has been no more talk of the ‘golf gods’, for example. All this is inherently public, and no one should claim to actually know who the professional athlete is. But personae also change over time, just as do social fact data like the suicide rate. In traditional social organizations, there was only one kind of suicide, that undertaken by the person on behalf of the collective, the one ceding ultimate moral precedence to the many who, collectively, were thought of as one thing in any case. But in modernity, the situation became more complex, and Durkheim was the first to investigate it. The first social science study to make use of statistics, Durkheim’s method-breaking analytics ultimately put forth a much more important idea than his list of categories of suicide; that of the ‘social fact’. Four years earlier, he had stated deadpan that there was ‘no other moral order than that of society’. Therefore, ‘moral’ facts, as they had been called by J.S. Mill and others, were innately social in nature, echoing Marx and Engel’s 1846 epigram, ‘consciousness itself is a social product’. But The German Ideology did not actually appear in print until 1932, so Durkheim had independently come to this similar conclusion through an inductive study, unlike Marx.

            Induction, Sherlock Holmes’ actual method pace Conan-Doyle’s terminological error, proceeds from observations netted into facts of experience. ‘I can’t make bricks without clay’ Holmes testily editorializes to Watson. Quite so, Garcia’s experience in major championships provided a bounty of clay for him to reason, perhaps gratuitously but even so, that there was another force at work, denying him his due results. For those of you who have no interest in golf per se, a bumper sticker I once saw puts it best: ‘As if life weren’t hard enough, I play golf.’ Yet I don’t think we can judge whatever was in the golfer’s mind too severely, because as stated, we have to ask ourselves how many times we might have come to a similar conclusion through this guise of induction; ‘I should have had it, made it, got it, owned it, (or him or her et al), so why didn’t I?’ The social facts that may have leaned up against our individual desires are sometimes obscure, potentially requiring, on the one hand, a full-blown scientific analysis of the kind in which Durkheim excelled. On the other, we are also sometime loathe to admit that we, as individuals, most often do not have ‘what it takes’ to beat the world at its own game. And by the world, we mean of course the very social reality in which we are enveloped from birth until death.

            It is an aspect of adolescent angst to begin to discover this swathe of social facts, mostly ranged against us given the presence of so many others in this our shared world. Before this, parents and others try to give children almost anything they might self-interestedly desire and demand. But this kind of thing can’t go on forever. Indeed, the egotist as adult who keeps faith only in his desire is at risk for suicide, because the world cannot – or in this person’s mind, will not – provide as parents once may have done. So, we hear of ‘common sense’ parenting techniques that suggest weaning one’s child away from directly met demands at the earliest age possible. Yet this is not at all a sense derived from custom and innate sensibility, but rather from none other than induction, a process of logic and reflective reasoning. Adults know from experience that most of life in mass society will be anonymous, fraught with the ever-present problem of anomie. Children must be gradually, and gently, introduced to such a world, one to be their own just as it is already ours. It is not simply a matter of a parent not wishing their child to ‘make the same mistakes’, but rather a more abstract understanding that, while it cannot pinpoint exact contexts or moments in a life that will end in frustration and even loss, nevertheless knows that such moments will occur, other things being equal. One’s singular will, no matter how assertive and confident, cannot match itself each time to that of the world’s.

            For the one who yet imagines themselves larger than life, suicide is one outcome. But homicide may well be another, such acts perpetrated by those who imagine that their will really is, after all, stronger than the world’s and that they will prove this to be so by murdering the very others who have the unmitigated gall to stand in their way, for whatever reason. This yet darker path has no apparent limits, given historical precedents such as the Holocaust and other genocides. Seen in this wider light, the suicide is by far the more ethical person; as an egotist, he wants to die, as an altruist, she dies for those who remain, but as an ‘anomist’, this person truly sees no way out and in this, he is in ethical error. Still, such a mistaken act is far better than murder, and in this induction has also played a role. In its heartfelt attempt to overcome anonymity, observation and experience combine to tell the anomic person not so much of his impending fate, but rather the way life could, even should, be lived by human beings. It is, therefore, not that the alienated individual in our time feels nothing, but rather that she understands precisely what she lacks yet does not see a way in getting it. The fates, as it were, are ranged against her. One of the lesser attributes of calling out the golf gods is, of course, that golf is a trivial pursuit in and of itself, while social anomie is a very serious condition. Garcia has no doubt realized this over time, no differently than the rest of us are compelled to do. Entertainment ‘culture’ provides for us formulaic salves, whereupon we, equally formulaically, react with a ‘you think you’ve got problems!’ after viewing this or that melodrama, fact or fancy. Just so, the artificial relief of contemporary anomie is a major function of entertainment, sport or otherwise. Part voyeurism, part ressentiment-inducing, but majority anomie-reducing, popular culture sashays ever onward simply because the relations of production in our version of social organization do not themselves change.

            But what is the upshot of all of this? One, we have the means within our own heads to utilize a logical process through which we gain a better self-understanding of both alienation and anonymity at the level of the person. Two, Anomie is not something that is inevitable, as Durkheim himself noted. Three, anonymity has its up side, for who of us, aside from the true narcissist, would want to be known by all? Given the fashionable anxiety about the erosion of privacy in modern life, anonymity may well become a kind of precious value in the day to day. Four, we know by now that ‘fate’, though still not of our individual making all of the time, is nonetheless historically conjured, and does not appear in an irruptive manner. There are no golf gods after all. Just bad shots and sometimes some bad luck as well. Given all of this, taking our own shots when best we can will go some way in inducing within our experiences the sensibility that not only can we pick our battles, we don’t need to care all that much what others think or do not think of us, as long as we support their right to try and live a human life. Our relative anonymity vouchsafes the first, our inductive sense testifies to the second. If there is anything somewhat fishy about Durkheim’s analytic, it may be that anomie is itself a kind of nostalgic category, perhaps assuming that the collective life, or even having generous and compassionate community around one at all times, is the better way to live. If so, the authentically modern person must embrace, along with her newfound sense of radical freedom, the understanding that she is by her superior character able to in fact be alone in life, and that this is, for such a person, the new godhead after all.

            G.V. Loewen is the author of 56 books in ethics, education, health, social theory and aesthetics, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

Understanding and Comprehension

Verstehen und VerstÄndnis (Understanding and Comprehension)

            In my life I am confronted with both history and world. Neither of my own making, nevertheless they dominate my existence, and could be said to both originate and predestine it. History includes the tradition, all that which is customary in the sense of Hexis, which I must learn in order to function as a viable being in the social world-as-it-is. But history also includes all that has come to be known as ‘cultural capital’; the very ‘stuff’ of the human career, from ancient archaeology through to contemporary popular culture. It is an open question how much of this capital must be possessed in order to maintain social viability. To each is granted his specialties, perhaps, and in a complex organic social organization wherein each of us plays multiple social roles and thus wears many hats, the daily discourse of informal interaction is muted to the point where Verstehen, or ‘interpretive understanding’ is rarely called upon. This bracketing of interpretation is significant due not only to it being so by design, but also, and as a result of this, that each of us is left to assume that the other knows what she is about, as well as knowing what we ourselves meant by this or that interaction. In a word, we favor Deutung over Verstehen in everyday life.

            Since our very sanity is at stake, both in terms of our self-perception and that of the others, Deutung, or ‘interpreted meaning’ must indeed carry the day to day upon its presumptive shoulders. Most of us presume one another to already and always understand what is called for, and thus what is also called forth, in a great variety of social contexts. But since all of these are learned, and mostly in our childhood and youth, we cannot be certain that either we ourselves or any other person has fully comprehended every possible social context and thus knows how to act to a tee in each. The ‘social’ person used to be casually equated with the most ‘sociable’ person. The very conception of what can constitute a ‘social’ gathering might be interpreted by class and status, ethnicity or language, work relationship or family pedigree. Though all are social in the anthropological sense, the vast majority of us are not students of society in any formal way. It is perhaps ironic that though we do learn to tread on much or even most of the stages upon which civil behavior is to be enacted, very few of us choose to dive more deeply into just how all of this ‘sociality’ actually works.

            And there is, on the face of it, no good reason to do so. If all of us know the score in terms of how to act, what to say, and of course, the opposite as well – the shalt not is duly implied by the shalt – then what would be the point of expending the effort in order to take everything apart just to see how it functioned? Would there not be a risk, in reassembly, that I would then get it wrong? Taking society apart to examine its mechanics and its organics both would seem a risky, even radical act. And since there is no clear consensus on society itself being ‘broken’ – though many claim to know just where and how this or that part of it is broken and should be replaced by some other part, equally proclaimed as being ‘obvious’ – the idea of taking it all down and reconstructing from point zero seems to have little merit. Even the most astute and vigorous social scientist does not attempt such a cosmogonical feat, contenting herself with an examination only, noting the parts and how they interact, just as the ethnographer of modern culture might stalk the streets and observe actual human interactions. An examination of society generates the means by which Deutung can be performed. It does not require any consistent interpretation but rather looks more like a connect-the-dots diagram. The pieces lack order until the connections are made, surely, but just as certainly, all the pieces are present and can be ordered. And it is the resulting order that generates both customary performance but also much of the history of the world itself.

            Between Max Weber and Georg Simmel, ‘Verstehen-Sociology’ gained much traction in the first third of the twentieth century, making a deep impression upon other seminal sociological thinkers such as Alfred Schutz and Robert Merton. It faded post-war given the ascendence of functionalism, which is instructive for our topic as interpretation was just that tool of questioning that mere examination need not access. Taking it all apart and rebuilding it required interpretation, poring over it and noting its connections, as functionalism mostly did, required only already interpreted meaning. Not that this is an either/or situation. Indeed, one might argue that one would have to come to an understanding of the parts themselves before generating an overall comprehension of the whole. While interpretational understanding could do the first, it seemed only functionalism could do the second. Even so, in between these two stood social phenomenology. This third stance suggested that it was the very interactions of living persons that constructed and reconstructed the cultural parts of the social whole in an ongoing and even daily basis. In short, society itself was a performance, or better, a performative understanding of itself.

            The roots of this more active interpretation are diverse. Durkheim’s famous epigrammatic definition of organized religious belief is a case in point: ‘Religion is society worshipping itself’. No more concise editorial can be imagined. But within this pithy fruit is the very essence of performative social action. Borrowing from the older idea that in order for a deity to exist at all it must be actively worshipped, the very structure of society itself has to be borne up by our participation in its manifestations. I am a social actor, yes, but I am also a social founder through my action in the world. Similarly, and perhaps just as profoundly, history too is enacted insofar as we are historical beings. This is not to say that what is known as ‘the past’ must be reenacted – though Mesoamerican cultures have this as a benchmark understanding of human history; the most famous of these consistent performances is the re-enactment of the Columbian Conquest – but at the very least, history ‘continues’ only due to our actions in the present and our resolute being oriented towards the future.

            What is interesting for both the student and the social actor is the fact that performance requires interpretation. The customary scripts may be given and learned by rote, but there is always slippage between an ideal and a reality. The spectrum of improvisation is vast, ranging from stifling a belch during a dinner conversation to reworking a political speech given the pace of world events. Society in its mode of being-social is very serious theater indeed, and part of the fulfilling quality of actual theatrical performance, stagecraft and script, paid actors and paying audience, is that all involved recognize themselves outside of the drama and cast upon the wider social stage. We are, in our own way, not so different from the conquered cultures of Central America, though our settings are perhaps more abstract and work not so directly with historical narratives but rather with an indirect allegory. This too is by design; like the examination of the mechanisms of society without taking the whole thing apart, allegory allows each of us to take the broader view without fully imbricating, and perhaps thus also immolating, ourselves within or upon society’s reason-of-being. For if it is only the ‘moron’ who resists social custom and flouts both mores and moralities alike, most of us would rather preserve our semblance of sanity as part of our own social performance, so that we never risk the ire of others who can justifiably say to us, ‘Well, I’m doing my part to keep all this nonsense together, why aren’t you?’.

            Hence the suspicion and even aspersion cast up against all those who question things ‘too much’. The professional philosopher, who is only doing his job, is perhaps the only socially sanctioned role wherein the practitioner can legitimately say as part of his vocation that nothing is sacred. Outside of this, we are all expected to give at least nominal service to specific renditions of what the majority feels society to be, and to be about. The ‘what’ of society is the first, the ‘why’ the second. Even the thinker, in order to be judged as human at all, must bracket her radical investigations, insights, and indictments some of the time. Society is at once a performance and a comprehension. In true Durkheimian fashion, it performs itself in order to comprehend itself. But because such real drama requires ongoing improvisation and interpretation, society is also made up not so much of cut and dried mechanisms as quick-witted examinations and re-enactments. Society gives the appearance of a machine only insofar as we social actors have honed our individual and thence collective skills upon its various stages. A well-polished theatrical performance never leaves the fiction of its allegory, but a well-staged social performance upshifts itself into a stable social reality.

            The question for each of us then is, do I continue to improve my acting skills within the historical context of the culture into which I was born, in order to preserve the manner in which that culture enacts itself in the wider world, or do I pretend only and ever to be an understudy, letting others do the work which is in fact the duty of all? Or, do I walk off the stage entirely, into a hitherto obscured human drama called ‘conscience’ by the world and ‘the future’ by history? In fact, it is our shared birthright, as both gift and task, to accomplish both of these essential acts. For society as mechanism is only the resulting case, and society as performance the manner in which the present presents its case. In order for any modern culture to reproduce itself, it must eschew pure reproduction in favor of a self-understanding, a Selbstverstandnis or holistic comprehension-of-itself that is developed in history but seeks the future of and in all things. Interpretive understanding is both a tool and yet a way of life for human beings. But comprehension is the mode of being of the social whole, which means no one person can provide it, even for himself. That we are social beings tells upon us in two elemental ways: we are called to conscience by the ongoingness of the world, and we are called toward the future by the ongoing presence of history, which distinguishes the social world from the world at large. That I am myself never socially larger than that historical life is for me the Kerygma of existence.

            G.V. Loewen is the author of 56 books in ethics, education, social theory, health and aesthetics, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

Ethics and Personhood

Ethics and Personhood: ‘you can’t have one without the other’

            There is an agentive aspect to making the distinction between a morality and an ethics. Yet just here we are already relativists, for morality was never simply one of many, but rather ‘the’ only game in town. Even the recognizance, found in the Hebrew scriptures, that there are in fact other gods – just don’t worship them – presupposes in an essential manner that one’s own morality is at the very least superior to those of the others. So, to speak of ‘a’ morality, one amongst many, is to engage an historical sensibility utterly absent during the actual epochs when morals themselves were in the ascendancy. Then, morality could command because the one upon whom it made its demands was not a fully individuated person in the contemporary sense. The shalt and shalt not of a moral code impinged not upon agency per se but rather upon one’s sanity, if saneness is thought of in the sociological sense of fully understanding what is customary.

            For the Greeks, the ‘moron’ was the one who resisted custom; mores, traditions, rituals and the like, or was akin to a child who simply did not yet understand them and thus one’s duties towards same. And though it seems somewhat amusing that the one who went against the fates was none other than the ‘hyper-moron’, for our purposes we can borrow from the pithy pop lyricist Neil Peart and reiterate with him that for us today, ‘fate is just the weight of circumstances’. Just so, circumstance for any pre-modern human being could be conceived as fate simply because of the singular presence of morality. Bereft of competition, moral principles could very well give the impression that they were good for all times and places, to the point of convincing the would-be moralist that any sane human being would hold to them. I say ‘would-be’, because though moralizing always seems to be in fashion – demarcating the fine line between righteousness and self-righteousness – to actually be a moralist one requires at least some comparative data.

            It was just this that was missing in premodern social organizations, no matter their ‘level’ of cultural complexity. It is not a coincidence that our first serious stab at ethics occurred in the cosmopolitan settings of the Alexandrian Empire. It is well known that Aristotle’s attempt to disengage ethics from metaphysics didn’t quite work, not due to the person-friendly ideas therein – his conception of friendship is still basically our own; the most noble form of love – but due rather to the lack of persons themselves. Even so, the abruptly multicultural scenes of a relatively impartial imperialism forced upon the customary the customs of the others, unheard of, alien, eye-opening. It was the beginning of perspective in the more radical, experiential sense of the term. And the origin of recognizing that one’s culture was simply one of many also prompted the incipience of imagining the possibility that a single human being might just have a slightly different understanding of ‘his’ customs than did his intimate neighbor.

            Yet this too is an abstraction. While the history of ideas presents a far more choate brevis, the Socratic citizen which gains a worldly consciousness, the Pauline persona for which each step crosses a limen between history and destiny, the Augustinian subject which redeems itself and thus adds a self-consciousness – one is responsible for one’s own past, history is also and suddenly biography – and thence fast-forwarding through Machiavelli, Hobbes and Locke, the process of individuation greatly augmented until the 18th century wherein we first hear of the authentic individual, the Enlightenment’s fabled ‘sovereign selfhood’. It is here, belatedly, that the ‘which’ becomes a ‘who’.

            In literary reflection, the mythic hero which is only begrudgingly human, and then only for a brief period of existence, is gradually transmuted to the person who acts heroically and thence often also dies a human death. Between the hero and the person lies the saint. Between mythology and biography there is hagiography. And while the self-styled heroic author may sometimes engage in autohagiography – Crowley is perhaps an exemplar of self-satire to this regard, though the reader is led both ways there – in general modern literature casts very much human beings into human crises. We have to turn to epic fantasy to attain the echo of the mythic, but in so doing, we also in general cast aside our shared humanity. I resist here the opportunity to provide an alternative to this lot. In any case, it is mortality rather than mere morality that retains its own de profundis in the face of anonymous social relations and mass society.

            The Socratic citizen is lesser in distancing himself from the ‘examined life’. This early Selbstverstandnis has elements of an ethics about it; the idea of virtue, the sense that one should think for oneself over against institutions and customs alike, the weighing of one’s experience in contrast to received wisdom, the questioning of authority. But I feel that it also instrumentalizes youth, seeks the vigor of the question only to enthrall it to the rigor of the argument. Inasmuch as it ‘corrupts’, it also uses youth for its own purposes. In this it feels more like a mission than a mere mission statement. Similarly, the Pauline pilgrim; one is individuated in the face of a transcendental judgment by which the mythic re-enters history through the back door, as it were. The more radical ‘you have heard it said, but…’ is muted by the sense that the objection to history is both final and ahistorical. It vaults the apodeictic into a kind of aphasia, wherein language itself is lost to Logos just as history is lost to Time. That this inability to give voice to one’s own experience is made singular through the redemption or damnation of the soul only underscores the absence of ethics in this kind of liminal spatiality. With Augustine, we are presented with a morality under the guise of an ethics. Self-consciousness is the basis for a redemptive strike; picketing sin in the knowing manner of the one who has sinned but then has broken good, for the good, and for good, in judging the self and finding it wanting. But this is a narrow understanding of the self as its subjectivity is limited to an auto-moralizing; in a word, the subject is subjected to itself.

            In this self-conscious subjection, I appear before myself as a shadow, awaiting the completion and uplifting of secular being through the death of sin. The world is itself the untended garden, its overgrown paths serpentine and thus leading one on but never out. I dwell in this undergrowth as my soul dwelleth only in the shadow of Being. There is no way in which a holistic and authentic selfhood can germinate here. For this, we have to wait for the being-ahead of the will to life to overtake the nostalgic desire for either childhood or death itself. Both are impersonal events, abstracted into Edenic paradise on the one hand, the paradise of the firmament on the other. Only in our own time does our childhood become our own – if only for a moment given the forces of socialization and marketing, schooling and State – and as well do we, if we are resolute, face our ownmost deaths, the ‘death which is mine own’ and can only mean the completion of my being. It is the happenstance of birth, the wonder of the child, the revolution of youth, the Phronesis of mature adulthood, and the singular ownmost of death, which altogether makes the modern individual a person.

            Given this, the history of ethics as a series of truncated attempts to present agency and responsibility over against ritual and duty – and in this, we should never understand Antigone as representing an ethics; her dilemma lies between conflicting duties and customs, not between a morality and an ethics – comes to its own self-understanding in the person-in-the-world. In doing so, it recapitulates its own history but one now lensed through a ‘completed’ ethics; self-reflection seems Socratic, anxiety has its Pauline mood, resoluteness one Augustinian, being-ahead its evolutionary futurism, and its confrontation with tradition its messianic medium. The presence of key moments of the history of ethics geared into our interiority – we use the term ‘conscience’ for this odd amalgamation of quite different, if related, cultural phenomena – allows us to live as if we were historical beings cast in the setting of timeless epic. Though we no longer write myth – at most, the new mythology is demythology – we are yet able to be moved by it, think it larger than life, imagine ourselves as mortal heroes. The formula for this Erlebnis-seeking is pat enough: the rebellious youth takes her show on the road, discovering along the way that some key elements of what she disdained are in fact her tacit allies; trust, faith, and love. In coming of age as a person, our heroine gains for herself an ethics, differing from the received but suffocating morality of the family compact, deferring the perceived but sanctimonious mores of the social contract. If her quest is to reevaluate all values, her destiny is to return to at least a few of them after being otherwise. The new ethics she presents to the world after conquering her own moralizing mountain is simply the action in the world obverse to her own act of being in that selfsame world.

            This is the contemporary myth, our own adventure and not that of our ancestors, however antique. Its heroes are fully human but indeed only demonstrate this by overcoming the dehumanizing effects of anonymity and abstraction the both. In short, today’s epic hero becomes human, and indeed this is her entire mission. Everyone her own messiah? Perhaps not quite that, not yet. For the godhead forced upon the youth, even though not her own, confronts her with the idea that there could be something more to life than what meets the shuttered eye. In its very parochiality, the heroine is made witness to the possibility that her world is but a shadow of the Being-of-the-world itself. It is in this realization that the adventure begins and the young halfling of a person, beset by market personas and upset by parental identities, strikes out with all of her ‘passions unabated’, as well as all of her ‘strength of hatred’, in order to gain the revolution all youth must gain. The very presence of this literary formula in media today at the very least cuts both ways; at once it is a surrogate for the real fight in which youth must engage, and thus presents a decoy and a distraction therefrom, but perhaps it also exemplifies and immortalizes that same fight, inspiring youth to take up its visionary sword and slice through the uncanny knot that shrouds our future being and history alike. If so, then with personhood comes also ethics; an agency in the world that acts as no one has ever acted heretofore. If so, then the most profound wisdom that we can offer our youth is the sensibility that what we are must not, and never, be repeated.

            G.V. Loewen is the author of 56 books in ethics, education, aesthetics, health and social theory, as well as metaphysical adventure fiction. He was professor of the interdisciplinary human sciences for over two decades.

Truthful Fiction, Fictional Truth

Truthful Fiction, Fictional Truth

            World Game, the ruling force, blends false and true.

            The ever-eternally fooling force, blends us in, too. – Nietzsche

            A god now made an animal does not suggest forbearance. In our resentment, we thus resent the truth; happenstance and death. But in our enduring creativity, we do not merely suppress this state of affairs, at its most base, the ‘human condition’, but imagine attaining a novel godhead. This striving for a new divinity is the source of not only the historical religious world systems, but of all imaginative works of the human consciousness. Its fictional content belies its truthful form.

            Let us take a famous macrocosmic example, oft repeated in the microcosm of the human relations. In ‘Acts’, it is related that not only has the dialectic of tradition and revolution been uplifted in and into the ‘Holy Spirit’ – a synthetic conception of the thetic ‘old God of morals’ and its antithesis, the ethical God on earth – but that this new force has generalized the original thesis to apply to all human beings. The Gentiles are also saved or at least, savable. For the first time, at Antioch, the term ‘Christian’ is applied to this new community of believers, some few years before Paul’s letters to the Galatians and thus about 15 years after the Crucifixion. Though this is not the first time such a dialectic which blends fantasy and reality appears in the history of religion, it does represent the advent in the West of the utter democracy of divinity and the equally infinite goodness of grace. The fact that this is new is oddly and even ironically underscored by the fiction that it was forecast in the tradition.

            In the bourgeois marriage, the thesis of the man runs headlong into the antithesis of the woman, generating a synthesis in the child. The child is neither and yet is also both. Its fact is its novel existence, brought about by the Aufheben of conjugality. Its fiction is that it ‘belongs’ to the parents, but in all creative work, including the birth and socialization of a child, an equal element of fantasy must be in play. For to only acknowledge the factual conditions of mortality and finiteness, of difference and uniqueness, would be to put the kibosh on trying to do any of that creative work at all. It would place us as species-being back in a pre-Promethean landscape of shadow and even terror. But there is also no lack of danger in the means by which we give a future to ourselves. In both macrocosm and microcosm the same risk thus presents itself: what if the fiction overtakes the truth?

            If so, in the first, we have religion instead of faith, mere belief without enlightenment; and in the second, we conjure only loyalty in place of trust, fear instead of respect. So if it is truly said that humans cannot live by truth alone, neither can we completely abjure it. The material conditions of human life, the ‘bread’, is by itself not sufficient to become fully human. The ‘faith’, imagination, creativity, fantasy, fiction, is what not only fulfills our desires in some analytic sense, but also completes our being in that existential.

            What then is ‘truthful fiction’, or ‘fictional truth’? I don’t think we can entirely make them discrete. Myth is accepted as nothing but fiction, and yet it contains elements of truth, not only about the human character, however hypostasized, but also about the cosmogonical aspects of our shared world. Myth responds to the perduring and sometimes perplexing duet of questions that challenge us through our very presence in the world; how has the world come to be, and how have I come to be in that selfsame world? Mythic fantasy supplies us with an autobiography writ larger than life. It is not to be read as either history or as a ‘mere’ tall tale, but is rather that synthetic form which uplifts and conserves all that is of value in both the thesis of fact and the antithesis of fiction. It is very much then a ‘truthful fiction’, and, looking at ourselves in its refracted but not distorted glass, its function and its form as well come together for us in an almost miraculous mirror.

            Contrast this with the meticulous mirror of nature that is provided human consciousness by science. If myth is our shared ‘truthful fiction’, then I will suggest here that its iconoclastic child, science, is our equally collective ‘fictional truth’. Historically, science was the synthesis of myth and life, of imagination and experience. It too is thus a dialectical form, even a syncretistic one. Its truth is well-known: the only consistent and logical understanding of nature that we humans have at our current disposal. But its fiction is that it has completely vanquished the imagination, not so much from the source of its questions, but rather from its methods, and particularly from its results. It is a myth, for example, that the cosmology of science is not also epic myth. It is a fiction that science overtakes the fictional to maintain its human interest. Like the God that entered history, suspending for all time and for all comers the sense that divinity by definition is a distant and alien thing, the idea that science exits that same history is equally a fantasy. For science, like myth, is a wholly human production and thus relies as much upon our imagination and ingenuity throughout its process, from question through method to result and thence explanation. It is especially evident that in scientific explanation, there is a concerted and historically consistent effort to efface all traces of mythic sense, replacing them with a hard-nosed experiential sensibility. The fact that even evangelical educational rehabilitation centers targeting youth advertise only ‘evidence-based’ therapies – whatever other more dubious practices may be present therein – is but one example of the astonishing success the fiction of science has generated for itself.

            Just so, if it were not for the fact that ‘fictional truth’ is so available for even the non-believer to utilize should remind us of nothing other than the soteriological generalization recounted in ‘Acts’. Authors who have written in the history of science, especially those who speak of its origins and its early development, from the Miletian School to the Copernican Revolution and onwards, are, in part, repeating the act of cosmogony, of Genesis, and within these actions, the process of the dialectic. This is not to say that there is, or can be, nothing new in the world. The synthetic term, the apex of the dialectical triangle, is justifiably seen as a novel form, performing a hybrid function; at once reminding us of reality while providing the means for a being defined by its finiteness to live on in its face.

            Thus we should not regard the sometimes annoying, even disturbing, blend of fiction and truth as an impediment to the greater experience of life or even to the lesser knowledge of that life as experienced. The ‘world game’ is assuredly afoot, its mystery far outstripping any detective adventure born of and thence borne on the imagination alone. That ‘we too’ are part of its yet mysterious mix, its blithe blending of our beings into both a history of acts which are not our own and a biography which very much is, however much we sometimes attempt to avoid its action, is, in the end, the most blessed of gifts that any divine animal could imagine for itself.

            G.V. Loewen is the author of 55 books in ethics, religion, education, aesthetics health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.