Blog

Flatteries not Included

Flatteries not Included (The Problem of False Other-directedness)

            One aspect of David Riesman’s famous analysis of post-war society that is often overlooked is the sense that the ‘other’, in his ethical rubric, presents an inauthentic otherness. In following our literal neighbors, in ‘keeping up with the Jones’’, we are not only aping an ideal form by means of idealized formulae, we are striving to homogenize society; to make everyone into the same thing. Riesman’s other-directedness, which he rightly casts as both unethical and cowardly without quite explaining why this is more profoundly the case deontologically, is thus not about otherness at all, only ‘the others’ in the sense of a diaspora of Das Man. Insofar as one is left with making what appears only as a decision of individual character – a way in which to distinguish ourselves from a merely individuated life, another aspect of modernity of which Riesman is correctly critical – we leave in possession of an incomplete analytic, suggesting in turn that such a decision cannot itself be fully either made or kept.

            Riesman’s ‘other’ is simply another version of myself. I look at him with envy or disdain, resentment and, in a crisis, even ressentiment. Yet he is nonetheless an intimate stranger; familiar in every way that society seems to count. He has either what I have or what I would like to have. I regard him thence with covetousness, which goes beyond the antique sensibility that his trophy wife is more attractive that mine. Or, conversely, I play the other’s role for him, with similar sentiments abounding. None of this is otherness per se, only what is ‘next’ in line. And the more so, it is also not the Other, the radically irreal Otherness of the uncanny. Are there then three kinds of others with which I must live? The next person, like me in all outward respects and most inward ones as well – we often underestimate the mental sameness occurring in mass society as it is somehow disturbing to imagine myself as much less unique than I would desire – is another; a representative of the herd, to be harsh, an expression of the generalized other, to be discursive, a mimesis of class filiation and that in both senses, to be critical, or yet a ‘fellowman’, to borrow from Schutz. All of these themselves demean the humanity of this next person, and yet all of them are correct in their own way about what he is in society.

            Most mature adults will recognize the great difficulty in procuring friendship as one ages. We are wary of letting just anyone in on who we are, preferring to display only the what for public consumption. This, in spite of the corresponding fact that friends hailing from other phases of contemporary lifespan have changed beyond recognition, especially those much-vaunted childhood friends. Yet we tend not to seek replacements for friendships come adrift or gone awry, suggesting that our perspective is one that suggests ‘well, any further friendships will ultimately go the same way, and if not, we will all die out of them in any case’. Romantic relationships are subject to the same stern logic, but survive its lens more easily given the erotic desires present for some decades after youth. Either way, however, authentic otherness is the last thing persons seek when surrounding themselves with serial circles of acquaintance, very often the most any of us is willing to commit to during working adulthood. Indeed, the frisson of fascination exerted by fictional limns of the Other as an irruptive force exert more pull than does otherness as a cultural fact. Once again, the otherworld requires no real commitment from us, given its own cameo ethereality. If the potential friend might be relatively blameless in the face of our diffidence, the ghost has only itself to blame for same.

            The reliance on sameness to distinguish otherness presents, even so, a more complex problem for ethics and for sociality alike. Though it is reasonable to a point to prefer those who are deemed ‘like us’, to fall in love with ‘kindred spirits’, at least of the earthly kind, or to idolize historical figures who appear to embody our own ideals, whatever they may be, what is less reasoned is the sensibility which overdevelops out of such liaisons. We learn, from a young age, those whom to shun, and these cleavages fall mostly along class and status lines. In-marriage rates exhibit a shocking social class homogeneity, and even those ‘progressive couples’ who do not share a skin colour or even a religion, if any, find that they share almost everything else, especially when measured against the most important variables for match-making or even simply hooking up. For women, anything else is slumming, and for men, just another notch on one’s belt, so to speak. Authentic otherness is inadmissible in marriage; there is too much at stake for elemental disagreements to carry the day. But even for acquaintances who may not share anywhere near as much as do spouses, there one quickly co-constructs a list of topics that will have to remain taboo. Within families it is proverbial that one does not discuss either religion or politics, and perhaps more recently, sexuality as well. Each contemporary person travels in a set of mostly disconnected circles, a more-gentle rendering of living secret lives, if one is deemed sane, or of having multiple personalities, if one is not.

            These social circles are themselves bound by either similar tasks, viewpoints, status backgrounds, or yet beliefs, such as a church membership, and persons who appear in one circle are more likely never to frequent another. Simmel’s ‘web of group affiliations’ still provides one of the most insightful analyses of this aspect of modern society. Circles may be casual or formal, or may move from one to the other pending occasion. They may accept new members, if those more veteran tire of one another’s direct company, or they may hive off into yet smaller groups, driven by a competition for in-group status. In none of this, however, do we discover the differences associated with authentic otherness. To do so, one must be willing to essentially throw over one’s own druthers and connections, and so once cherished and newly perished. Two of my oldest friends, hailing from vastly different cultural backgrounds, nevertheless married decades ago and are yet together. The parents of the woman refused to speak to her for nine years after she had taken up with him. Only when the couple produced their own children did the newly-minted grandparents seek them out. This kind of dynamic will no doubt be familiar to many, even if very few persons take the risk of striving to know the authentic other.

            Yet one can say this and still be well within the normative definitions of otherness. The one who is truly different to me is oddly familiar in that she is eminently recognizable as a societal sore thumb. At the same time, the dominant genders and their relations present an ongoing normative context shot through with apparent conflict and difference. Men and women continue to be raised quite differently in our society and indeed, in all cultures succeeding those of the social contract. The chief reason why the total divorce rate has hovered around fifty percent for many decades is not so much economic – women appearing en masse in a non-crisis mode workforce starting in the 1970s is often cited as the most important variable here; let us suggest that this is merely a vehicle for divorce and not a motive for it –  is that men and women find one another to be stunningly unrecognizable, and this as a human being, not simply as another person. Every dominant gender marriage is thus an odd exercise in internecine yet still cross- cultural ethnography. Participant observation rules the day, and one of the major reasons why youthful intimacies are so erotically inclined, aside from the general sexual repression of our puritanical educational institutions, is that sex is by far the easiest thing for two people to share with one another. It generates both authentic and inauthentic intimacy; it tends to play us beautifully false to one another.

            When the overt passions fade, young people change up and the dance continues elsewhere. If there is also a sense that ‘the grass is always greener’ there is also a growing sense that one needs to ‘settle down’ at some point or other, and so a balance is eventually struck. Subjectively, same-sex relationships are more convenient for such persons, as they do not participate in the wider cross-cultural gender conflict. Of course, objectively they remain more difficult, since the rest of us still cast aspersion towards them, and that precisely because they are seen as avoiding a perduring conflict but one that is nevertheless necessary for the reproduction of society as a whole. It is a simple case of appearing to not be ‘doing one’s part’, ‘sharing the load’, ‘taking a hit for the team’, and so on. Any alternative gender may be hung up on such crosses, and this same diaphanous resentment is at work in other, if related, arenas having to do with the interface of sexuality and gender and the character of the polis, such as women who do not support reproductive rights and who thus vote ‘pro-life’: ‘I raised my own children, why can’t she?’. The underlying pattern to such sensitivities acts like a leitmotif; in this case, it is the perception that someone is cheating.

            It does take a tremendous effort to construct a long-term intimate companionship with an authentic other, and the dominant genders have been experimenting with this task for millennia. Those who have forsaken this norm, however jaded and jaundiced it may be as a principle and certainly not and never being something ‘natural’, are in their turn consigned to a number of margins, not least that of apparent cowardice. It may well be a wondrous thing for men and women to love one another, but how, exactly, does one go about doing such a thing? To face this question squarely is not to just be a ‘square’. There is enough queerness in heterodoxy to make most of us blink at anything yet further down that proverbial side-street. What we find in adult relationships of all kinds is a practice which both acts at a safe distance, all the while safeguarding the perimeter with which the relationship has itself surrounded. Marriage and like companionships represent the epitome of this construction, which is why, even for younger persons, it requires a fair bit of work to undo. Though statistically consistent even if in and out of pop culture fashion, ‘swinging’, mutual and consenting, provides a failsafe for formal intimacy whereby one preserves the once-again edible cake. Alternative genders may themselves be acted out in such spaces, but we lack the data to state that those who play-act the margins are more compassionate towards their reality.

            In all of this, we flatter ourselves. But the world-as-it-is does not include such pat and happy ends. Our tendency to pursue the faux otherness of distant cultural items such as cuisine and popular art forms, as well as genuflect toward political positions of ‘multi-culturalism’ and ‘inclusivity’, betray our deeper motives. We seek only the kind of difference that cements our sameness, that cannot sabotage our sense of what we are and which allows us to decoy ourselves away from the question of who we might become. That we ultimately become other to all that we have been presents Dasein with its ownmost completedness. In contemplating this, however, we are brought bodily into the question of the Other as Anxiety and as the Nothing which comes to me; that it shall come to all others itself means nothing, and this is where normative understandings of otherness let us down the most palpably. Perhaps we can rather suggest that the flight from authentic otherness in life is a proprioceptive resonance of the denial of death; it is the faux equivalent of imagining a form of consciousness immortal; it is the method by which we learn to die by ourselves. In this, we cannot entirely dismiss its patent cowardice as outside of all ethics, even if we might ideally state that resoluteness in life is the better practice of that to be tested in the face of the absence of that self-same life.

            G.V. Loewen is the author of over 60 books. He was professor of the interdisciplinary human sciences for over two decades.

Does Gratitude lead to Complacency?

Does Gratitude lead to Complacency? (The shared character of past and future)

            To be given respite in the face of a crisis is our greatest hope. Once given, once taken, how does this effect our character? Just now, and just then, I was compelled to be resolute, facing down the end and facing up to my personal challenge; the end of complacency, of whatever sort. Resolute being, one of the elemental ‘existentials’ of Dasein, places my being before itself, and thus as well wills my personhood to walk away from itself, itself as it is today. Cultures of all credo and stripe face this same task, and by it, all of them are challenged both bodily and mightily. It is perhaps not implausible to imagine that the courage which is demanded of a single human being in the face of the as yet unknown future might somehow be scaled to suit the needs of that same person’s society. The question of individual character might become a way in which to interrogate cultural merit, a kind of ‘superorganic’ structure which germinates in the basic subsistence of any social organization. The primordial society had no sense of history, and yet, painstakingly and imperceptibly, walked into a future, even though the concept of which could not itself take hold in this original imagination. Any time we today shun this movement, we are regressing into this first being; the proto-human who, in spite of himself, evolved a penetrating and visionary consciousness.

            Resoluteness is Greek, while gratitude is Hebrew. This is one mythopoetic manner of understanding the mystagogical function of the two contrasting ethical stances. That the former is superior to the latter in theory alone does not immediately help us, for it was born in the desultory of dismal dismay; the future is nothing but the end, its all downhill from here. For the Hebrews, the stance is itself weaker, but the motive superior: the future is ours to walk toward and though its all uphill from here, nevertheless, the vantage will be worth it. With the demise of Christian metaphysics in German idealism, the willing being had but resoluteness to call upon in order to become that futural figure. Can one be grateful for the loss of gratitude? As it is so often used as a mere platitude, being grateful lacks the essential kick which propels Dasein to complete the arc of its thrown project. At the same time, resoluteness alone often dismisses what has in fact already been accomplished, and to our credit. Today, we must then ask, what is resolute gratitude? What is the means by which Dasein discloses to itself not only its futurity as a being-ahead-of-itself, but as well, its own beingness-as-it-has-been, which would include its accomplishments?

            Due to a serious health condition, I lived under the impression of the loss of futural being for about 18 months. I was recently given a clean bill of health, a second chance at life, if you will, and found it just as difficult to accept the latter as I did the former. I had become resolute, and had found gratitude, but only concerning the past. I was resolute before the sense that the past was now all I had or could have had, and grateful for this past. But taken in this way, the conceptions become salves and vanish from the vocabulary of vocation, the erudition of ethics. Here lies one of the clues to resolute gratitude: that both must orient themselves toward only the future of Dasein. One may refer to what one has completed only in the sense of Schutz’s ‘I can do it again’, as a writer might say to herself, ‘I have written so many books, why should I not write another?’, and so on. In support of this self-reference which is not back-referencing, I must as well only refer to my prior experience in the manner Schutz has also detailed, when he quotes ‘I cannot swim in the same river twice’. Experience would indeed lose its value, both as the basis for human knowledge but as well, for any ethics, if it itself could only be repeated. This is why, in the primordial human trope, experience is limited to the daily round and to a small suite of crises in which all who live must be challenged by the call to that same life. Childbirth as the future, dying which is the past, hunting and gathering and storytelling and child-raising, as the present presents itself. Is it only the scale and detail of these essential rites of passage which has been altered over the eons?

            I want to suggest that for our own time, what has in fact been altered in a qualitative manner are the implications of mine ownmost death. During the interminable tenure of the social contract, there were no persons, and only parts of the mechanical whole dropped away. The ethnographic witness of mourning rituals in subsistence societies, however marked by astonishment and shot through with romance, nevertheless tells us that there is no one, only the many. One loved one’s group, unto death, and in that death the love of the group holds utter sway over the shared emotions. Here, experience of the human condition is the same thing for all. For us, so far removed from both the complete intimacy of the cohort – Freud’s ‘horde’ has been, in English, trailed away from itself with the over-emphasis on sheer size rather than cohesiveness, which is the other aspect the term suggests; his sense that it was paternalistic is almost assuredly an ironic projection, imported from his own analysis of the modern State – and the daily necessity for its nurturing and nourishment, cannot but see in experience only difference, not sameness. Just so, philosophers too have made it an ambition to convince us that experience must be ever new; Erlebnis and not mere Erfahrung. The lack of the novel in our lives is assuaged by the invention of theatrical experience, such as that to be found in sports and entertainment fiction. But there is nothing truly new in a game which has itself been played thousands of times, or in a script designed to appeal to a known market. In spite of this, we can be so captivated by the ongoing action that we forget the other chief aspect of authentic experience: its presence enacts not action but rather an act.

            In this, individuated experience, becoming an ‘in hand’ through its generalized call to conscience, reenacts the moments of ‘collective effervescence’, to use Durkheim’s phrase, to be found in contexts of crisis which the primordial human community endured or celebrated. That we cannot feel the presence of ‘others’ is precisely due to their being others to ourselves. This was not the case originally, and no ethic of the future would ever imply that it should so be again. We experience life only as our life, and this, in turn, invokes in us both resoluteness and gratitude. On the one hand, I am alienated by my solo adventures; ultimately, no one can fully share them, and this comes home to me most intensely when I am tasked with completing my own Dasein, when I am faced with finitude. But on the other hand, I am liberated by the very same sensibility; no one else has experienced life quite the same way as have I! This is a marvel, a wonder, and perhaps still for some, a miracle. Narrative thus becomes a means of communicating an unshared vision, rather than one of iterating a vision already known to all. Not only did this shift in human consciousness open up language to both religion and to science, it transformed cosmology itself, freeing it from being the vehicle only for cosmogony. Until the ethic of the individual emerges, gently beginning in the West with the Pre-Socratics and much more radically given a futural model in the life of Jesus, our story of the universe was the story of its creation alone.

            Today, origin myths are mostly of interest to folklorists and writers of fantasy quest narratives. This ‘lorecraft’ constructs in turn a ‘worldcraft’, in a manner not so different from what must have occurred during the social contract itself. Cosmogony thus remains as a part of the theater by which the lack of novelty in modern life is partly compensated, thus as well retaining an integral aspect of its cultural value; the latter day spectacle of the pulp fiction epic is our version of each evening’s fireside tale, told and retold in increments, night after starry night. But cosmology proper, liberated from the umbilical uroboros, is now able to investigate for itself the reality of the universe as it can be known without recompense and as only and ever presenting to our astonished senses the radically new. Cosmology is, in a word, the centerpiece of authentic human experience, for no other realm of our yet shared understanding is as alien and wondrous. It can be so simply due to is non-human character, and in this, it tells us its own story, bereft and unrelated to our human concerns. No cosmogony has this function, and indeed, just the opposite; origin myths relate human experience to the universe, not the other way round. This is also why almost all contemporary adventure epics chart a course backward rather than into the unknown. They are attempts to recover the recipe for respite alone, and mistake their ancient form – the extended, originally oral, narrative – for their present function – to impel the present to overcome itself.

            In this, we can be, both as a culture and as persons, too grateful for the past. The resale market for cosmogonical stories remains a leading ledger of this error. We are ourselves led away from the world-as-it-is, for that is after all the function of entertainment cast only as itself. The melodramas of fiction and sports, whether live-action or ‘virtual’, present to us a world askew, a world righted, a world askew then righted, or more disturbingly, a ‘right world’; a world which is seen as being itself in the right. Seldom are we met with the future of our own world, with all of its rightness and wrongness fully in our face. ‘Is this not after all the real world?’, we may ask ourselves. ‘If so, I cannot be entertained by it; I must be resolute only, and take my gratitude from that which allows me to dispense with my obligation to the future of that world.’ In short, the future is seen only as a task, rather than as well a gift. History is also both of these, but with the past, we overemphasize the giftedness therein and turn away from its challenge. Our stance towards the future is the very opposite; we overdo the task in front of us and forget what a great gift, indeed, the greatest of gifts, it is to have a future at all.

            And just as a person can fall ill and be forced to contemplate the lack of that future and the end of one’s life, the completion of one’s Dasein, so a culture entire can sicken itself to the point of disbelief in the future, of itself and in principle. Our half-planned technical apocalypse is a dangerous gesture to this regard. The future causes in us a basic resentment toward life if we take it only as a task. Our very will to life, so essential and indeed, seen as an essence in its supplanting of the animal’s survival instinct, is muted by this overstatement of the unknown as only a threat. Along with this, the dredging of the salvaged selvedge of historical druthers distracts us from becoming conscious that what we have been, as a species, presents just as much of a challenge to us – for it tells us who we are and why, and speaks these wisdoms to us without either rancor but also outside of all salvation – as it does a gift. The authentic disposition of Dasein’s response to the call to conscience as concernful being is that the past and future must be understood as equal parts curse and blessing. We cannot, as the cosmogonical viewpoint had it, simply choose the one and not the other, just as we cannot, as Nietzsche reminds us, choose joy without sorrow. We cannot choose the past without the future since it is we who walk forward resolutely from the one toward the other. Just so, this movement cannot be accomplished without gratitude, for futurity is something elemental to our being, and not merely an unknown factor to be discerned with time, an alien language to be deciphered with study. The future is, in its authenticity, of the same ethical presence as is the past, and thus requires of us the self-same sensibility; that of resolute gratitude and grateful resoluteness. Only by way of this will experience confer upon us its overcoming of complacency, and the universe will continue to be open to our wonder.

            G.V. Loewen is the author of over 60 books, and was professor of the interdisciplinary human sciences for over two decades.

Before Good and Evil

Before Good and Evil (a non-moral reality)

            A generally overlooked aspect of Engels’ social evolutionary schema that closes the circle around its dynamic if not its scale, is the absence of a God concept in what he refers to as ‘primitive communism’. Marx later writes, ‘for the communist man, the idea of God cannot occur’. That is to say, even the very idea of a God becomes impossible in the communist mode of production. For Engels, the cultures of the social contract were to be the model of the relations of production in what remains today an hypothetical communist society. In his schematic, the quirk occurs late in the day, almost as if it were a plot device, necessary because, after unrolling a tight tapestry of human history and prehistory alike – and for the first time, making a connection between them without regressing into either metaphysics or flirting with outright bigotry – the reader finds the climax requires the usual suspension of belief. While this is fine for commercial fiction, it is not so fine for philosophy. That the means of production do not change from the Bourgeois mode of production to that of communism more than implies that capitalism is communism bereft of pre-capitalist symbolic formations.

            This is not, on the face of it, an insoluble problem in practice, only for the model. It is somewhat difficult to believe that neither Marx nor Engels were aware of this tipsiness in an otherwise reasonable ‘model of’, but this is precisely the point here: if Engels strove to create a ‘model of’, Marx desired rather a ‘model for’. Given the challenge of transforming the same model from one to the other, it is perhaps unsurprising that the logic of the dialectic abruptly drops off just when one would expect to see its culmination. A literary scholar once suggested to me that a failed novel is the worst thing, but a failed philosophy is but a work in progress. While such a sentiment is itself reasonable, the key is to continue that work. Let’s reexamine the connections between the origin and the destination in Engels, in order to clarify both the motive and thence the rationale for constructing it the way in which he did.

            ‘Primitive communism’ is the less romantic version of Rousseau’s social contract. It becomes even less sentimental in Durkheim’s ‘mechanical solidarity’, and downright Third Reichish in Malinowski’s diaries, not intended for publication, wherein the ‘savages should all be obliterated’. Yes, living-in with a bunch of superstitious morons would likely get old, as the famous ethnographer discovered for himself, but then again, this was precisely the point of Marx and Engels when they dedicated their corpus to a demythology of modern man. In the nineteenth century, when social evolutionary schemas were all the rage, Darwin’s revelations only fostered a deepening of the sense that what one saw regarding ‘progress’ was not merely cultural, but had to do with the ‘species essence’, as Marx has it. This post-Enlightenment problem was not quite overcome even in the work of some of the greatest of its revolutionary thinkers, including Nietzsche, Freud, and Heidegger. For each, there is a point wherein metaphysics, the idea of Man, capital ‘M’, creeps back in. From a purely authorial point of view, this is a subjective reaction to becoming over-enamored with one’s own ideas. This is the more easily solved aspect of the problem. Less simple is the aspect which lies at the discursive level: from Aristotle to Foucault, metaphysics, in its broadest sense and most distanciated case, re-presences itself. At the far end, ethics does not manage to sever its umbilical cord to metaphysics, and at the near end, the archaeological structures of discourse, their ‘evenements’ and their orthographies, trend trundling into the same. It appears that it is not an easy thing, at all, to overcome the idea of the ideas.

            Yet for the vast bulk of our species’ tenure on this planet, and presumably, for all of the millions of years before this, wherein our hominid ancestors rusticated, metaphysics didn’t, equally at all, exist. This is the perduring strength of Engels’ understanding: the original human condition provides all of the symbolic clues necessary to convert capitalism into communism. A cosmology without gods, a cosmogony of transformation, and an apolitical polis; what more could one ask for? This was humanity not beyond good and evil, but rather before.

            Gauguin and D.H. Lawrence were liberated by this discovery, but Malinowski was apparently appalled by it. Even so, one would have to more minutely distinguish the types of societies each of these European interlopers lived in, in order to more fully appreciate the implications of Engels’ own work. Melanesia is not Eden, though Polynesia appeared to be a closer approximation thereof. And Mestizo Meso-America, however sunny and sexy when compared with a paranoid and ultimately also delusional Interwar Europe, could only be compared with subsistence social organizations, at a stretch, in the remotest village conditions. Rousseauist romance aside for a moment, Engels was himself the polar opposite of any sentimentalist, having disowned his father, a great capitalist and solemn Protestant Bourgeois, and thence studying the working conditions in the heart of industrial England, producing the first ever full-fledged ethnography in 1845. No romance here, one would suspect, but even there, even then, Engels did find his life love, rescuing a 12-year-old girl from the mills and later marrying her when she ‘came of age’, to use a period expression. In a word, Engels cut a rather more heroic figure than the dreamy Rousseau, embittered Lawrence and escapist Gauguin. For the feminist, Engels was able to do so because he had also shed the misogynist contraptions of his forebears and peers alike. Marx was unable to claim the same for himself, we would suggest.

            However this may be, what is certain is that Rousseau’s image of the ‘noble savage’ itself cut two ways. Was it then the savagery or the nobility that evolutionary discourse would favor? In Nietzsche, they appear to almost become the same thing, and thence in Freud as well; hence the ongoing problem of repression. Darwin, on his part, seemed aloof to the distinction, which may well be par for the course for the harder sciences; ‘it is what it is’, could be an empiricist motto. But all of this discursive hand-wringing in the face of human history comes just before 1859 and thenceforth in the implicatory interregnum between Darwin’s ‘Origin’ and his 1871 ‘Descent’. Afterwards, handwringing gives way to head-shrinking.

            Metaphysics, as a projection of human aspiration, served equally well as a set of ideals as it did ideal conditions; it proposed, in its diverse contents cross-culturally, that while humanity actually lived like this in the present, in the future it could live like that. At first, even death was but a metaphor. One needed to shed the human being which I am in order to ascend to the new culture. There is thus an exiguous, but still continuous, connection between the exhortations found in Gilgamesh to those of The Will to Power. In a word, my life as it is and how it has been, is but a shadow of either what is to come, or what it should be. The discursive rendering of the saint, metaphysics as morality quickly came to define not only the standard of ideal conduct in the world – and this as a role model, a ‘model for’; which in turn suggests that the dialectic should have been able, if left to its own internal logical device, overcome any flaw in Engels’ schema, since in metaphysics we do have a general example of what once was merely a ‘model of’ transmuting itself into a ‘model for’ – but as well the rubric by which one, indeed, anyone, could attain such an ideal. These are the timeless codes, from Hammurabi to the Decalogue, which connote a space transcendent to history, a space which is not a place and which can be simply called ‘Time’. In this, metaphysics reinvents the absence of history which was, forever and ever, the condition of our species and its direct predecessors.

            The timeless time of the social contract was attractive to Engels both as a model of a society which endured in spite of itself and its own serious limitations, as well as politically; as a model for the re-creation of a similar set of relations of production which would, in their own way, withstand the test of historical time. Communism is thus granted the status of an Eden-in-practice. Like any utopian scheme, Engels’ dialectical materialism presents its terminus as at the least indefinite, and in this, aspires to bring the metaphysical metaphor to ground. That we have not yet been able to slough off the ‘old gods’ of pre-capitalist symbolic forms, does not slay the utopian loyalist but rather summons her to further heroics, discursive or otherwise. In our own day, climate clamor, identity ideology, gender genuflection, and hysteria in the face of the facts of human history fashionably dominate popular discourse regarding the future, however indefinite it may be or yet become. Not that Engels’ was himself either an ill-considered thinker or a person who dwelt in the clouds, quite the opposite. But any time one ‘gets an idea in one’s head’, as it were, the deeper meaning of such a phrase comes to the fore in light of the represencing of metaphysical aspirations, this time at a very subjective level. It allows us to mistake the personal for the political, the ideological for the theoretical, even the factual for the fanciful. It blinds us to both the vicissitudes of historical time – our conception thereof does not admit to there ever being a ‘forever’, either in the distant past or the projected future – as well as the evidence, fragmentary and yet possessed of its own miracle: that even in the fossil record of quasi-timeless geological time, there is still change, albeit glacial. The toolkit of Homo Erectus showed almost no alteration over a span of up to two million years, but, in the end, it was transformed, as more sophisticated proto-humans arose. This cannot possibly be called a memory, but only a fact. In this, we learn that experience has a too-intimate effect upon us; through it alone we are become bigots, the deniers of worlds.

            What Engels did realize, before the logical slippage, was that too great a cleaving to models of meant a more challenging effort regarding models for. There is no sign, in running through his evolutionary model, that anything unexpected was to occur. Marx noted, perhaps more to himself than to anyone else, that capital presented the most liberating possibility of any human condition theretofore, simply because there was not only the vast potential of its industrial-technical means of production, but there was also, and for the first time, social mobility built into the system itself. Romantic pseudo-history has culture heroes flung to the top of antique societies, but these figures are exceedingly rare. Whether or not Capital can overcome the metaphysics it has inherited from the social organizations occurring in history between the bookended communisms remains to be seen. Social mobility itself cuts both ways. That one can improve one’s subjective lot also means that one can sabotage it. And when an entire culture history ‘breaks bad’, it is the great plot device of an ideology to glorify the implausible in order to suppress the impossible.

            G.V. Loewen is the author of over 60 books. He was professor of the interdisciplinary human sciences for over two decades.

The Reign in Spain

The Reign in Spain (falls mainly on the king)

            After having survived a quite literal mudslinging, Spain’s monarch must also have just as literally encountered the very ground of his rule. The sovereign, as a social role, is both the body politic and the territory, the land, whereupon his subjects rusticate. Bataille’s political sociology remains the best take on an anthropological history of the idea of the sovereign, but today we understand a ruler whose role is both archaic and even anachronistic to, perhaps with irony, work to get back to his earthy roots. A monarch today represents the people over against the government and other interests. They are a relatively free agent, apparently apolitical but not non-political, symbolic of a set of values of which all are supposedly supportive. Today, the list of such values which can be represented in this old-world manner is likely much shorter than it had been in the past, but we cannot be sure of this, mainly due to the fact that historic records are not only penned by the privileged, the literate, the cultured, but also preserved by them. We have an official line, prevalent in all types of history known by us, to the threshold that it would not be an exaggeration to imply that all history is, to a great extent, official history.

            The sovereign was, however, not originally an historical figure at all. The position was an Aufhebung, not only propelled to the apex of the societal pyramid, but floating above that point. Like the third eye of the Masonic lore, it was held in space by its divine assignation in feudalism, by its being perceived as the worldly source of Mana in traditional societies, or by its having secured a rather happenstance superiority in resource access and distribution, as in early irrigation civilizations. Held in space by the otherworld, and conversely, held in place by our shared world over which the sovereign presided but also must exempt himself from, the ruler’s rule is one shot through with distanciation. Today, of course, the remaining monarchs have come down to earth, with the date of 1688 being important to that regard. 1789 would not have been possible without the movement from monarch to parliament. Yet it is 1789 and not 1688 which allows us to become nostalgic for the monarchy and, in regions where such persons yet exist, such as Spain, imagine that the sovereign has a populist responsibility, an authentic obligation to ‘the people’ which, in turn, is the only thing that authenticates his existence as well as the continued existence of the role itself.

            Just as we have made God a fellow traveller, so the sovereign must also fall into that same worldly line. Lineage is now part of an antiquarian, even a dilettantish or yet Whiggish, history, and nothing more. A royal genealogy may be romantic, but it gives the current title-holder no moral purchase upon how responsible one is or what responsibilities one has. And the personalization of religion, which is easier to shoulder than that of politics due to the abstract and essential quality of the divine, is both a practice-run at making leadership itself worldly, as well as a hedge. The nautical phrase, ‘having one anchor out to windward’ applies to modern religion, especially Protestantism, in that we can still claim belief. We speak to a personalized godhead but we still have faith that someone is listening to us. Our relationship with sovereignty is muddier than this.

            Apropos, today’s monarchs are philanthropists in every sense of the term. They work for charitable organizations, they lend their status to benevolent causes, they labor on behalf of non-governmental organizations, they travel the world for the cause of surface diplomacy – nothing important actually ‘gets done’ on such junkets; monarchs do not negotiate the brass tacks of contemporary geopolitics – and they make appearances at arts and cultural events. They are taxed by their abstract origin; they must appear to be everywhere at once. To be seen but not heard in this overtaxed manner makes the sovereign into a young child. The monarch has no voice in any case, and to ‘blame’ him for his nation’s woes, natural or cultural it matters not, is to mistake both his person and his role. In the capacity of the former, he is like any of the rest of us, covered in mud by mudslides, suffocating to death if in the wrong place at the wrong time. As to the latter, the monarch has no political power, no Realpolitik, if you will. And while many of us have imagined, perhaps as children ourselves, that it would be a lark to fling mud at a king no less, the act is itself symbolic, participating in that near-primordial order of affairs where the sovereign’s very being is lived on the land through and by myself.

            This same land had betrayed its people, murdering them ruthlessly and anonymously. Ergo, the king had demonstrated that self-same betrayal. This was no mere matter of sympathetic magic; the sovereign is the land as well as is the people, and so in him, through a natural disaster, an internecine conflict occurred. The Lisbon earthquake was interpreted by some as evidence for the absence of God in the world. The world had, in that case, betrayed itself, shuddering to its foundations the culture that had grown from it, shaking in its essence with the parturition from the source of its own creation. There is no Erda in our contemporary narrative. Wisdom comes not from the earth but rather from the greater cosmos, the only remaining presence that can mimic both the distanciated being of the divine and its royal representative, as well as the abstract quality of the moral Mana necessary to keep everything in its static place. Just so, all populist politicians, none of them remotely royal or abstract, claim to be ‘the anointed’ – a recent report had one Trump follower referring to him using that exact phrase – and if one is loyal to them, they shall return the earth to its former order. The ‘again’ of these slogans is what is truly disturbing about them, not the idea of greatness.

            But Bataille reminds us that an authentic sovereign had no need to make claims of any kind. Just as the one who possesses what possesses her, the person of faith, the one who has no need to express or expound that faith to others – her acts alone speak the voice of the greater being, which is why some faiths refer to them as ‘works’; a direct nod to the sense that the divine ‘works’ through us – the sovereign acts without having to take action, utters without speaking, works without laboring. No mere politician can accomplish any of these things, but neither should they try to do so. Self-sacrifice is the lot of the modern leader, for she remains a person even when occupying her lead role. Not only was the sovereign never a self, he had no personal relationships. The people were his embodied action in the world, the land his deeper hearth. ‘The world is deep’, Nietzsche intones, the seriousness of Zarathustra’s ‘Midnight Song’ given an oddly fitting sanctity and transcendence by Mahler setting it into his Third Symphony. Yes, the world is deep. Yet we have today chosen to live only upon it, and not within its embrace. This, for the mythologist, is the truer source of the climate crisis and the overuse of our shared ecosystem.

            Divorced from the earth, our leaders no longer ‘earthly’ in that ancient sense but rather entirely worldly, we must alone confront the sheer scale of anonymous natural forces which can suddenly impinge upon our existence. The ‘natural’ disaster can sometime be avoided with planning and foresight, and this is the argument of the Spaniards who were made victims by the recently value-neutral earth. Insurance companies, ironically still comfortable with using the phrase ‘act of God’, cannot replace creation, only repair destruction, for they are not themselves Gods. Insurance can only take action, not render act. Because we are persons, our Gods personalized, our leaders elevated but not exalted, we must come to terms with both action and labor, ‘own’ our responsibilities but not author them, and leave the act to history and the work to the arts. Only a God resurrects; its representative, more akin to a mobile organ, presides over a ritual laying on of hands, acts as the vehicle for Mana, and wields it on behalf of the people at large. The sovereign sacrifices all that is merely human, and unknowingly, for from the beginning of his presence he will not be human. The Dalai Lama is perhaps the last vestige of the sovereign whom Bataille brilliantly analyses. Not a person, not quite human, he is gendered only for convenience, dressed only as a sign is dressed. His lot is no pillar of fire by night, but even so, the sovereign is expected to guide his people through his decisions. The body of the sovereign is culpable if other bodies fail; in this case, the earthly corpus lashing out, taking the people’s corpses into itself, in an excessive ritual of inhuman inhumation.

            What of our own expectations? It is commonly said that we expect ‘too much’ from our politicians, and not only given the dynamics of office and how one attains it. But this hypertrophic trophy, the leader, cannot connote a victory other than one political. It is not that we expect too much of the person but rather of the position. The reality is, is that a politician is not a sovereign, a person not a God, the office of policies not a temple of wisdoms. So, when the earth reminds us of its own current status, forever now apart from the transformational cosmology of the social contract and, more recently, divorced from its ability to at least provide recurring subsistence as a ‘land’ does for its people, we shall suffer. It is part of our drive for Babel redux that compels us to lay our too-possessive hands upon the earth, but in this we mistake the relationship a God had with earth; that we imagine the earth was enthralled to the Mana of Being, rather than it itself existing as its own form of being. Just so, since we are not Gods, our beings must remain ‘in the world’ and not within the earth. For only do the dead make the earth their home.

            The castigation of Castile is a case of mistaken identity. At once, the politics of identity is called into question: who leads? As well, the idea of identity politics emerges more fully: we shall seek to resurrect not ourselves – once again, only I as a God could do so – but instead our tribe; that which existed before there were either sovereigns or divinities. The question is itself recurring: can we manifest the community of the social contract on a global scale without descending into the mechanical solidarity which made society possible in the first place?

            G.V. Loewen is the author of over 60 books. He was professor of the interdisciplinary human sciences for over two decades.

The ‘Zeitmotif’ and the New Art

The ‘Zeitmotif’ and the New Art (Metaphorical Realism)

            In his 1851 essay on music drama, Wagner outlines the structure of the leitmotif, a recurring theme which can be used to convey a wider sense of emotion concerning a character, an abstract force, a place, or even an intent. Used in this way, an audience is reminded of the presence of the essence of the character, force, or place etc., just as all of his principal characters themselves represent more abstract qualities in the human and historical imagination. Characters, by definition, cannot be archetypes, but they can cleave to them. And this is what we are presented with through the use of the leitmotif: it is the connecting link between person and persona, character and type, experience and essence. Insofar as the entirety of the romantic aesthetic was transformed by Wagner’s idea, the leitmotif as a concept in a sense contains itself.

            I would like to suggest a corresponding conception for history, rather than art. Rather than Zeitgeist, the ‘spirit of the age’ or of the times, which is either too narrow temporally if in fact realistic, or unrealistic if overextended in time, and increasingly so the more time it is claimed to be able to represent. Its thus far place-filler converse, ‘leitgeist’, suggests a kind of transhistorical presence, a ‘recurring spirit’ which, akin to that of religious ethics or soteriological doctrines, also by definition cannot adhere to any specific time period. So then, the fourth possible term available would be ‘Zeitmotif’: a temporally contained motive which also implies historical motivation. In modern art too, the leitmotif is present, and in all arenas of popular art Wagner remains the benchmark. Video game soundtracks are an ubiquitous example of the use of leitmotifs. Every important character has their own theme, for instance, as well as do certain kinds of events or yet scenes. Brands have theme-songs or specific melodies attached to them, and even cartoonish characters in children’s video games have simple ditties which, due to their repetitive and even omnipresent replay, have become instantly recognizable, even outside of their digital and interactive contexts.

            By contrast, the Zeitmotif does not so much recur as characterize. Indeed, if we were to subjunct modern art to modernity more generally, we could call the leitmotif itself a Zeitmotif, for it is something which in itself is characteristic of the art of a particular age. For pre-modern or traditional relations of production or cosmologies, the leitgeist would certainly qualify as a Zeitmotif, for it too is characteristic of an important aspect of a world-system, yet as well constrained by a specific historical time period. It is not at all a contradiction to state that historical conceptions, even self-conceptions, can include the transhistorical. The reality of the former does not even imply, objectively speaking, the wider reality of the latter. Thus the ‘presence of God’, so presumed as universal in premodern contexts, has for us fallen away, both from ourselves and perhaps more radically, from itself as well. The Fall of Man becomes the less-dressed rehearsal for that of the divine.

            The history of art presupposes that each innovation seeks for itself pride of place amongst the competition to represent not only its individuated subject matter, but the very age in which it finds itself present. This intent could well be called a leitgeist, but the content of each of these successive styles or genres is clearly much more limited. We can argue that the origin of modernity in art appears with Goya, and especially his willingness to, quite without romance though indeed with the highest human drama, depict violence and horror. This period in art only begins to reach its far horizon with the advent of digital imagery. It ascends to its symbolic nadir perhaps with Picasso – Guernica comes to mind – where a Catalonian catatonia is catastrophically catalogued. A Zeitmotif ignores the ‘school’ of artists, their nationality, and their respective lifespans as long as none of these fall outside the wider period identified by a Zeitgeist. In this, the two concepts are themselves linked, just as the archetypical suasion of a leitmotif casts its arm loosely round the figure of a leitgeist. The Zeitmotif in the new art acts as an aesthetic centrifuge.

            In the older sensibility, a corresponding Zeitmotif could not be subjectively identified. This is so due to its content, which spoke of transcendence and not history, the otherworld rather than the this-world. The Madonna figure is certainly, from an art historian’s perspective, a leitmotif in the sense of it being a recurring expression, and, from the view of the history of ideas, a Zeitmotif, since it is a ‘sign of the times’, as it were. But in itself, and from the context of this genre’s ‘production and consumption’ – more empathically, ‘creation and assumption’ – it is simply a snapshot of an ongoing presence, a leitgeist, rendered and portrayed yes, but neither invented by, nor tethered to, in any way the artist or the viewer. Only in modernity is the Zeitmotif both acknowledged as a ‘thing’ which exists and which has various content, but as well understood as it being itself a form of formal analysis. The lack of the subjective analytic in premodernity does no disservice to its art, of course, but it does place it somewhat outside our general ken. We are conscious of its presence as representative of a worldview, but not as the presence of a God.

            To reiterate most plainly: today, any leitgeist is but a Zeitmotif, any Zeitmotif a temporally truncated non-recurring leitmotif, as well as an emblem of a Zeitgeist. How then do these additional terms help us explicate apparently qualitatively different periods in both art, belief, cultural practice, and even realismus?

            Our ability to recognize contrast between and amongst successive, or even simultaneous, genres in the arts provides a first clue. Brahms and Bruckner shared a number of important things; they were both major composers of their time, they had their respective followings, neither married, and both fell in love with women who were utterly unavailable to them. They met the once; enjoying together with gusto the one thing perhaps to their minds that they truly shared; a love for south German cuisine and beer. Brahms’ avatar was Beethoven, Bruckner’s, Wagner. Brahms had himself already endured the fashionable conflict between his followers and Wagner’s, the latter supplanted by Bruckner’s after Wagner’s death in 1883. Even so, viewed over the longer term, there is no authentic conflict here. Both composers’ apical ancestor was Beethoven, and the romanticism of the 19th century did not truly vanish until the advent of Schoenberg, a few years after both men’s own respective passing. The case is instructive on more than one count. First, we must learn to distinguish mere fashion from history proper as the identification of a Zeitmotif can only occur within the ambit of the latter. Next, we can also begin to apprehend the phenomenological property of ‘taste’; its ‘regimes’, to borrow from Bourdieu, too formal to themselves provide egress from their also fashionable frames. For institutionalization does not an epistemology grant. Finally, but not exhaustively, a Zeitmotif collects to itself that which can often appear to be in opposition. More profoundly aesthetically, but no different structurally, is late 19th Vienna when juxtaposed with more recent feuds between popular music groups, such as The Beatles with The Rolling Stones, that same band with The Who, or Metallica with Guns ‘N Roses. A Zeitmotif thus does not make distinct an aesthetic judgment, but is rather a simple phenomenological rubric.

            Further, the presence of a Zeitmotif is not limited to a specific form of content. In our example, we would have to as well identify an entire ‘Wagnerian’ persona, versus that ‘Brahmsian’. This analytic points neither to acolyte much less to martinette, but instead attempts to disclose, in the existential sense of the term, the Dasein of the form in question; its beingness in the world. Nietzsche was, for a short time, a Wagnerian, but we would never think of the philosopher today as wearing that heart on his sleeve. And speaking of Dasein, Heidegger was a member of the NSDAP for about two years, before belatedly realizing the fuller import of that movement’s own leitmotifs. Today, only the embittered critic trades in Nazified insults when it comes to the lineage of great thinkers. By contrast, a Zeitmotif has staying power; but always within its own amalgamated arc! And yet further, and more evidence of the parochial quality of the presence of a Zeitmotif in situ, as it were, one would strain to find either a Wagnerian, Brucknerian, or a Brahmsian today, and while each composer still has his respective fans, mostly it is the case that if one loves one, one loves all three and others besides. So, what can construct a Zeitmotif also, over time, serves to ‘deconstruct’ it – in the looser sense, surely, but indeed including the characteristics ‘differing’ (we recognize the distinct styles of yet musically kindred composers) and ‘deferring’ (we abjure judgment, recusing ourselves from stating with any final emphasis which composer is ‘superior’) – without our analytic losing its hold over the historically inclined framework, once in place.

            While a leitmotif recurs not merely to reiterate its charge’s presence but as well to remind the listener or what-have-you that indeed this presence has in fact reappeared, a Zeitmotif is much more static, relying upon its somewhat standoffish but always-fullest presence to stifle any lapse in our collective memory. In this, it maintains an advantage over the older leitgeist, which, in its very abstraction, must avail itself of its uncanny properties. Like any vision, it then risks the perennial problem of having utter authority over the visionary but that over no other. In direct contrast – and perhaps also as an historicist ‘replacement’ of the leitgeist? – the Zeitmotif’s presence rests in the mere presentification of itself; it is simply there, and thus its version of risk is that we, equally simply, neglect to note its ongoing presence, since it can rapidly recede into mundanity, as with everything else.

            Perhaps it is reasonable to conclude that it is this very otiose quality which animates the Zeitmotif, and is in turn reanimated by it. Even at the level of fashion, our spirits are at least titillated by their being the appearance of conflict. In narrative of course, it is proverbial that ‘if one does not have conflict, one does not have a story’; that is, at all. We may venture to say the same of life, in spite of our almost innate sense that we must avoid conflict as best we can. Simmel’s discussion of the irreal, though not quite uncanny, quality of there being a life which can only be lived by experiencing a number of different lives and thus inhabiting, or yet indwelling, a number of different phases of life, resonates here; for the concept of the Zeitmotif comes home to us most intimately when we understand ourselves through its ledgered yet lustrous lens.

            G.V. Loewen is the author of over 60 books. He was professor of the interdisciplinary human sciences for over two decades.

An Ethicist Looks at Youth Pornography

An Ethicist Looks at Youth Pornography (a self-inflicted study)

            “I thought I’d become entranced with myself. I got into it because I wanted to have fun and it’s my body, right? But instead of it being ‘hey, look at me’, it quickly became ‘hey, look at all the people looking at me.’ It was all about the numbers.” (19-year-old female university student).

            “My only regret is that I started too young. I was twelve. I wouldn’t recommend it before say, 16, as my body is now no different than it was at 16. But at 12 I was so taken with myself and that I was in control, you know? And all these thousands of people following me. But ten years later, I look at those people and say, ‘Uh, excuse me? You’re following a naked 12-year-old. There’s something seriously wrong here.” (22-year-old female university graduate).

            “Girls who are on the net want to be on the net. It’s that simple. Many do, most don’t. It’s like anything else you do, from vaping to playing volleyball. Most don’t, but some do, who cares? And yeah, you’re told about risks, but how many suicides have there actually been? I’ve read of three in the backstory news over the past twenty years. Three! Out of tens of thousands of girls per month, who knows, maybe way more. You’ve got a better chance of being struck by lightning, speaking of risks.” (18-year-old female high school graduate).

Introduction:

            When I consulted as an expert for the senate committee tasked with setting new government policy preventing access to violent pornography by minors, I was struck by the assumptions everyone in the conversation made about the topic itself. Eventually, Bill S-210, (age verification for online porn sites), was adopted by said body on April 18, 2023. Its coverage had, perhaps inevitably, generalized itself from restricting access to ‘violent’ pornography to all online pornographic sites. In good faith, I did not suspect the bill’s sponsors of any prior intent to widen the scope of the bill, and indeed, given that one could neither properly define ‘violence’ in sexual portrayals with any efficacy, and that even if one could, such (per)versions of intimacy would be mixed in with all other possible versions, given the scope of the sites in question themselves, any bill seeking to restrict access thereto for minors would, in the end, have to inure blanket coverage. I supported the bill as is, in practice.

            But the process left many unanswered questions. Why did minors seek out pornography, even participate directly in making it? Why did adults seek to limit such access, even ban it outright? The usual arguments hailing from developmental psychology were, to a philosopher’s mind, verging on the vacuous. Psychology itself is the source of our knowledge of children’s sexuality. Children are sexual beings. The question must rather run along the lines of the sexualization of children for adults. And this is not a question for psychology at all, but instead, one for ethics. As the American Psychiatric Association defines pedophilia using phrases such as ‘a prurient interest in children under 12’, by its own discursive and policy standards, the banning of youth access to pornography must, in turn, be argued as well along lines other than those from psychology. The argument I put forward to the committee is that, under Canadian law, persons under the age of 18 cannot have sex for money. This was the only point of consistency wherein an outright ban of access for those between ages 12 and 17 would make any ethical sense. Since pornography as an industry is not truly about sex but rather money, minors should not be able to participate in it.

            But what of pornography itself? Between parent-pandering politicians, schools concerned about lawsuits, psychologists and counselors drumming up business for themselves, and NPO’s fulminating the latest moral panics, it was clear that neither clarity nor objectivity was to be found in the public sphere regarding issues surrounding youth and shared sexuality. In order to discover the reality of such a conflicted and ideologically laden scene, it was equally clear that one had to properly study it oneself. And for better or worse, I did.

The Study:

            For the past three years I employed a battery of mixed qualitative methods, including unobtrusive, indirect participation, and interview as well as dialogue. Participants were solicited from their on-line profiles found either on porn servers or from my own academic networks, and the therapists were recruited from the Psychology Today listings. I was up-front uncomfortable about asking actual minors about their intimate doings and so I did not attempt to do so. This is a weakness in the study, as I had only past and indirect access to youth participation in pornography, through the voices of those who were youth in that past but had in the interim, before the study commenced, become legal adults. The epigraphs above are examples of hundreds of like interview out-takes. Some of the methods involved deception, including posing as a female youth online to attract groomers in hopes of disclosing the process by which illicit pornographers recruit their victims, and posing as a patient with a pornography addiction in order to access psychotherapists’ on-the-ground practices and methods of combatting this medically real health issue. As a veteran member of 4 university research ethics boards and a co-founder of two, I was well aware of both the pitfalls of engaging in deliberate deception during research as well as the ‘dangers to self’ involved in certain kinds of human subjects ethnography. Indeed, as an ethicist, it was often my role on such bodies to look for possible risks to researchers and over the years, I found many. This specific study presented a number of risks, since I was interacting virtually with both criminals and at-risk young adults. Perhaps ironically, perhaps fittingly, the vehicle of digital media lessened those risks for my vocation just as the informants claimed it did for theirs.

            Such a study would not have passed any ethics board I sat on – not on my watch, at least – but since I am long outside of the institutional circle, itself mostly concerned with litigation against it and less so about the truth of things, as an independent scholar I remained uniquely qualified to engage in this kind of research, having both twenty years of social science fieldwork behind me, much of it in arenas of social deviance and other marginal communities such as UFO cults, American Civil War reenactors, and artists. I had as well authored the first detailed scientific study of a specific genre of sexuality, the BDSM theatre, which appeared variously in peer reviewed journals as well as in my monographs of 2006 and 2011B. Even so, this recent study was different than any other I have completed in a number of important ways: 1. I no longer was capable, nor felt it necessary, to include amongst the methods those of direct participation. 2. The atmosphere surrounding the topic at hand was muddied beyond any possible clarity by moralizing, anxiety, and fashionable politics, as well as a vague fear of technology in general; and 3. Given 2, it is unlikely anyone will pay the least bit of attention to the nonetheless interesting results thereof.

A Summary of the Responses:

            All vectors requiring the suite of methods outlined above were ongoing simultaneously. It invoked, in the traditionalist view, a sense of that old-world ethnographic immersion, with the major exception that I had no novel ‘natural language’ to learn, as one would do, with pith helmet atop head and notebook in hand, ‘among the natives’. Nevertheless, I found the denizens of the pornographic scene to indeed be restless in their own, sometimes fetching, manner:

            “I was like, ‘Okay, I know I’m hot’. All my friends adored me. I wanted to pay for my own college. So, I get on there and I’ve got thousands, then tens of thousands of views and so on. I felt like I was the hottest thing out there. It was very empowering. But then I checked out the competition and it was like, ‘Okay, yeah, she’s kinda hot too’, and ‘oh, uh, okay, she’s hot’, and ‘Hmm, damn, she really is hot!’ and on and on, right? And then the whole thing became kind of a spiteful, vindictive battle of who could generate the most followers and you know these were all the same people following everyone, because young guys, and huh, I guess old guys too, can’t just look at one pretty girl.” (19-year-old female university student).

            The motivation for intimate expression and display was in some majority income related, especially for youth but also for young adults:

            “My parents couldn’t afford college. I was the first person in my family to ever go, and the only reason I went was because I did my own internet porn. It was by far the easiest way to make money. No managers bitching you out, no guys harassing you at the workplace, no minimum wage and then getting home and taking three showers and still the fast-food grease smell is on you. Its shit, utter shit, anywhere else teens work, right? So as soon as I actually was a teenager, I got on there. I’d seen my older sister work fast-food and it killed her. Not me.” (21-year-old female college graduate).

            The sense that making pornography, even illicitly, was a superior form of both self-expression and of employment, was a major theme in interview:

            “Don’t talk to me about morality. Is it a ‘good’ thing to put on a micro-skirt and sashay your way around a restaurant, smiling and flirting and flaring your skirts and bending over surreptitiously just to generate bigger tips? Is that ‘moral’ behavior? No, you wanna look, you’re gonna pay. And the only way a young person can balance those books is by doing porn. I can’t say I love it, but its way better than anything else out there. So, save me the lecture on responsibility. I fucking told my mom to shut it, I’ve paid for college through it, and what do you know? She did.” (25-year-old female graduate student).

            I was unable to access more than a handful of young males who were willing to speak of their online activities, legal or no, but those that did manifested an apparently sincere understanding for their female counterparts:

            “I don’t know if you ever worked a shit-job in your life, no offense. But people don’t know just how badly girls are treated out there. Guys like me, almost all guys, think girls are just objects for their amusement and desire. I got so turned off by that. And then one day my girlfriend told me she was doing porn, take it or leave it. Well, if her, why not me? It only seemed fair, you dig? But only when I got into it did I gain empathy for women. There was no danger for me at least. I read that the audience for young guys is either gay men or middle-aged women! Makes me laugh because I’m not gay and I was a teenager at the time. Like, hey, my mom and her friends think I’m the shit! It was a huge joke, but the money was better than anything I could have made short of becoming an actual sex worker. But then I’d have to have actual sex with my ‘mom’, so, uh, no way!” (21-year-old male university student).

            As in most professions, amateur pornography favors men, in this case mainly because the vast majority of workers are female, even though the audience for pornography of all genres is evenly split between the dominant genders. Even so, doing pornography was still found to be alienating for some in this study:

            “With all the tech toys out there, I learned quickly that I could have much more intense pleasure than any man would be capable of giving me. Overnight, it was like, ‘well, who needs men?’. And many women I know feel that way. Like, in general. Virtual solo sex for money. Sounds perfect, you know? No obligations to anyone, no health risks like STDs, no chance of rape or whatever. And cost-free admiration. Who cares what they’re doing, right? Some people I know get off on others getting off on them. I guess they could be called exhibitionists. But all these labels do is make things clinical. It’s irrelevant. The only thing that matters in the end is the money, on the one side, and the lack of real community on the other.” (27-year-old female white-collar worker).

            The anomie, or subjective alienation, expressed by some in interview was, however, not a universal sensitivity. Feelings of loneliness and usury developed only over time, and were associated strongly with older participants. Those younger adults who had been manufacturing and distributing illegal pornography for some years as youths shrugged off suggestions of any potential Weltschmerz in waiting:

            “Do you really think I’m going to be doing this at age thirty? One, no one would watch. Two, I should have two degrees by then and some normal job. I might even be married, who knows? That’s the whole thing about people who worry about teens and sex. They don’t understand that it’s just a phase of life, like anything else. Old people don’t have sex, or not much of it. Young people do. It’s that simple. Do you jack yourself off and record it? ‘Hey girls, the famous philosopher is fucking himself on-line! A can’t miss, that one.’ No offense, no really, but you get it right? I mean, I appreciate you doing a study like this because like, no one knows what this shit is really about.” (19-year-old female high school graduate).

            Very often, during any research process, participants themselves suggest promising lines of querying. So, I began to ask that seemingly simple question, and the responses were intriguingly critical:

            “What is this all about? Well, for me, it’s about control. My parents tell me what do to 24/7 and after a certain age it’s like, ‘Well, go fuck yourselves’. Hah, and then, its well, I can fuck myself but in a good way, unlike what others try and do to me. Okay, so now I’m in college but I still live at home. There’s nothing illegal like them hitting me, but there are still rules. The economy forces young people to stay young for far too long. I get that. I can’t afford anything by myself. Even if and when I get a degree, is that really going to set me free? Making porn is an insurance policy; that’s what it’s all about.” (18-year-old female university student).

            With more veteran producers, a semblance of a politics emerges:

            “Okay, good question, if vague. For me, it’s about exercising some sort of agency in a world that cares nothing for me. What are my skills? I have a great body and lots of energy. Fine, what else? Do I sell my body and my face for next to nothing waiting tables, or do I sell it on-line for decent wages? You tell me. You didn’t have to make that choice, no offense. But anyone who moralizes at me and anyone else who hates what I do needs to look in the mirror. Are they jealous of my youth? Are they the same people who leer at my peers who do wait tables? Yeah, I’ve ‘converted’ a few of my friends. They’ve seen the light, hah! No more butt-pinches and slaps at the fast-food joint, no more stares and comments at the sit-down restaurant. You get the picture. Long live the internet!” (23-year-old female sex worker).

            In spite of the consistent if not constant caveats generated by government agencies and NPOs alike regarding the risks to youth who involve themselves in pornography, whether as viewers or actual producers, when asked about such risks and their attendant campaigns, respondents were universally critical:

            “The only time I was stalked was when I worked fast food. You get all kinds in places like that, and all the wrong kinds, whether its people with disabilities, criminals, unhappy husbands, INCELs, you name it. And you know, like, right away, ‘this guy is dangerous’. On-line there’s no contact. If there is any danger to it, well, two can play that game, right? You expose me I expose you. The police can track your IP and the rest of it. Don’t insult me or play with me on-line. You have no idea where I live or even who I am. Those few girls who were threatened with exposure, maybe one or two killed themselves, well, how did that even happen, right? I have no fear of ruined reputation because there’s a million girls out there who look basically just like me. Do I live in Lithuania? No, but she might. And when I was still in high school it was like, ‘okay, make my day asshole’.” (20-year-old female university student).

            Not all research participants were as confident, nay, yet belligerent, as were some, but even the more cautious ones sneered at the nay-sayers:

            “You have to be smart with it. Of course you do. I would not tell a young girl to try this. I started when I was 15 and I learned quickly what not to do. Never invite anyone into a chat. Never focus on one consumer at the expense of others. Never say you’re single. Never offend anyone, like by saying anything about their own sexual prowess or ego. Obviously, never mention where you live or what school you go to or your real name, I mean, a ten-year-old knows that part of it, not that she should be doing what I do, but really. The biggest thing is that you’re being paid to be someone’s fantasy object, and as long as it stays at that level, there’s no risk. Like, none at all.” (19-year-old sex female sex worker).

            I asked producers what was going through their minds during actual performances, and correspondingly, reported further on, I asked therapists what transpired mentally during their respective interactions with those who did, or had, performed:

            “When you’re live it’s all about the act. You’re getting pleasure and so are they. Nothing else should intrude upon this ‘duet’, if you will. It’s a total fantasy only in the sense that I would never be together for real with anyone who views me, and they know that. But they can dream, and when they do, I’m there for them, almost equally for real. The thing that pisses me off is now the 3D AI ‘girls’ are stealing my views and you know it’s not other teenagers making those. It’s some loser who has tech gear and skill and he’s making money from some of the same people I used to make money from. Pretty soon all the moralizers can just go home, with that going on. Who knows, maybe some of those religious fanatics are actually making the AI shit, trying to put real girls out of business!” (22-year-old college graduate).

            I had not thought of that possibility, as ludicrous as it may sound on the face of it. Whoever is generating artificial sex objects however, is panning for the same guttural gold as are real persons; that much was clear. Another common response:

            “Okay, so it’s a business like any other. There’s you and there’s the competition. So, you innovate, just like any good entrepreneur. As far as the AI stuff goes, well, I have a video where I slash myself on the back of my arm and it bleeds a little, no biggie. And I say, ‘No fucking sex doll or AI mock can do that, boys.’ And some people are turned on by that, and word gets around, right? I got good responses from that one, a lot of views. People said they really appreciate me ‘being real’, and that I’m ‘not a coward’. And though I’m not quite real in one sense, I do have guts. It takes guts to make porn, which is something the haters like to forget. You try it.” (20-year-old female university student).

            The therapists and counselors involved in the study were not of one mind in their responses to being shown patterned interview out-takes with young adults. Many were shy of making any final judgment at all, which was consistent with their professional duty to act as resources rather than evaluators. The following was commonplace, whether I myself was feigning illness or no:

            “We should never moralize about sex. It doesn’t help at all. Especially for young women, I feel they are driven to place themselves at risk because they are looking for some reassurance. Not only that they are beautiful, because they know that it’s no great turn of trick to be beautiful at their age, but much more so, a kind of validation that they have some social worth, that they have a place in society more generally. What kind of place is, of course, another matter entirely.” (middle-aged female psychotherapist).

            A male professional counterpart added what turned out to be as well a well-travelled road:

            “I’ve worked as a counselor for only two decades, so while I’m still young, I have increasing difficulty identifying with youth. You told me you had the same issue as a professor, when you were still teaching. It makes me raise my eyebrows, when a teenager tells me she’s making porn, but I don’t judge. That only makes what might be a bad situation worse. Instead, I ask such a person, ‘what’s in it for you?’. I get very similar responses as you have shown me from your study. The sum of such responses is, I dare say, quite convincing.” (middle-aged male psychologist).

            Professional psychologists and counselors varied only upon their methods of guiding minors or others, and in turn, based this variance on whether or not the client in question actually wanted to get out of the business or did not. No clinician or counselor with whom I spoke, either as a health research colleague or as a ‘patient’, said that they had ever recommended to a porn producer that they stop, let alone suggesting that they were necessarily placing themselves at risk, contrasting mightily with the journalistic, political, and other grassroots voices regarding the topics at hand:

            “I don’t want to ever say to a young person involved in porn that, ‘there’s no risks’, but we have to look at the stats. We know that 95% of violence and abuse against minors happens in the home and from family members or friends thereof. 95% of the other 5% happens in the schools or in other like contexts, as in, where there may be coaches, music teachers, ballet instructors, and the like. We know this, and we have known this for some time. But it is only very recently that stories of such abuse are appearing, and some very high-profile ones, like the Olympic gymnasts and what-have-you. And yet parents blithely drop their kids off at ballet or whatever, and those same kids, when older, with their same trained and disciplined figures, may be making porn, because they know they have the right type of looks for it. And only then do parents hit the roof. So, there’s a problem with the whole discourse surrounding risk in our society, and I for one am glad you’re doing this expository study on one of the core arenas of these misconceptions.” (middle-aged female clinician).

            I have argued elsewhere that most organized activities for youth in our culture serve multiple, often conflicting purposes. Henry Giroux is the most sophisticated name in this part of critical discourse, but alas, I could not access him to comment upon this study. Yet psychologists themselves appeared aware that there was a studied hypocrisy afoot when it came to comparing activities such as sports and the arts with pornography. I then, in turn, threw that out in the direction of the pornographers themselves:

            “Hah! Well, that really makes me laugh. I was in ballet for years. That’s exactly how I got this body and the confidence to strut my butt, right? But dance is like all the rest of it for us girls. The adults bark at you, touch you when and where they should not under the guise of ‘positioning’, some parents even still spank their kids if they’re younger. The dance teachers don’t dare but they tell on you, right? I got it up until I was 12. Now I spank myself for money and I’m in complete control of it, which I never was as a little kid. So yup, hypocrisy? That’s basically any older adult’s middle name as far as I’m concerned.” (19-year-old female university student).

            There were many respondents who also did not see any serious difference between doing sports or dance and doing porn, given the apparel and physiognomic feats required for many athletic and performing arts venues:

            “Yeah, well, the thing of it is, what I wear online and what I wore in dance or when I was in track at school; not much difference. And I’m still doing crazy things with my body either way, so no real difference there either. And the people who showed up to watch me play volleyball in high school weren’t all there to watch the game, if you know what I mean. Same with track, same with dance. The bottom line, excuse the expression, is that people want to look at young girls, the less clothes the better, and so we’ve got all kinds of ways people can do just that. And my parents never batted an eye at it. So, it’s all porn, at the end of the day. All of it.” (18-year-old female high school graduate).           

            When I asked how making porn itself, illicitly or no, compared with just viewing it, after explaining that I was consulting for the Senate committee, a number of responses shared the following themes:

            “The viewers are losers, at least in one sense. But I’ve read other studies of porn usage. On the one hand, you have the stereotype, the INCEL guy who could never get a date, or that’s how those people think of themselves, anyways. I’ve always found that there’s someone for everyone out there, sad but true. But on the other, you have some married guy with a professional job and an attractive wife but they now have kids and he’s not getting enough. Women too, of course. So that audience isn’t losers at all, and so I have to perform with both in mind. But as far as the difference between making and just viewing it, producers have the bods and the guts, the consumers are just anyone, and they might be cowards too but I don’t really judge or care about that.” (23-year-old female sex worker).

            The other category of respondent were the groomers, but since they were, by their own tacit admittance, criminals, and their process of recruiting for underage sex labor was shot through with both a cloying extortion and hortatory clichés that I felt even an eight-year-old would not fall for – though apparently, I remain naïve about such entrapment – I do not consider any of it worthy of reproduction here.  Rather, I end the results section with a typical summary of responses of amateur and unaffiliated professional producers when asked to characterize the essence of the falderal surrounding their chosen workplace and their activities within:

            “It’s not for everyone. But what is? Don’t tell me I can’t do it because you don’t like it. Or you pretend you don’t. Too fucking bad. Look, I’m 18. Everywhere I go people stare at me. Do I get paid for any of that? Do I get a guy, young or old, come up to me and give me a hundred bucks and say ‘Sorry, miss, I was leering at you. I know this doesn’t make up for it, but take it anyways and just know I’d never do anything more than just look’. Never. Never in a million years would that ever happen on the street. But hey, I discovered a wondrous land where it does happen! And in that land, that’s all guys do, is ‘just look’. You hear people yelling ‘keep it real’. No, reality is what sucks. Virtual reality is a godsend. I’ll be making porn until no one is willing to pay me for it. And every critic can just suck on that. Full stop.” (18-year-old female high school graduate).

The Analytic Upshot:

            In every field study I have conducted as principal investigator, I have found that the commonplace sociological rubric regarding people defending their own values is true to life. The sentiments expressed by sex workers, of whatever age or style of performance, was no different. Even if their community is disparate, partly fictional, and connected only loosely, they still felt that they were a part of something greater than themselves as individuals. Many saw themselves as rebels with a noble cause, even as social critics. Policies which censored them or targeted them in other ways were disdained and mocked, the apparent hypocrisies of their political and parental vendors exposed. I was myself asked, on some occasions, about my role in such censorship, and I explained that, as an ethicist, I would like to see some formal accountability within the organizations benefitting from uploading their materials and profiting from them, as often as not without the original creator’s knowledge. But even this was a hedge, and I knew it. Better to state that the distinction between a shared everyday reality which is always public and must place the whole of itself before any specific part thereof, and the semi-private reality of the internet and like venues, needs to be preserved insofar as the former does not find itself too engrossed in the latter. For cultures too can become addicted.

            The most important points raised by respondents in an ethical sense were those directed against the idea that pornography was somehow qualitatively different than other activities youth partook in, and that the conception and definition of risk within its scope was severely overblown. For myself, and from an analytic standpoint alone, there may be a sense that if young people in any society become too taken with themselves in one relatively narrow way – the perfect physical and sexual specimen – then their once-respective identities would be as narrowed. As an ethicist, I think this is the greatest danger at the level of personhood. At the level of character, I feel that there is a danger of a craven cowardice in virtual sexuality, precisely due to there not being a real other with whom one must confront, conflict with, reassure and rapproche, and most of all, try to love. Given that almost all respondents themselves appeared to understand these dual dangers when questioned about them, and put their lack of interest in their ethical themes down to simply not being part of the phase of life we generally refer to as youth – thereby implying that when they were more mature, such themes would then take on more weight in their lives – I could not in turn simply dismiss such a reply. We do not yet have the longitudinal data to document either way this implied transition.

            In light and in lieu of this present absence, I will end this summary with a final quotation to these regards:

            “No one does this forever. I’m certainly not planning on it. In ten years, I’ll be married and probably have at least one kid. I’ll look like everyone else you see; that is, not great! My husband will want to fuck me at his discretion, my kids will want me to feed them, drive them somewhere, help them with their homework, all that. Right now, ‘all that’ feels like a kind of death. So, what’s so wrong with living a little before you start to die?” (19-year-old female university student).

            G.V. Loewen is the author of over 60 books in ethics, education, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

Artificial Stupidity

Artificial Stupidity (Forget about AI; this is the real danger)

            Perhaps one of the oddest contradictions of human history is the dual character of that very history. This is so due to the fact that the major source of cultural incompetence is that same heritage through which we have become competent. The past of our world culture, all that we have known as a species and all we have experienced as individuals, is contained in this history. Some of it has been lost forever due to the vicissitudes of a history ongoing and ever-changing, and some further still a secret, yet to be rediscovered. Weekly, we read of startling archaeological finds, many due to the improved technology of location and geophysics. Entire large cities which have escaped our ken for perhaps millennia, emerge with all of their romance intact but with only a partial ability to communicate their knowledge to us. Shipwrecks laden with untold riches, tombs divulging their latent treasures, and all of it speaking the uncanny Ursprach of the dead up against whom we have abruptly brushed. The past does not share a precise language with the present. It tells us of its own experiences only indirectly. We understand what we can, acknowledging that there will always be something or other lost in proverbial translation.

            Because we cannot choose what has been preserved and what has been lost, we must take these finds as they are and thus hopefully as well for what they are: the slippage between the cultural competencies of the past and the needs of the present waxes and wanes pending temporal provenience and just how much sheer ‘stuff’ has survived. Mitigating the rapidly receding horizon whereupon remote antiquity calls across an ever-increasing chasm of incomprehension is the historical fact that the farther back one travels in the human career, the simpler things get. Cultural complexity, in its vast majority, is a product of our own era, a well-documented affair which, though subject to ideological suasion and attendant ‘rewriting’, is nevertheless almost fully present to us. From the Internet Archive to the Library of Congress and every modernist reliquary in between, the records remain. Do we have as fair a sample of the earliest of writings? Records of warehouse holdings, which spoke of the agrarian advent, a novel way of life, mingle with narratives of once only oral myths and epics, which spoke of that previous. The symbolic order of subsistence societies, nomadic, horticultural at most, tiny in their population load, relatively intimate with the doings of relevant animals and an always enveloping wider nature, are the murky and indirect sources of what are yet world institutions. And it is one of the most extraordinary facts of history that most of the beliefs of the present have their roots in a stage of cultural development following hard upon the social contract.

            Religions with gods, the evaluated afterlife, gendered divisions of labor, domestication of animals, semi-sedentary communities, none of these was the least bit present in humanity’s earliest attempts at culture. Indeed, it is only very recently that we have begun to question them, ten to twenty millennia after their original appearance. The chief factor which disallows their critical interrogation is not that they are stupid in themselves. For a great length of time they served their diverse purposes, dexterous and generalizable, adept at adapting to other societal changes; the most profound of which coming with the advent of agriculture. The symbolic literacy of antiquity contained the competence proper to those periods. For us, the question must be, is much of that same order able to function in our own society? It is rather the shame we feel, as a culture and perhaps also unconsciously, that we, who would like to identify with our ancestors through our respect for their many achievements – the shibboleth at hand is of course the one about the shoulders of giants, though in fact culture has always been a collective enterprise and the singular revolutionary figure is merely a moment of cultural crystallization; a statement that the culture in question is almost done with itself and needs to move on – feel that we have been failed by our predecessors. In turn, we have failed them, perhaps through our lack of respect and our ignorance about their ways. Even so, this dual sense of being let down is not so much historical as it is personal, and its present-day source lies in the family dynamic and its cross-generational conflict.

            The past is the parent of the present. The present presents to the past its future selfhood through the child. At first content to learn by rote the manners and mannerisms of memorial consciousness, the onset of adolescence prepares the child to become a being of the future. One must literally begin to look ahead of oneself and one’s current state and status. That I cannot know the future through my own experience, that no ‘culture’ solely of the future can yet exist, inspires me to take up the ongoing historical task of creating such a thing. The future is the dialectical apex of a triangle whose bases are the past as thesis and the present as antithesis. Futurity, an elemental aspect of Dasein as a being which is always ahead of itself, expresses the Aufheben of time as experienced through culture. History is cultural time.

            If we are sometimes dismayed that time does not wait for us, we are comforted overmuch by the converse condition; history waits on us for too long. This is the case due to the perduring presence of antique symbolic items which, in their own time of efflorescence, expressed the order of those days. More than just awaiting us, however, in addition, symbolic history waits upon us, is our servant, and in its savant presence we are carried away by what is in reality a narrow wisdom. With the loss of the visionary by way of the exposition of the real through science, the apparatus which presumed upon the utter and final presence of the vision as the only means by which humanity could not only predict the future but as well actually attain it, the entire architecture of agrarian symbology collapses. Modernity states emphatically that there is no otherworld.

            Momentary in the social contract, souls unevaluated and presently returning to animate the newly born, the otherworld then was intimately a part of the this-world. Transformer beings crossed such a threshold, something only mysterious to those who lacked that specific ability. In a word, the limen to the otherworld held no mystery in itself. Here, in the primordial mindset of our human ancestors – how far back we have no way of knowing – the uncanny was merely an augmentation of reality, and not one other to it. In this, we can begin to comprehend that even with the earliest appearance of sedentism and agriculture, this perhaps original human cosmology had run its course. It was not long before it became quite formalized; the evaluations of the living were projected upon the dead. The otherworld was divorced from what was considered to be real in itself, and placed at a distance therefrom. Hierarchies in the social order, jarringly novel and also often harsh, enabled a process of judgment that no longer contained within it the will of the community as a culture entire. The ‘sentencing circles’ of the social contract became themselves null and void. Instead of the scapegoat, the law; instead of the wilderness, the prison. City replaced village, herding and thence harvesting replaced hunting, processing replaced gathering. None of this is an effort at nostalgia, quite the opposite, but can we bring the same objectivity to bear upon the ongoing presence of agrarian worldviews in our own times, while acknowledging the total loss of that which preceded them?

            If mathematics is the unknowing language of nature, self-consciously, if haltingly, understood by human beings, there is no ‘mind of God’. If the cosmos is a repetitive affair, indefinite in size and yet finite in its own history, as well unknowing, there is no ‘purpose’ to existence. It is arguably the most radical cultural item of modernity that we have granted ourselves the ability to make our own meaningfulness, bereft of either judge or judgment. For some, the otherworld as a personal hallucination conjures a remanant, a vestige of the experience of the hunter and the gatherer, alone in a forest now otherwise metaphysically forbidding, as well as physically fading fast. Though the collective unconscious may well have preserved culture memory from these earliest periods of human consciousness, even here, in dream and waking dream alike, in the reveries of the writer and the revelations of the thinker, we remain children of our own time and no other; beings of our own world and none other. Yet given that this world, in its rather self-conscious appraisal as both the ‘this-world’ and the only world, is shot through with reminders that our present-day cultural self-understanding includes everything from the past which both burdens our endeavors while at the same time urging them onward, it is arguably the greatest challenge of our age to sort through all that may still serve us as a function without form. For the latter is gone. That agrarian framework which itself was built upon the formalization of yet earlier cosmological rubrics was lost in the shift to capital and its industrial-technical means of production. The Zeitgeist of the society it birthed, ‘bourgeois’ and individuated, places me in an ‘iron cage’ not so much of economics, but rather of symbolics and of the symbolic life. Insanity and ‘magical thinking’ are the only spaces of the visionary, but such a culture as has sequestered the human imagination to an ideal arithmetic fosters its own idiot-savant quality. And we are as impressed with it as we in turn imagine our ancestors were of their own.

            The incompetencies of the past were often of a logistical and technical matter. That our predecessors could not observe what we now take for granted, the cosmos included, does not necessarily mean that they envisioned less than we. But we are not going to find more than the decayed and perhaps also decadent pith of those visions in the myths and mantras of ages lost. Our entire conception of essence may itself be the vestigial bigotry of bygone ballads. But if that is so, as Nietzsche for one suggested, then existence too can be called into question as symbolic of its own absence of a future conscientiousness. The romantics sought to replace it by living, the existentialists, ironically, by being. But present existence, historical in its very character, holds within it both an unquiet mélange of melodies, the sirens of stupidity, as well as a space within which is held all that can ignore and thus avoid the quite artificial rocks to which we are yet being drawn.

            G.V. Loewen is the author of over sixty books, and was professor of the interdisciplinary human sciences for over two decades.

On Truth and Lie in a Virtual Sense

On Truth and Lie in a Virtual Sense (it’s not 1872 anymore)

            In what is arguably the most important short essay of the 19th century, the youthful Nietzsche belatedly answers the querulous query, ‘what is truth?’ made notorious, if still resolutely apt, by Pilate. For some millennia, it was recognized that though reality could possess lies – especially the social reality constructed solely by human beings – truth, by contrast, could not. But in ‘On Truth and Lie in an Extra-Moral Sense’, (1872), Nietzsche casts aside that distinction. Truth is simply its own form of lie, currency which has long lost its imprint of precise value and stands on the memory of it being metal alone. Truth is both metaphoric and metonymic, an exalted form of euphemism that covers over the reality of it itself having been constructed and imagined by that same human consciousness which, oddly, even perversely for Nietzsche, finds succor in the misplaced ‘will to truth’. This jarring statement, finding its legendary lines in “…how shadowy and transient, how aimless and arbitrary the human intellect looks within nature. There were eternities during which it did not exist. And when it is all over with the human intellect, nothing will have happened. For this intellect has no additional mission which would lead it beyond human life.” We can call this ‘nihilism’ if we want; nevertheless, in the cosmic order of things seen from the vantage point of Victorian period, it is more true than any human truth.

            Nietzsche, however, does not dwell for overlong in the cosmic. His question is, and ere after, not so much ‘what is truth?’, but rather, ‘what is human?’ If “…to be truthful is to employ the usual metaphors.”, then in a moral sense, truthfulness means merely “…the duty to lie according to a fixed convention.” We do hear, from time to time, the phrase ‘conventional truths’, which are taken to point to a kind of statement existing exiguously between truisms and ‘trivial truths’, the former chestnuts of uncertain origin but precise provenience, and the latter simple statements of self-definition; certain only because they can only reference themselves. But Nietzsche tells us that all truths are such only by convention, thus erasing these other, perhaps cowardly, distinctions. The most famous passage of the paper occurs just above these reminders, and after reproducing it here, I want to provide some discursive context, both before and after, in order to aid understanding of just how it was possible that Nietzsche, at age 28 – the same age at which Hume wrote his magnum opus A Treatise on Human Nature – was able to come up with such a succinctly damning statement of one of humanity’s most cherished possessions. “What then is truth? A movable host of metaphors, metonymies, and anthropomorphisms: in short, a sum of human relations which have poetically and rhetorically intensified, transferred, and embellished, and which, after long usage, seem to a people to be fixed, canonical, and binding. Truths are illusions which we have forgotten are illusions; they are metaphors that have become worn out and have been drained of sensuous force…”

            The previous year, Darwin’s The Ascent of Man appeared, making clear the evolutionary connection between the great apes and human beings, something which was only implied in his revolutionary 1859 work. We shared the primate branch with other creatures; apes and humans had a common ancestor. Recasting the ‘great chain of being’ was not what was more seriously unsettling about Darwin’s work, but rather that humans were to be included in it, as another animal, but one simply more evolved. Nietzsche himself found this fact regrettable in the extreme, but also found within it the source of the death of godhead, something some commentators imagine him celebrating. The son of a Protestant minister, Nietzsche was, instead, moved to devote much of the rest of his working life coming up with both a new ethics to replace the one sourced in the divine assignation of conscience within human consciousness, but as well, a now ‘post-metaphysical’ cosmology centered around not the will to truth, but rather the will to power, ‘and nothing besides’.

            But in fact, the seeds for the exposition of the illusory qualities of human truths were sown far before Darwin’s somewhat indirect framework had taken hold over the philosophical imagination. ‘Perspectivism’, usually attributed to Nietzsche as well as fashionably misattributed to post-colonial discourses, actually first occurs with any force in Vico’s The New Science, (1725), wherein he speaks of cultures and peoples having different truths, in which they wholeheartedly believe as if they were the sole human knowing of the things themselves. ‘New’, of course, refers to the human sciences, the Geisteswissenschaften, as a complement for, and contrast to, the sciences of nature. The German translator of J.S. Mill’s System of Logic, (1843) came up with the term as well as its contrasting one, which ever since has given students thereof problems. Naturwissenshaften is straightforward enough, but ‘Geistes’! These ‘sciences of the spirit’, were in the main, unimpressive to Nietzsche, with the exception that they exposed the relativistic quality of truth on the ground. Anyone who has travelled outside of their own locale knows that the sole remaining truth about truth is that it’s status can adhere to anything we humans need it to.

            Closer to Nietzsche’s own time, aside from Mill’s important work – it should be noted that Mill was a vigorous supporter of the nascent feminist social science, and was personal friends with a number of its progenitrixes – Marx and Engels had penned The German Ideology – 1846, but not published until 1932 – in which the phrase ‘consciousness too is a social product’ presages in a much more concise manner Nietzsche’s argument. From Vico and Hume to Mill and Marx, the sense that truth was more than merely ‘elusive’ – a sensibility hailing from the natural sciences – had been germinating in serious discourse. The irony here is, perhaps, that the entire heart of Enlightenment discourse, officially dedicated to the truth of things bereft of moral overlays, ended up losing truth itself by jettisoning its moral sources and backdrop. And it was Nietzsche who first noticed this irony.

            His essay too went unpublished for some time, but eventually this acknowledgement that evolution, on the natural science side, and cultural perspectivism, on that social, gave way to an entire discursive framework within which truth found its place beside all other human faculties; institutions, subsistence practices, cosmologies, magic, kinship, the rites of passage and so on. By 1923, W. I. Thomas’ famous ‘principle’ could be uttered almost in passing: “If people believe something to be real, it is real in its consequences.” This is the working version of Nietzsche’s essay in a single sentence. By the mid-1930s Robert Merton could sum up the source of all inquiry into truths within the reality the Geisteswissenschaften studied in a single, precise question of his own: ‘Who benefits?’. In a word, a truth, of whatever form or function, existed due to someone or other gaining something from its remaining extant. Truths which do not function in this manner are soon overtaken by others, but the character of human truth is not altered by their simple replacement, any more than it is by their reproduction, the latter of which Nietzsche himself had concentrated most of his analysis upon.

            Today, we face another challenge to the traditional model of what a truth is or can be. If we now understand truth to be extramoral, or ‘non-moral’, what then of truths which are wholly virtual? When I first placed a virtual reality helmet upon my surefire rational head, I was astonished not only at the simulacra available, but the more so, by my ‘natural’ reactions thereto. I hesitated and even leapt back from a virtual ‘cliff’; I automatically bent forward to pet a virtual dog which, just to keep things ‘real’, had the ability to pick up a bone with its rather alien snout. I knew the experience was not real in the usual sense, and yet I still had the experience. Virtual reality is thus more like a vision, but one which can be shared through technology. The visionary has now an audience greater than himself, even if the content of the visions are just as hallucinatory as those of ages antique filled with the equally aged who could at least be truthful to themselves. Virtual reality is the scion of the sciences of the ‘spirit’, and its panoramas, its melodramas, its illusions are exactly what would animate Nietzsche’s own sensibility if he would have dreamed up the idea. By contrast, the sciences of nature too have their own child, ‘augmented reality’, which is a misnomer, because what it shows to our senses through a technological prosthetic are things which are actually already there in the world. There is no ‘virtuality’ about this augmentation; yet it is not reality per se that is being augmented, but rather our sensate. We are enabled to see the guts of things, for instance, in a manner reminiscent of Husserl’s gradually building ‘glancing ray’ which, bereft of the hyletic sphere, gets at the essence of things. We can see around corners, inside compartments, splice wires and inspect semi-conductors and this is how a precise and cool empiricism would likely interpret transcendental phenomenology’s ‘noesis’. It is a literalist litany of ‘to the things themselves’.

            And when we are dealing with mere things, truth and reality coincide most closely. Things alone, however, cannot hold our human interest. We know we are the far more curious phenomenon, and perhaps the greater proportion of that more fascinating character comes from our ability to find truth in the illusory, to make beliefs real through acting upon them, and yet also to be able to analyze and critique these attempts, seeing them as well for what they are. A consciousness that understands the very truth of truth is the result, and to my mind this is laudable achievement. For Nietzsche, the tacit question which resonates from his seminal essay might run along the lines of ‘why then have truth at all?’. He answers it, in so many words: “So long as it is able to deceive without injuring, that master of deception, the intellect, is free; it is released from its former slavery and celebrates its Saturnalia. It is never more luxuriant, richer, prouder, more clever and more daring. [ ] The intellect has now thrown the token of bondage from itself.” If the cosmic truth of human existence is sobering – and perhaps a new reality of a constructed intelligence will, in fact, carry humanity’s intellect ‘beyond human life’ and thus into a more ‘truthful’ future – the worldly truths we humans have taken for a wider reality have done far more than act as agents of self-deception. Our ability to conceive of something we call the ‘truth’ is far more profound than even our corresponding ability to believe in it and thenceforth act upon it. We need the concept of truth in the same way that Nietzsche much later notes that ‘we are more in love with love itself’ than we are of the beloved. We love the truth, but truths are of passing adoration. Truth then, might be one of those Durkheimian concepts which, akin to the sacred, are able to overleap discursive shifts in metaphysics and even societal shifts in modes of production. Nietzsche is correct about Truth and truths alike, and yet is it not more than true that in spite of this redolent gem of self-understanding, what more fully animates the human endeavor – patient and cumulative experiment in its natural science aspect, impassioned and visionary dream on that of ‘spirit’ – is that reality, after all, has itself always been virtual?

            G.V. Loewen is the author of over 60 books. He was professor of the interdisciplinary human sciences for over two decades.

Coincidence and Signage

Coincidence and Signage (The ‘prose of the world’ revisited)

            Wherever I go, I observe signs. Many are simply functional; some explaining traffic flow, keeping everyone safely moving in the appropriate directions. Other like signage would include the shingles of business and government, church and non-profit. These are basic locational signs, some with directions, which are then duplicated at the sites themselves. Without directional signage, daily life would become a hierarchical jumble of conflicting ‘local knowledge’; those having lived in a neighborhood or city having a distinct advantage over newcomers. Even so, such signage is yet handy for the experienced city-dweller, and all the more so, for those who venture forth from urban areas into those rural. Function is seldom turned to form at this level of the sign, and only when the state is anxious to make this or that political declaration, itself a sign of a different dimension, does directional signage become a vehicle for ideology.

            The other major type of signage available to us in our contemporary scene is that of marketing. It too is primarily functional, but not entirely. While one could perhaps make a gentle argument that directional and locational signage betrays our penchant for social order in its most general, even vaguest, sense, advertising carries a double intent somewhat more bodily. While first oriented to only selling a product or a service, the presence of advertising is after all also a function of a specific economic sensibility, that of capital, wherein entrepreneurs compete for market and franchise through advertising. Here, the undertone of ideology is slightly more manifest than in mere locational and directional signage, for the very rubric of capital is contained within the ad, whereas all known cultures have some semblance of basic order about them, and one that is generally seen as value neutral. Running a red light might well get the communist killed as effectively as it would the capitalist. This said, advertising is not as neutral, and when I recently saw an ad on a nearby campus exhorting students to join the Marxist society I was bemused by its patent irony. It is clear that if one desires to share anything at all, advertising is the most effective way to do it, lending credence to the everyday sense we have that the ubiquity of sheer shill is nothing more than what can be taken from it at face value.

            But the combination of signage and outcome lends to us another level of sensate; we are used to every sign being utterly honest about its information and direction. When our digital maps are slow to update, we can get frustrated, as this or that business or other site has, in the interim, moved or simply vanished. If I am on such and such street then I expect to find this or that address along it. I am aware, of course, that larger-scale businesses and other like concerns have more than one location, and so I must sift accordingly, but this is a very different challenge than not being able to find a site at all. I read a piece of signage, and through my success in following its directions, I transform that signage into a sign; that is, its information imparted becomes a force in my life, and one that has led to success. We are quite used to this tacit process of transformation and think nothing of it in the day-to-day. But its presence in our consciousness has a profound impact on how we understand and give meaning to the world more widely, as well as to the myriad of chance interactions we encounter in our smaller lives. This is so because we have already mastered, within our normative and standard rubric of routine, the ability to read the world as if it were a text.

            Foucault, as a prologue to his famed book. The Order of Things, (1966), speaks of the medieval period being dominated by a worldview that based itself upon the ‘prose of the world’. Herein, signs were everywhere, just as is in our time, signage. Signs of what, exactly? Simply put, signs that the world was an autograph creation of God. God’s work spoke for itself, in a further sense, but if one desired to accrue meaningfulness to this odd dynamic of the otherworld and the this-world, one had to learn to ‘read off’ that signature which marked a great variety of phenomena. Our interpretation of the divine hand as quill thus re-marked, more than it did simply remark upon, that same world, creating an exegetically inclined prose. The natural world had become its own scripture. Foucault’s own point is that this worldview was about to give way to the incipient version of our own, wherein sign devolves to mere signage, and the world worlds its own way, apart from either human or divine design. Here, I want to suggest that the presence of the prose of the world has not in fact been utterly overtaken by modernity, but has rather been transmuted into what, even as a baser metal than the ideals of the alchemical mindset, is still functioning as sign and indeed, even signatory.

             A modernist glance might note that God too had to advertise His works, and so marketing has in fact a longer history than it might seem, but this would ignore the contrast between a source who very much had the time to wait for His market to show authentic interest in the message being marketed, as well as the theological fact that whether or not one was ‘sold’ by its soteriological suasion, the events of the apocalypse and judgment would occur in any case. This is manifestly not the condition for contemporary marketers, hence their sometimes desperate haste and powerful panache present in their attempts to convince us that their own version of salvation is worth our while, and for that matter, our money. But advertising firms owe the majority of their success not to their inventive signage but rather to the more profound and historical fact that we are used to reading signage as if it were a sign. Advertising would have no power over us without this ability, and this interpretative skill, itself sourced in the primordial need for human beings to make their mortal lives meaningful and thus tolerable in some manner, was brought to its most sophisticated head during the medieval period.

            Anything could be a sign. What today we mainly put down to chance, happenstance, and more strongly, coincidence or yet the déja vu of the psychologists, was for our ancestors something to be noted. Like the Logos itself, not all of which could be directly understood by humanity, the prose of the world was both present and complete. Our human faculties, on the one hand, could access some of its truths, but our human failings, on the other, prohibited an holistic comprehension of the mind of God. Indeed, we only see the charlatan in the place of the mystagogue in today’s world, wherein those who claim to know the mind of God and thus its divine will as well, are generally seen for what they are. We have no record of such figures existing in previous eras, and this makes sense insofar as they had the open book of God’s creation and will before them, and all had the same access thereto. It was the lot of the gnostic to at first make a claim about knowing the truth of things in essence and thenceforth attempt to vouchsafe such a venture through the interpretation of the world as a complement to scripture. There was certainly seen to be a symbiosis between the two, but the addition of the world as a source of God’s truth and will greatly amplified the presence of a form of cultural literacy during the related historical periods, dominated as they were by the second wave of agrarianism, that of the feudal order in the West.

            It is this very literacy that advertising now uses, rather unbeknownst to itself. Mammon has perhaps replaced Yahweh, especially amongst the latter’s original acolytes, we might gently suggest, but Yahweh seldom advertised in any indirect way. No, He made demands, followed or not, but one’s that all could understand if not exactly relate to. The world cast as a prose document has its source in the God who didn’t beat around the burning bush, so to speak. Yahweh’s competitors were often obfuscatory; meaning was obscure and thus meaningfulness a chore. It is this simplicity of demand that modern marketers borrow from the Judaic dynamic, as well as the sense that they would become as Yahweh was; a mascot for a better life.

            For advertising in capital ultimately sells us its own form of earthly salvation. ‘Better living’ is its mantra, status only its mantle. What we demand from our own mode of production is worldly success, just as did the Calvinists who, not at all by coincidence, imagined that such itself would be none other than a sign of the divine salvation to come. They were to be saved, just as were the chosen people before them. It is confusing at the level of symbolics, to say the least, that the very people who so disdained the Jews sought to so emulate their situation. No doubt the worldly competition within the mercantile affairs of incipient capital was projected upon the deeper canvas of a competition for a reserved place in paradise. It was no more ludicrous for the originally marginal community of sectarians to claim that worldly wealth was a sign of God’s favor and grace than it was for a marginal ethnicity to claim for itself salvation by equally earthly kinship. But all such claims must be taken in the context of both the anxiousness that accompanies our daily rounds and the existential anxiety that is the hallmark of the human condition more generally. If such rationalizations appear as nonsensical today, it is due to our own displacing of such an anxiety into our demand for a better life in the here and now.

            Does then advertising have the grace to make that desire into reality? Perhaps not by itself, but what it does do is reduce the level of happenstance in our mundane lives so that we might be able to more appreciate the possible irruptive presence of authentic signs into that same life. Coincidence is, at base, constructed from the sameness that marks mundanity just as did God’s autograph mark the medieval world. Reason in that latter and now mostly absent world was of a superior form of consciousness, but today it comes across as the mere rationalization of an inferior form. With abundant irony, it is marketing which today mimics the call to conscience which once animated an entire culture’s aspirations. Can we then use this seemingly omnipotent source of signage in capital to both engender a better material condition for all, but as well, and more deeply, engage in a latter-day eschatological ecumenism, one in which there is represenced the sense that each of us, as a human being, are subject to the same ultimate forces and are the object of the same essential conditions?

            G.V. Loewen is the author of over 60 books. He was professor of the interdisciplinary human sciences  for over two decades.

Malice and Co.

Malice and Co. (The Nobel and the Noble)

            When my wife and I were living back on the West Coast we knew a retired teacher who not only had the grace to read my first short fiction collection but also the generosity to extoll my ‘genius’ in an hours-long conversation afterward. During this too-pleasant evening he told us of an encounter with one of his youthful students. Then twelve, she had become attached to him in the classroom, and what do you know, the first day of summer had brought her newly minted teen self to his front door, unannounced but promptly revealing every intent to intimately engage with him. To his credit, he gently ran her off, never to return. But indeed, such a moment must force every man to ask of himself a challenging question, ‘what would I have done in his place?’. Writ small, this is the same question that history poses to each of us, man or woman or other, and the usual contents are ‘would I have worked in a death camp or been one of its victims or, in turn, done nothing at all?’. As an ethicist, in fact I cannot say what I would have done. Like an ominous version of the contextual jest, one would have ‘had to have been there’ to really get it.

            I doubt very much many of us could know, given the hypotheticals of alternate biographies and all that such might imply. Certainly, as a young professor, I had a conga line of young women at my door – brazenly so since all of them were of legal age or older – and while I was still single, I acted upon many such calls. But twelve or thirteen seems a different matter. So, when it was revealed that Alice Munro’s daughter had been molested by her second husband at all of nine years old, with him claiming it was merely a scene out of Lolita after all, I cringed. No, the character in Nabokov was twelve, not nine, and there is a world of difference at that age. Lolita also had already been placed in a criminal circumstance by Humbert, and the reader is left with both having to trust his account of things thenceforth as well as presume that the young woman was hoping to ease her predicament; ‘well, at least he won’t kill me if I have regular sex with him’. And while it is highly unlikely that any nine-year-old would be the initiator of such circumstance, at twelve or thirteen, it might be a different story. As indeed it should be, barring intimacy. I say this because by adolescence a child needs to have that sense that she is becoming her own person. In many families with whom I have consulted, there was an ‘Electraic’ tension between mother and daughter, beginning around that age: ‘She mocks me, hates me even, is jealous of my looks and freedom and thinks dad admires me and not her. Maybe he does. She attempts to control me, and yet she still gets to sleep with him. I know how to fuck her over big time, just watch me’, and so on. Of course, the father is still culpable if he enables such desires, but the desires themselves are perfectly understandable and, as an assertion of nascent selfhood, even laudable.

            But not at nine. This fellow, who served no jail time, was clearly a villain, but such proved as well to be the case for the Nobel novelist. It is this latter fact which is causing conniptions in so-called cultural circles, but once again, there is much evidence to vouchsafe the authenticity of Munro’s feelings. Upon divorce, the child who remains from this now moribund union is often subjected to resentment, even hatred. She is a reminder of a bond now sundered, the once gift of love become the spawn of bitterness. Munro’s daughter was abused twice over, first by her step-father and then by her mother, who wholly bought into the Lolita idea. This kind of thing is no odd slap in the face, also not to be countenanced of course, but rather constitutes an outright betrayal. But does any of this impinge upon Munro’s creative works, and if so, how so?

            Somewhat akin to the proverbial death camp question, such a relationship ambiguates established legacies. One thing I do know is that its not a problem for me. I always disdained Munro’s work; nostalgic navel-gazing from gloom and doom baby-boom. But intriguingly, and perhaps ironically, the discovery that the author herself was a villain with real feelings and conflict in her existence, which it appears she tried to suppress for decades, might well make her work the more interesting. It would have to be something big to do so, at least for myself as a fiction writer and a scholar in aesthetics. Yet culture history is replete with villains, many of such standing as to make Munro, Woody Allen and like company look themselves like nine-year-olds. The most important case must be that of Richard Wagner, whose towering genius is often seen as tainted by his vehement political anti-Semitism. It could be argued that Wagner himself had a role, however cameo, in the murder of twelve millions in the camps and sixty elsewhere around the globe. ‘Go big’ must have been his mantra, given the Ring cycle and many other grand artistic works. But even here, his personal sensibilities, presumably reflected or at least refracted in his creations, we are left with ambiguity. His call to his Jewish musicians to ‘lose their Jewishness’ since otherwise they were ‘the perfect human beings’ might be interpreted as simply a reminder that ethnicity of any sort is both window-dressing and crutch, and decoys the noble soul away from his authenticity as a superior human being. If that was the case, I would wholly agree.

            Other famous cases of the handwringing at history remain at our newly gnarled fingertips. Heidegger, also no fan of ‘The Jews’, nevertheless saved both his mentor and his lover, both Jewish, from the Nazi onslaught, suggesting that it was not ethnicity itself that he disdained but rather simple inferiority. Husserl, being one of the great modern philosophers and the founder of phenomenology as a serious discourse, as well as Hannah Arendt, who went on to become arguably the most important female thinker of the twentieth century, were certainly neither of them inferior in any way. Richard Strauss was pushed out of his job as the Reich’s Art Director because he defended working with Jewish writers and musicians. Uh, yeah, Wagner, Heidegger, Strauss. Who is Alice Munro again?

            But aside from the wider historical context and career of what has to be by now a cliché – ‘I found out my hero was a villain, woe is me!’ – we must, as with the problem of history in general, turn the critical lens upon ourselves. That there exist people who might well wish me dead simply tells me I have lived my own life, and without reserve. One owns one’s own iniquities, and I am fortunate, equally simply, that my list contains nothing overly villainous, such as molesting children or, for that matter, running a death camp. But facts and fancies are ill-matched, and just as Nietzsche slyly reminded us that pride ultimately triumphs over memory, the critic’s own desires might well be able to vanquish history itself. For instance, I have been referred to as a child pornographer, and by someone I grew up with no less. Given the commonplace and wholly fictional idea that an author must always be culling from his own personal experience, I had to blink at the implications of such an outrageous charge. Disgusted by Lolita and Romeo and Juliet alike, for my first published fictional work, I wrote something more inspiring and in fact, more real to life, if not actually my life. To my mind, this is what a good fiction author does. They don’t just look, as one of Munro’s peers has done, at Heinlein’s If this goes on…, or yet The Odyssey, and say, ‘well, how about telling the same story but from a female perspective?’. Uh, how about it? No, rather they take up a famous trope and completely redo it, from the inside out, making it once again our own, instead of the piece of comforting nostalgia it has over the centuries become. This, by the way, was the entire intent of Queen of Hearts. Both Camelot and Calvary are now once again authentically our own stories, and not those of our distant, and dreary, ancestors.

            For distant and dreary are, at last, perhaps the two things that link Munro’s personal villainy and her cultural works. In both sets of narratives there is much suppression, much decoy behavior. That she knew these very human errors personally, and not simply by way of a creative imagination, both makes her writings more real and at once less artistic. Since never the twain completely meet, each of us must then decide for herself whether we prefer art, or rather life.

            G.V. Loewen is the author of sixty books in ethics, education, aesthetics, health, social theory and other areas, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.