The Future is Plastic

The Future is Plastic (Sculpting Fluid Change)

            With the major oil-producing nations shying away from a variety of bans on plastic use and waste, given that the petrochemical industry is facing a shortage of expanding commodity markets and such countries as India, Iran, and Russia reserve their ‘right to develop’; with microplastics in water supplies, gigantic festoons of plastic littering the remote oceans, plastic detritus on the beaches – to the point that certain crustaceans are now using plastic bottle caps and the like as makeshift ‘shells’; inventive creatures they must be – and with plastic recycling losing its trendiness, the bit character in The Graduate (1967) may have said more than he meant, in counseling the young Dustin Hoffman about the most promising careers: “The future (really) is plastic!”. This film, meant as comedy but in fact a tragedy – the culminating scenes have Hoffman playing Harold Lloyd in an updated chase sequence borrowed wholly from Girl Shy (1924), but the happy ending of Lloyd’s daredevil antics is not repeated in the more recent effort – reminds one of nothing other than the contrast between plastic items themselves, brightly colored, whimsical, toy-like, and their lingering effect upon the environment. Indeed, ‘malingering’ might be the more apt term, given their notoriously long half-lives.

            But the conception of plastic predates the actual material invention, seen yet in interwar period ‘Bakelite’ and other like artifacts as varied as vintage poker chips, early electric shavers, toothbrushes, and shoe-horns, to name a few. Plastik in German is ‘sculpture’, as in the art form. And the ability to mold this new liquid polymer-like substance into any possible shape desired could only accrue to itself the same name, Anglicized but carrying the same methodic meaning. Sculpted plastic did itself appear in the galleries soon after the war, taking its place among the modernist movement, yet also pushing it along toward pop art. Plastic as a substance is seemingly as value-neutral as it is a conception. The latter connotes change, not permanence, so there is an irony of contrast between the idea and the product, given once again the fact that plastic is so difficult to break down and few organisms in nature have, so to speak, the guts to do so. Certainly, we humans appear to lack them, as it is far more convenient to make like the crab and turn away from the world, sheltering under our very much artificial shells.

            Even so, the film’s enduring epigram also must be taken much more literally than a general suggestion to get a job in a specific and growing industry. The future is, by definition, plastic; fluid, as yet unformed, to be molded, the very outcome of present-day change which in turn is the future’s ownmost harbinger. The littoral litter of actual plastic objects and their shards and fragments does nothing to alter this profoundly existential condition. Yes, unless the world does itself become uninhabitable due to it’s becoming inundated with things made of plastic. It is not a momentary irony after all, this contrast between the conception and the object, the idea and the product, the meaningful word and passing thing. But we must ask, is the nascent drive to cleanse the earth of these cast-off remnants transmuting into gaily Lovecraftian remanants – one can imagine that Cthulhu itself, rising from the ocean depths, is after all made up of a million tons of plastic waste held together with giant fish nets – simply a matter of rehabilitating the health of the ecosystem or does it carry some other, more essential sentiment, within it?

            The idea of the future is, oddly, itself a recent invention. For the Greeks, the future was to be as tragic as the fates of the young would-be lovers in the Hoffman film, escaped from their normative prisons, yes, but then realizing, in the final frames, that they had now come face to face with an utterly unknown – and for them, seated side by side at the very back of a bus, just as unknowable – time to come. There is no being-ahead in the Greek mythos, of course, but during the transition toward logos, the mythic temporality was shed before ever was the mythic sensibility. The past was venerated, the present deplored, the future dreaded. Speaking of rehabilitation, the first light that shone from a future point appears in the resurrection of the Christian mythos; it speaks of a future that is better than what has been. This is an impressive volte face given the druthers of classical thought, and represents, through the midwifery of the Hebrews, a re-uptake of Egyptian thought concerning both personal destiny and the structure of the afterlife more generally. Perhaps paradoxically, the idea of a future being as well as world is actually an older sensibility than is the idea of decay and the overall running down of things. The future as a conception comes from the past as an actuality. What is more truly resurrected is thus not a particular culture hero but rather an entire outlook, a worldview that seeks to overcome both the torpor of the present and the ultimate breakdown of the future.

            This novel vantage presents to itself an equally unexplored panorama. That the Greeks maintained vestiges of their older temporality, a cycle in which the usual linear histories are inverted – the past was somehow ahead of them and thus could be known; this is dramatized in some of the most famous literary sequences that have survived from this period, such as those that speak of ‘predestination’ in Oedipus Rex or Antigone, while the future was ‘behind’ them and was thus unknown to the present – tells us of their abject fear of the future as a looming historical space. The ‘horror vacui’ of their Geometric period in sculpture was, for the Greeks, seemingly imported into a wider worldview. Blank space, either on the surfaces of clay vessels or in the temporal imagination, could neither be condoned nor countenanced. There is a residue of this even in our present-day imagination, since the future ‘itself’ has not changed and can itself never be present for us. Toffler’s Future Shock (1970) is a well-known popular attempt to essentially bring the future into the present, filling up the otherwise void spaces with its abrupt presence. The author speaks of urban renewal projects, where in a short space of time the entire landscape has been transformed. This is the general character of city life, in one sense, and it is no coincidence that temples remain the most enduring structures in these otherwise fluid and very much plastic spaces. Temples stand not because of their vintage if oft warmed-over architectural styles, but rather due to the worldview they represent and the morality they express, both of which are not only archaic to capital and to modernity more widely, but as well, contradict them.

            Their contrary character mimics the temporal inversion of the Greek mindset regarding history; what it was and what it meant. An urban core church tells us that the future is the past, that what is to come is actually behind us, its origins are very much its destinations and we complete our mortal being in the death of the present alone. Mythos, in its timeless and principled mannerisms, can duly afford both this contradiction – in itself there is no temporal conflict as history cannot exist in myth – and its benediction; it is rather through the logos that the future regains its promise and the present thus becomes promissory. To see the temple as a mere relic is to enforce the linearity of the very Word which the new belief and its attendant world-system have bequeathed to us. But it is a literal enforcement even so, for at once it can take refuge in the umbrella ethic, imported from the East, that earthly life was to be transcended, and thus even the places of worship upon the earth would be annulled in their meaningfulness and annihilated in their objectificity, as well as being able to hang the Logos up above its own worldly speech; to not do this second part meant to hang oneself, tethered to a world both forsaken and thus doomed: ‘my words fly up but my thoughts remain below’, as Shakespeare has it. Here, thought, a form of the Logos, is meant to itself retrieve the Being of mythos. No wonder then are we reserved in the face of any future.

            Though history can be concretized as ‘the past’, either as an official account to be found in government records like Hansard, courtroom transcripts, policy manuals, papal tracts or missals, and many other like documents, it remains fluid due to countering events such as new archaeological discoveries or historical interpretations, as well as the vicissitudes of mortal memory and even the popular culture misrepresentations of both historical cultures and otherwise well-documented events. The future is, by definition, plastic, but by redefinition, so is the past. The present lies in an Husserlian flux, even fluxion, so that its fluidity is as undeniable as is its sheer immanence. Its ‘pure presence’, however eidetic and hence rather unavailable in its tendency to be unavailing of itself, could be seen as another way in which to ‘avoid a void’, as it were. If there was a well-ensconced horror of the vacuum in spatial representation, as the logos gained preeminence, this sentiment found itself transposed to the very cosmos; ‘nature abhors a vacuum’. Today, cosmology fills in the greatest vacuum yet discovered by science, that of open intergalactic space, with ‘dark matter’ and even darker energy that shines not observably but in fact historically, refracting the ’ether’ of the Victorians. These and like efforts speak to us not of a simple accumulation of knowledge but the more so of a mimesis: that while nature might abhor nothingness, history deplores it, humanity avoids it, including my personal death, and temporality absolves itself from it. Thus to be plastic is to adopt an adeptly adaptive response to self-negation.

            The unshaped space is at best, a place-to-be. Unlived time is imminent alone, without presence. Idioms such as the ‘virgin landscape’, ‘virgin seas’, ‘untapped energy’, even inertia itself, all testify to the sense that what is the new is as well exciting, even if it might also be feared. To be the first to discover or explore something is to become a vehicle for the future. This is a metaphor of mythos, but one absorbed by the history of logos; in our very individuality we grant the safest of harbors to the idea of both uniqueness and thence the ability to be the first one to have done this or that, this specific way and no other. Simply because it is I, as an I, no one else could fill that void. Yet the goal is ever the same: to happen across a blankness and conjure forth a tapestry, to take the mute and give it voice, to transform the nothing into a something. This act is fluidity, it enacts change. Through this ability, we are able to see the future even if we have yet to fully experience it. The trick remains, however, to see in a future something which is itself different from what has previously filled such diverse voids; gaps in knowledge being perhaps the most important. Lloyd’s futurity is preferable to Hoffman’s, but between them we are called to witness the dual poles of human possibility; that I can busily color in the bald heralds of death without considering their augury and their ability to import the future into my very presence or, I can, with resolute being, step into each of them and move through them, only filling them up in passing, and thereby gaining the wisdom of that which moves all mortal life.

            G.V. Loewen is the author of over 60 books, and was professor of the interdisciplinary human sciences for over two decades.

Now you say it, now you don’t

Now you say it, now you don’t (recanting recantation)

            What is the character of the take-back? What could have so changed for me that I am myself transformed in return? That what I stated to be the case, either for myself, for another, or for the world, was either in error, ignorant or deliberate, moral or empirical, or could never have been in the first place nearest the truth? In recanting, I must pivot, change my mind or heart, or so be changed by ensuing events, including the contents of my own experiences as a person. Of course, changing one’s tune may be enforced unethically and externally, for instance by an authoritarian parent, but these kinds of recantations are themselves false. A forced choice is in fact no choice at all. Rather, I must be convinced that altering tack is not only in my best interest but as well comes to me, and at the least, as if I had made the choice to backtrack of my own free will.

            Three modes of recantation stand out; those of remorse, regret and reserve. They have slightly different ethical inclinations, and thus as motives, carry a somewhat diverse suasion about them. Remorse may certainly be faked, but the conception itself generally has to do with a sense that I have indeed erred and that the error was one of character and not simply act. Regret, by contrast, has in it a sense that I could well feel it even if its source is me being caught out; that I regret not getting away with my error, most especially, in it not becoming a new truth and thus able to stand alone in a more longitudinal fashion. Reserve is the most objective source of recantation. It suggests that something in the world has changed, unexpectedly, or in some other way as unlikely or improbable, and my statement of the facts meant to hold into the near future is thus rendered obsolete. Reserve is built into predictions or even predications from the start, and one might even note this or that possibility as a caveat. The least sophisticated form of reserve is the ‘margin of error’ employed by predictive statistics, nodding both to the vicissitudes of sample size and the foregoing ‘history’ of the kind of test involved. Here, a take-back is also equally simple: once in a while the most probable outcome does not occur.

            Importing this sensibility into the ethical life reduces human existence to a mere game of chance. At its most base level, probability does have an agency all its own. Even so, calculating ‘the odds’ and applying them to situations where I either seek to ‘get away’ with something or other, or further, tell myself that it is unlikely I am misrecognizing my own motives by way of a reassurance that I am working for the good, is itself a form of bad faith. This is one reason why reserve is so attractive. Within its probabilistic preserve, I am neither morally nor ethically culpable. Unless the odds themselves have been misrepresented – and in this, one would already have inserted a different kind of source for potential recantation – the numbers stand alone, telling their own tale; there is no ‘school’ to be minded in such cases, and I cannot speak either inside or outside thereof. Yet in its very attraction, reserve seems to promise a way around having to face up to either authentic remorse or being compelled to exhibit regret, no matter the outcome. This is surely why those who are neither predicting the weather, election results, nor yet stock values, are temped to imagine that acts of character are no different than risk assessments.

            Reserve is, however, a possible candidate for ethical action if it is employed before any decision or statement is actually made. Though somewhat archaic, we regularly see in literature descriptions of characters who ‘act with reserve’, or who present themselves as ‘reserved’. These are understood by the reader to be observers of the human character, including their own. They neither tilt at windmills nor jump in the fire. They are associated with level-headedness, but of a moral kind and not the ‘cool under fire’ type who may well be a hothead in terms of what decisions he has previously made to place him those kinds of situations. The reserved person is also one whom others seek out for advice or even judgment. Such characters are often more conservative than their peers, but not always. To say to oneself or to another than one harbors ‘reservations’ about this or that decision is to always be ahead of the moment. One cannot be reserved either about action or within its heady movement. Just so, the person ‘with reserve’ is seen as much more likely to have come to the correct conclusion before such action duly commences. It is only when such a character begins to become too enamored of her own observations and predictions that her countenance is altered from one of quiet confidence to a more unbridled arrogance, and this is where both remorse and regret awake to the doings of the day.

            A winning record does not by itself produce this change. One can be proved right without anyone else being aware. Entire novels have centered around this type of character, often a child, whose witness to adult doings is unmarred by the accumulated politics of experience. Such a character suffers if she discloses the truth too often, or in too sensitive a condition, but nonetheless she endures as a figure of the truth. The child in literature is oft used as a guileless messiah; she is relatively newly born to a has-been world, suggesting the ‘twice-born’ status of an elect, and she thus as well has no specific loyalty to how that world is itself run, or has been run, in the past. Hence, she is unreserved in her ability to stand back and behold within reserve. She has no agency other than her bare witness, and whatever suffering she endures at the hands of adults, the narrative can either itself take an heroic stand against it, having the youthful character never blink, never break, or in a more tragic tone, gradually but relentlessly convert the child into a wholly agentive, but otherwise utterly flawed, adult.

            And herein do we ourselves witness the appearance of both remorse and regret. In the main, the hero feels the former, the anti-hero the latter. Remorse centers around our conception of the betrayal of conscience, and this may include our own as an approximation of that of the other, or, if the other in question does not in fact feel herself to have been betrayed, nevertheless I may have betrayed myself; my own standards of ethical conduct have been transgressed; I have ‘fallen below’ my better selfhood. Conscience, whatever its ultimate source, is both the origin and the destination of remorse. One might go so far to suggest that remorse is best characterized as a wholly internal conversation with oneself, as opposed to regret, which at some point must be recognized by others. The courtroom expression ‘the showing of remorse’ in order to facilitate a lighter sentence or a more compassionate judgment, lends itself to the fakery of charm. Authentic remorse only discloses itself, and that as an elemental ethical aspect of Dasein’s ownmost being; it is never simply displayed. In this, remorse cannot be ‘shown’, only expressed indirectly, either by one’s subsequent actions or yet inactions. Remorsefulness as an emotional state may precede such a disclosure and thence carry through to the point wherein the other has finally pardoned my error rather than merely corrected it – here we speak of forgiveness in the West or forbearance in the East, though the latter term seems to have a wider temporal usage; one can be forbearing in the same way as one can be reserved, for example, while the sense of ‘being forgiving’ or having a ‘forgiving’ personality is more awkward, even a misunderstanding of the concept – or it may become a more permanent fixture, pending on the scope and scale of my error. In mighty contrast to merely regretting an otherwise passing faux pas – here, we are often told by a friend or lover that ‘no one else noticed it, no worries’, or such-like – remorseful being is an ethical inclination of Dasein’s ownmost call to conscience, and indeed, characterizes this call in all of its arcs, returning to itself the very source of its phenomenological disposition as a being who acts as opposed to one who can only enact, such as a God or hero.

            While remorse utters a disquisitive discourse in which I am in turn called to confront my own actions, once taken or, for the character whose combination of both reserve and unflinching self-examination is superior, even before any action commences, regret is a concept that is defined only and always after the fact. Regret, thus rather speaks inquisitively; it is always on the make to find out as precisely as possible the chances against it; that is, how likely it is to be compelled to feel itself. Remorse does not seek to avoid its own presence, while regret’s entire predisposition is to the contrary. I do not wish to regret my actions, decisions, words or deeds, nor do I wish to regret my interactions with others, especially those whom I love. But in all this, I am self-interested and to a tee. For regret is the care of the self spoken into being by way of bad faith. Remorse is a part of my very being, an authentic ‘existentiell’ of Dasein’s concernfulness and indeed, a catalyst thereto. It is part of the character of the ‘I can do it again’ as a manner of both basic learning and ethical improvement. Regret, though at first shunning the converse phenomenological realization that ‘I cannot swim in the same river twice’, has to work to overcome itself in order to at least feel a sense of relief, let alone joy, that this is in fact the essential case for human beings. To say one thing in its favor, regret has the ability to reorient my sensibilities to that relief: ‘I do not wish to return after all, I am glad it’s over, I live for today and thence for the future, and I will not live in the past.’ Indeed, regret may be so placed; it is a resident of what has come before, and I do not wish to revisit it. Remorse, in its turn, while not compelling me to return to the source of my regret, does ever move me to consider reserve to be the superior witness as itself an aspect of being-ahead.

            Regret at length utters a recantation of itself, generally without changing our ethical character. Remorse recants any such take back, and instead settles in, in order to reshape, however slightly, the interior of our conscience. It seeks to avoid the use of recanting for not only appearance’s sake – this is another reason why it can only disclose and never display – but also as a fail-safe against human ethical error more generally. For remorseful being to work as does anxiety itself, I must orient myself not only to the futural, but as well to understand that any relevant human future can only come about if by definition it speaks no language of the past. Regret seeks the past as succor for its misery, and even remorse must eventually let go its hold over our being-concerned. Even reserve must count as one of its reservations its own self-witness, so that it does not become a simple barrier to change. At the same time, we are, as beings of finiteness and finitude alike, ethically called upon to ‘live without reserve’. How we navigate the situated conditions wherein the dynamic made of contemplation and of action wills its outcome will in turn define both ourselves and our consciences.

            G.V. Loewen is the author of over 60 books, and was professor of the interdisciplinary human sciences for over two decades.

The Decoy of Self-Improvement

The Decoy of Self-Improvement (a conflict of metaphysical expectations)

            I am a thrown project, arcing over what is at hand, stumbling through what is closest to me. I find I am a being in the world, a being which is completed only in mine ownmost death. I inherit nothing of my own, at first, and this cultural persona yet resonates with archetypes universal as well as the apical ancestry of the specific culture history into which I have come. As a boy, I had a certain set of role models after which I could shape myself: the adventurer, the warrior, the navigator, the architect, the bard and so on. The list of gendered archetypes for men is no longer than that for womankind, but it is much more projective, opening onto the world and indeed, taking the world for its own. And while it is an open question whether or not the hero’s life is still superior to that of the person’s, we are today confined by the dynamic extant between personhood and persona, an unquiet keep into which no hero can tread.

            To insert the heroic into modernity we have invented the popular discourse of self-improvement. I am not a hero, for I live in the world of humanity alone, but I may believe that I yet can act heroically, mimicking not the character of an archetype but simply some of its behavior. Each of our culture heroes, after the agrarian revolution, are figures like ourselves, augmented human beings, demi-gods due to a mixed birth, miscegenative misfits who are thus mis-aligned in both the social world and that dreamscape of the pantheon. The agrarian culture heroine is marked by her divorce from animality. In pre-agrarian societies, these beings are defined by their ability to change their incarnative presence, animal spirits who can take the shape of a human being and back and forth, as well as take on many other forms, relevant or appropriate to their task at hand. In my home, it is Raven who is the leading figure in what for us is now a most alien sensibility. Raven discovered the first people in a giant clamshell washed up on a remote beach, the metaphorical image connoting some kind of deep culture memory of the Bering Strait crossings, some 20-40K years ago. We are told that Raven was as astonished as were the people themselves, and this too is of profound import: across the pre-agrarian consciousness, humans and animals share not only a common nature, they share a common humanity as part of that nature.

            This is the metaphysics of transformation rather than that of transfiguration, which appears much later in human history. And at this later time there is as well a split, a schism, between the great irrigation civilizations of the East and those of the Middle East and West. In the former, transcendental metaphysics came into its own, with the goal of leaving this life for something that carried one’s being far beyond it. In the West, the this-world was understood as a proving ground for the otherworld, and, in passing over the evaluative limen which demarcated the two, one was transfigured. The concepts are distinct: in transformational metaphysics, it is a two-way street. One can change into something else for a time, and then change back, as the need arose. It is highly likely this idea came from the seasonal rounds subsistence societies were compelled to rigidly follow. Even the village sites changed, and in Raven’s geographic region the winter habitation sites were considered permanent, those for the summer, nomadic and temporary, shifting to follow fish, game, and plant food. The community took on a mobile form and format in the warmer months, and settled down into a rich symbolic harvest of narrative, theater, song and dance, during those colder. It was in winter that the animal spirits and others more radically Other, such as the world-transformer Kanekelak, or the Thunderbird, appeared and thence convened with Raven’s children and all of their relations. In these cultures, the mask represented this convention of Being, allowing the transformation of the hunter and the gatherer into something archetypical.

            In the metaphysics of transfiguration, there is no going back. It is strictly a one-way street, and in the West, it was the Egyptians who invented this sensibility. There were no seasonal rounds in massive irrigation societies, from the Yellow River in China, to the Indus-Harappan in India, to Sumer and Mesopotamia, through Babylon and to the Nile. Sedentism proper had taken over, writing was invented as well as slavery, large-scale warfare, and the priesthood, this last nothing more than a ‘calumniation’, according to Nietzsche. The Epic of Gilgamesh agrees with him and indeed broadens the critique, for its major ethical theme exhorts the hero to turn his back on the accumulated wealth of the new epoch and return to the garden; the world’s undomesticated larder which by itself never quite generated enough surplus for the social stratigraphy we accept as ‘natural’ to have taken hold. It is today ours to live with as best we can, but the perduring voice of the first mythic narratives still gives us pause: what if we could engender the perfect society, the best way of human life?

            If the culture hero as a figure is the frame within which I seek to improve myself, then the return to paradise is the goal. The sensibility is still agrarian, however, for I wish to become something other to myself at present and then never go back to it. It may well be that the conflict between pre-agrarian goals attained by agrarian means is what, at base, sabotages my efforts to make today’s society into an earthly Nirvana, wherein all are treated justly and all have what they need to live at a certain qualitative standard. We have yet to discover an authentically modern self-understanding, bereft of either aspects of the social contract – the idea of paradise itself – or those of the archaic civilizations – that I can transfigure myself and thus become more than I have been. There may be, in spite of these vast gulfs of both history and memory alike, still some points of contact. Raven is a pragmatist at heart. His transformational abilities are to be employed ad hoc, and never to simply gain status. It is of especial relevance that the huge surpluses that were in fact generated by the coastal chiefdoms were here redistributed through status-enhancing displays. The Potlatch, one of Bataille’s examples of the corresponding outlet for this set of cultures’ ‘accursed share’, saw both gift-giving and destruction of valuable objects, the ritual sacrifice of slaves, and alliance-marriage of young women. It must have been a lurid, outlandish spectacle, with its combination of grotesquerie and wanton vandalism, its deep cultural theater and the very presence of the transformer beings themselves, perhaps at a bit of a distance, their forms blending with the shadows of the giant conifers and the overshadows of the more distant mountains.

            For ritual too would become more staid with the advent of agriculture. Even its most grim displays – like the cutting out of a the heart of a slave or war prisoner at the top of the cultic Meso-American pyramid; in one stroke the formidable obsidian blade would slice through the ribcage, for the heart must still be beating as it was held up to the God in question – was a moment of climax. Propitiation had been altered from a simple orison to the cougar when one killed a deer or a women’s chorus on the beach willing the safe return of the whale-hunters and their canoes, to a highly rehearsed and therefore rote repetition of liturgical prayer, in the recesses of temples meant to ape mystery without their spaces actually being mysterious, such as the cave in which one of the first people witnessed the transformers’ secret song and dance. With sedentary society, highly stratified and specialized, generating uncounted surpluses of both foodstuffs and the mouths it had to feed, cosmogony gradually loosened its hold upon cosmology, and humanity itself, by shifting its sense of the temporal into an historical cycle rather than one timeless and eternally recurring, began to insert itself into the workings of the universe.

            But nowhere in human history and prehistory alike do we find the idea of self-improvement. It is a distinctly modern sensibility, even if it attempts an amalgam of more ancient sources. I am not a hero, yet I can act heroically; I have never experienced paradise, and yet I can create my own; I seek no Olympic summit but rather only to move institutional mountains. The symbolic decoy of this novel approximation of Dasein’s own authentic arc lies in its departure from our existential lot. I cannot be an allegory of myself, I cannot live as does the archetype, for indeed the latter does not ‘live’ in any real sense at all. Even here, however, such odd delusions are not fatal, for the entire worldview with which they had been associated is long past. No, the truer decoy, beyond any symbolic distraction, rests in the sensibility that only the individual person has the mandate to improve himself, and more than this, only himself. Yet further, that the individual person is the only space in which there could be improvement, implying that society as a whole is thereby bettered only because solitary persons have elected, of their own free will and perhaps goodness of heart, to better themselves. This radically inductive approach to cultural evolution is both utterly new – pace the social planners and utopianists from More to Skinner and everyone in between – as well as being oddly blind to its disconnect from the world. Its ethic – that I as a role model foster more compassionate attitudes and actions amongst other with whom I interact – is equivocal. Its light comes in the form of the neighbor, which is the most radically disjunctive of archetypes since he is fully human and yet has abandoned his humanity in a transformational manner. The neighbor excerpts herself from the bonds and bounds of all social roles, but yet returns to the world after her heroic act is completed. The world, in the interim, has not itself been altered.

            Let me suggest then that self-improvement outside of either symbolic distraction or the delusion of induction can be understood as the irruption of the neighbor, this libertine of compassion. Such action turned to act is, phenomenologically speaking, an expression of Dasein’s call to conscience; it is bereft of the self-conscious, as in its personal Potlatch it throws to the winds all possible worry and transforms concern to care, but more importantly, it is also devoid of self-consciousness, in that the sense that I must render care to myself first and foremost is also discarded. The neighbor is a presence outside of the present, it is an action becoming act, a being-within-the-worlding, and a figure without archetype. Its humanity is perhaps primordial, and only its ethic, historical. It decoys nothing, and yet it improves something, and this other-than-the-self which, in its transformation, also enacts something outside of itself and without self-reference. It allows me to become part of that which is closest to me, and, for a moment, the world is no longer simply at hand, but rather has arced itself up to meet my thrownness and take me into its essential embrace.

            G.V. Loewen is the author of over 60 books, and was professor of the interdisciplinary human sciences for over two decades.

A Lion in a Christian Den

A Lion in a Christian Den (My Ethnographic Church-Hopping)

            There is a well-known distinction made in the sociology of religion between religious belief and religious behavior. Ritual, that which engages in a public and thus shared manner of experiencing action in the world, with a view to integrating and maintaining community, is considered an external and thence observable set of behaviors. This is contrasted with belief, an internal sense or orientation that is in itself maintained by the faith in that said community. The most concise and accurate definition of their amalgam comes of course from Durkheim: “Religion is society worshipping itself.” Certainly, but what then of faith? In investigating this related, but different, question, I found myself over the past quarter century attending a diversity of churches in some very different geographic and cultural regions of North America. I will briefly summarize two outstanding examples below, before attempting an equally cursive analysis.

            Mississippi: For three years I found myself in the very heart of what Mencken sartorially called ‘a miasma of Methodism, a backwater of Baptism’ and so on, but in spite of appearances, these most deep southerners more endured the ritualism of their ancestral beliefs than exhibited any sheer fanaticism concerning them. As one neighbor of mine said, ‘We’re like 7-Up; you like us, we like you’. Amicable enough, but the rider to such a sentiment included the sense that one should live and let live in the very much ‘when in Rome’ style. I too was something of an appearance, even an apparition, being a stranger in the strangest land I have ever experienced. My ad hoc but abrupt criticism of people’s beliefs and behaviors could be put down to me being a foreigner, even an ‘alien’, but there is only so long a community of like persons can put up with such before inviting the interloper to take his leave. Before this inevitable moment came, however, I had been equally invited to a great number of churches, since there were not only a plethora of choices scattered round the haunted landscape but as well, I had a great diversity of contacts through my professional employ.

            I attended a Methodist church, where people of my ‘class’ – which did not merely refer to socio-economic status; not at all – and ‘race’ – self-explanatory in this region – and found it to be a convivial hearth of semi-reflective self-analysis. Much depends upon the minister, of course, his druthers and his education, and the more so, his concept of faith. These Methodists were engaged in a self-critique which did not extend fully into their society of upbringing, but preferred to lead by implication: ‘If I falter, it is not so much the sources of my character but the way in which I as a character behaves’. By contrast, the Southern Baptist Convention uttered criticism only in the direction of others. I attended an example of this denomination and found it to be in most ways the very opposite of the Methodists. It was overtly anti-intellectual, defensive in its posture, preening in its delivery, and was unconcerned about the hallmark of the distinction noted at the beginning of this piece; that people who heard the sermon could not recall anything of its content when asked promptly after service ended. It was enough to see and be seen. The Mormon students that were in my classrooms were an ingratiating bunch, and I visited their ‘spaces’ and found them to be genuinely interested in learning as much as they could about other viewpoints. These were young people, often quite literally on their youth missions, and they were, in this region, often at extreme risk for violence to their persons, as Mormonism remains the devils’ work in Baptist and Evangelical territories. I also worked with a Mormon colleague whose favorite band was Van Halen and who had taken a doctorate in the social sciences. All of this likely mediocre education had made no impression upon his beliefs, but had completely altered his behavior. I also attended the Church of the Nazarene. This community was made up of blue collar professionals who had climbed one social class above their parents. It was ‘whiter than white’, excuse the apt and oft-used regional expression, and my black students looked at me with great concern and dismay upon their faces when I related my experiences with that sect. And speaking of which, I also received invitations from Black Baptist students and these forays, simply due to my own status and the culture shock felt perhaps more by their community than by myself upon darkening their doors, made for what was by far the most genuine Christian experience of any. The Black churches were ebullient, joyful, and emotional without reserve and reservation. They certainly had their own version of the ‘false consciousness’ about them, and why not, given the circumstances of their parishioners. If salvation was unnecessary for many whites – the white churches exhibited a great self-assuredness not so much that they were in the right doctrinally but that those who accepted their sectarian sensibilities could do no wrong thus-wise – those black took up the work of being saved with great gusto and passion. In a word, the black churches were proud, the white, merely prideful.

            Cape Breton Island: An equally marginal economic and cultural region, this ‘white person’s reserve’ – again, excuse the local flavor – had unexpectedly a great many similarities to the deep south. It had been marginalized by historical and economic circumstances; all who could get out had gotten out long ago. It too had a haunted landscape, filled with relics, antique graveyards, historical sites and towns lost to time. The churches were, however, themselves mostly abandoned, which contrasted mightily with Mississippi and contiguous states. My wife and I sat inside venerable piles with less than ten others upon numerous occasions, and we were by far the youngest people present, with the exception of the pastors themselves, who were always in their twenties. The only church that was able to maintain any sort of community was that Roman Catholic, and all others were essentially extended family affairs, in perhaps a fitting mimesis to the original churches of this area, settled as it was so far back in European North American history as to have lost the ability to think itself into a future at all. The United churches had here become as had the Presbyterian and Wesleyan churches elsewhere on the continent; the last vestiges of an ailing demographic willing themselves in and out of a collective grave. Belief was sacrosanct, but in a politely delicate manner reminiscent of arsenic and old lace. There were no abandoned churches in the old south, not even museum conversions, but indeed this latter was the better fate of churches in Cape Breton.

            Whereas ritualism was mostly avoided in Mississippi and like regions, the Cape Breton churches gave the appearance of only being able to go through the motions, perhaps reflecting the very lives of their fading converts. Interestingly, tradition was cited as the chief rationale for maintaining such small parishes and this in turn implies that most active reflection upon faith itself had long been replaced instead by a rote genuflection. It was personally disturbing that the two persons who had reached out to us most intimately died almost immediately after we had begun our social ties with them, one in his 80s, but the other in her 20s. They had given us the distinct impression that they had been moved by our interest and our interpretations of their work, which made their unexpected passing all the more resonant of the general passage of the wider cultural landscape and thus religion within it. The only other kind of church in this region could be called ‘new age’, or even ‘hippyesque’, and my impression of these meeting places – like some evangelicals, they disdained the term ‘church’ and did not themselves use it – was that they had collected all of those who had no familial networks through which one gained access to either the Catholic and especially the United options.

            Yet in almost every other way of life, the deep south and the extreme maritime regions enacted the same sensibilities and nursed the same sensitivities. Though the American Civil War yet resonated in Mississippi, it was not impossible that the Anglo-French war, occurring a century and more earlier, did not still have some effect in Cape Breton. One could argue that the island never did recover from the final obliteration of Louisbourg. Its simulacra, a brilliantly executed if only slightly more profound version of Caesar’s Palace, did not, in its faux resurrection, bring any of the rest of the region with it into the very much seasonal light of a niche tourist market.

            Reflections: A small church is today simply a gathering place for those who have grown up together. It is both a surrogate and genuine family, and one cannot simply show up out of nowhere and expect to be treated as one of its own. This is what large suburban churches, such as the ‘Alliance’ network and like others, are for. Now living in Winnipeg, my wife and I have found a small church that in general acts in a Christian manner, but here too, because of my own ethnic background, a Mennonite church can afford to exhibit its ‘welcome here’. Both sides of my family are from Winnipeg, and I am myself connected to well-known scions of the Mennonite presence, even if at a generational distance. All of this is highly suggestive that due to both the utter erosion of religion’s explanatory power – its cosmogony has no such force up against scientific cosmology – and the serial scandals that plague almost all churches of whatever credo and covenant, many of them to do with sexual abuse, even the word ‘church’ has begun to accrue to itself a kind of difficult baggage. And just as, also sociologically speaking, all churches begin as cults, some also end the same way.

            At the same time, modernity has fostered its own hallmark of an absence of community, and at all levels and in all of its institutions. It is a relatively simple thing to debunk belief, and an objective history of consciousness has shown that the very concept of the soul is at the least a cultural figment, at best a place-holder for an as-yet unexplored mechanism of the human psyche. We are mostly content to have supplanted its presence with an amalgam of personal conscience and the law. We have thus successfully displaced the spirit and its mortal expression in the church, but a perduring question remains: how does one replace the human heart?

            G.V. Loewen is the author of over 60 books, and was professor of the interdisciplinary human sciences for over two decades.

The ‘Ambitextrous’

The ‘Ambitextrous’ (Overtone and Undertone)

            Multiple meanings in literature, marketing, politics and even within the interactions of the day to day and the face to face are nothing new. They allow for the creative person to explore the human imagination, the wordsmith to get a kick, or the passive aggressive personality to take a shot. Playing deliberately from both hands, however, the ambidextrous text presents to us a more calculated version of the double intent. The more so, such ambitexterity seeks not to be revealed, and this is its chief departure from the coincidence, pun or clever play on words. Here, the merely clever slides into the sly, the amicable wink into that of the leer. It is particularly evident in marketing and politics that the ambitextrous is being employed, but beyond any specific usage thereof, there underlies the very ability for it to be used in the social structure as a whole. While the essential polysemy of language in general presents an overtone – something that desires to be known and thus attempts to take the fore – ambitexterity occurs as a converse to this, as in fact an undertone.

            One of many possible examples of the former in popular culture, amicable, clever but in an inoffensive manner, a wink only rather than a wink followed hard on by a nod, occurs in album titles. One need only recall to mind The Who’s 1971 Who’s Next, wherein we ourselves acknowledge the sense of it being the band’s next release, perhaps the implication that they as a band were in line for something or other – given all of the famous deaths and breakups of the period, for instance – as well as the visual jape of the band members themselves urinating on a concrete pillar and having done their business, asking the simple question of the consumer. A decade later saw the release of Rush’ live album Exit…Stage Left, where no less than three possible senses may be taken; the band leaving the stage, the stage itself has been left by the band, and the stage as a space is what is left over after the band’s exeunt. Hundreds of other examples might be cited, but the point is self-evident: such overtones of polysemy are meant to be understood and quite consciously so.

            It is otherwise with the ambitextrous. Though its use might be regarded as value-neutral, its underhandedness in both its method and its goal sabotages any possible ethic that could have seen to be arising therefrom. Given that I had the idea of the concept through writing what I hoped was no flippant flop – an oversize narrative with which I took great literary pains to avoid being a novel; the end result was more of a failed novel rather than something radically new – I also realized that a calculated effort to move the reader into another space of meaning through the unmarked vehicle of a canonical prose form was nothing more than a deception, however sophisticated or no. This instance can serve as a cautionary device for those future readers of St. Kirsten ­- sub-titled ‘the last novel’; and here there was authentic polysemy; at the time it was to be the final novel I myself would write, or if not, it was that previous, the ‘last’ one, the one beforehand, and thirdly, it was meant to be the final novel ever written by anyone; a concerted conceit but also a well-advised critique of the novelist in general; in a word, after this point there could no longer be a novel written at all – due out sometime in 2025. In principle, the creative effort must remain as the most focused, but also the smallest, version of the ambitextrous.

            For its truer homeland is that of propaganda, and in all of its forms. As Zizek has suggested, ‘only when one comes to believe in the truth-value of propaganda can it itself taken for the truth’. The latter is not as important as the former; one has to value the very idea of being misled. Why would anyone so value such a force? Does it seek to ever provide a suitable and tolerable veil for an oft-intolerable reality? Not quite, as this is rather the function of the social form itself, and we have understood this general principle at least since Durkheim. He suggests that ‘the air is no less heavy for the fact that we do not feel its weight’. Point taken: socialization is the most successful form of ‘propaganda’, if we are uncharitable. But if we are more objective, we understand that in order for any society to function at all, its cultural apparatus must be accepted in the majority by the majority. Its symbolic forms betray their function when investigated by either the native speaker or an outsider – even if the tools to ply such a trade must be learned formally and institutionally and are not, and never, a part of any culture’s primary socialization – and thus there is no enduring mystery about their presence. Much of historical analysis rests on these same pinions, and it is thus but a short step from dissecting a society of the past to one extant in our own time.

            The ambitexterity of ‘society’ as an abstraction rests in its ability to maintain a loyal fellowship, not a sycophantic follow-ship. Society and its polis are thus not ‘political’ in the specific sense of them being geared into the desire for power. Society has a power over us because we grant that authority to it through upholding cultural norms and participating in their corresponding forms of life. Culture trumps society just as history trumps morality. We are vehicles, in daily life, of both the passive symbols of our shared culture as well as active expressions thereof. This is why adolescence itself has at least two functions; it hones the adult’s skill in ‘maintaining the right’ in the face of youthful challenge, but at the same time, youth allows adulthood to make necessary adjustments to the social order, and in a most ad hoc manner. In this way, culture cleaves to itself the fluidity it needs to survive historical changes. It needs rebellion as much as it needs revolution, and it is up to the adult to winnow the one from the other other since the very incompleteness of socialization to be found in the adolescent disallows such persons themselves to make that same distinction.

            So far, we have seen the ambitextrous as a false mimesis of polysemy, as a calculated creative effort, and as an effect of how society itself functions through its symbolic forms. None of this is particularly underhanded, but in each of the foregoing examples, the undertonal quality is, nevertheless, present. Now we are better prepared to examine the purely propagandistic effect of the ambitextrous; this is not only its authentic practice but as well its highest self-regard. If successful in hoodwinking us into imagining that our way of life, our manner of unthought, our sense of right and our suite of prejudices are not simply the best way but in fact the only way for human beings to live, then it has served its highest master. Propaganda is least effective to any of these regards when it is served directly from the State. We are generally aware that this or that politician seeks to gain power and thence maintain it. Secondarily, the status of being someone who actually makes decisions is also in play. The vast majority of us have no such power, no such authority, and this is the majority explanation of why we tend to treat our children, and especially, our adolescents, so badly. Contrary to a fashionable script, this includes almost all white heterodox males as well; no power, no authority. The stage is thus set for the ambitextrous to take firm hold.

            Its leading edge is advertising. No matter the product being shilled, it is the landscape into which this item is set that holds the truer sale. We see non-whites, recently in a super-abundance which reflects nothing of their demographic ratio at large, but what are they doing? They are adding a pigment to an otherwise utterly Bourgeois setting. We see non-whites driving cars that in reality they cannot afford, living in gracious executive homes that are purchased by an insignificant number of their peers, spouting off in a tongue foreign to their ears, and driving their faux children to distraction by their ambitious social-climbing, made to look second nature in ads whilst in reality being a desperation of anxiousness. Just so, in order to remind us that this social order being portrayed is after all white at heart, we are yet called to witness white people doing all of the same things but mustered up with a sense of panache that non-whites are yet to master. With a salacious Schadenfreude, parents curb teenage desires in killjoy compartments, while very much in the background a reliable automobile is so noted. Reliability is itself being sold, in this sense, since teens are notoriously unreliable and in every way, and it is thus an adult’s responsibility to introduce them to a general responsibility, which apparently includes never even kissing one another before one marries. Being married is thus likened to driving a reliable car; the commodity fetish in this case is not about the product at all, but rather about a sensibility.

            The ambitextrous sells what is taken for common sense, all the while actually being a sensitivity over against both change and to the human imagination. It is a fear of desire, an anxiousness over personhood. It compels obedience not to the State nor even to society, both of which have their own, self-authenticating mechanisms of symbolic persuasion, as we have seen, but rather to our own worst selves; the self that masks selfishness with both a self-absorbed consumption and an aping role-play of the martinette, the one who mimics an authority he does not actually possess. That children are the chief victims of this masquerade troubles us not at all, for our own memories of childhood which have survived at all and which are not diluted by the sentimental – the major function of the ambitextrous in advertising is to present family life as the very home and hearth of human happiness, another unutterable lie given the abuse statistics, for one – remind we ourselves of being chattel. The fascism we endured was only overcome by us converting to the fascist figurehead. We now not only live the lie of ambitexterity, we are that lie.

            In this, the ambitextrous has successfully merged propaganda with socialization. In all of the efforts of the Tyro of the State, nothing political has ever come close to the rate of success to be found in contemporary advertising. And though we can find other spaces in which the ambitextrous is present – the schools are the most obvious example – in none do we find the sheer shameless showcase of purveying sentiment in the name of mere commodity. The latter is only a bauble, a representation of a hobby or the stuff of the dilettante. It is an ongoing astonishment, for the thinking person, to weekly witness the witless wonder of a way of life based upon so contented a self-delusion.

            G.V. Loewen is the author of over 60 books, and was professor of the interdisciplinary human sciences for over two decades.

Does Gratitude lead to Complacency?

Does Gratitude lead to Complacency? (The shared character of past and future)

            To be given respite in the face of a crisis is our greatest hope. Once given, once taken, how does this effect our character? Just now, and just then, I was compelled to be resolute, facing down the end and facing up to my personal challenge; the end of complacency, of whatever sort. Resolute being, one of the elemental ‘existentials’ of Dasein, places my being before itself, and thus as well wills my personhood to walk away from itself, itself as it is today. Cultures of all credo and stripe face this same task, and by it, all of them are challenged both bodily and mightily. It is perhaps not implausible to imagine that the courage which is demanded of a single human being in the face of the as yet unknown future might somehow be scaled to suit the needs of that same person’s society. The question of individual character might become a way in which to interrogate cultural merit, a kind of ‘superorganic’ structure which germinates in the basic subsistence of any social organization. The primordial society had no sense of history, and yet, painstakingly and imperceptibly, walked into a future, even though the concept of which could not itself take hold in this original imagination. Any time we today shun this movement, we are regressing into this first being; the proto-human who, in spite of himself, evolved a penetrating and visionary consciousness.

            Resoluteness is Greek, while gratitude is Hebrew. This is one mythopoetic manner of understanding the mystagogical function of the two contrasting ethical stances. That the former is superior to the latter in theory alone does not immediately help us, for it was born in the desultory of dismal dismay; the future is nothing but the end, its all downhill from here. For the Hebrews, the stance is itself weaker, but the motive superior: the future is ours to walk toward and though its all uphill from here, nevertheless, the vantage will be worth it. With the demise of Christian metaphysics in German idealism, the willing being had but resoluteness to call upon in order to become that futural figure. Can one be grateful for the loss of gratitude? As it is so often used as a mere platitude, being grateful lacks the essential kick which propels Dasein to complete the arc of its thrown project. At the same time, resoluteness alone often dismisses what has in fact already been accomplished, and to our credit. Today, we must then ask, what is resolute gratitude? What is the means by which Dasein discloses to itself not only its futurity as a being-ahead-of-itself, but as well, its own beingness-as-it-has-been, which would include its accomplishments?

            Due to a serious health condition, I lived under the impression of the loss of futural being for about 18 months. I was recently given a clean bill of health, a second chance at life, if you will, and found it just as difficult to accept the latter as I did the former. I had become resolute, and had found gratitude, but only concerning the past. I was resolute before the sense that the past was now all I had or could have had, and grateful for this past. But taken in this way, the conceptions become salves and vanish from the vocabulary of vocation, the erudition of ethics. Here lies one of the clues to resolute gratitude: that both must orient themselves toward only the future of Dasein. One may refer to what one has completed only in the sense of Schutz’s ‘I can do it again’, as a writer might say to herself, ‘I have written so many books, why should I not write another?’, and so on. In support of this self-reference which is not back-referencing, I must as well only refer to my prior experience in the manner Schutz has also detailed, when he quotes ‘I cannot swim in the same river twice’. Experience would indeed lose its value, both as the basis for human knowledge but as well, for any ethics, if it itself could only be repeated. This is why, in the primordial human trope, experience is limited to the daily round and to a small suite of crises in which all who live must be challenged by the call to that same life. Childbirth as the future, dying which is the past, hunting and gathering and storytelling and child-raising, as the present presents itself. Is it only the scale and detail of these essential rites of passage which has been altered over the eons?

            I want to suggest that for our own time, what has in fact been altered in a qualitative manner are the implications of mine ownmost death. During the interminable tenure of the social contract, there were no persons, and only parts of the mechanical whole dropped away. The ethnographic witness of mourning rituals in subsistence societies, however marked by astonishment and shot through with romance, nevertheless tells us that there is no one, only the many. One loved one’s group, unto death, and in that death the love of the group holds utter sway over the shared emotions. Here, experience of the human condition is the same thing for all. For us, so far removed from both the complete intimacy of the cohort – Freud’s ‘horde’ has been, in English, trailed away from itself with the over-emphasis on sheer size rather than cohesiveness, which is the other aspect the term suggests; his sense that it was paternalistic is almost assuredly an ironic projection, imported from his own analysis of the modern State – and the daily necessity for its nurturing and nourishment, cannot but see in experience only difference, not sameness. Just so, philosophers too have made it an ambition to convince us that experience must be ever new; Erlebnis and not mere Erfahrung. The lack of the novel in our lives is assuaged by the invention of theatrical experience, such as that to be found in sports and entertainment fiction. But there is nothing truly new in a game which has itself been played thousands of times, or in a script designed to appeal to a known market. In spite of this, we can be so captivated by the ongoing action that we forget the other chief aspect of authentic experience: its presence enacts not action but rather an act.

            In this, individuated experience, becoming an ‘in hand’ through its generalized call to conscience, reenacts the moments of ‘collective effervescence’, to use Durkheim’s phrase, to be found in contexts of crisis which the primordial human community endured or celebrated. That we cannot feel the presence of ‘others’ is precisely due to their being others to ourselves. This was not the case originally, and no ethic of the future would ever imply that it should so be again. We experience life only as our life, and this, in turn, invokes in us both resoluteness and gratitude. On the one hand, I am alienated by my solo adventures; ultimately, no one can fully share them, and this comes home to me most intensely when I am tasked with completing my own Dasein, when I am faced with finitude. But on the other hand, I am liberated by the very same sensibility; no one else has experienced life quite the same way as have I! This is a marvel, a wonder, and perhaps still for some, a miracle. Narrative thus becomes a means of communicating an unshared vision, rather than one of iterating a vision already known to all. Not only did this shift in human consciousness open up language to both religion and to science, it transformed cosmology itself, freeing it from being the vehicle only for cosmogony. Until the ethic of the individual emerges, gently beginning in the West with the Pre-Socratics and much more radically given a futural model in the life of Jesus, our story of the universe was the story of its creation alone.

            Today, origin myths are mostly of interest to folklorists and writers of fantasy quest narratives. This ‘lorecraft’ constructs in turn a ‘worldcraft’, in a manner not so different from what must have occurred during the social contract itself. Cosmogony thus remains as a part of the theater by which the lack of novelty in modern life is partly compensated, thus as well retaining an integral aspect of its cultural value; the latter day spectacle of the pulp fiction epic is our version of each evening’s fireside tale, told and retold in increments, night after starry night. But cosmology proper, liberated from the umbilical uroboros, is now able to investigate for itself the reality of the universe as it can be known without recompense and as only and ever presenting to our astonished senses the radically new. Cosmology is, in a word, the centerpiece of authentic human experience, for no other realm of our yet shared understanding is as alien and wondrous. It can be so simply due to is non-human character, and in this, it tells us its own story, bereft and unrelated to our human concerns. No cosmogony has this function, and indeed, just the opposite; origin myths relate human experience to the universe, not the other way round. This is also why almost all contemporary adventure epics chart a course backward rather than into the unknown. They are attempts to recover the recipe for respite alone, and mistake their ancient form – the extended, originally oral, narrative – for their present function – to impel the present to overcome itself.

            In this, we can be, both as a culture and as persons, too grateful for the past. The resale market for cosmogonical stories remains a leading ledger of this error. We are ourselves led away from the world-as-it-is, for that is after all the function of entertainment cast only as itself. The melodramas of fiction and sports, whether live-action or ‘virtual’, present to us a world askew, a world righted, a world askew then righted, or more disturbingly, a ‘right world’; a world which is seen as being itself in the right. Seldom are we met with the future of our own world, with all of its rightness and wrongness fully in our face. ‘Is this not after all the real world?’, we may ask ourselves. ‘If so, I cannot be entertained by it; I must be resolute only, and take my gratitude from that which allows me to dispense with my obligation to the future of that world.’ In short, the future is seen only as a task, rather than as well a gift. History is also both of these, but with the past, we overemphasize the giftedness therein and turn away from its challenge. Our stance towards the future is the very opposite; we overdo the task in front of us and forget what a great gift, indeed, the greatest of gifts, it is to have a future at all.

            And just as a person can fall ill and be forced to contemplate the lack of that future and the end of one’s life, the completion of one’s Dasein, so a culture entire can sicken itself to the point of disbelief in the future, of itself and in principle. Our half-planned technical apocalypse is a dangerous gesture to this regard. The future causes in us a basic resentment toward life if we take it only as a task. Our very will to life, so essential and indeed, seen as an essence in its supplanting of the animal’s survival instinct, is muted by this overstatement of the unknown as only a threat. Along with this, the dredging of the salvaged selvedge of historical druthers distracts us from becoming conscious that what we have been, as a species, presents just as much of a challenge to us – for it tells us who we are and why, and speaks these wisdoms to us without either rancor but also outside of all salvation – as it does a gift. The authentic disposition of Dasein’s response to the call to conscience as concernful being is that the past and future must be understood as equal parts curse and blessing. We cannot, as the cosmogonical viewpoint had it, simply choose the one and not the other, just as we cannot, as Nietzsche reminds us, choose joy without sorrow. We cannot choose the past without the future since it is we who walk forward resolutely from the one toward the other. Just so, this movement cannot be accomplished without gratitude, for futurity is something elemental to our being, and not merely an unknown factor to be discerned with time, an alien language to be deciphered with study. The future is, in its authenticity, of the same ethical presence as is the past, and thus requires of us the self-same sensibility; that of resolute gratitude and grateful resoluteness. Only by way of this will experience confer upon us its overcoming of complacency, and the universe will continue to be open to our wonder.

            G.V. Loewen is the author of over 60 books, and was professor of the interdisciplinary human sciences for over two decades.

The Reign in Spain

The Reign in Spain (falls mainly on the king)

            After having survived a quite literal mudslinging, Spain’s monarch must also have just as literally encountered the very ground of his rule. The sovereign, as a social role, is both the body politic and the territory, the land, whereupon his subjects rusticate. Bataille’s political sociology remains the best take on an anthropological history of the idea of the sovereign, but today we understand a ruler whose role is both archaic and even anachronistic to, perhaps with irony, work to get back to his earthy roots. A monarch today represents the people over against the government and other interests. They are a relatively free agent, apparently apolitical but not non-political, symbolic of a set of values of which all are supposedly supportive. Today, the list of such values which can be represented in this old-world manner is likely much shorter than it had been in the past, but we cannot be sure of this, mainly due to the fact that historic records are not only penned by the privileged, the literate, the cultured, but also preserved by them. We have an official line, prevalent in all types of history known by us, to the threshold that it would not be an exaggeration to imply that all history is, to a great extent, official history.

            The sovereign was, however, not originally an historical figure at all. The position was an Aufhebung, not only propelled to the apex of the societal pyramid, but floating above that point. Like the third eye of the Masonic lore, it was held in space by its divine assignation in feudalism, by its being perceived as the worldly source of Mana in traditional societies, or by its having secured a rather happenstance superiority in resource access and distribution, as in early irrigation civilizations. Held in space by the otherworld, and conversely, held in place by our shared world over which the sovereign presided but also must exempt himself from, the ruler’s rule is one shot through with distanciation. Today, of course, the remaining monarchs have come down to earth, with the date of 1688 being important to that regard. 1789 would not have been possible without the movement from monarch to parliament. Yet it is 1789 and not 1688 which allows us to become nostalgic for the monarchy and, in regions where such persons yet exist, such as Spain, imagine that the sovereign has a populist responsibility, an authentic obligation to ‘the people’ which, in turn, is the only thing that authenticates his existence as well as the continued existence of the role itself.

            Just as we have made God a fellow traveller, so the sovereign must also fall into that same worldly line. Lineage is now part of an antiquarian, even a dilettantish or yet Whiggish, history, and nothing more. A royal genealogy may be romantic, but it gives the current title-holder no moral purchase upon how responsible one is or what responsibilities one has. And the personalization of religion, which is easier to shoulder than that of politics due to the abstract and essential quality of the divine, is both a practice-run at making leadership itself worldly, as well as a hedge. The nautical phrase, ‘having one anchor out to windward’ applies to modern religion, especially Protestantism, in that we can still claim belief. We speak to a personalized godhead but we still have faith that someone is listening to us. Our relationship with sovereignty is muddier than this.

            Apropos, today’s monarchs are philanthropists in every sense of the term. They work for charitable organizations, they lend their status to benevolent causes, they labor on behalf of non-governmental organizations, they travel the world for the cause of surface diplomacy – nothing important actually ‘gets done’ on such junkets; monarchs do not negotiate the brass tacks of contemporary geopolitics – and they make appearances at arts and cultural events. They are taxed by their abstract origin; they must appear to be everywhere at once. To be seen but not heard in this overtaxed manner makes the sovereign into a young child. The monarch has no voice in any case, and to ‘blame’ him for his nation’s woes, natural or cultural it matters not, is to mistake both his person and his role. In the capacity of the former, he is like any of the rest of us, covered in mud by mudslides, suffocating to death if in the wrong place at the wrong time. As to the latter, the monarch has no political power, no Realpolitik, if you will. And while many of us have imagined, perhaps as children ourselves, that it would be a lark to fling mud at a king no less, the act is itself symbolic, participating in that near-primordial order of affairs where the sovereign’s very being is lived on the land through and by myself.

            This same land had betrayed its people, murdering them ruthlessly and anonymously. Ergo, the king had demonstrated that self-same betrayal. This was no mere matter of sympathetic magic; the sovereign is the land as well as is the people, and so in him, through a natural disaster, an internecine conflict occurred. The Lisbon earthquake was interpreted by some as evidence for the absence of God in the world. The world had, in that case, betrayed itself, shuddering to its foundations the culture that had grown from it, shaking in its essence with the parturition from the source of its own creation. There is no Erda in our contemporary narrative. Wisdom comes not from the earth but rather from the greater cosmos, the only remaining presence that can mimic both the distanciated being of the divine and its royal representative, as well as the abstract quality of the moral Mana necessary to keep everything in its static place. Just so, all populist politicians, none of them remotely royal or abstract, claim to be ‘the anointed’ – a recent report had one Trump follower referring to him using that exact phrase – and if one is loyal to them, they shall return the earth to its former order. The ‘again’ of these slogans is what is truly disturbing about them, not the idea of greatness.

            But Bataille reminds us that an authentic sovereign had no need to make claims of any kind. Just as the one who possesses what possesses her, the person of faith, the one who has no need to express or expound that faith to others – her acts alone speak the voice of the greater being, which is why some faiths refer to them as ‘works’; a direct nod to the sense that the divine ‘works’ through us – the sovereign acts without having to take action, utters without speaking, works without laboring. No mere politician can accomplish any of these things, but neither should they try to do so. Self-sacrifice is the lot of the modern leader, for she remains a person even when occupying her lead role. Not only was the sovereign never a self, he had no personal relationships. The people were his embodied action in the world, the land his deeper hearth. ‘The world is deep’, Nietzsche intones, the seriousness of Zarathustra’s ‘Midnight Song’ given an oddly fitting sanctity and transcendence by Mahler setting it into his Third Symphony. Yes, the world is deep. Yet we have today chosen to live only upon it, and not within its embrace. This, for the mythologist, is the truer source of the climate crisis and the overuse of our shared ecosystem.

            Divorced from the earth, our leaders no longer ‘earthly’ in that ancient sense but rather entirely worldly, we must alone confront the sheer scale of anonymous natural forces which can suddenly impinge upon our existence. The ‘natural’ disaster can sometime be avoided with planning and foresight, and this is the argument of the Spaniards who were made victims by the recently value-neutral earth. Insurance companies, ironically still comfortable with using the phrase ‘act of God’, cannot replace creation, only repair destruction, for they are not themselves Gods. Insurance can only take action, not render act. Because we are persons, our Gods personalized, our leaders elevated but not exalted, we must come to terms with both action and labor, ‘own’ our responsibilities but not author them, and leave the act to history and the work to the arts. Only a God resurrects; its representative, more akin to a mobile organ, presides over a ritual laying on of hands, acts as the vehicle for Mana, and wields it on behalf of the people at large. The sovereign sacrifices all that is merely human, and unknowingly, for from the beginning of his presence he will not be human. The Dalai Lama is perhaps the last vestige of the sovereign whom Bataille brilliantly analyses. Not a person, not quite human, he is gendered only for convenience, dressed only as a sign is dressed. His lot is no pillar of fire by night, but even so, the sovereign is expected to guide his people through his decisions. The body of the sovereign is culpable if other bodies fail; in this case, the earthly corpus lashing out, taking the people’s corpses into itself, in an excessive ritual of inhuman inhumation.

            What of our own expectations? It is commonly said that we expect ‘too much’ from our politicians, and not only given the dynamics of office and how one attains it. But this hypertrophic trophy, the leader, cannot connote a victory other than one political. It is not that we expect too much of the person but rather of the position. The reality is, is that a politician is not a sovereign, a person not a God, the office of policies not a temple of wisdoms. So, when the earth reminds us of its own current status, forever now apart from the transformational cosmology of the social contract and, more recently, divorced from its ability to at least provide recurring subsistence as a ‘land’ does for its people, we shall suffer. It is part of our drive for Babel redux that compels us to lay our too-possessive hands upon the earth, but in this we mistake the relationship a God had with earth; that we imagine the earth was enthralled to the Mana of Being, rather than it itself existing as its own form of being. Just so, since we are not Gods, our beings must remain ‘in the world’ and not within the earth. For only do the dead make the earth their home.

            The castigation of Castile is a case of mistaken identity. At once, the politics of identity is called into question: who leads? As well, the idea of identity politics emerges more fully: we shall seek to resurrect not ourselves – once again, only I as a God could do so – but instead our tribe; that which existed before there were either sovereigns or divinities. The question is itself recurring: can we manifest the community of the social contract on a global scale without descending into the mechanical solidarity which made society possible in the first place?

            G.V. Loewen is the author of over 60 books. He was professor of the interdisciplinary human sciences for over two decades.

An Ethicist Looks at Youth Pornography

An Ethicist Looks at Youth Pornography (a self-inflicted study)

            “I thought I’d become entranced with myself. I got into it because I wanted to have fun and it’s my body, right? But instead of it being ‘hey, look at me’, it quickly became ‘hey, look at all the people looking at me.’ It was all about the numbers.” (19-year-old female university student).

            “My only regret is that I started too young. I was twelve. I wouldn’t recommend it before say, 16, as my body is now no different than it was at 16. But at 12 I was so taken with myself and that I was in control, you know? And all these thousands of people following me. But ten years later, I look at those people and say, ‘Uh, excuse me? You’re following a naked 12-year-old. There’s something seriously wrong here.” (22-year-old female university graduate).

            “Girls who are on the net want to be on the net. It’s that simple. Many do, most don’t. It’s like anything else you do, from vaping to playing volleyball. Most don’t, but some do, who cares? And yeah, you’re told about risks, but how many suicides have there actually been? I’ve read of three in the backstory news over the past twenty years. Three! Out of tens of thousands of girls per month, who knows, maybe way more. You’ve got a better chance of being struck by lightning, speaking of risks.” (18-year-old female high school graduate).

Introduction:

            When I consulted as an expert for the senate committee tasked with setting new government policy preventing access to violent pornography by minors, I was struck by the assumptions everyone in the conversation made about the topic itself. Eventually, Bill S-210, (age verification for online porn sites), was adopted by said body on April 18, 2023. Its coverage had, perhaps inevitably, generalized itself from restricting access to ‘violent’ pornography to all online pornographic sites. In good faith, I did not suspect the bill’s sponsors of any prior intent to widen the scope of the bill, and indeed, given that one could neither properly define ‘violence’ in sexual portrayals with any efficacy, and that even if one could, such (per)versions of intimacy would be mixed in with all other possible versions, given the scope of the sites in question themselves, any bill seeking to restrict access thereto for minors would, in the end, have to inure blanket coverage. I supported the bill as is, in practice.

            But the process left many unanswered questions. Why did minors seek out pornography, even participate directly in making it? Why did adults seek to limit such access, even ban it outright? The usual arguments hailing from developmental psychology were, to a philosopher’s mind, verging on the vacuous. Psychology itself is the source of our knowledge of children’s sexuality. Children are sexual beings. The question must rather run along the lines of the sexualization of children for adults. And this is not a question for psychology at all, but instead, one for ethics. As the American Psychiatric Association defines pedophilia using phrases such as ‘a prurient interest in children under 12’, by its own discursive and policy standards, the banning of youth access to pornography must, in turn, be argued as well along lines other than those from psychology. The argument I put forward to the committee is that, under Canadian law, persons under the age of 18 cannot have sex for money. This was the only point of consistency wherein an outright ban of access for those between ages 12 and 17 would make any ethical sense. Since pornography as an industry is not truly about sex but rather money, minors should not be able to participate in it.

            But what of pornography itself? Between parent-pandering politicians, schools concerned about lawsuits, psychologists and counselors drumming up business for themselves, and NPO’s fulminating the latest moral panics, it was clear that neither clarity nor objectivity was to be found in the public sphere regarding issues surrounding youth and shared sexuality. In order to discover the reality of such a conflicted and ideologically laden scene, it was equally clear that one had to properly study it oneself. And for better or worse, I did.

The Study:

            For the past three years I employed a battery of mixed qualitative methods, including unobtrusive, indirect participation, and interview as well as dialogue. Participants were solicited from their on-line profiles found either on porn servers or from my own academic networks, and the therapists were recruited from the Psychology Today listings. I was up-front uncomfortable about asking actual minors about their intimate doings and so I did not attempt to do so. This is a weakness in the study, as I had only past and indirect access to youth participation in pornography, through the voices of those who were youth in that past but had in the interim, before the study commenced, become legal adults. The epigraphs above are examples of hundreds of like interview out-takes. Some of the methods involved deception, including posing as a female youth online to attract groomers in hopes of disclosing the process by which illicit pornographers recruit their victims, and posing as a patient with a pornography addiction in order to access psychotherapists’ on-the-ground practices and methods of combatting this medically real health issue. As a veteran member of 4 university research ethics boards and a co-founder of two, I was well aware of both the pitfalls of engaging in deliberate deception during research as well as the ‘dangers to self’ involved in certain kinds of human subjects ethnography. Indeed, as an ethicist, it was often my role on such bodies to look for possible risks to researchers and over the years, I found many. This specific study presented a number of risks, since I was interacting virtually with both criminals and at-risk young adults. Perhaps ironically, perhaps fittingly, the vehicle of digital media lessened those risks for my vocation just as the informants claimed it did for theirs.

            Such a study would not have passed any ethics board I sat on – not on my watch, at least – but since I am long outside of the institutional circle, itself mostly concerned with litigation against it and less so about the truth of things, as an independent scholar I remained uniquely qualified to engage in this kind of research, having both twenty years of social science fieldwork behind me, much of it in arenas of social deviance and other marginal communities such as UFO cults, American Civil War reenactors, and artists. I had as well authored the first detailed scientific study of a specific genre of sexuality, the BDSM theatre, which appeared variously in peer reviewed journals as well as in my monographs of 2006 and 2011B. Even so, this recent study was different than any other I have completed in a number of important ways: 1. I no longer was capable, nor felt it necessary, to include amongst the methods those of direct participation. 2. The atmosphere surrounding the topic at hand was muddied beyond any possible clarity by moralizing, anxiety, and fashionable politics, as well as a vague fear of technology in general; and 3. Given 2, it is unlikely anyone will pay the least bit of attention to the nonetheless interesting results thereof.

A Summary of the Responses:

            All vectors requiring the suite of methods outlined above were ongoing simultaneously. It invoked, in the traditionalist view, a sense of that old-world ethnographic immersion, with the major exception that I had no novel ‘natural language’ to learn, as one would do, with pith helmet atop head and notebook in hand, ‘among the natives’. Nevertheless, I found the denizens of the pornographic scene to indeed be restless in their own, sometimes fetching, manner:

            “I was like, ‘Okay, I know I’m hot’. All my friends adored me. I wanted to pay for my own college. So, I get on there and I’ve got thousands, then tens of thousands of views and so on. I felt like I was the hottest thing out there. It was very empowering. But then I checked out the competition and it was like, ‘Okay, yeah, she’s kinda hot too’, and ‘oh, uh, okay, she’s hot’, and ‘Hmm, damn, she really is hot!’ and on and on, right? And then the whole thing became kind of a spiteful, vindictive battle of who could generate the most followers and you know these were all the same people following everyone, because young guys, and huh, I guess old guys too, can’t just look at one pretty girl.” (19-year-old female university student).

            The motivation for intimate expression and display was in some majority income related, especially for youth but also for young adults:

            “My parents couldn’t afford college. I was the first person in my family to ever go, and the only reason I went was because I did my own internet porn. It was by far the easiest way to make money. No managers bitching you out, no guys harassing you at the workplace, no minimum wage and then getting home and taking three showers and still the fast-food grease smell is on you. Its shit, utter shit, anywhere else teens work, right? So as soon as I actually was a teenager, I got on there. I’d seen my older sister work fast-food and it killed her. Not me.” (21-year-old female college graduate).

            The sense that making pornography, even illicitly, was a superior form of both self-expression and of employment, was a major theme in interview:

            “Don’t talk to me about morality. Is it a ‘good’ thing to put on a micro-skirt and sashay your way around a restaurant, smiling and flirting and flaring your skirts and bending over surreptitiously just to generate bigger tips? Is that ‘moral’ behavior? No, you wanna look, you’re gonna pay. And the only way a young person can balance those books is by doing porn. I can’t say I love it, but its way better than anything else out there. So, save me the lecture on responsibility. I fucking told my mom to shut it, I’ve paid for college through it, and what do you know? She did.” (25-year-old female graduate student).

            I was unable to access more than a handful of young males who were willing to speak of their online activities, legal or no, but those that did manifested an apparently sincere understanding for their female counterparts:

            “I don’t know if you ever worked a shit-job in your life, no offense. But people don’t know just how badly girls are treated out there. Guys like me, almost all guys, think girls are just objects for their amusement and desire. I got so turned off by that. And then one day my girlfriend told me she was doing porn, take it or leave it. Well, if her, why not me? It only seemed fair, you dig? But only when I got into it did I gain empathy for women. There was no danger for me at least. I read that the audience for young guys is either gay men or middle-aged women! Makes me laugh because I’m not gay and I was a teenager at the time. Like, hey, my mom and her friends think I’m the shit! It was a huge joke, but the money was better than anything I could have made short of becoming an actual sex worker. But then I’d have to have actual sex with my ‘mom’, so, uh, no way!” (21-year-old male university student).

            As in most professions, amateur pornography favors men, in this case mainly because the vast majority of workers are female, even though the audience for pornography of all genres is evenly split between the dominant genders. Even so, doing pornography was still found to be alienating for some in this study:

            “With all the tech toys out there, I learned quickly that I could have much more intense pleasure than any man would be capable of giving me. Overnight, it was like, ‘well, who needs men?’. And many women I know feel that way. Like, in general. Virtual solo sex for money. Sounds perfect, you know? No obligations to anyone, no health risks like STDs, no chance of rape or whatever. And cost-free admiration. Who cares what they’re doing, right? Some people I know get off on others getting off on them. I guess they could be called exhibitionists. But all these labels do is make things clinical. It’s irrelevant. The only thing that matters in the end is the money, on the one side, and the lack of real community on the other.” (27-year-old female white-collar worker).

            The anomie, or subjective alienation, expressed by some in interview was, however, not a universal sensitivity. Feelings of loneliness and usury developed only over time, and were associated strongly with older participants. Those younger adults who had been manufacturing and distributing illegal pornography for some years as youths shrugged off suggestions of any potential Weltschmerz in waiting:

            “Do you really think I’m going to be doing this at age thirty? One, no one would watch. Two, I should have two degrees by then and some normal job. I might even be married, who knows? That’s the whole thing about people who worry about teens and sex. They don’t understand that it’s just a phase of life, like anything else. Old people don’t have sex, or not much of it. Young people do. It’s that simple. Do you jack yourself off and record it? ‘Hey girls, the famous philosopher is fucking himself on-line! A can’t miss, that one.’ No offense, no really, but you get it right? I mean, I appreciate you doing a study like this because like, no one knows what this shit is really about.” (19-year-old female high school graduate).

            Very often, during any research process, participants themselves suggest promising lines of querying. So, I began to ask that seemingly simple question, and the responses were intriguingly critical:

            “What is this all about? Well, for me, it’s about control. My parents tell me what do to 24/7 and after a certain age it’s like, ‘Well, go fuck yourselves’. Hah, and then, its well, I can fuck myself but in a good way, unlike what others try and do to me. Okay, so now I’m in college but I still live at home. There’s nothing illegal like them hitting me, but there are still rules. The economy forces young people to stay young for far too long. I get that. I can’t afford anything by myself. Even if and when I get a degree, is that really going to set me free? Making porn is an insurance policy; that’s what it’s all about.” (18-year-old female university student).

            With more veteran producers, a semblance of a politics emerges:

            “Okay, good question, if vague. For me, it’s about exercising some sort of agency in a world that cares nothing for me. What are my skills? I have a great body and lots of energy. Fine, what else? Do I sell my body and my face for next to nothing waiting tables, or do I sell it on-line for decent wages? You tell me. You didn’t have to make that choice, no offense. But anyone who moralizes at me and anyone else who hates what I do needs to look in the mirror. Are they jealous of my youth? Are they the same people who leer at my peers who do wait tables? Yeah, I’ve ‘converted’ a few of my friends. They’ve seen the light, hah! No more butt-pinches and slaps at the fast-food joint, no more stares and comments at the sit-down restaurant. You get the picture. Long live the internet!” (23-year-old female sex worker).

            In spite of the consistent if not constant caveats generated by government agencies and NPOs alike regarding the risks to youth who involve themselves in pornography, whether as viewers or actual producers, when asked about such risks and their attendant campaigns, respondents were universally critical:

            “The only time I was stalked was when I worked fast food. You get all kinds in places like that, and all the wrong kinds, whether its people with disabilities, criminals, unhappy husbands, INCELs, you name it. And you know, like, right away, ‘this guy is dangerous’. On-line there’s no contact. If there is any danger to it, well, two can play that game, right? You expose me I expose you. The police can track your IP and the rest of it. Don’t insult me or play with me on-line. You have no idea where I live or even who I am. Those few girls who were threatened with exposure, maybe one or two killed themselves, well, how did that even happen, right? I have no fear of ruined reputation because there’s a million girls out there who look basically just like me. Do I live in Lithuania? No, but she might. And when I was still in high school it was like, ‘okay, make my day asshole’.” (20-year-old female university student).

            Not all research participants were as confident, nay, yet belligerent, as were some, but even the more cautious ones sneered at the nay-sayers:

            “You have to be smart with it. Of course you do. I would not tell a young girl to try this. I started when I was 15 and I learned quickly what not to do. Never invite anyone into a chat. Never focus on one consumer at the expense of others. Never say you’re single. Never offend anyone, like by saying anything about their own sexual prowess or ego. Obviously, never mention where you live or what school you go to or your real name, I mean, a ten-year-old knows that part of it, not that she should be doing what I do, but really. The biggest thing is that you’re being paid to be someone’s fantasy object, and as long as it stays at that level, there’s no risk. Like, none at all.” (19-year-old sex female sex worker).

            I asked producers what was going through their minds during actual performances, and correspondingly, reported further on, I asked therapists what transpired mentally during their respective interactions with those who did, or had, performed:

            “When you’re live it’s all about the act. You’re getting pleasure and so are they. Nothing else should intrude upon this ‘duet’, if you will. It’s a total fantasy only in the sense that I would never be together for real with anyone who views me, and they know that. But they can dream, and when they do, I’m there for them, almost equally for real. The thing that pisses me off is now the 3D AI ‘girls’ are stealing my views and you know it’s not other teenagers making those. It’s some loser who has tech gear and skill and he’s making money from some of the same people I used to make money from. Pretty soon all the moralizers can just go home, with that going on. Who knows, maybe some of those religious fanatics are actually making the AI shit, trying to put real girls out of business!” (22-year-old college graduate).

            I had not thought of that possibility, as ludicrous as it may sound on the face of it. Whoever is generating artificial sex objects however, is panning for the same guttural gold as are real persons; that much was clear. Another common response:

            “Okay, so it’s a business like any other. There’s you and there’s the competition. So, you innovate, just like any good entrepreneur. As far as the AI stuff goes, well, I have a video where I slash myself on the back of my arm and it bleeds a little, no biggie. And I say, ‘No fucking sex doll or AI mock can do that, boys.’ And some people are turned on by that, and word gets around, right? I got good responses from that one, a lot of views. People said they really appreciate me ‘being real’, and that I’m ‘not a coward’. And though I’m not quite real in one sense, I do have guts. It takes guts to make porn, which is something the haters like to forget. You try it.” (20-year-old female university student).

            The therapists and counselors involved in the study were not of one mind in their responses to being shown patterned interview out-takes with young adults. Many were shy of making any final judgment at all, which was consistent with their professional duty to act as resources rather than evaluators. The following was commonplace, whether I myself was feigning illness or no:

            “We should never moralize about sex. It doesn’t help at all. Especially for young women, I feel they are driven to place themselves at risk because they are looking for some reassurance. Not only that they are beautiful, because they know that it’s no great turn of trick to be beautiful at their age, but much more so, a kind of validation that they have some social worth, that they have a place in society more generally. What kind of place is, of course, another matter entirely.” (middle-aged female psychotherapist).

            A male professional counterpart added what turned out to be as well a well-travelled road:

            “I’ve worked as a counselor for only two decades, so while I’m still young, I have increasing difficulty identifying with youth. You told me you had the same issue as a professor, when you were still teaching. It makes me raise my eyebrows, when a teenager tells me she’s making porn, but I don’t judge. That only makes what might be a bad situation worse. Instead, I ask such a person, ‘what’s in it for you?’. I get very similar responses as you have shown me from your study. The sum of such responses is, I dare say, quite convincing.” (middle-aged male psychologist).

            Professional psychologists and counselors varied only upon their methods of guiding minors or others, and in turn, based this variance on whether or not the client in question actually wanted to get out of the business or did not. No clinician or counselor with whom I spoke, either as a health research colleague or as a ‘patient’, said that they had ever recommended to a porn producer that they stop, let alone suggesting that they were necessarily placing themselves at risk, contrasting mightily with the journalistic, political, and other grassroots voices regarding the topics at hand:

            “I don’t want to ever say to a young person involved in porn that, ‘there’s no risks’, but we have to look at the stats. We know that 95% of violence and abuse against minors happens in the home and from family members or friends thereof. 95% of the other 5% happens in the schools or in other like contexts, as in, where there may be coaches, music teachers, ballet instructors, and the like. We know this, and we have known this for some time. But it is only very recently that stories of such abuse are appearing, and some very high-profile ones, like the Olympic gymnasts and what-have-you. And yet parents blithely drop their kids off at ballet or whatever, and those same kids, when older, with their same trained and disciplined figures, may be making porn, because they know they have the right type of looks for it. And only then do parents hit the roof. So, there’s a problem with the whole discourse surrounding risk in our society, and I for one am glad you’re doing this expository study on one of the core arenas of these misconceptions.” (middle-aged female clinician).

            I have argued elsewhere that most organized activities for youth in our culture serve multiple, often conflicting purposes. Henry Giroux is the most sophisticated name in this part of critical discourse, but alas, I could not access him to comment upon this study. Yet psychologists themselves appeared aware that there was a studied hypocrisy afoot when it came to comparing activities such as sports and the arts with pornography. I then, in turn, threw that out in the direction of the pornographers themselves:

            “Hah! Well, that really makes me laugh. I was in ballet for years. That’s exactly how I got this body and the confidence to strut my butt, right? But dance is like all the rest of it for us girls. The adults bark at you, touch you when and where they should not under the guise of ‘positioning’, some parents even still spank their kids if they’re younger. The dance teachers don’t dare but they tell on you, right? I got it up until I was 12. Now I spank myself for money and I’m in complete control of it, which I never was as a little kid. So yup, hypocrisy? That’s basically any older adult’s middle name as far as I’m concerned.” (19-year-old female university student).

            There were many respondents who also did not see any serious difference between doing sports or dance and doing porn, given the apparel and physiognomic feats required for many athletic and performing arts venues:

            “Yeah, well, the thing of it is, what I wear online and what I wore in dance or when I was in track at school; not much difference. And I’m still doing crazy things with my body either way, so no real difference there either. And the people who showed up to watch me play volleyball in high school weren’t all there to watch the game, if you know what I mean. Same with track, same with dance. The bottom line, excuse the expression, is that people want to look at young girls, the less clothes the better, and so we’ve got all kinds of ways people can do just that. And my parents never batted an eye at it. So, it’s all porn, at the end of the day. All of it.” (18-year-old female high school graduate).           

            When I asked how making porn itself, illicitly or no, compared with just viewing it, after explaining that I was consulting for the Senate committee, a number of responses shared the following themes:

            “The viewers are losers, at least in one sense. But I’ve read other studies of porn usage. On the one hand, you have the stereotype, the INCEL guy who could never get a date, or that’s how those people think of themselves, anyways. I’ve always found that there’s someone for everyone out there, sad but true. But on the other, you have some married guy with a professional job and an attractive wife but they now have kids and he’s not getting enough. Women too, of course. So that audience isn’t losers at all, and so I have to perform with both in mind. But as far as the difference between making and just viewing it, producers have the bods and the guts, the consumers are just anyone, and they might be cowards too but I don’t really judge or care about that.” (23-year-old female sex worker).

            The other category of respondent were the groomers, but since they were, by their own tacit admittance, criminals, and their process of recruiting for underage sex labor was shot through with both a cloying extortion and hortatory clichés that I felt even an eight-year-old would not fall for – though apparently, I remain naïve about such entrapment – I do not consider any of it worthy of reproduction here.  Rather, I end the results section with a typical summary of responses of amateur and unaffiliated professional producers when asked to characterize the essence of the falderal surrounding their chosen workplace and their activities within:

            “It’s not for everyone. But what is? Don’t tell me I can’t do it because you don’t like it. Or you pretend you don’t. Too fucking bad. Look, I’m 18. Everywhere I go people stare at me. Do I get paid for any of that? Do I get a guy, young or old, come up to me and give me a hundred bucks and say ‘Sorry, miss, I was leering at you. I know this doesn’t make up for it, but take it anyways and just know I’d never do anything more than just look’. Never. Never in a million years would that ever happen on the street. But hey, I discovered a wondrous land where it does happen! And in that land, that’s all guys do, is ‘just look’. You hear people yelling ‘keep it real’. No, reality is what sucks. Virtual reality is a godsend. I’ll be making porn until no one is willing to pay me for it. And every critic can just suck on that. Full stop.” (18-year-old female high school graduate).

The Analytic Upshot:

            In every field study I have conducted as principal investigator, I have found that the commonplace sociological rubric regarding people defending their own values is true to life. The sentiments expressed by sex workers, of whatever age or style of performance, was no different. Even if their community is disparate, partly fictional, and connected only loosely, they still felt that they were a part of something greater than themselves as individuals. Many saw themselves as rebels with a noble cause, even as social critics. Policies which censored them or targeted them in other ways were disdained and mocked, the apparent hypocrisies of their political and parental vendors exposed. I was myself asked, on some occasions, about my role in such censorship, and I explained that, as an ethicist, I would like to see some formal accountability within the organizations benefitting from uploading their materials and profiting from them, as often as not without the original creator’s knowledge. But even this was a hedge, and I knew it. Better to state that the distinction between a shared everyday reality which is always public and must place the whole of itself before any specific part thereof, and the semi-private reality of the internet and like venues, needs to be preserved insofar as the former does not find itself too engrossed in the latter. For cultures too can become addicted.

            The most important points raised by respondents in an ethical sense were those directed against the idea that pornography was somehow qualitatively different than other activities youth partook in, and that the conception and definition of risk within its scope was severely overblown. For myself, and from an analytic standpoint alone, there may be a sense that if young people in any society become too taken with themselves in one relatively narrow way – the perfect physical and sexual specimen – then their once-respective identities would be as narrowed. As an ethicist, I think this is the greatest danger at the level of personhood. At the level of character, I feel that there is a danger of a craven cowardice in virtual sexuality, precisely due to there not being a real other with whom one must confront, conflict with, reassure and rapproche, and most of all, try to love. Given that almost all respondents themselves appeared to understand these dual dangers when questioned about them, and put their lack of interest in their ethical themes down to simply not being part of the phase of life we generally refer to as youth – thereby implying that when they were more mature, such themes would then take on more weight in their lives – I could not in turn simply dismiss such a reply. We do not yet have the longitudinal data to document either way this implied transition.

            In light and in lieu of this present absence, I will end this summary with a final quotation to these regards:

            “No one does this forever. I’m certainly not planning on it. In ten years, I’ll be married and probably have at least one kid. I’ll look like everyone else you see; that is, not great! My husband will want to fuck me at his discretion, my kids will want me to feed them, drive them somewhere, help them with their homework, all that. Right now, ‘all that’ feels like a kind of death. So, what’s so wrong with living a little before you start to die?” (19-year-old female university student).

            G.V. Loewen is the author of over 60 books in ethics, education, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

On Truth and Lie in a Virtual Sense

On Truth and Lie in a Virtual Sense (it’s not 1872 anymore)

            In what is arguably the most important short essay of the 19th century, the youthful Nietzsche belatedly answers the querulous query, ‘what is truth?’ made notorious, if still resolutely apt, by Pilate. For some millennia, it was recognized that though reality could possess lies – especially the social reality constructed solely by human beings – truth, by contrast, could not. But in ‘On Truth and Lie in an Extra-Moral Sense’, (1872), Nietzsche casts aside that distinction. Truth is simply its own form of lie, currency which has long lost its imprint of precise value and stands on the memory of it being metal alone. Truth is both metaphoric and metonymic, an exalted form of euphemism that covers over the reality of it itself having been constructed and imagined by that same human consciousness which, oddly, even perversely for Nietzsche, finds succor in the misplaced ‘will to truth’. This jarring statement, finding its legendary lines in “…how shadowy and transient, how aimless and arbitrary the human intellect looks within nature. There were eternities during which it did not exist. And when it is all over with the human intellect, nothing will have happened. For this intellect has no additional mission which would lead it beyond human life.” We can call this ‘nihilism’ if we want; nevertheless, in the cosmic order of things seen from the vantage point of Victorian period, it is more true than any human truth.

            Nietzsche, however, does not dwell for overlong in the cosmic. His question is, and ere after, not so much ‘what is truth?’, but rather, ‘what is human?’ If “…to be truthful is to employ the usual metaphors.”, then in a moral sense, truthfulness means merely “…the duty to lie according to a fixed convention.” We do hear, from time to time, the phrase ‘conventional truths’, which are taken to point to a kind of statement existing exiguously between truisms and ‘trivial truths’, the former chestnuts of uncertain origin but precise provenience, and the latter simple statements of self-definition; certain only because they can only reference themselves. But Nietzsche tells us that all truths are such only by convention, thus erasing these other, perhaps cowardly, distinctions. The most famous passage of the paper occurs just above these reminders, and after reproducing it here, I want to provide some discursive context, both before and after, in order to aid understanding of just how it was possible that Nietzsche, at age 28 – the same age at which Hume wrote his magnum opus A Treatise on Human Nature – was able to come up with such a succinctly damning statement of one of humanity’s most cherished possessions. “What then is truth? A movable host of metaphors, metonymies, and anthropomorphisms: in short, a sum of human relations which have poetically and rhetorically intensified, transferred, and embellished, and which, after long usage, seem to a people to be fixed, canonical, and binding. Truths are illusions which we have forgotten are illusions; they are metaphors that have become worn out and have been drained of sensuous force…”

            The previous year, Darwin’s The Ascent of Man appeared, making clear the evolutionary connection between the great apes and human beings, something which was only implied in his revolutionary 1859 work. We shared the primate branch with other creatures; apes and humans had a common ancestor. Recasting the ‘great chain of being’ was not what was more seriously unsettling about Darwin’s work, but rather that humans were to be included in it, as another animal, but one simply more evolved. Nietzsche himself found this fact regrettable in the extreme, but also found within it the source of the death of godhead, something some commentators imagine him celebrating. The son of a Protestant minister, Nietzsche was, instead, moved to devote much of the rest of his working life coming up with both a new ethics to replace the one sourced in the divine assignation of conscience within human consciousness, but as well, a now ‘post-metaphysical’ cosmology centered around not the will to truth, but rather the will to power, ‘and nothing besides’.

            But in fact, the seeds for the exposition of the illusory qualities of human truths were sown far before Darwin’s somewhat indirect framework had taken hold over the philosophical imagination. ‘Perspectivism’, usually attributed to Nietzsche as well as fashionably misattributed to post-colonial discourses, actually first occurs with any force in Vico’s The New Science, (1725), wherein he speaks of cultures and peoples having different truths, in which they wholeheartedly believe as if they were the sole human knowing of the things themselves. ‘New’, of course, refers to the human sciences, the Geisteswissenschaften, as a complement for, and contrast to, the sciences of nature. The German translator of J.S. Mill’s System of Logic, (1843) came up with the term as well as its contrasting one, which ever since has given students thereof problems. Naturwissenshaften is straightforward enough, but ‘Geistes’! These ‘sciences of the spirit’, were in the main, unimpressive to Nietzsche, with the exception that they exposed the relativistic quality of truth on the ground. Anyone who has travelled outside of their own locale knows that the sole remaining truth about truth is that it’s status can adhere to anything we humans need it to.

            Closer to Nietzsche’s own time, aside from Mill’s important work – it should be noted that Mill was a vigorous supporter of the nascent feminist social science, and was personal friends with a number of its progenitrixes – Marx and Engels had penned The German Ideology – 1846, but not published until 1932 – in which the phrase ‘consciousness too is a social product’ presages in a much more concise manner Nietzsche’s argument. From Vico and Hume to Mill and Marx, the sense that truth was more than merely ‘elusive’ – a sensibility hailing from the natural sciences – had been germinating in serious discourse. The irony here is, perhaps, that the entire heart of Enlightenment discourse, officially dedicated to the truth of things bereft of moral overlays, ended up losing truth itself by jettisoning its moral sources and backdrop. And it was Nietzsche who first noticed this irony.

            His essay too went unpublished for some time, but eventually this acknowledgement that evolution, on the natural science side, and cultural perspectivism, on that social, gave way to an entire discursive framework within which truth found its place beside all other human faculties; institutions, subsistence practices, cosmologies, magic, kinship, the rites of passage and so on. By 1923, W. I. Thomas’ famous ‘principle’ could be uttered almost in passing: “If people believe something to be real, it is real in its consequences.” This is the working version of Nietzsche’s essay in a single sentence. By the mid-1930s Robert Merton could sum up the source of all inquiry into truths within the reality the Geisteswissenschaften studied in a single, precise question of his own: ‘Who benefits?’. In a word, a truth, of whatever form or function, existed due to someone or other gaining something from its remaining extant. Truths which do not function in this manner are soon overtaken by others, but the character of human truth is not altered by their simple replacement, any more than it is by their reproduction, the latter of which Nietzsche himself had concentrated most of his analysis upon.

            Today, we face another challenge to the traditional model of what a truth is or can be. If we now understand truth to be extramoral, or ‘non-moral’, what then of truths which are wholly virtual? When I first placed a virtual reality helmet upon my surefire rational head, I was astonished not only at the simulacra available, but the more so, by my ‘natural’ reactions thereto. I hesitated and even leapt back from a virtual ‘cliff’; I automatically bent forward to pet a virtual dog which, just to keep things ‘real’, had the ability to pick up a bone with its rather alien snout. I knew the experience was not real in the usual sense, and yet I still had the experience. Virtual reality is thus more like a vision, but one which can be shared through technology. The visionary has now an audience greater than himself, even if the content of the visions are just as hallucinatory as those of ages antique filled with the equally aged who could at least be truthful to themselves. Virtual reality is the scion of the sciences of the ‘spirit’, and its panoramas, its melodramas, its illusions are exactly what would animate Nietzsche’s own sensibility if he would have dreamed up the idea. By contrast, the sciences of nature too have their own child, ‘augmented reality’, which is a misnomer, because what it shows to our senses through a technological prosthetic are things which are actually already there in the world. There is no ‘virtuality’ about this augmentation; yet it is not reality per se that is being augmented, but rather our sensate. We are enabled to see the guts of things, for instance, in a manner reminiscent of Husserl’s gradually building ‘glancing ray’ which, bereft of the hyletic sphere, gets at the essence of things. We can see around corners, inside compartments, splice wires and inspect semi-conductors and this is how a precise and cool empiricism would likely interpret transcendental phenomenology’s ‘noesis’. It is a literalist litany of ‘to the things themselves’.

            And when we are dealing with mere things, truth and reality coincide most closely. Things alone, however, cannot hold our human interest. We know we are the far more curious phenomenon, and perhaps the greater proportion of that more fascinating character comes from our ability to find truth in the illusory, to make beliefs real through acting upon them, and yet also to be able to analyze and critique these attempts, seeing them as well for what they are. A consciousness that understands the very truth of truth is the result, and to my mind this is laudable achievement. For Nietzsche, the tacit question which resonates from his seminal essay might run along the lines of ‘why then have truth at all?’. He answers it, in so many words: “So long as it is able to deceive without injuring, that master of deception, the intellect, is free; it is released from its former slavery and celebrates its Saturnalia. It is never more luxuriant, richer, prouder, more clever and more daring. [ ] The intellect has now thrown the token of bondage from itself.” If the cosmic truth of human existence is sobering – and perhaps a new reality of a constructed intelligence will, in fact, carry humanity’s intellect ‘beyond human life’ and thus into a more ‘truthful’ future – the worldly truths we humans have taken for a wider reality have done far more than act as agents of self-deception. Our ability to conceive of something we call the ‘truth’ is far more profound than even our corresponding ability to believe in it and thenceforth act upon it. We need the concept of truth in the same way that Nietzsche much later notes that ‘we are more in love with love itself’ than we are of the beloved. We love the truth, but truths are of passing adoration. Truth then, might be one of those Durkheimian concepts which, akin to the sacred, are able to overleap discursive shifts in metaphysics and even societal shifts in modes of production. Nietzsche is correct about Truth and truths alike, and yet is it not more than true that in spite of this redolent gem of self-understanding, what more fully animates the human endeavor – patient and cumulative experiment in its natural science aspect, impassioned and visionary dream on that of ‘spirit’ – is that reality, after all, has itself always been virtual?

            G.V. Loewen is the author of over 60 books. He was professor of the interdisciplinary human sciences for over two decades.

Malice and Co.

Malice and Co. (The Nobel and the Noble)

            When my wife and I were living back on the West Coast we knew a retired teacher who not only had the grace to read my first short fiction collection but also the generosity to extoll my ‘genius’ in an hours-long conversation afterward. During this too-pleasant evening he told us of an encounter with one of his youthful students. Then twelve, she had become attached to him in the classroom, and what do you know, the first day of summer had brought her newly minted teen self to his front door, unannounced but promptly revealing every intent to intimately engage with him. To his credit, he gently ran her off, never to return. But indeed, such a moment must force every man to ask of himself a challenging question, ‘what would I have done in his place?’. Writ small, this is the same question that history poses to each of us, man or woman or other, and the usual contents are ‘would I have worked in a death camp or been one of its victims or, in turn, done nothing at all?’. As an ethicist, in fact I cannot say what I would have done. Like an ominous version of the contextual jest, one would have ‘had to have been there’ to really get it.

            I doubt very much many of us could know, given the hypotheticals of alternate biographies and all that such might imply. Certainly, as a young professor, I had a conga line of young women at my door – brazenly so since all of them were of legal age or older – and while I was still single, I acted upon many such calls. But twelve or thirteen seems a different matter. So, when it was revealed that Alice Munro’s daughter had been molested by her second husband at all of nine years old, with him claiming it was merely a scene out of Lolita after all, I cringed. No, the character in Nabokov was twelve, not nine, and there is a world of difference at that age. Lolita also had already been placed in a criminal circumstance by Humbert, and the reader is left with both having to trust his account of things thenceforth as well as presume that the young woman was hoping to ease her predicament; ‘well, at least he won’t kill me if I have regular sex with him’. And while it is highly unlikely that any nine-year-old would be the initiator of such circumstance, at twelve or thirteen, it might be a different story. As indeed it should be, barring intimacy. I say this because by adolescence a child needs to have that sense that she is becoming her own person. In many families with whom I have consulted, there was an ‘Electraic’ tension between mother and daughter, beginning around that age: ‘She mocks me, hates me even, is jealous of my looks and freedom and thinks dad admires me and not her. Maybe he does. She attempts to control me, and yet she still gets to sleep with him. I know how to fuck her over big time, just watch me’, and so on. Of course, the father is still culpable if he enables such desires, but the desires themselves are perfectly understandable and, as an assertion of nascent selfhood, even laudable.

            But not at nine. This fellow, who served no jail time, was clearly a villain, but such proved as well to be the case for the Nobel novelist. It is this latter fact which is causing conniptions in so-called cultural circles, but once again, there is much evidence to vouchsafe the authenticity of Munro’s feelings. Upon divorce, the child who remains from this now moribund union is often subjected to resentment, even hatred. She is a reminder of a bond now sundered, the once gift of love become the spawn of bitterness. Munro’s daughter was abused twice over, first by her step-father and then by her mother, who wholly bought into the Lolita idea. This kind of thing is no odd slap in the face, also not to be countenanced of course, but rather constitutes an outright betrayal. But does any of this impinge upon Munro’s creative works, and if so, how so?

            Somewhat akin to the proverbial death camp question, such a relationship ambiguates established legacies. One thing I do know is that its not a problem for me. I always disdained Munro’s work; nostalgic navel-gazing from gloom and doom baby-boom. But intriguingly, and perhaps ironically, the discovery that the author herself was a villain with real feelings and conflict in her existence, which it appears she tried to suppress for decades, might well make her work the more interesting. It would have to be something big to do so, at least for myself as a fiction writer and a scholar in aesthetics. Yet culture history is replete with villains, many of such standing as to make Munro, Woody Allen and like company look themselves like nine-year-olds. The most important case must be that of Richard Wagner, whose towering genius is often seen as tainted by his vehement political anti-Semitism. It could be argued that Wagner himself had a role, however cameo, in the murder of twelve millions in the camps and sixty elsewhere around the globe. ‘Go big’ must have been his mantra, given the Ring cycle and many other grand artistic works. But even here, his personal sensibilities, presumably reflected or at least refracted in his creations, we are left with ambiguity. His call to his Jewish musicians to ‘lose their Jewishness’ since otherwise they were ‘the perfect human beings’ might be interpreted as simply a reminder that ethnicity of any sort is both window-dressing and crutch, and decoys the noble soul away from his authenticity as a superior human being. If that was the case, I would wholly agree.

            Other famous cases of the handwringing at history remain at our newly gnarled fingertips. Heidegger, also no fan of ‘The Jews’, nevertheless saved both his mentor and his lover, both Jewish, from the Nazi onslaught, suggesting that it was not ethnicity itself that he disdained but rather simple inferiority. Husserl, being one of the great modern philosophers and the founder of phenomenology as a serious discourse, as well as Hannah Arendt, who went on to become arguably the most important female thinker of the twentieth century, were certainly neither of them inferior in any way. Richard Strauss was pushed out of his job as the Reich’s Art Director because he defended working with Jewish writers and musicians. Uh, yeah, Wagner, Heidegger, Strauss. Who is Alice Munro again?

            But aside from the wider historical context and career of what has to be by now a cliché – ‘I found out my hero was a villain, woe is me!’ – we must, as with the problem of history in general, turn the critical lens upon ourselves. That there exist people who might well wish me dead simply tells me I have lived my own life, and without reserve. One owns one’s own iniquities, and I am fortunate, equally simply, that my list contains nothing overly villainous, such as molesting children or, for that matter, running a death camp. But facts and fancies are ill-matched, and just as Nietzsche slyly reminded us that pride ultimately triumphs over memory, the critic’s own desires might well be able to vanquish history itself. For instance, I have been referred to as a child pornographer, and by someone I grew up with no less. Given the commonplace and wholly fictional idea that an author must always be culling from his own personal experience, I had to blink at the implications of such an outrageous charge. Disgusted by Lolita and Romeo and Juliet alike, for my first published fictional work, I wrote something more inspiring and in fact, more real to life, if not actually my life. To my mind, this is what a good fiction author does. They don’t just look, as one of Munro’s peers has done, at Heinlein’s If this goes on…, or yet The Odyssey, and say, ‘well, how about telling the same story but from a female perspective?’. Uh, how about it? No, rather they take up a famous trope and completely redo it, from the inside out, making it once again our own, instead of the piece of comforting nostalgia it has over the centuries become. This, by the way, was the entire intent of Queen of Hearts. Both Camelot and Calvary are now once again authentically our own stories, and not those of our distant, and dreary, ancestors.

            For distant and dreary are, at last, perhaps the two things that link Munro’s personal villainy and her cultural works. In both sets of narratives there is much suppression, much decoy behavior. That she knew these very human errors personally, and not simply by way of a creative imagination, both makes her writings more real and at once less artistic. Since never the twain completely meet, each of us must then decide for herself whether we prefer art, or rather life.

            G.V. Loewen is the author of sixty books in ethics, education, aesthetics, health, social theory and other areas, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.