Identity Fetishism

Identity Fetishism (the objectification of echoism)

            In over-identifying with the generalized other, the echoist sacrifices at first self-interest, as does the traditional altruist, but also thence the very self, if the pursuit of the other’s needs and desires overcomes the one who so pursues. The echoist is commonly seen as the figure who expresses personality traits opposite to that of the narcissist, though it is not correct to assume that this latter always acts in their own best interest. The narcissist is, after all, blinded by his own blind loyalty not to the self as he actually is, but to an idealized selfhood into which he has placed a fallible reality. The echoist, on her part, denies that the self either has needs at all, or, more usually, places these needs below those claimed by the other, giving them a lesser value. The danger for psychology as a discourse when using these Greek ethical constructions is that, aside from their original caveat against extremities – the golden mean or moderation in all things was a Greek mantra – that the best kind of personality holds within it a balance of self and other, which is a mere technical manner of stating that a neo-Christian selfhood is the newer ideal. Not the Christian in the original, and more radical sense, for the neighbor figure is after all the ultimate altruist. He is, as I have stated elsewhere, simply the libertine of compassion.

            Not so the ‘balanced’ self, which identifies with the generalized other as if this abstract and very much collective presence is expressed in now this individual before me, now that. Such a middle road, the fairway between La Scylla of the echoist and the Charybdis of the narcissist, and thus the fair way to adjudicate between one’s own needs and those of the other or others, always as well denies the selfhood as it is. It is a personalized way of doing the same to the world as it is. For in gifting oneself to the other, we do generally gain our own desires, be they having to do with public acclaim, a sense of personal vindication, a veneer of the virtuous, or being a model citizen, in no order and perhaps also in toto. Augustine is himself cautious about self-sacrifice, not due to Jesus becoming the Christ through so enacting it, but rather because for lesser beings, it would be a challenge to sort out one’s intents. Are we truly selfless in our actions? Did we actually put the other before ourselves? Do we rather seek to become an echo of the savior; to ‘borrow status’, to use a sociological turn of phrase.

            The narcissist seeks all such things, and in spades. In this, he is by far the easier to identify, and perhaps somewhat perversely, to identity with as well. He is unsure of his own person and thus desires to build around it a persona, the bastion against self-doubt constructed of that same anxious architecture. A persona is, however, still a more authentic expression of the lack of selfhood than is the fullest leap into the generalized other. A persona, though a mask, yet must be carried by its wearer. Not so otherness, of course, for it is irruptive, if rare, and especially in modernity. Not so the Other, capital ‘O’, which is alien and we would suggest, generally incomprehensible even if fully present to our senses bemused. But the case is different when it comes to echoism. This otherness, generalized in G.H. Mead’s sense that one has by a certain age internalized social norms and is able to exemplify them in one’s day to day or quotidian conduct – something which the over-identification with specific guises of the generalized other ironically allows one as a person, and even as a citizen, to forego – is not taken on as one does a costume of oneself, as in narcissism, but is rather slipped bodily into as if one were able to simply up and transfer one’s being into a ready-made vessel. Anyone who has adopted for themselves a form of identity politics has indulged in this fantasy.

            This is why one might suggest that there has been an objectification of echoism. The classic echoist, whom one might recognize casually as a ‘doormat’ or even a masochist, gears herself into the needs of singular others, usually serially and repetitively. It is these persons who are at most risk for domestic abuse, for example. The echoist internalizes the sense that she is of little value, or that her only value is in being a servant of another, aggrandizing his needs if he has no merit, or, if authentic value is present, then aiding his genuine quest. Either way, the echoist denies the self. It is a pressing weakness of the genius that he demands an echo; first from a person, then a community, and thence from the world itself. When Mahler consulted Freud in the Netherlands in 1910, the second edition of The Interpretation of Dreams had recently appeared and its author was by then as world-famous as was the celebrity composer and conductor. Aside from uttering the expected ‘what a meeting of giants, wish I could have been there’, we can more seriously remind ourselves that no archaeologies of selfhood, no high-flying hermeneutics, no ambitious analyses were involved. No, Freud simply told Mahler that he was being a prick, hmm. For Mahler’s marriage had been shipwrecked by his demand that Alma, once the hottest young woman in Vienna and an aspiring composer in her own right, should utterly sacrifice her own needs and desires to his superior gifts. And as challenging as it would be to compete with a Mahler, this was manifestly not what Alma was trying in any case to do. Freud told Mahler to instead aid his estranged wife’s quest, and ‘who better to do it’, for Pete’s sakes. More deeply and consistently, to take his mate’s needs as seriously as he took his own. Note to self, and dear reader.

            Alma was neither echoist nor narcissist. But then again, neither was her husband. So, what comes out of this historical vignette is both an illustration of the problem of identifying just exactly where our selfhood lies, especially in relation to others, and also, by extension, where might we find the place or the space wherein our best self resides? For many today, these questions are too challenging to confront in any authentic manner. Hence the mass objectification of echoism as a parallax to the much more individuated construction of a persona. Statements such as ‘I am a person of color, a trans-person, a proud boy, a Christian “first”, a liberal, a conservative, a survivor of the residential schools, a Holocaust survivor, an abuse victim, a revolutionary, a woman, a man’ and a myriad of others, if held to be front and center in even casual conversation and in one’s political opinions, if taken to be the defining characteristic of one’s selfhood, are all decoys, meant to help one avoid the anguish of being a self, and short-circuiting the essential relation between anxiety and personhood. With all the patent irony of modernism, it is psychotherapy itself which plays upon these projections. And even if we place our faith in the analytic process – which involves a gradual unmasking of persona in order to confront the authentic self in all of its patently fragile mortality – we must, in the end, also abandon the wider conception of faith as well.

            But what of the second term in our title? Speaking of faith, the fetish item, ethnographically, contained the Mana of some otherwise amorphous and animistic force. It might be the famed Churinga stones of the Australians, it might be the disembodied artifacts pinned into the shaman’s mesa in Mexico, or yet the ‘figurines of the Virgin Mary’, to borrow from King Crimson. Marx lights upon this conception and realizes that in capital, it is the commodity which now is seen as ‘Mannic’, excuse the obvious pun. Part of the object’s ‘surplus’, indirectly linked to the broader economic conception of surplus value, lies in its ability to transfer the consumer’s desire by objectifying it. The ‘finest’ marques, such as Ferrari, have mastered not the marketing of self-indulgence, but rather the ability to place the person in intimate association with the thing, as if the driver of a legendary auto is direct kindred with the shaman and their traditional fetish. Certainly, when I drove an expensive Jaguar just for fun, I felt a kind of augmented power, as if the prosthetic was mimicking an extramundane quality, something that the shaman’s tried and true trickery also mimicked. I also felt that the big cat was a mere extension of myself, and not just of my body, but rather of my very being.

            And this is what the idolaters of identity also seek. In their absence of selfhood, they desire to deny their very existence as human beings first, as historical beings, as beings endowed, by evolution or otherwise, with both reason and imagination, and as cleaving to a very much mutable ‘human nature’ which is not, and has never been, one thing, let alone the one thing they have, like a long line of crucified simulacra, hung themselves upon.

            G.V. Loewen is the author of 59 books in ethics, education, religion, social theory, aesthetics and health, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

The Technical and the Ethical

The Technical and the Ethical (what they share, and what they don’t)

            It is commonplace to hear that our morality has not ‘kept up’ with our technology, or that the latter proceeds at a more ‘rapid’ pace than the former. But seen from the perspective of action in the world, morality and technology occupy the same relative place to what is technical, or involves technique, and what is ethical; morality in action or conscience enacted. Both morality and technology proceed from our Promethean humanity. We require external prosthetics, from the simplest of wooden and stone tools of our distant ancestors to the quantum accelerators of our own day, in order to make a culture at all or indeed to survive the night. The domestication of fire was key to these regards. Morality is an idealized prosthetic, an extension of our mental life into the world, just as technology extends the capacities of our bodily form. It is plausible that our very idea of ‘embodiment’, a theological term but also one phenomenological, originates in this early distinction between our physical and mental capabilities and endurances. For to feel ensconced in a material vehicle as something other than is the world might well play on an ancestral sensibility brought about by that very duet of prosthetic extensions; I am more than I seem to be.

            Is this ‘more’ defined only by our evolving extensions, or might it also be the case that embodiment is of the essence of things in this matter; that I have not only made myself into a ‘more’ in the life I know and share with other human beings, but that I am also to be more, perhaps in a further life, or, to extend the logic of extension itself, a further part of this life? Conceptions of the afterlife, in their earliest form, saw it as a mere transition between earthly lives, a kind of eternal recurrence, not necessarily of the exact same thing, as in Nietzsche’s radically life-affirming formulation, but simply as another round of something similar; similar, especially in social contract cultures wherein this earliest idea of existential extension arises. Even with rudimentary social stratification, political power, and the presence of consistent material surplus, this first afterlife is not altered. What we observe is an inclining difference in burial rituals rather than the abandonment of the rites of passage in general. But with the advent of sedentary mass cultures we do see the idea that some have different destinies in the afterlife than others. Yet even here, the essence of the purpose of the afterlife, though altered from its primordial recurrence theme, remains consistent for all who thus enter it; some kind of evaluation is at stake, and that of one’s conscience and not one’s essence.

            By this point, an idea that must have been percolating within our ancestral breast for eons has appeared bodily in the world: the sense that embodiment as a locus for technological and moral extensions has a purpose beyond itself. Not only is life to be extended, but so also is its meaning. In this, we humans are gifted with the fuller sense of the Promethean ethic. Indeed, it was not so much the ability to live in ignorance of our own deaths as did the Gods themselves, though for us as a limited period, that riled the Greek pantheon, but rather that mortal life should have a meaning beyond itself. And this idea, implicated in the gift of the demigod, essentially annulled the difference between Gods and humans; both had now indefinite existences from which purpose and meaningfulness might be gleaned. Worse still, from the divine perspective at least, was that meaning itself for an omniscient and omnipotent being was not truly relevant, or at the very least, occurred in whole cloth as with everything else such a being would bring into being with its own presence. For us, rather, making meaning as we go along, though long the order of the quotidian day, abruptly upshifted itself into the essence of life, the very reason that we lived at all.

            What fills the conceptual as well as experiential gaps between technology and morality and actual human life in the world is, respectively, technique and ethics, the technical and the ethical. What they share is their fundamentally ad hoc basis: both technique and ethics responds to a specific circumstance, as often as not unpredicted or at least, unexpected. And just as there is not a one-to-one correspondence between morality and ethics – the idea that the former contains timeless principles as ideals and the latter is the space of real-time action wherein what is good in one case might not be in the next, and so on – so there is no levels-identity relation between technology, an umbrella term for anything prosthetic in culture, and technique, the actual use of tools and as well the skills involved in the construction thereof. Ethics is not morality simply brought down to earth, but rather moral ideas enacted and thus modified in the world on an ongoing basis. If such an image rapidly becomes blurred, we understand that ‘life is vague’, as Gadamer sagely notes, though it sounds like a mere truism and might even contain a nascent sense that ‘this life’ is vague when compared to some other life. Just so, the very ad hoc character of human life – we are constantly faced with differing circumstance and indeed, the very sense that life is mostly circumstantial both in action and in essence; we often meet our mates by chance, we do not ask to be born, etc. – forces upon us a reckoning: if our extensory apparatus seeks to ameliorate the chance quality of existence, if our extended sensibilities based upon prior experience purposes itself as the assuagement of our limited conscience  – conscience can no more predict the future with due certainty than can technique itself – then might it be the case that because we our aware of ideals in the first place, that there is another kind or form of life within which such ideals actually exist?

            This speculation should be familiar from Peter Berger’s 1967 argument concerning the possible reality of the afterlife, whatever its culturally defined character. It has its germ in James’ legendary Gifford lectures of 1901, wherein ‘the reality of the unseen’ is of great moment in the career of belief. Not just this, but as well, and rather more darkly, ‘the sacrifice of the intellect’, must also be had if one is to authentically adopt a religious suasion. At some point, reflection must cease, reason give itself away, in order for faith alone to carry the day. The idea that due to our ability to at least imagine an ideal way of life, which at once does away with the need for both the technical and the ethical, is suggestive of ‘another’ world or yet an otherworld where the very idea of the ideal is also moribund. We do not of course reiterate any of this argument here. For us, ideals arise through the human ability to make history; though ad hoc in its action, life remains to be lived by a being who is possessed of both memory and anticipation; two sibling, if contrasting, phenomenological dispositions of Dasein. Because of this, we are able to say to ourselves, ‘well, that was different, but it reminds me of the time when I…’ and such-like. Or, by way of comparison, ‘I have never experienced anything the like…’. All culture, all history, is only possible by way of the constant remarking upon the difference between what is similar and what is contrasting, and indeed how and by how much such experiences do in fact contrast. Technique is not only technology in use but also the reflexive process through which the former is modified by having become part of a human life; for now, technology has no life of its own. Ethics, rather, is not only morality lensed by action in the world but is as well its own domain, and this is the point at which the technical and the ethical part ways.

            For what they do not share with one another is autonomous form and thence formulation. The technical, the realm of technique alone, is always enthralled to the task at hand. It cannot compare itself to its ideal. One does hear, ‘well, if all other things had been equal, this would have worked’. Applied scientists, who should be wiser than all of this, are often the source of this plaintiff. But one also hears another refrain, one that only partially quotes the original source, and that is ‘for all action there is an equal and opposite reaction’. Yes, ‘in a closed system’, as the actual text concludes. Human life, history, culture, and the world at large are manifestly not such closed systems as Newton ideally described for his local physics. Those who misuse this famous epigram, speaking in part of his second law of thermodynamics, do so to make simplistic human relations, especially those political, in order to manipulate others. In so doing, however, they have, perhaps inadvertently but nonetheless bodily, moved from the technical to the ethical. They have expressed what is of the utmost for our shared humanity; not at all our ability to extend our physical capabilities through equally material prosthetics but rather our inability to know our ends, both singular and indeed collective. Its is only ethics which speaks, Janus-like but without duplicity, to this human dilemma and not and never technique. In doing so, we are brought face to face with the existential import of being able to at once have an awareness of how we would ideally act ‘if all else was equal’ and the ‘system was closed’, and thus the equal understanding that we must in any case take action without direct recourse to either ideals or to an ideal world.

            G.V. Loewen is the author of 59 books in ethics, education, religion, aesthetics, social theory and health, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

Can Communism Contribute to Culture?

Can Communism Contribute to Culture? (after giving birth to it)

            The question of culture within a communist mode of production is a highly speculative one. Not least due to the historical facts; there has never been an authentically communist society. Engels sought to close the circle on history itself, by reprising ‘primitive communism’ writ large and sourced in the largesse of rationalized industrial production. Social contract societies are the original human cultures, so in one sense, culture is itself a child of communism, or perhaps less ideologically, communalism. These types of social organization, referred to as having ‘mechanical solidarity’ by Durkheim and being pre-political in Pierre Clastres’ sensibility – here, only the presence of surplus generates social hierarchy and all that this radically novel form of social relations entails and has, in the interim, entailed – seem taken quite unawares by Engels’ appropriation of them as a test model for the future of humanity. This moment represents best the 18th century Rousseauist sense that Marx and Engels brought to 19th century social thought. And there is a dose of early romanticism, healthy or no, in all such utopian imaginings, from Plato’s ideal state to the relatively stateless vision of Ayn Rand. Such a moment is reiterative when examined through the lens of the arts, the chief contributor to culture in its narrower sense.

            For fraudulent communism, the cases are mostly negative. From Shostakovich’s serial house arrests, to Brecht’s remorseful disillusionment, to the official non-personhood of Nicolae Bretan and many others, the arts tend to suffer, often ignominiously, under pseudo-communist regimes of all stripes and hues. Just as does fraudulent religion contribute nothing to the value and history of belief, a fraudulent politics can offer nothing to the culture and dynamic of ‘political man’. But Marx singles out the artist, among all other possible social roles, in his early examination of the merits of industrial or technocratic communism. One of the arguments he makes is both rational and ethical; give everyone the opportunity to evidence whether or not they have the artistic genius. In China, there is a piano school wherein thousands of pre-selected students study. In the closing scenes of the wincingly intimate documentary of Yo Yo Ma, he is shown speechless and with eyes glinting, standing in a studio listening to a ten-year-old Chinese girl play Chopin. The legendary cellist, one of the greatest artists of our time, is in awe. For the young lady is not merely reproducing Chopin with utter perfection, and doing so sporting an oversize pink plastic watch on her wrist to boot, she is Chopin. Aside from such extramundane factors such as speculative reincarnation, her very being speaks volumes regarding Marx’s suggestion. For him, it was simply a question of available numbers. Only by extending the opportunity stream and structure of universal education can we identify such talents.

            China too is hardly communist in Marx and Engels’ sense, but unlike other social experiments of similar type, it has realized that its apical intellectual ancestors – both very much Western of course, in direct contradiction to all the nonsense emanating from Beijing about China being non-Western or even anti-Western in some whole-souled fashion – were correct; one had to have consistent and highly rationalized industrial means of production before any communist relations of production could take hold. And the only manner of reaching the former status is through capitalism, not communism, as Marx himself clearly stated. China backed into Engels’ historical curve, as it were, with the seeming inevitability that a controlled economy is either a dead-end regarding the dialectical fulfillment of history through the demise of class conflict – and ultimately the ‘withering away’ of the state itself – or that what we are witnessing, with dubious privilege, is just another transition point along the way to authentic communist relations. This latter claim seems to me to be fraught with potential rationalization, even abuse. For primitive communism, the first society, was also the most radically democratic, and this without surplus of any kind, which is probably the more germane aspect of any of this. The hypothetical communism of Marx and Engels presumes upon variables that on the ground feel almost as extramundane as does reincarnation: one, that an entire large-scale populace would have an equal and representative say in the doings of a skeleton government; two, that such leaders as they may be would themselves be Platonic ideals, ‘philosopher-kings’ politburo style; and three, that politics would continue to be of interest at all, in a society that on the one hand cannot imagine even the question of God, as Marx once again states, and on the other, accepts and endorses the sensibility that politics should wholly replace religion with regard to human passion and interest, as well as ‘belief’.

            But there is no need to believe in something which is factual, in the world as it is, and without the credulous. We may not know all there is to potentially know about our own political doings, but there is never a true mystery in the sense that some part of politics has itself departed from the quotidian in some irruptive manner. Even hypothetical communism appears otherworldly given that its goal is to eliminate itself, end history, vanquish ideology, transform individual will into that collective, and install a world ‘government’ that governs without itself being a state! All of this together does indeed require a leap of faith, enormous and enchanted at once. But the question of political alternatives, no matter how stylized and romantic, is yet quite salient to our time, when democracies, partial as they may be, seem disenchanted with themselves, and many appear to long for authoritarian practices in power as well as in personal relations. The tired adage ’be careful what you wish for’ seems to make no impression on such persons. Far from the mostly long mute ideologues of post-war versions of Neo-Marxism, it is rather the unstudied and uncultured franchises who desire to be dominated and told what to do – in spite of their rhetoric of freedom and individual responsibility; the only consistency here is the truer call to ‘let me be responsible for dominating and dictating to my own children et al’ – that present to contemporary historical relations its gravest threat.

            For history too does end within any authoritarian circle. The opposite of that sidereal, this enclosure pens its own history, ‘rewrites’ itself, as we saw in the Reich then and in Florida now, and thus pens itself inside it. That said, reactionary pseudo-history is likely no less a fraud than much of the ‘politically correct’ rewrites that equally scan the career of human endeavor for examples and exemplars favorable only to their narrowed and ideologically inclined druthers. PragerU has its corresponding entity in the DEI sensitivity; one might well say that they deserve one another, just as did, at least at the level of statehood, the Soviet Union deserve the Third Reich and vice-versa, however awful this may be to contemplate. Do then the actual Taliban deserve the self-proclaimed ‘American Taliban’? Does the Third-Wave ‘Feminist’ deserve the neo-liberal economist? One could go on of course, but the point here is that it is commonplace for the political pendulum, to borrow another cliché, once pulled back in one direction, to entail an equal and opposite swing. The oscillation thence initiated cannot be halted in any rapid manner, and we find ourselves swinging to and fro along with everything else. The pendulum is its own metronome, setting the pace of public discourse and the level of political interest. Dialogue is absent, as well as is historical consciousness. One does not understand history, or the history of thought, on purpose. In this, we also may say that we deserve our own shared ignorance.

            For Marx, the question of culture was, as ever, a dialectical one. It is just that, as perceptive as he was of the reality of the social conditions in which he found himself alive, he yet seems unable to extend this same profundity within his own analytic. If he had, he would have noted its inconsistencies, which in turn have allowed, and perhaps even prevaricated, the light readings both Lenin and Mao brought to their early studies, not to mention their personal vendettas projected onto a mostly unknowing social world. It is always possible, of course, that both Marx and Engels knew full well of the challenges to their own logic inherent in their claims, and simply ignored them in order to further revolutionary ambitions. I would like to doubt this was truly the case, as in any major thinker, there can be found lapses of both reason and imagination alike. That it would take such a lapse, perhaps calculated and controlled, in order for communism to recreate culture anew, as in the Chopin example – and is this an authentic contribution to culture? – and especially so, to actually give birth to a new culture entirely, suggests that any future attempt approaching the vision of Marx and Engels should hope that it never achieves its political goals.

            G.V. Loewen is the author of 59 book in ethics, education, religion, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

The Anomie of the People

The Anomie of the People (subjective alienation today)

            In the Economic and Philosophical Manuscripts of 1844, Marx and Engels outline the four forms of alienated consciousness. In a sense, this quartet of disharmony in turn form the Gestalt of Proletarian unthought, just as they provide for the Bourgeois outlook an odd, even perverse, set of rationalizations for their own continuing alienation. Capital is more complex now than it was in the mid-19th century, and the failure of the middle classes of our own time has highlighted not so much Marx’s ideas but rather that of his successor in the human sciences, Emile Durkheim. Momentarily, we must admit that the former might well have seen in the latter a yet further decoy, but perhaps not. Subjective alienation, or anomie, is just as real as are the others, objective and structural as they may be. What Durkheim was confronting as a discursive manifest was the same thing that an individual person confronted as a producer and consumer as well as a human being: is it only the case that objectively alienated labor by necessity contributed directly to the anomic existence, or is it more interesting than this?

            Let us first review Marx’s conceptions, keeping in mind that in the interim many mitigating factors have been created, for better or worse, to mute at least the effects of the problem at hand. The four forms of alienated consciousness are as follows: 1. Alienation from the product of work: workers produce objects that for the most part they cannot themselves afford or are even ‘meant’ for them. 2. Alienation from other workers: workers are placed in a do or die competition with one another, thereby sabotaging any sense of a wider solidarity. 3. Alienation from work itself: most work is unfulfilling in any deeper sense, ‘its just a job’, and 4. Alienation from human potential: this is by far the most profound of the forms and speaks to our species-being being distanced from its own broader abilities. In this, capital inherits the worst of the religious pre-modern worldview, but without any of the entailing grace or salvation about it; one is born, one works, one dies.

            Each of the forms has undergone extensive mutation, some more, some less. 1. For the most part, workers can in fact afford the objects they help produce, and for some, such as contractors and skilled labor, the potential exists for they themselves to construct such objects, such as executive homes, for themselves and more or less by themselves, over time. 2. Unions, which Marx and Engels disdained, have eased the sense that workers are each other’s enemies and only that, though the globalization of labor has heightened the anxiety around finding and keeping a job at a living wage. At the same time, the more skills one has, the less likely an employer can afford to lose, not you yourself, but the class of worker in which you have placed yourself. 3. Much work has been augmented to become more existentially fulfilling, though it remains a servitude in the service sector; Durkheim himself made this first point not long after Marx’s death, and suggested that wages earned could ‘borrow’ from the prestige of wages spent, however frivolously. The journal The Hindu noted some twenty years ago or so that Europeans spent on average about one billion dollars on ice cream products per annum, for instance. 4. We are yet quite unsure of the scope of human potential, and presumably we are far from reaching its nadir. Marx himself stated that capital was the most liberating form of economic organization to date since it did free up some few people to reach their individual potentials and thus display something of the role-model to others. It is an open question whether or not an authentic communism would do as well. Even so, this final and most damning form of objective alienation remains a plague on our species-being, though one could certainly argue that wage labor is hardly the only factor in its ongoing presence.

            Durkheim was dissatisfied with the structural explanation of alienated consciousness in the main due to its utter ignoring of the chief locus of perception in Bourgeois relations, that of the individual. In this sense, Marx’s analysis presented itself as a contradiction in terms, and it was not the only one extant in the 1844 manuscripts. One can only be reminded at this juncture that Marx and Engels also ignored the fact that communism, as a still hypothetical mode of production, entailed no alteration in the means of production, unlike every other sea-change of this sort in history. In Marx, communism was simply capitalism bereft of pre-modern sentiments; the symbolic forms of the theistic period would somehow drop off, altering the relations of production but not the technical and industrial means. Communism thus is presented as an exception to the ruling dynamic of history – class conflict – and the only way one can rationalize this odd conclusion to Engels’ historical model is that within communism class conflict does itself end. But this is putting the cart before the horse in logical terms. Beyond this, though often seen as a mere aside, Marx’s analysis of the role of the artist ‘under’ communism also ignores the most profound aspect of what the artist does in society; she works against the grain, most simply, opening up human consciousness by transgressing norms and thus thereby transcending alienation as well. It is unclear how, in the communist mode of production, the artist would have anything to do at all, or if she did, would be able, or allowed, to do it.

            All this aside, Durkheim’s’ main interest was complementing the structural model for the personal level. All very well to bandy about large-scale factors, at the end of the day, real people bore the results of their world-historical confluence. If revolution was consciousness in the making, how then could it occur at all without individuals processing their perceptions of their own alienation? Indeed, they do so, and the means by which they do Durkheim called the anomic relations of production. Anomie is subjective alienation; its symptoms are anxiousness, angst, embitteredness, resentment, and even neurosis and ressentiment. In a word, anomie is a most serious affair, and even it be seen as a mere symptom of objectively defined alienated consciousness within Bourgeois relations, what it presents to us is a full-blooded symptomology of the entire mode of production. Durkheim’s genius lay in his ability to take the most minute moment and see in it the whole of the relevant Zeitgeist. Witness his analysis of deviance in his 1893 The Division of Labor in Society, perhaps still the most famous example of inductive thinking in the human sciences. But anomie and its further effects – as in, suicide – appears as a working conception four years later. Part of another four-term model, the anomic person is alienated from his own selfhood. To him, this is a more present form of inexistence than any structural item could be. A job is a job, it is not a life. To be fair, we speak from our own time, and Durkheim, whether or not he was a critic of the fact that capital had augmented in significant ways its panoply of distractions by the fin de siecle period, had the vision to understand that this relatively free mode of production could not survive its socialist detractors for any length of time if it had not become more appealing to the worker himself. Nonetheless, in doing so, the symbolic life of the pre-modern period abruptly slipped away, leading to disenchantment, something that Durkheim’s major sibling thinker, Max Weber, became famous for analyzing. But for the former, Entzauberung, the loss of the ‘magical’ quality of and in the world, was not an end in itself, but rather something which had rather been transposed, with a variety of plausible substitutions taking the place of the once religious-inspired worldview aspects. Instead of a local sect, a local sports team, instead of a pilgrimage site, a sports stadium. Instead of a saint himself, a Taylor Swift herself, and so on. For Durkheim, all of these transpositions involved the perennial career of the concept of the sacred, something that Marx and Engels ignored, and something that Weber stated, rather perfunctorily, could not truly exist in modernity, just as he so claimed for authentic charisma. But we can compare Joan of Arc to Tiger Woods along such lines, Durkheim might have said. The sacred was for Durkheim a kind of meta-conception, something that survived even shifts in the mode of production, from subsistence to agrarianism to industry and perhaps yet to intelligent technologies. For Engels, such shifts were all inclusive, so concepts such as the sacred, or ideas such as archetypes, for that matter, were inadmissible to his modeling. This is clearly an oversight at best, especially in light of what we have already mentioned regarding his apparently incomplete premises for the ‘final’ shift from capital to communism. The only way to make one kind of sense of such a model is, aside from the usual inability to predict the future, which all human analytics fall short of, is that communism ends symbolic forms and in their entirety. As Marx put it, distinguishing his much more radical ‘atheism’ from that of Feuerbach, ‘For the communist man, the question of God cannot arise.’

            Needless to say, Durkheim’s vision of the sacred was much broader and deeper than any of this. He was aware, as was Engels, of cosmologies which had no Gods at all, but unlike his German compatriot, he used this knowledge in his own analyses. By 1912, with the publication of the legendary The Elementary Forms of the Religious Life, appearing in the same year as the first essays of both Scheler’s Ressentiment and Freud’s Totem and Taboo, Durkheim had formalized the dialectic between trans-historical concepts such as the sacred, ritual, or the archetypes and their contrasting historical forms, such as specific pantheons or godheads, rituals in their ethnographic detail, and beliefs. Once again, as a clearly sibling analytic to Weber’s distinction between historical and ideal types, the sense that any specific mode of production would be immune to alienation in general, and anomie in particular, might be called into question. Durkheim had, somewhat ironically, somewhat painted himself into an analytic corner. At the same time, his understanding of that which can transcend historical alterations of world-orders and even worldviews was, akin to art itself, indeed the chief anonymous manner of initiating those very shifts themselves!

            This insight is of the utmost. In modernity, art has replaced religious belief, popular art, religious behavior. But the idea of the sacred remains intact, as does the enactment of ritual and the identification with the archetypes, though such lists thereof vary. Finally, we may state with more confidence that anomie, though also likely a local guise of another kind of presence, specific to human consciousness and perhaps even primordial and thence Promethean in its origins – such a sensibility Heidegger casts as Sorgeheit; the dialectical apex or synthesis resulting from the Aufheben of alienation and anxiety – leads mostly not to suicide at all but rather to care or concernfulness, allows us a glimpse of the possibility of a human future wherein alienation is itself overcome.

            G.V. Loewen is the author of 59 books in ethics, education, aesthetics, religion, social theory and health, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

A Modernist Gospel

A Modernist Gospel (H.G. Wells’ The Sleeper Awakes, 1899).

            Published first as a serial and thence complete in the same year as Acton began the bulk of his ‘Lectures in Modern History’, and several months before Freud’s The Interpretation of Dreams and yet several more before Nietzsche ultimately succumbed to a genetic brain disorder that had also claimed the lives of his father and brother, Wells’ early dystopian novel came hard on the heels of a series of legendary hits including The Invisible Man and The War of the Worlds. The original bore the title ‘When the Sleeper Wakes’ which was altered, along with some other minor edits, by Wells in the 1910 edition. In the preface to the third edition of 1921, he remarks that he no longer felt that such a future would in fact be the destiny of humankind. Over a century later, his vision of an autocratic capitalist hierarchy made manifest in a social organization kindred with that portrayed in Fritz Lang’s Metropolis (1926) and satired in Aldous Huxley’s Brave New World (1932), we are not quite as sure, as was the author, of our collective fate.

            Wells disliked Metropolis, and we can infer that he felt it a plot device to ‘awaken’ one of the elites to the misery of the minions who supported he and his peers at the expense of their lives. Our contemporary geopolitics bears Lang out but as well, the Wells of 1898, when The Sleeper Awakes was originally penned.If it is plausible that most of our very much knowing elites take little enough care to ensure their longitudinal position in society, it is perhaps equally unlikely that, if indeed apprised of such conditions, any one of them would become the revolutionary hero we see in both Wells’ novel and Lang’s film. In the novel, the character Graham is cast as a modern messiah, as well as representing the incarnation of a myth, long disused by the literary future. He does not sleep for six days, then falls into a trancelike slumber upon the seventh, in an allusion to the Creator God of the Ancient Hebrews. After awakening 203 years later, he is held prisoner for three days and thence emerges, studying the changes for another three days before reaching a decision to carry forward the revolt which had originally been engineered and thence coopted by the great capitalist figure, Ostrog. This ‘eastern gothic barbarian’ become manager allusion is also transparent. Ultimately, Graham does ensure the people’s revolution is successful, but only through his self-sacrifice. In the climactic personal scene, he stands aloof to personal love, that of Helen Wotton, the young woman who has been his voice of conscience.

            The novel is thus only temporarily dystopian, and its theme is subjectively about self-sacrifice, objectively about political manipulation and exploitation, one of Wells’ leitmotifs. Even if he is arguably the most visionary author in the English language – it is a challenge to see anything new in science fiction and related genres if one knows Wells’ entire corpus of fiction – he was still a child of his time. Socialism and eugenics dominated his outlook, him seeing both as the chief manners of improving the human race. That we have rejected both almost entirely – the human genome project and the social welfare state are perhaps the residue of these once much grander ideas – Wells might well have seen as a final acquiescence to the thralldom of capital. He writes still later, in 1923, that with the publication of Men Like Gods (1921) that he had ‘tired of telling brighter tales of the human future to a world intent on destroying itself’. No reflective person today would not share his pain.

            Wells himself takes great pains with his thick description of the world, c. 2101. That it is peppered with imaginative innovations in the technical realm does nothing to distract the outsider from its basic inequality and injustice. The novel is a handy read for anyone who desires some much-needed perspective on our own reality, 2024. If anything, we have travelled nearer to Wells’ vision in the interim; half-way, he might judge us, if he could see our condition today. That we too have our technical spices and distractions, that our ability to do things in and to the world has far outmatched our ability to both think and care about the self-same world, all this he would have recognized and indeed predicted with his usual accuracy. But his late Victorian prose serves a more profound purpose than immersion; it allows the reader to just as painfully work through this terrifying vision with tools that are not made for such work. In this, we are cast back upon our own contemporary ethics, and each of us falls back upon her respective conscience, both of which seem unwilling, or yet unable, given their entanglement, to vouchsafe a humane future. We are as is the sleeper. Wells’ agenda is to awaken us, and that at a structural level, not at all one ideological. He is well aware that even if we do not literally sleep, we are yet asleep to all that truly matters to humanity.

            Helmuth Plessner reminds us that by ‘dividing the universe into fields of action, the world loses its face’. That we harbor the means of self-destruction and, once again, have entered a cycle wherein politicians are more amenable to committing global suicide on our behalves, he understands as merely the logical consequence of making a technique of cosmology. Oddly, we can understand this ‘discursifying’ of creation begins with the original Western gospels, its four-square of discipleship reporting as allegorical disciplines; the taxman representing government, the doctor representing the sciences, the seer the remaining enchantedness of the world, and finally, the youth, who represents the future. We understand the final three years of Jesus’ existence through lenses of action, each with the germs of their respective fields. Our ongoing harvest has left much of those four fields fallow, and Wells plays upon this with his contrast between the cynical rationalizations of Ostrog and the call to conscience of Wotton. The fruit of the gospel remains the sustenance of only the most marginal. Graham is referred to as ‘God’ and as ‘the one who has come’ and so on, in various moments when the people are encouraging or agitating for his presence and his Word.

            It was not at all peculiar for fin de siècle authors to rewrite the gospels in modernist forms, or yet pen new gospels entirely – Thus Spake Zarathustra is of the course the stand-out to this regard; once again in four books – and this interest speaks of their disenchantment with the idea of progress and their sense of the coming apocalypse. That August 1914 ended the bright-eyed gaze of both evolution as progress and Western culture as objective spirit, allowing John Bury to recapitulate a ‘history of the idea’ itself by 1920, should present a serious caveat to our own contemporary world visions, humane or inhumane the both. That Wells was able to conjure, in his own inimitable and unsparing style, a story resolutely current to the denizens of a different age, is an enduring testament to his own prescient imagination. But that we have celebrated many others of his works which only at best indirectly touch upon the key problems our species faces, presents a much more dubious record of our willingness to close our own hearts off to our consciences, thereby denuding consciousness itself of its built-in compass.

            At once straddling the genres of fantasy, horror, science fiction, dystopia, and tall tale, The Sleeper Awakes is, finally, simply a very solid and relevant narrative that sold well on the backs of Wells’ early legendary works. Its challenge to us is not so much literary – the novel of today has displaced third party external description with a deeply introverted sense of what is going on in the character’s mind, and this not gleaned by way of described emotions but rather through ongoing reflection and its corresponding personal action – but very much as a statement of a critical politics. To reply to such a pointed query is to make manifest our shared reality as it is, and not as stated by either government or corporation. That Wells has provided to us both the model and the goal leaves us in his inestimable debt.

            G.V. Loewen is the author of 59 books in ethics, education, social theory, art, religion, and health, as well as fiction. He was professor of the interdisciplinary human sciences for over twenty years.

Autobiography in Fiction

Autobiography in Fiction (when the author isn’t quite dead)

            An old friend of mine recently read one of my short stories and noted how I had used my own first name as the narrator’s, the only time I have ever done so. “Did this suggest that you saw yourself in his role, or that part of the story was something that happened to you personally?”, he asked. These are two intriguing questions, and admittedly, they put a flea in my ear to examine my entire corpus of fiction in response. The perduring question that backdrops them is of course, ‘how much reality is there in fiction?’, and that in general. The source-point of such reality, however much of it may or may not be present, is itself problematic; personal memory. Asking if the reader can trust the writer is not so different than asking if the writer can trust his own experience. Indeed, my experience of writing fiction is that it is a form of waking dream, so there may well be as much of the unconscious life in the text as there is conscious memory of waking experience.

            However this may be, such questions remain, and each author, in her desire to become a discursive label rather than a mere person, must confront them in some manner or other. For myself, I began by listing each of the moments where I had quite calculatingly borrowed from my life experience. This kind of material is specific, at first not metaphorical and not to be interpreted as anything but the most convenient of plot devices. Such an overview produced more than I had imagined, and while I have never written the much-vaunted ‘autobiographical novel’ – D.H. Lawrence’s Sons and Lovers, for example – I am guilty of pilfering autobiographical memory in a lesser sense – along the lines of, say, H.G. Wells’ Tono Bungay, by way of contrasting case. My early novella used a setting from my childhood that I knew well. My first novel used two outré experiences I actually had to help set its partially phantasmagorical tone. Certain characters in short fiction were gently based upon this or that person I had known, more or less well. In others, I placed a part of myself, named or unnamed, in the role of observer, or principle actor. In one short, I was an aspiring writer who lacked commercial success, for instance. In my first mainstream novel, About the Others, many dream-sequences were personal memories, and the protagonist is a retired professor who is too sure of his own profundity. Hmmm, all this sounds vaguely familiar. In my second such effort, the novel The Understudies, one of the three principles is, once again, a retired professor and philosophical author, though this time one full of self-doubt rationalized by a nostalgic sexual swagger. My blushes, Watson.

            Suffice to say, that after such a cursory examination of the presence of the author in his work, there was much to be accounted for, even at the level of plot. But what of that of metaphor and meaning? Dare I ask, given the lay of the lexical land thus far? That youth figure prominently in most of my fiction, that their task is one of coming of age, of confronting injustice, of working through their own conflict and building character quite literally, might suggest that I myself am yet undergoing a similar self-understanding. Youth becoming adults is a veritable leitmotif in my corpus. Youth learning to live, to love, to gain community, encountering danger and death, are recurring themes. Youth unjustly treated, even ill-treated, at the hands of adults, and that same youth becoming political, dangerous, engaged in self-styled campaigns of justice, thinking little of parricide or what-have-you, on their road to a higher freedom. The pre-Barthes literary critic would pause in wonder at it all; does this author desire to relive his youth in a more noble manner? Or is he yet still a youth in vital areas of his own character?

            Far more so than general non-fiction, let alone scholarly work, does fiction expose the reader to the writer, and that for better or worse. Some authors manipulate this dynamic in their favor, by posing as far more experienced or worldly than they actually are or ever were. There may well be a vicarious element to fiction that is more the act of the writer than that of the reader, though we do not as often think of it this way. And it is the case, perhaps tellingly, that writing fiction allows the author to purvey not only his desires upon a public, unsuspecting or no, but also, more radically, his vision. It is this latter that dominates my own fiction; not desire vain so much as perhaps demythology in vain. I generally write agenda fiction, so by that standard alone, it can never be understood as art, that aside from not being myself an artist. Such an agenda could be interpreted, however, as giving voice to much that is absent in my own existence, more pointedly than even the wider reality of its lack in our shared world. If Nietzsche, somewhat self-effacingly, tells us that, after all, ‘the philosopher has only his opinions’, then what mere fiction author could say more?

            Such a two-front examination of fictional narrative, on the one hand, deliberate borrowing from reality for plot decoration or device, for character sketch or place setting and, on the other, the inveigling of the authorial unconscious into the very fabric of the literary textile, has one further insight of note: that we ourselves as human beings live a dual existence. At once, we are waking selves charged with the socius’ diktat to perform as normative a set of roles as we can muster to ourselves, and somewhat in spite of this or even because of it pending circumstance, we are as well all that which social norms seek to deny. It is through fiction, literary or no, that the writer explores the fluid dynamic which exists between these existential states; the one attempting to be graceful but the other perhaps approaching grace itself.

            G.V. Loewen is the author of 59 books in ethics, education, aesthetics, religion, social theory and health, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

Regression Analysis Redefined

Introduction: Regression Analysis Redefined

            We live in the time of the world regression. How do we then respond to such a world, wherein what appears to have become the most plausible sensibility is the least sensible, the most probable the least possible? In a word, that time can run backward, that history can fold in on itself, and that culture can regress into, and unto, its childhood, even yet its primordial inexistence. And though such an event in some of its symptoms can be measured statistically, this volume of studies suggests that we redefine the analytic of regression. To do so, we might use the following rubrics:

            1. That regression is present any time one desires to base reality upon fantasy, and has thereby lost the ability to distinguish between the two.

            2. That regression is present any time nostalgia is in the ascendant, no matter the cultural thematics or personalist narratives involved.

            3. That regression is present whenever childhood, a mere phase of life, is exalted as both innocent and yet also wise at once.

            4. That regression is present if and when youth and its experiences, once again, a brief phase in human existence, are negatively sanctioned, limited, mocked, or bullied.

            5. And most importantly, when history is itself understood as the handmaiden of myth, and thus its auto-teleology is aborted, regression is the source of this inauthenticity.

            The exaltation of childhood, the disdaining of youth, the disbelief in reality through ‘anti-science’, the dismantling of history through ideology, the inability to discriminate between fantasy and the world as it is – perhaps observed most popularly in our entertainment fictions and more darkly, in our moral panics – and, most insipidly, the inveigling of a marketeering that plays upon our personal desires to re-attain lost youth or yet childhood in the form of generational nostalgia in fashion, popular music, and once again, emerging from the shadows, in mores and norms, is the source of the world-crisis today. ‘I want to return our education system to about 1930’, says Dennis Prager, the billionaire founder of PragerU, a private sector purveyor of fantasist school curricula, ‘but without all the bad stuff’. Which would be? The only thing that comes to mind that would perhaps be better would be textual literacy – more people read books a century ago than today; but at the same time we must ask, what kind of books? – but this too would have to be oriented to other more contemporary forms, such as that digital, in order for it to be salutary to literacy in general. This is but a single example of hundreds globally, which would include populist and nationalist movements in politics, ethnic-based religious affiliations and churches, charter schools based upon ethnicity feigned or historical, government policies that pander to the neuroses of otherwise absent parents, and so on. Let us recast as questions each of the five points listed above, which designate types of regressive presence.

            1. How can one distinguish between what is real and what is non-real?

            The irreal is a third form of general human experience which occurs only when something ‘irruptive’, an event or a presence which breaks into waking reality as if one suddenly and momentarily dreamed awake, makes itself known. These kinds of experiences are rare and we, in our modernity, no longer interpret them in the traditional mode of the visionary or the religious-inspired presence. That they continue to occur sporadically is certainly of interest, given that the cultural matrix which might be seen to have generated them in the first place is long lost. This phenomenological concept can serve us in a different manner today: anything that tends to hitch itself up to the authentically irruptive but is not itself irreal is fantasy, pure and simple. The difference between Israel and Zion is a current example of a political attempt to base a modern nation-state on a legendary construct. Similar historical examples abound: Victorian England’s smittenness with Arthurian Britain, given ideological, that is, unhistorical, literacy by Mallory; The Second and Third Reichs’ genuflections directed to Nordic mythos, given artistic transcendence, but equally non-historical this time, by Wagner. Is there now a Zionist composer or children’s author about?

            2. Why do our desires for youthfulness take on a nostalgic formula?

            Mostly for market purposes, childhood and youth are extended far beyond their phase of life appropriateness. It may well be that the reappearance of neoconservative or even neo-fascist norms regarding child-raising and the curtailing of youthful desire and wonder are the result of simple economics; the market targeting the only people with non-responsible disposable income coupled with the general lack of control over anything but consuming by which children and youth are characterized. In this sense, youth consumption is no different from anorexia; a simple attempt to exert agency in an otherwise adult world. Even if this is the case, however, such regressions are no less than evil, as they strike at the heart of what makes youth profound. Hazlitt, writing at the time when ‘youth’ itself was a novel concept, is correct when he states that youth’s very lack of experience is what makes it not only a unique period of human existence, but also gives it its patent sense of wonder, wanderlust, desireful passion, and naïve compassion all at once. From our first love to our first knowing brush with death, such events appear once again to be irruptive, so filled with wonder are they. The very absence of the human irreal in mature being prompts a regressive desire to ‘return’ to our salad days, green not so much with envy but with a desperate melancholic anxiety.

            3 and 4. How is it possible that the absence of experience generates wisdom?

            It isn’t. If experience can sometimes harden our biases, turning us into ironic bigots, it also has the power to banish prejudice and for all time. Akin to the jaded hypostasy that suffering makes one insightful – for the artist this may be true in some cases, for the rest of us, suffering produces primarily misery, secondarily, resentment, even ressentiment – lack of temporally adjudicated biographical experience in a life is, writ small, the lack of historical consciousness in a culture. What adults are reacting to in the child-mind is a naivety that appears to make suffering blissful; if only we could manage to bracket the world so easily! And what we are reacting to in the mind of the youth is the ability to dare to question the world as it is. Now this second aspect of the illusion of the absence of experience is an excellent tutor, if only we adults would take it up with all due seriousness. Instead, we seek to limit the questions of youth just as we limit youth’s ability to express its phase-of-life’s essential characters; wonder, desire, passion, romance, and most importantly, its rebellion against authority. If we merely took the last facet of the youthful gem and lived it, leaving the other more phantasmagorically inclined imagery behind us where it belongs, we would be by far the better for it.

            5. How do we attain an ‘effective historical consciousness’?

            The phrase is Gadamer’s, and points to a kind of working pragmatics that, in its ‘fusion of horizons,’ generates Phronesis, or practical wisdom. One simple way to approach a sophisticated state of being is to recall to ourselves the how-to skills associated with a specific material task, such as fixing something around the house or cooking a meal, a project in the workplace or helping a child with their studies. These are aspects of a consciousness directed ad hoc, or to some specific task or object. They are also the stuff of Weber’s ‘rational action directed towards a finite goal’. Finite goal agency is, in turn, a manner of thinking about the self: I am an actor who needs to get from here to there – what do I need to do to accomplish this movement? The process by which I do so, whatever its content, is a temporal one, but one that belies its own historicity due to its intense focus on what is at hand. Nevertheless, time has passed, and a small part of one’s own personal history has been acted out. Now think of species-being in History as a form of agentive action directed to specific, if various, series of goals. This can not only provide some inspiration in anxious times, when once again, the mythic apocalypse is being contrived as an overlay upon very material conflicts regarding resources and their distribution, it can also give us, as individuals, the sense that what we do matters within the wider cultural history of which we are a part.

            Finally, the redefined regression analysis (RRA), differs from demythology in that it cannot take place through art. It is an aspect of critical and reflective thought alone. Its effect may be equally disillusionary, but its means must stay analytic, never adopting either the allegorical or the agenda narrative. It also differs from a deontology, which is to be seen more as another effect therefrom rather than a source method. Demythology is an anti-transcendentalist critique that is perhaps best performed in art, deontology similarly in philosophy, but RRA in the sciences, and specifically in the human sciences, their critical allies.

            This volume of essays, both popular and scholarly, is dedicated to redefining the analysis of regression in all its forms. It does so at a time when we are witnessing a worldwide regression, the psyche of which is desperate, anxious, and fearful, all of the very weakest aspects of our shared human character. Instead of giving in to those base impulses, grasp rather the more noble cast of compassionate critique, both in your own life and in the life of the world itself.

            The following two articles first appeared in edited form in peer-reviewed journals which are now defunct. They are reprinted here in their original state for the first time.

            2011v    ‘On Distinguishing Between Criticism and Critique in the Light of Historical Consciousness’, in Journal of Arts and Culture Volume 2, #3, Nov. 2011. Pp. 71-78 dc. ISSN 0976-9862

2012v    ‘Is there Hermeneutic Authenticity in Pedagogical Praxis?’ in Journal of Education and General Studies, volume 1 #8, July 2012. Pp. 180-187 dc ISSN 2277-0984

The Good-in-Itself versus the Good-for-Oneself

The Good-in-itself versus the Good-for-Oneself (an excursus in grounded meta-ethics)

            The term ‘meta-ethics’ first presents an inherent contradiction. Ethics is, by definition, about the space of action in the world. It is grounded only in the sense that it occurs in medias res, on the ground, whilst running along. It is perhaps typical of analytic philosophy to make the goal an ‘in itself’ and then the means to it quite contextual. The leap of faith is simple enough: can we establish a principle – in this case, about the essence of morality – based upon all that is unprincipled in itself. This faith does not, perhaps ironically, include a moral judgment, for ‘unprincipled’ is meant to suggest only that which is relative to condition. Think of Durkheim’s understanding of ‘deviant’, which was highly statistical, and in which the normative was equally seen as simply that which most people in fact do, or believe. It is the same, and even more so, for his kindred concept of ‘pathological’, which is deliberately contrasted with the stark and even jarring term ‘normal’, so disdained today. The social fact, to again borrow from the same thinker, that everyone is ‘normal somewhere’, belies without entirely betraying the sense that our shared condition is not experienced in an identity relation with itself. But if one seeks a principle, one would either have to assume that there are actions which lend themselves to a choate whole as in a structure made of differing but corresponding elements, or that at some point, with the presence of enough of a certain kind of action or sets of actions, that a Gestalt belatedly arrives which can be thenceforth named ‘morality’, or some other like category.

            As a hermeneutic thinker, I am cautious about such claims. Ethics is never by itself, or thus an ‘in itself’. It is quite unlike physics in these regards, which, though certainly not acting in the proverbially ‘closed system’, it is nevertheless highly predictable in its correlative effects, and can be, with great aplomb, analytically worked out backwards, as it were, to specific precipitates and even ‘causes’. Now it is not that ethics so named is random, entirely spontaneous, or improvised on the spot every time. Clearly, there is some relationship between the action of the good and the good in principle, and it is thus a matter of discovering more exactly what that connection may be, how it has altered itself over time, and how living human beings perceive it. But unlike the formal study of meta-ethics, what I will suggest here is that we begin quite inductively and without any principled goal in mind, by attempting to understand a Verstehen of Verstandnis, if you will, which ultimately returns to a Selbstverstandnis. This ‘selfhood’ is not, in the end, oneself, but rather about the self as it is currently experienced and acted out by our contemporaries in the world as it is. Insofar as it is not overly personal nor overtly subjective, this selfhood should contain within it at least a semblance of a principle.

            Let us begin then by taking a familiar example in which the contrast between action and order may be glimpsed. It is very often the case, in teaching undergraduates of any age or possessed of any credo, that they imagine that their personal experience can by itself generate facts, or that what they have known is the whole of social reality. Long ago, when I was still myself possessed of a sense of experiential superiority, I responded to a hapless young student who, in reacting to the statistic about youthful marriage which, at the time, had it that 85% of marriages entered into before the age of 26 ended in divorce, objected that her parents had been high school sweethearts, meeting one another at age 14, married at 18, and were yet together some decades later. Congratulate them, I replied, they are part of the 15%. This generated buckets of belly-laughs from the rest of the class and I am sure the poor thing was humiliated. That I only felt some minor bad conscience about it years later, suggests that ethics, at least of the pedagogic variety, had been conspicuously absent in this specific case.

            At the same time, such an event served the wider case quite well, as it pointed out, rather pointedly, that one’s own experiences were not enough to understand fully the human condition. Now, if we take the same sensibility to ethics, we might argue that since one’s own actions in the world are not representative of any kind of morality which might be known by other means, they are also not the fullest expressions thereof. What is meant by this latter remark? If one does not know the good, one cannot be either a representative of it, nor can one express it through one’s actions. This is a moral statement, and as such, it evaluates the value of the principle, not by its enactment, but rather, to borrow from Foucault, by virtue of its ‘enactmental complex’. The status of morals in society is one of the salient variables for analytic philosophy’s idea of what meta-ethics might be. The term ‘status’ implies both its state and its value, what it is in itself and how we esteem it, even if we do not precisely know what it is, or what else it may be, unto itself. The old-hat problem of our perception of the world comes immediately to mind, but morality, as Durkheim for one stringently reminded us, is not of ‘this’ world at all. It is social alone, for, as he calls it, ‘there is no other moral order apart from society’. Before Vico, one could read ‘not of this world’ as implying an otherworld; in the premodern sense, one of divinity but also one of spirit. With the Enlightenment, ‘spirit’ disconnected itself from the divine, became ‘objective’, as it were, and whether dialectical in nature or more simply existential, the one thing it no longer was, was essential;  spirit had become its own deep deontology.

            Now however this may strike us, one day as liberating, the next, alienating, and either way, certainly as a foreground to our favorite modernist expression, that of ‘freedom’, the deontology of morality did little enough to thenceforth favor ethics as any kind of ‘in itself’. Ethics, in our day, is more often thought of in the context of business, the white collar professions, medicine, or the law. We do not regularly hear of ethics as a stand-alone discourse, and when I tell people that I am an ‘ethicist’, or that is one of my philosophical areas of study, they always ask, ‘do you mean professional ethics or business ethics or…’ and so on. It is clear enough that ethics, recently divorced from morality, has accomplished what Aristotle began only through the sleight of hand of popular language in use. And while morality is itself shunned as both dreadfully old-fashioned as well as avoided since it is perceived as a prime candidate for interpersonal conflict, ethics has almost vanished entirely from the ‘open space of the public’. And if one gets a bemused response to being an ‘ethicist’, just think of how people might react if one introduced oneself as a ‘moralist’!

            The foregoing should not be seen as a digression. Just as personal experience does not in itself comprehend the world, the actions based upon each of our individuated experiences cannot in fact construct an ethics, let alone a morality. In our day, the quest for principles is either a mirror for the ventures in technique and technology which seek indefinite perfection – research in stem cells, in artificial intelligence, in extraterrestrial contact, in cybernetics or cyber-organicity including portable or downloadable-uploadable consciousness – or it is simply another one of the same type. The moral objection to each and any of these is that they are the latter-day Babel, and can thus only be the products of an arrogant but still mortal mind which seeks to be as a God already and always is. One could ask the question, in return, ‘are moral objections always in themselves moral?’ but this would take us beyond the scope of this brief commentary. Instead, though not in lieu of, let me suggest that meta-ethics as framed by analytic thought is fraught with a problem similar in likeness to that of making something perfect, or at least, superior to what it had been. One, we are not sure if morality is in fact superior to ethics: the timeless quality of moral principles is obviated by history. History slays morality just as disbelief murders Godhead. We know from both our personal experience and our more worldly discourses thereof, that what is good for one goose is not necessarily good for the next, let alone the ganders, diverse in themselves.  Two, what is it about the character of ethical thought, and the witness of ethical action, that necessarily requires us to hitch it up to a more static system of principles? We have already stated that ethics is not about the random, and if we take our proverbial chances in the world each day, we do so with the prior knowledge that almost all others are very much aware of doing the same thing, and thus as a society we are alert to a too egregious over-acting, and that of all kinds. Durkheim’s sense of the source of morality again comes to the fore: here, morality is understood as a working resource which expresses its historical essence through the action of ethics. As such, there is nothing to be gained by esteeming morality for its own sake or even contrasting it with the discourse of ethics in a manner that exalts its status.

            The virtuous must be decided not by a grounding, but rather on the grounds of all that which is at first needful of some kind of adjustment. How we access the frames by which we make ethical decisions is certainly of interest, but I suspect that most people do not refer to principles in so doing. Instead, they rely on what has worked for them in the past, their ‘previous prejudices’, which can appear to them as if they were a set of principles in spite of what we have just observed about the essential parochiality of personal experience. In a word, prejudice is not principle. Certainly, morality attempts, and mostly in good faith I imagine, to overcome the individuated quality of merely biographical self-understanding. All the more so is it not present to mind when we act. It is not that morality is utterly moribund, a relic alongside other ontotheological constructs fit only for the museum of thought and never for the world as it is, but we would do better to work on a more effective discourse concerning ethics and specifically, ethical action, with the only pseudo-ideal present perhaps the irruptive figure of the neighbor in mind. This anti-socius works against the moral order of society and thus momentarily stands outside of ethical discourse as well. But the action of the neighbor serves as an expression of the essence of our truly shared condition and as such, reminds us of the radical authenticity that must be present in order for ethics to have any reality at all.

            G.V. Loewen is the author of 58 books in ethics, education, religion, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

Writing as a Vocation

Writing as a Vocation (a personalist accounting II)

            I never imagined I would become a writer. Even after my twentieth book was published, I thought of myself as an educator, a professor, and a pedagogue, but not a writer. I was simply a thinker who happened to enjoy writing. After I had finished with my administrative role and found that the vast majority of time had been taken up with its duties, divers and sundry as they were, the sheer amount of freed-up time lent itself to that very imagination. A well-known Canadian novelist was my first victim. His delicate indelicacy, ‘its quite a bit better than I thought it was going to be’, encouraged me to take at least the idea of writing more seriously. Almost forty books later, I have thought of myself as both a thinker and a writer now for some years. But what does it actually mean to be a writer? What does it mean to write?

            Writing as a means of communication:

            Writing is the greatest legacy of the first agrarian period. Other aspects of culture and civilization bequeathed to us from that antique epoch include mass warfare, caste slavery, and steep social hierarchies, as well as abstracted religious systems and gender inequality, all quite dubious historical gifts. Even monumental architecture might be seen as something of an unnecessary luxury. But the ability to record one’s thoughts, or simply describe the facts at hand, has made humanity a much more conscious, as well as self-conscious, species than it ever would have become without it. The first two ‘genres’ of written text exemplify the contrast between the senses and the imagination. The former is expressed as records of warehouse holdings, in the earliest of cuneiform, the latter, in the great mythic narratives, such as Gilgamesh, which orally is far older than even its first recorded rendition. Myth and fact divided the mind of antiquity and they are with us still, though both in somewhat muted form. The mythic has been personalized in a sense beyond belief, which in fact must be shared as part of culture to be truly authentic to itself. Fact has become a signpost for the absence of imagination, which is both ironic and ultimately impoverished. Throughout their conjoined career, myth and fact, fantasy and reality, continue to attract us in spite of their now stilted quality.

            They are able to do so because they continue to communicate things which are of the essence to our kind. On the one hand, writing allows one person to share their vision with another, no matter how outlandish are its contents or premises. With it written down, any reader can judge for themselves whether or not to take it with a pinch of salt or a drop of strychnine. We are able to read of distant places, exotic sources, crazed witness and unexpected encounter. We no longer need to presume it is some version of ‘Livingstone’ whom we meet in the heart of darkness or elsewhere, nor do we presume upon ourselves that we are always and utterly sane if only we manage to shun the irreal or the irruptive. On the other hand, the entire cosmic order is made more accessible to each of us through writing. These need not be the facts of a Gradgrind or for that matter, a Tyler, and the fact that one is, fortunately, a fictional educator and the other, perhaps regrettably, was not, impinges not a jot upon the reader’s sensibilities. Our question immediately becomes, ‘is this fact of merit, does it possess any value other than its descriptive presence?’. The judgment we carry into fiction is not entirely distinct from that which we carry unto fact.

            And it is writing that gives us this more sophisticated grace. We can discriminate between reality and fantasy after all, if only more of us would do so in our own time. Writing is both the stringent gatekeeper of any who would sully fact with fiction, but as well, and sometimes in direct contrast to this function, writing is also the means by which fact merges into fiction, and something of the fictional, in its ludic veridicity, appeals to us as if it were the thing itself. Writing represences the world as it is, and it makes present to us other possible worlds. In doing so, we find ourselves in the possession of a naked sword, visionary and keen, which, in a singular cut, can tear away the veils we tend to place over both our social normativity and our global inequity the both. At every level, from the most personal to the utterly dispassionate, writing reveals our truths to ourselves. Been molested? Write about it; let everyone know. Free others to communicate and come together to halt injustice. Fallen in love? Tell us all about it, for we too have such yearnings. Allow us to dream together in a waking state, overly conscious of our singularity, overtly impassioned by our desire for community. An undiscovered world awaits all readers of both astronomy and history, fantasy and science fiction. In a sense, writing does not discriminate such fields so distinctly as does discourse, and this is one of the chief differences between writing in that Derridean sense and the ‘tracing’ of nature through language in that Saussurean.

            Either way, writing as a means of communication remains its primary role in culture, whether or not the intention of the author recurs in their works, and without respect to the reader’s own intentions, whether it is to be simply entertained, informed, or enlightened. To each her own epiphany, one might respond to the text in hand, and from each their own experience. For writing has one further sidereal quality; that it becomes part of the reader’s world and his experiences thereof and therein, forgetting its ‘original’ source-point and reaching over any differences in biography and even history that once lay between writer and reader. In this, writing cannot in itself ever be parochial. For we living beings, this status provides for us an egress from our own rather sheltered perspectives and oft-shuttered imaginations.

            Writing as a personal experience:

            Non-fiction writing is an exercise in waking from what Schutz has framed as the ‘wide-awake consciousness’. This may at first seem redundant: how does one awake from the already waking life? Social reality provides for us a seldom penetrable insulation of norms, rituals, symbolic forms, and abstract beliefs within which no thought is necessary. As long as I run on my cultural and historical rails I need not blink at the world. But upon writing about this oft otherwise mute witness, I am compelled to reflect upon my sense of that same world, and what had been predictable and routine becomes much more experiential and even beckons an incipient adventure. Writing about the world as it is, insofar as each one of us can grasp it, is to awake from the day-to-day of the waking life. It is to simply become conscious, rather than to ‘raise consciousness’, for consciousness is always already with us and we are consciousness embodied. This awakening is also not a specific moral direct, such as ‘becoming woke’ or even ‘waking up’. It is a phenomenological disposition that pauses when it encounters the ‘of course’ statements associated with any automatic, or even automated, defense of society in the majority view. This is the hallmark of non-fiction: that it at once describes how things actually are and asks the reader to reflect upon, and question after, such truths. Non-fiction explicates to us that things are not quite as they seem to be, without suggesting how such things might be or might have been in the same way that fiction does.

            By contrast, fiction is thus less limited by the world. It may present different worlds, more or less plausible, and thence judged in terms of how recognizable to the unthought norms of the day they may be. If non-fiction writing awakens us to the subtext of life and living-on, writing fiction is to experience a waking dream. When we read the fiction of others, we note that our own perceptions are enlarged, but not in reference to the world per se, but rather to our own respective psyches. That the collective unconscious of humanity may also prove to be within our reach, at least once in a while, is testament to the function of the mythic as it plays within a reality itself bereft of myth. The latitude of interpretation associated with reading fiction is also wider than that of non-fiction, as readers may feel more free to bring their own experience into the text. Similarly, writing fiction sources itself in the author’s own experience, and those experiences which have been related to him by others he has known, sometimes intimately, sometimes vicariously. A commonplace projective trope thus begins with such rhetorical questions, ‘what if I had known her better?’, or, ‘what if we had never met?” and the like. In fiction, we are able to step outside of the facts at hand and imagine something else, indeed, almost anything else. This is why the creative character of fiction cannot be entirely divorced from the ‘discoverable’ sensibility associated with facts. If it is, then the world would lose its historical essence and humanity would be forever stunted in its species-maturity.

            My own experience with writing has fully participated in both major realms. For myself, scholarly non-fiction is shot through with the dialectic, as is appropriate for a hermeneutic phenomenologist. My more general non-fiction works are attempts to communicate difficult analyses to literate lay-people no matter their own backgrounds. It is the latter which is much more challenging for the writer to accomplish with any aplomb, and my originally mediocre assays have, over the years, given way to more modest, and thus more effective, offerings. At the same time, I take some satisfaction in making nominal contributions to aesthetics, ethics, education, and psychology, all emanating from my philosophical base. It would be past vain to enumerate such titles, but two examples, from both ends of the writerly spectrum, so far stand out: Aesthetic Subjectivity: glimpsing the shared soul (2011), is my major statement about art and its attendant discourses. The title is mine, the subtitle, the publisher’s, denoting a sudden and apt insight on their part. This book received a number of interdisciplinary reviews and was an unqualified success. But scholarly books are, by definition, elusive, and this work is now sadly out of print. In contrast, The Penumbra of Personhood: ‘anti-humanism’ reconsidered (2020),was a nightmare to write and no doubt the worse to read. I vowed to never write another large-scale scholarly work and to this day I have not, though I am planning one for 2024 in spite of this cherished interregnum. ‘Penumbra’ nearly finished me as a non-fiction writer, and was a reminder of how the vocation of writing can take over one’s life, sacrificing it in the service of the almighty text.

            As a belated writer of fiction, I have experienced similar distensions of ability and result. I am, first of all, sometimes taken aback by my waking dreams and how certain aspects of my unconscious life have found their way on to the page for all to peruse. Do I really have a penchant for grotesque violence? Have I never moved beyond adolescence in my desires? Though many would agree, life lived as an adult can be frustrating and sometimes even the coach of despair, but even so, at the end of any reads, I would hope no one would wish a life like any of my characters have been given and thus have had to live out. And just as art and life remain distinct, where there is no art one can yet suggest there is also a distinctive absence of life. So my fiction has within it a semblance of both at once. Since for the most part I write agenda fiction, by definition it cannot be art, no matter what kind of literary sophistication it may be said to have, and I make no claims to this regard. I write verse, not poetry, and I write books, not novels. I have never considered myself to be, or to yet become, either a poet or a novelist, but I have penned seventeen novels nonetheless, along with a novella, two short story collections, and an arc of folktales. This last, Raven Today, has been called my most ‘beautiful’ work by readers apparently in the know, and perhaps amusingly, is the only work of fiction I have produced which contains no bad language.        

            As with the non-fiction, I may be forgiven in citing just two books here. About the Others was my first adult mainstream title, and this failed art novel was meant as a tribute to my favorite author, H.G. Wells, who himself had quite a number of them. It has some autobiographical elements, and as such is the only work of fiction I have written that relies on what is this commonplace source material. But if my first mainline attempt was much-flawed, if still a tolerable page-turner, my second was, in my own view, perfect. That The Understudies remains unpublished reminds the author that his view of perfection may not at all be understood by others. This too is the common lot for writers of all sorts, and one must inhale that displeasing atmosphere as best one can, expelling it in new directions and perhaps relieving oneself of this or that delusion in the process. Writing fiction is about the literary sleight of hand, so to move from a pleasant illusion to a sometimes unsavory disillusion reworks the story from the outside in. And of course there is a world of difference between writing and publishing, especially in the fiction industry where, because of at least the potential for profit – unlike, and especially, scholarly works – editors and presses become agitated if ‘fit’ for catalogue is at all transgressed.

            Writing fiction is not a thankless vocation. Its task is to step into worlds hitherto unknown and uncharted, but its gift is that you are the one who becomes the first to know, the first to map, and these new worlds come to love you as much as you have given them the reciprocal gift of life.

            Writing as a Discursive Activity:

            All writers contribute to discourse, the conversation of the history of consciousness. If ‘dialogue is what we are’, as Gadamer has declared, discourse is that dialogue written down, a record of thought itself, and not merely thoughts, which any person may have, and in the most fleeting of fashions. Discourses come in many forms, and one need not be dismayed if philosophy is not on one’s writerly menu. Few read it, for one, and fewer understand it. And though it is not economics, the ‘dismal science’, philosophical discourse is often discouraging, as it leaves nothing sacred and unmasks even the sweetest of sentiment for what it may be or contain within. It is, in a word, not for the faint of heart, and if one has any hint of the Pollyanna, it will leave that fake Sophia naked and utterly at risk for her estranged sister’s truth.

            Given this, it is no holiday to write either. Perhaps it is this slough-filled pilgrimage which is the truer source of the action in my fictional works! I do find myself alternating between non-fiction and fiction, sometimes writing both at once, as I am currently doing. But discourse is immune to authorial sentiment. And if the author is himself dead, as Barthes famously reminded us, perhaps the writer lives yet. I have stated that we today dwell in the period of the afterlife of God, so it is not a stretch to imagine as well an afterlife of the author as a kind of remanential writer. This figure is itself discursive, and is made up, if you will, of all those who continue to author works in spite of that particular literary function being surpassed or superceded. That there is no autograph which can contain the text, that there can be no signature which vouchsafes it, is, even so, not to say that the reader can do more than rewrite the read in the light of her own experience and sensibilities. Penetrating non-fiction, as well as reflective fiction, in fact disallows complacency of any kind on the part of the reader, and tells us instead that discourse is alive and well, fully matriculated from its birth, divine or no, and fully accepting, and acting upon, its birthright.

            Hence writing is to experience the presence of discourse in one’s life. It is creative, in its guise as fictional, constructive as factual, but either way, it remains a wholly discursive act. That I became a writer tells me in turn that the vocation of writing adopted me as its own child, as it has done for countless others and, one would hope, will continue to do as long as there exists a human consciousness worthy of its precious record.

            G.V. Loewen is the author of 58 books in ethics, education, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.

Teaching as a Vocation

Teaching as a Vocation (a personalist accounting)

            Sometimes those who can do, also teach. And teaching is also a doing, at least of sorts. Shaw’s perhaps unknowing indictment is well taken, however, for the vast majority of what passes for teaching in our contemporary systems, at whatever level, is tantamount to mass regurgitation and within the framework of patent unthought. This is what needs be for any social system to reproduce itself without too many mutations, not unlike the patterns our genetic proteins must follow. At the same time, the world does not wait for us, nor any systemic congery we have constructed for ourselves. So, within the mass, there must always be the mutant, as it were, the catalyst for a transformation of thinking and even of human experience itself, so that reproduction is itself given new life. And those who feel that assignation, who treat teaching in the traditional sense of vocation, are perhaps more apt to become those agents of necessary change. What follows is a brief narrative of both pedagogy as a discourse and of my own experience becoming both a pedagogue and an education theorist.

            Teaching as an Vocational Experience:

            Though I taught my first few classes as a graduate assistant in the Winter Term of 1989, I only became a sessional instructor five years later. Five years after that, I was awarded my first tenure stream professorial position, and for some twenty years occupied this perch in various units and in three universities, ending my career after a five-year chairpersonship of a liberal arts department. A quarter-century in the university classroom, with more than 140 courses taught over that time, and my experience was one of some irony. At the beginning, I felt the calling of teaching as an authentic assignation, but by the end, I felt nothing of the sort. Does the saint recuse himself from his hagiographic similitude? Does the pilgrim quit his progress? Or for that matter, does the dictator ever simply step down? Clearly, one’s personal sense of what one must do can shift over time. My friends have suggested that I teach still, just in different and distanciated venues, sometimes digitally, other times informally, and I have done various writing workshops and series over the past few years, though now even these are fading memories. I have not been inside a bona fide classroom in over eight years.

            As vocations go, teaching has many rewards, both in the light and, to be discussed below, in the shadows as well. But teaching presumes that one can also learn from one’s students, alter one’s pedagogic trajectory to fit their needs, or have at least the nerve, if not the outright gall, to suggest to them that they do not know their needs, or are only partially conscious thereof. This may seem rash, but any vocation demands also vision. The saint does not brook debate regarding ideal action in the world, and indeed seeks to make mere living action into transhistorical act. The pilgrim will not be detoured from her goal, however afar, and in turn will not be deterred from pursuing it by all means, even if such sometimes stray into the unmentionable. The dictator’s Diktat is indeed generally unfit to print, but nevertheless, it commends itself with utmost consistency to the principle of vocation. Teaching, much less glamorous than any of these, is nonetheless safer, and to the point of complacency. The goal of university teaching is to be, speaking of ideals, open-ended, improvisatory, iconoclastic, critical. Its actual character tends toward the routine, even the otiose, as evidenced by my own professors, trained as they were in mid-century, and many by canonical figures. By the time I possessed the terminal degree from a world top-40 institution, I was but once removed from the likes of Talcott Parsons – I possess to this day many of his office files in which he stored his accumulation of journal articles, as well as the papers themselves – Erving Goffman, Claude Levi-Strauss, Raymond Firth, Victor Turner, Virginia Oleson, as well as others, including the great Dorothy Smith. With this last I had the privilege of dialogue much later in my career, when I myself was nominally worthy thereof. Yet in spite of, or perhaps because of, their elite training in the human sciences, my own professors’ course outlines were sometimes forty years backdated, sometimes even non-existent, for these were the days when university administrations actually kept their distance from the pedagogic scene.

            I had numerous teachers of merit, but by far the most important was no less than Dorothy Heathcote, the legendary drama pedagogue, by whom I was taught firsthand in the summer of 1980 when I was but fourteen. It was a transformational experience, that summer festival workshops series, with same-aged peers and the most brilliant pedagogue for youth I have ever known. It was she who told me that I had the potential to teach, and she who took the first step with me and showed me a path upon which that potential could evolve into a practice. For many years as a professor in my own right, I attempted to conjure for my students that same sidereal realm in which she moved so effortlessly. Heathcote was compassionate, fearless, unbounded, and quick on her feet. She had about her an aura of gentle invincibility; this is the only manner of description that comes to mind when I think of her. She showed me that the best pedagogue did not so much live and die by her students’ aptitudes or abilities, but rather helped that student understand the very meaning of life and death in its relation to experience, to knowledge, and to education.

            My longest-lasting teacher, and also my most personal, has always been my sister, a five decade veteran of the public schools, in which she occupied almost every role imaginable, from itinerant music teacher to principle, through drama director and superintendent’s office curriculum planner. That she continues apace today, working as the field supervisor in teacher training for her regional university, attests to the truer sense of vocation in pedagogy which is no longer present for myself. One’s experience of teaching as a vocation includes moments of ethical fulfillment – the most commonplace is when a student relates how you have transformed their life and given them a keen drive to succeed or at the very least a hope and an aspiration to be more than they had been before – as well as a consistent sense of existential contentment. No one I know has had more of these future-oriented moments than has my sister, and every one well-earned.

            That I have a number of life-long friendships that began in the classroom is a lasting blessing. That I met the young woman who was to become my future wife in the classroom strikes me as a kind of miracle. The many thousands of students, most of them marginal and many the first-time college attempts in their respective families, have of course come and gone. Those once known fairly intimately I now know nothing of. Those who were obstreperous have long been forgotten. And all this is as it should be, for another principle of the vocation is its not quite diffident, but indeed quite dispassionate, stance and instantiation of itself. Assignation is itself impersonal, for whatever the source of such, be it the Fates or the Furies or both, could have chosen anyone in the end. A vocation is the result of a Valkyric light, shone upon the fragile being merely in the world and making him of that selfsame world.

            Teaching as a Fix, and as a Pimp

            But teaching as a vocation has its shadow side. If there is magic in it, there is also present sorcery. For myself, I was an attention-seeker, and the fact that I could transfix large audiences, keeping them on the edge of their collective seats for up to 90 minutes, only fuelled the sense that I was, as an individual, more than my vocation would, or should, admit. My narcissism could be rationalized away as being in the service of good product, and clever production. If the classroom experience with Professor Loewen was a commodity worth the price, even in steeply ascending university tuitions, I became, in that space, my own fetish object. I bathed in the applause, and I glowed in the admiration of people far too young to make any worthwhile distinctions of mature character. I came to need the fix, captivating, enervating, and especially offending cohorts of students, getting younger and younger as I myself aged. At present, long outside of such contexts, I have to police myself yet regarding the motives for my more critical work. That I am not always entirely successful any readership will attest. The fact that my course evaluations bore no signs of my self-interest was remarkable but also an important relief. One could say, ‘whatever it takes to get up there and kill it’, but as an ethicist, I maintain my doubts. Teaching as a vocation might cater to the fix, but it does not admit the fixation.

            Nor does it the lust. I was a member of what I think, and hope, to be the final generation of academic gigolos. A young male professor, the campus menace and, at least in my gendered druthers, the patent nemesis of the coed. What I can say, is that I never cheated on any one of them, and that they were all adults. That I even fell in love with three of those otherwise uncounted might also be worth something. And of course, my wife of more than two decades rose to the very apex of this otherwise somewhat sordid pyramid scheme cum bedroom farce. The teaching vocation cast as a pimp is unique to the university, or at least one would hope that it is, and as such it places a more stringent ethical demand upon the advanced pedagogue. Institutions have belatedly framed policy surrounding ‘campus romance’ as it is still sometimes sentimentally referred to, as if this were still 1950 or so, and I was witness to these changes, for such policies were non-existent not only when I was myself a student but also for about a third of my professional career. Romance or no, intimate liaisons with one’s own students is not recommended, and I say that as perhaps one of the very least prudish persons on the planet. Inevitably, one’s emotions, or worse, one’s desires, obviate the nobility of the pedagogic plane. It is not that all students must be ‘treated the same’, as if they are but lab rats, but rather that each student must be given their ownmost care and concernfulness, that which is most apt for their current condition, and most astute regarding their current abilities. Beyond this, the tables of desire can be easily turned. I was myself stalked no less than five times, and those represent the cases of which I was aware. By four women, one man, and fortunately the fellow involved was absolutely non-threatening and only one of the young women was, at least to me who is hardly GQ, unattractive. Even so, desire is a game that two can play, further obfuscating both the discourse and dialogue which must be present in authentic teaching.

            Teaching as a Discursive Activity:

            And speaking of which, late in my teaching career I somewhat randomly became an educational theorist. I have now written two books and a number of articles in the field, and I was both astonished and honored that my 2012 book has been used in multiple programs for curricular and pedagogic renewal. For me, the study of teaching became almost as important as teaching itself, and I was able to, as a more mature pedagogue, bring this work into the classroom, thus making it more historically conscious of itself, and allowing students to begin to claim a sense of the wider contexts within which teaching both functioned as a critical discourse as well as its very opposite. My enduring idols of modern education are John Dewey and William James, two pillars of pragmatism but more than this, two transcendental teachers and very much public figures. The present work in digital media I have undertaken with my corporate co-founder and business partner, Avinash Pillay, a true genius of the new age and someone who himself has all the makings of an effective pedagogue, remain profoundly in debt to Dewey and James, and their own attempts at disseminating more widely the history of ideas and the philosophy of consciousness entire, halting and of course technically limited as they were in their own time.

            To read what other teachers have to say about teaching is kindred with reading writers writing about writing, but more on that in a companion piece. Suffice to say that experience is both a great teacher in itself, but also, in its own shadowy form, a purveyor of bias, even bigotry. ‘I know how to do this and I don’t need to learn anything new about it’ might well be the least of it, regarding the poor attitudes the veteran teacher can accumulate. More subtle, and thus more dangerous, is the evolving sense that I can master any classroom ‘situation’, and that I am the master of any student. That I am unassailable not only in my opinions, but also in my very presence. That I, in a word, have moved beyond the need to risk myself.

            But in fact, within authentic dialogue, there is not only present the dialectic, which is objective without being objectificatory, but also, at a personal and a subjective ‘level’, a ‘diacritic’ function which entails that participants willingly risk not merely their beliefs but their entire manner of being; the way they have lived until this moment. The teacher is a mere resource and more experienced participant in the realm of dialogue. It is an intensely hermeneutic realm, and what I mean by this is simply that it entails translation, interpretation, and interaction unframed by specific discursive tropes. I have written at length about ‘hermeneutic pedagogy’, so suffice to state here that if either student or teacher is unwilling or unable to place one’s very reason for being on the pedagogic table, the results emanating from any lesser classroom or other context will tend toward the merely reproductive. This is not a case of the professor giving over his authority to his students, or even the by now cliché sense that classrooms should be ‘student-centered’. Even learning centered classrooms, in contrast to teaching centered say, still does not reach the apogee of authenticity in dialogue. Of course, the standards of intimacy which can be tractioned in various classes and courses must be utterly aware of the students’ own presence and their willingness to risk themselves. But I have always pushed my students to expand not only their perspective in relation to history and thought itself, but with regard to their own capabilities. I have encouraged them to ask any question, no matter how impolitic or unfashionable, and that they may speak to any topic, if only to express their incomplete knowledge, which is in turn a more profound expression of our incomplete beings, to be finished only in mine ownmost death. This concept of incompleteness is of the utmost in a serious pedagogy, for it reminds young people that no matter the life-phase or one’s ‘amount’ of experience both personal and cultural, that we are, ideally, always learning, and that the new is only what fully overturns and overcomes what we once thought we knew.

            Teaching the Vocation of Teaching:

            Lastly, I would like to add a few lines about how one’s sense of vocation in general is itself transformed by the experience of teaching.

            A vocation begins with wide eyes and bright imaginings. It resonates with childlike wonder and perhaps also even a smidgeon of childish anticipation, as if each new classroom were an unopened birthday gift of unknown proportion and value. It should carry one through many other vicissitudes of a life, its own exiguous thread enduring any strain, suffering any insult, and shrugging away any care. And this personal function may last the entire life course, even if its objective content and very character be altered, as it has been for myself. Teaching as a vocation should also stand aloof to both bribe and blackmail, for it should fear no evil other than being wary of that within the shrouds of its own shadows. Over time, one’s own sense of what one is doing alters its vantage point, pointing away from imminent joy and as well pleasure eminent, and toward the more practiced sensibility of ‘Am I doing this well, what can be better, how have my students changed over time, what now does the world ask of all of us?’ and other like queries ongoing. One progresses from painstakingly constructing course outlines, living and dying by every course evaluation, memorizing entire lectures and the like, to being able to gain the larger pattern and paint the more complete picture, of being able to walk into any classroom absolutely cold and simply flick on the killswitch, and of not being overly concerned about either the latest pretty face or the most recent and in fact non-teachable failure, both of which will ere be present as long as one remains an active teacher. These changes represent to oneself both a personal evolution as well as one discursive and dialogic.

            A vocation ends simply when one decides to end one’s relationship with it. Its presence then become a kind of remanant, but a good-matured one and one not given to haunting either our incomplete dreams or our doubts about what we in fact have accomplished, however distant and dated such may be. When I left teaching I was momentarily lost for purpose in action, but I was never alone, for the experience of assignation is fully portable across any specific series of vocations, and this by itself is perhaps the most profound thing one learns by having had one in the first place.

            G.V. Loewen is the author of 58 books in education, ethics, aesthetics, health and social theory, as well as fiction. He was professor of the interdisciplinary human sciences for over two decades.