On Living at the Edge of Meaning
There is a novel by the Japanese writer Yoko Ogawa called The Memory Police in which things disappear from a small island. Not catastrophically, but quietly. Roses vanish. Then birds. Then calendars, ferries, and photographs. Then people begin to forget, not just the objects themselves, but the capacity to miss them. The world grows thinner as a result, and nobody mourns, because the grief disappears along with the thing itself.
I have been thinking about this novel for a long time; not as fiction, but as a form of diagnosis.
We are living, like in Ogawa’s novel, through a slow subtraction. Not of physical objects, at least not yet, but of the structures that make experience feel weighted, significant, continuous. Community, shared narrative, trust in institutions, the assumption that what one does today connects meaningfully to something larger and lasting: these are dissolving, and the dissolution is so gradual that most people do not register the loss. They have already forgotten what the thing felt like when it was there.
Short-termism and the failure of feeling
The human brain evolved to solve immediate threats: predators, hunger, rival tribes. It did not evolve to feel the full weight of problems that unfold over decades or centuries. This is not stupidity. It is a profound mismatch between cognitive hardware and the scale of modern problems. Climate change, infrastructure decay, institutional erosion: these are slow bleeds, not wounds. And slow bleeds are easily normalized. By the time the loss is undeniable, the capacity to name it is already half gone.
This is what the philosopher Joseph Tainter identified in his study of civilizational collapse: complexity accumulates until the cost of maintaining it exceeds its returns. Then the whole system simplifies, rapidly and in ways that leave survivors unable to reconstruct what was lost, because the knowledge required to reconstruct it was itself part of what collapsed (Tainter 4). The people left behind are not stupid. They simply live in a thinner world and have no memory of the thicker one.
The philosopher at the end of things
There is a particular kind of thinker who occupies a difficult position. One who sees the pattern clearly and who carries, as a result, a weight of perception that cannot easily be shared, because the people around them are still furnished with illusions that, however false, provide genuine comfort and motivation.
Albert Camus wrote about absurdity, being the confrontation between the human need for meaning and the universe’s silence in response to the matter. His solution was to revolt: to refuse to look away, to insist on living fully inside the absurdity rather than fleeing into false consolation (Camus 53–54). But Camus was working with individual mortality. Civilizational or species mortality is a far different scale of problem, and the revolt required is larger, and lonelier.
Oswald Spengler, writing in the aftermath of the First World War, argued that civilizations follow organic cycles (spring, summer, autumn, winter) and that the West had entered its late autumn, what he called the age of Zivilisation: technically brilliant, but spiritually exhausted, and culturally imitative (Spengler 31). The philosopher at such a moment does not have the luxury of optimism. But neither, Spengler suggested, do they have the luxury of despair. One lives in one’s time, and does what is possible.
The thinning of infrastructure
The thinning is not only existential. It is material.
Modern civilization runs on systems of extraordinary complexity and interdependence (electrical grids, global supply chains, financial networks, the internet), and these systems were built for efficiency, not resilience. They are tightly coupled, which in engineering terms means that failures propagate rapidly and widely (Perrow 4). Most people have no idea how fragile they are. The knowledge required to survive outside these systems (how to grow food, preserve it, source clean water, practice basic medicine) has largely been lost at the population level. It is not stored anywhere accessible. It has simply disappeared, like the roses (et al) in Ogawa’s novel.
Climate change does not need to make the planet uninhabitable to produce catastrophic outcomes. It needs only to push infrastructure past its tolerance threshold at enough points simultaneously. A billion displaced people do not merely suffer: they destabilize every political system they touch. And destabilized systems make irrational decisions: historically, wars over remaining resources, and eventually the deployment of weapons capable of something far more final than displacement.
The act of doing
Viktor Frankl, writing under the abhorrent conditions of the Nazi concentration camps, argued that meaning does not require favorable conditions, but it does require engagement with what is actually present, however diminished. The will to meaning persists even when objects of meaning are stripped away. What matters is not the quality of the experience, but the fact of presence inside it: the act of witnessing, of remaining conscious and deliberate, as its own justification (Frankl 76–77).
This is a different posture from optimism. It does not require believing that the arc is launching upward. It requires only continuing to engage with the reality that is actually here: the thinning world, the honest questions, the weight of clear perception. The narrator of Ogawa’s novel keeps writing even as the world loses its objects. The act of writing does not restore what is lost. But it is not nothing. It is, in fact, quite a lot.
“Everything can be taken from a man but one thing: the last of the human freedoms — to choose one’s attitude in any given set of circumstances, to choose one’s own way.”
— Viktor E. Frankl, Man’s Search for Meaning (77)
The question is not whether meaning survives the thinning, or whether the act of remaining present (of doing, in the full weight of that word) is itself a form of meaning sufficient to sustain a life. There are philosophers who have concluded that it is. There are people living inside that conclusion right now, carrying the perception clearly, continuing anyway.
That continuance is a quiet form of resistance. Not loud. Not optimistic. But real.
The world thins. Some of us notice. Some of us keep writing.
Works Cited
Camus, Albert. The Myth of Sisyphus. Translated by Justin O’Brien, Vintage International, 1991.
Frankl, Viktor E. Man’s Search for Meaning. Beacon Press, 2006.
Ogawa, Yoko. The Memory Police. Translated by Stephen Snyder, Pantheon Books, 2019.
Perrow, Charles. Normal Accidents: Living with High-Risk Technologies. Princeton University Press, 1999.
Spengler, Oswald. The Decline of the West. Translated by Charles Francis Atkinson, Alfred A. Knopf, 1926.
Tainter, Joseph A. The Collapse of Complex Societies. Cambridge University Press, 1988.
There is a particular kind of exhaustion that has no good name. It is not the tiredness that follows a long day of honest work, the kind that earns deep sleep and a dreamless night. It is something older, more corrosive. It lives in the marrow. It has settled into us collectively, quietly, the way damp settles into old walls: invisible until the plaster starts to bow.
In 2025, the American Psychological Association found that stress levels across the United States had reached their highest recorded point in nearly two decades. Not since the aftermath of 2008 had so many people, across income brackets, across generations, described themselves as overwhelmed and barely holding on. Globally, the World Health Organization reported that rates of anxiety and depression had climbed forty percent above pre-pandemic baselines and simply... stayed there. The emergency ended, but the wound did not close.
We need each other more than ever. But we are so very distracted.
The average person now spends upwards of seven hours a day looking at a screen. More, if they work digitally/virtually. Algorithms, those patient and remorseless architects of our attention, have learned with terrifying precision exactly what flavor of outrage or grief or spectacle will keep our eyes moving, our thumbs scrolling. We are not browsing anymore. We are being browsed. Catalogued. Served back to ourselves in slightly more enraging versions, until we are too overstimulated to feel anything and too under stimulated to act.
And we are tired. We are oh so very tired.
Our muscles ache almost as much as our bones, and our voices are harsh and raspy, and bark like the worst of secrets spat out at the worst of moments. One in three adults in the developed world now reports chronic sleep deprivation. One in five reports that they have no one they would consider a close confidant. This is up from one in twenty just thirty years ago. We are, in the clinical language of researchers and quietly in the private language of our own hearts, desperately, profoundly lonely. Not the romantic kind that songs are written about. The grey, grinding kind that makes ordinary afternoons feel like something to survive.
We stumble and cry as we slip and fall, tearing our flesh in tiny, neat slits. Clean, but not. We are bothered, but do not break.
Watch any city street at rush hour, or rather, look at it, look past the AirPods and the lowered gazes and the carefully performed purposefulness, and you will see it. People moving through each other like water around stones. There is grief being carried in those bodies. There is fear. There is the residue of news cycles that do not pause: of wars that grind on in corners of the world we name and then forget; of wildfires and floods and disasters arriving in places that were told they were safe; of economic squeezes that turn the simple act of buying groceries into a small, humiliating arithmetic. We howl and twist into the mud and the blood and the roots and the rocks, and our human symmetries slip away into something far older than we.
Something animal. Something pre-language. Just the body, trying to persist.
We are trapped. But we have always been trapped in a silent contract we never signed. Just like all the other animals: things reduced to numbers, to be bought and sold by the supposed best of us. Used and discarded. Tools for trade. Meat to be manipulated, molested, and passed around until no longer useful.
The language of markets has colonized everything now. Our attention is inventory. Our data is a commodity. Our fears are a revenue stream: packaged, segmented, monetized by platforms worth more than the economies of mid-sized nations. We did not agree to this, not in any meaningful sense. We clicked accept on terms we did not read, for services we could not afford to refuse, offered by entities with no face and no address and no particular interest in whether we flourish or merely persist long enough to generate another quarter's worth of engagement metrics.
I want it to stop. We want it to stop. Surely, we must.
There are days I think we have forgotten what it feels like to want something without being told to want it first. We have outsourced so much. Our navigation, our memory, our social lives, our sense of what matters... that the self begins to feel like a subtenant in its own house. We put ourselves first, stupid and vain apes that we are, and by doing so, we have created a world that somehow serves almost none of us. The wealthiest eight people on earth now hold more than the bottom half of humanity combined. We built that. We ratified it with every transaction, every election, every time we decided that a stranger's suffering was not quite our problem.
We have turned our backs upon everything else. We have no one else to turn to.
And yet.
After the Los Angeles fires of early 2025, strangers drove hours to deliver supplies to people whose neighborhoods had burned while they slept. After the floods in Valencia, in Brazil, in the Sahel, ordinary people organized in group chats and neighborhood networks and did the work that institutions were too slow, too structured, too self-preserving to do. When the lights go out (literally or otherwise) something in us still moves toward each other. Still reaches. Still lifts.
This is not sentimentality. It is data. It is ten thousand years of evidence that we survive not by our individual brilliance but by our stubborn, irrational, magnificent tendency to help each other anyway.
Reach for my hand in the dark. I will help lift you up, just as surely as you lift me.
I mean this as plainly and as urgently as it sounds. Not as a metaphor. Not as a motivational poster. As instruction. As the only thing that has ever worked, in the end, in any civilization, across any catastrophe you care to name. We are going to need each other, not abstractly, by way of speeches and manifestos, but in the concrete, unglamorous way of checking in on someone, of showing up, of asking the question and then waiting for the actual answer instead of the acceptable one.
We cannot get out by ourselves. We never could. The self-made myth was always exactly that: a fiction stitched together in retrospect over a lifetime of invisible dependencies, of mothers and teachers and strangers who held a door. The most radical thing any of us can do, in this particular darkness, is to refuse the fiction of our own self-sufficiency and admit, out loud, that we are here because someone reached for us once, and that we owe the same reaching to someone else.
So. Here is my hand. I cannot promise you light. But I can promise you company in the dark, and that is, as it turns out, may just be the same thing
Works cited
American Psychological Association. Stress in America 2024: A Nation in Crisis. APA, 2024, www.apa.org/news/press/releases/stress.
Cacioppo, John T., and William Patrick. Loneliness: Human Nature and the Need for Social Connection. W. W. Norton, 2008.
Cox, Daniel A. "The State of American Friendship: Change, Challenges, and Loss." Survey Center on American Life, American Enterprise Institute, 8 June 2021, www.americansurveycenter.org/research/the-state-of-american-friendship-change-challenges-and-loss.
DataReportal. Digital 2025: Global Overview Report. Kepios, Jan. 2025, datareportal.com/reports/digital-2025-global-overview-report.
National Sleep Foundation. Sleep in America Poll 2024: Sleep and Performance. National Sleep Foundation, 2024, www.thensf.org/sleep-in-america-polls.
Oxfam International. Inequality Inc.: How Corporate Power Divides Our World and the Need for a New Era of Public Action. Oxfam, Jan. 2024, www.oxfam.org/en/research/inequality-inc.
Twenge, Jean M. iGen: Why Today's Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy — and Completely Unprepared for Adulthood. Atria Books, 2017.
United Nations Office for the Coordination of Humanitarian Affairs. Global Humanitarian Overview 2025. OCHA, 2024, www.unocha.org/global-humanitarian-overview-2025.
United States Surgeon General. Our Epidemic of Loneliness and Isolation: The U.S. Surgeon General's Advisory on the Healing Effects of Social Connection and Community. U.S. Department of Health and Human Services, 2023, www.hhs.gov/sites/default/files/surgeon-general-social-connection-advisory.pdf.
World Health Organization. World Mental Health Report: Transforming Mental Health for All. WHO, 2022, www.who.int/publications/i/item/9789240049338.
In every horror movie, there is a moment when the protagonist understands.
Not with a jump-scare, or in the shocking discovery of the first body, but with a quieter, more devastating recognition that the situation was always already what it now appears to be.
The house was never safe. The town was never innocent. The monster never arrived, because it was and had always been here, hiding patiently in the architecture of the ordinary.
In this moment, what changes is not the facts, but the quality of attention brought to bear on them. The world does not transform. The protagonist changes, and what they become, in that becoming, is someone who can no longer not-see.
I believe we are having such a moment, not in film but in the world.
This revelation did not arrive with screeching music or the sudden appearance of something monstrous. As the most unsettling truths it arrived in the middle of an ordinary sunny afternoon, like a logical conclusion one has delayed for years.
It suddenly became clear that the world is finished with us. Not finishing, as in the process of concluding its business with humanity but finished in the way a sentence is finished. In the way a calculation is finished.
The result has been arrived at. What remains is not a resolution but a remainder.
Horror, as a genre, has always known this. It has known that its monsters and its dread and its long corridors leading to inevitable rooms are not metaphors for our anxieties but dispatches from a reality we have been too comfortable with.
Or perhaps too cowardly, to directly inhabit.
Perhaps then, horror is not escapism but is the one genre that has been telling the truth all along.
We have had it exactly backwards.
The critical consensus that horror is a lower genre, a pleasure-mechanism for processing fear, safely removed, a symbolic theater in which we exorcise what we cannot otherwise confront, assumes that horror takes us away from the real world into some sort of managed darkness.
But consider the opposite possibility: that horror alone has been adequately describing the world as it actually is, and that every other genre represents the true escapism, the true flight from conditions that horror has been mapping with terrible fidelity for centuries.
In his In the Dust of This Planet, writer/philosopher Eugene Thacker makes the distinction between the world-for-us, the world-in-itself, and the world-without-us. Briefly, the ‘world-for-us’ being the world of human meaning, human scale, and human narrative; the ‘world-in-itself,’ or the world as philosophy has attempted to think about it, stripped of projections; and the ‘world-without-us, that is, the world as it will be, when we are no longer here to make it mean anything. Horror, he argues, is the genre that most directly confronts this third category. Not because it depicts monsters, but because it depicts a world in which human meaning-making fails, and the universe declines to organize itself around our survival or our significance.
The horror film does not take place in a different world. It takes place in this one, with one crucial variable adjusted: the assumption of our continuance has been removed. Strip this assumption from the world as we ordinarily experience it, and you do not get fantasy. Rather, you get a more accurate picture. The monster is not an intrusion from elsewhere. It is what was always here when the lights of our self-importance go out. It is the world-without-us, pressing its face against the glass.
Horror narratives have a certain grammar, and it is worth studying as one would study the grammar of any language that carries information about the nature of things. Characters are introduced and eliminated in sequence. The ones who survive longest are not saved. They are simply the last. Therefore, survival in horror is not a category but a position, a temporary location in a queue whose terminus is fixed.
The final girl does not escape the logic of the film; she merely occupies its concluding position. She is the remainder, not the reprieve.
This grammar maps with uncomfortable precision onto the situation we now collectively inhabit. We are not, as the more optimistic framings of ecological crisis would have it, at a crossroads, as if two genuinely open futures stretch before us, and we have only to choose correctly the right path forward. We are further along in the film than that. The crossroads are far behind us, somewhere in the middle of the last century, in the moment when the shape of what was coming became statistically legible and was nonetheless declined as useful information. What we inhabit now is not decision but sequence. Not a crisis, but a queue.
The horror of this position is that it does not feel like horror. It feels like just another afternoon. The horror film's great formal achievement is the gap it opens between what the audience knows and what the characters experience: we see the monster in the background while the character pours coffee, continues a conversation, laughs at something that will not matter in four minutes. This gap, between knowledge and the texture of ordinary experience, is precisely the gap we inhabit now. The data is available. The projections are not secret. Yet the coffee gets poured, the conversation continues, and the laugh happens. We are the characters who have not yet turned around, and we are the audience that can already see what is standing directly behind us.
Philosopher/writer Gary J. Shipley has written of consciousness as the organ of its own torment -the thing that sees clearly enough to know what it cannot change. The peculiar position of the early twenty-first century human is precisely this: we have developed instruments of perception (scientific, digital, philosophical), sensitive enough to describe our situation with precision, and we have discovered that this precision does not confer power.
We can see the queue, but we cannot leave it. Lucidity here is not liberation. It is the specific horror of knowing which scene you are in.
There is an important distinction to be made between a world that is hostile to us and a world that is done with us, and the distinction is not a comfort. Hostility implies ongoing relationships, implies that we remain significant enough to be targeted, and implies a monster with an agenda. What I mean by a world finished with us is something colder and more complete. It is the horror not of malevolence but of conclusion.
Consider what it means for a process to conclude. Calculation finishes. The result exists; the variables that generated it become, in a sense, beside the point. They were necessary to arrive at the result, but the result does not require their continued existence. The world (and here I mean the world in its fullest sense, the biogeochemical systems, the climatic patterns, the evolutionary processes, the thermodynamic drift of all things toward their eventual equilibrium) has been running a calculation of which we were a part. And it is finishing. We were inputs. The output is indifferent to us.
This is not nihilism (or not only nihilism), but an attempt to adequately describe the quality of the situation. Nihilism still tends to make the human its reference point. Ultimately, it is a claim about the absence of meaning for us. What I am reaching toward is something more radical: a description of a world that has simply moved on, in which our extinction is not a tragedy in any cosmically legible sense but something more like a bookkeeping event.
A line crossed out. An account settled.
Author Charlene Elsby's philosophical work is useful here because it focuses attention on the way consciousness arrives too late to its own conditions; that is, how the phenomenological subject is always catching up with a situation that has already shaped it, decided its coordinates, and determined the field in which its apparent choices occur.
Scale this up from the individual subject to the species, and the structure holds, and darkens. We did not choose the conditions of our creation. We will not choose the conditions of our conclusion. What we have, in between, is the awareness, the specific, agonizing, occasionally beautiful awareness, of the fact of this.
Here, consciousness is not the crowning achievement of nature but its most elaborate mistake: the part of the world that knows what the world is doing and cannot do anything but know it.
The horror film has two modes, and one of them is lying. The jump-scare is the lie: the sudden percussion of shock, the body jolted into alertness, the momentary confusion of alarm with meaning. It simulates danger as an event, as punctuation, as a thing that arrives and then is over. It is the mode that most flatters us, because it implies that the threat is locatable, that it announces itself, that we have a body capable of responding to it in the ancient fight-or-flight terms our nervous systems were designed for. We jump, and then we breathe, and the breathing feels like safety.
But the other mode, the slow burn, the long corridor, the creeping establishment of an atmosphere that makes the very air of the film feel wrong before a single explicable thing has happened. This is the honest mode. This is the one doing the real work. Dread is not shock. Dread is the condition of knowing, at a level below articulation, that the situation is irrecoverable, and having to continue to live inside it while this knowledge saturates everything, colors every ordinary moment with its implication. The slow burn horror movie is great not because it frightens us but because it describes, with more accuracy than most other forms, the texture of certain kinds of knowing.
The unraveling of the living world is a slow burn. It does not announce itself with a single event sufficient to organize a response. It moves like the wrong quality of light in a room you cannot identify as wrong until the wrongness becomes undeniable.
The statistics accumulate. The seasons misbehave. Species disappear at the edge of attention. The coral goes white. The permafrost exhales ancient gases into air that grows incrementally thicker with consequence.
None of this is the moment. There is no moment. There is only the slow, terrible sense that the world is a film in which the music changed a long time ago, and we are only now realizing the change.
What the slow burn asks of its audience is an unusual kind of attention, not the alert, reactive attention of shock but something more like the attention of a person who has decided to look steadily at something they would prefer not to see. Philosopher Simone Weil called this kind of attention a form of love, not sentiment but the willingness to let the object of attention be what it is, without the distortion of our preferences for it. To look at what is happening to the living world with this kind of attention is to submit to the horror film's slow burn in real time, with no exit and no end credits (at least, not for us).
The final girl is horror's gift to its audience: the assurance that someone will remain to witness, to remember, to carry the knowledge of what occurred forward into a future where it might mean something. She is the narrative's last stand against the monster. Not the triumph of the human, exactly, but the continuation of the human, in the insistence that there will be someone to tell the story. She survives so that the story can be told. She is the storytelling function in human form.
There is no final girl for a species. This is the formal problem that extinction poses to every narrative we have (every genre, every structure of meaning we have constructed around the assumption of our continuance). Survival narratives presuppose a remainder. Tragedy presupposes an audience. History presupposes someone to record and someone to read the record. Even nihilism, that most anti-narrative of positions, requires a subject to assert it. The extinction of the subject is the one event that all our narrative machinery is constitutionally incapable of processing, because the machinery runs on the subject. It is the one horror for which there is no witness, no one to walk out of the burning house into the morning to breathe the air and know that they are still there.
What we are left with, then, is the attempt to think a story that has no final girl. To hold in mind an event that our minds were not designed to hold, using tools that will not survive the event they are trying to describe.
This is not a conclusion of despair, but an authentic attempt to understand the full implication of what it means to wait in the queue.
The horror film ends. The credits roll. Someone walks out of the theater, and the world is still there.
What does it mean to be inside a horror film that ends without credits, without theater, without someone walking out?
Perhaps it means the attention we bring to our situation is all we have, and that it is not nothing. Not salvation (nothing is salvation; the film is still the film) but not nothing. Elsby's forensic attention to the philosophically inconvenient, Shipley's willingness to let the worst thought think itself all the way through, Thacker's patient taxonomy of a darkness that does not resolve…these are not solutions. They are examples of what it looks like to be in a horror film with your eyes open.
To wait in the line without looking away from the fact of the line. To bring to the situation of being finished with, something that at least has the dignity of refusing to pretend otherwise.
What changes in the revelation scene is not the facts, but the attention given. The world does not transform; the protagonist does.
I want to end here, with the question of what kind of attention is adequate to a world that has finished its business with us.
Not because I have an answer. Rather, because the question seems to me the only honest place to stand.
Horror, at its best, does not offer resolution. It offers (for those willing to sit with it past the jump-scares and the genre mechanics, into the atmosphere of the thing) a change in the quality of attention: a willingness to look at the world-without-us and not immediately reach for the consolation of our own centrality. A practice of seeing (different from a practice of hope) which may well be the very precondition for any thinking adequate to the actual situation.
We are in the movie.
We have been in the film for longer than we knew. This knowledge does not get us out. But it does change how we move through it.
And that change; that turns toward the thing rather than away from it. That decision to bring full attention to what is actually the case is the only form of honesty available to us now.
It will not save us. Nothing will save us. But it is what thinking looks like when it refuses to lie.
Clover, Carol J. Men, Women, and Chain Saws: Gender in the Modern Horror Film. Princeton University Press, 1992.
Elsby, Charlene. Hexis. Atropos Press, 2018.
Shipley, Gary J. Warewolff! Eraserhead Press, 2016.
Thacker, Eugene. In the Dust of This Planet: Horror of Philosophy, Vol. 1. Zero Books, 2011.
Weil, Simone. "Reflections on the Right Use of School Studies with a View to the Love of God." Waiting for God, translated by Emma Craufurd, Harper & Row, 1951, pp. 57–65.
Mark Fisher Today: AI, Capitalism, and the Digital Age
The work of the late Mark Fisher (1968-2017), one of the most influential cultural theorists and critics of the early 21st century, continues to resonate deeply with modern audiences now 7 years after his untimely and unfortunate death. Fisher, known for his piercing analysis of late capitalism, the cultural malaise of modern life, and for expanding Derrida’s project of hauntology, developed a philosophical approach that combined cultural criticism, personal reflection, and political theory. His profound, yet accessible, writings have contributed to his enduring philosophical influence and legacy far beyond academic audiences.
In many ways, Mark Fisher can be considered an oracle for our over-worked, stretched-too thin society on the brink of total collapse. This begs the question: If Fisher were alive today, what would he think about our world- one where capitalism is more entrenched than ever, digital technology permeates each and every moment, and artificial intelligence stands ready to reshape how we live and work? It would not be a stretch to say that he would most likely not be a fan. So, with that in mind, let’s take a look at Fisher’s philosophy to imagine his take on our world today.
Capitalist Realism
Arguably Fisher’s most well-known concept, that of capitalist realism, refers to the troubling sense that it is far easier to imagine the end of the world than it is to picture the end of capitalism. In other words, we are stuck in a self-glorifying trap, full of glorious noise, but signifying nothing. This is perhaps no more evident now eight years after Fisher’s death, in the face of increasing climate disasters, rising economic inequality, and the looming threat AI-driven job loss. Our current language around these issues exposes Fisher’s prescience: we discuss our problems as if they are the inevitable, and not as things that could be solved simply by changing the system itself. How do we, then, remove ourselves from a system that we are already enmeshed in? Or how do we stop the commercialization of emergent technologies to further enmesh ourselves into this system?
This brings us specifically to the use of artificial intelligence. Fisher would, no doubt, be highly skeptical of the use of AI, specifically in its use by corporations and governments, not to liberate us (much as the promises of Communism), but to make capitalist systems run more efficiently. This type of automated decision-making, especially in the realms of hiring, policing, and welfare, often enforces existing inequalities, making the logic of capitalism seem like a natural, technical necessity.
The Slow Digital Erosion of Attention & the Death of Cultural Innovation
In fact, Fisher was skeptical of digital technology’s promise to connect and empower us, calling social media and our constant digital connectivity a communicative parasite, one that saps our attention and blurs the line between work and leisure. Today this feels strangely prophetic. With AI- powered tools promising to make us more productive than ever and notifications pinging all around us around the clock, our attention is more fragmented than ever. Fisher might argue that AI is not freeing us to pursue higher meaning, but deepening our sense of alienation as each moment of our lives becomes a potential site of productivity.
Fisher feared this was leading to a cultural stagnation, arguing that instead of creating new art or ideas, we are stuck endlessly remixing and rebooting the past. Generative AI, which is created by mimicking a catalog of already existing styles, is an excellent modern example of this. AI can produce a “new” song or story in the style of a lost artist, but it rarely offers anything by way of originality or authenticity. Fisher would likely see this as indicative of a symptom of a deeper malaise: a culture haunted by its own history, unable to imagine a different, better future. In this sense, AI does not break the cycle of stagnation but only serves to hasten it.
This weird eeriness surrounding (more on Fisher's take on the 'weird and eerie' to come!) AI’s ability to imitate human creativity, this technological uncanniness felt in the face of digital creations that are, at once, both familiar and unsettling, feels as if we are haunted by the ghosts of a cultural past, only to be endlessly recycled by machines over and over again.
Can We Resist? Or Is There No Escape?
Fisher was deeply concerned with our growing sense of reflexive impotence, or the feeling that we know things are completely wrong, but we cannot imagine any real alternatives to change them. Today’s AI debates often focus on managing its risks and making it less biased, instead of questioning why we need so much AI in the first place. We should be asking why the technology was created, and what their intended uses and goals are, and ultimately who they are benefiting. Fisher would likely challenge us to think beyond the idea of inevitability of AI and the logic of endless growth, and questioning instead what if we used technology to build creativity and genuine solidarity, instead of simply speeding up the status quo?
Mark Fisher’s work remains an extremely important and powerful tool for understanding or digital, now AI-driven world. It reminds us to imagine and fight for something radically different than our consumer society, in the face of a world haunted by the past and paralyzed by the present. Ultimately, Fisher’s advice for today would be the same as during his life: we cannot accept the present as something unchangeable. This is a message we need now more than ever. A message of hope from one we lost too soon.
Reference (s):
Fisher, Mark, et al. K-Punk: The Collected and Unpublished Writings of Mark Fisher (2004-2016). Repeater Books, 2018.
Andrei Tarkovsky’s fifth (and final movie as a citizen of the Soviet Union), Stalker (1979), follows the story of three men, referred to only as the Writer, the Professor, and their guide, the titular Stalker, as they navigate a forbidden area called the Zone. Hidden in this unnatural setting lies a secret place guarded by checkpoints and surrounded by traps; a room where one is given their innermost desire.
Stalker is an existential film about man’s search for meaning amidst the noise and chaos of the world. It is also a film about the beauty and horror of the world itself, full of wide, contrasting shots of decaying industrial structures and natural landscapes full of green grass and flowing water. In each case, Nature has broken down man’s impressive architecture, depicted in deteriorating walls and torn metal, as if each detail of the environment bears a mystery in the debris scattered about by some unknown catastrophe.
The Stalker is himself obsessed with his mission of taking others to the miraculous room and giving them the opportunity to have their greatest wishes fulfilled, but he has never entered the threshold of the room itself. This journey is for others, yet he yearns to take one that would change the world into the Zone. Like John the Baptist of the Christian New Testament, the cross is not his to bear. He is only making the way for someone better, that is, one that is truly worthy: a Savior.
But the journey in the Zone reveals the true nature of the Writer and the Scientist, both superficial and vain in their materialism (and fueled by ulterior motives). The Stalker slowly loses hope that humanity can improve the world, his doubt reflected his gaunt face and haunted expression. He is tormented by the failure of his past and unable to comprehend the meaning behind himself and the Zone, each one stripping away at his crumbling faith little by little. Like the greatest of existentialists, the Stalker is looking for hope among the hopeless.
Jean-Paul Sartre (echoing Heidegger) used the terms dread and anguish to describe an essential state that, while unpleasant, is the first step towards an appreciation of freedom and authenticity (Tarkovsky’s Stalker...). This sense of hopelessness can be seen in both The Writer and The Scientist, though their motives for navigating the Zone (i.e.- seeking freedom) are vastly different. The Writer, a poet without a muse (audience), seeks to reclaim a sense of purpose despite personal and material wealth. The Scientist, however, secretly plans to plant a bomb at the uncanny site, fearing what may happen if the secrets of the Zone fall into the wrong hands. The Scientist, in the words of the Stalker seeks to destroy what he believes will be either humanity’s redemption or greatest temptation: hope.
The Stalker, having glimpsed but never reaching freedom for himself, is driven not only by a sense of failure, but bound by a sense of duty (Botz-Bornstein et al). Near the end of the film, the Stalker’s wife addresses the audience (breaking the fourth wall with profound effect). Looking directly into the camera, the at once a devoted, yet lonely woman, speaks of her husband’s physical and mental absence like a widow, describing him as a holy fool (that is, one so devoted to an individual calling that they inadvertently doom themselves to failure). Like Camus’ vision of Sisyphus, the fool carries their burdens both for and despite the gods. Cursed by their sense of duty, yet transformed by it, they exist in a foreign place that is familiar yet hidden. A Zone.
And it is in this Zone, this metaphorical mental battleground full of puzzles and secrets, that the Stalker finds his happiness: a tiny speck of hope for something better (Pourtova 779). The three men return from the Zone, its threshold left unused and untouched by any. The Stalker returns home and retires to his bed, riddled with existential angst and despair, crying out for understanding and peace, repeating the words It isn’t enough (evoking visions of the madman Nietzsche’s deathbed). Despite his torment and mental anguish, he clings to some small piece of curiosity, though some may call it hope.
The film concludes with the Stalker’s disabled daughter, Monkey, using telekinetic powers, no doubt caused by her father’s time in the Zone. Though it remains hidden from him, the change he so desperately wants and the freedom he desires lies in the very place he paradoxically abandons in search of it: home. This is the tragedy of the Stalker, and no doubt often of man itself.
Tarkovsky’s Stalker is an existential cinematic experience depicting man’s struggle for freedom and salvation. It is an analogy of man’s struggle for meaning in a meaningless world, with the Zone acting as a reflection of each character’s mind (each full of secret desires, regrets, and horror). Yet it is also a metaphor for hope. The Stalker continues to take others into the danger of the Zone in the hopes that he will find the key to happiness yet never dares to reach for that key himself. Like Sisyphus, one must imagine the Stalker happy, as he creates his own meaning in the face of a stark reality he refuses to accept. This is where he finds his freedom, his hope, and, ultimately, peace.
Works Cited:
Andrei Tarkovsky as an Existentialist Cinematic ..., https://www.highonfilms.com/andrei-tarkovsky-existentialist-cinematic-philosopher/.
Botz-Bornstein, Thorsten. “Realism, Dream, and ‘Strangeness’ in Andrei Tarkovsky.” Film- Philosophy, vol. 8, no. 3, 2004, doi:10.3366/film.2004.0028.
Pourtova, E. (2017). Andrei Tarkovsky: stalker of the unconscious. Journal of Analytical Psychology, 62(5), 778–786. https://doi.org/10.1111/1468-5922.12365
Tarkovsky, Andrei. Stalker. Janus Films, 1979.
Tarkovsky's Stalker: existentialism and mental health, https://www.thelancet.com/journals/lanpsy/article/PIIS2215-0366(19)30474-2/fulltext.
In the United States, today is the Super Bowl: the annual sporting event which generates over $500 million in revenue annually through $7 million ad slots and halftime shows, exceeding $13 million in total costs. All profits aside, the Super Bowl is also a damning metaphor for the transformation of desire into spectacle under the cover of capitalism.
French philosopher Jean-François Lyotard’s controversial yet often misunderstood postmodern philosophy may offer a distinct lens to investigate at this phenomenon. This can specifically be seen in his critique of grand narratives. According to Lyotard, universal ideologies, such as Marxism or Enlightenment progress, have collapsed, and have been replaced by competing micro-narratives. The Super Bowl is a perfect metaphor for this shift, substituting collective political goals with hyper-individualized consumption, such as luxury box seats marketed as “communal experiences” despite their exclusivity. Prioritizing performance over cultural authenticity mirrors Lyotard’s claims that capitalism reduces even communal traditions to pure engines of profit.
Lyotard’s concept of the ‘libidinal economy,’ or economies driven by libidinal 'energies' or 'intensities' that flow through all structures (including the human body and social events), refers to the distribution and arrangement of desire and identification, and the complex relationship between sexuality and the unconscious, further hints at capitalism’s reliance on raw, often irrational desire rather than ideological coercion. The Super Bowl’s halftime spectacles and limited-edition merchandise channel collective longing into commodified experiences, sustaining the system through emotional intensity. This “cathexis of desire” as Lyotard calls it, reflects his view that capitalism thrives by diverting human energy into consumptive acts, where anticipation of purchases (such as planning Super Bowl parties) often outweighs satisfaction from ownership. Yet this system breeds an interesting paradox: while capitalism frames consumer choice as self-expression, it also homogenizes identity. For example, organic food purchases, marketed as acts of individuality, mask structural inequities (exploitative labor practices or premium pricing) that excludes marginalized groups. Lyotard’s skepticism of universal progress narratives illustrates how such “choices” often reinforce systemic inequality, as dissent becomes co-opted. This can be clearly seen in corporate greenwashing campaigns strategically advertised during the Super Bowl. Yet, the Super Bowl is anything but green, and the overall costs are steep. Environmentally speaking, individual Super Bowl stadiums generate up to forty tons of waste per game, in a sobering reminder of capitalism’s blatant neglect towards sustainability. Exploitation-wise, luxury goods, produced en masse for the big event, are often riddled with labor and human rights abuses. Such byproducts of capitalism are hidden behind brightly colored, mass produced products and experiences that, while entertaining, do little to add to the consumer’s quality of life.
Lyotard dismissed universal justice narratives, clarifying society’s tolerance for such inequities when presented as localized issues rather than systemic failures. Resistance, therefore, lies in the “little narratives” that challenge capitalist logic without claiming universal solutions. Worker cooperatives like Evergreen redefine corporate hierarchies, while Patagonia’s “Don’t Buy This Jacket” Super Bowl ad subverts consumption: even as its irony (boosting sales despite anti-consumerist messaging) underscores Lyotard’s warning that critique reinforces the overall system.
The Super Bowl, in its gratuitous displays of excess, serves as a stark metaphor for capitalism's inherent contradictions, highlighting human creativity and technological prowess while simultaneously revealing the system's inherent capacity for alienation and inequality. The hyperreality it constructs, while offering a seductive escape from the trials and tribulations of life in a capitalist system, ultimately reinforces the existing power structures. The challenge of navigating this landscape lies in harnessing our collective desires and creativity towards more equitable ends.
The emergence of alternative economic models and the resulting shifts in consumer consciousness may offer a temporary sense of hope, but, without a fundamental reimagining of our economic systems (and social values), such efforts risk being consumed by the relentless logic of capital accumulation. The Super Bowl ultimately stands as a shining monument to capitalism's innate ability to commodify every aspect of human experience, from sport to social justice: a reminder that true liberation cannot be found in the shallow waters of hyperreality but in genuine social and economic transformation.
References:
1. Environmental Protection Agency. (2024). Textiles: Material-Specific Data. EPA.
2. Evergreen Cooperatives. (2025). About Us. Evergreen Cooperatives.
3. First Insight. (2024). Gen Z Shopping Behaviors Support Sustainability. First Insight.
4. IKEA. (2020). IKEA launches furniture buy-back and resale program. IKEA.
5. International Labour Organization. (2024). World Employment and Social Outlook. ILO.
6. Knox, S. (2025). Late-stage capitalism has made the Super Bowl too expensive even for the millionaires. Deadspin.
7. Lyotard, J.F. (1984). The Postmodern Condition: A Report on Knowledge. University of Minnesota Press.
8. Lyotard, J.F. (1993). Libidinal Economy. Indiana University Press.
9. Lyotard, J.F. (1988). The Differend: Phrases in Dispute. University of Minnesota Press.
10. Nielsen. (2024). Super Bowl LVII Draws Nearly 115 Million Viewers. Nielsen.
11. Patagonia. (2011). Don't Buy This Jacket - Black Friday and the New York Times. Patagonia.
12. Santino, J. (2016). The Super Bowl: America's Holiday. Sport in American History.
13. Waste Management. (2024). Sustainability Report. Waste Management, Inc.
14. World Economic Forum. (2024). The State of Fashion: 2024. McKinsey & Company.
The spoken word has always been a challenge for me. Social phobias and anxiety transform my speech into something clumsy and stilted, especially in public settings, where every word feels like it is under scrutiny. During these occasions, I worry that my words do not come across with the authority or intention I mean to convey; that my performance- because that’s what speech often feels like- is lacking in the eyes of those who hear it. My voice shakes and falters, betraying the very confidence I try so hard to project. I wonder if people see through me if they notice the cracks in my composure. At the doubt that seeps through.
For me, the written word is preferable, but no less faulty. Writing offers the luxury of revision: the ability to write and rewrite thoughts until they feel polished, but even this process has its flaws. Memories can be misremembered or forgotten altogether, leaving gaps that imagination eagerly fills. Fiction becomes reality as I unconsciously embellish details to make sense of events or to soften their edges. The written word may feel more deliberate than speech, but it is no less susceptible to distortion.
We are all unreliable narrators of our own stories. We erase what is hurtful, damning, or inconvenient. We rewrite roles for ourselves, recasting our most villainous pursuits as confused or misguided, wholly ignoring our intentions or, worse yet, the consequences of our actions. We are like actors playing the heroes in narratives that are often anything but heroic. And while we may not be the monsters we believe ourselves to be, we must admit that some are among us.
French philosopher Jacques Derrida’s famous assertion that “there is nothing outside the text” (French: Il n’y a pas de hors-texte) gives an interesting lens to examine this struggle between communication and self-perception. Where many suggest this statement suggests a world confined entirely within language, that is, where nothing exists beyond words, Derrida’s meaning remains notoriously misunderstood. What he is saying is that all experience, whether spoken, written, or otherwise, is always mediated through interpretation. In this way, the text is not limited to writing and includes all forms of signification and meaning.
This parallels my personal experience with spoken and written communication. If everything is an interpretation, my words, no matter how carefully I chosen, are subject to constant reinterpretation by others (and myself over time). Speech and writing are not pure channels for thought, but are shaped by context, history, and the biases of both speaker and listener. Derrida’s thoughts on the impossibility of fully capturing or controlling meaning both frustrates and liberates me.
Whenever I write or speak, I am constructing a version of myself, a text, that will be read differently depending on who is reading and under what circumstances. This applies even to myself: my own understanding will inevitably change over time. This endless process of interpretation echoes Derrida’s most popular concept known as deconstruction, the process of revealing hidden assumptions and contradictions within texts, recognizing in the process that no single interpretation can claim absolute authority.
Part of my struggle may lie in the significance I place on the idea of authenticity. I want my words, spoken and written, to reflect who I am. Yet, defining my self feels like an impossible task. Am I the anxiety ridden performer stumbling over my words in public, or am I the careful writer who crafts each sentence with an exacting precision?
Perhaps I am both. Or neither, depending on the moment in time.
Derrida reminds us that authenticity is a construct shaped by language and interpretation. If there is no fixed meaning outside the text, then there may also be no fixed self; only a series of narratives we create and recreate as we navigate the world. This does not reduce the importance of seeking honesty in communication but invites us instead to embrace the fluidity of identity and meaning.
What I have learned from my struggles with communication is that perfection is something neither attainable nor necessary. The beauty of words, whether spoken or written, does not lie in their impeccability, but in their ability to connect us with others. Even the clumsiest of words can carry meaning. Even the most imperfect stories can reveal certain truths.
I may never fully speak or write with ease or without doubt, I am learning to find grace in my imperfections. While my words may falter, fail, or fall short of what I want to express, they are mine.
Perhaps that is enough.
Reference:
1. Derrida, J. (1967). Of grammatology (G. C. Spivak, Trans.). Baltimore, MD: Johns Hopkins University Press.
“One of the recurring philosophical questions is: 'Does a falling tree in the forest make a sound when there is no one to hear?' Which says something about the nature of philosophers, because there is always someone in a forest. It may only be a badger, wondering what that cracking noise was, or a squirrel a bit puzzled by all the scenery going upwards, but someone.”
-Terry Pratchett, Small Gods
The movement known as humanism began in 14th century Italy with Francesco Petrarch (1304-1374), who promoted the study of a curriculum based on ethics, grammar, and poetry, effectively laying the groundwork for an educational archetype that emphasized the vast importance of civic virtue and human dignity and civic virtue. By the 15th century, humanism had spread throughout Europe as an attempt to reconcile classical ideas with Christian teachings, emphasizing moral philosophy over abstract theology. The movement would inspire the Renaissance artists, such as Donatello and Michelangelo, to embrace humanist principles; particularly the belief that art could promote not only the individual but the whole of society.
The 18th century, however, signified a drastic turning point in humanist thinking, as the Enlightenment thinkers expanded beyond classical revival to embrace scientific rationality and political reform. Philosophers such as Baruch Spinoza, Jean-Jacques Rousseau, and Thomas Paine, would begin to redefine divinity as a totality of nature, framing human rights as something inherent and universal. Ludwig Feuerbach and Karl Marx would begin to criticize traditional institutions, arguing that religion alienated humans from their true potential, as Charles Darwin would present a naturalistic explanation for human existence, eschewing theological claims of divine creation. Such critiques would lay the foundational origins for secular humanism, which effectively prioritized empirical evidence and ethical autonomy over religious dogma.
The 20th century would see a rise in institutionalized humanism, which sought to place the responsibility of leading an ethical life of personal fulfillment over supernaturalism. Advocates called for compassion, global cooperation, reason, and compassion, as they addressed contemporary issues like environmental sustainability and civil rights. 1941 would see the founding of the American Humanist Association, which championed secular governance and science while adapting to the rising challenges of modernity.
Over time, humanism has adapted to address the changing needs of its time, promoting the honorable core tenets of empathy, reason, and the pursuit of knowledge. Grounded in human potential, it is an attempt to transcend divisions and embody the Renaissance idea of excellence in service to the common good. However, I posit there is an inherent danger present in the movement that I believe complicates the legacy of the Humanist movement, one grounded in oppressive structures that render it inadequate and dangerous in addressing contemporary challenges.
Environmental proponents (myself included) look to indict humanism for legitimizing ecological exploitation though its anthropocentric worldview. Anthropocentrism, or the belief that humans are the most valuable creatures on the planet, can be seen at the root of global crises like biodiversity loss and climate change, Experts argue that humanism creates a definite schism between humans and nature, repurposing the Earth’s ecosystems as a resource to be used and commodified instead of nurtured. It is humanism’s narrative of mastery and progress, I argue, that accelerates our current environmental collapse.
Humanism’s ethical focus on human flourishing, while well intended, fundamentally ignores nonhuman interests, perpetuating a dominant mentality that nature exists solely to serve our human needs. This undermines the idea of biodiversity conservation and reinforces our destructive and unsustainable consumption patterns. Even social appeals for environmental stewardship hide the innate belief in human exceptionalism, placing it over the importance of environmental humility. Furthermore, secular theorists echo concerns over the legacy of humanism, framing it as a Eurocentric project that masks imperial violence while legitimizing hierarchies under the guise of enlightenment.
I am not, however, advocating the religious criticism of humanism, which condemns the movement for eroding moral absolutes, making it a scapegoat for moral relativism, free sexuality, and the disintegration of traditional family values. Rather, like Foucault, I am rejecting the idea of a fixed human nature, and advocate instead for a critical ontology that seeks to reinvent subjectivity beyond Enlightenment constraints, echoing Nietzsche’s charge to overcome the human by embracing creativity and multiplicity over rationalist dogma.
This proposition rests on the following arguments.
1. Human agency is an illusion shaped by ideological/ material forces beyond our control.
2. Anthropocentrism creates environmental exploitation which jeopardizes the stability of the planet.
3. Universalist claims hide exclusionary practices that fundamentally reinforce capitalist, colonial, and patriarchal systems.
4. The category of ‘human’ fails to account for posthuman realities.
This final point is derived from Jacques Derrida’s later work, The Animal That Therefore I Am, where he not only critiqued humanism, but dismantled the epistemological foundations of anthropocentrism, revealing its contingency and violence. In our Anthropocene, this deconstruction is an ethical necessity, forcing us recognize nonhuman agency and redistribute our responsibilities to reimagine humanity as part of the ecological network, instead of placing ourselves above it. Derrida is offering a path beyond humanism’s innate exceptionalism towards an ethics of shared vulnerability and care.
To be clear, I am not arguing against humanism’s ethical aspirations but am seeking to challenge its foundational assumptions. Perhaps the task is to recreate humanism in such a way that human and nonhuman humans coexist under the umbrella term ‘animals?’ No matter what form humanism takes, if we are to live sustainably, humankind must learn to reclaim its rightful place within nature, instead of claiming mastery over it.
References:
• https://www.britannica.com/topic/humanism
• https://www.bop.gov/foia/docs/humanism_manual.pdf
• Derrida, Jacques. The Animal That Therefore I Am. Translated by David Wills, Fordham University Press, 2008.
• https://thehumanist.com/commentary/humanism-and-history-a-brief-look-at-our-resolutions-and-social-justice-issues/
• https://www.worldhistory.org/Renaissance_Humanism/
• https://www.worldhistory.org/timeline/Renaissance_Humanism/
• Nietzsche, Friedrich. Human, All Too Human: A Book for Free Spirits. Translated by R. J. Hollingdale, Cambridge University Press, 1996.
• Pratchett, T. (1992). Small Gods. HarperTorch.