the nonsense review 001
20250526225856
Information has become a form of garbage, not only incapable of answering the most fundamental human ques- tions but barely useful in providing coherent direction to the Technopoly solution of even mundane problems. To say it still another way: The milieu in which Technopoly flourishes is one in which the tie between information and human purpose has been severed i.e., information appears indiscriminately, directed at no one in particular, in enormous volume and at high speeds, and disconnected from theory, meaning, or purpose.
All of this has called into being a new world.
Neil Postman, 1993, from the book Technopoly: The Surrender of Culture to Technology
Sensemaking is the umbrella term for the action of interpreting things we perceive. I engage in sensemaking when I look at a pile of objects in a drawer and decide that they are spoons — and am therefore able to respond to a request from whoever is setting the table for “five more spoons.” When I apply subjective values to those spoons — when I reflect that “these are cheap-looking spoons, I like them less than the ones we misplaced in the last house move” — I am engaging in a specific type of sensemaking that I refer to as “meaning-making.”
Vaughn Tan, 2024, from the article “AI's meaning-making problem”
When we discuss faculties like discernment, taste, subjectivity, intellect, etc., we have to confront the reality that these are associated with one’s will to act and the performance of actions themselves. This is beyond the scope of Tan’s article but judgement is the logical predecessor to action.
There’s a lot of constriction and manipulation in this domain that needs to be undressed, Inshaa Allah.
Meaning-making is about subjective choice, which is why it underlies judgment, aesthetics, taste, and fashion. Everyone does it, but not everyone does it equally well and equally consciously. The human ability to make meaning is inherently connected to our ability to be partisan or arbitrary…
This passage got me to start thinking about the concept of “free speech”, “free will” and “freedom”; how these are all employed according to Western ideals and how they are fundamentals of a socially constructed narrative borne from cultural bubble of self-determination in Western civilization.
From the Enlightenment Era to American Revolution, with gestation periods during the Revolution in France, toward Western industrialization and hegemonic expanse, heaving throuh two World Wars and affecting the generations that followed from then, nearly coming to a halt during the time of conflict in Vietnam until finally emerging worried and worn against the twilight of the “War against Terror” and set to lay emaciated by the onset of what I call “Disutopia” between 2016 and 2021.
So my gut is telling me that with the epistemic vacuum that AI is likely to contribute to over the next few years, hostility is on the horizon.
I’m not talking about something like “general intelligence” (an AI with humanlike intellectual capabilities—I don’t believe in that anyhow). AI doesn’t have to get any better than it is right now and in my experience it’s already quite serviceable. I think that society has been plodding down a winding road through an open field that is about to abruptly end with no unanimous decisions as to which way to turn next.
The dilution of meaning among humans will be exacerbated, not caused by AI.
Related to Tan’s article and the Neil Postman quote that prefaced this post is Scott Werner’s The Coming Knowledge-Work Supply-Chain Crisis.
The average person probably has no idea what a “knowledge worker” or a “pull request” is. That’s okay. The whole point is about how AI generates more data than people who deal with data for a living can manage. The average person, if thoughtfully provoked, can come to the discovery that we are already swimming in data.
How’s the water?
I’m almost finished with The Technology Trap by Carl Benedikt Frey. It’s a long read. Skip to the later chapters where all the historical context that’s referred to in the beginning is cross-referenced. You’ll appreciate the author’s sober perspective. One caveat is that it was published in 2019. Usually a six-year-old book wouldn’t seem too far gone from being timely but a lot’s changed in society, politics, technology and the economy since then. Or has it?
Since I’ve started it, articles like Thinking is being outsourced to AI humor me.
If AI was actually being used to maximize the collective good it would be trained to do menial and cumbersome tasks, not creative work that replaces jobs and supplants critical thought. But it’s not meant to ease or empower. More and more of us are being deliberately made “redundant” as bosses attempt to remove power from workers and consolidate it among themselves.
Similar could’ve been said about electricity, steam power and computers. With the exception of “critical thought”, unless you count the printing press and the internet (although most people thought that the internet would bolster and democratize critical thinking, or so I’ve read).
Additionally, what “menial and cumbersome tasks” are is a matter of interpretation and responsibility. I may be wrong, but technically speaking, AI (i.e., a large language model) is trained to do math with words. It just so happens that you can get a lot of work done that way. In the same way you could’ve got a lot done with hot water a few centuries ago.
As Frey mentions, historical data suggests that productivity increases driven by technological advancements don’t result in people working less out of want for leisure. They work less because the technology proves their job to be, well, menial and cumbersome. The people whose jobs are safe tend to work more—with the exception of the rich guys who weren’t working that often to begin with.
Rich men wear leisure suits. A working man wears a robe, if anything at all, while he gets ready to go to work.
I wrote down somewhere about how AI is the first major technology that threatens the livelihood of the skilled and educated, “White-collar” world (i.e., “knowledge workers”) and if it were to become as impactful as its feared would flatten the socioeconomic strata that separates the people who get depressed and order DoorDash from the people who are depressed delivering DoorDash.
This decade has been psychologically hard on the upper-middle class, per their custom of fashionable lateness.
Although I do agree with this,
People will graduate college not knowing how to write, hardly knowing how to read, and only developing the ability to prompt machines into generating words for them. There were and are deep issues with our education system, a system that in many ways has looked more like running kids down an assembly line than fostering their individual traits and strengths. But this compounding crisis is creating a newfound generational gap. If it isn’t already clear, it soon will become apparent that there is a divide, a pre-AI generation and a post-AI generation. The critical thinking gap, in addition to massive gaps in numerous other skills and cognitive abilities, will make itself more visible day by day.
This article has a lot of graphs and charts on it that challenge the common conception of what it means to be able to read.
As of right now I’ve recorded one confirmed case of the American public school system graduating a young woman who cannot read or write—with honors.
The Age of Myth-Making - Untangled with Charley Johnson [1]:
In Orality and Literacy, Walter J. Ong traces the evolution from orality to literacy, and along the way, conveys the power and import of reading and writing. If we didn’t evolve to read and write, we wouldn’t have knowledge as we know it. As Ong writes, literacy is “absolutely necessary for the development not only of science but also of history, philosophy, explicative understanding of literature and of any art, and indeed for the explanation of language.” Don’t believe him? Try to do a li’l calculus or explain Plato’s Republic without reading or writing. As Nicholas Carr writes in The Shallows, “The written word liberated knowledge from the bounds of individual memory” and “opened the mind broad new frontiers of thought and expression.” Our reading, writing, and critical thinking habits are continually changing in big ways.
Orality and Literacy is a good book. It must have been a major work for its time because a lot of people refer to it.
The above article also refers to a statement by Marshall McLuhan, teacher and colleague of Neil Postman:
In an age of information overload, I think we’re likely to turn increasingly to myths. As Marshall McLuhan foresaw, “When a man is overwhelmed by information…he resorts to myth. Myth is inclusive, time saving, and fast.” Myth offers a structure and context for efficiently making sense of new information. It does so by tapping not into reason but emotion and the subconscious mind. Take the rise of conspiracy theories. As Carr writes, their spread has “less to do with the nature of credulity than with the nature of faith.”
Postman’s Technopoly and his later work Building a Bridge to the 18th Century: How the Past Can Improve Our Future by Neil Postman address the significance of religion and narrative in “meaning-making” or managing information, statements about the world around us.
All of the articles I shared here are from writers who are on Substack, which serves you a heuristics-driven feed of related material. Like a Twitter for humanities majors. So this is me being something like a human algorithm making sense of all of this writing about things not making sense soon and already.
-
Corresponding to the earlier article shared about AI and knowledge work bottlenecks, the author of this article is holding a free workshop on “how to critically analyze data”. ↩