Kent  Pitman's

Writing

Whither Original Thought

Click ▶ to play this essay's debut reading by the author, Kent Pitman, on the May 14, 2025 episode of the Lispy Gopher radio show. It's about 12 minutes long (from 5:43 to 17:54 in the show).

Whither Original Thought

A variant on the famous sculpture by Rodin called The Thinker, but in this case the thinker is seated in a rolling office chair, staring at a computer screen, his wastebasket full of wadded up paper. Except it's a statue, so all of this is cast into carved marble.

Did the great philosophers usher in the birth of thought or the death of it?

These great thinkers gave organization to the discipline of philosophy for us, but must it be organized? Must it be a discipline? Or is it, like religion, something that can and must tolerate not just variety but continual rebirth?

Is a Catholic remiss for not having studied to be a protestant--or a Buddhist? Or is it legitimate simply to have faith?

Can one write a great book without having read any?

I’m not talking probabilities here, but absolutes. Does legitimacy rightly rest on result or on pedigree?

For one who is longing for inspiration or handrails, options to explore what others have done abound. But is there shame in standing tall and inspired without having peered into the store windows or sifted the garbage bins of others who’ve come before?

If everyone with a good idea got it from someone else, who did that person get it from? Are we so sure that new ideas come from study and not from lack of it?

If I do occasionally think interesting thoughts, why must I frame them in terms of deltas--of minor incremental changes--from the work of another philosopher, as if my only contribution to thought can be the incremental improvement on what has been previously done? Can there be no credit for independent invention. Was my invention of the thesaurus or the computer hard disk of no interest just because someone else independently invented these first?

We might use incremental comparison to shorthand an experience at a cocktail party. “It was just like my childhood, but with student loans and beer.” But do we revere those architects or artists as groundbreaking who present their work as incremental variation of other works? Could a new last page to a great novel win a literary award? Do we ask a great author to describe his characters as variations of familiar characters from other books so that we can see more clearly what incremental contribution they have made over works that have come before? How strange that would be.

And why must I study the ancient or even newly fallen philosophers, and not they me? Well, there is of course the minor issue of death, but is that really fair? Because I have not had the good fortune to die are my thoughts less valid? Were I to fling myself from a balcony or slip with a razor shaving, would this make my arguments immediately more valid? “Think you can win an argument with me? I’ll show you!” I cry as I dive from the bridge into the waiting ocean. “He was such a deep thinker,” my obituary will surely read.

I was around for the not-quite-beginning of computer science--or at least of modern software engineering. A few came before me but comparatively few. It was a time when computers were mostly blank and we made programs from little more than nothing. More importantly, we actually made things, created them from whole cloth, building them up from smaller to bigger.

Today if programmers expend time thinking how to make something from scratch, we’re told it’s probably already there on the net somewhere. “Just download some packaged thing, grab the relevant parts and be done.” It’s as if we were hired as dress designers and then admonished to just find an existing design at amazon.com, slap on a few accessories, and ship it as art.

No need to ever begin anew when now there’s cut&paste. Mix and match can be empowering to some, and time-saving to anyone. It frees you of the need to always create. But to have no right to ever reconsider, to challenge, or to recreate the past? That’s disempowering. And just sad.

When I was first learning to program, there was quite literally no one there to tell us what not to think. We had the joyous freedom to think, and people valued us for it. There was a premium on thinking, and no one fussed the details of what we had done before or what we knew of others deeds but merely valued us for our potential thought and deed, building forward.

Today that world has fallen away. It’s as if someone in Thought Control Central thinks the hard thought is now done and well-tested. It’s time for some detailing, and then we can just be done altogether. To go back and think from the beginning is of little value to whoever runs this modern show. It’s a distraction from the commercial pressure to just ’get things done’.

Programmers bow to units of packaging called libraries. An apt metaphor in fact, because I think they often represent the packaged death of thought in the same way that great philosophy texts can represent as much the death of thought as the birth of it. Grand thoughts once having been thought, need not be revisited. Any goal of--much less right to--the simple joy of discovery is outright poo-pooed.

I find it hard to concentrate amidst the constant din of dead philosophers, so I sometimes struggle not to read, or at least to defer reading. I’d rather think first and then find afterward how my own thoughts compared to others that came, sometimes only by accident, before me.

Paradoxically, the push to care ends up being a veiled push to not care. That is, if I can be made to care enough to study what others have thought and said, then perhaps I’ll find less need to care about muddying the waters myself with similar thoughts differently shaped or expressed. The risk that I might say again the same thing as has come before, or some uselessly different thing, but in different words, is apparently unbearable, if not offensive, to many. New ideas should be easily recognizable and not make one have to think so hard. Usually best to be silent, or at least for curiosity to be properly vetted, properly directed, properly efficient. Better to rely on old ideas as much as possible. We are a society comfortable with our status quo. “Can’t innovation just be more of what we have already?” it seems to cry out.

After all, what’s on the shelves already is, as my career army father would often say, “good enough for government work,” by which he meant the government isn’t concerned with perfection, just enough to stay ahead of the wolves. Government bureaucracy as a standard for adequate thought. Well, my dad at least seemed to say it with a knowing understanding that it wasn’t the pinnacle of everything, a mere relief of the sense that one must be responsible for everything. To make an apple pie from scratch, Carl Sagan reminded us, one must first create the Universe.

A right not to care sometimes is fine. As I write this, I’m enjoying some well-earned time off-grid here in Italy. It’s good to have time to live away from life, a time to not care. I don’t get much of that. Not caring is good for a while, but a life of not caring is not good. And the idea that one might be held back from thought is downright depressing, so the tiny semantic slip from no need to care to the need not to care is important.

Computer libraries are not just handy tools for quickly getting up to speed, they are the tyrant that philosophy becomes, preying on those with the courage to confront a curiosity about life’s purpose, ethics, or the existence of God on their own terms.

You have read Kant, or Nietzsche, or Wittgenstein, haven’t you? I’m sorry, but I haven’t, or perhaps haven’t yet. I don’t close the door. But I’m a slow reader. I’m sure they were great guys with many interesting thoughts. I wish I’d known them personally and could have had coffee to chat. I adore such things and have no disinterest in them. I gravitate to people who think such thoughts. but if I must spend my time reading them, I won’t have thoughts of my own. So I read a little, and am grateful for that time, but I also just think anew. And I celebrate the fact that others read. I’m not anti-reading, just pro-choice. I chat up people who read faster than I do, adoring the interactive exploration with those lucky ones who have time both to read and to think, or occasionally just to read and to recite.

I celebrate my illiteracy. OK, perhaps not complete illiteracy, because I have done some reading. But let’s be frank, my reading has been more haphazard perhaps than that of others. I just try to turn that to a strength. Embarrassed as I am when people ask me about obviously famous philosophers and I blankly look at them and say ’sorry, not sure,’ I am also proud. We all have handicaps, but also strengths. My fear, or often just indignance, is not that there is illegitimacy in what I do but just that the will of society to judge is so strong that I will be judged illegitimate for that which is intentionally me. I actually think anew and am not convinced by all these better read souls that this is a bad plan for life, even as they are occasionally willing to threaten my right to eat over it.

I’m not too proud to look up a word. A friend asked me about epistemology the other day, and I said I didn’t know, that I’d have to look it up. Good word, it turns out. It seems to name what I have done a truckload of in life. I learned from many who probably knew the word, but that doesn’t mean I had to know it. And I’m happy to have it. But neither was I impeded without it. Perhaps better, I was allowed the luxury of shaping such a concept on my own and to my own spec, or the chance I might discover something new and subtle and beautiful along the way, from a fresh angle, unburdened by vocabulary, unintimidated by the past.

A library, whether a computer library or a library of philosophical teachings, is a lovely thing as long as it is merely optional. Whether I want to visit on a once-a-year pilgrimage or end each day curled up in a chair reading and shooed out by a tired librarian wanting to go home, it is my option, and option is a beautiful thing.

If legitimacy of thought is premised on my having read what’s in a library, then each new addition to the stacks chips away at my time on Earth, delegitimizing thoughts. Libraries, then, became enablers of tyranny, if not a first-class tyranny in their own right. And at that point, we might as well burn every one of them and free ourselves from the chains of “it’s already been done.”

Kent Pitman, June 2014, Castiglion Fiorentino, Tuscany, Italy

Copyright © 2025 by Kent Pitman.

An Important Footnote. This is what I’d call a “mood piece,” an expression of a particular feeling or sentiment, not a specific proposal other than cause to rethink certain assumptions from a new point of view. Under no circumstances should my remarks here be construed as anti-science or a condemnation of the idea of learning or learning institutions. And especially the idea of ever burning anything down should be seen only as a metaphor and not a specific call to action. I shouldn't have to say these things, but the world has gotten weird, and ugly, and occasionally far too literal. I still reserve, and here also exercise, my right to colorfully critique even things I support.


Author’s Notes: If you got value from this post, please “Share” it.

I would even go so far as to say the periodic questioning of those things we hold dear is a moral imperative. I owe a detailed explanation of that thought, but on another day.

The graphic was created at abacus.ai using Claud Sonnet 3.7 and Seedream. I did post-processing in Gimp only to reduce the overall size and to change the image format from png to jpg.

I wrote this in early June 2014 while at a writing retreat hosted by Cary Tennis at Le Santucce in Castiglion Fiorentino, Italy.