15 Comments
User's avatar
Brandon's avatar

This whole AI thing has been really weighing on my emotionally the last couple of months. I have this crazy feeling of loss, like something has broken and won't be put back. I have been half-hoping that a solar flare would set us back 50 or 60 years, just to give me more time—before what, I don't know.

I am hoping now, like el gato malo says in his piece linked below, that the rise of AI and the knowledge that almost anything could be fake, could be a scam, may force us to lose "trust in anything out of immediate sensory sphere" and thus ruin "long distance trust" and globalization thereby forcing us to return to real life and the real world.

https://boriquagato.substack.com/p/will-ai-shrink-the-world?selection=1568c0cd-2c6b-4f6a-bd27-adf1499cc285

Expand full comment
The Delinquent Academic's avatar

I felt a bit like that a year ago, but whether technology continues to improve, thus destroying what’s left of our humanity, or whether a combination of factors devolves our technology use and understanding of it, I’ve tried to be at peace with whatever outcome, and search for a little cranny in the world where I can just chill out and do my thing.

I do think Frank Herbert’s idea of the Butlerian Jihad is genius, and that there is already a reaction against AI. I saw a Ted Gioia piece the other day that suggests … everybody basically hates AI. It would be interesting if the consumerist market turned away from leaning into AI productions - but to trust the market on such a thing seems naive.

Expand full comment
Brandon's avatar

Yea I saw you mention Butlerian Jihad the other day and had to look it up. Such a cool idea! I’d join that jihad

Expand full comment
The Delinquent Academic's avatar

Haha, if it happens in like, ten years, we’ll also be fighting all the nerds who want to kept their AI girlfriends lol

Expand full comment
A.C. Cargill, All-Human Author's avatar

I share your fears. We need to stop AI. Just because something can be done doesn't mean it should be.

Expand full comment
The Man Behind the Screen's avatar

As with anything, AI/LLMs are ultimately tools developed for human use. And as with any tools, they can be misused and abused, leading to them becoming detrimental and potentially harmful. We have seen some of the results of that in creative fields.

My stance on AI is largely in line with your own. I don't care for it. I don't want it. I won't use it, at least not to my active knowledge. My reasons for this are many, ranging from a lack of reliability on the research side–you mentioned the potential usefulness of such models for answering simple questions quickly, but I've seen many heaping double-handfuls of instances where the answers to even simple questions presented by these LLMs are flat out wrong–to the one we most closely align on, the question of personal integrity.

I think where we differ greatest is that my distaste for it does extend to others. When I see obvious signs that others have used AI in their work, I'm inclined to avoid it. It's not a 100% thing, there are exceptions, some of whom use this very platform. Given the creative background I have, though, a pang of disgust inevitably forms in my stomach when I see writers turn to AI for accompanying artwork, or illustrators turn to it for blurbs of text.

Ultimately, the tool is here to stay. I don't think we're going to get rid of it, so I'll instead hope that we can refine it into something that's more positive than what it is.

I'll hope for that, but I won't, and don't, expect it.

Expand full comment
The Delinquent Academic's avatar

Well said bro - and you’re right that the tool is here to stay. It is very hard to imagine how it will change human’s interaction with art and the role of the artist in the future; given, from cave paintings in Paleolithic to Medici patronage in the Renaissance to Tolkien’s influence on the West, I cannot imagine a world where art is not important. How it will be important, is an interesting question. Will the humble writer, like you and I, still exit? In some manner, I believe we have to. Writing IS thinking. And without thinking … well. However the corruption in the schools and university system means whole generations are being raised who cannot write and therefore cannot think. We have entered the post-literate age; where intelligence may be declining. And I do share that same disgust that you share when I see - I’m just not going to let it ruin my day anymore haha (the machines are coming ahhhhh! lol).

Expand full comment
James the Hun's avatar

I absolutely adore Gattaca and that scene.

Anton: How are you doing this, Vincent? How have you done any of this?

Vincent: "You wanna know how I did it?" This is how I did it, Anton: I never saved anything for the swim back."

The music is killer, too.

Sometimes I watch it just for inspiration.

https://www.youtube.com/watch?v=GM-znjDGubE

Expand full comment
Stephen Riddell's avatar

I tend to agree about not using LLMs in creative writing, especially for core editing functions like reshaping the structure of a narrative. I tested out Google Gemini when I was writing a setting guide for a D&D game, and while the suggestions it made were alright for that very basic kind of creativity, it is ultimately a parrot that only produces derivative ideas.

While this didn't really matter in a D&D setting guide, I'm currently attempting to write a fantasy novel for young teenage boys. I've used Gemini for spellchecking so far, but I don't need an LLM to tell me how my story could better conform to the Hero's Journey or some other hackneyed plot structure.

That being said, I'm not convinced by Bukowski's vision of the artist. As an artist who has actually gone mad, and experienced many of the other things in his list, it feels very good to have the 'fire' burning in you at the time. However, I was actually just being pig-headed and one of the key steps in regaining my sanity was to realise how shallow this vision of life as an artist is.

This vision is obviously very attractive to young men, pretty much everyone I knew believed it, but it is much easier to make art when you aren't cynically detaching yourself from the society you exist in. Since regaining my sanity, I've actually been more productive in my art because I've placed a higher value on my close personal relationships.

These days, I prefer to think of art more in the manner of Kurt Vonnegut, "Practice any art, music, singing, dancing, acting, drawing, painting, sculpting, poetry, fiction, essays, reportage, no matter how well or badly, not to get money and fame, but to experience becoming, to find out what's inside you, to make your soul grow."

Expand full comment
The Delinquent Academic's avatar

I would guess it depends what one means by ‘fire’; I had not thought it could mean losing one’s sanity - that is far from ideal. I am interested to hear your story sometime. I would agree with you, and wrote in the article, that writing itself does not come before my relationships, even existence itself. To place one’s passion, one’s career, or whatever, above others you are responsible for is a childish narcissism I feel our culture encourages. Haha thanks for the quote - I should have used that!

Expand full comment
Katrina Biggs's avatar

As you mentioned, AI is going to become incredibly useful with research, even though what’s garnered will still have to be carefully analysed, and a recent comment about being useful for OIA requests made me see the use of it there, too. However, for writing where we want the writer’s ‘voice’, we need the writer. AI can probably learn how to copy a writer’s voice, but I’m not sure if it will be able to surprise the reader by writing something that’s influenced by our surroundings, current circumstances, or a single unique experience.

Expand full comment
The Delinquent Academic's avatar

Yeah hopefully, but even if it can - to me it is the deception that would anger me. If I read something and was like, ‘wow, this person can really write!’ Then found out it was written by AI, I’d feel fleeced, dirty, my mind … abused in some way. It sounds dramatic, but I believe the sanctity of human relations (the writer-reader being an important one) are being devalued in our society across the board, and this is only a bad thing.

Expand full comment
Katrina Biggs's avatar

Actually, I think I would feel the same as you re: feeling cheated if I found that AI had written something that seemed 'real'. It would be a bit like giving someone some accolades for having written something superb, only to find that someone else had in fact written it for them.

Expand full comment
Mr. Raven's avatar

As someone who uses generative AI in my paper cuts and short animations and who is up front about that, I respect your stance. I live in a wilderness area, and I certainly do not want some transhumanist dystopia. I am actually a big Ted Kacyzinski fan and struggle on a daily basis with how much I should use these machines which are increasingly like genies that grant wishes, but who also as you point out deskill people in doing so.

Expand full comment
The Delinquent Academic's avatar

I respect that you are open about using it, and I do understand that it can increase efficiency and workflow within a multifaceted creation-arc, as it were. I think being aware that they are a kind of like a genie is the biggest hurdle really; then you know when to limit your use so it does not become unhealthy. I would wager most people, while perhaps thinking about it before, are not aware of their dependence on machines. I am definitely leaning the Ted way too - it is hard not to look around and see people … dehumanize themselves at the hands of machines.

Expand full comment