Writing is hard. Wouldn’t it be nice if it was easier?
With advancements in LLMs and AI tools it seems like some are celebrating the end of putting pen to paper (or fingers to keyboard).
But I believe that to do so is to approve our own execution.
Writing is pervasive. Words permeate your life, and for most people written words are the form of communication with the most “utility” (versus more informal spoken language). Writing is what you mostly use in your job, education or in other formal settings. Emails, texts, slack, school exams, police reports, letters from the bank… And I believe that writing and thinking are tightly wound together. Perhaps indivisible.
So if the AI is doing the writing – who is doing the thinking?
An abridged history of “the button”
First, rewinding a little. Much has been written about the abilities of LLMs to produce content that does a good job of mimicking or bettering human writers. We often talk about chats or conversations, as in ChatGPT, but this is really writing not talking, even if combined with a program that then turns the generated words into an audio output. Many of us have played around with these tools and found that the output is surprisingly persuasive. It’s also really easy. And the most recent updates including GPT-4o are even more compelling. For example, it was widely reported when Chat GPT and other models passed the Multistate Bar Exam (at the time it was in the 90th percentile of test-takers, and maybe higher now) (see GPT-4 Passes the Bar Exam). If you’re played around with these tools you know what I mean.
And indeed, recent reports indicate that many people are now actively using AI tools in their everyday work and life. School teachers are reporting that when they take home assignments many responses come back with evidence that they may have been AI generated. ChatGPT set the record for fastest-growing user base of any online product (they now have around 180 million users). Anecdotally I know of Engineering Managers who use LLMs to write performance reviews for their reports, CEOs who get ideas for their next LinkedIn posts, and PMs who use the tool to write up PRDs.
I began to think about types of writing across a spectrum. On one hand you have the most intellectual, strategic and unique work you could imagine – writing a company’s strategy, designing a new approach for scientific research or crafting the most beautiful love story. And on the other hand you have the most boring and rote work you could imagine – repeated and almost identical tax fillings or writing up detailed minutes from a meeting…
Much of what we spend our time doing as humans ends up falling somewhere between these two extremes. In this gray area there are many individual pieces of work that feel repetitive and boring – wouldn't it be great if some piece of technology could help accelerate this work? Even sitting hammering away at this keyboard a part of me wishes I could have the AI finish the rest of this blog so I could get outside to enjoy a sunny SF afternoon. Wouldn’t it be better if I just “pushed the button”?
Writing is thinking
The writing that we do everyday is a sort of manifestation of the thinking that we’re doing. Whether it’s emails, slack or documents, the act of writing forces the brain to engage in the topic at hand.
Frank Bruni in the New York Times says: Writing is thinking, but it's thinking slowed down — stilled — to a point where dimensions and nuances otherwise invisible to you appear.
My concern is that we are sleepwalking into a world where we’ve forgotten how to do the type of deep thinking that comes from writing. And deep writing and thought can only happen when it’s built on a foundation. Even when writing the most mundane words on a page you are probably doing more thinking than you realize. In the act of crafting a well written slack reply you’re probably getting in the head of your colleague, thinking carefully about how they will respond and what is important to get them on side. Or in the process of writing up a performance review for a direct report you’re thinking about how she will react when you deliver the news that she isn’t ready for a promotion yet.
John Warner has written a wonderful piece about the impact of AI on High School English teaching. He says this about writing:
[Writing] is an embodied process that connects me to my own humanity, by putting me in touch with my mind, the same way a vigorous hike through the woods can put me in touch with my body. For me, writing is simultaneously expression and exploration. In a piece like this, writing is the expression and exploration of an idea (or collection of ideas). It is only through the writing that I can fully understand what I think.
In a relevant parallel – he also writes about how advances in artificial intelligence reveal some of the central pointlessness of the way that high school English is currently taught. He argues that AI models can easily churn out high scoring essays precisely because the grading rewards rote structure, rather than creativity and deep thought.
We should be especially worried about AI being used in schools or universities, not because the technology isn’t potentially useful, or because of dogmatic fears around “cheating”, but because it’s so easy to use as a shortcut to skip learning how to think.
Writing is meaning
Writing also has other meanings in the world that we live in. Producing words requires you to expend energy to think. Writing is a sort of pre-historic crypto network, with a “proof of work” embedded in the fact that these well written words required prior education, effort and thought. So producing and trading words has meaning attributed to them. It’s very easy to speak words, but often the act of writing something down requires you to access a deeper level of commitment and thought. Hence “can I get that in writing?”
Ethan Mollick, the author of a new book called “Co-Intelligence” says: The fact that a professor takes the time to write a good letter is a sign that they support the student’s application. We are setting our time on fire to signal to others that this letter is worth reading. I have written many long documents to vendors, financial services regulators and banks. Often these documents served to reassure, convince or challenge. They were detailed and formally written – and these attributes imbued them with the meaning that we were serious people. What now?
And writing words is a huge part of many of our jobs – and we derive personal meaning and satisfaction from writing those words, whether alone or collaboratively with our colleagues. My first job was as a management consultant. Perhaps looking back we spent too long thinking through client proposals, final reports and slide decks. But that was our life, and we believed it was important. Through the process of collaboratively writing we built relationships, learnt a craft and developed skills. What happens once it is easy to get 80 or 90% of the way there with just a click of a button? Who will teach young graduates in their first job if partners can use “Grad Scheme GPT” instead, at a fraction of the price?
Mollick continues: In a world in which the AI gives an instant, pretty good, near universally accessible shortcut, we’ll soon face a crisis of meaning in creative work of all kinds. We are going to need to reconstruct meaning, in art and in the rituals of creative work
I’m not opposed to using AI tools, but we need to be aware of the implications of using these tools in the way in which the corporate and broader world functions.
Words without meaning
Most of the media coverage discussing AI and LLMs focusses on capabilities for creating new content – for example “write me an essay” or “tell me the answer to this question”. However once you have used these tools you’ll see that the writing produced sometimes only has the appearance of being well written.
As you dig deeper many AI tools still struggle to produce truly unique writing. Instead content is generic, but well written, and lacks a strong point of view (unless you tell it to have one). Responses are often overly reliant upon the prompts you use (i.e. you’ll get what feels like whatever you wrote regurgitated back at you). And LLMs struggle on topics that are not widely written about. And the thought process that powers the outputs of an LLM is not exposed easily to the user – it’s consequently hard to reliably learn from them, since they don’t “explain their working”.
The problem is that we might be training ourselves not to look too closely. When it’s so easy to push the button, you have an incentive to accept the output.
Words without meaning are worthless. I think that we want writing that is surprising and beautiful. And unique insights or new ways of looking at a problem are what enable progress. It’s the topics that haven’t yet been written about that are most valuable for humans to spend time thinking about, whether that’s specific feedback to a direct report who is learning a new skill, or pioneering research in species impacting areas.
John Warner put it simpler: “It is, at its core, a bullshitter.”
AI is simply averaging out the many words that it has previously read across its petabytes of training data. My fear is a dystopian future – not the Matrix or Terminator. But instead a world awash with meaningless and cheap words. In such a world we’d need to find new ways to do business, and re-imagine how we use writing.
Some examples:
For legal agreements and contracts – AI will make it cheap and easy to generate contracts. But without lawyers spending time writing and editing such a document, how would you have confidence that both signatories understand what they are committing to? Legal negotiations are generally not one-sided, rather the act of collaboratively writing and thinking about a document to create a common understanding that both parties are happy with (or at least understand).
For communicating with customers – the cost of words is going to zero. It has never been easier to sell, onboard, apologize, warn, explain, etcetera, etcetera. But humans are social animals – written communication will lose its impact in a post-AI world if customers believe that the words they are reading are likely AI-generated (even if they’re not!). Perhaps we’ll need to find new ways of communicating with customers that discloses and highlights when a human is involved. Will hand written letters or ink signatures make a come-back? Will brands invest in physical locations in order to build deeper connections with their customers “in the real world”?
In Education – Educators are turning to AI to generate lesson plans, prepare for classes or produce handouts – in short, using AI to write the words they need to support their teaching. However there are concerns that teachers themselves will no longer think deeply about the structure and content of their classes. How will we ensure that our children are getting the best quality education? And to further complicate matters students are increasingly using AI as well, eroding their ability to think for themselves too. If we want our young people to develop into the best versions of themselves we need to find ways to ensure that both educators and students are not over-reliant on AI.
I think that Humans need to learn to write.
Because writing is thinking.
And thinking is what makes us human.
I also read and enjoyed Centaurs and Cyborgs on the Jagged Frontier
Thanks to Alex Kotran (CEO AI Education Project) for reading and feedback
Loved reading this – I always feel like what sets apart great writing from good writing is when you can see/feel the thought process of the writer through their words. Often think about the quote re. clarity in writing and thought: "I'm sorry this letter is so long. I didn't have time to write a short one."