So, when a company or organization is trying to decide where their content will come from, they need to prioritize establishing trust first and foremost, or their content marketing efforts will fall short. This is an important consideration when evaluating whether you will use an LLM (like ChatGPT, Grok, Claude, or others) to write your content for you, or whether you will hire a real human to do the work.
Much has been debated over the last year about what AI is capable of versus what skilled human writers are capable of, so we’re not going to rehash that discussion all again. (If you want to know what we have to say on the topic, we covered it in great depth here.) But what technology proponents still seem to be ignoring is the way AI is failing to progress in ways that are truly meaningful for businesses that need quality content to reach their target audiences.
Organizations that want to be able to offer their audience helpful, trustworthy, actionable advice (which should be all organizations!) need to pay attention to the ways in which AI-generated content continues to fall short to understand why relying on humans to generate their content is still so important.
I’m going to detail some AI failures that I’ve personally witnessed recently to offer some clarity around where AI is still severely limited. What I want you to understand is that these types of AI failures are not one-offs and they’re not meaningless. Bad AI outcomes occur routinely and have deep significance related to how your content is perceived by your audience, which can stifle your brand’s growth.
The main takeaway is you shouldn’t be using an LLM to write your content for you! Here’s why…
I’m helping coach my son’s running team, but I’m not a runner (they just needed someone – anyone). The team is training to run a 5K and I want to be prepared so, I used an LLM to try to find a nice 3.1 mile running loop from my house that I could use to train. I crafted a detailed prompt of what I was looking for and provided my address as a starting point. It “thought” and then gave me a long, comprehensive response. I started reading it and was delighted! It reminded me to be cautious on the road and detailed a lovely route that I could use to maximize scenery and minimize hills (yes!). The response said it had specifically considered places that would be safe for pedestrians. Great!
But there were a few problems. First, one of the roads it told me to turn on doesn’t exist. It was a mere AI hallucination. Once I was able to determine which road it meant to provide based on the roads that were listed before and after, I realized the road it was telling me to run on would be awful road for pedestrians because it’s very busy with very little shoulder to use. Then, I realized that the planned route was a 5-mile run from my home to a popular local running trail. So, instead of the LLM giving me a 3.1-mile loop like I requested, it gave me instructions on how to run 10+ miles roundtrip to get to somewhere else where I could then run some more. Not at all helpful!
Lessons to Learn: If you use an LLM during any part of your content creation, it’s imperative that you check the response thoroughly for errors. Nothing will ruin your credibility faster with your audience than publishing wrong information or bad advice!
Lacking Wisdom
After trying (and failing) to get a clear running route mapped out, I took a step back and decided to ask the LLM for a 5K training plan that I could follow instead, assuming that I could kind of make up a route as I went. What it gave me was a detailed day by day plan of what to do each day for the next 8 weeks to get ready for a 5K race. One of the first things I noticed was that the longest runs of the week were always on a Sunday, which won’t work for me, but I figured I could just revise the prompt to specify that I wanted Sundays to be rest days. I was hopeful that I was on the right track.
And then I noticed the biggest problem with the LLM’s response – it lacked the wisdom to understand that someone looking for a plan to get ready for the shortest distance race possible was not an advanced runner. While this is obvious to people, the LLM hadn’t been programmed to understand this kind of logical nuance.
The training plan it gave me had information that was way too sophisticated for what I needed (like references to tempo runs and recovery runs). I was overwhelmed by recommendations about foreign things like “negative splits” and “overstriding.” There was a bizarre dichotomy between a plan meant to teach a novice runner how to get started and verbiage meant to inform an advanced runner of how to improve.
Lessons to Learn: Correct information precedes wisdom so without accuracy as a foundation, wisdom can’t emerge. This is a real problem when you want to generate the kind of content that provides wise recommendations and actionable advice. First, your content must be accurate, then it needs to match your audience’s knowledge level and needs. Your audience wants helpful expert insights from someone that knows the industry well and understands their goals and pain points to give them the best possible advice that’s specifically aimed at them. Remember, for your content to be effective, it must be customized for both the audience and the channel!
Offering Information Instead of Knowledge
In a separate instance, I was trying to determine where to meet up with a fellow Mom from our kids’ school. Normally, I would pick a coffeehouse for an iced latte and some good conversation, but she isn’t a coffee drinker so I wanted to find someplace that would be better suited for her preferences. I asked an LLM to give me five possible places in Rockford, Michigan where I could meet up with a friend to talk that weren’t coffeehouses but had a similar atmosphere. I had a few places in mind already, but I was curious to see if the LLM would identify any places that I hadn’t thought of. …It certainly came up with some interesting options!
The first place the response suggested was a top-rated burger place in Rockford, Illinois. (That would be a little far to drive!) The second place it suggested was the closest library, which is incredibly small and cramped, not to mention that libraries generally frown on lengthy casual conversations. (The response mentioned that I could also consider reserving a meeting room, which I found particularly hilarious.) The third suggestion had strong date-vibes. (We’re not that kind of friends!) The remaining suggestions were walking trails, which I suppose could work if we wanted to bring our own food or drinks? All in all, the response was awful. It contained factual errors and a complete lack of the kind of first-hand knowledge that someone asking for a local recommendation would want.
Lessons to Learn: LLMs typically spit back responses that contain the kind of information you could find anywhere, which is fine if you’re writing a research paper. If you want a dictionary that sounds a little friendlier, LLMs are a great choice! But if you want first-hand knowledge, nuanced understanding, real-world advice, or a deep dive into something, nothing compares to real human experience. Remember, because LLMs aren’t actual people, they don’t have personal experiences they can draw on. The best they can do is aggregate real people’s experiences that have been shared publicly in the hopes that something in there might be what someone is asking about. This kind of “shot in the dark” approach doesn’t portray your business well and isn’t an effective use of your time or marketing budget!
Failing to Recognize Nuance
Vernacular can vary significantly across user groups, which means that LLMs need to be programmed to understand all kinds of nuances. While one person might ask, “Can you give me…” in a prompt, another might say “Please make…” or “Use this to generate…” and mean the same thing. Recently I published an article, and I wanted some social media post ideas that I could use on LinkedIn, Facebook, and X to promote it. I’m not sure how I asked that “incorrectly” but what the LLM gave me was examples of existing posts from other marketing companies who had posted on the same topic that my article covered. I revised the prompt and was able to get exactly what I wanted so I could base my posts off the ideas the LLM provided. But it offered an example of how LLMs often miss the kind of nuance that humans wouldn’t by strictly answering based on what the prompt said versus what it meant.
This kind of thing happens with the AI generated “conversation starter” prompts that appear on social platforms as well. For example, I follow a lot of local farms on Facebook because I like to pick fresh fruit with my kids. Recently, one of the farms posted, “Good news! Our big donut machine is back in operation so we’re back to making our famous donuts. Stop in soon and get one!” and Facebook gave the following suggested prompts for followers to comment: “Can I place an order online?”, “Donut shop daily specials”, “Donut machine repair process”, and “Donut machine capacity”. The first two are fine, but the second two are laughable because the LLM that the platform is leveraging doesn’t understand the nuance of the fact that a business posting about a piece of machinery being fixed likely doesn’t have knowledge (or desire) to talk about the repair process involved to get them back up and running …nor would people who love eating donuts care to ask. These follow up questions sound unnatural because they lack the kind of nuanced understanding that people have when making conversation with each other. Sure, the original post was highlighting the repair of a donut machine, but not because machine repair work was of value to their audience, but because of what it meant – hot, fresh donuts were back.
Lessons to Learn: Machines can copy people, but they aren’t people. So, before you use an LLM to generate content for your business, ask yourself which one your audience would rather talk to–a machine or a human, because there’s still a very big difference! Using an LLM to create resources for your audience or interact with them can turn people off when they feel like the technology doesn’t understand what they really need or are asking. Losing that personal touch can alienate leads and existing customers in a way that your business can’t afford!
When you need content written that sounds like a human wrote it, contact us. We’re real people really writing …and it shows! Our content powers the digital marketing strategies that deliver more traffic, convert more leads, and retain more customers. Find out more about our content marketing services today!
RSS Feed