|
It bothers me. A lot. I used to go on LinkedIn to learn something. Now all I learn is that more and more of the content is AI generated. For instance, I’ve never known humans use this symbol “—“ so much. Have 90% of the people suddenly started doing that or are all these posts I am seeing generated by the same bot? In the software and management consulting world, the primary product is advice. But nowadays, that advice is being increasingly packaged by LLMs. While AI is an incredible accelerator, the "pure" AI output often lacks the grit, the nuance, and the specific horror experience that a human consultant brings to a project. If you’re reviewing a proposal, a technical audit, or a strategy piece, this post might be helpful. I’ve worked with three different AIs to create the 10 tips below. Using them in combination may help you understand if the content you are reading is generated by an AI and perhaps not reviewed by a human. 1. The "Tapestry & Delve” language Although this will likely change over time, it seems like the AI models currently have "favourite" words that they use with statistical frequency.
AI is trained on "balanced" writing. It loves to group benefits or features into sets of three.
AI models are programmed to be helpful and optimistic. They almost always end an article or a post by zooming out to a "grand vision" of the future.
AI can explain how a system works, but it struggles to describe how it fails in the real world.
This is a specific rhetorical phrase that I find the easiest to spot right now. I am told AI models use it to sound profound.
Look at the article from a distance. Do all the paragraphs look to be about the same length?
While these are used in a grammatically correct way, statistically they are used a lot more by the machine than by a typical human.
AI bots speak from a position of total certainty and universal truth. That is unless you point out a mistake and then they are forced to issue an apology.
AI offers analogies that are either too basic or too obvious. I guess on the flip side they are safe metaphors.
In professional services, and perhaps in any respected field, we cite reports (Gartner, McKinsey, etc) or reputable publications. But the AI bot doesn’t seem to need to do that.
Why it matters In a world where anyone can generate a 2,000+ word article in 10 seconds, the value of originality has never been higher. If your content looks like it meets some of the criteria in the list above, then your clients will subconsciously (or perhaps even consciously) know it, and assume you are taking shortcuts with your other work too. What should you do instead? Sure, use AI to draft, but then "break" the machine's patterns. Read, change the language so you can be proud of it, then read again. Add your own 3 AM stories, use a real world metaphor, and please, please, delete the word "tapestry." |
Welcome to my blog!About the authorPlamen is an experienced Software Delivery consultant helping organisations around the world identify their path to success and follow it. Archives
February 2026
Categories
All
|
RSS Feed