By Donald Inglis
•
January 12, 2026
Social media seems, once again, to be ablaze with the famous question: How many r’s are in strawberry? It is one of those prompts that reliably resurfaces every few months, usually accompanied by screenshots of ChatGPT confidently giving the wrong answer. On the surface, it is a harmless curiosity. A bit of fun. But it also reveals something far more important about how AI works, and where its limitations still sit. It often looks like modern AI can accomplish any task. Want a fun marketing image? Easy. Need a blog post written? Done. Want to use AI to create a romantic song for your wedding anniversary? You’ll have it in seconds. Yet despite how magically the technology seems, AI still falls surprisingly flat when it comes to certain basic tasks. Tasks you would expect a seven-year-old to achieve with ease. It is amusing, and slightly baffling, to see ChatGPT struggle with something as simple as counting letters in a word. But it is not just ChatGPT being glitchy or careless. There are structural reasons why large language models struggle with certain words more than others. Take the question itself: how many r’s are there in the word strawberry? For most people, the answer is immediate. You picture the word, scan it, and count. Three. For ChatGPT, the process is completely different. It does not “see” words as letters in sequence. It predicts likely outputs based on patterns it has learned from enormous volumes of text. When asked, what answer does it give? Just a clear and confident: “two.” So, for all the billions in investment, the vast computing power, the pressure on global energy and water resources, and the near-mythical reputation AI now carries, it still cannot reliably answer how many r’s are in strawberry. That should give anyone pause before using AI for things that really matter. Why this matters for tax, finance, and professional advice The strawberry example is trivial, but the underlying issue is not. AI systems are designed to produce plausible responses, not guaranteed correct ones. When they get things wrong, they often do so with complete confidence. That is a dangerous combination in areas like tax, accounting, and compliance. With self-assessment deadlines approaching, it is tempting for business owners to ask AI questions such as: Can I claim this expense? Do I need to register for VAT? How should I structure my income to be more tax-efficient? AI can produce an answer quickly, and it will often sound reasonable. The problem is that it may be outdated, oversimplified, or simply incorrect for your specific circumstances. UK tax law is nuanced, highly contextual, and frequently updated. AI does not understand your full financial picture unless you give it every detail, and even then, it cannot apply professional judgement in the way a qualified adviser can. The same risk applies when using AI for business communications or financial decisions. Using AI to draft explanations, summaries, or documents without proper review can introduce subtle errors. A missed exception, a misquoted threshold, or an outdated allowance can all undermine confidence and potentially create problems later. Using AI safely and sensibly in practice AI is not useless. Far from it. But it needs to be used with care and clear boundaries. Here are a few practical guidelines help reduce risk: Treat AI as a starting point, not a final answer. It can help you think, outline, or draft, but it should never be the last word on technical matters. Always verify facts against authoritative sources, such as HMRC guidance, legislation, or professional manuals. Do not rely on AI for personalised tax advice. Review anything important before acting on it. If you would not be comfortable explaining it to HMRC, it should not be based on an unchecked AI response. Be especially cautious with deadlines, thresholds, and eligibility criteria. These are areas where AI errors are common and costly. AI can save time, spark ideas, and help with structure and clarity. What it cannot do, at least not yet, is replace professional judgement, accountability, or detailed technical understanding. If it can confidently miscount the letters in strawberry, it can just as confidently misstate a tax rule. The difference is that one is a joke on social media, and the other can have real financial consequences. How we can help At Inglis, we support individuals and businesses with clear, practical accounting advice you can rely on. We understand that tools like AI can be useful, but when it comes to tax, compliance, and financial decisions, having a trusted adviser still matters. If you would like a second opinion on a tax question, help making sense of your numbers, or reassurance that you are doing the right thing, we are always happy to talk things through. You can call us on 01904 787 973 or book a call with our team .