The Setup
David (not his real name) is a 55-year-old operations director at a mid-sized logistics company in Singapore. He had been asked by his CEO to attend a Claude Cowork workshop — and he had agreed, reluctantly, primarily because declining felt politically awkward.
He arrived fifteen minutes early, found a seat at the back, and spent the first ten minutes scrolling his emails. When asked in the opening check-in whether he thought AI was going to change his job, he said: "I hope not." The room laughed. He did not.
He was 55 years old, had been in operations management for thirty years, and had watched a dozen technology rollouts promise transformation and deliver frustration. He was not there to be excited. He was there to tick a box.
The Problem We Found
Every Friday afternoon, David spent between two and three hours producing his weekly operations report. He pulled figures from three different spreadsheets, cross-referenced them against the previous week's numbers, wrote a narrative summary of the week's performance, flagged any issues requiring management attention, and formatted everything according to the company's reporting template. He had done this every single Friday for six years.
When asked how long this process had taken on average this year, he calculated it almost immediately: "About 130 hours. Just on that one report." He paused. "I've never actually added it up before."
The realisation that he had spent the equivalent of more than three full working weeks this year on a single recurring report — without questioning whether it could be done differently — was the moment his scepticism began to crack.
What We Built
In the first 45 minutes of the session, David and his trainer mapped the weekly report process in detail: which spreadsheets, which columns, which calculations, what the narrative structure looked like, where the formatting template lived, and what the most common management questions were that the report needed to anticipate.
In the next 75 minutes, David used Claude Cowork to build a workflow that read those three spreadsheets every Friday morning, ran the calculations, identified week-on-week changes that exceeded a defined threshold, drafted the narrative summary using his own previous reports as tone and structure references, and produced a formatted document saved to his shared drive ready for review.
The first version took twelve minutes to produce and had three errors — a calculation that was slightly off, a formatting detail that did not match the template, and a section where the narrative summary was too generic. David corrected all three in the next 30 minutes by describing exactly what was wrong. The second version was, in his words, "honestly better than what I write myself on a tired Friday afternoon."
The Result
David has used the workflow every week since the training session. He reviews the output, makes minor edits where needed — usually fewer than ten minutes of work — and submits it. His Friday afternoons, previously consumed by report production, are now available for the actual operations thinking that the report was supposed to enable but never left him time for.
In the six months since the session, he has built three additional workflows: one for supplier performance tracking, one for monthly variance reporting, and one that processes incoming maintenance requests and categorises them by urgency and resource requirements. None of these required any additional training — the skill he developed building the first workflow transferred directly.
When asked what surprised him most, he did not say the technology. He said: "That it took someone asking me to add up how many hours I spend on that report before I ever questioned it. The AI just made it obvious that I should have questioned it years ago."
What This Case Study Tells Us About AI Adoption
David's experience is not unusual. Across our training cohorts in Singapore, the participants who arrive most sceptical are frequently the ones who get the most out of the session. Not because they are converted by a demo, but because they have thirty years of domain knowledge about their work — they know exactly what the repetitive patterns are, exactly what good output looks like, and exactly what they would do with the time recovered if they had it.
That domain knowledge is the asset that makes Claude Cowork powerful. The AI can execute. It cannot decide which problems are worth solving, what good output looks like for a specific role in a specific organisation, or whether the time saved is being reinvested in something that matters. Those judgments are the experienced professional's contribution — and they are irreplaceable.
Age is not a barrier to AI adoption. Technical experience is not a prerequisite. Willingness to describe a problem precisely and evaluate a result honestly — skills that a 55-year-old operations director has in abundance — are all that is required. Read about why this matters in our broader piece: What Is an AI-Native Professional?
Could This Be You?
If you have a recurring process that follows the same pattern every week or every month — a report, a document, a set of communications — the question is not whether Claude Cowork could automate it. It almost certainly could. The question is whether you are willing to spend two hours building the automation rather than 130 hours a year doing it manually.
Our Claude Cowork Workshop Singapore is structured exactly this way: you bring your most expensive recurring process, we help you build the automation for it during the session, and you leave with something working. Join our WhatsApp community to connect with other non-technical professionals who are doing the same.
For the programme overview and how to book: Claude Cowork Training.
Bring your most expensive recurring process
You leave with a working workflow. Just like David did.
Book a workshop