AI is rewriting our job descriptions
How AI is quietly reshaping what we do all day and why arts/culture organizations need to get ahead of it
Kristin Darrow | February 2026
A new study disseminated by Harvard Business Review this week stopped me mid-scroll. I described exactly what I’ve been living.
The study, "AI Doesn't Reduce Work — It Intensifies It" by UC Berkeley researchers Aruna Ranganathan and Xingqi Maggie Ye, followed 200 employees at a U.S. tech company for eight months. Their finding? Workers who adopted AI tools didn't work less. They worked more - faster pace, broader scope, longer hours - often without anyone asking them to. AI made "doing more" feel so possible, so accessible, and frankly so rewarding that people just... kept going.
I read that and thought: yep. I was awake at 2:15am on Tuesday nursing Claude along on a big AI research project. I was wired and happy doing the work. But exhausted. This is increasingly my daily life.
I literally use AI all day long. As a full-time AI practitioner/consultant, it’s my job. But I’m also human first. And I can confirm — the cognitive fatigue of “AI-first” is real. The widening of my own job description is real. I'm doing things today I wouldn't have attempted a year ago, not because someone handed me a new job spec, but because AI made it feel suddenly within reach. It's empowering. And it's exhausting.
What the study actually found
The researchers identified three distinct patterns of intensification that I think every organization — tech or otherwise — should be paying attention to:
Task expansion. People started absorbing work that used to belong to others. Product managers began writing code. Researchers tackled engineering tasks. Work that previously would have been outsourced or deferred got pulled in-house because AI made it feel “doable.” Meanwhile, engineers spent increasing time reviewing and coaching colleagues' AI-assisted output - a hidden workload that showed up in Slack threads and quick desk-side conversations, not on anyone's task list.
Blurred boundaries. AI reduced the friction of starting any task to nearly zero. So people began slipping "quick prompts" into lunch breaks, meetings, idle moments. Typing a line to an AI felt more like chatting than working. But over time, the natural pauses in the workday disappeared. As one worker put it: breaks stopped feeling like breaks.
Relentless multitasking. Workers ran multiple AI threads simultaneously - manually coding while AI generated alternatives, reviving long-deferred tasks in the background, juggling open loops everywhere. This produced a feeling of momentum. But the reality was constant attention-switching, frequent output-checking, and a growing sense of always-on.
The researchers described a self-reinforcing cycle: AI speeds up tasks → expectations for speed rise → reliance on AI deepens → scope of work expands → workload intensifies. And silently. One engineer summed it up perfectly: you thought you'd work less. But you don't work less. You work the same amount or more.
Why this matters
Here's where it gets personal and where it connects to my work with arts and culture organizations.
I’ve just come from the world where this AI job transformation landed first. I spent years in product management and engineering leadership, including working in a coaching network alongside Marty Cagan at Silicon Valley Product Group coaching tech companies on how to innovate. My job as product coach was to help tech companies change how they 1) build, 2) how they solve problems, 3) how they decide what problems to solve. I watched firsthand as ChatGPT landed and software engineers' roles started shapeshifting overnight — then cascading to product managers, then designers, then marketers, then leadership. I had a front-row seat at AI job shifting, ground zero.
Every major article about AI's impact on work - including this HBR study - now points to software engineering as the canary in the coal mine. The job transformation starts there and radiates outward. I know that world intimately. I lived it.
And now I'm here, back in arts and culture, because I believe this sector deserves better than being last in line for a transformation that's already underway.
What this means for arts and culture
The HBR study describes a tech company. But the patterns it identifies - task expansion, boundary erosion, cognitive overload- are not unique to tech. They're what happens whenever AI tools land in a workplace without intentional structure around them.
And here's the thing: arts and culture organizations are already resource-constrained. Staff are already wearing multiple hats. The 15-person development team is already doing the work of 25. When AI enters that environment and people start absorbing even more tasks because they can - without anyone rethinking job descriptions, workloads, or team structure - the burnout risk isn't theoretical. It's inevitable.
The phrase from the study that stuck with me most: organizations need to develop an "AI practice" - a set of intentional norms and routines that structure how AI is used, when it's appropriate to stop, and how work should and should not expand. Not just hand people the tools and expect them to self-regulate. That doesn't work. The research proves it (and I intuit it— I’m self-employed so my “boss” is me and I’m not so good at setting great boundaries on work at the moment.)
The bigger opportunity (and my bigger thesis)
Here's where I want to flip the script.
This isn't just a cautionary tale. It's a massive opportunity - especially for smaller organizations.
For the past 20 years, technology adoption in the nonprofit/arts sector has followed a painfully slow arc. New CRM? That's a 3-year project. Website redesign? 18 months if you're lucky. Digital transformation? A decade-long journey with mixed results.
We don't have to repeat that arc.
The AI moment is different. Fundamentally different. The tools are accessible, often low-cost and low “switching cost,” and they don't require a massive implementation (yes, you definitely want to work WITH your IT team no matter what).
The point is that a 15-person organization can leverage AI in ways that would have required a 50-person team just two years ago. The smaller and more nimble you are, the more leverage you have right now.
My premise — the big thesis I'm building toward in all my work — is this: arts and culture organizations can leapfrog the slow, painful technology adoption cycles of the Web 2.0 era. We can build lean, scrappy teams that leverage AI to automate operational work, sharpen fundraising, accelerate content creation, and free up humans to do what humans do best — create meaning, build community, make art.
But only if we're intentional about it. Only if we design the practice, not just deploy the tools.
That means rethinking job descriptions before they rewrite themselves. It means building organizational muscle for continuous learning and adaptation - what the product world calls "discovery." It means leadership that frames the big problems worth solving, not leadership that mandates specific AI tools.
What I'm offering
I've spent years helping for-profit tech companies and nonprofits master these principles. Now I'm bringing that experience - the product operating model, the innovation coaching, the firsthand AI transformation knowledge - to arts and culture.
If you're an arts leader, you are (or will) start to see your staff quietly absorb more and more work with AI tools. You might start wondering whether that's a win or a warning sign. From the bleeding edge, let me tell you first-hand… it's both. The time to get intentional about AI practice is now. Before things calcify in place and toxic work patterns start.
I'm here to help you design the practice — not just adopt the tools. I’m a rare combination of both tech architect and org architect and AI is my jam.
Reach out: kristin@matters.work
Kristin Darrow is an independent consultant working with arts and culture organizations on AI strategy and organizational transformation. She brings a decade and a half of product leadership experience from Tessitura Network and Silicon Valley Product Group to help cultural organizations leverage AI responsibly and effectively.