Skip to Main Content

Dec 16, 2025 | 5 minute read

QA Power-Up: AI Workflow Insights from a Testing Expert

written by Kevin Cheng

A Polygon representing quality assurance getting a speed boost power-up from a block representing AI.

AI is a big timesaver, and the way I work has changed because of it.

Everyday I review my assigned tickets, assess the complexity, and estimate how much time each one needs. I knock the fast ones out so we can ship value quickly.When there's an urgent ticket or a customer waiting for a feature, that takes priority. But even then, quick fixes keep coming in throughout the day, and I'm always asking myself: can I finish this in the next little while, ship it, and come back to the bigger thing? I don't want to be a roadblock for the team.

To keep track of everything, I group my Chrome tabs by feature. For example, quotes go in one group and rule promotions in another. When I'm done, I close the whole group. With everything organized on my screen, I can always see which one I can knock out fast and keep things moving.

AI Is Like a Colleague Next to Me

This is where AI fits into my workflow. I treat it like someone sitting at the desk next to me, someone I can hand off the mechanical parts of my job to.

When I need to build a test, I open a chat window and say: I need to build this test. Here's the new endpoint. Here's the request body. Here's the response. Basically the same JSON format I use in Postman. Then I tell it to write a new test following the exact patterns in my framework.

What happens next is pretty cool. It crawls through my project, studies the structure, and figures out where things should go. Most of the time, I'd say it’s 80-90%, it's accurate. I just tweak a few things and it learns for next time.

AI Workflows

The real shift came when I started digging into AI workflows. The problem I kept running into was repetition: every time I needed to create a changelog, I'd go through the same sequence of prompts. Create a changelog for this feature. Then, put it in that folder. Then, tweak this content. Then, add relative links. Four or five follow-up prompts, every single time.

Workflows solve that by bundling all those instructions into a markdown file. Human readable, basically a checklist for the AI. And I didn't even write the workflow file manually. I just told the AI to create one for me: make a workflow for creating changelogs.

Now when I need a changelog, I trigger the workflow with a slash command, give it the date and feature name, and submit. The AI crawls through the project, creates the file in the right folder, and formats everything according to our conventions. If it needs more information, it asks.

One thing I like about this approach is that the workflow is always learning. When something's off, like a link formatting issue or a file landing in the wrong place, I don't just fix the output. I tell the AI to update the workflow itself, and next time it gets it right.

It's like training an assistant: you point out what to do differently, and it adjusts how it operates going forward.

End-to-End Tests: Same Approach

I've built a similar workflow for our end-to-end test suite. I give it an endpoint, request body, and response, and it creates the test following our existing patterns.

The workflow file also doubles as documentation, which has been a nice side effect. I structured it with two sections: a user guide that explains how to use it, then the AI instructions underneath.

Those AI instructions are basically a checklist. Understand the test requirements. Check if it's a new endpoint, and if so, create it in this path. Review existing steps so it doesn't create duplicates that bloat the project. Things like authentication and response validation, it checks what it can reuse before creating anything new. Then it creates the feature file, adds the appropriate tags, and includes cleanup steps.

It's like recording what I would do as a human and telling the robot to follow the same process. That's really what it comes down to.

When to Use This Approach

My rule for when to build a workflow is simple: if you're doing something repetitively in almost the same pattern every time, that's a candidate. You put in the work once to capture how you do it, and then every time after that is just a slash command and a few details.

That said, not everything fits. Quick one-off tasks, things with a lot of variation, or situations where I need to clarify requirements with the team still happen the traditional way. The goal isn't to automate everything. It's to automate the predictable stuff so I have more time for the parts that actually need human judgment.

The workflows themselves are committed to our repository, so anyone on the team can use them. That's been a good way to share this approach beyond just my own work.

The Token Efficiency Win

Here's something I didn't expect when I started using workflows: they're way more token-efficient.

A normal conversation might be four or five back-and-forths, and each prompt costs tokens. That adds up quickly. With a workflow, you front-load all the instructions into one prompt. Even if the AI makes eight code changes internally, it counts as one interaction because you've given enough detail upfront that there's no back-and-forth.

The math works out pretty clearly when you're running dozens of these per week, and it's changed how I think about structuring my prompts in general.

Get Started with Elastic Path

Schedule a demo to see how Elastic Path delivers unified commerce for leading global brands.