Jira automation (Woohoo!), Focus you can trust, Evidence mode, Excel exports for clusters, and Improved agent reasoning
Apr 6, 2026
·
AI

Jira automation
Jira automation is here. Whenever an issue is created or updated, the NEXT AI agent automatically retrieves, compiles and computes relevant customer evidence, then posts it as a comment on the ticket — customer reels, real quotes, quantified data, and links back to the source. Every ticket arrives pre-loaded with the context your team needs to make confident decisions, without anyone having to search for it.
Connect your Jira instance via OAuth, select the projects you want to monitor, and the agent handles the rest.
Focus you can trust
When you set a focus, you're telling the agent exactly what data matters. This week, four changes make sure it listens — and that you can always see what's in scope.
Highlight count stays visible. Previously, confirming your focus hid the highlight count and left the area feeling cramped. The focus area now keeps the count on screen in a cleaner two-line layout with better spacing between tags. At a glance you know how much data the agent is working with and where that data is coming from.
Your selections stick. Setting a focus manually used to be undermined by auto-suggestions that widened the scope and replaced your choices. Suggestions now layer on top of what you already selected — your tags and filters remain the foundation, and the system adds from there.
Smarter defaults from tag groups. When setting the focus, the system now recognizes which tags belong together. Instead of requiring every tag to match (which over-narrows results), related tags are offered as alternatives — for example, "Product Issue or Pain Point." You get broader, more useful defaults without having to rearrange chips by hand.
Consistent refinement across every flow. Behind the scenes, the way the agent narrows focus has been rebuilt into a single, dedicated step. Your existing scope always serves as the starting point, and any refinement — whether it's a new tag, a date range, or an updated description — is added on top rather than rewriting what you set. This applies uniformly, so focus behaves the same way no matter where you're working.
New mode: Evidence
Until now, every analysis in NEXT AI assumed you already had a hypothesis to test or a structure to cluster around. But many questions start broader: "What are customers saying about checkout?" or "Show me everything related to pricing." Forcing those into modes designed for data crunching added friction and muddied the results.
A new Evidence Mode gives you a dedicated space for open-ended exploration. Point the agent at a topic and it gathers all relevant evidence into a single, organized view. No hypothesis required, no clustering to compute — just everything that matches, in one place. It's available alongside Hypothesis and Cluster modes, so you choose the approach that fits the question.
Excel exports for clusters
We've upgraded highlight downloads in clusters from a CSV—workable, but clunky to open and format—to native Excel (.xlsx) files. Same fields, same column order, but the file opens cleanly in Microsoft Excel or any spreadsheet app with no import dialogs or encoding issues.
Improvements to agent reasoning
In Evidence and Hypothesis Mode, the agent now draws a hard line between structure and substance: only steps backed by actual evidence produce meaningful output. When there's nothing to cite, the response says so plainly instead of generating plausible-sounding detail (hallucinating). You can trust that every claim in the output traces back to a real source — and when the data isn't there, you'll know.