Blog
From Data Pipelines to Agentic Workflows: A Shift in Analytics Engineering

What are agentic workflows in analytics engineering and why do they matter?
Agentic workflows are systems that actively support users in structuring, transforming, and combining data, guiding decisions rather than requiring every step to be manually engineered into a pipeline. For analytics engineering, this shift matters because it redistributes how data work is organized: reducing the bottleneck on specialist teams, making self-service more realistic for business users, and shortening the path from raw data to actionable insight without replacing the need for human judgment and strong data foundations.
For many organizations, becoming truly data-driven still depends on something seemingly basic yet critical: getting data ready for analytics. Despite modern data platforms and increasingly powerful tooling, preparing data is still often manual and time-consuming. As organizations scale their data use, this quickly becomes a bottleneck.
At the same time, a new approach is emerging: agentic workflows that actively support users in structuring, transforming, and combining data. Rather than relying solely on manually engineered pipelines, these workflows introduce systems that guide decisions and reduce the effort required to work with data.
The hidden constraint in data-driven organizations
Most companies already have some of the right building blocks in place. Data warehouses are set up, dashboards are live, and use cases are defined. Yet behind the scenes, a significant part of the work still relies on specialized expertise, which can become a bottleneck as demand grows.
Ingesting data is only the first step. Turning raw data into something usable requires interpretation, transformation, and alignment with existing models. It also requires understanding how datasets relate to each other, often based on context that lives in people’s heads rather than in documentation. As demand grows, so does the pressure on these teams: more requests come in, time-to-insight increases, and progress slows. This is exactly where agentic workflows come in: not by replacing the platform, but by reshaping how data work is distributed, supported, and executed across the organization, augmenting the teams involved rather than replacing them.
From pipelines to agentic workflows
What if preparing and integrating data didn’t require translating every step into code first? What if the process could be guided instead of engineered?
This is where we see a shift towards more agentic workflows. Instead of building pipelines step by step, users interact with systems that can analyze data patterns, support them in making decisions along the way, and guide the sequence of steps needed to move from raw data to a usable dataset. In practice, this looks different from traditional data work. A user uploads a dataset, and instead of writing transformations, the system suggests how the data could be structured, highlights inconsistencies, and proposes how it could be combined with existing datasets. The user reviews, adjusts, and approves, making the workflow iterative and interactive rather than predefined and static.
There’s nothing fundamentally new about this. We’ve been abstracting and automating parts of data workflows for years. What’s different now is that systems are increasingly playing an active role in helping users make decisions, rather than just executing predefined steps. At the same time, it’s important to stay realistic about what this means. These systems don’t actually understand the business context behind a dataset. They work with patterns, metadata, and signals in the data itself, which means making sense of that still requires human judgment. In practice, these workflows are most effective when they support domain experts rather thanreplace them. The user remains responsible for interpreting and validating the system’s suggestions.
This changes the workflow entirely. It becomes less about building pipelines and more about guiding outcomes.
Why this matters in practice
From a strategy perspective, this shift goes beyond efficiency. It changes how organizations operate.
It makes self-service more realistic. Instead of waiting for an analytics engineer to model a new dataset, a business user can start working with it more directly, supported by the system. This does not remove the need for strong data foundations, but it makes the complexity more manageable and accessible.
Beyond efficiency, it changes how organizations scale. Today, meeting the growing demand for analytics-ready data typically means adding more engineering capacity, which is already scarce. With agentic workflows, part of that effort is absorbed by the system itself, allowing teams to handle more work without growing at the same pace.
On top of that, it shortens the path from data to insight. When the effort required to prepare and integrate data decreases, organizations can move faster: new datasets can be explored earlier, use cases can be tested sooner, and decisions can be made with less delay. It also creates more room for experimentation, allowing teams to learn faster and iterate more easily.
At the same time, it introduces more consistency. Instead of every team structuring data in their own way, the system provides guidance, not by enforcing strict rules, but by nudging users towards better and more standardized decisions. This, however, raises the bar for governance. As more decisions are supported by AI, it becomes even more important to understand how those decisions are made, who is responsible, and how changes can be traced.
How agentic workflows are changing the role of Analytics Engineers
As agentic workflows become more guided and interactive, the role of Analytics Engineering shifts. Instead of focusing primarily on building pipelines, the focus moves towards designing systems that others can use effectively. This includes defining standards, embedding best practices, and ensuring that data governance is part of the workflow from the start.
The role becomes less about doing the work and more about enabling it. At the same time, abstraction should not come at the cost of understanding. Even if systems take over part of the process, organizations still need to maintain clarity on how their data is structured and interpreted. Otherwise, new bottlenecks will simply emergein a different form.
Looking ahead
We are moving towards a world where data platforms do more than store and process data. They actively support how data is interpreted and used. This is part of a broader shift towards more agentic systems, where software not only executes tasks but actively supports users in navigating complex workflows.
Agentic workflows are not a one-size-fits-all solution. More complex use cases will continue to require deep expertise and tailored approaches, but the direction is clear: data work is becoming more interactive, more accessible, and less dependent on manual implementation.
This does not replace Analytics Engineers. It increases their leverage by allowing them to focus on higher-value problems, while more routine work becomes accessible to a broader user base. Ultimately, the biggest shift is not just technological, but organizational. It is about how teams collaborate, how responsibilities are distributed, and how decisions are made around data. In practice, this is also where the biggest blocker lies. Technology can enable agentic workflows, but without an updated operating model, the full potential remains out of reach.
That means being concrete about how data teams are structured, how work flows between engineers and business users, and where human judgment is required. It also means treating AI governance not as a compliance afterthought, but as an active enabler, one that gives teams the confidence to act on AI-supported decisions. Organizations that get this right don't just adopt new tooling; they redesign how data work gets done.
What this means for your strategy
For leaders shaping their Data & AI strategy, agentic workflows are worth taking seriously, not as a technology trend, but as an organizational opportunity. The organizations that benefit most won’t necessarily be the ones with the most advanced platforms, but those that rethink how data work is organized: who does what, where expertise is needed, and how more people can contribute without losing quality or control. That is a strategic question before it is a technical one.
This is also where we see our role. On the strategy side, we help organizations redesign workflows, clarify responsibilities, and identify where agentic support can create the most value without compromising governance. On the delivery side, we help translate that into practice through the underlying Data Engineering and Analytics Engineering work required to rebuild pipelines, prepare analytics-ready data, and embed these capabilities into the data platform.
Are we only scaling our technology stack, or are we also evolving how people interact with data? Because the real leverage isn’t in the platform itself, but in closing the gap between business understanding and data execution.
To sum up, here is what this article covered:
- Most organizations already have the right building blocks in place, yet preparing data for analytics still depends on specialized expertise, and as demand grows, that becomes a bottleneck.
- Agentic workflows change this not by replacing the platform, but by reshaping how data work is distributed: users interact with systems that suggest, highlight, and guide, while humans review, adjust, and approve.
- This does not remove the need for human judgment. These systems work with patterns and metadata; making sense of that in a business context still requires domain expertise.
- The organizational impact is significant: self-service becomes more realistic, teams can scale without adding engineering capacity at the same pace, and the path from data to insight gets shorter.
- For analytics engineers, the role shifts from building pipelines to designing systems others can use with governance and standards embedded from the start.
- Ultimately, the biggest shift is not technological but organizational: rethinking who does what, where expertise is truly needed, and how more people can contribute without losing quality or control.
Frequently Ask Questions
Written by

Yannick Bosch
Data & AI Strategist
Our Ideas
Explore More Blogs

AI Agent Resources and Tools
Agents without tools aren’t very useful in any real-world scenarios. The RockBot framework provides a range of subsystems you can use when building an...
Rockford Lhotka
Contact


