A guide to Gartner and Forrester research for asset managers
Asset managers are absorbing pressure from every direction. Fee compression is a real pain point, with fund expense ratios dropping by more than half over the past two decades. At the same time, private markets are experiencing explosive growth, bringing a wave of data complexity that your legacy systems weren’t built to handle. On top of that, AI adoption is accelerating.
Taken together, these forces are reshaping how asset managers operate. They are raising a more fundamental question: is your data fit for what comes next?
If you’re looking for guidance, analyst research is a great place to start. But most data quality evaluation frameworks were written for generic enterprise use cases, and don’t always map to the way data moves through investment operations.
Read on to better understand how to use Gartner’s Magic Quadrant and Critical Capabilities for Augmented Data Quality Solutions, alongside the Forrester Wave™ for Data Quality Solutions, to find which capabilities matter most for your operating model, and how you can implement them in practice.
Why vendor rankings aren’t enough
Analyst research shouldn’t be read as a scoreboard, so asking “Who ranked the best?” won’t give you the answers you need. A vendor can look strong in a general market evaluation and still fall short in the areas that matter most when your company is managing a complex, dynamic portfolio.
For asset managers, doing data quality well calls for a more practical reading of available analyst research. You are not looking for a platform with strong general capabilities, but for tooling that helps you organize, understand, and improve the data your business depends on every day.
The five areas below deserve your closest attention:
- Start with analytics and AI readiness
For most asset managers, the most useful starting point in Gartner’s Critical Capabilities report is the analytics and AI readiness use case. AI readiness goes beyond future experimentation to ask whether your data is fit for the work you need it to do now: portfolio analytics, risk models, performance measurement, exposure monitoring, and AI-assisted workflows. Each of these use cases depends on quality data that’s timely, explainable, and fit for purpose.
Here, Gartner’s Magic Quadrant and Critical Capabilities reports are especially useful when read in tandem. The Critical Capabilities report can help you compare vendor strength by use case, while the Magic Quadrant shows how the market as a whole is moving. The Magic Quadrant says modern augmented data quality solutions must support not only profiling, but also ongoing monitoring, rule management, active metadata, remediation, and AI-enhanced usability.
For asset managers, look to understand: Can the platform I’m exploring do more than simply profile the data? Can it help me maintain trust in my data as it moves through downstream processes that shape our investment decisions, disclosures, and risk management?
- Look closely at observability and anomaly detection
This is one of the strongest themes that runs across available analyst research on the data quality market.
Forrester says buyers should prioritize continuous data observability and real-time anomaly detection to prevent downstream data issues. Gartner similarly identifies profiling and monitoring/detection as a core market requirement, with metadata helping support automatic detection of outliers, anomalies, patterns, and drifts.
This is especially relevant for asset managers because many data issues do not show up at ingestion. A file may arrive in the expected structure and still cause problems later, such as stale prices feeding valuation processes, identifier mismatches affecting exposure views, inconsistent issuer hierarchies breaking compliance checks, and delayed benchmark data distorting client reporting.
This is also where data quality gates become operationally important. In an asset-management context, a data quality gate means embedding validation directly into the flow of data before it moves downstream. This might mean stopping a price file if key identifiers do not match the security master, blocking benchmark data if effective dates are missing, or flagging incomplete ESG attributes before they feed portfolio construction, exposure analysis, or investor reporting. Used this way, data quality gates are a way to prevent bad data from turning into reconciliation work, reporting exceptions, or poor model inputs later in the process.
This is precisely what asset managers should look for in analyst research – not only whether a platform can detect an issue, but whether it can enforce trust at the point where the data enters the operational chain. This capability builds directly on Gartner’s emphasis on deploying monitoring and validation rules in batch or real time and on Forrester’s focus on embedded remediation and proactive impact analysis.
For asset managers, ensure that any platform you evaluate can identify and contain issues before they become manual reconciliation work, reporting defects, and risk-reporting exceptions.
- Use automation to end manual reconciliation
Gartner predicts that by 2027, 70% of organizations seeking Data & Analytics governance tools will prioritize automation and choose solutions that minimize manual intervention (Gartner, Magic Quadrant for Augmented Data Quality Solutions, 2026; Critical Capabilities for Augmented Data Quality Solutions, 2026). Gartner also says self-learning algorithms are reducing the need for a human in the loop and that vendors are investing in automation to suggest rules, prioritize alerts, recommend remediation, and drive efficiency.
For asset managers, this has a direct impact on operating cost.
Manual effort is still one of the biggest cost drivers for asset managers when it comes to managing data quality, including reconciling data across sources, investigating breaks, resolving duplicates, validating hierarchies, or tracking down why one downstream system does not agree with another. A platform that surfaces issues without helping teams triage and remediate them increases visibility, but doesn’t offset workload.
When analysts reference AI-assisted rule creation, remediation workflows, and workflow orchestration, asset managers should pay attention. Forrester says buyers should seek predictive insights, automated root-cause analysis, proactive impact analysis, and embedded remediation workflows. Gartner points to natural-language rule creation, AI-assisted interactions, and agentic workflows as increasingly important market capabilities.
For asset managers, it all boils down to whether or not a platform will actually reduce workload in the current operating model. Look for evidence that your tool can reuse rules across pipelines, prioritize meaningful exceptions, assign ownership clearly, and automate repetitive stewardship tasks. Those are the capabilities that help companies move from reactive reconciliation to scalable control.
- Read lineage and metadata as a regulatory and AI issue, not just a governance feature
Lineage can sound abstract in analyst research, but for asset managers, lineage and metadata are real business tools.
You need to know and be able to show where particular figures come from, how they have been transformed, and why they should be trusted. That applies across performance reporting, client disclosures, risk calculations, and internal oversight.
It’s also increasingly important for AI readiness. If firms are going to use AI in research, operations, analytics, or risk reporting, they need confidence that the underlying data is traceable and explainable.
For asset managers, lineage is much more than a governance feature; it’s a practical way to defend outputs, understand impact, and trust the data behind both risk reporting, investment decisions, and AI-enabled decisions.
- Focus on operational and transactional data quality
Gartner defines operational and transactional data quality as support for controlling the quality of data created by, maintained by, and housed in operational or transactional applications. What does that mean for asset managers?
The most costly data issues are the ones that show up in day-to-day processes: breaks between positions and accounting records, mismatches between internal and external reference data, missing attributes in downstream reporting flows, or inconsistent values moving across trade, valuation, compliance, and reporting systems.
For this reason, data quality for asset managers needs to be embedded close to where data is created, maintained, and moved. The goal is to stop issues from spreading in the first place, not just fix after the fact.
Does your platform pass the test?
As an asset manager, here are the most important questions to focus on as you read data quality research by Gartner and Forrester:
- Can the platform monitor data continuously, not just profile it once for a point-in-time snapshot? Gartner and Forrester both treat observability, anomaly detection, and ongoing monitoring as core modern capabilities.
- Does it actually reduce manual work through smart automation? Gartner is explicit that automation is becoming central and that buyers are prioritizing solutions that minimize manual intervention.
- Can it connect data quality to governance, lineage, and impact analysis? Asset managers need explainability, defensibility, and a clear view of downstream impact
- Does it support AI readiness in practice? Gartner and Forrester both anchor the market around AI readiness, but the real test is whether the platform can make data fit for analytics, models, and increasingly unstructured-data use cases.
- Can it handle the complexity of a highly regulated financial environment? Forrester notes Ataccama’s strong adoption in financial services, and Gartner repeatedly highlights regulated industries across the market. Those are more meaningful fit signals for asset managers than generic enterprise breadth.
The bottom line
The analyst research is useful for asset managers, but only if you interpret it through the realities of your industry.
Gartner Critical Capabilities is the best place to start because it shows how vendors perform against specific use cases. The Magic Quadrant helps explain the broader market direction. Forrester’s report is useful as a cross-check because it reinforces the importance of observability, automation, remediation, and AI readiness.
Read together, the reports point asset managers toward a clear set of priorities: analytics and AI readiness, observability, automation, lineage, and operational data quality. Those are the areas that determine whether a data quality platform becomes a strategic control point or just another layer in the stack.
Get the analyst research
See how leading analysts evaluate today’s data quality platforms, then compare what matters most for your business. Use the reports to assess observability, automation, lineage, AI readiness, and operational fit with more confidence.
Anja Duricic
Anja is our Product Marketing Manager for ONE AI at Ataccama, with over 5 years in data, including her time at GoodData. She holds an MA from the University of Amsterdam and is passionate about the human experience, learning from real-life companies, and helping them with real-life needs.