UnitedHealth’s Data Quality Management made mainstream news, releasing the results of independent audit tied to the United States Department of Justice Medicare Advantage data-audit reviews which found 23 action plans, to remediate by March 2026, in cooperation with DOJ investigations.
UnitedHealth’s public action-plan timeline illustrates the broader point: when scrutiny hits, everyone asks the same question “How can you prove your data is trustworthy?”
A Difficult Stretch: UnitedHealth’s Data Quality Management
United Healthcare is the nation’s largest health insurer, providing benefits to more than 50 million Americans. In December 2024, UnitedHealthcare CEO Brian Thompson was slain in an act of violence that also triggered a surge of public anger and renewed scrutiny of how UnitedHealth’s Data Quality Management is handling benefit coverage and payment decisions.
| Category + Signal | Remediation |
|---|---|
| Progress via a Published a 23-action improvement plan with timelines | Stakeholders can track execution against a defined plan. |
| Progress through Standardized documentation and processes, focused Medicare Advantage–sensitive areas | Reduce service process variance and improve audit defensibility. |
| Challenge is that unmanaged Documentation and process gaps further weaken regulated audit defense | Inconsistency and unmanaged gaps can escalate into compliance and payment integrity risk. |
| Challenge of unmanaged data quality and process governance resulted in Reputation damage falling 33 spots: #59 (2020) → #92 (2025) in Axios Harris Poll 100 | Trust erosion increases scrutiny and pressure for proof. |
| Challenges of eroded public trust impacted brand health which dropped 16.1 points: 15.4 average → -0.7 low after December 2024 CEO killing | Transparent evidence and visible controls matter even more. |
Data Quality Auditing
External reviews test processes that expose whether an organization can prove those processes with consistent data, documentation, and control evidence. Trust Triangle. Modern regulators and consumer markets expect data management proofs:
- Data Security protects confidentiality and integrity, while enforcing identity-driven access.
- Data Quality focuses data condition (accuracy, completeness, timeliness, consistency).
- Data Governance defines ownership, policies, decision rights, and oversight design.
How UnitedHealth’s Data Quality Management findings signal “data quality problems”
Color key: 🟦 Risk Assessment (RA) • 🟩 Care Services Mgmt (CSM) • 🟧 Manufacturer Discounts (MDD)
| #Area | Finding (short) | Likely cause | Proof of correction |
|---|---|---|---|
| 🟦01-RA | Annual policy review + logs | Lifecycle gaps | Version history + annual attest |
| 🟦02-RA | Policy inventory + compliance | No master index | Policy register + gap closure |
| 🟦03-RA | Validate adherence | Weak enforcement | Control tests + exceptions log |
| 🟦04-RA | Central policy repository | Docs scattered | Repo live + migration evidence |
| 🟦05-RA | Roadmap to centralize tooling | Tool sprawl | Roadmap + milestones met |
| 🟦06-RA | Chart review/data submit policy | Inconsistent SOPs | Umbrella SOP + training proof |
| 🟦07-RA | Cross-area “umbrella” policies | Duplicates/drift | Consolidate + retire list |
| 🟦08-RA | HouseCalls annual refresh | Refresh cadence gap | Annual review records |
| 🟦09-RA | Approved coding credentials | Credential variance | Approved list + onboarding checks |
| 🟦10-RA | Quarterly coding standards + comms | Slow dissemination | Quarterly release notes + ack |
| 🟦11-RA | Independent coding audits | Independence gap | Compliance audit charter + results |
| 🟦12-RA | ASM/Submission Services policy | Process complexity | SOP + control mapping |
| 🟦13-RA | UHC oversight policy for RA | Oversight not evidenced | KPIs + oversight action logs |
| 🟩14-CSM | Central eGRC tracking | Tracking scattered | eGRC owners/dates/artifacts |
| 🟩15-CSM | Monitoring + escalation | Follow-through variance | Escalation rules + trend reports |
| 🟩16-CSM | Due dates to prevent repeats | Deadline slippage | Aging/SLA reports + closure |
| 🟩17-CSM | Fully remediate audit findings | Partial remediation | Closure packets + re-testing |
| 🟩18-CSM | Define QM responsibilities | Role ambiguity | Policy + RACI + committee reports |
| 🟩19-CSM | Reconcile public remediation | Conflicting summaries | Reconciliation workflow + resolution |
| 🟧20-MDD | Strengthen controls (roadmap) | Control gaps | Roadmap + control test results |
| 🟧21-MDD | Clearer exclusion reporting | Low transparency | New report samples + reconciliation |
| 🟧22-MDD | Escalation for disputes/non-pay | Ad hoc handling | Case logs + comms trail |
| 🟧23-MDD | Automate + streamline documentation | Manual work + doc sprawl | Automation logs + retired docs |
Importance of “Data Quality Management” in healthcare
Data Quality Management (DQM) isn’t reactively cleaning data when something looks wrong. It is the foundation of system trust—controls, standards, and proof of data accuracy.
DQM must be able to answer three questions on demand:
- Can we prove the data is correct? (accuracy + validity)
- Can we prove how it got here? (lineage + audit trail)
- Can we prove who can use it and why? (access + purpose + identity)
Turning audits into evidence: a practical Data Governance Framework + workspace
Healthcare governance, risk, and compliance are personal priorities for me. That’s why partnering at Cognizant with ServiceNow’s Data Quality leaders—and helping unveil the free Innovation Labs Data Quality Workspace at the ServiceNow Toronto World Forum—mattered. It gives teams a purpose-built way to profile platform data, track the core data quality dimensions, and show scorecard-grade evidence before an audit or incident forces the issue.
Trust collapses when data has no proof
Markets punish uncertainty faster than enterprises can remediate it. Meanwhile, customers interpret “we’ll fix it by next year” as “it wasn’t controlled last year.”
Integrated Risk Management dynamic explains the real business risk of weak data foundations:
- Regulators expect repeatable evidence, not reassurance.
- Customers expect transparency, not complexity.
- Boards expect measurable controls, not narrative updates.
Healthcare Data Quality Management Lessons
Auditors often label findings as “operational.” However, those issues almost always map to core data quality dimensions—and they surface faster as healthcare AI shifts from assist to act. Therefore, treat every audit signal as a data-quality signal.
Practical takeaway: When documentation varies, it starts as an efficiency problem. Then it escalates into a payment integrity and audit defense risk—because you can’t prove the data you submitted is complete, consistent, and supported.
A simplified “audit-to-data-quality” mapping
| Audit signal (from public summaries) | Data quality dimension | Trust risk created | What good looks like |
|---|---|---|---|
| Non-standard documentation in risk-related workflows Reuters | Consistency, completeness | Questions about integrity of submitted data | Standard templates, required fields, validations, evidence links |
| Action plans to centralize policies + track findings UnitedHealth Group | Governance, traceability | Repeat findings, slow remediation | Policy lifecycle + control library + remediation SLAs |
| Recommendations to standardize audits + automate Reuters | Timeliness, auditability | Controls fail silently at scale | Continuous monitoring + automated control tests + dashboards |
Create Data Quality wins
Strive for the Outcome: executives stop asking “Are we compliant?” and start asking “Show me the evidence.”
| Leadership question (sharpened) | Trusted data quality proof to show | Where it typically lives in a “Trusted Data” workspace |
|---|---|---|
| Which decisions rely on this dataset—and what harm happens if it’s wrong? | Decision register + impact/risk statement | Decision map / Data product overview |
| Which fields are critical, and who owns them? | CDE list + named business/technical stewards | Data inventory → CDEs tab |
| Which rules block unsupported values from entering the system? | Rule catalog (validations) + enforcement points | Rule catalog + ingestion controls |
| Which monitoring catches drift, missingness, or spikes within hours? | Observability alerts + SLAs + anomaly history | Monitoring / Alerts dashboard |
| Which lineage proves the source and transformation steps? | Lineage diagram + ETL/transform log references | Lineage / Provenance view |
| Which approvals document policy changes and exceptions? | Approval history + policy versioning + exception rationale | Governance → Policies & exceptions |
| Which identity controls prove least privilege and appropriate access? | Role/entitlement model + access reviews + audit logs | Access model + review evidence |
| Which dashboards show quality by domain, product, and vendor? | DQ scorecards segmented by domain/product/vendor | Executive scorecards |
| Which incidents link directly to data defects (and repeat causes)? | Incident/problem linkage + root-cause trends | Defects → Incidents & problems |
| Which controls are automated, and which still rely on heroics? | Control inventory + automation coverage % + manual steps | Controls → Automation coverage |
| Which evidence pack could we hand to a regulator today? | Exportable evidence pack (dated, complete, traceable) | Evidence pack generator / Exports |
| Which AI agents consume this data, and how do we constrain them? | AI data usage register + guardrails + allow/deny policies | AI governance → Data consumers |
Minimum viable evidence pack (that demonstrates the Data)
Controls fail when they live in slide decks instead of systems. Therefore, regulated industries need a Data Governance Framework that runs like an operational product: measurable, testable, and observable.
| Evidence pack item | What it must include (minimum) | Outcome it enables |
|---|---|---|
| Data inventory + CDEs | Dataset list, CDE flags, owners | Clear accountability |
| Definitions + stewardship | Glossary, decision rights, steward assignments | Consistent interpretation |
| Rule catalog + thresholds | Rules by domain, thresholds, enforcement points | Prevent defects early |
| Profiling baseline + trends (90 days) | Baseline metrics + 90-day trendline + variance notes | Proves stability over time |
| Exceptions log + remediation proof | Exceptions, approvals, fixes, re-test evidence | Shows controlled deviation |
| Lineage snapshot per CDE | Source → transforms → targets + timestamps | Traceability & auditability |
| Access model | Identity, roles, purpose, least-privilege mapping | Access governance proof |
| Control test results | Automated test runs, pass/fail, coverage | Repeatable assurance |
Conclusion: Trust starts with demonstrated, trusted data
Data Management as a foundation to marketplace leadership, trust, and Generative AI is essential. Reputation doesn’t recover on promises alone. Instead, trust returns when an organization can show consistent documentation, governed controls, monitored quality, and an audit-ready evidence trail.
Strong governance and observability stop being back-office hygiene and become a public promise you can defend. A governance workspace operationalizes trust by centralizing:
- Policy registry: standards, definitions, approvals, review cadence.
- Control library: “what we test,” mapped to regulations and internal risk.
- Profiling results: dashboards by domain/system/table, trending over time.
- Issue management: tickets, owners, remediation SLAs, and root-cause tags.
- Lineage views: source → transformations → downstream consumers.
- Audit evidence vault: artifacts, logs, approvals, and “why” narratives.
Other UnitedHealth’s Data Quality Management Resources
- 7 Ways Data Interoperability Improves Healthcare
- Application impact analysis: a risk-based approach to business continuity and disaster recovery – PubMed
- Association of Artificial Intelligence (AI) and Robotic Process Automation (RPA) | Groups | LinkedIn
- Broad’s AI COVID-19 Solutions
- Data Quality in Health Research: Integrative Literature Review – PMC
- Enterprise Global Cyber Fraud Prevention- Methods: Detection & Mitigation, & IS Best Practices
- FedRAMP – Glossary | CSRC
- Getting Ahead of Global Regulations
- Glossary of Health Coverage and Medical Terms
- Governance, Risk, and Compliance (GRC)
- GRC Industry Reference Matrix
- Healthcare Compliance Simplified Framework
- HealthCare.gov glossary
- Healthcare Cybersecurity – Heal Security Inc.
- HL7 Standards | HL7 International
- Humanizing Health: Elevate Respect
- IDC Saudi Arabia CIO Summit 2023
- KAUST: AI-Healthcare Innovation
- Measuring Quality | Stanford Health Care
- Middle East’s Top CIO50 Innovation Leaders. #7 is the most visionary Healthcare CIO
- SaaS Compliance Frameworks
- Security and IT Glossary
- Transforming Healthcare Software Catalogs