top of page
Search

Data sharing is infrastructure, not compliance

Data portability is having a policy moment. Across the UK, Australia, the EU and Singapore, governments are converging on a familiar promise: make data portable, standardised, and usable so markets become easier to compare, easier to switch, and easier to innovate in.

But portability is often the wrong frame.


Portability is a mechanism. What governments are actually trying to build is a governed data-sharing capability: standardised meaning, trusted access, clear accountability, and a marketplace of intermediaries and services that can use data safely.


At Friday Initiatives, we see this pattern across every data regime: data legislation is rarely about data. It’s about four recurring policy roots: autonomy, fairness, competition/innovation, and security/sovereignty. Rights are the lever. Outcomes are the prize.


And here’s the uncomfortable truth: most portability and sharing regimes will underdeliver unless we treat data readiness as infrastructure, not compliance.


The real blocker: most industries cannot share what they cannot define, clean, or value


This is the practical failure mode policymakers underestimate most:

  • No infrastructure: no interoperable rails, no shared identifiers, no implemented controls

  • No clean data: inconsistent semantics, missing metadata, weak quality, no lineage

  • No awareness of use cases: organisations can’t see the “why,” so they don’t invest

  • No valuation: if data value isn’t measurable, capital allocation stays low

  • Talent concentration: the capability sits in Big Tech, not in national industries or SMEs


So governments announce portability and sharing frameworks, but the economy doesn’t have the data capability layer required to deliver the outcomes. The result is predictable: low adoption, shallow interoperability, and limited innovation.


And there’s a further risk that rarely gets said out loud: if industries cannot build the capability layer, they outsource it to whoever can. Which usually means dominant platforms and that directly undermines the sovereignty objective.


The consent problem: checkbox portability will not deliver the policy outcome


The problem is not consent in the abstract. The problem is consent-as-infrastructure: using consumer prompts and click-through journeys as the primary control mechanism for ecosystem-scale data sharing.


Consent-led models are widely understood to underperform. They lead to fatigue, information asymmetry, technical complexity, and increased legal and operational risk—without reliably producing safer outcomes.


If governments believe “consumer choice” is the engine, they’ll keep building consent-heavy experiences that look compliant but fail to deliver real switching or innovation.


Consent becomes:

  • a UX tax

  • a legal fiction (“informed” only in theory)

  • a governance substitute


The alternative is not “no accountability”. It is the opposite: purpose-led sharing with enforceable controls—clear purpose limits, auditability, liability, and trust mechanisms. In many contexts, that means relying on appropriate lawful bases (including legitimate interests where it applies), backed by safeguards that are real in operations, not just in policy.


Meanwhile, without that governance layer—standards, incentives, accountability, purpose constraints, auditability—nothing scales.


Or worse: industries outsource the capability to Big Tech because they’re the only ones with the infrastructure and talent—which directly undermines the sovereignty objective.


What infrastructure actually means in a data regime


When we say data readiness is infrastructure, we’re not talking about a one-off standards deliverable or a compliance checklist.


We mean the operational capability that makes data sharing real: the definitions, controls, and execution patterns that allow data to move across an ecosystem with trust and accountability intact.


In practice, that looks like:

  • Semantic interoperability (not just APIs)Standards must include meaning: definitions, categories, quality thresholds, and governance.

  • Trust architecture baked inAccreditation (or equivalent), auditability, clear liability, enforceable safeguards.

  • Execution layers for switchingComparison must become action—otherwise switching remains theoretical.

  • Shared utilities to stop the “Big Tech gravity” problemSector utilities, reference implementations, interoperable tooling—so SMEs can participate without rebuilding everything.

  • A valuation and use-case disciplineIndustries invest when value is measurable. You need frameworks that connect data sharing to outcomes (revenue, efficiency, decision quality, risk mitigation, public value).


This is why Friday’s internal premise matters: data is never neutral—it exists within purpose, context, and consequence. If you don’t operationalise purpose and context, portability just increases risk and friction.


Our takeaway


Most portability regimes will fail for the same reason: they treat data sharing like a compliance obligation, not a capability layer.


But the endgame governments are aiming for is not download your data. It is a functioning market system, built on standards, controls, incentives, accountability, and real operational infrastructure.


That is what portability is trying to force into existence.


And it is why portability is the headline, but governed data sharing is the endgame.

 
 
 

Comments


bottom of page