Skip to content

SHIELD — H: Harden

Hardening is proactive — it reduces the attack surface before threats materialise.

TL;DR

Classify data first, then protect it proportionally. Design DLP policies by tier — not one policy for all environments. Apply row-level and column-level security on sensitive Dataverse data. Store secrets in Azure Key Vault, never in solutions. Encryption is default; residency must be configured. Hardening is ongoing, not one-time.

Applies To

Audience: Security Architect · Data Protection Officer · Platform Admin Character: Ongoing Frameworks: SHIELD · DIALOGE (Data — Dataverse security) · BOLT (Guardrails — DLP)

SHIELD Harden and SCALE-OPS Containment

SHIELD Harden defines the security rationale for DLP policies and data protection controls. SCALE-OPS Containment covers the deployment and management of those policies across environments. Both are required — design without deployment is a policy document; deployment without design is an accident.


What Harden Means in SHIELD

Harden covers the controls that protect data at rest, in transit, and in use — ensuring that sensitive and regulated data is classified, protected, and governed before it is ever exposed to a threat. Unlike Defend, which responds to threats after they emerge, Harden eliminates the conditions that make threats possible.

Power Platform's reach across the enterprise data estate makes Harden particularly important. A single licensed user can connect to HR systems, financial databases, CRM records, and external services — all within a single canvas app or cloud flow. Without deliberate data hardening, that connectivity becomes data exposure.

Hardening is not a one-time configuration exercise. The data landscape changes — new tables are created, new connectors are added, new regulations come into effect, and new solutions are built that touch data in ways not previously anticipated. Harden is a continuous posture, reviewed and strengthened as the platform evolves.


Why Harden Decisions Matter

Data protection failures in Power Platform typically follow one of three patterns:

Classification failures — sensitive data exists in the platform but has never been identified as sensitive. A Dataverse table containing salary information has the same security role configuration as a table containing product catalogue data. Nobody made an explicit decision about this; it was simply never addressed.

DLP failures — a connector that accesses sensitive internal data is in the same DLP bucket as a connector that posts to a public social media platform. A maker builds a flow that extracts HR records and sends them to an unapproved external service. The DLP policy that would have blocked this was never designed for this combination.

Access control failures — sensitive columns are visible to every user who has read access to the table. Row-level security was never configured. A user who should only see their own region's data can see all regions' data because the security model was never scoped.

All three are preventable with deliberate data hardening — classification first, then protection calibrated to the classification.


The Core Questions Harden Answers

  • Is sensitive and regulated data identified and classified across the platform?
  • Are DLP policies designed, tiered, and enforced — not just applied as defaults?
  • Is access to sensitive data restricted at the row and column level?
  • Is data encrypted at rest and in transit?
  • Is data residency configured for regulated workloads?
  • Are secrets, API keys, and credentials stored securely — not embedded in solutions?
  • Does the organisation understand its privacy obligations and has it met them?

Data Classification — The Foundation of Harden

Data classification is the prerequisite for every other Harden control. You cannot apply appropriate protection to data you have not classified. Classification answers the question: how sensitive is this data, and what protection does it require?

Classification tiers:

Tier Description Examples
Public Information that can be freely shared externally Product catalogues, published pricing, public website content
Internal Information for internal use — not sensitive but not for external sharing Internal policies, meeting notes, project documentation
Confidential Sensitive business information requiring controlled access Financial forecasts, M&A activity, strategic plans, personnel decisions
Regulated Data subject to legal or regulatory protection requirements PII, health data, financial account data, payment card data

Classification in Power Platform:

Classification is applied at the Dataverse table level — every table containing data above the Internal tier should have its classification documented and its access controls calibrated accordingly. Microsoft Purview extends classification to the data content itself — sensitivity labels can be applied to Dataverse records and columns based on the data they contain.

The classification process: Before any enterprise solution goes live, every Dataverse table it creates or accesses should be classified. This is not a retrospective exercise — classification embedded in the solution design process costs a fraction of what classification remediation costs after production deployment.


Data Loss Prevention Policies — The Primary Technical Guardrail

DLP policies are Power Platform's primary mechanism for governing what connectors can be used together — preventing data from flowing between systems in ways that were not authorised.

How DLP Policies Work

DLP policies classify connectors into three buckets:

  • Business — connectors approved for use with business data. Connectors in the Business bucket can only interact with other Business bucket connectors within the same app or flow.
  • Non-Business — connectors not approved for business data. They can be used, but not in combination with Business bucket connectors.
  • Blocked — connectors that cannot be used in this environment at all.

A connector in the Business bucket and a connector in the Non-Business bucket cannot be used in the same app or flow. This prevents a flow from reading sensitive data from an approved internal system (Business) and sending it to an unapproved external service (Non-Business).

DLP Policy Design Principles

Tier policies to environment risk: DLP policies should become progressively more restrictive as environments move toward production and as the sensitivity of data they contain increases. A Developer personal environment can be permissive — makers need the freedom to experiment. The Default Environment should be restrictive — it is open to all licensed users and cannot be deleted. Production environments should be tightly controlled.

Environment Type DLP Posture
Developer (personal) Permissive — broad connector access for experimentation
Shared Development Moderate — approved connectors for the solution being built
Test / UAT Mirrors production — makers test against real DLP constraints
Production Restrictive — only approved connectors, premium and custom connectors explicitly governed
Default Environment Restrictive — block premium connectors, no production workloads

Document the rationale, not just the policy: Every DLP policy should have a documented business rationale — why specific connectors are in their assigned buckets, why certain connectors are blocked, and what the process is for requesting exceptions. When a maker asks why they cannot use a specific connector, the answer should be immediately available and coherent — not "that's just how it's configured."

Tenant-level vs environment-level policies: Tenant-level DLP policies apply to all environments and cannot be overridden by environment-level policies. Environment-level policies can be more restrictive than tenant-level policies but not more permissive. The tenant-level policy sets the floor; environment-level policies can raise it further.

For regulated organisations, the most sensitive restrictions — blocking specific connectors entirely across the tenant, or preventing data from leaving the Microsoft cloud — should be implemented at the tenant level to ensure they cannot be bypassed by environment-level changes.

The Default Environment — A Special Case

The Default Environment is open to every licensed user in the tenant. It cannot be deleted. It should never contain production workloads. It requires the most restrictive DLP policy in the tenant:

  • Block all premium connectors in the Default Environment
  • Block custom connectors in the Default Environment
  • Permit only standard connectors relevant to personal productivity (SharePoint, Teams, Outlook, OneDrive)
  • Monitor the Default Environment closely via CoE tooling — it is the most ungoverned surface in the tenant

Dataverse Security — Row-Level and Column-Level Protection

Dataverse's security model provides granular data access controls that operate independently of which application or interface is used to access the data. Unlike DLP policies, which govern connector interactions, Dataverse security governs access to the data itself.

Row-Level Security

Row-level security controls which records a user can see within a table — independent of whether they have general read access to the table. It is implemented through the security role scope model:

  • User scope — the user can only see records they own
  • Business Unit scope — the user can see records owned by users in their business unit
  • Parent: Child Business Units scope — the user can see records in their business unit and all child business units
  • Organisation scope — the user can see all records in the environment

The enterprise implication: Row-level security is the difference between a regional sales manager seeing all customer records in the organisation versus seeing only their region's records. For solutions where different users should see different subsets of data — by region, by team, by sensitivity, by business unit — row-level security must be designed explicitly. It does not happen by default.

Column-Level Security

Column-level security restricts access to specific columns within a table — independent of the user's broader table access. A user with read access to a Contact table may still be restricted from seeing salary, social security number, or health-related columns.

Column-level security is implemented through column security profiles — separate from security roles — and must be assigned explicitly to users or teams.

When column-level security is required:

Data Type Column-Level Security Required
Salary and compensation data Always
Social security / national ID numbers Always
Health and medical information Always
Payment card data Always
Date of birth Typically
Home address Typically
Personal email and phone Context-dependent

The test: if a data breach involving this column would require regulatory notification or cause material harm to the individuals whose data it contains — column-level security is required.

Business Units

Business units create an organisational hierarchy within a Dataverse environment — the primary mechanism for isolating data between organisational groups. Users belong to a business unit; records are owned by users in a business unit; security role scopes operate against business unit membership.

For organisations with complex data isolation requirements — different subsidiaries, regulated and non-regulated data in the same environment, regional isolation requirements — business unit structure must be designed as part of the security architecture, not retrofitted after data has accumulated.


Encryption

Data at Rest

All data stored in Dataverse is encrypted at rest using Microsoft-managed encryption keys by default. This encryption applies to all Dataverse database, file, and log storage — providing baseline protection without any configuration requirement.

Customer-managed keys: For organisations with the highest sensitivity requirements — regulated industries, organisations with specific key management obligations — Dataverse supports customer-managed encryption keys (CMK) via Azure Key Vault. CMK gives the organisation control over the encryption keys protecting their Dataverse data, enabling key rotation, key revocation, and independent key management outside Microsoft's infrastructure.

CMK is appropriate for organisations that: - Have regulatory requirements for key ownership and management - Need the ability to revoke Microsoft's access to their data by revoking the encryption key - Are subject to bring-your-own-key (BYOK) compliance requirements

CMK introduces operational complexity — key management, rotation schedules, access governance for the Key Vault — that must be managed deliberately. Enabling CMK without the operational processes to manage it creates a new risk: loss of access to data if keys are mismanaged.

Data in Transit

All data transmitted between Power Platform services and between Power Platform and connectors is encrypted in transit using TLS 1.2 or higher. This is enforced by the platform and does not require configuration.

For custom connectors — connectors built by the organisation to access internal APIs — the API endpoint must enforce HTTPS. HTTP endpoints are not acceptable for custom connectors accessing sensitive data.


Data Residency

Data residency defines where data is stored geographically — a critical requirement for regulated industries and organisations operating under data sovereignty legislation.

Dataverse Data Residency

Dataverse environments are provisioned in a specific geographic region — the region is selected at environment creation and determines where data is stored. For regulated workloads:

  • Verify the environment is provisioned in the required region before any data is created
  • Understand that some platform features — AI Builder processing, certain analytical capabilities — may process data outside the primary region
  • Review Microsoft's Power Platform data residency documentation for the specific services in use and their data handling locations

Cross-Boundary Data Flows

DLP policies govern which connectors can be used — but they do not govern where connected systems store their data. A flow that passes Dataverse data to an external SaaS connector may be sending that data to a server in a different geographic region.

For regulated workloads with strict data residency requirements: - Assess every external connector for the geographic location of its data processing and storage - Use VNet integration to keep Power Platform traffic within the organisational network perimeter - Consider restricting certain external connectors in regulated environments regardless of DLP bucket classification


Privacy Compliance

Power Platform solutions that process personal data — names, email addresses, locations, identification numbers, health information — carry privacy obligations under GDPR, CCPA, and other applicable privacy regulations.

GDPR and Personal Data

For organisations subject to GDPR, every Power Platform solution that processes personal data must have:

  • Lawful basis for processing — the organisation must have a documented legal basis for processing the personal data the solution handles
  • Data minimisation — the solution should collect only the personal data necessary for its stated purpose
  • Retention limits — personal data should not be retained beyond the period necessary for its purpose
  • Data subject rights — the organisation must be able to respond to data subject access requests, erasure requests, and portability requests for data held in Dataverse

The CoE Starter Kit's compliance components support the documentation of data handling for each solution — capturing business justification, data classification, and support arrangements as part of the maker attestation process.

Sensitive Data Governance

Beyond formal regulatory requirements, organisations should establish a sensitive data governance policy for Power Platform:

  • Which data classifications require DPO review before a solution is deployed?
  • What is the process for handling a data subject access request for data held in Dataverse?
  • What retention and deletion controls are in place for personal data?
  • How are data breaches involving Power Platform data detected and reported?

Secrets Management — Azure Key Vault Integration

Custom connectors, Power Automate HTTP actions, and integrations with external systems frequently require API keys, client secrets, and connection strings. These credentials must never be embedded in solution components or environment variable values — they belong in Azure Key Vault.

Environment variables of type Secret: Power Platform environment variables support a Secret type — where the value is a reference to an Azure Key Vault secret rather than a value stored in Dataverse. The platform retrieves the secret at runtime; the credential is never stored in the solution or visible in the Admin Center.

Why Key Vault integration matters: - Secrets stored in Key Vault are rotated without solution changes — the Key Vault reference remains valid - Access to Key Vault secrets is governed and audited independently - Key Vault secrets can be revoked immediately if credentials are compromised - Secrets are never visible in solution exports, pipeline artefacts, or source control

The rule: Every credential, API key, and connection string used by a Power Platform solution in a production environment must be stored in Azure Key Vault and referenced via environment variable. No exceptions.


Maturity Levels

Level Description
Basic DLP policy applied to all environments including Default. Basic data classification for obvious sensitive tables. Encryption at rest via Microsoft-managed keys. No custom connectors in production without review.
Intermediate Data classification documented for all Dataverse tables in production. DLP policies tiered by environment risk with documented rationale. Row-level security implemented for all tables with access scope requirements. Column-level security implemented for regulated fields. Secrets in Key Vault for all production integrations. Data residency verified for regulated workloads.
Advanced Full classification inventory maintained. DLP policies reviewed quarterly. Microsoft Purview sensitivity labels applied. Customer-managed keys for highest-sensitivity environments. Privacy compliance framework documented. Automated data classification scanning. Column-level security coverage verified regularly. GDPR data subject rights process defined and tested.

Safe Zone

Solutions using only public or internal data, standard Microsoft connectors, and no regulated personal data can operate at Basic maturity.

Any solution that meets one or more of the following must reach Intermediate or Advanced maturity before Go-Live: - Processes personally identifiable information (PII) - Processes financial, health, legal, or other regulated data - Connects to external services via custom or premium connectors - Is subject to GDPR, HIPAA, SOC2, or other compliance frameworks - Contains salary, compensation, or sensitive HR data - Is accessible by external or guest users


Common Mistakes

  • DLP policies applied without rationale — policies exist but nobody can explain why a connector is in its assigned bucket. When an exception is requested, there is no baseline to evaluate it against.
  • No data classification — all Dataverse tables treated equally regardless of the sensitivity of the data they contain. Security controls applied uniformly at the lowest common denominator.
  • Row-level security never designed — all users with table read access can see all records. A regional salesperson can see every customer across every region.
  • Column-level security overlooked — sensitive fields (salary, national ID, health data) visible to every user who has access to the table, because column-level security was never configured.
  • Secrets embedded in solutions — API keys and connection strings stored as plain text environment variable values, visible in solution exports and source control.
  • Default Environment DLP neglected — the most widely accessible environment in the tenant has no restrictive DLP policy. Any licensed user can build flows that exfiltrate data.
  • Data residency assumed, not verified — regulated workloads deployed without verifying the environment's geographic region. Data resides in a jurisdiction that does not meet regulatory requirements.
  • Privacy compliance undocumented — solutions process personal data without a documented lawful basis, retention policy, or data subject rights process. Discovered during a regulatory audit.
  • CMK without operational processes — customer-managed keys enabled without key rotation schedules, access governance for Key Vault, or a plan for what happens if the key is compromised or lost.

Readiness Checklist

Data Classification - [ ] Classification tiers defined for the organisation (Public, Internal, Confidential, Regulated or equivalent) - [ ] All Dataverse tables in production classified - [ ] Microsoft Purview sensitivity labels configured for regulated data (where applicable) - [ ] Classification reviewed when new tables are added to production

DLP Policies - [ ] DLP policy applied to every environment including Default Environment - [ ] Default Environment policy blocks premium and custom connectors - [ ] DLP policies tiered by environment risk — progressively more restrictive toward production - [ ] Business rationale documented for every connector classification decision - [ ] Tenant-level DLP policy in place for organisation-wide restrictions - [ ] DLP policy exception process defined and governed

Dataverse Security - [ ] Security roles designed by job function — least-privilege principle applied - [ ] Row-level security scope defined for all tables with access requirements - [ ] Column-level security profiles created for all regulated and sensitive fields - [ ] Business unit structure designed where organisational data isolation is required - [ ] Security model reviewed when user responsibilities change

Encryption and Secrets - [ ] All production integration credentials stored in Azure Key Vault - [ ] Environment variables of type Secret used — no plain text credentials - [ ] Customer-managed keys assessed for highest-sensitivity environments - [ ] Custom connector API endpoints enforce HTTPS

Data Residency - [ ] Production environment geographic region verified against residency requirements - [ ] Cross-boundary data flows assessed for regulated connectors - [ ] VNet integration assessed for regulated workloads

Privacy Compliance - [ ] Personal data inventory — all Dataverse tables containing PII identified - [ ] Lawful basis for processing documented for each personal data use case - [ ] Data retention policy defined and implemented - [ ] Data subject rights process defined — access, erasure, portability - [ ] GDPR / CCPA / applicable regulation compliance assessed and documented


Part of the SHIELD Framework — powerplatform.wiki Last updated: March 2026 Last reviewed: March 2026