Dynamics 365 Finance & Operations now offers a robust archiving capability that moves historical, inactive data out of live tables while keeping it accessible for audits and analytics.
Here’s what
you’ll find in this guide:
- How the archiving process works
- Which data types are supported
today
- How to safely include your
custom fields and tables
If you’re
still deciding whether archiving is the right next step, start with our blog:
“D365 F&O Database Optimization”
It covers practical clean-up techniques and explains how they fit into a
broader data lifecycle strategy perfect context before diving into
Dataverse-based retention.
Why Archive?
- Performance:
Reduce row counts in large tables to speed up queries and batch processing.
- Cost & Operations: Smaller databases mean lower capacity usage,
faster backups/restores, and smoother refreshes and upgrades.
- Compliance:
Retain historical data securely for audits and regulatory requirements
without bloating your live system.
The Archiving Lifecycle in D365 F&O
- Criteria & Marking – Choose the functional
scenario and define archival rules (e.g., completed or closed status older
than X months).
- Replication to Dataverse – Copy data from live tables
to long-term storage managed via Dataverse.
- Reconciliation – Verify that all marked rows
are present in long-term retention.
- Move & Delete – Transfer validated rows to
history tables and remove them from live tables to reduce operational
size.
- Access & Restore – Use dedicated inquiry pages
to view history, restore data if needed, or purge when permitted.
What Can You Archive Today?
Initial support focuses on high-volume areas for maximum impact:
- Finance:
General ledger, tax transactions
- Supply Chain:
Inventory transactions, inventory journals, sales orders
Microsoft plans to add more scenarios in future releases (e.g.,
Commerce, Customer Invoice Data).
Working with Archived Data
- Read Access:
History data remains queryable in F&O via dedicated inquiry pages.
- Analytics:
Archived data in Dataverse can be accessed through Fabric or Synapse for
reporting.
- Deletion:
When retention periods expire, purge data from long-term storage according
to policy.
Including Custom Tables and
Fields – Must-Do Steps
Most real-world implementations include extensions, and the archiving
framework supports them. However, you need to make sure your schemas are
aligned for a smooth process:
1. Mirror Custom Fields in
the History Table
The history table should include all columns from the live table.
Exceptions: Platform-managed columns like SysRowVersion and SysDataState
don’t need to be duplicated.
2. Add Custom Fields to the
BI (Virtual) Entity
Dataverse long-term retention relies on virtual/BI entities to interact
with F&O data.
- Add your custom fields to the relevant BI
entity.
- Refresh the entity in Dataverse so the
archiving process recognizes them.
- Important: Do
not add relationships between BI entities.
If you’ve extended created/modified fields, follow the recommended
naming conventions in the BI entity.
|
Table field |
Bi Entity |
|
CreatedDateTime |
CreatedOn |
|
ModifiedDateTime |
SysModifiedDateTime |
|
CreatedBy |
SysCreatedBy |
|
ModifiedBy |
SysModifiedBy |
Validation Tip
Before going live, run schema comparison queries:
- Compare live vs. history tables
- Compare history vs. BI entity
This helps catch missing columns early and prevents archival job
failures or reconciliation mismatches.
Use the provided scripts to validate whether fields match the expected schema.
Table vs History table:
DECLARE @tmpDataTable Table (SrNo
int, Scenario VarChar(10), LiveTable varchar(100), HistoryTable varchar(100),
BiEntity varchar(100))
INSERT INTO @tmpDataTable
values(1,'GL','GENERALJOURNALACCOUNTENTRY','GENERALJOURNALACCOUNTENTRYHISTORY','GeneraljournalaccountentryBiEntity')
select * from @tmpDataTable
--SELECT QUERY TO IDENTIFY SCHEMA
MISMATCH
SELECT T.SrNo,T.Scenario,
L.COLUMN_NAME,L.TABLE_NAME,HistoryTable,BiEntity--, *
FROM @tmpDataTable T JOIN
INFORMATION_SCHEMA.COLUMNS L ON T.LiveTable = L.TABLE_NAME LEFT OUTER JOIN INFORMATION_SCHEMA.COLUMNS
H ON L.COLUMN_NAME = H.COLUMN_NAME AND
T.HistoryTable= H.TABLE_NAME
WHERE --L.TABLE_NAME IN
(N'SALESTABLE') AND
H.COLUMN_NAME IS NULL AND L.COLUMN_NAME not in
('SYSDATASTATECODE','SYSROWVERSION') Order by T.SrNo
History table Vs Bi Entity:
DECLARE @tmpDataTable Table (SrNo int, Scenario
VarChar(10), LiveTable varchar(100), HistoryTable varchar(100), BiEntity
varchar(100))
INSERT INTO @tmpDataTable
values(1,'GL','GENERALJOURNALACCOUNTENTRY','GENERALJOURNALACCOUNTENTRYHISTORY','GeneraljournalaccountentryBiEntity')
SELECT T.SrNo,T.Scenario,
L.COLUMN_NAME,L.TABLE_NAME,HistoryTable,BiEntity--, *
FROM @tmpDataTable T JOIN
INFORMATION_SCHEMA.COLUMNS L ON T.LiveTable = L.TABLE_NAME LEFT OUTER JOIN INFORMATION_SCHEMA.COLUMNS H
ON L.COLUMN_NAME = H.COLUMN_NAME AND
T.BiEntity = H.TABLE_NAME
WHERE --L.TABLE_NAME IN
(N'SALESTABLE') AND H.COLUMN_NAME IS NULL
AND L.COLUMN_NAME not in
('SYSDATASTATECODE','SYSROWVERSION')
Order by T.SrNo, L.TABLE_NAME,
L.COLUMN_NAME
3. Adding New Tables to an Archive Scenario
If you need to include custom tables in an archive scenario, you can
extend the existing setup. These tables should relate either directly or
indirectly to the main live table.
What to do:
- Create a corresponding history table and
BI entity that mirrors the structure of the live table.
- You generally don’t need to replicate indexes;
a clustered index on RecId is usually sufficient for history
tables.
- Extend
the
ArchiveAutomationJobRequestCreatorclass for a scenario to add the new table to archive table chart
Operational Guardrails & Best Practices
- Start in a sandbox: Validate the full archive flow end-to-end (criteria → replicate → reconcile → move/delete → inquiry/restore).
- Align with retention policies: Work with compliance and legal teams to define time-based and event-based rules.
- Build safeguards: Use role-based access, approvals, and backout plans for initial runs.
b. Purge aggressively:
Remove data from history tables and Dataverse using bulk delete APIs for
maximum cost savings.
Putting It All Together –
Sample Rollout Plan
- Pilot scope:
Archive sales orders older than 24 months that are fully invoiced and
closed.
- Extend schema:
Add custom fields to SalesTableHistory and the corresponding BI entity; refresh in
Dataverse.
- Dry run in sandbox:
Validate replication, reconciliation, and accessibility via inquiry pages.
- Cutover window:
Execute in production during a planned maintenance window; monitor
database size and job telemetry.
- Analytics check:
Ensure your Fabric/Synapse model includes both live and historical layers
for complete reporting.
Resources
- Archive data in Dynamics 365
finance and operations apps with Dataverse [learn.microsoft.com]
- Archive customization
(preview): Add custom fields/tables to history & BI entities [learn.microsoft.com]

No comments:
Post a Comment