Monday, November 17, 2025

D365 F&O Database Optimization

As your business expands, so does the Dynamics 365 Finance & Operations database. This growth can affect performance, increase maintenance windows, and drive up storage costs. To keep things sustainable, here’s a practical approach to reducing database size: (1) Start with Quick Wins – Cleanup Routines, (2)Archive Historical and Inactive Data, (3) Offload Analytics – Use Fabric or Synapse Link, and (4) Sandbox Optimization Strategies

Step 1: Start with Quick Wins – Cleanup Routines

Begin by tidying up the data you don’t need every day. Categorize your data based on importance and compliance requirements. Archive anything you need to keep for regulatory reasons, and safely delete less critical items.

Microsoft offers several built-in cleanup routines that help reduce database size and improve performance. These routines cover different modules and system administration areas and should be scheduled regularly as part of your data hygiene plan.

Why bother?
Removing old logs, history, and staging tables can make queries faster and backups leaner with minimal risk if tested properly.

Examples of common cleanup routines:

  • Batch job history cleanup
  • Database log cleanup
  • Notification cleanup
  • DMF staging and execution history cleanup

You’ll find these under System Administration > Periodic Tasks or within modules like Inventory Management, Sales, Procurement, and Production Control. If the standard jobs don’t meet your needs, you can deploy custom scripts.

Best practice tips:

  • Always check the business impact before running a cleanup.
  • Test in a sandbox environment first.
  • Schedule jobs outside business hours to avoid disruption.

For more details, see https://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/sysadmin/cleanuproutines.

Step 2: Make It Durable – Archive Historical and Inactive Data

Cleanup routines are great for regular maintenance, but archiving is your long-term strategy for improving performance, meeting compliance requirements, and cutting storage costs.

How Archiving Works in Dynamics 365 F&O

Here’s the high-level process:

  1. Define your criteria – Identify which records in your live tables are ready for archiving.
  2. Replicate to Dataverse – Move those records into long-term retention.
  3. Reconcile – Confirm that all marked records exist in Dataverse.
  4. Move and delete – Shift data to history tables and remove it from live tables to shrink your operational database.
  5. Access when needed – Use dedicated inquiry pages to view archived data, restore it to live tables, or purge it when appropriate.

What You Can Archive Today

  • Finance: General ledger, tax transactions
  • Supply Chain: Inventory transactions, inventory journals, sales orders
    (More data types are planned in future releases.)

Why It Matters

Archiving reduces the size of your operational database, which means:

  • Faster performance on large tables
  • Easier backups, refreshes, and upgrades
  • Lower capacity costs

Customization Tips

If you have custom fields or tables:

  • Mirror them in the corresponding history tables and BI entities.
  • Refresh the virtual entity in Dataverse so those fields are included in long-term retention.
  • Avoid adding relationships to BI entities; keep the history table schema aligned with the live table (except for platform-managed fields like SysRowVersion and SysDataState).

 

Step 3: Offload Analytics – Use Fabric or Synapse Link

When you need deep analytical reporting or large-scale historical insights, avoid querying the transactional database directly. Instead, offload data from Dynamics 365 Finance & Operations to a Lakehouse using Microsoft Fabric or Synapse Link.
This approach ensures:

  • Faster performance for operational workloads
  • Scalable analytics without impacting day-to-day transactions

 

Step 4: Optimize Your Sandboxes

Data Cleansing

  • Remove older transactional records that are no longer needed.
  • Use Microsoft’s standard cleanup routines wherever possible.
  • If standard routines don’t cover your needs, create custom scripts.

Pro Tip: For bulk deletions, use SQL queries. Deleting directly from F&O can be slow and resource intensive.

Sandbox Review

  • Audit active projects and check how sandboxes are being used.
  • Reduce the number of sandboxes if they’re underutilized to cut costs and free up resources.

Step 5: Governance – Monitor and Iterate

Good governance keeps everything running smoothly.

  • Use Power Platform Admin Centers and capacity views to monitor storage usage and trends.
  • Set a quarterly review cycle:
    • Validate cleanup results
    • Revisit archive criteria
    • Audit analytics pipelines

 Implementation Checklist

  • Start with low-risk cleanups: Identify candidates, test in a sandbox, and schedule recurring jobs.
  • Define archival criteria: Use factors like age, status, or legal hold. Begin with a pilot scenario (e.g., Sales orders).
  • Handle customizations: Include custom fields and tables in history tables and BI entities, then refresh in Dataverse.
  • Shift analytics: Move reporting to Synapse Link or Fabric, catalogue your “gold” datasets, and retire direct reporting on live tables.

Key Takeaways

  • Begin with cleanup routines for quick wins.
  • Move to archiving for a sustainable way to shrink your production database.

If you’re ready to explore advanced strategies for long-term retention and archiving, check out our companion post: Archiving D365 F&O Data with Dataverse Long-Term Retention”

In that article, we dive into:

  • How Dataverse supports compliance and scalability
  • Advanced approaches for archiving beyond basic clean-up
  • Leveraging Dataverse for analytics and long-term data management

No comments:

Post a Comment