Fluent Commerce Logo
Docs
Sign In

How to Interpret the Batch Pre-Processing (BPP) Dashboard

Essential knowledge

Author:

Fluent Commerce

Changed on:

12 Mar 2025

Overview


This document explains how to interpret the Batch Pre-Processing (BPP) dashboard, focusing on the Change Rate Efficiency metric. It provides methods for analyzing inventory batch trends and optimizing data submissions. Implementation partners can use these insights to enhance batch performance and deduplication, while customers gain better visibility into changed vs. unchanged records. The dashboard supports a date range of up to 31 days, with historical data limited to the last five months.

Key points

  • Change Rate Efficiency (%) = (Total Changed Records ÷ Total Processed Records) × 100%. This metric indicates the proportion of records in each batch identified as changed.
  • Low Efficiency (Near 0%):
    Indicates many records are unchanged. In high-volume scenarios, this means the system must filter through many redundant entries, potentially delaying the processing of true updates.
  • Moderate Efficiency (10% – 50%):
    Reflects a balanced mix of updates and unchanged records—common for many business workflows.
  • High Efficiency (Above 80%):
    It shows that nearly all submitted records have been detected as changed. For low-volume jobs, this means you are mostly sending updated data.
  • Remember:
    Achieving balance is key. While our system automatically filters out unchanged records, the volume and proportion of changed records can still impact processing time and overall system performance.

Understanding the BPP Metrics

The BPP dashboard provides insights into how batch deduplication and data filtering work before records reach the main inventory processing engine. For each batch run, the dashboard displays:

  • Total Records Processed:
    The number of inventory records evaluated in the batch.
  • Total Records Changed:
    The number of records identified as updated (i.e., different from their previous state) by our change detection logic.
  • Total Records Unchanged:
    The number of records filtered out as duplicates or unchanged.
  • Change Rate Efficiency (%):
    Calculated as (Total Changed Records ÷ Total Processed Records) × 100, this percentage indicates the proportion of records deemed changed.
  • BPP Job Completion Time:
    The timestamp when the batch job finished processing.

Together, these metrics show how effectively the BPP step filters out unchanged records so that only new or updated data proceeds to the main inventory processing engine.

How to Interpret Change Rate Efficiency Data

1. Low Change Rate Efficiency (Near 0%)

  • What It Means:
    Most records in the batch remain unchanged, suggesting a high level of redundancy. In high-volume submissions, although our system automatically filters out unchanged data, the filtering process can take time, potentially delaying the update of the changed records.
  • What to Consider:
    • Review the data submitted ensuring the volume of unchanged records is intentional.
    • If you’re sending large batches with a low change rate, be aware that the deduplication process might add some latency.
    • Consider if adjusting the frequency of submissions could help balance the load without compromising data accuracy.

2. Moderate Change Rate Efficiency (Around 10% – 50%)

  • What It Means:
    A balanced mix of changed and unchanged records is being detected. 
  • What to Consider:
    • Continue to monitor the metric over time.
    • If the efficiency drifts significantly, it may be worth reviewing your data update practices.
    • In this scenario, the BPP logic is effectively filtering out unchanged records while processing the updates appropriately.

3. High Change Rate Efficiency (Above 80%)

  • What It Means:
    Nearly all submitted records are flagged as changed, which is common when you’re sending low volumes of data that are largely updated. This means that almost every record in the batch is new or updated.
  • What to Consider:
    • If you’re intentionally sending low volumes of highly updated data, the system will process these efficiently.
    • However, if you observe this in a high-volume context, it may be a signal to evaluate whether the deduplication step (BPP) is necessary.
    • In sustained high-efficiency scenarios, you might consider configuring your job to bypass BPP, allowing records to flow directly into the Rubix engine workflow. This could reduce processing overhead and improve overall performance.

Note: Ensure this change aligns with your business requirements and test thoroughly before implementing.

Additional Scenarios to Consider

1. Fluctuating Change Rate Efficiency
  • Scenario: The efficiency percentage swings widely between batches (e.g., 10% one day and 90% the next).
  • Interpretation:
    This variability could indicate that your data updates are changing throughout a period of time—perhaps due to periodic bulk updates alternating with smaller incremental changes.
  • Recommendation:
    Examine your update schedule and submission patterns. Smoothing out these spikes by spreading updates more evenly may help stabilize the efficiency metric and system performance.
2. Sudden Drop in Change Rate Efficiency
  • Scenario: The metric drops from around 50% to under 10% and stays low.
  • Interpretation:
    Nearly all records in the batches are unchanged, which might suggest issues such as a stalled update process or changes in data submission patterns.
  • Recommendation:
    Verify that your data feeds and transformation logic are functioning correctly. Ensure that genuine changes are captured in the batches. Once issues are resolved (or if the low change rate is expected), adjust your submission frequency accordingly.
3. Sudden Spike in Change Rate Efficiency
  • Scenario:
    Your Change Rate Efficiency metric, which typically hovers in a moderate range (e.g., 10%–50%), suddenly jumps to near 100% and stays high over several batches.
  • Interpretation:
    A spike to nearly 100% means that almost every record in the batch is detected as changed compared to its previous state. It’s important to note that even if you perform a full data refresh, our deduplication logic will only flag a record as changed if key attributes (such as quantity and other critical fields) have actually been modified. Therefore, a sudden spike indicates that nearly every record is genuinely updated, not just re-sent without differences. This change in behavior could be due to a deliberate shift in your submission process or might signal an unexpected issue in your data pipeline.
  • Recommendation:
    • If Intentional:
      Verify that your submission process is designed to update nearly all records. Monitor the impact on processing times, as a high change rate—even in low-volume jobs—can increase processing duration. If the deduplication step becomes redundant under these conditions, consider bypassing BPP to streamline processing.
    • If Unintentional:
      Review your data submission configuration and change detection logic. Ensure that only records with actual modifications are being flagged as changed. This may involve auditing your data inputs and checking for misconfigurations that could be causing unchanged records to be mistakenly marked as updated.
  • Processing Considerations:
    Remember, a high change rate means more records require processing, which can lead to longer processing times. Balancing the volume of records with the genuine rate of change is key to maintaining optimal performance.


Final Thoughts

By consistently monitoring the BPP dashboard and Change Rate Efficiency trends, you can make data-driven decisions to optimize your inventory update process. In summary, these insights help you to:

  • Optimize Batch Submission Strategies:
    Balance submission frequency and batch size to ensure timely updates.
  • Manage Processing Delays:
    Understand that while our system filters out unchanged records, high volumes with low change rates may introduce some processing latency.
  • Decide on BPP Necessity:
    Use the efficiency metric to determine if the BPP deduplication step is beneficial or if bypassing it (for low-volume, high-change scenarios) might enhance performance.
  • Ensure Timely Updates:
    Monitor your metrics to maintain a smooth, efficient inventory ingestion pipeline.

By understanding and acting on these insights, you can keep your inventory processing running efficiently and effectively, ensuring that the right data is processed at the right time.

Fluent Commerce

Fluent Commerce

Copyright © 2025 Fluent Retail Pty Ltd (trading as Fluent Commerce). All rights reserved. No materials on this docs.fluentcommerce.com site may be used in any way and/or for any purpose without prior written authorisation from Fluent Commerce. Current customers and partners shall use these materials strictly in accordance with the terms and conditions of their written agreements with Fluent Commerce or its affiliates.

Fluent Logo