Troubleshooting: How to Verify Chart data and Compare Performance Objectives Charts with Native Jira Gadgets

Troubleshooting: How to Verify Chart data and Compare Performance Objectives Charts with Native Jira Gadgets

This article outlines a practical way to troubleshoot and verify your Performance Objectives' gadgets data when chart results don’t look quite right or differ from what you expect.

Isolate and Validate a Few Issues

If you’ve already created a chart and you don’t feel sure about the results you see - for example, the numbers seem too high or too low - here is how validate your data and understand where differences come from. To do this effectively, start small - use only a few issues. This makes it easier to review their Jira history and see the data behind your chart results.

How to Do This:

  1. Start with your existing chart where the data seems off or unclear

  • Identify and click on the bar or segment with the fewest issues.

(A smaller set makes it easier to inspect results one by one)

  • Open the resulting Jira issue list.

  • Create a saved filter from this list - you’ll use it to re-check data consistency

  1. Duplicate your chart

  2. Apply the saved filter in the Data Source of the duplicated gadget

  • In the data source settings:

    • Disable the date range.

    • Remove all additional filters.

    • Apply only your new saved filter.

  • Keep the metric as per your reporting needs.

  • Display the data by Issue Key so each issue is clearly visible.

  1. Isolate mismatched issues

  • Continue refining the dataset until only the mismatched issues remain.

  1. Inspect issue histories

  • Open the history of each remaining issue.

  • Check transitions, updates, and field changes to verify the chart data or identify the differences.

  1. Re-check the data in a native Jira gadget

As an additional validation step, you can use a native Jira gadget that reports on the same metric - for example:

  • Sprint Report → for sprint-related metrics

  • Average Time in Status gadget → for time-in-status metrics

  • Created vs. Resolved Issues, Control Chart, or others depending on the KPI

Apply the same saved filter in the Jira gadget and compare the results. If differences appear, you can gradually remove issues from the filter to narrow the dataset further.

Common Reasons for Differences

Differences in time or issue counts between Performance Objectives and native Jira gadgets usually come from:

  • Different data source settings

  • Different date ranges

  • Rounding variations

  • Different metric definitions

The best way to pinpoint the cause is to narrow your dataset and review the history of the specific issues involved.

Watch a Short Video

Here’s a short video showing how to:

  • Identify the chart segment with the fewest issues

  • Create and apply a saved filter

  • Review each issue’s history

  • Compare results with a native Jira gadget

Troubleshooting-isolate-issues-tutorial.mp4