Refresh dataset notification

Slimani
Slimani Member
edited December 2023 in General

Hello dear community,

my use case is:

we refresh more that 200 dataset ( from SAP source) in the same refresh workflow.

For reasons that we don't know some of these dataset are not being refresh sometime.

How can I track the list of dataset the should be refresh but are not ?

The issue is that the refresh workflow is not in failed state

Thank you in advance

Best regards

Tagged:

Best Answer

  • Henry Simms
    Henry Simms Administrator
    edited October 24 Answer ✓

    To solve this you should be able to use the Stop execution when refresh failed option on the Source steps in your workflow (available from Aperture 2.8.4 onwards). This has the effect of causing the workflow to fail if auto-refresh on the source from the External System fails for any reason.

    With the setting enabled, my job execution will fail when the External System cannot be reached:

    Without the setting enabled (the default), the job will complete with a warning in the execution details:

    You can also add an Automation to notify / alert (or carry out some other action) when any of these failures or warnings occur, for example:

    Which will give the send the following notification:

    Note that the "Send notification" action and notifications inbox in my screenshot above is new in version 2.12, but in prior versions you can still send emails or trigger workflows / schedules

Answers

  • we have had similar cases - workflow is in completed state, but no refresh and/or the dataset is empty. If I remember right the error message (when you go to "Show Job") tells you that there has been JDBC error, so something has happened in connection. There should be "failed" state on these cases - not "completed"

    BR

    Juha

  • Henry Simms
    Henry Simms Administrator

    I was looking at this behaviour again recently and here a couple of useful additional points:

    Table in External System has new columns added after initial load

    If columns are added to a table in your DBMS, this will not prevent the table being auto-refreshed by a workflow. However the new columns won't be brought in to the Data Studio Dataset. Looking in the Job Details, you will see a warning, but the workflow's execution will never fail:

    To bring the new column into the Dataset, manually refresh and Configure:

    Table in External System has columns removed after initial load

    In this scenario, workflow execution will fail if the Stop execution when refresh failed option is checked on the Source step. The Job Details will also indicate the issue: