How to capture log errors into a table if error occurred during execution of workflow or Schedules?

Aditi
Aditi Member
edited December 2023 in General

How to capture log errors into a table if error occurred during execution of workflow or Schedules

Answers

  • Henry Simms
    Henry Simms Administrator

    Hi @Aditi

    There's a few approaches I can think of here. You could consume the /jobs API endpoint, either in a different tool or in Data Studio itself, to capture failing job executions.

    A simpler solution would be to use a Automation triggered by the Workflow Failed event, where the event data (error message, error time, workflow name etc) are passed as parameters into a second workflow that logs the event in a table.

    The automation would look something like this:

    And the workflow to log errors would take the parameters from the event, add them to a row, and then insert that row into the error log dataset, which is multi-batch:

    The Workflow Error Log table in this case ends up looking like this, when a couple of workflow executions have failed:

  • Henry, are you able to share the workflow for this, I am interested in setting up logging. Thanks

  • @Henry Simms

    I want to create a log of when datasets are loaded. I created the automation triggered by 'Dataset Loaded' to then run a workflow (-DATASET-LOADED). I have 4 parameters for the workflow and they are mapped to event data within the automation.

    In the workflow -DATASET-LOADED I have the 4 parameters, and they are set to 'configurable on runtime'. My source data is a blank excel that just has the column names set to the 4 parameters. Then I have a transform step that sets each column to use one of the parameters. Finally it goes to a dataset DATASET-LOADED-LOG via Snapshot step.

    When I load data, a new batch is created in DATASET-LOADED-LOG but each batch is empty, so its just an empty file. (note I had to exclude DATASET-LOADED-LOG in the event of the automation, otherwise it would be an endless loop!).

    Any pointers on how I can get the data written to the snapshot correctly?

    Thanks

  • Henry Simms
    Henry Simms Administrator

    Hi @James T Sidebotham , it does sound like you've followed the correct steps to get this working. Some of the common errors around setting up events and passing event data through workflows (via parameters) to a dataset are:

    1. Not correctly setting the parameter's values in the Automation
    2. Not applying the parameter values to Dataset values in the Workflow

    The best way to avoid (or detect) parameter values not being set is to ensure that the workflow parameter is set to "required", which will ensure that you'll get an error in the job log if the "logging" workflow is invoked without values for these params:

    In the Automation, click the spanner (🔧) icon to assign event data to params:

    To bring workflow parameter values into the log snapshot via the logging workflow, the simplest approach is to use a Constant function from a Transform step and then select from the list of parameters in your workflow

    I've exported a small working example of what you're trying to achieve in the dmxd below (for version 2.14.9 and above).

    It can be tested by refreshing the "Input File" dataset with one of the test files in this zip:

  • @Henry Simms Hi, and thank you for that very detailed explanation. I had actually set up everything correctly per your instructions. However, after a closer look I noticed that my source schema template (used in the 1st step of the workflow) did not have an empty row. I had assumed that the column mapping was all that was required, so after I reloaded the schema with 1 empty row I was able to get it functioning correctly.

    Thank you for your fast response though, much appreciated.