Force workflow failure due to failing rows

beth.kirby
beth.kirby Member
edited December 2023 in General

Is there a way I can force a whole workflow to fail as a result of failing rows in a validation step(at desired tolerance level)? I want my workflow to stop and fail when I have a certain number of failing rows. I have email notifications set up with fire event to email me when I have failing rows but I also want the full workflow to fail. Sorry I cannot seem to see this in the user documentation. I have tried to replicate failure by appending new columns of 0s and 1s (0 for failures) and dividing a constant by my new column, but this does not fail the whole workflow, just give errors in rows where dividing by 0.

Best Answer

  • Akshay Davis
    Akshay Davis Experian Super Contributor
    edited February 2022 Answer ✓

    @beth.kirby there is a custom extension you could add which throws an exception under defined conditions. This will fail the workflow. If you contact your account manager or EDQ Consultant you're working with they can assist you getting set up with this.

Answers

  • Josh Boxer
    Josh Boxer Administrator

    Hi Beth, can you explain the goal of failing the workflow?

    Is the workflow writing the Validate results to a Snapshot step and is the goal here to delete/not add the entire batch of results if they are below a set threshold? or is the workflow doing something else?

  • The workflow is feeding as the input to another workflow, which I do not want to proceed if there are failing rows. @Josh Boxer

  • Akshay Davis
    Akshay Davis Experian Super Contributor

    @beth.kirby You could append a column to the dataset which is Pass/Fail or 0/1 and have a filter step prior to the output, filtering to allow only Pass rows through. If you limit the result of the validaiton to a single pass/fail cell for the overall result you could just do a cartesian join on the dataset to append that for the final filter.

    This would still show the workflow as having completed successfully, but the subsequent workflow should receive no data.

  • Josh Boxer
    Josh Boxer Administrator

    The function 'Row count' might be helpful to fail either on a specific number of rows or a threshold count fails vs count source rows.

  • I had considered this, but in the second workflow I output the file into a data storage container and so where possible i wanted to try and avoid multiple uploads of blank files. If there is not an alternative, then I will try this method, thank you @Akshay Davis

    @Josh Boxer Could you please expand on your suggestion? thanks id advance

  • Josh Boxer
    Josh Boxer Administrator

    If the volume of records being processed changes each time then you might want to use a threshold (tolerance level) rather than fixed number of rows to determine a failure. You could calculate this using the Row count function or in this case if you have already got a 1/0 column then you can use a Group step to determine the volume of pass vs total rows, which would look like: