Duplicate Triggering of workflow with datadropzone

2 questions please:

When a file is dropped in the datadropzone, it gets removed almost instantly. Sometimes, we don't even see the file in the folder. So users think the file is not processed and so they submit the file again.

Q1. Is there a way to see if the file is picked up? (Especially for data consumers, who cannot see the Jobs menu). Is there a place where these files are stored after taking from the datadropzone?

Q2. How can we prevent the duplicate triggering of the same workflow? I can see that both workflows are executing and this might get bad results especially when the input dataset is changed in the middle of the process. Is there a way to say when workflow NOT started (In the notification to run the workflow)?




  • Josh BoxerJosh Boxer Administrator

    for security files are disappeared once they are processed

    External system dropzones can set the polling time so that the file would sit around for longer if that is a desirable approach that your users would not get confused https://community.experianaperture.io/discussion/857/aperture-data-studio-2-8-1

    It sounds like you are using an Automation to run multiple Workflows when a Dataset gets updated. Automations trigger Workflows in parallel. Are the Workflows related? You could instead set the Automation to run a Schedule containing multiple Workflows which will run in sequence

    Lastly, (and probably most simple) create a Scoreboard Chart as a Dashboard widget (that the Consumer would have access to) to show the timestamp of the latest file [note the Source metadata dropdown]

  • @Josh Boxer Thanks for the quick response. I shall use the last method mentioned for checking if the file is loaded. Thank you.

    Do you have an answer for the Q2. I have a notification that trigger the workflow for the dataset when the dataset is loaded. So if the files are dropped twice, then the workflow will run twice. We can advise users not to drop more than once, but I think occasionally users may still drop file more than once and two jobs for the same workflow will run at the same time. Is there any way to check for a condition that the workflow is not running currently before triggering the workflow?

  • Josh BoxerJosh Boxer Administrator

    What is the issue with the Workflow running twice?

    I dont really have an answer but some trial and error with something like:

    • Automation 1: Data loaded AND Workflow completed within N mins
    • Automation 2: Automation 1 time period expired (meaning Workflow still running)
Sign In or Register to comment.