Workflow triggered via datadropzone - Dynamic Source in Workflow

George_Stephen
edited December 2023 in General

I see that a source in workflow can be defined as dynamic and we can pass the source name when running online or running thru schedule. But when running thru notification (Verion 2.8.8) , I see no option to pass the source parameter. Has anyone found any solution for this?

Comments

  • Josh Boxer
    Josh Boxer Administrator

    Hi George

    Assuming your Workflow has a Workflow parameter setup (in the Published version) https://docs.experianaperture.io/data-quality/aperture-data-studio-v2/get-started/create-a-workflow/#workflowparameters

    Your notification (called an Automation in later version of Data Studio) will have a setting icon to enter a value

    If you are trying to use a value from within the Workflow then you would do this using a Fire event step. Here is an example:

    https://community.experianaperture.io/discussion/829/use-a-value-in-your-data-as-the-export-file-name


  • @Josh Boxer It is not a workflow parameter. I made the source dynamic (Can be supplied at execution + must be supplied). This prompts for source when run online. Also there is option to provide source when scheduled. But there is no option to provide source within notification to add the workflow.

    (I am not able to attach screenshots here. I can email if you need screenshots.).

  • Josh Boxer
    Josh Boxer Administrator

    Hi George

    Your account should now let you attach a screenshot, which might help explain

    I am not fully understanding what you are trying to achieve, but you

    a) have a Workflow with a Source that must be supplied at execution

    b) do not want to manually execute the Workflow providing data

    c) do want to automate Workflow execution based on an Event?

    What is the Event(s)? A specific Dataset has been updated (via a dropzone)?

    How many different Datasets do you want to run through the Workflow?

  • I have attached the details. Here are the specific answers:

    a) have a Workflow with a Source that must be supplied at execution : Yes

    b) do not want to manually execute the Workflow providing data : Yes

    c) do want to automate Workflow execution based on an Event? : Yes

    What is the Event(s)? A specific Dataset has been updated (via a dropzone)? :Yes Datadropzone dataset loaded.

    How many different Datasets do you want to run through the Workflow? : 50+

    The person submitting the file for validation in the datadropzone need not go into Aperture. He will get an indicator immediately that the file is picked up and has started processing. Once the validation is complete, the result will be exported which may take more time.

    Hope this clarifies.

  • Josh Boxer
    Josh Boxer Administrator

    Thanks for the detail. I think you want to take a different approach, make the 'status' workflow in your original question a Re-usable workflow then call it as a step or sub-process from any of the 'Validation' workflows (then you dont need to include it in any notifications/automations)

    https://community.experianaperture.io/discussion/849/best-practices-designing-complex-workflows

    https://docs.experianaperture.io/data-quality/aperture-data-studio-v2/collaborate-and-re-use-data/re-use-workflows/

  • @Josh Boxer This was my initial plan, but you mentioned in my previous question that there is no guarantee that the status step will execute first.

     "If you look at the Job details page you will see that workflows are optimised so that steps do not get processed linearly/sequentially."

    https://community.experianaperture.io/discussion/934/duplicate-triggering-of-workflow-with-datadropzone#latest

    This is the reason I decided to have a separate workflow and make it run first in the notification. This is working ok but it is forcing me to create 50 workflows instead of one.

  • @Josh Boxer Thanks for the meeting. I tried adding the status workflow as a reusable workflow. But that has the same problem as notification. There is no option to provide the dataset name in the reusable workflow.

    I Also tried to physically copy all the steps in status workflow to the main workflow. This does not work either. All the export steps are executed together. So we cannot guarantee the execution of the status step in the beginning.

    Attached are the screenshots.

    So I suggest to improve the notification/automation step to provide the option to provide dataset in the workflow assignment. Please let me know if you have any questions.

    thanks, George

  • Josh Boxer
    Josh Boxer Administrator

    Thanks for the update George, we will investigate further

  • Ivan Ng
    Ivan Ng Administrator

    Hi George,

    Thank you so much for your input. I'm pleased to inform you that the feature you've suggested in this post in now available in Aperture Data Studio v2.11.4 (and onwards). Do let us know if you have any questions or suggestions.


  • Thanks!