Workflow triggered via datadropzone - Dynamic Source in Workflow
I see that a source in workflow can be defined as dynamic and we can pass the source name when running online or running thru schedule. But when running thru notification (Verion 2.8.8) , I see no option to pass the source parameter. Has anyone found any solution for this?
Hey! You will be signed out in 60 seconds due to inactivity. Click here to continue using the site.
Assuming your Workflow has a Workflow parameter setup (in the Published version) https://docs.experianaperture.io/data-quality/aperture-data-studio-v2/get-started/create-a-workflow/#workflowparameters
Your notification (called an Automation in later version of Data Studio) will have a setting icon to enter a value
If you are trying to use a value from within the Workflow then you would do this using a Fire event step. Here is an example:
@Josh Boxer It is not a workflow parameter. I made the source dynamic (Can be supplied at execution + must be supplied). This prompts for source when run online. Also there is option to provide source when scheduled. But there is no option to provide source within notification to add the workflow.
(I am not able to attach screenshots here. I can email if you need screenshots.).
Your account should now let you attach a screenshot, which might help explain
I am not fully understanding what you are trying to achieve, but you
a) have a Workflow with a Source that must be supplied at execution
b) do not want to manually execute the Workflow providing data
c) do want to automate Workflow execution based on an Event?
What is the Event(s)? A specific Dataset has been updated (via a dropzone)?
How many different Datasets do you want to run through the Workflow?
I have attached the details. Here are the specific answers:
a) have a Workflow with a Source that must be supplied at execution : Yes
b) do not want to manually execute the Workflow providing data : Yes
c) do want to automate Workflow execution based on an Event? : Yes
What is the Event(s)? A specific Dataset has been updated (via a dropzone)? :Yes Datadropzone dataset loaded.
How many different Datasets do you want to run through the Workflow? : 50+
The person submitting the file for validation in the datadropzone need not go into Aperture. He will get an indicator immediately that the file is picked up and has started processing. Once the validation is complete, the result will be exported which may take more time.
Hope this clarifies.
Thanks for the detail. I think you want to take a different approach, make the 'status' workflow in your original question a Re-usable workflow then call it as a step or sub-process from any of the 'Validation' workflows (then you dont need to include it in any notifications/automations)
@Josh Boxer This was my initial plan, but you mentioned in my previous question that there is no guarantee that the status step will execute first.
"If you look at the Job details page you will see that workflows are optimised so that steps do not get processed linearly/sequentially."
This is the reason I decided to have a separate workflow and make it run first in the notification. This is working ok but it is forcing me to create 50 workflows instead of one.
@Josh Boxer Thanks for the meeting. I tried adding the status workflow as a reusable workflow. But that has the same problem as notification. There is no option to provide the dataset name in the reusable workflow.
I Also tried to physically copy all the steps in status workflow to the main workflow. This does not work either. All the export steps are executed together. So we cannot guarantee the execution of the status step in the beginning.
Attached are the screenshots.
So I suggest to improve the notification/automation step to provide the option to provide dataset in the workflow assignment. Please let me know if you have any questions.
Thanks for the update George, we will investigate further