Data Studio v2.0

Hi

When l have a snapshot at the end of a workflow, if the workflow is 200k or less lines it will run no errors. I expanded the file to beyond these amount (did 250K, 210K etc) of lines and it shuts down the application and l have to start and stop server and log back in.

The failure you see below is due to the job kicking you out, as it occurs every time l run the workflow and any workflow over 200k lines. Less than this it runs the snapshot fine and in a good timeframe

Is there a setting that is driving the rows allowed in a snapshot? Should l manage snapshots with large data differently? 

the failure is due to it shutting down system when running this workflow, it happens every time l run it.

Start and stop server and your back in.

Is anyone aware of this issue? Any assistance would be appreciated

thanks Carolyn

 

Tagged:

Answers

  • Clinton JonesClinton Jones Experian Elite

    Hi Carolyn this sounds like you might ahve a workflow with the Address Validation step in it and some questionable data quality in the address data and the address validation is causing eth workflow to terminate prematurely.

    I think the number of lines is pretty irrelevant here, it is dependent on what is happening in the data between 200k and say 210k that is the presenting the issue.

    There are a couple of clean up routines you can apply to the data before running it against an API integration such as removing noise

    https://docs.experianaperture.io/datastudio/create-functions.htm?Highlight=denoise

    Denoise - denoising values transforms them by translating all letters to uppercase and retaining only letters and digits. This option returns all matches where the denoised lookup value matches the denoised value in the lookup column.

Sign In or Register to comment.