Aperture using workflow to export via JDBC - Issue

Hi,

Please can you help/advise? 

I am trying to export dataset containing 79k records (83 columns) to HDI table, which is failing. However, if we limit the record to 4 or 100 the export to HDI works perfectly fine.

Also, please find below error:

Unexpected errors have occurred during Workflow execution: 9003: An unexpected error has occurred during Workflow execution4056: Error when preparing JDBC statement parameter

Answers

  • Josh Boxer
    Josh Boxer Administrator
    edited February 20

    If I understand correctly, you are able to export the first 100 rows successfully, but the full 79k records returns an error?

    There is an option in the Export step ‘Continue Workflow execution on JDBC exports error’, which will continue processing starting from the next record/batch (rather than failing completely). The step will produce a ‘Show failing rows’ output that includes the first database error message that impacted one or more rows in each failing batch. Re-exporting these failed rows with a smaller batch size can identify which row is causing the failure if this cannot be determined from the message.

    https://docs.experianaperture.io/data-quality/aperture-data-studio-v2/prepare-and-move-data/export-data/#export

    Also, if you look in the data studio log file you might be able to see a more detailed error message

  • JenJ
    JenJ Member
    edited February 20

    Hi Josh,

    Yes that's right - I am able to export few records to Hadoop table but not the 80k records with 83 columns. I have already 'checked' the option you have mentioned in the workflow setting. No luck!

    See below outcome

    1. 4 records – Successfully exported to Hadoop Table
    2. 34 records – Failed export to Hadoop Table
    3. 170 records – Failed export to Hadoop Table
    4. 79k records – Failed export to Hadoop Table

    Also, see screenshot below:

  • Josh Boxer
    Josh Boxer Administrator

    Check the log file and the failed rows file.

    If still unclear please raise a Support ticket with the details and we will help you investigate further.

  • JenJ
    JenJ Member

    Hi Josh,

    Thanks for looking into this one - there are no failed rows created. I have not raised a support ticket before so will check with the Aperture support team and raise one.

    Thanks again.

    Best Regards,

    Jencil.

  • Henry Simms
    Henry Simms Administrator

    Given that very small exports are working it sounds like the connection either timing out or hitting some sort of data transfer size threshold, or perhaps being blocked somewhere in its route to HDI.

    I would recommend setting up the spy log, which may give more information as to what's failing from the driver's perspective:

    I wouldn't use the Atomic Database Update option for large volumes as it will be inefficient.