Export to JDBC - not loading but Job completing

Hi

I have an export step that goes to JDBC database. We have another table in the database thats working and we have checked connection and it works.

When l load to my database it is saying it completed but you can clearly see 0 seconds at export stage (taking about 1 hour 47 mins to load 900K lines, 48 columns. There are no errors in log, it doesnt fail on job.

Is there anywhere l can review to find out why its not loading?

Do l need to make sure fields in schema have exact what table has?

So export step


Key columns has all the 48 columns in sequence of the database.

Run job - all steps run in a second (1000 rows)

but its not doing export step

Transform 7 is the last step before export (1000 lines so quite quick)

On a previous one it went for over an hour

stated complete and hasnt worked but no error in log on server and the steps say export 0 seconds. Its not actually doing anything


any ideas on what l need to do or look for? I did expect errors on fields to database but no errors listed

This is quite urgent to get it working, any ideas would be a great help

thanks

Answers

  • CarolynCarolyn Contributor

    Seeing this error in log. Not sure if this helps find the issue


  • Sueann SeeSueann See Administrator

    @Carolyn this is something we might not have a quick answer to. Have you logged a case with the support team?

  • Ivan NgIvan Ng Administrator

    Hi Carolyn,

    I would advise that you get a support ticket raised for this. From the little information that I can see in the error message, it does seem to indicate that the schema that you're importing to is not matching. You need to ensure that the columns in your output matches with the columns in the external table. We would recommend the following:

    1. Create a dataset in Data Studio that matches the target table's schema.
      1. An easy way to do that is to load the table itself as a new dataset except with using the "Use column headings only" option to bypass loading the actual rows.


    1. In your workflow, just before the Export step, use a Map to target step to map your output columns to the dataset that you've uploaded in step 1. Once done, you can then connect this to the Export step. This will ensure that your columns will match the ones in the target table.
      1. Note: You do not need to map all columns in the target. Column not mapped will be null (for the case of INSERT) or unchanged (in the case of UPDATE).

    Let me know if you have further questions. Thanks.

    Best Regards,

    Ivan Ng

  • Sueann SeeSueann See Administrator
    @Carolyn you mentioned that key columns have all 48 columns. This really should only be the primary key of the table you are trying to insert/update. Not sure if this solves your problem? If not, please log a ticket for further investigation.
  • CarolynCarolyn Contributor

    Hi Everyone

    After some tracing in background we identified the problem๐Ÿ˜. however, I do have a few questions, if thats okay:

    1. In my file, l have added longitude and latitude to an address as 2 separate columns. In Aperture,l have this as numeric and in database its loading it to has it as decimal place. I do a map to target prior to loading.

    My layout looks like this:

    However when we did trace it was failing as its loading it as this:

    Its convert it to varchar. we didnt change aperture end as it made no difference what l put in the schema and we changed the database its writing back to and made this varchar and it worked. However, they would like to have this as numeric in future.

    2.0 Also, is there away to put the character numbers on aperture side or is 4000 the default and thats okay?

    3.0 Can l get an understanding of what all the options do on an export to a database? we are not sure what one to use going forward. At moment l have done an insert as database was empty but going forward everytime we have a new full load, what should we be using:


    Can you please advise

    thanks

    Carolyn

  • Sueann SeeSueann See Administrator
    edited August 12

    @Carolyn

    1. Is this N'145.095965915286', the sample value that you are trying to insert as decimal? It looks like you have to do some conversion to a real decimal value first, for example North indicates a positive number whereas South would be represented as a negative number. If you insert a value of 145.095965915286 instead, it should work as a decimal. What database are you using by the way?
    2. The character limit should ultimately be defined at the database. You need to understand what kind of data you are saving and what is the maximum length/size of the data you are expecting so that you can specify an appropriate size. As a best practice you should specify what you really need instead of using a column length that is much larger than what is required due to various reasons for example, in terms of query performance and potentially hitting the maximum row size and not being able to add new columns in the future. You can use Aperture Data Studio to ensure your data quality such that it adheres to a specific character length that you expect to avoid unclean data hitting the database and resulting in an error upon insert.
    3. It will depend on what you anticipate in your new full load. Do you expect more rows that are totally new (for inserts) or more rows for updates?

    Insert - does only inserts. If you try to insert a row that already exists, this will cause an error if you have defined a unique primary key for your database table. However, if you are not using any unique keys for your database table, then it will just insert another row and possibly cause a duplicate if you already have inserted it before.

    Update - does only updates. So if you try to update a row that does not exist, it does nothing.

    Delete - does only deletes. So if you try to delete a row that does not exist, it does nothing.

    Insert or update - tries inserts first. If the row already exists (based on keys) then tries updating instead.

    Update or Insert - tries updates first. If the row does not exist, then tries insert instead.


    The difference between Insert or update and Update or Insert is about performance. If you anticipate more rows to be updated, use Update or Insert. If you anticipate more rows to be inserted, use Insert or update.

  • CarolynCarolyn Contributor

    Hi

    the longitude and latitude is derived by Aperture using address validation. I haven't formatted them in anyway> one is positive number and one is a negative number. Should this be numeric?

  • Sueann SeeSueann See Administrator

    @Carolyn i had to edit my post to include more details about how the "Insert" works for different scenarios. If you try to insert a row that already exists, this will cause an error if you have defined a unique primary key for your database table. However, if you are not using any unique keys for your database table, then it will just insert another row and possibly cause a duplicate if you already have inserted it before.

    Which dataset/data are you using for address validation specifically?

  • CarolynCarolyn Contributor

    We are having some real interesting issues occur when exporting JDBC.

    We will have a job run, say it failed but when you view details - the export step is either in started or running and just keeps going until the job falls of the que. no option to cancel as it says its finished. We are doing traces in database end and we can see nothing happening.

    Ive had one completed on 1000 rows, no errors, no failed line and it didn't load

    We will continue to do traces, but the front end doesn't look right.

    Anyone aware of known issues in this space

    thanks

  • Ivan NgIvan Ng Administrator
    edited August 13

    Hi Carolyn,

    I believe you're raising 2 issues here. Issue 1, as I understand is that you see that a job has failed, but when looking into the job details, the Export step is still showing a Started/Running status? This is interesting. I've not seen this.

    For your 2nd issue, as I understand, you have one job with a completed status, but the rows are not inserted/updated to the SQL database? This seems wrong. Usually, the insert/update will fail if you have invalid rows, datatype mismatch, column nullity or value length incompatibility, but this should be flagged out instead of getting a Completed status.

    Please can you open a support ticket and we can look into this. Thanks.

  • CarolynCarolyn Contributor

    Hi

    We are getting no clear messages. Can you assist with what some of these might mean?

    An unexpected error has occurred during Workflow execution. The supplied value is not a valid instance of data type numeric. Check the source

    data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.


    An unexpected error has occurred during Workflow execution. [ApetureDataStudio] [SQLServer JDBC Server] Object has been closed


    thanks

  • Ivan NgIvan Ng Administrator

    Hi Carolyn,

    The first error message is suggesting that you have invalid or incorrect values for the Data Type specified for the column. Is this the lat/long columns that you mentioned earlier? You may want to check to see if there are any values that exceeds the precision/scale defined for the column.

    As for the error message "[ApetureDataStudio] [SQLServer JDBC Server] Object has been closed", I've not really seen this one much. Can I enquire on what is being executed when you hit into this error? Are you exporting or importing from the SQL DB? What's the number of rows like? Also, can I enquire on the setup of your server? Is it sitting behind a proxy?

    Lastly, I tried to simulate your initial issue where you wanted to export the latitudes and longitudes values from the Validate Address step into decimal type columns in the SQL db table. I found that I needed to convert the Latitude and Longitude columns to numeric first before I was able to export to decimal columns. This is what I basically did:

    my workflow

    My SQL Table design:



    Post export results:


  • Ian BuckleIan Buckle Administrator

    @Carolyn I have made the support team aware of the issues you are facing and will reach out to you about this to investigate further.

  • CarolynCarolyn Contributor

    Hi

    Thanks that would be great.  I am seeing some really weird behaviour in the jobs listing when we run. We changed the latitude and longitude to number prior to running as per mentioned above and no luck.

    Job states failed, go into view and you see its running export and has the error. Then it runs for hours until it drops of jobs list. If you stop and start server it changes the status of the export step to started.

    We are doing sql traces in back end and its just not doing anything and this morning it looks like it found the connection and then just dropped it. We know server connection works as we have a small columned dataset that can connect and load to the server we are writing back too.

    Any assistance would be great

Sign In or Register to comment.