Column Exceeds maximum allowed ?

stevenmckinnonstevenmckinnon New Member
edited October 11 in General

Anyone any ideas ? I was getting the problem on the postcode column so ran a length check and restricted it to a maximum of 9 characters and converted anything with just spaces to NULL's hense the "new postcode" column.


Tagged:

Best Answers

  • stevenmckinnonstevenmckinnon New Member
    Accepted Answer

    Thanks Henry ... I will definitely do that. I am trying to run a few things to isolate out the rows causing me the problem as I now no longer want to just fix the problem, I really want to know what caused it. Just me taking things too far as always but my OCD wont let me leave it alone. Will let you know how I get one.


    S

Answers

  • Clinton JonesClinton Jones Experian Super Contributor

    Hello @stevenmckinnon this looks like a JDBC related error, can you open a support ticket and we get some technical resources to dig into why this is happening. https://www.edq.com/documentation/contact-support/uk/



  • Clinton JonesClinton Jones Experian Super Contributor

    Hi @stevenmckinnon I look at this again and it seems pretty clear that you may be trying to update the SQL database from your export with data that is too wide for the target.

    Depending on your schema definition for the target in the database you may have set some maximum lengths or types for the target table.

    The easiest way to identify the offending rows is to either create a rule ahead of the export that filters out records that are too long, or to run a profile on the reworked data and look at the longest length. You can do this with a suitable validation rules.

    Here's a simple one for length check

    here's a different one for multiple format checks


    These will also catch nulls and over length postcodes

    You could also use a regular expression match or leverage a constant

  • Henry SimmsHenry Simms Experian Super Contributor

    Hi @stevenmckinnon, in this particular case it looks like your workflow is exporting to SQL Server with the CREATE table option, rather than insert:

    In this case, the error

    java.sql.SQLException: [Pandora][SQLServer JDBC Driver][SQLServer]The size (<length>) given to the column '<col name>' exceeds the maximum allowed for any data type (8000).

    refers to SQL Server's limit of 8000 bytes (the equivalent of 8000 single-byte characters) for char and varchar columns. Data Studio is attempting to create a column in the new target table that is wider than SQL Server will allow. A solution could be to allow Data Studio to create columns with varchar(max), allowing for longer values to be stored, but using up more storage space on the SQL Server instance. I'll look at whether we want to allow this option.

    In the meantime, it's likely that the very long values in the postcode column indicate that something has gone wrong in the length check. If you can show us how you're doing that, and some results, we should be able to work it out!

  • Hi Henry and Clinton

    I should have mentioned at the start of my post that I had included a length check on the postcode when I first encountered this problem and it didn't seem to fix it. I have put a filter in after the length check to exclude all the "TRUE"'s.

    In the export step I am using "CREATE" rather than insert.


  • Henry SimmsHenry Simms Experian Super Contributor

    That looks like it should work Steve.

    One thing I noticed: Should the length check be applied to the New_Postcode column rather than Postcode?

    To help diagnose the problem further, you could also Branch after the final Filter step and Profile the new postcode column, checking the max length there to make sure there's nothing remaining that looks wrong:


    In show data on the Profile step, you would see a Longest Length stat:


  • Clinton JonesClinton Jones Experian Super Contributor

    @stevenmckinnon and you set the postcode column up as accepting nulls?

  • So I also spotted the "New Postcode" bit and ran the workflow again after fixing that mistake but it spat out the same error message. So .. I ran the profile step and have attached the output. Anything obvious in it ?


  • Clinton JonesClinton Jones Experian Super Contributor

    at least one row ... so ultimately you have one or more rows that are problematic - you may need to apply a remove noise on this column as you may have trailing spaces?


  • Henry SimmsHenry Simms Experian Super Contributor

    FYI I updated the previous comment to give more detail, so please re-read if you were confused before!

  • Hi Henry

    Just to keep you updated. I ran a filter to remove all numerical values as a UK Postcode cant be purely numerical and it took out approx. 21,000 rows. Re-ran the workflow and it is now processing so you were right, there must have been something hiding in the numerical values like a scientific expression.

    Out of interest: My workflow imported 5 tables. 4 were from JDBC and 1 was a snapshot. I can edit the column values on the JDBC tables through the profile - Edit column function but I couldn't workout how to do a similar task on the snapshot. The snapshot has 14m rows in it and took several hours to produce so I didn't fancy re-creating it just for testing purposes. Is it possible to edit the column values of a snapshot ?

  • Henry SimmsHenry Simms Experian Super Contributor

    Hi Steve, good to hear you're making progress.

    When you're changing the data types \ data standardisation rules on a column in a JDBC table via the Preview -> Edit dialog, you're affecting how they will get loaded in. Because a snapshot is already a loaded table, the same thing isn't possible. However, you can manipulate the data based on the type for a snapshot:

    If you use the Datatype function, you can view the type for each value:

    You can check and change data types in the Transform step. In this example, I can change any values with Decimal type to Alphanumeric:

    To see the result more clearly, I've added additional columns to show the data type for values in the "number" column before and after the conversion

    Note that some conversions will throw errors: Converting a Decimal to an Alphanumeric is fine, but converting an Alphanumeric to a Decimal would fail with an error if the value couldn't actually be converted.

Sign In or Register to comment.