CSV append failed due to data being "out of range"

Siim Neljandik shared this question 4 years ago
Answered

Hello,

A user tried to upload data to Yellowfin with the csv uploader. When they uploaded the file they got a generic useless "Oh no! An error has occurred during processing, please try again later".

When I looked into the logs I found that the real error was "Error: java.sql.BatchUpdateException: Data truncation: Out of range value for column" and when I checked the table, the column was decimal(5,0) when the data in the newer file was (8,2). Is there a way to enable Yellowfin to dynamically change the column size during append or is the Yellowfin .csv import useless for files with variable field sizes?

System information:

Application Version:8.0.2
Build:20190820
Java Version:1.8.0_171
Operating System:Windows Server 2012 R2 6.3 (amd64)

Replies (6)

photo
1

Hi Siim,

Thanks for reaching out. If you take a look at the Do section under the Prepare CSV File header in our CSV Data Imports page, which can be found here, you'll see that one of the requirements is "Consistent Data per Column". Decimal(8,2) and Decimal(5,0) are technically not the same data type, and I suspect if you attempted to append data in a standard RDBMS in a similar manner, you'd run into an identical issue if you didn't first script and execute an Alter Table query to change the data type of the applicable column. That said, once you import csv data into Yellowfin, you can head into the RDBMS where you imported the data and should be able to run said Alter Table query to change the data type, at which point you should be able to successfully import the decimal(8,2) csv data. I think it would cause more problems to dynamically change data types upon import, if every time an import was appending data, the data type was changed as this can cause data loss and other undesired effects. It's also an extra security step, as users should be aware of differences in data types they're working with. If you're using the same data source and tables, data types really shouldn't be changed all too often, so for these typically rare scenarios, altering the data type manually should do the trick.

Regards,

Mike

photo
1

Hello,

If you use the overwrite option, is the table fully rebuilt or would it keep the table structure? If the structure is kept then we could use an initial dummy upload with highest expected values to initialize the table and then overwrite it with the actual data. The whole reason why we wanted to use the Yellowfin .csv upload was to allow a lower level user to upload data without our input and having to review and alter tables would defeat that purpose. Another way this issue could be mitigated if you could change the column size during the data preparation step, but I failed to see that kind of option when I tested it.

While I understand that from an database administrative side decimal(8,2) and decimal(5,0) are different data types, in the real world they are the same. This will most likely cause confusion to our less technical users who would be in charge of the data upload.

All the best,

Siim

photo
1

Hi Siim,

Thanks for your response. I understand a how non-technical user may not understand the difference between these data types, and so it may be confusing. However, my point is that from a SQL perspective, these are different data types, and it is the RDBMS that is throwing the truncation error, not Yellowfin. As you alluded to here though, if you alter the able and change the column to the data type with the lowest specificity, in this case decimal(8,2), then there shouldn't be an issue. If you alter the table to change a column it does not rebuild the table nor delete the table in the column the query is applied to:

/ca280c452bf2f7419e76a18b7968025b

/a5b27bf02fc0b6b6ba0d9adcb76e4eed

If data needs to be altered before insertion, one option available from within Yellowfin would be to utilize our ETL functionality, but if your method is intended for non-technical users I suspect that method would not be suitable. This is an option available to you though from within Yellowfin if you need to change a data type. Ultimately, it's a matter of bringing in the correct data type in the first place, and there's really only a few methods available of doing so - obtaining the compatible data from the source, utilizing ETL to change the data type to the compatible data type, or manually changing the data type in the RDBMS.

Regards,

Mike

photo
1

Hi Siim,

I just wanted to check in and see how my last response found you. Do you need anything else from us on this?

Regards,

Mike

photo
1

Hello,

The current workaround is to create a dummy csv with large values to initialize the table with large fields and then overwrite it with the actual data. This way users without database access/knowledge can work on it.

All the best

Siim N

photo
1

Hi Siim,

Thanks for letting us know the chosen method to achieve this. This considered, I'll go ahead and close out this case then, but please don't hesitate to reach out with any further questions or concerns on this or anything else.

Regards,

Mike

Leave a Comment
 
Attach a file