I have a text field on HDFS that can have a very long value of more than 4000 characters length.
I had to export this to Oracle using Sqoop. In Oracle table, this field is defined with data type VARCHAR2(4000 BYTE).
I get an error when Sqoop’ing.
Caused by: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
Oracle treats value longer than the defined limit for that field as LONG. Hence the error. The error is not very informative though.
Use a CLOB datatype for that field on Oracle. CLOB can store value longer than 4000 characters.
Don’t forget to add –map-column-java option to Sqoop export command. As there is no COLB type in Java/Hive. Hive has to know how to treat this value. Full command is shown below.
sqoop export --connect jdbc:oracle:thin:@hostname:1521/sid \ --username user --password passwd \ --table Oracle_Table_Name \ --columns col1,col2,col3 \ --map-column-java col3=String \ --export-dir 'location' \ --input-fields-terminated-by '\001' --input-lines-terminated-by "\n" \ --input-null-string '\\N' --input-null-non-string '\\N'
One thought on “Sqoop Export a Long Text Field”
Thanks for this:
–map-column-java col3=String \
helped me out.