All Forums Hadoop
07 Sep 2016
Having an Issue with CHAR,VARCHAR,DATE Data types while exporting/importing from Hive to Teradata using TDCH

Hello,
I am trying to Import/Export Tables from/to Teradata into Hive. But If Hive table has CHAR/VARCHAR/DATE data types, I am getting below error from TDCH Connector

INFO tool.ConnectorExportTool: ConnectorExportTool starts at 1473252737445

INFO common.ConnectorPlugin: load plugins in file:/tmp/hadoop-unjar6516039745100009834/ teradata.connector.plugins.xml

INFO hive.metastore: Trying to connect to metastore with URI thrift://el3207.bc:9083

INFO hive.metastore: Connected to metastore.

INFO processor.TeradataOutputProcessor: output postprocessor com.teradata.connector.teradata.processor.TeradataBatc hInsertProcessor starts at:  1473252738715

INFO processor.TeradataOutputProcessor: output postprocessor com.teradata.connector.teradata.processor.TeradataBatc hInsertProcessor ends at:  1473252738715

INFO processor.TeradataOutputProcessor: the total elapsed time of output postprocessor com.teradata.connector.teradata.processor.TeradataBatc hInsertProcessor is: 0s

INFO tool.ConnectorExportTool: com.teradata.connector.common.exception.ConnectorExcep tion: CHAR(6) Field data type is not supported

        at com.teradata.connector.common.tool.ConnectorJobRunner. runJob(ConnectorJobRunner.java:140)

        at com.teradata.connector.common.tool.ConnectorExportTool .run(ConnectorExportTool.java:62)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java: 76)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java: 90)

        at com.teradata.connector.common.tool.ConnectorExportTool .main(ConnectorExportTool.java:780)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMeth odAccessorImpl.java:62)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delega tingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:498)

        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

 

16/09/07 14:52:19 INFO tool.ConnectorExportTool: job completed with exit code 14006

 

alternate way I can see here is changing CHAR/VARCHAR/DATE to string type in Hive table and do the import/export.

 

Changing the existing tables is not an optimal solution , Can anyone please help on this?

 

Thanks,

 
 

You must sign in to leave a comment.