0 - 11 of 11 tags for tdch

I am not able to use enclosedby and escapedby arguments in Teradata hadoop connector. I get the following error when I pass these arguments. here I am trying to set enclosedby with a double quote and escapedby with forward slash. The error goes away when I remove the enclosedby and escapedby arguments.

I am trying to load data from HDFS/Hive into Teradata using the method batch.insert. 
Is there a way to set errorlimit property for the load so that the job will not fail when X-1 or lesser records were not loaded for when the errorlimit is set to X.

Using TDCH, What is the best way to import multiple tables into Hive from Teradata? Is there an option to move an entire database from Teradata to Hive?

Hi All,

We are trying to read data from Hive table and load into teradata table after lots of transformation. So, we are trying to use "load_from_hcatalog". But while selecting data, we are getting below error:

 

I have a Hive table that is too big for my Teradata database, but if I can exclude some of the columns, it should be OK.  Since I don't want to have a duplicate copy of the table with fewer columns on my hadoop server I have two choices for import:    1) use a view to filter out the columns or    2) use a query to filter out the columns.

Hi all, 
  i am encuntering the following error when trying to use TDCH to load data to Teradata. 

when exporting 2billion+ records into teradata from hadoop using TDCH (Teradata Connector for Hadoop) using the below command with "batch.insert",

I'm using TDCH to export a hive data into teradata table. For this I need to specify the number of mappers to my TDCH job. So, my question is "is this number of mappers option we give to TDCH job is just a hint to TDCH? or number of mappers created by TDCH will always be equal to number of mappers given in the option" (of the TDCH job)?

Hello All,
 
I am currently in the process of developing a complete end-end solution that will migrate objects/data from Teradata to Hadoop and vise versa.
 
When researched, i got the articles on TERADATA CONNECTORS FOR HADOOP (COMMAND LINE INTERFACE) and their respective  tutorial document.
 
 

Good Evening,
 

Hi,
I have a couple of questions regarding the Hadoop connectors:

  • What connector do I need to integrate the Hadoop and Teradata using an ETL tool like Talend or Informatica PowerCenter?
  • Can I use all of the 3 connectors side-by-side and deploy it in a sandbox environment?

 Regards,
Joseph