0 - 20 of 20 tags for #TPT

We want to offload LOB objects from Teradata tables where  size is greater than 64K.I am trying to evaluate all the options at disposal here.

Hi there,
I'm new to TPT and have got it working.  I'm wanting to use Job Variable files to house things such as username and password and have got this working with the main script by using tbuild -f and -v.


Dear All
Our reporting tool generates the following SQL query to be executed on TD 15.10:

I have a quick clarification on the TPT Architecture.
I'm executing a TPT script on my local PC which is using the ODBC Operator to pull data from a non-Teradata data source.  I'm them loading the data in Teradata using the Load operator.

My main objective is to transfer data from teradata to hadoop. TPT is the fastest way of doing this (eventhough i have confusion between TDCH and TPT which one to go for).  TPT 15.0 onwards supports direct export to hdfs as i read in one of the documentation and also on one of the forums.

I have been frequently seeing this error in my jobs failure output:
"Type:1 (0= TPT err, 1= DBS err, 2=CLI err)
(While doing TPT API initiate)"
I looked at the component log file and I could not find any information which would give me a lead.

Masters: I want to know how others are extracting the TPT statistics and what is the best way to get the info.  I plan to take these and move it to an audit table. I know there are &ACTIVITIYCOUNT etc on Bteq or mload but I want to know what are the easy ways when it comes to TPT

We ran a job using TPT 14.10 and found the following information in the logs:
Data Block size:        1992440
Buffers/Block:            31
Data Buffer size:        64272
Total available memory:        10000000
Largest allocable area:        10000000

We are facing the following issue in configuring TPT API library with Informatica 9.6.1.
We are using a different user for informatica services instead of root.
Please check below output.
etluat:/home/infauat > echo $TWB_ROOT                                 

We just upgraded To TD version from TD 12, and we are in a process of replacing all the fastload scripts with TPT Load.
Earlier with fastload, it used to load the good records and move bad records to ET and job used to fail.

Does TPT SQL Inserter operator supports error limit attribute?. Looking at the manual it seems it doesn't. But Stream operator does seems to support it. currently we are looking for alternatives to Load operator for low volume imports but our alternative solution must support error limit and skip row features.

TPT DataConnector producer error : "Delimited Data Parsing error: Input record exceeds allocated storage"
I am using TPT Data connector producer and sql inserter operator to load clob data using tab delimited file. Max length of text I am loading into CLOB is 90000 approx.

Hi All,
I am loading varchar(max) / nvarchar(max) data from sql server to Teradata using TPT and data direct drivers. The trimmed length of varchar(max) column is 5356651. I am unable to load this data as data dorect drivers replaces varchar(-1) instead of varchar(max).
Any idea how I can load this data into teradata?


When trying to insreat into log table getting deadlock on TPT script

  I am getting deadlock issue, when trying to insert into log table ( TABLE_LOG_TABLE_1).
I am not sure what is wrong? Can you please let me know you need more information

Is there any way I can forcefully/manually stop/fail TPT while it is loading (while TPT is in application phase) the staging tables so that the table being loaded is locked and there is no way to access it.
I need to perform some tests on TPT locked tables

Hi guys,
I've no ideia of what could be the cause of this error. On TD Docs (http://www.info.teradata.com/htmlpubs/DB_TTU_14_10/index.html#page/General_Reference/B035_1096_112K/TPT.41.1267.html), the explanation doesn't say how to solve the problem.

when I used VARDATE  in TPT transformation everythink worked perfectly until the DATE column was filled in the source extract.
First row containing empty date field causes error:
DATACON_O2EV: TPT10672 Error: Data contains a separator character which is unmatched by the formatting string.

Hello, I am trying to run multiple TPT jobs in parallel(using selector operator to select data from a Teradata table that has BLOB and CLOB columns and data connector operator to write the data in files in deferred mode ), and getting the following error: