100 - 150 of 216 tags for tpt


Is it possible to combine two heterogenous sources and load into one particular target. If so How using TPT?
Ex: I'm combining data from Oracle and DB2 and loading into one Teradata Table.

We are now using TPT and we have a requirement to load multiple files at the same time. In the beginning we use the (*) in the file name but we have a challenge of using the wildcard character (*). Anyway to use TPT load data file contain regular expression such as ABC_AAA_[0-9].dat, ABC_AAA_BBB_[0-9][0-9].dat? 
Here is the example case:

Teradata Parallel Transporter, a high-performance parallel and scalable extract and load utility for Teradata, allows users to launch ETL processes that interact with various sources and targets by creating and submitting TPT job scripts.

Hi everyone.  I'm trying to load SQL server tables to Teradata14 using OleDB via TD's OleLoad tool.  I'm having trouble with attributes defined as VARCHAR(MAX) in SQL server - it seems that this is a LOB data type.  Here is the script that OleLoad is generating:

Greetings Experts,

What is the basic difference between the instances and sessions in TPT.

If I declare a Maxsessions attribute for a operator to be 10 over a 10 AMP system, and the consumer operator uses all 10 sessions (as producer is able to keep the consumer busy)

I am looking to upgrade our current version of Teradata TPT API from 13.00 to 13.10. Can any one tell me if these downloads are available?
TPT API 13.10 X8664
TPT Export Operator 13.10 X8664
TPT Load Operator 13.10 X8664
TPT Stream Operator 13.10 X8664
TPT Update Operator 13.10 X8664
Thx in advanced.

Hello everyone,

I'm trying to transfer data from a MySQL table into a Teradata table. The export uses the ODBC operator and the inport uses the LOAD operator.
All records go into the ET table, usually with 2673 (source parcel length incorrect) errors, some of them fail with a 6760 (invalid timestamp field).

I am trying to load data from hadoop to Teradata using tpt.
The delimiter in these files is the default hadoop delimiter (Cntrl-A - hex value 01).
Does tpt support this delimiter and how do I specify it?

The article "Teradata Parallel Transporter Supports Quoted VARTEXT in TTU14.00" has a passing comment that there is a possibility to change TPT handling of zero-length strings:
"The user can specify how empty data values are to be handled:

Hello Everyone,
I've been learning TPT and have been given various scenarios to execute; specifically ones that deal with the FAST LOAD protocol. There is one scenario that I am having issues trying to figure out, which is as follows:
TARGET TABLE has N columns.

Is there a way, within a TPT module, to accept in OS environment variables?
I am aware of the global variable file (good for all modules) and the local variable file (good for a particular module), but I need to get variables passed in for a specific execution of a module (i.e. the parms values will change with each execution).

I am trying to load a file into a table using TPT Load operator. I am using TPT version I am facing the following problem while loading the data. My target table has a column TRANS_NBR INTEGER NOT NULL and the file has data like 817911111111111 for this column.

I have the requirement, where for one of the client I need to incorporate the filename in Teradata table. Client has Teradata 13 machine. We have a fixed width source flat file and we are using TPT script in "TEXT" format. 


I am loading Teradata (V13.10.03.08) database using informaticav9.1.0 (with Hotfix1). In Infa, i am using TPT connection.
I have the follwing property on the TPT Connection string used in informatica..
Tenacity 4, Max/Min, 10/10. Sleep 6. Block Size 64000. System Operator: Export.

Hi All,
   I have a TPT script for extracting data from a table & writting to file using select operator, this works perfectly OK for me.
Now i have this entire script of mine stored in a Oracle column with CLOB datatype.

Is there any possible to load IBM EBCDIC file using TPT? There is a EBCDIC file in the mainframe, we are trying to load the file to teradata which installed in the Linux server, is there any possible using TPT connected to mainframe and load the file directly?

Hello Teradata experts,
I am getting EXPORT_OPERATOR: TPT12108: Output Schema does not match data from SELECT statement
with the following TPT script. Could anyone hint me what i am doing wrong please?
I am using UTF8 as we have unicode columns as well.

For months, I've successfully been using TPT/tbuild, executed via a DOS batch script.  Now IT gave me a fresh, newly-installed PC, and now toward the end of the tbuild I am getting :
unhandled win32 exception occurred in executor.exe. 

We are Informatica Powerexchange for TPT. When we tried to load a table with 140  columns and row length of 55k, TPT connection is throwing invalid row length error. However same is working fine with relational connection. 

Can someone please explain the usage and the proper syntax for the the DEFINE SCHEMA 'name' DELIMTED 'tablename';  introduced in TPT 14.00 reference manual diagram on page 62 / Nov 2011 ?
I assume it is generating the proper schema based on an existing schema , but i could not get the syntax to work. For example


Hi All,


I have to write a TPT script to move data from teradata table to flat file in unix. I know we should use a export operator as producer and a dataconnector as consumer but i have somethings to clarify whether the following is possible in writing TPT in unix.



Hi everybody,
I have to load file that contains 2 or 3 columns (there is no field delimiter).
Into target table I want to load only 2 columns; 3rd column (if exists in file) has to be ignored.
FileReader: VARCHAR Format='Text'
        DESCRIPTION 'Desc1'

I am getting below error while trying to connect to Oracle DB using DataDirect 7.1 ODBC drivers from TPT script. I am able to test the connection using the example executable program in DataDirect and it works fine.
TeraData Version:  13.10
DataDirect Driver: 7.1 (64 bit trial for testing purpose)
Below is the Log.

Hello.  I need to export large numbers of tables to CSV files for customer deliverables.  I am working under Teradata 13.10.  For VARCHAR fields, I need to include a Character String delimiter (double-quote) in the CSV.   I have created a tbuild/TPT export script as described in Example 10 of the TPT User Guide, but cant find a way to get the string delim

This article describes how to preserve data integrity when a Teradata Parallel Transporter (Teradata PT) export to stream job restarts.

I have a TPT restart question.  After the TPT failed the developer has cleared the Log table/Error table (except the checkpoint on Linux).

Teradata Parallel Transporter is the premiere client load/unload tool, providing parallel extraction and loading of critical data.


My TPT script gets data from some pipes and load using the UPDATE operator, and for a few weeks everything worked well.


But, at 3 days I'm have getting this error in the middle of load process:

I'm trying to install teradata 14 on Windows 2003x64 server manually.
I've installed core modules without problem but when I install TPT, it gives me error: 'java’ is not recognized as an internal or external command, operable program or batch file.

Hi Teradata gurus,
I have developed a process in Java that read some files, add some fields, and load in some pipes to my TPT script.
In my tests I'm loading 6 pipes with 6 separeted Threads.
To load this I have these Producer Operators:

I am trying to export data using TPT export operator. It is successing if I dont use any qualify function in SQL, but failing if i use one.
can we use Qualify (like qualify row_number() over(partition by col1 order by col2, col3) = 1) function in TPT export SQL

I'm creating a Notify Exit routine for an Update Operator in TPT.  Part of my routine is printing on screen the table Name and Table number I'm loading/updating:
printf("++++\t\tTable Name  : %1.*s\n",

Does anyone have a template script for TPT that extracts data from Sybase?

I have a proble with TPT, and never pass through this error.



I'm using a TPT with LOAD_OPERATOR to load a flat file (Format='Text') into an empty table.  I'm receiving the following error:

Dear All,

I'm new using Teradata, currently i'm facing error

TPT10507: CLI Error 215: MTDP: EM_CONNECT(215): Error found in local machine during connect.

while using TPT with Data Connector and Update Operator.

Any suggestions what kind error is this ?






I got to know that I can use TPT ODBC Operator to select data from another ODBC DB and then load that data into TD DB, using the LOAD Operator.

Hi there,

Can TPT be used to export ASCII data as UTF-8... ? Meaning, the data stored in the TD DB table is ASCII, but we want a flat file export that is in UTF-8 Format. Is this possible?

As far as I have seen in the TPT User Guide, it appears that the character set of data and export file should be the same.



Hi everyone,
I would like to know if it's possible to know with some system table (such as dbc.dbqlogtbl or similar) to get the number of sessions opened by a MultiLoad/FastLoad/FastExport/TPT job.
In the dbqlogtbl it seems that Teradata puts only the father session which will create the child sessions...

Hi everybody,

In the current implementation of our load scripts with multiload and tpump we are using the system variables holding the result count(&sysinscnt, &sysupdcnt, &sysetcnt, &sysuvcnt)  for logging and further processing.

Hi everybody,

I'm currently evaluating the use of TPT for load jobs in a DWH environment. One of the basic problems is that we often have a random number of blanks at the end of the data row in the input file. This can't easily be changed because the files are delivered from several different systems.

Hi Teradata Masters,

I'm trying to provide data to be consumed by a Load Operator in my TPT script throught a named pipe.
My DATACONNECTOR Producer: (all runs local on Windows)

    DESCRIPTION 'Define opcoes de leitura de arquivos'

Hi ,

I have a smiple mapping where I take data from Oracle database ,use expression transformation to add certain parameter values and then push it to teradata box.

Now, it is running fine when I use relational connection however whe I use TPT connection it shows all rows being processed but nothing gets inserted into table.




-With 32 bit TPTAPI libraries (TelApi 13.10) I could connect and fetch the data from Teradata database (13.0).