0 - 50 of 216 tags for tpt




I need to export few chinese and japanese characters stored in teradata to a file.


I have tried different options to export data through TPT but i am either not able to see the unicode characters in the generated file or i face the conflicting data length error.


I am executing my TPT export and each time, it does a detailed logging in the following logs folder.
This detailed logging makes the tpt to fail with out of space error in the logs folder.
Log Folder : 

I am using TPT to export data to a file and move into S3 buckets., am curious to know if there is a way to export data and write directly into AWS S3 buckets.
Thanks & Regards,
Srivignesh KN

Using Teradata parallel transport with simplified syntax we get error 

TPT02638: Error: Conflicting data length for column(3) - "sr_abo_netto_do_oferty". Source column's data length (16) Target column's data length (8).


Source table:

I am having difficulty using tpt and bulk loading dates into a table. My define schema looks like:

Error : TPT_INFRA: TPT02638: Error: Conflicting data length for column(5) - STATE. Source column's data length (16) Target column's data length (8).

EXPORT_OPERATOR: TPT12108: Output Schema does not match data from SELECT statement


Code : 

Hi All,
Can anyone kindly let me know is it possible to perform look up in TPT ? I am having around 100K record in a file. I want to load it into a table by looking up another table(by joining a column between file and Table). Can someone kindly let me know is it possible. Thanks in advance!

Any assistance would be welcomed.  I cannot seem to put the Interval in a format that Teradata likes.  The column data looks like "00 02:00:00.000000".  We are extracting it from ORacle and loading to Teradata version 15.10.  I am able to load most other data types no problem, but I cannot find the format needed to make this happy.

I would like to know if there is a way in which i can disable TPT restartability, each time TPT executes i would like to run it from the beginning and it should not restart from the last checkpoint.

If TPT Attributes INTEGER SkipRows = 1 and Varchar SkipRowsEveryFile = 'N' (There are more than 1 file), how will the loader know, for which file the row is to be skipped? How will the loader behave if it faces such scenario?

Any help will be much appreciated.  Thank you.
Teradata Parallel Transporter Version
Job log: /apps/tpt/Datos/TDA_DESA/Tptlogs/root-924.out
Job id is root-924, running on TDExpress1403_Sles10
Teradata Parallel Transporter Update Operator Version

HI All,
I don't know where to look at for this error in my script.  Please help.

Hi All,

I create one ODBC Connection entry in ODBCINI file defined in .profile. OS is RHEL (Red Hat Enterprise Linux Server release 5.9 (Tikanga)). TPT Version is

Hi All,
Is there any way to pass ASCII value of character in TextDelimiter for TPT Export Operator? I want to export  a "TAB" delimited file. But when I'm giving tab a separator, it's taking space as delimiter. So if we can give ASCII character instead of actual character, it might produce the export file with tab as delimiter.

Hi Friends,
I am new to this forum.
I hav 2 queries. Kindly request your help.

Hi everyone, im trying to load some data using TPT to Teradata table.
I am having troubles on column what uses TIMESTAMP(0) format, and his mask is 'DD/MM/YYYY HH:MI:SS' or 'DD/MM/YYYYBHH:MI:SS' because the data what are providing us uses that format, an example: 21/12/2015 16:49:24, this data comes in a delimited ';' file.

I have installed FastExport but apparently it cannot output CSV files.
So now I'm trying to get ahold of Teradata parallel transporter, to export a large table (hundreds of millions of rows).
I've read this post and the quickstart guide for TPT, but I still don't know how to export a CSV from a table.

Hi , 
Can anyone explain me whether TPT Stream Operator(TPUMP) can be used to load data from file to table and table to file. If it is possible could you kindly provide the script for the same. 
In my view, TPT Stream Operator is best suitable for mini batch loads. Could you confirm this also please. 
Thank you,

DESCRIPTION 'Load a Teradata table from a file'
DEFINE SCHEMA Trans_n_Accts_Schema
Account_Number VARCHAR(50),
Trans_Number VARCHAR(50),
Trans_Date VARCHAR(50),
Trans_ID VARCHAR(50),
Trans_Amount VARCHAR(50)

I tried exporting the data using TPT in binary mode and it worked. File has been created.
When I tried to understand the exported data file, I am not able to understand it.
Is that will be the LPF(Length Prefixed File) ? 
How to interpret the data meaning what is the starting point and End point of record.

Hello All,
I have exported a table to binary file through TPT in Binary mode.
I want to understand the exported file format.
Is that a Length Prefixed File ? or How I can read the file?
What is the file format if we export data in binary mode using TPT?
Please help.

I have exported the data using TPT format as binary. File has been generated. How I can Validate my data with out loading into any of the table. Is there any way we can interpret the Teradata binary data file.

Recently I have noticed TPT is not handling multi line text box fields. I am loading data from Oracle to Teradata using TPT. One of the field has data like below (multi line values for a record)
thanks for your feedback

Hi all,  this is my first time here, and I hope anyone could help me.
I need to know if it is possible to create multiple output files by using TPT, but every  output file name  must be created based on the value of a table field.
 I have a table with hundred millions rows, and two files: Filename and Type.

Running Teradata Parallel Transporter :  Load Operator and ODBC Operator Version
I'm attempting to load several tables from an Oracle Database with ODBC and Load Operators (Streaming data with "Job is running in Buffer Mode").
Several tables work great without specifying a BufferSize for the Load Operator:

Hi All,
I have a TPT script with only update operator and with declared two error tables(errortable1,errortable2) but only one errortable2 is getting generated while run. why first error table is not generated.

Hi All,
I am facing with the below TPT error. I am not sure where to correct the code.


**** 18:42:33 TPT10508: RDBMS error 3707: Syntax error, expected something like a name or a Unicode delimited identifier or an 'UDFCALLNAME' keyword between the 'USING' keyword and the 'DELETE' keyword.

I am trying to load a file which has 2181224338 records to a Teradata table using TPT Script. The Data Connector attributes are as below and I am using LOAD operator. Version of TTU is






I am getting the below error if I specify -L parameter (log path) in tbuild command:
TPT_INFRA: TPT04106: Error: Conflicting job names have been specified
'tbuild' command argument <job name>: 'WAQ_JOB' and 'C:/Program Files/Teradata/Client/15.00/Teradata Parallel Transporter/logs'.

We are using Merge into in a TPT DDL to load data to the destintation tables.
Our problem is how to report the number of rows inserted/updated by the merge into sentence, because we cant use (or if its possible, we dont know how) notify in DDL Operator.
Please if you have done this before, can you share an example?

Hi ,
I am deleting 80 millions of records from a table , for this tpt is taking 12 hours to complete where as bteq deleting data in just 2 hours. Actually tpt should be more faster than the bteq.  Please suggest me how can i avoid this long run thrugh tpt.

I am recieving a file with delimiters with 20 fields, the problem is that after the last field there are blanks until completing the end of the record.  It s some kind of mix between delimiters and fixed length.

How can I use a different target database (not TdpId) than what I declare in WorkingDatabase?  I want the log and error tables to go to a work database, but the target table is in a different database.
For example:
Work.ExampleTable_ET = error table in Work database
Person.ExampleTable = target table in Person database

Teradata Parallel Transporter (TPT) provides a comprehensive set of features for loading data warehouses.

Hello All,
Since FLOAD only loads from flat files, can we use Tables here in TPT LOAD operators?


Hi Folks,
The Teradata TPT API could be very useful in our department to tailor the operation of TPT.  I tried building the getbuffer 64-bit sample program with Visual Studio 2010 and I'm getting 23 errors,  all similar to the following.   I know my way around C++, but dont understand anything about dllimport/dllexport.

I have been unable to read JSON data from a flat file into Teradata v15. Here is my setup.



Sorry. I had posted this earlier, but I accidently unpublished it while looking at it.


Hi All,
I have some questions on TPT usage with Informatica. We have some tables( with billion records) in oracle  and we have to load it to non empty Teradata table in a parallel mode.
My manager suggested to use TPT. Will that be advantageous? What will be the benefits of using TPT instead of traditional Mload.

Hi All,
Can we export multiple select statements into respective files using TPT. If yes, could you please provide sample code.

I am new to Teradata. I am trying to load the data from csv file to Teradata 13.0 on VMWARE using TPT, where in columns are enclosed with double quotes (") and delimited with pipe ( | ).


We are trying our first job with TPT and are having problems connecting to our source database.  We are receiving this error...
TPT17101: Fatal error received from ODBC driver:
MSG='[DataDirect][ODBC lib] Data source name not found and no default driver specified'

I'm trying to run a TPT job using Windows PowerShell but I'm getting this error:

Good Evening,

Hi All -
I have worked on Mload,Fload and Fastexport utilities and i am bit new to TPT .I am trying to unload data from one server to another sever using TPT but i am getting some error as below
$ tbuild -f MOVE_DATA.txt -v jobvars.txt
Teradata Parallel Transporter Version

I'm just tryingt to confirm that in order to get Windows Authentication to work you would exclude the UserName and UserPassword attributes and let the DSN handle it.  It that correct?
If that is correct is the driver that I'm using not compatible with the ODBC Operator?