There is now a new command line interface to Teradata Parallel Transporter!

TPT users have been asking for this for a while now, and we have delivered. We will explore here a brief summary of what it is and how to use it.

The command line name for this feature is "tdload". "tdload" is a front end "wrapper" to TPT that enables the user to load data from a (comma) delimited flat file into Teradata; without the need for writing a script!

We use the Unix convention command line options for the user to provide us with the necessary essentials (i.e. file name, user name, password, target table name, etc.).

The syntax is as follows (for example):

tdload -f filename -u username -p password -h hostname -t tablename

and the command line options are:

-f file name
-u user name
-p password
-h host name (tdpid)
-t target table name
-d delimiter character
-j job variable file

The default delimiter character is the comma (",") but this can be changed easily on the command line.

We also provide multi-letter command line options for those who want a little more control over their load job:

--SourceFileName
--SourceTextDelimiter
--TargetUserName
--TargetUserPassword
--TargetTdpId
--TargetAccountId
--TargetTable
--TargetMaxSessions
--TargetMinSessions
--TargetErrorLimit
--TargetWorkingDatabase
--LoadInstances
--FileInstances

Single-letter command line options use a single hyphen; multi-letter command line options use double hyphen.

"tdload" also has the intelligence to determine the best loading method. It will first detect whether it is possible to use the Load operator (FastLoad protocol). If not, then it will check whether the Update operator can be used (MultiLoad protocol). If the target table cannot be loaded by either of those operators, then "tdload" will check whether the Stream operator (TPump  protocol) is installed. If so, then the Stream operator will be used. If not, then the load job will use the SQL Inserter operator.

Under the covers, "tdload" will generate the TPT script and then run TPT. When the job is complete, "tdload" will delete the script, so there is no footprint. Since "tdload" is a feature of TPT, the output binary logs will still be generated.

Just like TPT, "tdload" supports the use of a job variable file. The job variable file for "tdload" supports the single-letter command line options and the multi-letter command line options. "tdload" will perform the necessary conversions to job variables and attributes that TPT understands.

Easy Loader is available as an efix to TPT 13.0. It is only available on the patch server and all 4 packages must be downloaded in order to get this feature, as we had to make changes to the TPT Infrastructure as well as the Load, Update and Stream operators.

Also, there is a README file on the patch server that provides more detailed information.

 

Discussion
ketfos 5 comments Joined 05/09
14 Jun 2009

Is Teradata Parallel Transporter availabe in 12.0?

feinholz 76 comments Joined 05/08
15 Jun 2009

TPT has been available since TTU8.0 (actually before that, but we do not support anything prior to TTU8.2 right now).

So, yes TPT is available as part of the TTU 12.0 set of products.

--SteveF

ketfos 5 comments Joined 05/09
30 Jun 2009

What files should I look for Teradata Parallel Transporter utility on my Unix box.
My system admin says he has installed the Teradatata Tools and Utilities Rel 12.0 Parallel Transporter for HP-UX.

feinholz 76 comments Joined 05/08
07 Jul 2009

Easy Loader is available in TPT13.0. After installing TPT 13.0, you need to download the latest patches for the infrastructure (tbuild) and the affected operators (pload, pupdate, pstream).

--SteveF

ketfos 5 comments Joined 05/09
10 Jul 2009

Is command line syntax one shown in this article, available in Teradata 12.0 on Unix?
If yes, what is the syntax to invoke tpt from command line in Unix.

The syntax is as follows (for example):

tdload -f filename -u username -p password -h hostname -t tablename

feinholz 76 comments Joined 05/08
30 Jul 2009

Easy Loader is available in 13.0. Invoke this command-line only version with the 'tload' command.

TPT is command line driven already but in 12.0 it requires a script and it is invoked through:

tbuild -f . . . . .

Please refer to the TPT documentation for all command line options.

--SteveF

saurabhpal 1 comment Joined 01/10
31 Jan 2010

I have the Source table at Oracle and I want to import the data into a Teradata table using TPT. Do I have to export the data from Oracle to a flat file to use the Easy Loader Utility or is it possible to import it directly from the Oracle DB?
I am using Teradata 13.0.

Thanks,
Saurabh

feinholz 76 comments Joined 05/08
01 Feb 2010

Easy Loader does not support the movement of data from Oracle to Teradata yet. However, you can still use script-based TPT to perform that move without landing data to disk.

--SteveF

frankjustme 2 comments Joined 09/09
08 Feb 2010

Tried this today after I patched my TPT up to latest 13.0 and it worked very nicely thanks nice tip.

Kjell 1 comment Joined 09/09
15 Apr 2010

Is it possible to use an access module together with Easy Loader or do you have to use a standard TPT script to make use of this functionality ?

Thanks,

Kjell.

feinholz 76 comments Joined 05/08
15 Apr 2010

Access Modules cannot be used with Easy Loader at this time. You would need to use the standard TPT script for that.

In its first release, Easy Loader only supports delimited data from flat files.

Thanks!

--SteveF

hyperj 11 comments Joined 09/09
27 Apr 2010

Easy Loader is installed on the Teradata Express for VMware Player downloads for anyone who would like to give is a quick spin around the block. It is a great utility and works very well, save a lot of time and confusion!

zhenya.li 4 comments Joined 05/10
12 May 2010

Can I use a tab as the delimeter with Easy Loader?

Thanks!

feinholz 76 comments Joined 05/08
12 May 2010

Yes. Specify the option like this:

$ tdload -d TAB . . . .

(no quotes around the word TAB)

--SteveF

zhenya.li 4 comments Joined 05/10
12 May 2010

It works correctly.
Thanks.

Tnewbee 1 comment Joined 05/10
24 Jun 2010

Hi!

I have heard about the TPT but never used it. And I need to copy 100 prod tables to dev? Can I use TPT for this? If yes how?

feinholz 76 comments Joined 05/08
28 Jun 2010

Script-based TPT can do this. If you want them moved in parallel you will need to write multiple scripts (one for each table).

If you do not mind them being sequential, you can write a batch script to loop through the table names and run an Export-to-Load with a single TPT script (just changing the table names on each iteration through the loop).

TPT can be used to execute a type of FastExport-to-FastLoad type of job within a single script, without using named pipes or landing the data to files.

--SteveF

zhenya.li 4 comments Joined 05/10
02 Jul 2010

When I use tdload to load a file in which have only one row record, there has a EOF error.
The log is as Follow. Why did this happen and how to resolve this problem?

Teradata Parallel Transporter Version 13.00.00.02
Job log: /opt/teradata/client/13.0/tbuild/logs/NST_APLI2_wk-1744.out
Job id is NST_APLI2_wk-1744, running on tera-etl1
Teradata Parallel Transporter DataConnector Version 13.00.00.02
$FILE_READER Instance 1 directing private log report to 'FileReaderLog-1'.
Teradata Parallel Transporter Load Operator Version 13.00.00.02
$LOAD: private log specified: LoadLog
$FILE_READER DataConnector Producer operator Instances: 1
$FILE_READER Operator instance 1 processing file '/db/td_tmp_test/export_tsv/NST_APLI2.tsv'.
$LOAD: connecting sessions
$LOAD: preparing target table
$LOAD: entering Acquisition Phase
$FILE_READER: TPT19350 I/O error on file '/db/td_tmp_test/export_tsv/NST_APLI2.tsv'.
$FILE_READER: TPT19435 pmRead failed. EOF encountered before end of record (35)
$FILE_READER: TPT19305 Fatal error reading data.
$FILE_READER Total files processed: 0.
$LOAD: disconnecting sessions
$LOAD: Total processor time used = '0.54 Second(s)'
$LOAD: Start : Fri Jul 2 18:28:08 2010
$LOAD: End : Fri Jul 2 18:28:17 2010
Job step MAIN_STEP terminated (status 12)
Job NST_APLI2_wk terminated (status 12)
Teradata Load Utility Version 13.00.00.02

feinholz 76 comments Joined 05/08
02 Jul 2010

There could be a few reaons.

1. the file does not have an end-of-record marker
2. the record does not have enough fields for the schema
3. make sure you have downloaded the latest TPT patches (especially the "tbld" patch)

--SteveF

zhenya.li 4 comments Joined 05/10
02 Jul 2010

Thank you very much.
I checked my file.
It does has a crlf mark at the end of the 1st line and a EOF mark at the second line.
And if my file has 2 rows in it, tdload worked correctly.
My tbuild version is 13.00.00.02. Is this the latest one?
Where can I download the latest version?

feinholz 76 comments Joined 05/08
02 Jul 2010

13.00.00.02 is not the latest version.
You will probably have to go to Teradata At Your Service to get to the patches.

--SteveF

Rashmi_kp 1 comment Joined 09/10
14 Sep 2010

Hi,
We are using TPT connection to Informatica. When we are trying to connect through Teradata Native ODBC, it is working fine. But when we connect through Teradata TPT ,it is throwing error as TPTWR_3802 : [ERROR] Type:(Teradata DBS Error), Error: (Database '$$FDS_Schema' does not exist.)

Can anyone please help me to resolve this?

feinholz 76 comments Joined 05/08
14 Sep 2010

I am not sure how this relates to Easy Loader. Easy Loader does not support a connection to Informatica. Nor does it support ODBC connections of any type.

Can you please be more specific as to the problem you are experiencing? The messages you show here are not TPT error messages.

--SteveF

daveemc 3 comments Joined 08/10
09 Dec 2010

you are almost there - 2 enhancements:
1) import to table even if it doesnt exist (ezloader creates it - everything as varchar(1000) if necessaary) OR read first row as header; parse field defns as follows:
mydb.import_data.txt
field1:i|field2:v(20)|field3|field4:n(10,2)
would
create table as mydb.import_data_txt(
field1 integer,
field2 varchar(20),
field3 varchar(1000),
field4 numeric(10,2)) primary index(field1);
then import to table
2) allow the import file be a named pipe

These are simple enhancements that pay substantial dividends.

Good job on Easy Loader! Kudos!

feinholz 76 comments Joined 05/08
09 Dec 2010

Thank you for the feedback. Definitely some things to think about.

Our next step will be to support the Export-to-Load scenario, where the user will be able to supply a source table (or just a SELECT statement) and a target table, and we will generate the necessary job to get the data copied from table to table.

However, that scenario will also require the target table to be created ahead of time.

There are a lot of challenges involved with leaving the table creation up to the application. I really do not recommend it because there is more to creating a table than just naming fields and data types.

You are also asking us to support a new type of record format and that enhancement is probably not going to be high on our priority list.

As to the named pipe, I find pipes pretty tricky. Named pipes are not supported in the same manner across all platforms (which is why we have a named pipe access module) and synchronization is needed between the creator of the pipe, the reader, the writer, etc.

However, these are good suggestions and we will definitely take them into consideration going forward.

Thanks!

--SteveF

feinholz 76 comments Joined 05/08
17 Dec 2010

By the way, I would like to know if the following:

1) import to table even if it doesnt exist (ezloader creates it - everything as varchar(1000) if necessaary) OR read first row as header; parse field defns as follows:
mydb.import_data.txt
field1:i|field2:v(20)|field3|field4:n(10,2)

is supported in any product that you know of, and if so , which one(s)?

If so, we would like to explore if there is a standard of some nature in this area.

As to the named pipe, that is not as desireable due to the nature of the synchronization that is needed with named pipes (just look at how user-unfriendly it is to try to use FastExport and FastLoad with named pipes).

We actually have a Named Pipe Access Module and it will one day be available for use with Easy Loader, but right now we are not enabling that functionality.

Thanks!

--SteveF

go161300 11 comments Joined 06/09
04 Jan 2011

It was working at TD13.0 Good!
Is it possible to changing different database for the error Tables ?
also, When I tried the second time for loading the same datafile, You mentioned, MultiLoad protocol used it.
Just Insert DML, Not Update DML ?

feinholz 76 comments Joined 05/08
06 Jan 2011

For right now, EasyLoader is only to be used for loading data, and by "loading" we mean "inserting". There would be no way for EasyLoader to guess what the UPDATE syntax would look like and in its first set of releases, we are not allowing the user to provide the DML themselves.

The idea behind EasyLoader is to make things "easy". If anything beyond "easy" is needed, then the user will write a script.

The enhancements being planned for EasyLoader involve more support for more operators, but again with "ease of use" in mind (i.e. support the ODBC operator to move data from Oracle to Teradata without the need for a script).

Supporting UPDATEs is currently not on the list of enhancements.

For specifying a different database for the error tables, try using job variable names "ErrorTable1" and "ErrorTable2" and fully qualify your error table names there. We do not document those just yet as we need to introduce these features bit-by-bit. We will document more job variables in upcoming releases.

--SteveF

surish711 2 comments Joined 04/11
20 Apr 2011

Hi Feinholz,am sorry dont know where to post this as am new member.I want to learn teradata Parallel Transporter starting formt he basics.Could you pls direct me with some easy materials and presentations??

Appreciate your quick reply!

feinholz 76 comments Joined 05/08
20 Apr 2011

Provide me with your email and I can send you some information.
This link will also take you to our documentation material.
You can take a look at the Reference Manual and User Guide (use the latest, TPT 13.10).
http://www.info.teradata.com/templates/eSrchResults.cfm?prodline=&txtpid=&txtrelno=&txtttlkywrd=TBUILD&rdsort=Title&srtord=Asc&nm=Teradata+Parallel+Transporter

--SteveF

RahulVerma 1 comment Joined 05/11
04 May 2011

Hi Feinholz..
I have gone through the PDFs wrt the above link.
I have a situation where I need to load data from more then 1 .csv files to a Teradata database table. Will TPT be a good method or it will be great if you could suggest something else. I need to make sure that the database doesnt crash due to excessive load as well.

Please reply soon.
Have a nice day!

rameshg2521 4 comments Joined 06/11
07 Jun 2011

New to TPT and I have a scenario here to load target table with TPT script, I did try with export operator and then load operator but getting some messages like " TPT03050: Schema 'oci_cust_base_Schema' is undefined." I have 2 source table and I need to do null check with coalesce and join 2 source tables to load to target tables in teradata, can any one suggest me how can I achieve this.any sample script will be really helpful.Many thanks.

feinholz 76 comments Joined 05/08
08 Jun 2011

For RahulVerma:

Loading data into a single table from multiple files is a strength of TPT. Of course, performance will rely on disk speed, CPU bandwidth, etc. Our documentation explains how to accomplish this (i.e. we support the wildcard syntax).

For Ramesh:

If your joins and coalescing can be accomplished through a SELECT statement (to pull data out of 2 or more tables), then TPT can support it. In order to help you further, it would always be most helpful for you to include everything you can about the job that did not work (i.e. version of TPT, contents of the script, output showing the error, etc.). Without any of this information, it is quite difficult to assist.

--SteveF

rameshg2521 4 comments Joined 06/11
13 Jun 2011

Hi Fein, here it is the script and error message.

DEFINE JOB Export_to_File
DESCRIPTION 'Export channel rows from the delta base file.'
(
DEFINE SCHEMA DELTA_BASE_Schema_tpt
(
SUB_ACCT_NO_OCI VARCHAR(16) ,
SERV_ID_OCI VARCHAR(10) ,
SERV_CDE_OCI VARCHAR(8) ,
ALA_STMT_DESC_ALA VARCHAR(24) ,
SUB_ACCT_NO_OCI_trans VARCHAR(16) ,
SERV_ID_OCI_trans VARCHAR(10) ,
SERV_CDE_OCI_trans VARCHAR(8) ,
ALA_STMT_DESC_ALA_trans VARCHAR(24)

);

DEFINE OPERATOR DDL_OPERATOR()
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'
TYPE DDL
ATTRIBUTES
(
VARCHAR PrivateLogName = 'BASE_DELTA_ddl__privatelog_ddl',
VARCHAR TdpId = 'servername',
VARCHAR UserName = 'uid',
VARCHAR UserPassword = 'pwd',
VARCHAR AccountID,
VARCHAR ErrorList = '3807'
);

DEFINE OPERATOR FILE_WRITER()
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'
TYPE DATACONNECTOR CONSUMER
SCHEMA DELTA_BASE_Schema_tpt
ATTRIBUTES
(
VARCHAR PrivateLogName='BASE_DELTA_writer_privatelog',
VARCHAR FileName = 'delta_base_test_10.txt',
VARCHAR DirectoryPath ='/app/instance/DS_files/CSG/',
VARCHAR IndicatorMode= 'N',
VARCHAR OpenMode= 'Write',
VARCHAR extDelimiter ='|',
VARCHAR Format = 'DELIMITED'
);

DEFINE OPERATOR Export_OPERATOR()
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER EXPORT OPERATOR'
TYPE EXPORT
SCHEMA DELTA_BASE_Schema_tpt
ATTRIBUTES
(
VARCHAR PrivateLogName ='delta_exportoper_privatelog',
VARCHAR TdpId = 'servername',
VARCHAR UserName = 'Uname',
VARCHAR UserPassword = 'pwd',
VARCHAR WorkingDatabase = 'DB',
VARCHAR SelectStmt = 'SELECT
cast(BASE.SUB_ACCT_NO_OCI as VARCHAR(16))
,cast(BASE.SERV_ID_OCI as VARCHAR(10))
,cast(BASE.SERV_CDE_OCI as VARCHAR(8))
,cast(BASE.ALA_STMT_DESC_ALA as VARCHAR(24))
,cast(DELTA.SUB_ACCT_NO_OCI_trans as VARCHAR(16))
,cast(DELTA.SERV_ID_OCI_trans as VARCHAR(10))
,cast(DELTA.SERV_CDE_OCI_trans as VARCHAR(8))
,cast(DELTA.ALA_STMT_DESC_ALA_trans as VARCHAR(24))

FROM DB.BASE BASE
Full join DB.DELTA DELTA
ON BASE.SUB_ACCT_NO_OCI=DELTA.SUB_ACCT_NO_OCI_trans
AND BASE.SERV_ID_OCI=DELTA.SERV_ID_OCI_trans'
);

STEP setup_export_to_file
(
APPLY TO OPERATOR (FILE_WRITER() )
SELECT * FROM OPERATOR (EXPORT_OPERATOR() [1] );
);

);

Message is as below:

Job id is dsadm-542, running on dshead-dt-a1d
Found CheckPoint file: /opt/teradata/client/13.10/tbuild/checkpoint/dsadmLVCP
This is a restart job; it restarts at step setup_export_to_file.
Teradata Parallel Transporter DataConnector Version 13.10.00.02
FILE_WRITER Instance 1 directing private log report to 'BASE_DELTA_writer_privatelog-1'.
Teradata Parallel Transporter Export Operator Version 13.10.00.02
Export_OPERATOR: private log specified: delta_exportoper_privatelog
FILE_WRITER: TPT19007 DataConnector Consumer operator Instances: 1
FILE_WRITER: TPT19003 ECI operator ID: FILE_WRITER-32255
FILE_WRITER: TPT19222 Operator instance 1 processing file '/app/instance/DS_files/CSG/oci_cust_delta_base_test_10.txt'.
Export_OPERATOR: connecting sessions
Export_OPERATOR: TPT10508: RDBMS error 3707: Syntax error, expected something like an 'ON' keyword between the word 'BASE' and '.'.
Export_OPERATOR: disconnecting sessions
Export_OPERATOR: Total processor time used = '0.03 Second(s)'
Export_OPERATOR: Start : Mon Jun 13 15:27:49 2011
Export_OPERATOR: End : Mon Jun 13 15:27:51 2011
FILE_WRITER: TPT19221 Total files processed: 0.
Job step setup_export_to_file terminated (status 12)
Job dsadm terminated (status 12)

Question is: Is join can be supported and I need to add COALESCE.

Many thanks for help.I need sample script if you have any.Please email me if you dont mind to rameshg2521@gmail.com.

yeldarb96 1 comment Joined 08/11
18 Aug 2011

Will the Teradata Parallel Transporter Infrastructure be available on the em64t platform? My TPT Loader is running the 64-bit, but tdload looks like it can only run as 32-bit (i386 platform). Should I expect issues?
I'm running this on Windows Server 2008.
Thanks.

feinholz 76 comments Joined 05/08
18 Aug 2011

TPT only runs in 32-bit mode, but can do so on 64-bit platforms.
When using the script-based TPT, you should not install any 64-bit portions.
The main 4 operators (Load, Update, Export and Stream) are available in 64-bit for TPTAPI only.

--SteveF

lgscheurich 3 comments Joined 09/06
18 Aug 2011

If we install TPT 13, will it work with TD version 12? Upgrading the database to 13 is not an option right now. Thanks!

feinholz 76 comments Joined 05/08
18 Aug 2011

Yes it will.

--SteveF

lgscheurich 3 comments Joined 09/06
18 Aug 2011

Awesome! As you know, I'm a huge fan of TPT and am trying to see about getting it more heavily used at my current employer. Not only do we have ODBC data sources, but we also are about to receive 168 files from an outside vendor that we want to be able to quickly load. Thanks!

feinholz 76 comments Joined 05/08
18 Aug 2011

Oh, hey, I did not notice who was writing until just now! How are you? Glad to hear you are pushing TPT. Right thing to do. :)

--SteveF

lgscheurich 3 comments Joined 09/06
18 Aug 2011

Doing great! A bit different being on the non-vendor side of things, but good. Our environment is a good fit for TPT. As we move forward, expect more questions... :-)

feinholz 76 comments Joined 05/08
18 Aug 2011

Fire away whenever you want. You can also just send them to my email if you want.

--SteveF

rameshg2521 4 comments Joined 06/11
26 Oct 2011

Currently in out environment version 13.10 TPT installed and scripts running good, but the check point is getting created with default with username+sequence number in the /opt/teradata/client/13.10/tbuild/checkpoint directory.if any other TPT runs at the same time it is using default check point which is created by another job. to make it unique checkpoint name start with the job name, how can I specify at the tbuild command level.Please suggest.

feinholz 76 comments Joined 05/08
26 Oct 2011

All TPT users should provide a unique job name so that the checkpoint/restart files are unique.
The job name is the last argument on the command line.

For example:

$ tbuild -f -v . . . .

You do not have to supply a command line option letter if you put the job name as the last item on the command line.

This should be fully documented.

--SteveF

rameshg2521 4 comments Joined 06/11
28 Oct 2011

Thank you Somuch Fein, it did worked.check point file and .out file preceding getting created with the job name.Really appreciate for your help.

JMBERTONCELLI 1 comment Joined 11/10
02 Dec 2011

hello,

I have been using FASTLOAD for many month and I just switched to TDLOAD. However, I found that having TABLE COLUMN NAME STARTING BY $,_,.... is not working that well with TDLOAD! if column name starting with special characters are supported then why TDLOAD is unable to dela with?
jm.

feinholz 76 comments Joined 05/08
02 Dec 2011

This question has been asked several times across several different medium.

The "$" has special meaning in TPT syntax. The user is supposed to be able to enclose column names within double-quotes to get around it.

(Someone I think tried that and it did not work for them and we are looking into it.)

So, right now, you would not be able to use "tdload" for that job, you would need to write a script and enclose the column names in double-quotes.

--SteveF

feinholz 76 comments Joined 05/08
27 Jan 2012

Unfortunately, I cannot stop my work in order to provide a private tutorial on TPT. This article outlines in a short amount of space, all that is needed to load data using TPT without a script.

The TPT User Guide has other scenarios and script examples for you to use. We also ship script samples when you install the product.

Our Reference manual, although large (TPT can be a very complex tool with a lot of advanced features), also provides background and concepts of TPT.

All of those should help get you started.

If you have specific questions, I will try to answer them.

--SteveF

lokmac 5 comments Joined 08/09
11 Apr 2012

Is there a way to specify to use the LDAP authentication mechanism when it is not the default for users?

feinholz 76 comments Joined 05/08
12 Apr 2012

Yes. Check the TPT Reference manual for the use of the (I think) LogonMech and LogonMechData attributes.

--SteveF

lokmac 5 comments Joined 08/09
12 Apr 2012

LogMech can be used for TPT scripts, but I can't see a LogMech parameter for use with "tdload". Are you able to confirm that it does exist, and if so what the correct command is, because I can't find any reference to it in the 13.10 reference manual.

feinholz 76 comments Joined 05/08
12 Apr 2012

tdload (EasyLoader) generates scripts under the covers. In order to do this, we have created templates for all operators and in these templates we have assigned job variables to every supported attribute by every operator.

Since multiple operators can support the same type of attributes, we have to differentiate between the operators so that you can assign different values to the different operators. Usually it is done by a "source" operator and a "target" operator.

Thus, for the Load operator you would specify TargetLogonMech and TargetLogonMechData.
For the Export operator you would specify SourceLogonMech and SourceLogonMechData.

--SteveF

Siva@ 6 comments Joined 05/12
06 Jun 2012

Hi Feinholz , I have a doubt. Actually we are trying to implement TPT to load into our tables. but it seems there is not much a difference in time taken to load the table using MLOAD as well as TPT MLOAD. So it will be nice if you clear my doubt which is can we able to reduce the time taken for a job using TPT when compared to other normal utilities like MLOAD , FASTLOAD ?.

Siva@ 6 comments Joined 05/12
06 Jun 2012

For Your Information , I tried to load 2 million records using mload as well as TPTMLOAD but the time taken by mload is less than the time taken by TPTMLOAD so thas the doubt :)

feinholz 76 comments Joined 05/08
06 Jun 2012

If you have a MultiLoad job script that reads from a single file and processes 2 million rows, converting that to a TPT script with a DataConnector operator and an Update operator with the same number of sessions does not necessarily mean the TPT job will run faster. Afterall, the same number of records still have to be read in and the same number of rows are sent to Teradata.

It depends on where your bottleneck is. The bottleneck could be on the network side. The bottleneck could be the disk I/O. The bottleneck could be CPU availability.

What version of TPT are you using?

If you are using a release that supports MultipleReaders, try setting the MultipleReaders attribute in the DC operator to 'Yes' and assign the number of instances for the DC operator to 2 or 3 and see if that helps. We sometimes get a boost in performance with large files when we have multiple processes reading in the data.

--SteveF

ericdinks 1 comment Joined 03/12
15 Jun 2012

Am new here and I have a question.

I need to convert a timestamp to a date using TPT script. CREATED is the timestamp(3) date and I want to feed that to a derived data field called CREADTED_DATE

this is the script am using but am getting error messages. Please help

:CREATED (TIMESTAMP(3), FORMAT '||'''YYYYMMDDHHMISS'''||'),
:CREATED_DATE (TIMESTAMP(3), FORMAT '||'''YYYYMMDDHHMISS'''||'),
:FACTIVE,
:FDELETED,
:EXPIRE1,
:HAZMAT,
:REFNO4,
:REFNO5,

Siva@ 6 comments Joined 05/12
20 Jun 2012

Sorry Feinholz for a very late Reply. the version we are using is this

Teradata Parallel Transporter Version 13.10.00.06 .

So is that Multiple reader option to use multiple session is available with this version ?

feinholz 76 comments Joined 05/08
20 Jun 2012

Yes, TPT 13.10 supports MultipleReaders.
Please consult the TPT 13.10 Users Guide.

--SteveF

ArpanRoy 2 comments Joined 05/12
20 Jun 2012

Hi,
Requesting your expert advice on one of the TPT issue I'm facing. Below is my TPT code:

DEFINE JOB My_JOB
DESCRIPTION 'Load script from LRF'
(
DEFINE SCHEMA INPUTLAYOUT
(
SLS_PSTN_USR_ID VARCHAR(7)
,FRST_NM VARCHAR(40)
,LST_NM VARCHAR(40)
,MID_INIT_NM VARCHAR(1)
,EMPL_IND VARCHAR(1)
);

DEFINE OPERATOR o_mload
TYPE update
SCHEMA *
ATTRIBUTES
(
VARCHAR TdpId = @TdpId
,VARCHAR UserName = @UserName
,VARCHAR UserPassword = @UserPassword
,VARCHAR AccountId = @AccountId
,INTEGER MaxSessions = @MaxSessions
,INTEGER MinSessions = @MinSessions
,VARCHAR TargetTable = @TargetTable
,VARCHAR WorkingDatabase = @WorkingDatabase
,VARCHAR LogTable = @LogTable
,VARCHAR ErrorTable1 = @ErrorTable1
,VARCHAR ErrorTable2 = @ErrorTable2
,VARCHAR WorkTable = @WorkTable
,VARCHAR AmpCheck = 'None'
,VARCHAR PrivateLogName = 'Load'
);

DEFINE OPERATOR o_tpump
TYPE stream
SCHEMA *
ATTRIBUTES
(
VARCHAR TdpId = @TdpId
,VARCHAR UserName = @UserName
,VARCHAR UserPassword = @UserPassword
,VARCHAR AccountId = @AccountId
,INTEGER MaxSessions = @MaxSessions
,INTEGER MinSessions = @MinSessions
,VARCHAR TargetTable = @TargetTable
,VARCHAR WorkingDatabase = @WorkingDatabase
,VARCHAR MacroDatabase = @MacroDatabase
,VARCHAR LogTable = @LogTable
,VARCHAR ErrorTable1 = @ErrorTable1
,VARCHAR ErrorTable2 = @ErrorTable2
,VARCHAR WorkTable = @WorkTable
,VARCHAR AmpCheck = 'None'
,VARCHAR PrivateLogName = 'Load'

);
DEFINE OPERATOR o_bteq
TYPE inserter
SCHEMA *
ATTRIBUTES
(
VARCHAR TdpId = @TdpId
,VARCHAR UserName = @UserName
,VARCHAR UserPassword = @UserPassword
,VARCHAR AccountId = @AccountId
,INTEGER MaxSessions = @MaxSessions
,INTEGER MinSessions = @MinSessions
,VARCHAR TargetTable = @TargetTable
,VARCHAR WorkingDatabase = @WorkingDatabase
,VARCHAR LogTable = @LogTable
,VARCHAR ErrorTable1 = @ErrorTable1
,VARCHAR ErrorTable2 = @ErrorTable2
,VARCHAR WorkTable = @WorkTable
,VARCHAR AmpCheck = 'None'
,VARCHAR PrivateLogName = 'Load'
);
DEFINE OPERATOR o_fastload
TYPE load
SCHEMA *
ATTRIBUTES
(
VARCHAR TdpId = @TdpId
,VARCHAR UserName = @UserName
,VARCHAR UserPassword = @UserPassword
,VARCHAR AccountId = @AccountId
,INTEGER MaxSessions = @MaxSessions
,INTEGER MinSessions = @MinSessions
,VARCHAR TargetTable = @TargetTable
,VARCHAR WorkingDatabase = @WorkingDatabase
,VARCHAR LogTable = @LogTable
,VARCHAR ErrorTable1 = @ErrorTable1
,VARCHAR ErrorTable2 = @ErrorTable2
,VARCHAR WorkTable = @WorkTable
,VARCHAR AmpCheck = 'None'
,VARCHAR PrivateLogName = 'Load'
);

DEFINE OPERATOR RO_4_INPUTLAYOUT
TYPE DATACONNECTOR PRODUCER
SCHEMA INPUTLAYOUT
ATTRIBUTES
(

VARCHAR FileName = @FileName
,VARCHAR DirectoryPath = @DirectoryPath
,VARCHAR Format = 'Delimited'
,VARCHAR TextDelimiter = '|'
,VARCHAR OpenMode = 'Read'
,VARCHAR IndicatorMode = 'N'
,VARCHAR PrivateLogName = 'Read'

);

APPLY
(

'
UPDATE tgt_tbl
SET
frst_nm=COALESCE(:frst_nm,''*'')
,lst_nm=COALESCE(:lst_nm,''*'')
,mid_init_nm=:mid_init_nm
,empl_ind=:empl_ind
,updt_dt_tm=CURRENT_TIMESTAMP(0)

WHERE
sls_pstn_usr_id=:sls_pstn_usr_id
AND empl_ind = ''N''
;'
,

'
INSERT INTO tgt_tbl
(
sls_pstn_usr_id
,frst_nm
,lst_nm
,mid_init_nm
,empl_ind
,mgt_lvl_cd
,brgn_ind
,silo_ind
,load_dt_tm
)
VALUES
(
:sls_pstn_usr_id
,COALESCE(:frst_nm,''*'')
,COALESCE(:lst_nm,''*'')
,:mid_init_nm
,:empl_ind
,''*''
,''*''
,''*''
,CURRENT_TIMESTAMP(0)
)
;'
)
INSERT FOR MISSING UPDATE ROWS
IGNORE DUPLICATE UPDATE ROWS
IGNORE MISSING UPDATE ROWS
IGNORE DUPLICATE INSERT ROWS
TO OPERATOR ( @load_op[@LoadInst] )
select *
from OPERATOR ( RO_4_INPUTLAYOUT[@ReadInst] );
);

Even I'm using INSERT FOR MISSING UPDATE ROWS, IGNORE DUPLICATE UPDATE ROWS,IGNORE MISSING UPDATE ROWS, IGNORE DUPLICATE INSERT ROWS, still if the record exists in the File as well as Table , it's sending it on the Error Table.

What should I do to prevent the data to go to the Error Table??

feinholz 76 comments Joined 05/08
20 Jun 2012

Thank you for providing the script.
I would need to see the console output to verify that you are indeed using the Update operator.
You are using a job variable for the operator name and so just from the script I cannot tell that you are running the Update operator.
(Granted, since the duplicate rows are in the error table, you are most likely using the Update operator, but I would like verification.)
The next step is to add the following attribute (and value) to the Update operator definition:
VARCHAR TraceLevel = 'all'
And re-run the job and then get the log (the .out file in the "logs" directory) and provide that file for me so I can look at the trace information.

--SteveF

ArpanRoy 2 comments Joined 05/12
21 Jun 2012

The log after adding tracing level = 'all' is so huge that tlogview command is not working.

The main problem is here is, the same table is being updated from 2 different files and the data in those files are NOT MULTUALLY EXCLUSIVE.
When first file is loaded, some data updated/inserted and same is for the second file.
Now as in the script we are using one filter, when the first file run, it's detecting the same row which is already inserted by Second file,
as INSERT and getting Duplicate PRIMARY KEY error.

So I want to tell TPT that even if you have DUPLICATE PK ERROR (same row exists in the target table) just ignore, don't insert into the error table.

I tried to add another UPDATE STATEMENT in the APPLY clause, to make a DUMMY UPDATE and while running it says, "INVALID UPSERT STATEMENT in APPLY".
Can you please help how to add multiple UPDATE statement in the same APPLY clause?

Sending the job log without tracing level all.

===================================================================
= =
= TERADATA PARALLEL TRANSPORTER =
= =
= STREAM OPERATOR VERSION 12.00.00.05 =
= =
= OPERATOR SUPPORT LIBRARY VERSION 12.00.00.02 =
= =
= COPYRIGHT 2001-2009, TERADATA CORPORATION. ALL RIGHTS RESERVED. =
= =
===================================================================

**** 13:02:52 Processing starting at: Thu Jun 21 13:02:52 2012

===================================================================
= =
= Module Identification =
= =
===================================================================

Stream Operator for Solaris running Streams TCP/IP
StreamMain : 09.00.00.12
StreamCLI : 09.00.00.09
StreamSess : 09.00.00.12
StreamSort : 05.02.00.01
StreamUtil : 09.00.00.08
PcomMBCS : 09.00.00.01
PcomMsgs : 09.00.00.01
PcomNtfy : 09.00.00.01
PcomPx : 09.00.00.02
PcomUtil : 09.00.00.05
PcomCLI : 09.00.00.15
CLIv2 : 12.00.00.02

===================================================================
= =
= Attribute Definitions =
= =
===================================================================

**** 13:02:53 Options in effect for this job:
OperatorType: Consumer
Instances: 1
Character set: Not specified; will use default
Checkpoint: No checkpoint in effect
Notify: Not enabled
Error limit: No limit in effect
Tenacity: 4 hour limit to successfully connect
Sleep: 6 minute(s) between connect retries
Buffers: 3
Pack: 20
Robust: In effect

===================================================================
= =
= Column/Field Definition =
= =
===================================================================

Column Name Offset Length Type
============================== ====== ====== ================
SLS_PSTN_USR_ID 0 7 VARCHAR
FRST_NM 9 40 VARCHAR
LST_NM 51 40 VARCHAR
MID_INIT_NM 93 1 VARCHAR
EMPL_IND 96 1 VARCHAR
============================== ====== ====== ================
INDICATOR BYTES NEEDED: 1
EXPECTED RECORD LENGTH: 100
**** 13:02:53 Operator Command ID for External Command Interface: o_tpump6763

===================================================================
= =
= Control Session Connection =
= =
===================================================================

**** 13:02:53 Connecting to RDBMS: 'xxxxxxxxxxx'
**** 13:02:53 Connecting with UserId: 'XXXload'

===================================================================
= =
= Teradata Database Information =
= =
===================================================================

**** 13:02:53 Teradata Database Version: '12.00.03.46 '
**** 13:02:53 Teradata Database Release: '12.00.03.33 '
**** 13:02:53 Maximum request size supported: 1MB
**** 13:02:53 Session character set: 'ASCII'
**** 13:02:53 RDBMS supports upsert SQL
**** 13:02:53 Data Encryption: supported
**** 13:02:53 RDBMS supports Array Support
**** 13:02:54 Restart log table 'XXXWORK.LGsatgt_tbl_pstn' has been created
**** 13:02:55 Error table 'XXXWORK.M12173_46974_21249_ET' has been created

===================================================================
= =
= Special Session Connection =
= =
===================================================================

**** 13:03:03 Maximum number of special sessions requested: 24
**** 13:03:03 Minimum number of special sessions required: 24

Instance Assigned Connected Result
======== ======== ========= ======================
1 24 24 Successful
======== ======== ========= ======================
Total 24 24 Successful
**** 13:03:07 Current working DATABASE set: 'USER_30_DAY_TABLES'

===================================================================
= =
= Load Phase =
= =
===================================================================

**** 13:03:08 Starting to send data to the RDBMS
**** 13:15:59 Checkpoint complete. Rows sent: 28806
**** 13:16:00 Checkpoint complete. Rows sent: 28806
**** 13:16:02 Finished sending rows to the RDBMS

Instance Rows Sent
======== =============
1 28806
======== =============
Total 28806

**** 13:16:02 Load Statistics for DML Group 1 :
Type Database Table or Macro Name Activity
====== ======================== ========================= ============
UPDATE USER_30_DAY_TABLES tgt_tbl 407
INSERT USER_30_DAY_TABLES tgt_tbl 0

===================================================================
= =
= Operator Task Cleanup =
= =
===================================================================

**** 13:16:02 Dropping macro 'tpump_macros.M12173_46974_21249_000_000'
**** 13:16:02 Macro 'tpump_macros.M12173_46974_21249_000_000' has been dropped
**** 13:16:02 Error table XXXWORK.M12173_46974_21249_ET contains 28399 rows
**** 13:16:02 Dropping restart log table 'XXXWORK.LGsatgt_tbl_pstn'
**** 13:16:03 Restart log table 'XXXWORK.LGsatgt_tbl_pstn' has been dropped

===================================================================
= =
= Logoff/Disconnect =
= =
===================================================================

**** 13:16:03 Logging off all sessions

Instance Cpu Time
======== ================
1 15.78 Seconds

**** 13:16:05 Total processor time used = '15.78 Second(s)'
. Start : Thu Jun 21 13:02:52 2012
. End : Thu Jun 21 13:16:05 2012
. Highest return code encountered = '0'.
**** 13:16:05 This job terminated

br230015 7 comments Joined 09/11
27 Jun 2012

Hi Feinzholz,
I installed TPT 13.10 on Linux 32 bit Landing Server and getting the following error, Would be thankful for any help.

[root@tpls-dr-poc testing]# tdload -f file1.txt -u ut_lodusr01 -h 10.16.5.52 -t dt_test.TPT_TEST
Teradata Load Utility Version 13.10.00.09

Password: ********
Teradata Parallel Transporter Version 13.10.00.09
Job log: /opt/teradata/client/13.10/tbuild/logs/root-19.out
Job id is root-19, running on tpls-dr-poc
Found CheckPoint file: /opt/teradata/client/13.10/tbuild/checkpoint/rootLVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter DataConnector Version 13.10.00.09
$FILE_READER Instance 1 directing private log report to 'FileReaderLog-1'.
$FILE_READER Instance 1 restarting.
$FILE_READER: TPT19008 DataConnector Producer operator Instances: 1
Teradata Parallel Transporter Load Operator Version 13.10.00.04
$LOAD: private log specified: LoadLog
$FILE_READER: TPT19003 ECI operator ID: $FILE_READER-3498
$LOAD: connecting sessions
$LOAD: preparing target table
$LOAD: entering Acquisition Phase
$FILE_READER: TPT19222 Operator instance 1 processing file 'file1.txt'.
$FILE_READER: TPT19350 I/O error on file 'file1.txt'.
$FILE_READER: TPT19109 Data Schema Column count (3) conflicts with columns received (0) in record 2.
$LOAD: disconnecting sessions
$FILE_READER: TPT19221 Total files processed: 0.
$LOAD: Total processor time used = '0.47 Second(s)'
$LOAD: Start : Wed Jun 27 14:01:27 2012
$LOAD: End : Wed Jun 27 14:01:51 2012
Job step MAIN_STEP terminated (status 8)
Job root terminated (status 8)

Table definition
CREATE SET TABLE DT_TEST.TPT_TEST ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
Acc_no VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
Name VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
Salary BIGINT)
PRIMARY INDEX ( Acc_no );
data file entry
more file1.txt
1,BR,50

Teradata

br230015 7 comments Joined 09/11
27 Jun 2012

I able to successfully load using the following command:
tdload -f file1.txt -u ut_lodusr01 -h 10.16.5.52 -t dt_test.tpt_test

But getting the follwing error while loading through script

[root@tpls-dr-poc testing]# tbuild -f tpt_test.txt
Teradata Parallel Transporter Version 13.10.00.09
TPT_INFRA: Syntax error at or near line 101 of Job Script File 'tpt_test.txt':
TPT_INFRA: At "SELECT" missing SEMICOL_ in Rule: Job Definition Body
Compilation failed due to errors. Execution Plan was not generated.
Job script compilation failed.
Job terminated with status 8.

Table definition
CREATE SET TABLE DT_TEST.TPT_TEST ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
Acc_no VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
Name VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
Salary BIGINT)
PRIMARY INDEX ( Acc_no );
data file entry
more file1.txt
1,BR,50

Thanks

Teradata

Siva@ 6 comments Joined 05/12
27 Jun 2012

Hi FEINHOLZ,

Waiting for ur reply for my post :)

feinholz 76 comments Joined 05/08
27 Jun 2012

if you have a syntax error in the script, the best approach is to post the entire script so that someone can try to find the error.

--SteveF

feinholz 76 comments Joined 05/08
27 Jun 2012

For the post about this:

$FILE_READER: TPT19350 I/O error on file 'file1.txt'.
$FILE_READER: TPT19109 Data Schema Column count (3) conflicts with columns received (0) in record 2.

Please provide the first few rows of the data file.
Make sure you do not have an end-of-record marker where one should not be.

--SteveF

Siva@ 6 comments Joined 05/12
27 Jun 2012

Hi Feinholz , This is my entire script. I think this will help for ur analysis.

DEFINE JOB LOAD_RESULTS_TABLE
DESCRIPTION 'INSERTING ROWS INTO RESULTS TABLE'
(
DEFINE SCHEMA PRODUCT_SOURCE_SCHEMA
(
A CHAR(02),
B CHAR(02),
C CHAR(02),
D CHAR(05),
);
DEFINE OPERATOR FILE_READER
TYPE DATACONNECTOR PRODUCER
SCHEMA PRODUCT_SOURCE_SCHEMA
ATTRIBUTES
(
VARCHAR FILENAME = 'DD:INFILE',
VARCHAR FORMAT = 'UNFORMATTED',
VARCHAR OPENMODE = 'READ',
VARCHAR MULTIPLEREADERS = 'Y'
);
DEFINE OPERATOR UPDATE_OPERATOR
DESCRIPTION ' LOAD OPERATOR FOR TD '
TYPE UPDATE
SCHEMA *
ATTRIBUTES
(
VARCHAR PRIVATELOGNAME = 'SIVA.ABCD',
VARCHAR TDPID ='8',
VARCHAR USERNAME = @JOBVAR_USERNAME,
VARCHAR USERPASSWORD = @JOBVAR_PASSWORD,
VARCHAR LOGTABLE = 'employee_tptlog',
VARCHAR TARGETTABLE = 'employee_table',
VARCHAR ERRORTABLE1 = 'employee_table_E1',
VARCHAR ERRORTABLE2 = 'employee_table_E2'
);
APPLY
(
'INSERT INTO employee_table
( A
,B
,C
,D
,DATE_ENTRY
,DATE_STAMP
,TIME_STAMP
)
VALUES
(:A
,:B
,:C
,:D
,DATE
,DATE
,TIME);')
TO OPERATOR (UPDATE_OPERATOR [2])
SELECT * FROM OPERATOR (FILE_READER [2]);
);

So I tried submitting through jcl job. But I got the following error.

****************************************************** *************************
TPT_INFRA: Syntax error at or near line 65 of Job Script File 'dd:SYSIN':
TPT_INFRA: At "2" missing RPAREN_ in Rule: OPERATOR Clause
Compilation failed due to errors. Execution Plan was not generated.
Job script compilation failed.
******************************** Bottom of Data ********************************
For this I searched ur post and got a solution like we need to change the code Emulator page to 1037.
But I am not aware of this as how to do it , It will be really helpful if you could see it .

Waiting For a Solution :)

feinholz 76 comments Joined 05/08
28 Jun 2012

One thing I do see is that you have a comma after the specification of column "D" in the schema object.

--SteveF

br230015 7 comments Joined 09/11
28 Jun 2012

How I confirm that stream operator is working?

Teradata

Siva@ 6 comments Joined 05/12
28 Jun 2012

Okay Thanks feinholz. That comma thing is a mistake. I cleared it . But I am getting error because of that [2] . This symbol '[' has got anything to do with mainframes.

So i got this error

TPT_INFRA: At "2" missing RPAREN_ in Rule: OPERATOR Clause
Compilation failed due to errors. Execution Plan was not generated.

feinholz 76 comments Joined 05/08
11 Jul 2012

Please take a look at this thread, specifically regarding the terminal emulator code page that should be used.
http://forums.teradata.com/forum/tools/tpt-multiple-instances-using-mainframe-host-syntax-error

--SteveF

21 Jul 2012

Hi Fein,

I am doing Fast Export ,the same working fine.But it always exporting the data with some binary junk value.

Please help.

R's,
Arka

feinholz 76 comments Joined 05/08
23 Jul 2012

What record format are you specifying?
Most formats are binary formats.
How many bytes of "junk"?
If 2 at the beginning of each record, then it is possible you are seeing the 2-byte record length.
If it is within the record, you are probably seeing the 2-byte field lengths for the VARCHAR fields.

--SteveF

ericsun2 5 comments Joined 06/10
24 Aug 2012

Is there 64-bit tdload and tbuild available for Linux?

Thanks

feinholz 76 comments Joined 05/08
24 Aug 2012

Not yet. They will be available in TTU 14.10.

--SteveF

sarath@TDDEV 2 comments Joined 09/12
11 Sep 2012

Hi Steven,

in TPT there is an way to trim the Trailing Blanks using "TrimTrailingBlanks" option. Because of the way our source is giving us the data we get trailing blanks. Is there a way to do the same thing using TDLOAD?

Thanks

Oscar101 2 comments Joined 09/12
13 Sep 2012

Hi i am getting the following error while moving data from 1 environment to another...
could you please help me resolve it??

ERROR:com.teradata.jdbc.jdbc_4.util.jdbcexception:
[Teradata Database] [TeraJDBC 13.10.00.03] [Error 3134] [SQLState HY008] The request was aborted by abort session command.

feinholz 76 comments Joined 05/08
13 Sep 2012

TPT does not use the Teradata JDBC driver.
TPT has no JDBC integration.

--SteveF

lokmac 5 comments Joined 08/09
13 Sep 2012

Hi Oscar101,

This page is only related to the "tdload" or "Easy Loader for Teradata Parallel Transporter" utility.

If you have a question about something that isn't related to this utility (which JDBC isn't) please post it in the relevant section of the Teradata Forums (http://forums.teradata.com/forum).

feinholz 76 comments Joined 05/08
13 Sep 2012

Well, that is true about a lot of the postings on this page, but so far we have not enforced rules as to how/what/where people can post their questions. Yes, it would be nice for people to only post questions that are related to the top title of this page. But sometimes, posts become conversations which morph into various topics.

--SteveF

lokmac 5 comments Joined 08/09
13 Sep 2012

I'm not stopping anyone. But they might find a broader audience with experience in the subject area if they post it in the forum (particularly if not related to TPT).

sarath@TDDEV 2 comments Joined 09/12
14 Sep 2012

Hi Steven,

i posted this question 2 days back but since its going to get lost in the other posts before getting answered i am posting it again. can you please answer my question.

in TPT there is an way to trim the Trailing Blanks using "TrimTrailingBlanks" option. Because of the way our source is giving us the data we get trailing blanks. Is there a way to do the same thing using TDLOAD?

Thanks

feinholz 76 comments Joined 05/08
14 Sep 2012

No, that option is not currently available with Easy Loader.

--SteveF

babugandi 2 comments Joined 09/12
30 Sep 2012

hi

babugandi 2 comments Joined 09/12
30 Sep 2012

Hi I am able to load the data to a table from flat below using below script. But values in flatfile are enclose with " so it is loading the same into table . Inorder to avoid loading " is there any attribute to be defined?

DEFINE JOB FILE_LOAD
DESCRIPTION 'Load a Teradata table from a file'
(
DEFINE SCHEMA Trans_n_Accts_Schema
(
Transaction_Id VARCHAR(256),
Risk_Classification VARCHAR(256) ,
Risk_Classification_Date VARCHAR(256),
Revised_Risk_Class_Reason VARCHAR(256),
Revised_Risk_Class_Author_Appr VARCHAR(256),
Revised_Risk_Classification VARCHAR(256)
);
DEFINE OPERATOR DDL_OPERATOR
TYPE DDL
ATTRIBUTES
(
VARCHAR PrivateLogName = 'ddl_log',
VARCHAR TdpId = @myTdpId,
VARCHAR UserName = @myUserName,
VARCHAR UserPassword = @myUserPassword,
VARCHAR ErrorList = '3807'
);
DEFINE OPERATOR FILE_READER
TYPE DATACONNECTOR PRODUCER
SCHEMA Trans_n_Accts_Schema
ATTRIBUTES
(
VARCHAR PrivateLogName = 'dataconnector_log',
VARCHAR DirectoryPath = '/ap02d/cfdw/data_files/ERISK/incoming/',
VARCHAR FileName = 'RISK_CLASS_DATA.csv',
VARCHAR Format = 'Delimited',
VARCHAR OpenMode = 'Read',
VARCHAR TextDelimiter = ','
);
DEFINE OPERATOR LOAD_OPERATOR
TYPE UPDATE
SCHEMA *
ATTRIBUTES
(
VARCHAR PrivateLogName = 'update_log',
VARCHAR TdpId = @myTdpId,
VARCHAR UserName = @myUserName,
VARCHAR UserPassword = @myUserPassword,
VARCHAR TargetTable = 'db_main' || '.Trans',
VARCHAR LogTable = 'db_main' || '.LG_Trans',
VARCHAR ErrorTable1 = 'db_main' || '.ET_Trans',
VARCHAR ErrorTable2 = 'db_main' || '.UV_Trans',
VARCHAR WorkTable = 'db_main' || '.WRK_Trans'
);

STEP Setup_Tables
(
APPLY
('DROP TABLE ' || 'db_main' || '.ET_Trans;'),
('DROP TABLE ' || 'db_main' || '.UV_Trans;'),
('DROP TABLE ' || 'db_main' || '.LG_Trans;'),
('DROP TABLE ' || 'db_main' || '.WRK_Trans;')
TO OPERATOR (DDL_OPERATOR);
);

STEP Load_Trans_Table
(
APPLY
('INSERT INTO ' || 'db_main' || '.Trans(Transaction_Id,
Risk_Classification,
Risk_Classification_Date,
Revised_Risk_Class_Reason,
Revised_Risk_Class_Author_Appr,
Revised_Risk_Classification,
ECTL_DATA_SRCE_ID)
VALUES(:Transaction_Id,
:Risk_Classification,
:Risk_Classification_Date,
:Revised_Risk_Class_Reason,
:Revised_Risk_Class_Author_Appr,
:Revised_Risk_Classification,
9);')
TO OPERATOR (LOAD_OPERATOR[2])
SELECT * FROM OPERATOR(FILE_READER[2]);
);
);

got_boys_123 1 comment Joined 08/12
02 Oct 2012

Hi,

Our shop is pretty new to TPT and tdload. I am trying to test tdload but I'm having problems when I try to pass an Account ID.

I'm executing tdload using the following command:
tdload --SourceFileName file.txt --TargetUserName user_1 --TargetUserPassword password_1 --TargetAccountId $L$DEV0 --TargetWorkingDatabase target_db --TargetTable target_tbl --TargetTdpId test_tdpid

I get the following error message:
Teradata Parallel Transporter Version 13.10.00.04
TPT_INFRA: TPT04094: Error: Line 1 of tbuild Command Line: Invalid token expression:
no tokens.
Job script preprocessing failed.
Job terminated with status 8.
Teradata Load Utility Version 13.10.00.04
account_id: '$L$DEV0'
+ Res=0

This problem occurs when I specify the --TargetAccountId option. The process works when I try to remove "--TargetAccountId $L$DEV0" but I need to supply it as it is required in our shop. I'm pretty sure the Account ID ($L$DEV0) works because I am able to use it in BTEQ.

Does anybody know how I can resolve this? The 2nd problem I have with this is that tdload returns a 0 return code even though the process failed. Is this a bug?

I appreciate any help in advance. Thanks!

feinholz 76 comments Joined 05/08
08 Oct 2012

Babugandi: in TPT14.0 we introduced the support for quoted delimited strings. The 14.0 documentation will provide the details.

--SteveF

feinholz 76 comments Joined 05/08
08 Oct 2012

got_boys_123: did you try enclosing $L$DEV0 in double-quotes? Also, on Unix, try to put a backslash ("\") in front of the "$" characters. The reason for the failure is that the Unix shell is processing the "$" before it gets to tdload.

--SteveF

cloghin 4 comments Joined 04/12
25 Jan 2013

tdload has a -S option to generate and persist the TPT script. This could be useful to save time in generating coorect TPT scripts. Where can the TPT script be located ? It was not in the working dirctorory. Thanks .

cab56 2 comments Joined 02/13
08 Feb 2013

I am attempting to use tdload to copy data from one DB to another DB. When I use a simple form of the SelectStmt job variable like 'select * from tableName;' then all works okay. My problem is that many times I do not wish to select all rows from the table and I would like to do something like 'select * from tableName where dateCol = '2013-01-15';', but I cannot seems to get this to work. I've tried to define a new job variable such as WhereClause = ' where dateCol = ''2013-01-15'' ' and then concat'ing this @WhereClause in the SelectStmt setting. This also did not work. How can I specify a SelectStmt query with a where clause like above? I am sure that this can be done, correct? I an using version 14.00.00.05 of tdload.

cab56

feinholz 76 comments Joined 05/08
11 Feb 2013

Please provide me with the command line you are submitting, and the contents of the job variable file.

--SteveF

cab56 2 comments Joined 02/13
11 Feb 2013

Here you go, more details than you will ever need, I hope. Thanks for the assistance.

Here the the actual tdload command : tdload -S -j demo_rtl_auth_fraud_lvl0.par.prd rtl_auth_fraud_lvl0
And here is the standard output from the above command;

Teradata Load Utility Version 14.00.00.05
TDLOAD: TPT05530: Warning: Both SourceTableName and SelectStmt are specified.
Value for SourceTableName will be ignored.
Teradata Parallel Transporter Version 14.00.00.05
Job log: /dwh_app/dw/dev/logs/tpt/dwhd-1625.out
Job id is dwhd-1625, running on lp1dev
Teradata Parallel Transporter Export Operator Version 14.00.00.05
$EXPORT: private log specified: ExportLog
Teradata Parallel Transporter Load Operator Version 14.00.00.05
$LOAD: private log specified: LoadLog
$EXPORT: connecting sessions
$LOAD: connecting sessions
$LOAD: preparing target table
$LOAD: entering Acquisition Phase
$EXPORT: sending SELECT request
$EXPORT: entering End Export Phase
$EXPORT: Total Rows Exported: 0
$LOAD: entering Application Phase
$LOAD: Statistics for Target Table: 'd_rtl_auth_fraud_lvl0_cab'
$LOAD: Total Rows Sent To RDBMS: 0
$LOAD: Total Rows Applied: 0
$LOAD: disconnecting sessions
$EXPORT: disconnecting sessions
$EXPORT: Total processor time used = '0.155862 Second(s)'
$EXPORT: Start : Mon Feb 11 15:25:01 2013
$EXPORT: End : Mon Feb 11 15:25:25 2013
$LOAD: Total processor time used = '0.166719 Second(s)'
$LOAD: Start : Mon Feb 11 15:25:01 2013
$LOAD: End : Mon Feb 11 15:25:28 2013
Job step MAIN_STEP completed successfully
Job dwhd completed successfully

Here is the job variable file contents;

SourceTable = 'd_rtl_auth_fraud_lvl0',
SourceTdpId = 'NKPC1TDC',
SourceUserName = 'terloadp',
SourceUserPassword = 'xxxxxx',
SourceWorkingDatabase = 'total_db',
SelectStmt = 'select * from d_rtl_auth_fraud_lvl0 where data_date = ''2013-01-10'';',
TargetTable = 'd_rtl_auth_fraud_lvl0_cab',
TargetTdpId = 'NKPD1TDC',
TargetUserName = 'terloadd',
TargetUserPassword = 'xxxxx',
TargetWorkingDatabase = 'total_db_d'

When I run this command - select count(*) from total_db.d_rtl_auth_fraud_lvl0 where data_date = '2013-01-10';
524 rows are returned. (this is my souce table)

When I look at the contents of the output of the tdload command when the –x option is used, I see this fragment and the select SQL;

jv_name: SelectStmt.
Leaving ValidateOption with result: 0
(option) name: SelectStmt.
(option) value: select * from d_rtl_auth_fraud_lvl0 where data_date = 2013-01-10;.
Leaving GetVariable with result: 0

And I see the following in these files, created at runtime;

dwhd on lp1dev /opt/teradata/client/14.00/tbuild/logs> lt
-rw-r--r-- 1 dwhd dwhd 425 Feb 11 15:01 TptScript_18940040_jv
-rw-r--r-- 1 dwhd dwhd 556 Feb 11 15:01 TptScript_18940040

dwhd on lp1dev /opt/teradata/client/14.00/tbuild/logs> cat TptScript_18940040_jv
SourceTable = 'd_rtl_auth_fraud_lvl0'
, SourceTdpId = 'NKPC1TDC'
, SourceUserName = 'terloadp'
, SourceUserPassword = 'xxxxxx'
, SourceWorkingDatabase = 'total_db'
, SelectStmt = 'select * from d_rtl_auth_fraud_lvl0 where data_date = 2013-01-10;'
, TargetTable = 'd_rtl_auth_fraud_lvl0_cab'
, TargetTdpId = 'NKPD1TDC'
, TargetUserName = 'terloadd'
, TargetUserPassword = 'xxxxx'
, TargetWorkingDatabase = 'total_db_d'

dwhd on lp1dev /opt/teradata/client/14.00/tbuild/logs> cat TptScript_18940040
DEFINE JOB LOAD_JOB
DESCRIPTION 'Loading Data From File To Teradata Table'
(
SET TargetMinSessions = 1;
SET TargetMaxSessions = 32;
SET LogTable = @TargetWorkingDatabase || '.' || @TargetTable || '_Log';
SET SourceMinSessions = 1;
SET SourceMaxSessions = 32;
SET SourceInstances = 1;
SET ExportPrivateLogName = 'ExportLog';
SET LoadInstances = 1;
SET UpdatePrivateLogName = 'UpdateLog';

APPLY $INSERT TO OPERATOR ($UPDATE() [@LoadInstances])
SELECT * FROM OPERATOR ($EXPORT() [@SourceInstances]);
);

Please notice that the above data_date = 2013-10-10 no longer has any quotes around it. This data_date column is a DATE column. Obviously, I need these quotes and this is the primary issue, how to code the SelectStmt in the job variable file and retain the needed quotes.

Also, once we get beyond this little issue, how do I control the location of the generated runtime tpt script? Seems it is going in the installation directory file system and this is not what I want.

Are there more tdload manuals and/or doc besides what I read in the TPT Reference?

cab56

teradatauser2 29 comments Joined 04/12
19 Feb 2013

I heard there is one more tool by which you can move data from one prod server to another, say i want to move data from prod to dev server. TPT was used for that as well. Can this be used for this ? can you refer me to the right link, if you are aware of it ?

GLOGASUBRAMANI 1 comment Joined 08/11
11 Jun 2013

Hi feinholz,
 Could you please let me know if we can enforce tdload to use STREAM/INSERTER/LOAD/UPDATE? i.e.,Can we force the tdload to use STREAM instead of UPDATE?
Thanks,
Logasubramani
 

feinholz 76 comments Joined 05/08
11 Jun 2013

Right now, the user cannot tell tdload which operator to use. We are aware of this type of enhancement and will consider it in a future release.
The idea behind tdload is to remove the user from making these types of decisions. The future enhancement will probably center around whether the user wishes to use a load slot or not, and not which specific operator to use.
We would like to head in the direction where the user never has to have control over the specific operator that is used, but we will see how long it takes to get there.
 

--SteveF

satyakatta 1 comment Joined 10/11
07 Apr 2015

Iam running JMS TPT Access module and iam getting the error “Could not find a valid End of the record (EOR)- DATA ERROR”. my JMS queue test message fixed width format.
Seems like below issue is can’t identify end of record from JMS Queue (Source).
What we need add in source to identify End of record (EOR).  
Could please help us.!
Log file as below:
hex: Hexadecimal formatted display from address 0846D5B0 for 24 bytes.
hex:  0000  41 42 43 44.45 46 47 2E.41 42 43 44.45 46 47 2E  *ABCDEFG.ABCDEFG.*
hex:  0010  41 42 43 44.45 46 47 2E                          *ABCDEFG.        *
     !ERROR! Could not find a valid EOR
     !ERROR! rcode=35
     !ERROR! returning rcode=35
     Last RecCount=0
     RecCount=0
     Buffer->Data
hex: Hexadecimal formatted display from address 00000000 for 0 bytes.
TPT_DATACONNECTOR_OPERATOR: TPT19350 I/O error on file 'ESB_JMS_LG_TC_SHIP_EV'.
     *** DATA ERROR > **************************************
     pmGetID entry <<<<
     PMPROCS................................. 14.00.00.02
     PMRWFMT................................. 14.00.00.02
     PMTRCE.................................. 13.00.00.02
     PMMM.................................... 03.00.00.01
     DCUDDI.................................. 14.00.00.11
     PMHEXDMP................................ 05.00.00.01
     PMUNXDSK................................ 14.00.00.05
     Data description at error:
     File name: 'ESB_JMS_LG_TC_SHIP_EV'
     Access Module I/O
     Record number: 1
     Data Format: 'TEXT'
     Access Mode: 'Read'
     Buffer Size: 64000
     Buffer Position: 0
     Buffer End: 24
  Hexadecimal formatted display from address 0846D5B0 for 24 bytes.
   0000  41 42 43 44.45 46 47 2E.41 42 43 44.45 46 47 2E  *ABCDEFG.ABCDEFG.*
   0010  41 42 43 44.45 46 47 2E                          *ABCDEFG.        *
     Record Count: 0
     EOR bytes
  Hexadecimal formatted display from address 082CB9C6 for 0 bytes.
     *** < DATA ERROR **************************************
     pmGetErrText entry <<<<
     LastReturnCode=35
     pmRead rc=35
     'EOF encountered before end of record'
TPT_DATACONNECTOR_OPERATOR: TPT19435 pmRead failed. EOF encountered before end of record (35)
TPT_DATACONNECTOR_OPERATOR: TPT19305 Fatal error reading data.
Setting exit code = 12.
Thanks, satya 

feinholz 76 comments Joined 05/08
08 Apr 2015

Please send this request to the proper forum.
http://forums.teradata.com/forum/tools
 

--SteveF

raavimistry 1 comment Joined 02/14
14 May 2015

Hi
In TDload ,RowErrFileName is not available.
Can you please tell is there any option to available this feature.
Because I have reject invalid records from file without TDload failure.
Please help me on this??
 
 

jammy19031989 9 comments Joined 07/13
05 Jan 2016

I am new to TDload and would like to get clarified on the below questions.
 
Can TDload support 2 files to load into a single table in a single TDload command?
Can TDload support 2 files to load into a 2 tables tables in a single TDloadf command?
can we have userdefined logtablename and error tables name and also the declaration of path where it needs to be stored?
To save and view the script, -S, but where it would get stored?
also i believe, we cant communicate with Oracle or any other data base outside teradata through Easyloader.
 
Thanks!
 

feinholz 76 comments Joined 05/08
06 Jan 2016

Some of these answers depend on the version od TPT that you have installed.
 
tdload can support multiple files to load into a single table. You must use wildcard syntax for the file name in order to achieve this.
 
tdload cannot load into more than one table.
 
Yes, you can have user defined log table names and error table names (depending on version of TPT you are using). You would specify the job variable names associated with the operator you want to use (Load or Update).
 
The -S option should put the script in the directory from which the job is being executed.
 
tdload currently does not support the ODBC operator. We are working on that in the 16.0 timeframe. No commitment should be inferred from this statement.
 

--SteveF

jammy19031989 9 comments Joined 07/13
11 Jan 2016

Thanks!! I am working on TD15.
'tdload can support multiple files to load into a single table. You must use wildcard syntax for the file name in order to achieve this."
 
could you please be little more precise on this.
 
tdload -f  X01.txt -u usr_nm-p pwd-h TDPID -t tab_nm-d '|~' --TargetWorkingDatabase DLANALYTICS -S job11.
 
Here I am trying to load a file called X01.txt.. 
How can I load X01.txt and X02.txt into a single 'TDLOAD' command?

jammy19031989 9 comments Joined 07/13
13 Jan 2016

could you please reply to my above post?.
 
what is the wild card syntax to load the files X01.txt and X02.txt into a single table.
 Besides, I see the TDLoad job completed unsuccessfully and when i see the log in the "logs" folder, i am unable to see the error table names. can you please specify where it will be stored?

feinholz 76 comments Joined 05/08
13 Jan 2016

Try specifying "X*.txt".
Error tables are stored on the database, most likely in the same database as your target table.
 

--SteveF

jammy19031989 9 comments Joined 07/13
14 Jan 2016

yeah I have tried that but it is throwing an error like the file "X*.txt" is not found.
it seems like considering '*' as literal character.
 could you please verify once and confirm the same.
(I would like to process more than 1 file into a single table and looking for correct wildcard character)

jammy19031989 9 comments Joined 07/13
14 Jan 2016

In addition to the above concern, i would like to clarify 1 more concern.
 
The job variable file looks like below. this is for loading a file into table

SourceFileName='/export/home/gssjamaa/TFACP200/ORA_TFACP200_049_CE3.txt',

TargetUserName='usrnm',

TargetUserPassword='**',

TargetTdpId='tdpid',

TargetLogonMech='ldap',

t='T_SRC_TFACP200_TPB',

d='|~',

TargetWorkingDatabase='FBA_T'

 

while running the TDload command, it throws error like 

"$FILE_READER: TPT19203 Required attribute 'OpenMode' not found.  Defaulting to 'Read'."

 

seems like it is unable to process the file despite the file is existing in the source path.

 

on the

 other hand, while running the command itself, I am able to process the file.

"tdload -f "/export/home/gssjamaa/TFACP200/ORA_TFACP200_04*_CE3.txt" -u usrnm-p ** -h tdpid  --TargetLogonMech ldap -t T_SRC_TFACP200_TP

B -d '|~' --TargetWorkingDatabase FBA_T -S job121"

 

is there any problem with the job variable file due to which i am unable to process the sourcefile?

 

 

jammy19031989 9 comments Joined 07/13
18 Jan 2016

could you please reply for my above 2 questions?. Early reply would be very much appreciated.
 
Thanks in advance!

jammy19031989 9 comments Joined 07/13
18 Jan 2016

 I had rectified the error. I am able to process multiple files using wildcard character. 
 
THanks!!
please reply on "Job Variable file" issue alone.

You must sign in to leave a comment.