All Forums Tools
akd2k6 54 posts Joined 12/14
14 Dec 2014
Teradata tpt

Hi All, I am new to this post and in teradata.

I am trying to create one unix script that will do the table sync from one to another for COB sync with tpt utility.

 

I need script to unload from source table and load to target table by tpt and if possible unload file in delimited.But also I need to generate the dml of the unload file and save it for future use.

 

Can you please help if there are anything in tpt like .ml script in fast export automatically created, else how to achieve that.

Also I want the tpt load should be upsert which tdload is not doing.I tried with tdload but it is only doing insert.

 

feinholz 1234 posts Joined 05/08
15 Dec 2014

The TPT documentation has sample scripts to help you with this.
When you install TPT, there is also a samples directory with sample scripts for all of the popular scenarios (like the one you seek).
TPT does not have a feature like FastExport's (to create an ML script).
As for tdload, right now it is only used for inserts.
(The original concept behind the Easy Loader feature is to quickly and easily perform inserts from a flat file into a table without the need for a script.)

--SteveF

akd2k6 54 posts Joined 12/14
15 Dec 2014

Thanks Steve, I have gone through tpt scripts from this forum and  understood the thing.
1.Is there any way I can generate the dml of the file to flat file and save it for future use?
2.Also If I am creating delimited unload file say '|' separated, it is failing for the tables which all records are not varchar.is there any solution?confused about lot of solution on this in the forum.
3.is there any way that I can unload the table by tpt and load the unloaded file by mload? as in this case I need the dml which not being able to generate in tpt.

feinholz 1234 posts Joined 05/08
15 Dec 2014

3. Why would you want to export from TPT and load back in with MultiLoad? Why not just load back in with TPT?
 
2. For delimited output, it depends on which version of TPT you are using. Earlier versions of TPT only supported writing out delimited in VARCHAR format and so you had to CAST the SELECT statement, or use the SQL Selector operator and specify ReportMode='Y' which would force the DBS to convert all columns to character. For later (I think starting in 14.10 or 15.0) the DataConnector file writer will do the conversion for you, and the schema does not have to be comprised of all VARCHAR columns.
 
1. Please explain why you need to save the DML away for future use.
 

--SteveF

akd2k6 54 posts Joined 12/14
27 Dec 2014

Hi Steve,I am using now tpt unload and load.I need to execute the below sql
SQL = 'SELECT *  FROM xxx.WZ1D02_BITPROC where CAST(fec_proceso as date format  'YYYY-MM-DD') >= '2014-12-02';'
 
I am keeping this in job variable file but getting error as-

Teradata Parallel Transporter Version 14.10.00.05

TPT_INFRA: TPT02026: Error: Line 12 of Local Job Variable File:

Quoted string lacks a single quote terminator character (')

 

 

Can you please advise how I can write the SQL query in job variable file.My current job variable file entery is as below and completly dinamically generated before calling the tbuild.

 

UsrId = 'xxxxx'

Pwd = 'xxxxx'

Tab = 'WZ1D02_BITPROC'

DB = 'xxx'

Tdp = 'xxx'

PrivateLog1 = 'file_writer_privatelog1_25559114'

PrivateLog2 = 'file_writer_privatelog2_25559114'

PrivateLog3  = 'file_writer_privatelog3_25559114'

Database_Table = 'xxx.WZ1D02_BITPROC'

OP_FILE ='WZ1D02_BITPROC.20141227200452.dat'

DIR_PATH = 'xxx/xxx/'

SQL = 'SELECT *  FROM xxx.WZ1D02_BITPROC where CAST(fec_proceso as date format 'YYYY-MM-DD') >= '2014-12-02';'

 

 

feinholz 1234 posts Joined 05/08
29 Dec 2014

A string attribute value must be enclosed with single quotes.
Thus, you need to escape the single-quotes inside that string.
You need to double all of the single quotes insidfe the quoted string.
 

--SteveF

akd2k6 54 posts Joined 12/14
05 Jan 2015

thanks Syeve, it worked.
I am unloading the data by fastexport or tpt to comma separated file with select * from tab; and trying to load the same unloaded csv file to target table.
This source and target table columns are not all varchar, instead they can be char, date,timestamp,decimal  all.
 
Can this loading be achived by mload?if so, can you please give the options in mload.

feinholz 1234 posts Joined 05/08
05 Jan 2015

MultiLoad is only a load tool, so not sure what you mean be "archive".
MultiLoad can load csv data. The record format is called "vartext".
In the schema definition, all columns must be defined as varchar in order to use vartext loading.
The MultiLoad Reference Manual has the necessary information for you.
 
Question: why are you unloading with FastExport (or TPT) and loading with MultiLoad? What is the ultimate goal here? And just fyi, you should always be using TPT.

--SteveF

Anonyme 16 posts Joined 12/14
06 Jan 2015

Hi Steve, hi everybody.
Our company is switching from Oracle to Teradata and to tell you the truth i'm getting confused to work with TPT script instead of SQL LOADER .
Please if you can show me some ressources to read in this website or other to start on feet as soon as possible with Teradata and specialy with TPT script .
Thank you.

feinholz 1234 posts Joined 05/08
06 Jan 2015

Well, this thread is probably not the proper place for this topic.
You should start a new thread/topic.
TPT is too large of a product to answer in a single post.
 
We have documentation for TPT in http://www.info.teradata.com/.
Look in the Tools And Utilities section.
We have a Reference Manual, User Guide and Quick Start Guide.
You can also email me at steven.feinholz@teradata.com.
 

--SteveF

Anonyme 16 posts Joined 12/14
07 Jan 2015

Thank you so much Steve, i'm going to check the website you gave me.
For further informations i'm going to email.
 

akd2k6 54 posts Joined 12/14
25 Jun 2015

Hi Steve, I am able to load the tables with tdload utility as-
tdload -f ${UNLOAD_FILE}  -t ${TARGET_DB}.${TARGET_TAB_NAME}  -u ${UsrId} -p ${Pwd}  -h ${Tdp} -L ${COB_LOG_DIR} --TargetWorkingDatabase ${TARGET_UTIL_DB} -d '|' ${tdload_checkpoint_log_file}
but while loading into actual production, the command is failing as it should not have access to create or drop table from actual prod databse. Later I did analysis and found tdload is creation the error and _UT tables in the actual db not in the working db we are passing. These temporary tables are not created in utility db where we should have permission. it's only creating _log table in utility db. Is there anyway to create the temporary _ET and _UT tables in utility db instead of actual target db. If not I think it's a huge drawback as most of the prod application should not have access to create and drop tables on run time and we can't use this tool.

feinholz 1234 posts Joined 05/08
25 Jun 2015

Fully qualify the error table names.
 

--SteveF

akd2k6 54 posts Joined 12/14
25 Jun 2015

can you please give  an example of the command. I could not fine any such option in tdload to mention error and ut table names

akd2k6 54 posts Joined 12/14
25 Jun 2015

$ tdload

Usage: tdload -f filename -u username -t tablename

              [-h hostname] [-p password] [-c charset_id]

              [-d delimiter] [-j filename][-L LogFilePath] 

              [-r CheckpointDirectory] [-R RetryLimit] 

              [-z CheckpointInterval] [-v] [-x] [--help] [JobName]

Short options description:

 

akd2k6 54 posts Joined 12/14
25 Jun 2015

in that help, there is no option to mention error table names

feinholz 1234 posts Joined 05/08
25 Jun 2015

Use the long names.
--ErrorTable1
--ErrorTable2
 
And just make sure to put the databasename and table name.
 

--SteveF

akd2k6 54 posts Joined 12/14
01 Jul 2015

Thanks a lot Steve, it worked. Now I faced a new issue while unloading a table with tbuild.
command-
tbuild -f  ${COB_SCRIPTS_DIR}/tpt_unload.ksh -u "Pwd='xxxx'"  -v  /temp/INT_CRD_PORTFOLIO_MASTER_E_test.txt

 

but it's failing with below error-

TPT_INFRA: TPT02638: Error: Conflicting data length for column(8) - "PRCH_DISC_LCYAMT". Source column's data length (16) Target column's data length (8).

SQL_SELECTOR: TPT15105: Error 13 in finalizing the table schema definition

SQL_SELECTOR: disconnecting sessions

SQL_SELECTOR: Total processor time used = '0.02965 Second(s)'

SQL_SELECTOR: Start : Wed Jul  1 14:31:07 2015

SQL_SELECTOR: End   : Wed Jul  1 14:31:08 2015

FILE_WRITER: TPT19221 Total files processed: 0.

Job step setup_export_to_file terminated (status 12)

 

Can you please advise on this. Source tables are corect and they have correct data.

 

DDL of that column- PRCH_DISC_LCYAMT DECIMAL(23,5),.. 

 

Sample data in that column-

PRCH_DISC_LCYAMT

0.07390

0.00890

0.02190

0.00860

 

 

akd2k6 54 posts Joined 12/14
01 Jul 2015

For your reference , the tpt script is as-
 

DEFINE JOB LOAD_EMPLOYEE_TABLE_FROM_FILE

DESCRIPTION 'EXPORT SAMPLE WZ1D02_BITPROC TABLE TO A FILE'

(

 DEFINE SCHEMA WZ1D02_BITPROC_SCHEMA FROM TABLE @Database_Table;

 

   DEFINE OPERATOR DDL_OPERATOR()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'

   TYPE DDL

   ATTRIBUTES

   (

      VARCHAR PrivateLogName = @PrivateLog1,

      VARCHAR TdpId          = @Tdp,

      VARCHAR UserName       = @UsrId,

      VARCHAR UserPassword   = @Pwd,

      VARCHAR AccountID,

      VARCHAR ErrorList      = '3807'

   );

 

   DEFINE OPERATOR FILE_WRITER()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

   TYPE DATACONNECTOR CONSUMER

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName    = @PrivateLog2,

      VARCHAR DirectoryPath     = @DIR_PATH, 

      VARCHAR FileName          = @OP_FILE,

      VARCHAR IndicatorMode     = 'N',

      VARCHAR OpenMode          = 'Write',

      VARCHAR Format            = 'Delimited',

      VARCHAR TextDelimiter     = '|'

   );

 

   DEFINE OPERATOR SQL_SELECTOR

   TYPE SELECTOR

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName = @PrivateLog3,

      VARCHAR TdpId= @Tdp,

      VARCHAR UserName= @UsrId,

      VARCHAR UserPassword= @Pwd,

      VARCHAR SelectStmt= @SQL,

      VARCHAR ReportMode='Y'

   );

 

   STEP setup_export_to_file

   (

      APPLY TO OPERATOR (FILE_WRITER() [1] )

      SELECT * FROM OPERATOR (SQL_SELECTOR);

   );

 

);

 

feinholz 1234 posts Joined 05/08
01 Jul 2015

In the operator definition for extracting data fro Teradata, make sure to set MaxDecimalDigits to 38 (or whatever your max precision size for your decimal columns).
We default the precision to 18.

--SteveF

akd2k6 54 posts Joined 12/14
01 Jul 2015

which operator Steve out of those 3 i my script. Also can you please help with the syntax.

feinholz 1234 posts Joined 05/08
01 Jul 2015

I only see one operator that extracts data out of Teradata.
The SQL Selector operator.
Please check the documentation; you will want to add something like:
INTEGER MaxDecimalDigits=38
 

--SteveF

akd2k6 54 posts Joined 12/14
06 Jul 2015

Thanks Steve it worked, but now got into some other issue..
My  requirement is, I  have to unload data from production table and load into COB table.Now production data can be insert,update ,delete load etc.
Also this is not for only one table, rather I have to do with one generic script and will pass only the table name in the script to take care of the rest. Also unload to ascii delimited file is a requirement, not directly table to table. instead I have to unload to file and then load.
Can you please advise what is the best approach in getting that as not aware about the table structure ,name etc beforehand and it should be parameterized.
 
I did like this-
1.Created one unload script as-

tbuild -f  ${WORKING_DIR}/tpt_unload.ksh -u "Pwd='${Pwd}'"  -v $tpt_job_variable_file ${tdunload_checkpoint_log_file} -n|tee -a ${LOG_NAME} ${tpt_unload_log_file}

 

tpt_unload.ksh

DEFINE JOB LOAD_EMPLOYEE_TABLE_FROM_FILE

DESCRIPTION 'EXPORT SAMPLE WZ1D02_BITPROC TABLE TO A FILE'

(

 DEFINE SCHEMA WZ1D02_BITPROC_SCHEMA FROM TABLE @Database_Table;

 

   DEFINE OPERATOR DDL_OPERATOR()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'

   TYPE DDL

   ATTRIBUTES

   (

      VARCHAR PrivateLogName = @PrivateLog1,

      VARCHAR TdpId          = @Tdp,

      VARCHAR UserName       = @UsrId,

      VARCHAR UserPassword   = @Pwd,

      VARCHAR AccountID,

      VARCHAR ErrorList      = '3807'

   );

 

   DEFINE OPERATOR FILE_WRITER()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

   TYPE DATACONNECTOR CONSUMER

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName    = @PrivateLog2,

      VARCHAR DirectoryPath     = @DIR_PATH,

      VARCHAR FileName          = @OP_FILE,

      VARCHAR IndicatorMode     = 'N',

      VARCHAR OpenMode          = 'Write',

      VARCHAR Format            = 'Delimited',

      VARCHAR TextDelimiter     = '|'

   );

 

   DEFINE OPERATOR SQL_SELECTOR

   TYPE SELECTOR

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName = @PrivateLog3,

      VARCHAR TdpId= @Tdp,

      VARCHAR UserName= @UsrId,

      VARCHAR UserPassword= @Pwd,

      VARCHAR SelectStmt= @SQL,

      VARCHAR ReportMode='Y',

      INTEGER MaxDecimalDigits=38

   );

 

   STEP setup_export_to_file

   (

      APPLY TO OPERATOR (FILE_WRITER() [1] )

      SELECT * FROM OPERATOR (SQL_SELECTOR);

   );

 

);

 

 

2.Load the unload file with-

tdload -f ${UNLOAD_FILE}  -t ${TARGET_DB}.${TARGET_TAB_NAME}  -u ${UsrId} -p ${Pwd}  -h ${Tdp} -L ${COB_LOG_DIR} --TargetWorkingDatabase ${TARGET_UTIL_DB} -d '|' --ErrorTable1 ${TARGET_UTIL_DB}.${ERROR_TBL1} --ErrorTable2 ${TARGET_UTIL_DB}.${ERROR_TBL2} --LogTable ${LOG

_TBL} --TargetMaxSessions ${TD_UTIL_MAX_SESSN} --TargetMinSessions ${TD_UTIL_MIN_SESSN} ${tdload_checkpoint_log_file}|tee -a $LOG_NAME ${tdload_log_file}

 

I am not being able to meet below requirement-

1. tdload is not being able to load duplicate records though my table us NUPI.

2.upsert not handling like we can do in mload. We can handle as fexp automatically creates the mload script andin that we get schema, field names etc.

 

Can you please advise in which approach I can meet the requirement.most important requirement is that unload file must be in ascii delimited to store in tape for backup.

 
 
 

akd2k6 54 posts Joined 12/14
06 Jul 2015

Hi Steve, can you please advise if anything can be done on this problem-
tdload is not allowing to load duplicate records though my table is NUPI.

feinholz 1234 posts Joined 05/08
06 Jul 2015

Easy Loader (tdload) is meant to do simple jobs. Load delimited data from a file into a table.
And Easy Loader will use a set of rules to determine the best load protocol (the best/fastest operator for the job).
As soon as you introduce a lot of parameters and requirements, you will need to write a script.
Easy Loader does not currently have the ability for the user to force a particular operator.
Easy Loader does not know that you want to be able to load duplicates.
If you have the ability to write a script for the extract, you should write a script for the import.
 
We will continue to make improvements to Easy Loader and add the ability for the user to dictate more parameters and requirements, but it is not currently available.
 

--SteveF

akd2k6 54 posts Joined 12/14
06 Jul 2015

yes Steve, I know the scripting. But  I got stuck at the schema and input/update quiries which need to hardcode or read from file.And for this have to create all quiries and schema files manually instead of automatically generating. Is there anyway that I can use that unloaded file and load without manually creating schema and input quiries. I do not have the problem to create any complex script, only challenge to create schema and quiries manually for all tables.

feinholz 1234 posts Joined 05/08
06 Jul 2015

What version of TPT are you using?
You can use the same commands for defining the schema for the IMPORT as you use for the Export. The data would have the same layout.
And when loading from a delimited data file into a table, the data has to match the target table.
So you can use the schema of the target table to define the layout of the data in the file.
Use:
DEFINE SCHEMA <name> FROM TABLE <target table name>;

--SteveF

akd2k6 54 posts Joined 12/14
06 Jul 2015

I am using teradata 14.10..can you please help me with the script for loading.

feinholz 1234 posts Joined 05/08
06 Jul 2015

Well, first of all you can eliminate the need for specifying the operator definitions in the script.
Use templates instead.
So, you can have a script that does this:
   DEFINE JOB <job name>
      DEFINE SCHEMA <schema name> FROM TABLE <target table name>;
      STEP setup_step
      (
         APPLY
            <some commands>
         TO OPERATOR ($DDL);
      );
      STEP load_step
      (
         APPLY
            <DML statements>
         TO OPERATOR ($UPDATE)
         SELECT * FROM OPERATOR ($FILE_READER);
      );
   );
 
I used the UPDATE operator in this example because you might want to load duplicates, and the Load operator cannot do that.
In the job variable file, use the job variables from the template files.
The template files can be found in:
   <twb-root-install-directory>/templates
 
 

--SteveF

akd2k6 54 posts Joined 12/14
06 Jul 2015

I went through some scripts available in portal but all of them found two things hardcoed- 
1. schema
2.insert , update quiries with table specific fields.
my source and target tables are same dll. can you please help me with the load script with these two as dynamic.
 
 

akd2k6 54 posts Joined 12/14
06 Jul 2015

what is the <some commands< and <dml statements>
should not these two dynamic? 

feinholz 1234 posts Joined 05/08
06 Jul 2015

Everything in a TPT script is substitutable.
Even the Insert statement.
You can do this if you want us to generate the INSERT statement for you:
   APPLY $INSERT TO OPERATOR ($LOAD)
 
For an UPDATE statement you have to generate it, but you can use a variable for that too:
   APPLY @my_update_dml_stmt TO OPERATOR ($UPDATE)
 
And create the UPDATE statement in the job variable file, or on the command line. Just make sure the UPDATE statement is in single quotes.
 
 
 

--SteveF

akd2k6 54 posts Joined 12/14
06 Jul 2015

looks like I am very near what need. can you please help with the two things the value- say for table employee.
1.<some commands>
2.<DML statements>

feinholz 1234 posts Joined 05/08
06 Jul 2015

<some commands> will just be any commands you need to use to create the target table.
If you do not need them. then do not use the DDL operator.
I already wrote you about the DML statements. That part is up to you.

--SteveF

akd2k6 54 posts Joined 12/14
06 Jul 2015

Sorry Steve, I lost totally. I think I am very near but lost. Need your help.can you please help with the whole script that I will use for the load where I need to load duplicate records. update means is there I need to write update quiry or it will be a insert quiry but update operator..totally lost. :)
will be great if you can paste the whole script that I can use. sorry to bother you.

feinholz 1234 posts Joined 05/08
06 Jul 2015

I cannot paste the whole script (I have a lot of other work I need to get to).
I have already provided almost the entire script already.
For now, you will need to put the entire INSERT or UPDATE statement in the script and get your job working, and then experiment with how to parameterize it.
You do not have to use the UPDATE statement is all you are doing is loading data into a table. You can use the INSERT statement. And you can use the $INSERT syntax if you want TPT to generate the INSERT statement for you.
If you need to keep duplicate rows, then use $UPDATE for the Update operator instead of $LOAD for the Load operator (the Load operator will cause Teradata to throw away the duplicate records).
 

--SteveF

TDThrottle 51 posts Joined 11/11
06 Jul 2015

Hi Abhishek,
For TPT sample scripts refer $TERADATA_HOME/Teradata Parallel Transporter/sample/userguide from you client system.
Thanks!!

akd2k6 54 posts Joined 12/14
09 Jul 2015

sure Steve, Thanks a lot.

akd2k6 54 posts Joined 12/14
09 Jul 2015

Hi Steve I am facing a new problem.this same activity I am trying with fexp and mload as fexp itself creates the mload script for me with proper dml which I don't need to think about dml,query etc. I am just changing the database and it will work. But I stuck into unicode column issue.
I am trying to unload data from one table by fexp script to file and then load the file to another same DDL table by mload.
But the records are getting rejected and going to error table when I encountered a column defined as unicode and it has special har.
column 
MERCH_NM_TXT VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC,
fexp script
-------------
fexp -c UTF8   << EOD
.RUN FILE $TD_SOURCE_LOGIN_FILE;
.LOGTABLE ${SOURCE_UTIL_DB}.${FEXP_UTIL_TBL_NM};
.BEGIN EXPORT
SESSIONS ${TD_UTIL_MAX_SESSN} ${TD_UTIL_MIN_SESSN};
LOCKING $SOURCE_DB.${SOURCE_TAB_NAME} FOR ACCESS
$UNLD_SQL
.EXPORT
        OUTFILE $TARGET_FILE
        MLSCRIPT $MLOAD_SCRIPT;
.END EXPORT;
.LOGOFF;
EOD
 
mload script
cat $mloadscript_file |mload -c "UTF-8"
 
But most of the records are loaded, but few are getting rejected where some -> type char is there which is not exactly it but I can't paste it here.
 
Can you please advise how I can load all data successfull. I think there is no problem with the data, as the data is already present in one table and I am trying to unload load another same ddl table.

feinholz 1234 posts Joined 05/08
09 Jul 2015

Look at the MultiLoad script.
(I do not know how FastExport generates it, so you may want to post it here if you want)
The database defines column sizes in terms of "characters".
MultiLoad works with data in "bytes".
With a session character set of UTF8, each character can take up 3 bytes of storage (on the client side).
Therefore, look at the MultiLoad script and make sure the column, defined as VARCHAR(100) in Teradata, is defined as VARCHAR(300) in the MultiLoad script.
And make sure the script (and/or the command line) specifies that you want to use the UTF8 client session character set.
 

--SteveF

akd2k6 54 posts Joined 12/14
13 Jul 2015

Hi Steve, I faced a new issue in date format of tbuild. This is the o/p of simple bteq query from that table-
select D_CLOSE_DATE from schema.table;
D_CLOSE_DATE

------------

    01/01/01

    15/07/07

    01/01/01

    15/07/07

    15/07/03

           ?

           ?

    15/07/07

    01/01/01

    15/07/07

    15/07/03

           ?

           ?

    15/07/07

    15/07/03

           ?

    01/01/01

    15/07/07

    15/07/03

           ?

           ?

    15/07/07

           ?

    01/01/01

           ?

    15/07/03

--------------------

Now I am unloading the data by tbuild and loading with tdload. But the tdload is sending above bold 5 records(01/01/01) into error table.

While I checked the unload file , that value is not unloaded into file correctly. they unloaded as below and it's going to error while loading.Can you please advise the reason and solution.

 

awk -F'^^' '{print $9}' $unload_file

1/01/

2015/07/07

1/01/

2015/07/07

2015/07/03

 

 

2015/07/07

1/01/

2015/07/07

2015/07/03

 

 

2015/07/07

2015/07/03

 

1/01/

2015/07/07

2015/07/03

 

 

2015/07/07

 

1/01/

 

2015/07/03

---------------------

File definition-

  D_CLOSE_DATE DATE FORMAT 'YY/MM/DD',

 

 

 

feinholz 1234 posts Joined 05/08
13 Jul 2015

In the future, please start a new thread with a subject that describes the problem. This makes it easier to track the issues, and for others to peruse and look at in case they have the same issues. Putting all questions into a generic "Teradata PT" thread is not ideal.
 
You will have to provide the script and the value in the database, for the EXPORT only.
 

--SteveF

akd2k6 54 posts Joined 12/14
13 Jul 2015

Hi Steve, new thread has been created.
http://forums.teradata.com/forum/tools/unloaded-date-value-issue-by-tpt-script

ashish089 3 posts Joined 06/15
28 Jul 2015

I am facing an issue while loading data using SSIS to Teradata.

I am having an XML file to load and as destination object am using "Teradata Destination" so the XML parser being source return 21 tags to be loaded.

So, if am trying to load all 21 in parallel it return an error and when am tryint to load only 13 out of those 21 it works fine.

 

SSIS is internally using TPT load to load the data.

 

Error Message:

 

"Error 5584: TPT Import error encountered during Initiate phase. CLI2: SESSOVER(301): Exceeded max number of sessions allowed."

 

Please Suggest!!!!

feinholz 1234 posts Joined 05/08
28 Jul 2015

Check the number of sessions you are trying to connect versus the max number of sessions allowed in the CLI config file "clispb.dat".
You may need to increase the value for max_num_sess (I think that is the name) in the clispb.dat file.
 

--SteveF

Rodolfo So 2 posts Joined 06/15
03 Aug 2015

same issue for me..Please suggest! thanks!
--Rod

Rodolfo So 2 posts Joined 06/15
21 Sep 2015

same issue for me..Please suggest! thanks!
--Rod

venkata_k01 24 posts Joined 07/16
14 Jul 2016

Hi,
I am giving the below select statement in job variable file but eventhough there are around 2K records present in the source table, when i am doing the data copy from source to target table using tdload , its not fetching any records.
 
SelectStmt='SELECT * FROM SOURCEDB.SOURCETABLE WHERE ROW_EFF_DT >= ''2016-07-10'' AND ROW_EFF_DT <=''2016-07-12'''
 
I am using double single quotes before & after date value but still its not working.
I am using TPT version 15.10.00.03.
 
Thanks
 

Fred 1096 posts Joined 08/04
14 Jul 2016

What is data type / format for ROW_EFF_DT (for implicit data type conversion)? If it is a DATE column, better to explicitly use DATE constant - keyword DATE in front of opening (double) single quote.

venkata_k01 24 posts Joined 07/16
14 Jul 2016

$EXPORT: TPT10508: RDBMS error 3706: Syntax error: expected something between the 'DATE' keyword and the integer '2016'.

Hi Fred,
Format is YYYY-MM-DD.
I tried by puting the DATE keyword before double single quotes.But getting the above error.
 
Thanks,
Hanu
 
 
 
 
 
 

You must sign in to leave a comment.