All Forums Tools
varshajoshi 5 posts Joined 05/12
21 May 2012
Loading using Teradata Parallel Transporter Load utility.

Hi,

I am working on parameterizing the TPTLOAD job script that loads a target table in teradata.I have a schema file where the target table schema has been defined. I now want to pass the schema file name to the TPTLOAD job script dynamically i.e as a parameter as a part of the tbuild command.

For instance we have the TPTLOAD job script as

USING CHARACTER SET UTF16
DEFINE JOB EXTRACT_FILE_LOAD
DESCRIPTION 'Export rows from a Teradata table to a file'
USING CHARACTER SET UTF16
DEFINE JOB FILE_LOAD
DESCRIPTION 'Load a Teradata table from a file'
(
INCLUDE 'filename.schema';

DEFINE OPERATOR DDL_OPERATOR
TYPE DDL
ATTRIBUTES (
VARCHAR TdpId = @tdpid,
VARCHAR UserName = @userid,
VARCHAR UserPassword = @password,

.....................

In the above example we have the INCLUDE 'filename.schema'; where we neeed to pass the filename as a parameter just as we pass VARCHAR UserName = @userid through the tbuild command using the -u option.

Please help me with this......

Thanks in advance :)

feinholz 1234 posts Joined 05/08
22 May 2012

Have you experimented?

Job variables can be used anywhere. So, just like how you can use it for UserName, you can use it for the INCLUDE statement.

     INCLUDE @my_file_name;

and:

tbuild -f <script> -u "my_file_name='some-file.txt'"

Try it and see if it works.

 

--SteveF

varshajoshi 5 posts Joined 05/12
01 Jun 2012

Hi,

I have the following TPT export script to use the field report mode feature to export data from the Teradata Database and transform it into delimited VARTEXT format using the DataConnector operator :

    DEFINE JOB EMPLOYEE_SOURCE_LOAD
    DESCRIPTION 'DATA EXPORT JOB'
    (

      DEFINE OPERATOR FILE_WRITER
      DESCRIPTION 'DataConnector WRITER'
      TYPE DATACONNECTOR CONSUMER
      SCHEMA *
      ATTRIBUTES
      (

            VARCHAR PrivateLogName = 'dataconlog',
            VARCHAR DirectoryPath = '.',

VARCHAR FileName = 'dataf.txt',
VARCHAR IndicatorMode = 'N',
VARCHAR OpenMode = 'Write',
VARCHAR Format = 'Delimited'

                  );
                  DEFINE SCHEMA EMPLOYEE_SOURCE_SCHEMA
                  DESCRIPTION 'EMPLOYEE INFORMATION SCHEMA'
                  (

                     EmpNo    VARCHAR(11),
                     EmpName  VARCHAR(12),
                     DeptNo   VARCHAR(3),
                     JobTitle VARCHAR(12),
                     Salary   VARCHAR(10),
                     YrsExp   VARCHAR(2),
                     DOB      VARCHAR(11),
                     Sex      VARCHAR(1),
                     Race     VARCHAR(1),
                     MStat    VARCHAR(1),
                     EdLev    VARCHAR(2),
                     HCap     VARCHAR(2)

                  );
                  DEFINE OPERATOR SQL_SELECTOR
                  DESCRIPTION 'SQL OPERATOR'
                  TYPE PRODUCER
                  SCHEMA EMPLOYEE_SOURCE_SCHEMA
                  ATTRIBUTES
                  (

                        VARCHAR PrivateLogName
                        VARCHAR UserName = 'MyUser',
                        VARCHAR UserPassword = 'MyPassword',
                        VARCHAR TdpId = 'MyDatabase',
                        VARCHAR ReportModeOn = 'Y',
                        VARCHAR SelectStmt = 'select * from personnel.employee;'

                  );
                  LOAD  INTO OPERATOR
                  ( FILE_WRITER  [1] )

                  SELECT * FROM OPERATOR
                  ( SQL_SELECTOR [1] );
);

 

But upon running the tbuild command , I encounter the error as :

 

Teradata Parallel Transporter Version 13.10.00.05
TPT_INFRA: Syntax error at or near line 43 of Job Script File 'rrs_city_readableoutfile.ctl':
TPT_INFRA: At "ATTRIBUTES" missing EXTERNAL_ in Rule: DEFINE OPERATOR
Compilation failed due to errors. Execution Plan was not generated.
Job script compilation failed.
Job terminated with status 8.
 

PLease suggest as to why this is happening or what could be the possible solution for this.

 

Varsha Joshi

 

feinholz 1234 posts Joined 05/08
01 Jun 2012

If you look in the TPT 13.10 User Guide, page 446, there is an example script for what you want to do.

When using the SQL Selector operator, you need to specify:

   TYPE SELECTOR

instead of:

   TYPE PRODUCER

When you use "TYPE PRODUCER", TPT is expecting you to provide the "external name" for the operator library file. We do not use that anymore.

By specifying "TYPE SELECTOR", TPT knows which operator file to load dynamically.

 

--SteveF

varshajoshi 5 posts Joined 05/12
07 Jun 2012

Thanks it worked  ! :)

varshajoshi 5 posts Joined 05/12
15 Jun 2012

Hi ,

I need to know if the WildcardInsert = 'Y' can be used in a stream_operator.

There is a TPT User Guide which refers to using WildcardInsert and refers to using it as :

VARCHAR WildcardInsert /*'Y' builds the INSERT statement from the target table definition, similar to a HELP TABLE statement in a FastLoad script. Use when input file schema exactly matches table schema. Permits the INSERT portion of the APPLY statement to simply specify INSERT INTO tablename, which reduce the amount of user coding.

I have a TPT Stream job script where I want to use the WildcardInsert as Y but the loading fails everytime as :

STREAM_OPERATOR: TPT10508: RDBMS error 3707: Syntax error, expected something like a 'SELECT' keyword or a 'VALIDTIME' keyword or a 'NONTEMPORAL' keyword or 'AS' keyword be

I know doubt whether WildcardInsert can be used in a stream_operator at all ! If not what could be the alternate solution.

Please help....

feinholz 1234 posts Joined 05/08
15 Jun 2012

No, the WildcardInsert feature is only for the Load operator (which only uses a single DML statement).

The documentation (13.10 version I just looked at) seemed pretty clear to me that it is not supported for the Stream operator or Update operator but is supported for the Load operator.

--SteveF

rolidev 4 posts Joined 10/12
27 Oct 2012

 

 

 

How to use TPT for loading all the fields of a file(in one line) in a single column of a table

I have  multiple files to be loaded into a table using tptload. Scenario is, i have two columns in my table (one is varchar(100) and another is varchar(50000)).

In the first column, i want the name of the file to be inserted for all the rows loaded using that file.

 

In the second column, i want to insert the data present in all the fields of a single row of that file. So, basically "i want the TextDelimiter to be new line character in tptload script."

I am not sure of what TextDelimiter to be set here.

 

Please refer the example below:

 

First File data (filename abcd.txt)

 

a#b#c#d

e#f

j#k#l

h

 

Second file data (filename efgh.txt)

j#k#l#m

n#d

 

In table

First column Second column

abcd.txt      a#b#c#d

abcd.txt      e#f

abcd.txt      j#k#l

abcd.txt      h

 

efgh.txt      j#k#l#m

efgh.txt      n#d

 

Can anyone provide any solution for this?

feinholz 1234 posts Joined 05/08
31 Oct 2012

Did you try a record format of "unformatted" instead of "binary"?

--SteveF

feinholz 1234 posts Joined 05/08
31 Oct 2012

Re: the post about the 2 columns and loading all of the data into a single column:
TPT has a feature whereby you indicate in the schema a column that is a metadata column specifically for the file name, and TPT will insert the file name into that column for the user.
Check the documentation for the "metadata" keyword in the schema object. I think it is something like:

    DEFINE SCHEMA
    (
        Column_name   VARCHAR(100) METADATA(FILENAME),
        Column_data      VARCHAR(50000)
    )

TPT does not support a newline character as a field delimiter.
But pick a delimiter as a character (or series of characters whose sequence) you know will not be part of the data and that should be sufficient.
 

--SteveF

13 Nov 2012

This was the same post which i kept before above 2 post
+++++++++++++++++++++++++++++++++++++++++++++++++
Hi
I am exporting a data from Teradata in the binary format using TPT export oprator. This exported data will be input for the mainfram.
But while loading data in mainfram , Mainfram Team is telling the exported data is not comming in the correct format. below are the issues raised by mainfram team
Issue:-

  1.                     Two byte of record information is getting appended in the lead position of the record.
  2.                    The file generated isn’t consistent

Please find my TPT script and layout of the binary files ..please let me know how can we reoslve this issue.
 
--------------TPT EXPORT SCRIPT---------------------------------
DEFINE JOB EXPORT_TO_FILE_SALESBYTYPE
DESCRIPTION 'Export to tgt_file EFD.SLS.BY.TYPE.OCT11.dat for SALES BY TYPE'
(
DEFINE SCHEMA PRODUCT_SOURCE_SCHEMA
(
      DST_PROC_WK_END_D CHAR(10),
      DST_DEPT_I SMALLINT,
      DST_CLASS_I SMALLINT,
      DST_LOC_I SMALLINT,
      DST_INV_TYPE_C CHAR(3),
      DST_SLS_Q BIGINT,
      DST_SLS_A DECIMAL(18,2)
);
DEFINE OPERATOR EXPORT_OPERATOR
TYPE EXPORT
SCHEMA PRODUCT_SOURCE_SCHEMA
ATTRIBUTES
(
VARCHAR PrivateLogName = 'export_log',
VARCHAR TdpId = 'TDDEVA',
VARCHAR UserName = 'S**********',
VARCHAR UserPassword = '***********',
VARCHAR SelectStmt = 'SELECT CAST(X.SLS_D AS CHAR(10)),
                             Z.MDSE_DEPT_REF_I,
                             Z.MDSE_CLAS_REF_I,
                             Y.CO_LOC_REF_I,
                             X.SLS_TYPE_C,
                             CAST (X.SLS_UNIT_Q AS BIGINT),
                             CAST ( X.EXT_SLS_PRC_A AS DECIMAL(18,2))
                      FROM   TEST7_V.MDSE_SLSTR_ITEM_LINE X,
                             TEST20_V.CO_LOC Y,
                             TEST20RPT_V.MDSE_ITEM_DIM_V Z
                      WHERE  X.MDSE_ITEM_I=Z.MDSE_ITEM_I
                      AND   X.CO_LOC_I=Y.CO_LOC_I
                      AND   Y.CO_LOC_REF_I=3
                      AND   X.SLS_D =1120923 ;'
);
DEFINE OPERATOR FILE_WRITER
TYPE DATACONNECTOR CONSUMER
SCHEMA *
ATTRIBUTES
(
VARCHAR PrivateLogName = 'dataconnector_log',
VARCHAR DirectoryPath = 'C:\scripts\Other-TPT',
VARCHAR FileName = 'EFD.SLS.BY.TYPE.OCT11.dat',
VARCHAR Format = 'Binary',
VARCHAR OpenMode = 'Write',
VARCHAR INdicatorMode = 'N'
);
APPLY TO OPERATOR (FILE_WRITER)
SELECT * FROM OPERATOR (EXPORT_OPERATOR[2]);
);
-------------------------
Below is the binary file layout
               Layout:                
               01 INSTOCKS- FILE.                                           
                   05 DEPT-I                     PIC S9(04) USAGE COMP.     
                   05 CLASS-I                    PIC S9(04) USAGE COMP.     
                   05 ITEM-I                     PIC S9(04) USAGE COMP.     
                   05 STORE_I                  PIC S9(04) USAGE COMP.     
                   05 OUT-OF-STK-F         PIC X(01).                 
                   05 BEL-MIN-F               PIC X(01).                 
                   05 WCI-F                      PIC X(01).                 
                   05 GOAL-PRCNT          PIC S9(03)V9(02) COMP-3.   

 

Regards,
Sam

feinholz 1234 posts Joined 05/08
14 Nov 2012

Instead of binary, trying exporting the data in "unformatted" format.
That record format will not add the 2-byte row length to the front of every row.
It will just write out the raw data.

--SteveF

feinholz 1234 posts Joined 05/08
14 Nov 2012

By the way, why are you exporting the data to a file on a non-mainframe platform, and then loading the data from a mainframe?
If I am misunderstanding your test scenario, please correct me.

--SteveF

rolidev 4 posts Joined 10/12
15 Nov 2012

Hi,
Is there a mechanism to load files using tpt operator, and have row number for each row of a file in order. In case i have a file
abc.txt where i have 10 rows. Then in the DB i can have rownumber for each row in order from 1 to 10.
Eg.
File:abc.txt
name id salary
abc 123 10000
bcd   234  3000
rfg  356  2000
total salary 15000
IN Database
locking row for access select * from emp.emp_sal ;
rownum emp_name  emp_id emp_salary
1           name             id        sal
2          abc                123      10000
3          bcd                234      3000
4         rfg                  356      2000
5        total salary:15000
 

Rahul Sharma 1 post Joined 12/12
10 Dec 2012

Hi Steven Feinholz,
We have got the Teradata client upgraded to V14.00 but there was some issue as it tried wriring into our SAS config directory.
 
Now i want to check if TPT has been installed properly using tbuild and bteq. As i am new to teradata, i fail to understand the script and the file that is to be used in these commands.
 
it wd b really nice of you if u can help me here.
 
Regards
Rahul Sharma

feinholz 1234 posts Joined 05/08
21 Dec 2012

This forum is not used to teach the client software. In order to assist, you will have to be more specific in your questions. I am not sure what you mean when you say "it" tried writing into your SAS config directory.
If the TTU software was installed without errors, I would think you can safely assume TPT and BTEQ were installed correctly, but I cannot say for sure because I am unaware of how you installed the software.
I am also unsure what you mean by, "i fail to understand the script and the file that is to be used in these commands". What commands?
If you want to learn how to use TPT, you will have to look up the information in the manuals. TPT does come with a quick start guide.
 

--SteveF

cianabarrueco 1 post Joined 02/12
09 Jan 2013

Hi Steven,
I attended to few of your sessions during the last Teradata Partners, very informative!!- Thanks
I have a question for you related to the use of TPT load operator when inserting unicode characters.
In a TPT job I'm trying to insert rows into an empty Teradata table that has a column defined as
VARCHAR(4000) CHARACTER SET UNICODE NOT CASESPECIFIC
My source query in the APPLY is specifying 1000 unicode characters selected from the SQL table, the variable in the DEFINE SCHEMA section is VARCHAR(3000) (triple the bytes in lenght), I include "USING CHARACTER SET UTF8" as first statment in the job and also using UTF8 from the command prompt
> tbuild  -f Myfile.txt -v Vars.txt -e UTF8 > MyLog.txt
The job is completed without errors but the data is not inserted into the table reporting possible duplicates (in this case, to provide an example I just specify one row from the source that contains chinese characters)
W_1_o_Test: Total Rows Sent To RDBMS:      1
W_1_o_Test: Total Rows Applied:            0
W_1_o_Test: Total Possible Duplicate Rows: 1
The error table is showing that the unicode column is the cause of the "duplicate" which I seen before is really a mismatch on the data type.
Any suggestions??
Thanks is advance

feinholz 1234 posts Joined 05/08
11 Jan 2013

As always, you MUST tell me what version of TPT you are using. It is very difficult to know if you are using a version that had a bug that was fixed.
Do not get confused by the "duplicate" message. That is a simple calculation based on the number of rows sent to the DBS, subtract the number of rows "applied", subtract the number of rows in the error tables. If there is anything left over, then we assume the rows must have been duplicates.
However, the version of TPT you are using is not outputting the number of rows in error tables to the console.
Go look at the logs from the operators and see if your row ended up in the error table.
 
 
 

--SteveF

cyberbond 1 post Joined 07/12
03 Feb 2013

 

Hi,

  I am using TPT to load a '|' delimited file. However, the blank space coming between 2 delimiters should be loaded as null in the target char field. I tried to achive this using CASE WHEN as mentioned in manual, but that ended up with following syntax error. Please help!

 

Teradata Parallel Transporter Version 14.00.00.04

TPT_INFRA: Syntax error at or near line 58 of Job Script File 'tpt_load1.tpt

TPT_INFRA: At &quot;SELECT&quot; missing SEMICOL_ in Rule: STEP

Compilation failed due to errors. Execution Plan was not generated.

Job script compilation failed.

Job terminated with status 8.

 

The sample file looks like this,

 

1362549|  |0000000013625493|1993-03-22|

1362486|SV|0000000013624863|1989-07-13|

1362489|  |4561143213624894|1990-08-16|1995-09-18

 

Here is the script:

 

DEFINE JOB Load_accts

  DESCRIPTION 'Load a Teradata table from a file' (

      DEFINE SCHEMA Schema_accts (

          cust_id  VARCHAR

          ,acct_type  VARCHAR

          ,acct_nbr  VARCHAR

          ,acct_start_date  VARCHAR

          ,acct_end_date  VARCHAR

      );

      

      DEFINE OPERATOR DataConnector_accts

      TYPE DATACONNECTOR PRODUCER

      SCHEMA Schema_accts

      ATTRIBUTES (

          VARCHAR PrivateLogName     = 'tduser.accts',

          VARCHAR FileName        = 'accts1.txt',

          VARCHAR TraceLevel       = 'All',

          VARCHAR FORMAT         = 'Delimited',

          VARCHAR TextDelimiter      = '|',

          VARCHAR OpenMode        = 'read'

      );

      

      DEFINE OPERATOR Insert_accts

      TYPE INSERTER

      SCHEMA *

      ATTRIBUTES (

          VARCHAR PrivateLogName   = 'tduser.accts',

          VARCHAR TdpId           = @TargetTdpId,

          VARCHAR UserName        = @TargetUserName,

          VARCHAR UserPassword    = @TargetUserPassword,

          VARCHAR TargetTable      = 'tduser.accts',

          VARCHAR LogTable        = 'tduser.accts_L',

          VARCHAR ErrorTable1       = 'tduser.accts_E1',

          VARCHAR ErrorTable2       = 'tduser.accts_E2',

          VARCHAR WorkTable        = 'tduser.accts_WT'

      );

      

      STEP Load_accts (

          APPLY (

              'INSERT INTO tduser.accts (

                     cust_id

                     ,acct_type

                     ,acct_nbr

                     ,acct_start_date

                     ,acct_end_date

              )

              VALUES (

                     :cust_id

                     ,:acct_type

                     ,:acct_nbr

                     ,:acct_start_date

                     ,:acct_end_date

              );'

          )

          TO OPERATOR (

              Insert_accts[1]

          )

 

          SELECT

          cust_id

          ,CASE WHEN TRIM(acct_type)='' THEN

           NULL

           ELSE

          acct_type

           END AS acct_type

          ,acct_nbr

          ,acct_start_date

          ,acct_end_date

          FROM OPERATOR (

              DataConnector_accts[1]

          );

      );

   );

feinholz 1234 posts Joined 05/08
07 Feb 2013

The SELECT statement is not a Teradata SELECT statement. It is a TPT SELECT.
Thus, not all SQL is supported (please check the documentation).
We do not support the TRIM function in TPT.
Any "space" between delimiters is considered data.
If you want the space character to imply the column should be NULLed, then do this:
   CASE WHEN acct_type = ' ' THEN NULL
   ELSE . . . . .
(put a space character in between the single-quotes)

--SteveF

bilal.farooq 28 posts Joined 08/10
11 Nov 2013

In the TPT script, schema section, can we use some string functions on this field: ‘METADATA(FileName)’
 
DEFINE SCHEMA T2_SOURCE_SCHEMA
DESCRIPTION 'T2 SCHEMA'
(
METACOL1 VARCHAR(40) METADATA(FileName),
A_IN_ID1 VARCHAR(4),
A_IN_C1 VARCHAR(10),
A_IN_C2 VARCHAR(10),
A_IN_C3 VARCHAR(10)
);
 
For example:
 
METACOL1 VARCHAR(40) SUBSTR(METADATA(FileName), 1, INDEX(METADATA(FileName), '|')-1 )
 
Regards,
Bilal

feinholz 1234 posts Joined 05/08
11 Nov 2013

No.

--SteveF

manharrishi 11 posts Joined 09/13
27 Aug 2015

Hi Steve,
Can you share any inputs on the below question which was asked many days ago. The version of TTU is 14.10.00.11.
Thanks
Manjeeth
 
Hi,
Is there a mechanism to load files using tpt operator, and have row number for each row of a file in order. In case i have a file
abc.txt where i have 10 rows. Then in the DB i can have rownumber for each row in order from 1 to 10.
Eg.
File:abc.txt
name id salary
abc 123 10000
bcd   234  3000
rfg  356  2000
total salary 15000
IN Database
locking row for access select * from emp.emp_sal ;
rownum emp_name  emp_id emp_salary
1           name             id        sal
2          abc                123      10000
3          bcd                234      3000
4         rfg                  356      2000
5        total salary:15000

You must sign in to leave a comment.