All Forums Tools
akd2k6 54 posts Joined 12/14
13 Jul 2015
unloaded Date value issue by tpt script

Hi Steve, I faced a new issue in date format of tbuild. This is the o/p of simple bteq query from that table-

select D_CLOSE_DATE from schema.table;

D_CLOSE_DATE

------------

    01/01/01

    15/07/07

    01/01/01

    15/07/07

    15/07/03

           ?

           ?

    15/07/07

    01/01/01

    15/07/07

    15/07/03

           ?

           ?

    15/07/07

    15/07/03

           ?

    01/01/01

    15/07/07

    15/07/03

           ?

           ?

    15/07/07

           ?

    01/01/01

           ?

    15/07/03

--------------------

 

Now I am unloading the data by tbuild and loading with tdload. But the tdload is sending above bold 5 records(01/01/01) into error table.

While I checked the unload file , that value is not unloaded into file correctly. they unloaded as below and it's going to error while loading.Can you please advise the reason and solution.

 

 

awk -F'^^' '{print $9}' $unload_file

1/01/

2015/07/07

1/01/

2015/07/07

2015/07/03

 

 

2015/07/07

1/01/

2015/07/07

2015/07/03

 

 

2015/07/07

2015/07/03

 

1/01/

2015/07/07

2015/07/03

 

 

2015/07/07

 

1/01/

 

2015/07/03

---------------------

Column definition-

 

D_CLOSE_DATE DATE FORMAT 'YY/MM/DD',

 

tpt job variable file content

-------------------------------

UsrId = 'xxx'

SOURCE_TAB_NAME = 'DRI_ADS_D'

SOURCE_DB = 'schema'

Tdp = 'xxx'

PrivateLog1 = 'file_writer_privatelog1_36045428'

PrivateLog2 = 'file_writer_privatelog2_36045428'

PrivateLog3  = 'file_writer_privatelog3_36045428'

Database_Table = 'schema.DRI_ADS_D'

OP_FILE ='DRI_ADS_D_UNLOAD_201507030849.dat'

DIR_PATH = '/tmp'

SQL = 'select * FROM schema.DRI_ADS_D;'

 

unload execution

------------------

tbuild -e 'UTF-8' -f  ${WORKING_DIR}/tpt_unload.ksh -u "Pwd='${Pwd}'"  -v $tpt_job_variable_file

 

tpt_unload.ksh

--------------

DEFINE JOB LOAD_EMPLOYEE_TABLE_FROM_FILE

DESCRIPTION 'EXPORT SAMPLE WZ1D02_BITPROC TABLE TO A FILE'

(

 DEFINE SCHEMA WZ1D02_BITPROC_SCHEMA FROM TABLE @Database_Table;

 

   DEFINE OPERATOR DDL_OPERATOR()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'

   TYPE DDL

   ATTRIBUTES

   (

      VARCHAR PrivateLogName = @PrivateLog1,

      VARCHAR TdpId          = @Tdp,

      VARCHAR UserName       = @UsrId,

      VARCHAR UserPassword   = @Pwd,

      VARCHAR AccountID,

      VARCHAR ErrorList      = '3807'

   );

 

   DEFINE OPERATOR FILE_WRITER()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

   TYPE DATACONNECTOR CONSUMER

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName    = @PrivateLog2,

      VARCHAR DirectoryPath     = @DIR_PATH, 

      VARCHAR FileName          = @OP_FILE,

      VARCHAR IndicatorMode     = 'N',

      VARCHAR OpenMode          = 'Write',

      VARCHAR Format            = 'Delimited',

      VARCHAR TextDelimiter     = 'ÇÇ'

   );

 

   DEFINE OPERATOR SQL_SELECTOR

   TYPE SELECTOR

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName = @PrivateLog3,

      VARCHAR TdpId= @Tdp,

      VARCHAR UserName= @UsrId,

      VARCHAR UserPassword= @Pwd,

      VARCHAR SelectStmt= @SQL,

      VARCHAR ReportMode='Y',

      INTEGER MaxDecimalDigits=38

   );

 

   STEP setup_export_to_file

   (

      APPLY TO OPERATOR (FILE_WRITER() [1] )

      SELECT * FROM OPERATOR (SQL_SELECTOR);

   );

 

);

 

 

 

 

feinholz 1234 posts Joined 05/08
13 Jul 2015

I am surprised your job ran at all.
You used the DEFINE SCHEMA to pull the table definition from a table.
This would have created a schema with various datatypes.
However, the SQL Selector is defind in the script to work in ReportMode, which requires a schema of all VARCHAR fields.
There should have been a schema mismatch error.
The best thing to do is to leave the DEFINE SCHEMA the way it is, but switch the exporting operator from the SQL Selector to the Export operator. That operator will run faster, all of the data will be exported in binary and the DC operator (as a file writer) will still convert the binary data to text when writing out the delimited data.
 

--SteveF

akd2k6 54 posts Joined 12/14
13 Jul 2015

Hi Steve,  Thanks a lot. I changed the script as you told and updated script is as below.

but still face the same issue.records for those date are coming in that format. Also after unloading,

I have to load the file to another table with tdload. is there anything I need to change ? I have given my tdload command also below for loading.

 

 

awk -F'^^' '{print $9}' $unload_file

2015/07/07

2015/07/07

2015/07/07

2015/07/07

2015/07/07

2015/07/07

2015/07/07

1/01/

1/01/

2015/07/03

1/01/

2015/07/03

2015/07/03

1/01/

2015/07/03

1/01/

2015/07/03

 

New unload script

DEFINE JOB LOAD_EMPLOYEE_TABLE_FROM_FILE

 

DESCRIPTION 'EXPORT SAMPLE WZ1D02_BITPROC TABLE TO A FILE'

(

 DEFINE SCHEMA WZ1D02_BITPROC_SCHEMA FROM TABLE @Database_Table;

 

 

   DEFINE OPERATOR DDL_OPERATOR()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'

   TYPE DDL

   ATTRIBUTES

   (

      VARCHAR PrivateLogName = @PrivateLog1,

      VARCHAR TdpId          = @Tdp,

      VARCHAR UserName       = @UsrId,

      VARCHAR UserPassword   = @Pwd,

      VARCHAR AccountID,

      VARCHAR ErrorList      = '3807'

   );

 

 

 

   DEFINE OPERATOR FILE_WRITER()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

   TYPE DATACONNECTOR CONSUMER

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName    = @PrivateLog2,

      VARCHAR DirectoryPath     = @DIR_PATH,

      VARCHAR FileName          = @OP_FILE,

      VARCHAR IndicatorMode     = 'N',

      VARCHAR OpenMode          = 'Write',

      VARCHAR Format            = 'Delimited',

      VARCHAR TextDelimiter     = 'ÇÇ'

   );

 

 

    DEFINE OPERATOR EXPORT_OPERATOR()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER EXPORT OPERATOR'

   TYPE EXPORT

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName    = @PrivateLog3,

      INTEGER MaxSessions       =  32,

      INTEGER MinSessions       =  4,

      VARCHAR TdpId             = @Tdp,

      VARCHAR UserName          = @UsrId,

      VARCHAR UserPassword      = @Pwd,

      VARCHAR AccountId,

      INTEGER MaxDecimalDigits  = 38,

      VARCHAR SelectStmt        = @SQL

 

   );

 

 

 

   STEP setup_export_to_file

 

   (

 

      APPLY TO OPERATOR (FILE_WRITER() [1] )

      SELECT * FROM OPERATOR (EXPORT_OPERATOR() [1]);

 

   );

);

 

 

 

Tdload script

 

tdload -c UTF8 -f $unload_file -t schema.DRI_ADS_D -u xxx -p xxx -h xxx -L /tmp --TargetWorkingDatabase util_schema \

-d 'ÇÇ' --ErrorTable1 util_schema.tabl1 --ErrorTable2 util_schema.tabl2 \

--LogTable util_schema.tabl3 --TargetMaxSessions 8 --TargetMinSessions 1

 

 

now tdload also failing to load that unload file.

LOAD: connecting sessions

$LOAD: preparing target table

$LOAD: TPT10508: RDBMS error 3621: Cannot load table DRI_ADS_D unless secondary indexes and join indexes are removed.

$LOAD: disconnecting sessions

feinholz 1234 posts Joined 05/08
14 Jul 2015

Just for the sake of trying, can you please pick a single byte delimiter that is part of the normal 7-bit ASCII code page?
I would like to see if that enables the date data to be written correctly.
 

--SteveF

akd2k6 54 posts Joined 12/14
14 Jul 2015

Hi Steve I tried with normal '|' delimited also but it's the same result in unload. previously with sql_select the load was working and only those invalid records were going to error table, but now load process with tdload also failing as mentioned above.
 
Changed unload script-

  DEFINE OPERATOR FILE_WRITER()

   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

   TYPE DATACONNECTOR CONSUMER

   SCHEMA WZ1D02_BITPROC_SCHEMA

   ATTRIBUTES

   (

      VARCHAR PrivateLogName    = @PrivateLog2,

      VARCHAR DirectoryPath     = @DIR_PATH, 

      VARCHAR FileName          = @OP_FILE,

      VARCHAR IndicatorMode     = 'N',

      VARCHAR OpenMode          = 'Write',

      VARCHAR Format            = 'Delimited',

      VARCHAR TextDelimiter     = '|'

   );

 

feinholz 1234 posts Joined 05/08
14 Jul 2015

The DBS 3621 issue is a separate issue and not related to which operator is used to export the data.
I would like to address them separately.
For the DATE data issue, is it true that only the dates with values 01/01/01 are the ones that are not written out properly?
BTW, it does not pay to try to load the data if it was exported incorrectly.
We need to just figure out why certain date values are not being written properly.
 
The DBS 3621 error should never be seen. It means that Easy Loader was picking the wrong operator to do the load. And yet Easy Loader is supposed to look at the target table and notice it has secondary indexes and not pick the Load operator.
Can you please run the tdload command with the -x and -S command line options?
The -S will keep the script.
I need you to send that to me.
The -x command line option will turn on trace.
I will need you to send me that information.
Please send all of this via email to me.
Thanks!

--SteveF

feinholz 1234 posts Joined 05/08
15 Jul 2015

Also, please provide me with the SELECT statement you are passing in as a job variable.
(When I try it on my system, I get 2001/01/01 in my data file.)
 

--SteveF

akd2k6 54 posts Joined 12/14
16 Jul 2015

Hi Steve, my sleect statement is 'select * from table;"
I tried with some other value like '1001-01-01','2001-01-01' , '0009-01-01' etc
whenever year is less than '1000' it's facing that problem.

feinholz 1234 posts Joined 05/08
16 Jul 2015

When your BTEQ output shows a date of 01/01/01, what is the real date?
2001?
1901?
When I enter a date as:
INSERT INTO TABLE abc ('0001/01/01');
I get this on output from TPT:
2901/01/01
 
In the DataConnector Consumer operator, add this attribute:
VARCHAR TraceLevel = 'all'
 
and send me the entire log file (the binary .out file).
I would like to take a look. The trace will show me what the DBS is passing to the operator.
I need to know if the weird data is coming from the DBS, or whether the data is correct, but the DC operator is not converting it properly.
So, if you could just extract the data for the one column/row that has a value of 01/01/01, that is preferable, as it cuts down on the amount of trace information.
 
Also, I never asked what version of TPT and what version of Teradata are you using?
 

--SteveF

akd2k6 54 posts Joined 12/14
21 Jul 2015

Steve I tried that with TPT also to unload in delimited file and load it to another table. But there I am facing below issues-
1.when date is less than (1000-01-01) e.x "0001-01-01" it's not unloading correctly and load process failing.
2.while any field value is ''(blank) it's unloading but during loading it's taking as NULL and trying to load nut null field and failing.
3. unicode fields have unicode character. but they are unloading. but while trying to load, they are getting rejected.looks like either CONSUMER or PRODUCER operator not performing correctly for all unicode char.
 
I am using export and DATACONNECTOR_CONSUMER operator for unload and DATACONNECTOR_PRODUCER and UPDATE operator for loading.
even I tried to unload with selector and DATACONNECTOR_CONSUMER operator, but in that uload itself failed with  error-
"Column #1 in schema is too small to hold 8 bytes of data"
My requirement is to create the unload file so that later I can ready without teradata for archival and also load that file to another table.

feinholz 1234 posts Joined 05/08
22 Jul 2015

Ok, I was finally able to reproduce the problem (#1) on my PC!
I will have to talk to the DBS folks to figure out what is going on, but here is what I know.
When you use the DEFINE SCHEMA <name> FROM TABLE <tname>;
we look at the table and see a DATE column and, by default, create a schema with the column defined as INTDATE.
(By default, Teradata treats dates as integers and even stores the values as integer values.)
When we export the data in that manner, for years that are less than 1000, for some reason Teradata gives us incorrect information.
I do not know why and I will have to investigate this.
However, if you actually explicitly define the schema in the script, and you define the DATE field as "ANSIDATE", and then you set the Export operator attribute DateForm to "ansidate", then the data will be exported correctly.
 

--SteveF

feinholz 1234 posts Joined 05/08
23 Jul 2015

Ok, a little more information.
When the default method of exporting DATE data is used, Teradata exports the data as integer data.
This is what is happening in this case.
And any year prior to 1900 is technically an invalid year, but looks like Teradata accepts it anyway.
However, a DATE that is externally representated (in character format) as "0001/01/01" is really stored internally as an integer value and the leading 0's are lost.
When the DC operator tries to convert that integer value to a string, there is not enough information to create a string in the form of "YYYY/MM/DD" because of all of the missing leading zeros.
The only way to correctly export the information is to export the data in ANSIDATE format.
 
However, our schema auto-generation code does not currently take DateForm into consideration when generating the schema and so the columns are always defined as INTDATE.
We will work on a fix for that.
In the meantime, the only workaround is to define the schema explicitly in the script, and provide the DateForm attribute (or ExportDateForm job variable) to the job.
 

--SteveF

feinholz 1234 posts Joined 05/08
23 Jul 2015

Even a little more information.
 
This is the formula Teradata uses:
(year-1900) * 10000 + (month*100) + day
 
Thus, "2015/01/01" would yield:
(2015-1900) * 10000 + (1 * 100) + 1 = 1150101
 
and "0001/01/01" would yield:
(1-1900) * 10000 + (1 * 100) + 1 = -18990000 + 100 + 1 = -18989899
 
And we see that the DC operator is really not converting from the decimal value correctly.
It should have taken the result (-18989899) and added 19000000, which would result in 10101, and then convert that to 1/01/01 (and possibly go one step further to add the leading 0's, but the DC operator would not know the exact format you are looking for).
We will fix the DC for this.
 
 

--SteveF

akd2k6 54 posts Joined 12/14
23 Jul 2015

Thanks Steve, 
1. I tried with the option dateform=ansiDate in job variable file but still the same wrong result.
2. I have another null vs blank issue-
Table fields

  MIS_DT DATE FORMAT 'YYYY-MM-DD',

  NM_TXT VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC not null,

  nm_surname VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC) 

 

table values

2015-01-21|abc|def

2015-01-31|BLANK|lmn

2015-01-01|ghi|ijk

2015-02-01|BLANK|NULL

 

I want to unload and then load the file to another same structure table.

 

while using below parameter-

,DCCCloseQuoteMark='"'

,DCCOpenQuoteMark = '"'

,DCCQuotedData='Yes'

 

Data is unloaded correctly like below

"2015/01/21"|"abc"|"def"

"2015/01/31"|""|"lmn"

"2015/01/01"|"ghi"|"ijk"

"2015/02/01"|""|

 

But while loading this file all the records are getting rejected due to date format issue.

 

Next I tried with below option-

,DCCCloseQuoteMark='"'

,DCCOpenQuoteMark = '"'

,DCCQuotedData='Optional'

 

It is unloading data as -

2015/01/21|abc|def

2015/01/31|""|lmn

2015/01/01|ghi|ijk

2015/02/01|""|

 

But during loading this file, it's loading all data but blank values are loaded as "". So in table instead  of loading blank values, it's loading double quote as value.

 

Can you please advise with which setting I can unload and load data from that table.

 

 

 
 
 

feinholz 1234 posts Joined 05/08
23 Jul 2015

#1. If you read all of my posts, you will see that the DateForm solution originally proposed will not work. You will have to also explicitly provide the schema and define the DATE fields as ANSIDATE.
#2. I did you set DCPQuotedData to 'yes' so that the file reader will process the data correctly and strip the quotes?
(do not confuse "DCC" and "DCP")
 

--SteveF

akd2k6 54 posts Joined 12/14
24 Jul 2015

Hi Steve,
I used the below setings in unload file-

,DCCCloseQuoteMark='"'

,DCCOpenQuoteMark = '"'

,DCCQuotedData='Yes'

 

and load job variable is as-

,DCPCloseQuoteMark='"'

,DCPOpenQuoteMark = '"'

,DCPQuotedData='Yes'

 

unload file content is -

"2015/01/21"|"palash"|"dhara"

"2015/02/01"|""|

"2015/01/01"|"abhishek"|"dhara"

"2015/01/31"|""|"arora"

 

But while loading, below 2 records got rejected due to field NM_TXT (field name found in error table and reason looks like middle field treated as null but table ddl defined as not null.)

"2015/02/01"|""|

"2015/01/31"|""|"arora"

 

After that I tried with option ,DCPNullColumns='No'

But here all records are loaded. but the last field value(nm_surname ) for the below record loaded as blank instead of null. In my source it is null not blank.

"2015/02/01"|""|

 

feinholz 1234 posts Joined 05/08
24 Jul 2015

When you say this:
 
unload file content is -
"2015/01/21"|"palash"|"dhara"
"2015/02/01"|""|
"2015/01/01"|"abhishek"|"dhara"
"2015/01/31"|""|"arora"
 
I am assuming for right now that these are the only 4 records that we are talking about.
Also, I just want to confirm with you that none of those fields are NULL.
When QuotedData is 'Yes', a field that is NULL is still supposed to be represented by 2 consecutive delimiters and the existence of "" means the field was a o-length string (a field being NULL, and a field having a 0-length string are 2 different things).
 
Once you verify that, I will check with engineering.
Please provide me with the exact TPT version and on which platform this job is running.
 

--SteveF

akd2k6 54 posts Joined 12/14
24 Jul 2015

Table has four records and one field in a row is null, below are the values in table-
2015-01-21|Palash|Dhara
2015-01-31|BLANK|Arora
2015-01-01|Abhishek|Dhara
2015-02-01|BLANK|NULL
 
while I unload data came as below in file. the NULL value is not coming in "" in second line last field.

"2015/01/21"|"palash"|"dhara"

"2015/02/01"|""|

"2015/01/01"|"abhishek"|"dhara"

"2015/01/31"|""|"arora"

feinholz 1234 posts Joined 05/08
24 Jul 2015

The NULL is not supposed to be represented by "".
That is an empty string.
It is valid to not write anything in the 3rd field. The existence of a delimiter and then an end-of-line marker will indicate the column at that position is supposed to be NULL.
However, when the file is then read in the file reader must note that there is an absence of a value for that column position and denote that column as NULL.
We are looking into our code to see what is going on.
However, I still have not heard back on the version of TPT and platform.
 

--SteveF

akd2k6 54 posts Joined 12/14
24 Jul 2015

platform is AIX 6.1
Teradata Parallel Transporter Version 14.10.00.05

akd2k6 54 posts Joined 12/14
24 Jul 2015

for your information-
load script-

USING CHARACTER SET UTF8

DEFINE JOB dc_load

DESCRIPTION 'Load a Teradata table from a file'

(

  APPLY $INSERT TO OPERATOR( $UPDATE()[@LoadInstances] )

  SELECT * FROM OPERATOR( $DATACONNECTOR_PRODUCER()[@DCPInstances] );

);

 

 

unload script-

USING CHARACTER SET UTF8

DEFINE JOB delimited_file_unload

DESCRIPTION 'Export rows from a Teradata table to a delimited file'

(

  APPLY TO OPERATOR ($DATACONNECTOR_CONSUMER())

  SELECT * FROM OPERATOR ($EXPORT()); 

);

 

 

feinholz 1234 posts Joined 05/08
24 Jul 2015

Thank you! I will let you know when we find something out.
 

--SteveF

feinholz 1234 posts Joined 05/08
27 Jul 2015

Possible workaround:
Add another atribute to the DC operator:
AppendDelimiter='Y'
(check to see if it is in the DC operator template file for your version of TPT)
This will force a delimiter to be placed at the end of every record.
And will hopefully get around the bug in which a delimiter followed by end-of-record is not being treated as a NUILL field.
 

--SteveF

leolee1234 1 post Joined 01/14
27 Jul 2015

Hi Steve,
May I know is there any update on the fix to export '0001-01-01' date when we are using DEFINE SCHEMA <name> FROM TABLE <tname>; ?
Since we are trying to export all the tables in an entire database and transfer the exported data file to another region's server, and then load to DB, the DEFINE SCHEMA <name> FROM TABLE <tname>; method sounds much more convenient to us.
We are using TTU 14.10. So when will it be fixed?
 
Thanks,
Leo

feinholz 1234 posts Joined 05/08
27 Jul 2015

I will check the development schedule tomorrow if I have a chance (if not Tuesday, then Wednesday). I imagine it will be a few months. We do not release an efix for every release in each month (we currently support releases 13.10, 14.0, 14.10, 15.0, 15.10 and are currently working on release 16.0).
 
The only other option (when it becomes available) is to offer you a D2D (a developer-to-developer) copy of the software, which is an engineering copy (not an official release). However, we do not allow D2Ds to be used in production environments. And I cannot commit to when a D2D would become available, but it (obviously) be sooner than the official release.
 

--SteveF

ashish089 3 posts Joined 06/15
28 Jul 2015

I am facing an issue while loading data using SSIS to Teradata.
I am having an XML file to load and as destination object am using "Teradata Destination" so the XML parser being source return 21 tags to be loaded.
So, if am trying to load all 21 in parallel it return an error and when am tryint to load only 13 out of those 21 it works fine.
 
SSIS is internally using TPT load to load the data.
 
Error Message:
 
"Error 5584: TPT Import error encountered during Initiate phase. CLI2: SESSOVER(301): Exceeded max number of sessions allowed."
 
Please Suggest!!!!

akd2k6 54 posts Joined 12/14
29 Jul 2015

Hi Steve, I got one new issue. I have one field with timestamp(6) and value in source table is NULL. But while unloading and then loading records are getting rejected with error 6706.
I tried in format binary,formatted ,with quote ,without quote. But same error.

feinholz 1234 posts Joined 05/08
29 Jul 2015

What does the output file for that row look like?
Also, have you tried Export-to-Load without writing the data to a file to see the results?
 

--SteveF

akd2k6 54 posts Joined 12/14
17 Sep 2015

Hi Steve , I am loading with tpt table to table. But got a new issue.
The column name in my table is with space.
Column names are as-

"Daily Total" INTEGER,

 

Teradata is allowing to create the column name with space. But while loading table to table. it is failing for schema issue.

feinholz 1234 posts Joined 05/08
17 Sep 2015

I will need more information that this.
I will need the script, version of TPT, exact error message, etc.
 

--SteveF

akd2k6 54 posts Joined 12/14
17 Sep 2015

Error

$UPDATE: TPT10508: RDBMS error 3707: Syntax error, expected something like ',' or ')' between the word 'Daily' and the word 'Total'.

$UPDATE: disconnecting sessions

 

Version

Teradata Parallel Transporter Version 14.10.00.05

 

Script

USING CHARACTER SET UTF8

DEFINE JOB tpt_table_to_table_load

(

  APPLY $INSERT TO OPERATOR($UPDATE()[1] )

  SELECT * FROM OPERATOR($EXPORT() [1]);

);

 

 

feinholz 1234 posts Joined 05/08
18 Sep 2015

Can you please run the tbuild command with the -P command line option?
In the current working directory you should find some trace files and the expanded script that is generated under the covers.
Please send all of those files to me:
steven.feinholz@teradata.com
 

--SteveF

reorz 2 posts Joined 04/14
30 Dec 2015

How save current_Date in JOB variable tpt.
I need save a filename with current_date

feinholz 1234 posts Joined 05/08
04 Jan 2016

We do not currently support that feature.
It is being considered for a future release.
 

--SteveF

You must sign in to leave a comment.