All Forums Tools
kusumayella 16 posts Joined 08/10
25 Mar 2011
TPT script loading issue

When I am trying to load 'sample table' table it's not throwing any error and not loading the corresponding table as well.

I got the following output:
Operator_LOAD_sampletable: Total Rows Sent To RDBMS: 4
Operator_LOAD_sampletable: Total Rows Applied: 0
Operator_LOAD_sampletable: Total Possible Duplicate Rows: 4

I have other language data in my data file.

Is anybody faced this problem or else have an idea?
If so,please share with me,I need it .

Thanks

narang.mohit 13 posts Joined 07/09
26 Mar 2011

check the error table

jwh_ws 15 posts Joined 04/11
05 Jun 2011

I am seeing the exact same issue. The table does not have a UPI or USI defined on it and the UV error table is empty after the load.

How can there be *Possible* Duplicate Rows on a table without a uniqueness constraint?

VandeBergB 182 posts Joined 09/06
05 Jun 2011

If you're using the load operator, it is analagous to fastload, which discards any rows that are duplicates. Check the data set your trying to load...

Some drink from the fountain of knowledge, others just gargle.

ProudCat 13 posts Joined 09/11
05 Dec 2011

I have exactly same problem now.

We use 'create multiset' statement to create a table and it definitely doesn't have duplicates. However ALL the rows get discarded..

$LOAD: Statistics for Target Table:  'pt_fragment'
$LOAD: Total Rows Sent To RDBMS:      38080
$LOAD: Total Rows Applied:            0
$LOAD: Total Possible Duplicate Rows: 38080

Any help?

 

feinholz 1234 posts Joined 05/08
05 Dec 2011

You will have to look inside the log (not just the messages on the console) to see if there is any additional information.

--SteveF

ProudCat 13 posts Joined 09/11
05 Dec 2011

Thanks. Was able to find an error code in the _err table. Apparently I have a problem with date formatting that has nothing to do with duplicate records. Very inconstent error reporting...

feinholz 1234 posts Joined 05/08
05 Dec 2011

Yeah, we are working to correct that.

The console is supposed to have the row counts from the error tables, and thus you would not have a value in the "duplicate row" message.

We actually do not know how many rows are dups. We just perform the basic math calculation based on rows sent to Teradata, then subtract the number of rows in the error table, minus the number of rows in the error tables. Anything left over is suspected to be missing due to be thrown away by Teradata as a dup.

 

--SteveF

j355ga 100 posts Joined 12/05
14 Feb 2012

I am getting similar error but is not related to invalid data.     

When using load operater to MULTISET NOPI  all rows load. 

When using load operator MULSTISET NUPI is used "duplicates" are rejected.    The ET and UV tables are empty after the load - even when specifying to retain the tables.

Any ideas?

Using a NUPI multiset table:

 $LOAD: Total Rows Sent To RDBMS:      21135243
 $LOAD: Total Rows Applied:            114917
 $LOAD: Total Possible Duplicate Rows: 21020326

Using a NOPI multiset table

 $LOAD: Total Rows Sent To RDBMS:      21135243
 $LOAD: Total Rows Applied:            21135243 

 

Jeff

feinholz 1234 posts Joined 05/08
14 Feb 2012

The Load operator uses the FastLoad protocol.

With the FastLoad protocol, you cannot insert duplicates rows into a MULTISET table. They will be discarded by the DBS during the Application Phase.

This is a DBS restriction, not a client utility restriction.

Since the loading of a NOPI table by the Load operator does not have an Application Phase, the duplicates cannot be discarded by the DBS, and thus duplicates can be stored in a NOPI table.

 

--SteveF

Samya 21 posts Joined 11/11
27 Feb 2012

 

Hi Feinholz,

Can you plz visit the below link, where in i have posted few of my queries. i was not sure of where to post the same.

http://forums.teradata.com/forum/tools/teradata-pt-12-0-error-output-schema-does-not-match#comment-19456

Thanks in advance,

 

Regards,

Sam

Samya 21 posts Joined 11/11
26 Mar 2012

 

Hi Feinholz,

Please provide your valuable inputs.....

I am getting the below error, when trying to load to a TD empty table: -

 

LOAD_OPERATOR: preparing target table

LOAD_OPERATOR: entering Acquisition Phase

LOAD_OPERATOR: entering Application Phase

LOAD_OPERATOR: Statistics for Target Table:  'abc.load_table_name'

LOAD_OPERATOR: Total Rows Sent To RDBMS:      1

LOAD_OPERATOR: Total Rows Applied:            0

LOAD_OPERATOR: Total Possible Duplicate Rows: 1

LOAD_OPERATOR: disconnecting sessions

FILE_READER: TPT19221 Total files processed: 1.

LOAD_OPERATOR: Total processor time used = '5.01691 Second(s)'

LOAD_OPERATOR: Start : Mon Mar 26 12:15:16 2012

LOAD_OPERATOR: End   : Mon Mar 26 12:35:10 2012

Job step load_data_from_file completed successfully

Job c1513506 completed successfully

 

 

The Error tables are getting deleted automatically as the job is getting successfully executed.

So Can you please, specify any link or doc wherein i can get the entire set of common errors & their solutions.

Regards,

Sam

feinholz 1234 posts Joined 05/08
26 Mar 2012

I do not see an error message from that output.

You will probably have to look at the entire log (with the tlogview utility) to see more about why the row may have been rejected.

 

--SteveF

Barathe 3 posts Joined 11/12
19 Dec 2012

I am using export and update operator in a TPT script.Primary Index alone is defined in my target table(Not-UPI).I don't get any error while executing the TPT script.But while loading the target table,the exported rows are not getting inserted.
I want to insert all the exported rows,even if they are duplicates.
PFB the output of the TPT script.
Teradata Parallel Transporter Update Operator Version 13.10.00.04
UPDATE_OPERATOR: private log specified: Loadoper_privatelog
Teradata Parallel Transporter Export Operator Version 13.10.00.05
EXPORT_OPERATOR: private log specified: exportoper_privatelog
EXPORT_OPERATOR: connecting sessions
UPDATE_OPERATOR: connecting sessions
UPDATE_OPERATOR: preparing target table(s)
UPDATE_OPERATOR: entering DML Phase
UPDATE_OPERATOR: entering Acquisition Phase
EXPORT_OPERATOR: sending SELECT request
EXPORT_OPERATOR: entering End Export Phase
EXPORT_OPERATOR: Total Rows Exported:  4
UPDATE_OPERATOR: entering Application Phase
UPDATE_OPERATOR: Statistics for Target Table:  'student_2'
UPDATE_OPERATOR: Rows Inserted: 0
UPDATE_OPERATOR: Rows Updated:  0
UPDATE_OPERATOR: Rows Deleted:  0
UPDATE_OPERATOR: entering Cleanup Phase
UPDATE_OPERATOR: disconnecting sessions
EXPORT_OPERATOR: disconnecting sessions
EXPORT_OPERATOR: Total processor time used = '0.36 Second(s)'
 
Kindly help in resolving this issue.
Regards,
Barathe G.
 
 

feinholz 1234 posts Joined 05/08
20 Dec 2012

Is your target table a SET table, or a MULTISET table? It must be a MULTISET Table in order to load duplicates.
You will need to look at the rest of the log to see where the rows are. They might be in one of the error tables.
 

--SteveF

zammohan 6 posts Joined 04/14
30 Apr 2014

Hi Fieholz,
 
I am having multiset target table with only primary index defined. Still duplicate records are not getting loaded when i use the following script. I would be be grateful to you, if you can throw some light on this.
 

USING CHARACTER SET UTF8
DEFINE JOB MOVE_DATA_WITHOUT_LANDING_TO_DISK
DESCRIPTION 'MOVE DATA WITHOUT LANDING THE DATA TO DISK'
(
DEFINE SCHEMA SCHEMA_NAME
   (
   COLUMN1 INTEGER,
COLUMN2 VARCHAR(384),
COLUMN3 VARCHAR(1500),
COLUMN4 VARCHAR(240),
COLUMN5 VARCHAR(240)

    );
   /*** Export Operator Definition ***/
 
   DEFINE OPERATOR EXPORT_OPERATOR
    DESCRIPTION 'TERADATA PARALLEL TRANSPORTER EXPORT OPERATOR'
    TYPE EXPORT
    SCHEMA PLT_IWH_BASE
    ATTRIBUTES
    (
      VARCHAR PrivateLogName = 'exportoper_privatelog',
       INTEGER MaxSessions    =  8,
       INTEGER MinSessions,
       VARCHAR TdpId          = 'XXXX',
       VARCHAR UserName       = 'XXXX',
       VARCHAR UserPassword   = 'XXXX',
       VARCHAR SelectStmt     = 'SELECT COLUMN1,
COLUMN2,
COLUMN3,
COLUMN4
 from SRC_DB_NAME.TABLE_NAME;'
    );
    /*** Load Operator Definition ***/
 
  
    DEFINE OPERATOR LOAD_OPERATOR    
    DESCRIPTION 'TERADATA PARALLEL TRANSPORTER LOAD OPERATOR'
    TYPE LOAD
    SCHEMA SCHEMA_NAME
    ATTRIBUTES
    (  VARCHAR PrivateLogName = 'loadoper_privatelog',
       INTEGER MaxSessions    = 16,
       INTEGER MinSessions,
       VARCHAR TargetTable    = 'TGT_DB_NAME.TABLE_NAME',
       VARCHAR TdpId          = 'XXXXX',
       VARCHAR UserName       = 'XXX',
       VARCHAR UserPassword   = 'XXX',
       VARCHAR ErrorTable1    = 'ERRTABLE1',
       VARCHAR ErrorTable2    = 'ERRTABLE2',
        VARCHAR LogTable      = 'LOGTABLE'
           );
 
   /*** Apply Statement ***/
   APPLY
    ('INSERT INTO TGT_DB_NAME.TABLE_NAME
(
:COLUMN1,
:COLUMN2,
:COLUMN3,
:COLUMN4
);')
    TO OPERATOR (LOAD_OPERATOR [1])
  
   SELECT * FROM OPERATOR (EXPORT_OPERATOR [2]);
 
);

 

Best,
Shyam

CarlosAL 512 posts Joined 04/08
30 Apr 2014

Fastload (or TPT load operator) doesn't load duplicated rows, no matter if table is SET or MULTISET  (NoPI tables are an exception to this).
If you need to load duplicates you must use multiload (or TPT update operator)
Cheers.
Carlos.

zammohan 6 posts Joined 04/14
01 May 2014

Many Thanks CarlosAL.
Using UPDATE operetaor, the duplicates got loaded.
I am loading a set of tables and only few tables contain duplicate data. Trying to analyze the performance difference b/w LOAD and UPDATE operator.

Best,
Shyam

You must sign in to leave a comment.