All Forums Tools
joemm 2 posts Joined 12/09
04 Dec 2009
TPT Stream Operator Bug?


We're using TPT Stream operator through Informatica 8.6. The operation I am trying to do is quite simple. Here is some outlines

1) I have a table with Identity column defined as UPI.
2) We're planning to do insert and update to the table based on Identity column
3) In Informatica, I used Update Strategy to flag the record is insert or update.
4) 2 streams are used. One for Insert and one for Update.

Run for the first time, reading from source is no issue and the Informatica session is hanging for quite some time without processing any record. I guess there might be some deadlock.

Change the Stream Operator for Update to "Is Staged" and rerun the Informatica session. All records are inserted correctly.
Then change one attribute in the source, and rerun the job. I was hoping that the record will be updated correctly in target table. Informatica session said the records are updated successfully. No loader log is created for update operation. But in staged file, I can see all the records and in the database no record has been updated.

Is it a bug? BTW, we're using TD 12.0

Thx in advance,

confondue 1 post Joined 12/09
08 Dec 2009

hey, any luck with this issue? We are facing a similar issue

Vador 36 posts Joined 08/07
11 Dec 2009

take care about deadlocks, the table maintenance is done by SQL, and locks are at the rowhash level, if you'r inserting and unpdating the same row through two different streams, you may be in trouble.

have you tried (for validation) to use the stream operator independantly then Informatica?
using the tbuild command will shows if the problem is in the stream operator or in the Infarmatica side.

TonyL 20 posts Joined 12/09
11 Dec 2009

What is the version of the TPT Stream operator that you are using?

joemm 2 posts Joined 12/09
15 Dec 2009

Hi Guys,

Sorry I haven't checked this forum for a week now. Still no luck with the issue. But what I did was instead of sourcing from one instance, I created 2 instances, one for insert and one for update and mark update as priority in target load plan. Then use TPT for Insert and Relational Writer for Update.

raogaru 1 post Joined 01/10
15 Jan 2010

This works if there are huge set of records it may consume time. This is a temporary solution. Can any one give the solution. I to facing the same issue while using TPT.

SuperJetJoe 2 posts Joined 02/10
28 Feb 2010

Why use 2 targets? You can do updates and inserts in one TPT Stream target.
It's probably locking the table with the 2 targets, if you aren't accessing it via a lock for access view.

AbeK 24 posts Joined 08/09
14 Apr 2010

We've run into a similar issue and the COE for Informatica/TPT at Teradata should be able to provide guidance. From the last time I checked, the TPT API stream operator builds its own UPDATE strategy which negates some updates.

zgadiwan 2 posts Joined 08/09
03 May 2010

Did you configure the Informatica session property to insert into target as 'Data Driven'?
You need to set this accordingly as the default is set to 'Insert'. The default will not work for an update strategy.

Please post your feedback when you try this option out.

RAGA 1 post Joined 11/10
17 Nov 2010

I am also having similar kind of problem here:

Please bear with me and sorry for such a big explanation...

Here is the issue I am encountering:

I am on Infa8.6 with Teradata 12 and I am using Teradata Stream Operator to insert/update data into my target tables.

My target table - TableA (on teradata) has 10 fields and INSERTS and UPDATES will happen on this table.
On the database side, Field2 has been marked as NUPI. Field1 currently has no index and will be populated by Informatica Sequence generator when a record has been inserted.

As of now, I guess I don't have issues with INSERTS.

My problem is with UPDATES...

Field1 and Field2 combined will uniquely identify a record that needs to be updated.

Right now, as per the database definition, I have Field2 marked on NUPI on the database side. Since Update needs to happen on both Field1 and Field2, I have added Field1 as PK and Field2 as PK in informatica target definition. And since I have defined Filed1 and Field as PK on the informatica side, Teradata is trying to SERIALIZE on both the fields when it is establishing connection to the target.

With the kind of data I am having, SERIALIZING on Field1 and Field2 will not work for me as most of the times, I will be updating the target table and possibly the same row many times and I am encountering deadlock issues most of the times which is causing my process to run for a longer time. So, I need to SERIALIZE on Field2 but update the table by Field1 and Field2.

So, what I am trying to do here is:

Mark only Field2 as PK on the informatica side (as it is NUPI on DB side) and Use Update Override on the target and generate UPDATE statement which updates by both Field1 and Field2.

Unfortunately, my problem is, It is not using the UPDTAE OVERRIDE sql that I have in the update override. It is using the same default Update statement which informatica generates by default and it is ignoring the oveeride statement that i have provided...

Can someone Please help me on what to do or how to get over this issue???

I have the following options In my session:
I have marked "Treat Source Rows - as Update"
I have checked Serialize option in mappings tab for target and
I have checked the Update Else Insert optin for the target.

Any help is greatly appreciated.

feinholz 1234 posts Joined 05/08
18 Nov 2010

The Stream operator uses the DML statements sent to it from the application (Informatica, in this case). If the DML statements being executed are not the ones you are expecting ("expecting" due to some setting you are providing to Informatica), then you need to talk to Informatica about this.


emilwu 72 posts Joined 12/07
10 Feb 2011

Stream operator is no different from TPUMP. if your try to update a PI (primary index, not primary key, the physical pi), i do not think it is possible.

If you looks closely what is happening under the hook the tpump uses macro to do amp local operation, update a PI requires data move from one amp to another, therefore, not supported.

For dead lock / lock problem, it is quite common on tpump jobs when the target table's pI is not unique and/or the target table's PI is not integer/numeric based. serialization helps , to certain degree but you still have chances to hit it. so, try change your session setting and play with pack size and serialization setting to get satisfying throughput.

You must sign in to leave a comment.