All Forums Tools
JerryLusa 23 posts Joined 08/13
18 Jun 2014
Running multiple concurrent TPT INSERTER jobs on one server - seeing long wind-down

We're trying to find the upper limit to how many concurrent INSERTER operators we can run on our application server.  After a certain number, we are seeing a long delay after the INSERT takes place.  If we knew what resource(s) were blocking the finalizaton of these jobs we might be able to tune the environment to get more running at once.
Here's the end of the log file for a job that ran for 11 seconds to the point where it completed an INSERT/SELECT of 300 rows read from text files.  Then it waits 9 minutes before ending.  What is causing the delay at the end of this job?
The were 11 other INSERTER jobs and 12 LOAD jobs running concurrently.  CPU was hovering 99-100%, Disk I/O and Network I/O were negligible.
the log (no sensitive data)...
 **** 16:42:56 Checkpoint complete. Rows inserted: 300
Task(APPLY_1[0001]): checkpoint completed, status = Success
Task(SELECT_2[0001]): checkpoint completed, status = Success
**** 16:42:58 Finished sending rows to the RDBMS
ICEtoTD_MIGRATOR: finished sending data
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 1, Total Rows Received = 300, Total Rows Sent = 0
              Instance    Rows Sent             Rows Inserted
              ========  ====================  ====================
                  1                      300                   300
              ========  ====================  ====================
                Total                    300                   300
**** 16:42:58 Number of bytes sent to the RDBMS for this job: 25300
**** 16:42:58 Load Phase statistics:
Elapsed time: 00:00:00:05 (dd:hh:mm:ss)
CPU time:     0.124801 Second(s)
MB/sec:       0.005
MB/cpusec:    0.193
ICEtoTD_MIGRATOR: Total Rows Sent To RDBMS:      300
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 1, Total Rows Received = 0, Total Rows Sent = 300
ICEtoTD_MIGRATOR: Total Rows Applied:            300
     ====================================================== =============
     =                                                                 =
     =                        Logoff/Disconnect                        =
     =                                                                 =
     ====================================================== =============
ICEtoTD_MIGRATOR: disconnecting sessions
**** 16:42:58 Logging off all sessions
              Instance      Cpu Time
              ========  ================
                   1        0.66 Seconds
**** 16:42:58 Total processor time used = '0.655204 Second(s)'
.        Start : Wed Jun 18 16:42:49 2014
.        End   : Wed Jun 18 16:42:58 2014
.        Highest return code encountered = '0'.
ICEtoTD_MIGRATOR: Total processor time used = '0.655204 Second(s)'
ICEtoTD_MIGRATOR: Start : Wed Jun 18 16:42:49 2014
ICEtoTD_MIGRATOR: End   : Wed Jun 18 16:42:58 2014
**** 16:42:58 This job terminated
ICEtoTD_FILEREADER: TPT19221 Total files processed: 100.
Job step ImportRecords completed successfully
Job MissionSto_1_INSERTER completed successfully, but with warning(s).
Job start: Wed Jun 18 16:42:47 2014
Job end:   Wed Jun 18 17:01:40 2014
Total available memory:          10000000
Largest allocable area:          10000000
Memory use high water mark:         72616
Free map size:                       1024
Free map use high water mark:          18
Free list use high water mark:          0

Jerry L.
feinholz 1234 posts Joined 05/08
19 Jun 2014

I would need to see the script(s), and the command lines.
Are you running 12 different TPT scripts?
Did you give each job its own unique name?
Are the jobs loading different target tables?
What version of TPT are you using?
 
 

--SteveF

JerryLusa 23 posts Joined 08/13
19 Jun 2014

-->  I would need to see the script(s), and the command lines.
listings at bottom

-->  Are you running 12 different TPT scripts?
16 scripts, one per target table.  Some tables have INSERTER and some LOAD operators in their script.  The scripts take a 'LoadSet' variable to distinguish unique job names, temp tables, etc. when running multiple jobs per targe table.

-->  Did you give each job its own unique name?
yes:  table name, operator and LoadSet

-->  Are the jobs loading different target tables?
loading from unique files.   multiple jobs write to same target tables.  there are from 1 to 10 jobs per target table.  LOAD scripts can run up to 4 jobs concurrently, INSERTER up to 10 based on demand.

-->  What version of TPT are you using?
 
14.10
Command Line:

start "%TABLE_NAME%" /B tbuild -f %_FILE_ROOT%\Scripts\XXXtoTD_%_OPERATOR%_%TABLE_NAME%.tpt -j %TABLE_NAME%_%_LOADSET%_%_OPERATOR% -n -s 1 -u "myLoadSet='%_LOADSET%' "
which expands to...
start "TargetTable" /B tbuild -f D:\Scripts\XXXtoTD_INSERTER_TargetTable.tpt -j TargetTable_1_INSERTER -n -s 1 -u "myLoadSet='1' "

TPT INSERTER script:
DEFINE JOB XXXtoTD_INSERTER_TargetTable(
DEFINE SCHEMA XXXtoTD_TargetTable (
Col1 VARCHAR(100),
Col2 VARCHAR(100),
Col3 VARCHAR(100),
Col4 VARCHAR(100),
Col5 VARCHAR(100),
Col6 VARCHAR(100),
Col7 VARCHAR(100),
Col8 VARCHAR(100),
Col9 VARCHAR(100),
Col10 VARCHAR(100),
Col11 VARCHAR(100) );
DEFINE OPERATOR DDL_OPERATOR
TYPE DDL
ATTRIBUTES
(
VARCHAR UserName = '*****************'
VARCHAR UserPassword = '*****************',
VARCHAR TdpID = '***************',
VARCHAR WorkingDatabase = '***************',
VARCHAR ErrorList = '3807'
);
DEFINE OPERATOR XXXtoTD_FILEREADER
TYPE DATACONNECTOR PRODUCER
SCHEMA XXXtoTD_TargetTable
ATTRIBUTES
(
VARCHAR FileName = 'TargetTable_' || @myLoadSet || '@*.txt',
VARCHAR Format = 'DELIMITED',
VARCHAR OpenMode = 'Read',
VARCHAR DirectoryPath = 'D:\XXX_Files',
VARCHAR ArchiveDirectoryPath = 'D:\XXX_Archive\',
VARCHAR ArchiveFatal = 'Y',
VARCHAR IndicatorMode = 'N',
VARCHAR TextDelimiter = '|',
VARCHAR MultipleReaders = 'N',
VARCHAR AcceptMissingColumns = 'Y'
);
DEFINE OPERATOR XXXtoTD_MIGRATOR
TYPE INSERTER
SCHEMA *
ATTRIBUTES
(
VARCHAR UserName = '*****************'
VARCHAR UserPassword = '*****************',
VARCHAR TdpID = '***************',
VARCHAR WorkingDatabase = '***************',
VARCHAR LogSQL = 'No'
);
STEP ImportRecords(
APPLY
('INSERT INTO XXX_.TargetTable (:Col1,:Col2,:Col3,:Col4,:Col5,:Col6,:Col7,:Col8,:Col9,:Col10,:Col11);')
TO OPERATOR (XXXtoTD_MIGRATOR[1])
SELECT * FROM OPERATOR (XXXtoTD_FILEREADER[1]);
);
);
 
 

Jerry L.

You must sign in to leave a comment.