All Forums Tools
arfsoft 1 post Joined 04/14
24 Apr 2014
TPT - BUFFERMAX error

Hi all, I'm pretty new to using TPT and am working with Version 14.00.00.07.  In the last week or so I have been finding that almost all of my jobs involving a simple load from one csv into one table are failing.
The error I get is: TPT19003 BUFFERMAXSIZE: 64260.  However I find once this error occurs the TPT job will then only attempt to process the file that caused this error in the first place, even when the job variables file and all possible places where file name is given are changed.
I see this as somewhat confusing and can only conclude that the log is somehow erroneous or there is some code that I don't have access to which contains the old file name still.
I especially am perplexed by this error as the files I pass to the TPT jobs are all loaded with no issues by another TPT job when I load them as a list of files parallely rather than just as one file.  Due to the sensitive nature of the data I'm handling I can't post verbatim code here, but here is some of the log with specifics censored.
TPT Load failed, running again to clear any locks
Teradata Parallel Transporter Version 14.00.00.07
Job log: <XXXXXXX>
Job id is <XXXXXXX>, running on blx20be01
Found CheckPoint file: <XXXXXXXX>
This is a restart job; it restarts at step Load_data.
Teradata Parallel Transporter DataConnector_data: TPT19006 Version 14.00.00.07
DataConnector_data Instance 1 restarting.
DataConnector_data: TPT19008 DataConnector Producer operator Instances: 1
DataConnector_data: TPT19003 ECI operator ID: DataConnector_data-1555
Teradata Parallel Transporter Load Operator Version 14.00.00.07
Insert_data: private log specified: <XXXXXXX>
Insert_data: connecting sessions
Insert_data: preparing target table
Insert_data: entering Acquisition Phase
DataConnector_data: TPT19222 Operator instance 1 processing file <XXXXXXXX>.      <--- This filename is unrelated to the job and not mentioned in any script or variable file, why is
DataConnector_data: TPT19003 BUFFERMAXSIZE: 64260                                                        it appearing here?
DataConnector_data: TPT19221 Total files processed: 0.
Insert_data: disconnecting sessions
Insert_data: Total processor time used = '1.33 Second(s)'
Insert_data: Start : Thu Apr 24 11:44:13 2014
Insert_data: End   : Thu Apr 24 11:44:19 2014
Job step Load_data terminated (status 12)
Job <XXXXXX> terminated (status 12)
 
Thanks in advance
 

feinholz 1234 posts Joined 05/08
24 Apr 2014

I think the clue is here:
 
Found CheckPoint file: <XXXXXXXX>
This is a restart job; it restarts at step Load_data.
 
Even if/when you think you are running a new job, TPT thinks you are restarting/resuming a previous job and thus it uses a lot of information stored away in its checkpoint files.
 
You need to delete all of the checkpoint files (you can use the "twbrmcp" tool and give it your job name) and see if that helps clean everything up.
Use job "name", not job "id".
The "job id" is <jobname>-<unique number>.

--SteveF

mullasci 9 posts Joined 06/09
23 Sep 2014

I am getting exactly the same error (19003) with a sightly more recente version of TPT.
In my case it is not a restart job.
How can I get TPT to continue loading the good data instead of stopping? 

Teradata Parallel Transporter Version 14.00.00.08

Job log: /opt/teradata/client/14.00/tbuild/logs/DW2.ADK_TXN_Q2012_4_tpt_log-2877.out

Job id is DW2.ADK_TXN_Q2012_4_tpt_log-2877, running on XXXX-1-22

Teradata Parallel Transporter Load Operator Version 14.00.00.08

LoadOperator: private log not specified

Teradata Parallel Transporter FileReader: TPT19006 Version 14.00.00.08

FileReader Instance 4 directing private log report to 'tpt_file_reader_log-4'.

FileReader Instance 7 directing private log report to 'tpt_file_reader_log-7'.

FileReader Instance 8 directing private log report to 'tpt_file_reader_log-8'.

FileReader Instance 3 directing private log report to 'tpt_file_reader_log-3'.

FileReader Instance 5 directing private log report to 'tpt_file_reader_log-5'.

FileReader Instance 6 directing private log report to 'tpt_file_reader_log-6'.

FileReader Instance 1 directing private log report to 'tpt_file_reader_log-1'.

FileReader Instance 2 directing private log report to 'tpt_file_reader_log-2'.

FileReader: TPT19008 DataConnector Producer operator Instances: 8

FileReader: TPT19003 ECI operator ID: FileReader-8261

LoadOperator: connecting sessions

FileReader: TPT19222 Operator instance 5 processing file '/data/XXXDataLocal/DW2.ADK_TXN_Q2012_4.dat'.

FileReader: TPT19222 Operator instance 6 processing file '/data/XXXDataLocal/DW2.ADK_TXN_Q2012_4.dat'.

FileReader: TPT19222 Operator instance 1 processing file '/data/XXXDataLocal/DW2.ADK_TXN_Q2012_4.dat'.

FileReader: TPT19222 Operator instance 2 processing file '/data/XXXDataLocal/DW2.ADK_TXN_Q2012_4.dat'.

FileReader: TPT19222 Operator instance 7 processing file '/data/XXXDataLocal/DW2.ADK_TXN_Q2012_4.dat'.

FileReader: TPT19222 Operator instance 8 processing file '/data/XXXDataLocal/DW2.ADK_TXN_Q2012_4.dat'.

FileReader: TPT19222 Operator instance 4 processing file '/data/XXXDataLocal/DW2.ADK_TXN_Q2012_4.dat'.

FileReader: TPT19222 Operator instance 3 processing file '/data/XXXDataLocal/DW2.ADK_TXN_Q2012_4.dat'.

LoadOperator: preparing target table

LoadOperator: entering Acquisition Phase

FileReader: TPT19003 !ERROR! Buffer Mode expected row length, 123, does not match actual row length of 138

FileReader: TPT19003 !ERROR! Buffer Mode expected row length, 123, does not match actual row length of 138

FileReader: TPT19003 !ERROR! Buffer Mode expected row length, 123, does not match actual row length of 138

LoadOperator: disconnecting sessions

FileReader: TPT19221 Total files processed: 5.

LoadOperator: Total processor time used = '150.52 Second(s)'

LoadOperator: Start : Tue Sep 23 19:56:10 2014

LoadOperator: End   : Tue Sep 23 21:54:19 2014

Job step MAIN_STEP terminated (status 12)

Job DW2.ADK_TXN_Q2012_4_tpt_log terminated (status 12)

 

Thanks in advance.

feinholz 1234 posts Joined 05/08
24 Sep 2014

Is the data delimited format?
The DC operator can continue processing under certain conditions when processing delimited data.
If some records will have too many columns or too few columns (than defined in the schema), then please read the documentation on AcceptExcessColumns and AcceptMissingColumns.
 

--SteveF

Naveen_K 22 posts Joined 08/09
05 Jan 2015

Hi,
 
I get the same error when I try to update the last field of a table to the file name. I am trying to load the files in a file list.
Please find below my TPT script.
 
DEFINE JOB FILE_LOAD

DESCRIPTION 'Load a Teradata table from a file'

(

DEFINE SCHEMA LOADME_SCHEMA

(

COL1  VARCHAR(30),

COL2  VARCHAR(30),

COL3  VARCHAR(250) METADATA(FileName)

);

DEFINE OPERATOR DDL_OPERATOR 

TYPE DDL

ATTRIBUTES (

VARCHAR TdpId = @tdpid,

VARCHAR UserName = @userid,

VARCHAR UserPassword  = @password,

VARCHAR ErrorList = '3807'

);

DEFINE OPERATOR FILE_READER

TYPE DATACONNECTOR PRODUCER

SCHEMA LOADME_SCHEMA

ATTRIBUTES

(

VARCHAR PrivateLogName='/ngs/app/etld/work/naveen/loadmeprivatelog',

VARCHAR Format = 'Delimited',

VARCHAR TextDelimiter = '|',

VARCHAR OpenMode = 'Read',

VARCHAR DateForm = 'ANSIDATE',

VARCHAR FileList = 'Y',

VARCHAR FileName = @filename_var

);

DEFINE OPERATOR LOAD_OPERATOR

TYPE LOAD

SCHEMA LOADME_SCHEMA

ATTRIBUTES

(

VARCHAR PrivateLogName = 'load_log',

VARCHAR TdpId = @tdpid,

VARCHAR UserName = @userid ,

VARCHAR UserPassword  = @password,

INTEGER TenacityHours = 4,

INTEGER TenacitySleep = 6,

INTEGER Maxsessions = 8,

INTEGER MinSessions = 2,

INTEGER ErrorLimit = 1,

VARCHAR TargetTable  = 'xbi_stg.LOADME',

VARCHAR LogTable  = 'xbi_stg.LG_LOADME',

VARCHAR ErrorTable = 'xbi_stg.ET_LOADME',

VARCHAR DateForm = 'ANSIDATE'

);

STEP drop_table

(

APPLY

('drop table xbi_stg.LG_LOADME;'),

('drop table xbi_stg.ET_LOADME;'),

('drop table xbi_stg.UV_LOADME;')

TO OPERATOR (DDL_OPERATOR);

);

STEP TBL_CITY

(

APPLY

('INSERT INTO xbi_stg.LOADME(

COL1,

COL2,

COL3) VALUES (

:COL1,

:COL2,

:COL3);

')

TO OPERATOR (LOAD_OPERATOR[1])

SELECT COL1,COL2,COL3 FROM OPERATOR(FILE_READER());

);

);

 

When I run the following command, 

 

tbuild -r /ngs/app/etld/work/naveen -f /ngs/app/etld/work/naveen/loadme.ctl -R 80 -u " tdpid='****',userid='****',password='****',v_utf='UTF8',v_sessions=2 ,v_pack=600,v_dbname='****',v_tablename='****',v_outfile='loadme.out',v_lg_tablename='****',v_et_tablename='****',v_directorypath='****',v_delimiter_value='|',v_schemaName=**** ,filename_var='/ngs/app/etld/work/naveen/loadme.out'" -L /ngs/app/etld/LOADER/F2C/tgtfiles loadme_69041370_EDWDEV_4363

 

I get the following error

 

FILE_READER: TPT19003 Buffer Mode expected row length, 51, does not match actual row length of 49

 

File list has only one file under that, and it has only one file.

 

Could you please provide pointers in resolving this issue ?

 

Regards,

Naveen K

 

feinholz 1234 posts Joined 05/08
05 Jan 2015

First of all, when looking at your script, I see this:
 
VARCHAR PrivateLogName='/ngs/app/etld/work/naveen/loadmeprivatelog',
 
The private log name is just the name of a virtual lok inside a physical file.
It is not a full path specification of a physical disk file.
 
Next, on the command line I see a lot of settings of this nature:
 
v_utf='UTF8',v_sessions=2 ,v_pack=600,v_dbname='****',v_tablename='****',v_outfile='loadme.out',v_lg_tablename='****',v_et_tablename='****',v_directorypath='****',v_delimiter_value='|',v_schemaName=****
 
but do not see them used anywhere.
 
As for the error, I will look into it.
 
 

--SteveF

feinholz 1234 posts Joined 05/08
05 Jan 2015

Does the file name in the file list have an end-of-line marker?
If not, please add one and then try again.
 

--SteveF

Naveen_K 22 posts Joined 08/09
05 Jan 2015

Yes it does. I tried without the file list and even that throws the same error.
Thanks,
Naveen K

feinholz 1234 posts Joined 05/08
05 Jan 2015

Can you provide a sample of the data?
Each record of the file has to have an end-of-record marker.
 

--SteveF

Naveen_K 22 posts Joined 08/09
05 Jan 2015

I am trying to load only one line. File has only this line, with new line character at the end.
1234|1234

 

Naveen_K 22 posts Joined 08/09
05 Jan 2015

TPT version is 
tbuild -V
Teradata Parallel Transporter Version 14.10.00.02

 

Naveen_K 22 posts Joined 08/09
06 Jan 2015

I tested the same script on another version of tbuild, and it seems to work as expected there. 
Teradata Parallel Transporter Version 13.10.00.05

But its not working on a higher version of 14.10.

feinholz 1234 posts Joined 05/08
06 Jan 2015

Just researched and found that it is fixed in 14.10.00.03.
Please download the latest 14.10 efix patches and try again.
Thanks!

--SteveF

Naveen_K 22 posts Joined 08/09
06 Jan 2015

Ok, thanks. It worked on TTU 14.10.00.07.

feinholz 1234 posts Joined 05/08
06 Jan 2015

Good to hear. Thanks for your patience.

--SteveF

Naveen_K 22 posts Joined 08/09
29 Jan 2015

Hi,
In continuation to this, I want to make use of a file list that contains list of compressed files ( .gz files ). Would I be able to load .gz files using the file list ?
 
Regards,
Naveen K

feinholz 1234 posts Joined 05/08
29 Jan 2015

That should be supported.
 

--SteveF

Naveen_K 22 posts Joined 08/09
29 Jan 2015

Yes, thank you.

You must sign in to leave a comment.