All Forums Tools
SebastianE 4 posts Joined 07/12
12 Jul 2012
TPT: AcceptExcessColumns with strange behaviour in FILE_READER

Hi everybody,

I'm currently evaluating the use of TPT for load jobs in a DWH environment. One of the basic problems is that we often have a random number of blanks at the end of the data row in the input file. This can't easily be changed because the files are delivered from several different systems.

With TPT 13.10 there was no way to load these files, now with TPT the new attribute AcceptExcessColumns has been introduced.

Now I'm facing 3 problems with that parameter:

  1. It seems to work only with delimited input files, while we usually have fixed length input files. Here I can only use a workaround to define a one column delimited schema und use substring to seperate the different columns during the load. Is there any possibility to work with SourceFormat='Text'?
  2. When setting AcceptExcessColumns='Yes' the data is loaded correctly, but in the log I get for each row that is longer then schema definition "TPT19350 I/O error on file <input_file>". Why is there an error message when the row is processed correctly?
  3. The documented parameter value AcceptExcessColumns=’YesWithoutLog’ which I thought would help me with issue number 2 leads to "FILE_READER: TPT19202 Attribute 'AcceptExcessColumns' value length is incorrect.  Length is 13, maximum is 8".  Is there a not documented value for the parameter that fits the length restriction?

Any help would be great!

Kind regards

feinholz 1234 posts Joined 05/08
12 Jul 2012

#2 and #3 are bugs we are currently fixing.

For #1, the AcceptExcessColumns was originally only intended for the "delimited" data.

We will not be supporting that feature for the other record formats in 14.10, but will revisit the issue in the next release.



SebastianE 4 posts Joined 07/12
16 Jul 2012

Thank you very much for the quick reply!

You must sign in to leave a comment.