All Forums Tools
jklee 40 posts Joined 07/06
30 Oct 2006
Viewing Fastload Errors

I've tried using BTEQ to export the data from the _e1 error table that fastload creates, but the data comes out in what I believe is fastload format, which doesn't make it very easy for my to analyze.Is there any way to conver this back to the format of the original file?The original file is to large to get the record out of (not on Uniz, so grep is not an option).Here's what I did in BTEQ: BTEQ -- Enter your DBC/SQL request or BTEQ command:.set recordmode on.set recordmode on BTEQ -- Enter your DBC/SQL request or BTEQ command:.export data file=weblog_0911_2683.txt.export data file=weblog_0911_2683.txt *** To reset export, type .EXPORT RESET BTEQ -- Enter your DBC/SQL request or BTEQ command:select * from staging.weblog_e1where errorcode=2683;select * from staging.weblog_e1where errorcode=2683; *** Success, Stmt# 1 ActivityCount = 19 *** Query completed. 19 rows found. 3 columns returned. *** Total elapsed time was 1 second.Here's what the file looks like:[Y9Gÿÿÿÿê? ÿÿÿÿÿÿÿÿÿÿÿÿý 1158001696 ss zh-tw 2006-09-11 15:08:16 4851137037954209665 726780682063314945 N 140.129.59.16 0 * http://www ...The original file was tab-delimited.Thanks,James

Fred 1096 posts Joined 08/04
30 Oct 2006

Export only the DataParcel column (don't use SELECT *), then remove (or ignore) the first two bytes from each record in the export file.

jklee 40 posts Joined 07/06
30 Oct 2006

using FastExport?

Fred 1096 posts Joined 08/04
31 Oct 2006

I would use BTEQ (RECORDMODE, DATA options as in your example) because the table should be small. But FastExport (FORMAT UNFORMAT, MODE RECORD) would work too. Either way, since DataParcel is VARBYTE, the exported file will have a two-byte unsigned binary length field added to the beginning of each record.

j355ga 100 posts Joined 12/05
31 Oct 2006

If you use SQL Assistant you can browse the records. SQL Assistant will place the data-parcel in a text file and open it when you click the data-parcel button. At least it does with the version we are using: 7.1.0

Jeff

jklee 40 posts Joined 07/06
31 Oct 2006

I've tried both those approaches, but the value in the data parcel itself looks to be encoded. It all comes out in hex. I convert back to ASCII, but I still get a bunch of strange characters.

j355ga 100 posts Joined 12/05
31 Oct 2006

That's as close as you can get - unless you want to write a program that will convert binary values to ascii. I've done it before but it's an ugly time-consuming process.

Jeff

jklee 40 posts Joined 07/06
31 Oct 2006

Thanks everyone. I appreciate the responses.Think I'll scour the web for a binary to ascii converter...

Fred 1096 posts Joined 08/04
31 Oct 2006

Oh, I didn't pick up on the fact that your input was FORMAT VARTEXT with TAB delimiters. The DataParcel field is the record sent to Teradata, but that is after FastLoad client converted the original text to VARCHAR fields, more or less in "FastLoad format" as you guessed: two-byte length for the record, two-byte length of the first text value, first text field, two-byte length of the second text value, second text field, etc.I suppose you could EXPORT the DataParcel field to a file using BTEQ Or FastExport, then FastLoad it back to a table with all fields defined as VARCHAR, then EXPORT from that table as tab-delimited using SQL Assistant.

jklee 40 posts Joined 07/06
31 Oct 2006

I was hoping there was a more streamlined way of doing that. Sort of an "unformat" function.Thanks again.

satish123 3 posts Joined 11/06
09 Nov 2006

I am new to this forum.Could someone help me with the following question.Col1 Col2 Col3 Col4a ab abc a4a ac bcd b4........There are 5 different values in Col1, 25 different values in Col2 , 47 different values in col3 and 188 different values in Col4.The question is how would you give a unique sequence number for all entries in each of the col1 - col 4 which are linked by relationship in a table.ThanksSatish

jklee 40 posts Joined 07/06
08 Dec 2006

Ok, I decided to go this route.I have exported the records (just the dataparcel column) using bteq as follows:.export data file="c:\bteq_temp\err.dat"select dataparcel from staging.global_omniture_hit_e1;.export resetAll that works fine.Where I have a problem is in bringing the data back in.I took the column list from the original fastload job and added the dummy field to the start as follows:.import data file="c:\bteq_temp\err.dat".repeat 9using(dummy varchar(5), hit_time_gmt varchar(11), service_txt varchar(2), accept_language varchar(20), omniture_date_time varchar(23), visid_high varchar(20), ... )select :visid_high , :visid_low , :visit_num , :visit_page_num;------------------------------I get the following response:------------------------------- *** Starting Row 0 at Fri Dec 08 11:00:34 2006 *** Failure 2673 The source parcel length does not match data that was defi ned. Statement# 1, Info =1 *** Total elapsed time was 1 second. *** Growing Buffer to 1169 *** Failure 2673 The source parcel length does not match data that was defi ned. Statement# 1, Info =1 *** Total elapsed time was 1 second....The original fastload job specified the file is VARTEXT, so I assume all the fields wihtin the data parcel are vartext fields, as opposed to the datatype in the target table for the fastload job.I've also tried this with the dummy variable defined as a smallint, but I get the same results.

Fred 1096 posts Joined 08/04
09 Dec 2006

There should not be a "dummy field" in the USING, just the VARCHAR fields you had in your FastLoad DEFINEs. (Anything BTEQ adds to the records in EXPORT will be handled automatically on IMPORT.)

You must sign in to leave a comment.