All Forums Database
aneelkumar04 22 posts Joined 07/11
19 Jul 2016
Fastload Duplicate Records issues

Hi All,
 
I am loading data from file to Db using fastload. My file got aborted due to duplciate records,but when i chcecked in file there is no duplicate records in file. I am confusinng. How its possible.
 
Log :
Total Records read   = 2987
Total Error Table 1  = 0 Table has been dropped
Total Error Tabel2 = 0 Table has been dropped
Total inserts appiled  = 2982
Total Duplicates rows  =5
 
 
 

AtardecerR0j0 71 posts Joined 09/12
20 Jul 2016

You can load data again using a NOPI table, so that you'll load duplicate rows. Then you can use sql to check about duplicate rows.

Be More!!

aneelkumar04 22 posts Joined 07/11
20 Jul 2016

Hi,
 
Thnaks for reply, But its production database.

M.Saeed Khurram 544 posts Joined 09/12
20 Jul 2016

Hi,
Fastload do not load full row duplicate records, rather put them in a separate error table by default. 
As per the log, there are 5 duplicates but no dups in error table, So one reason can be duplicate records on UPI.
Can you please copy the DDL of this table?
 

Khurram

CarlosAL 512 posts Joined 04/08
20 Jul 2016

Hi.
Fastload DOES NOT put full duplicate rows in an ERROR TABLE. Fastload puts UNIQUE PRIMARY INDEX violations in ERROR TABLE 2.
To the OP:
You can check duplicates in the file using something like:
#$ sort <your file> | uniq -icd
Note the 'i' flag which is 'case insensitive', that can be  the cause of your duplicates.
HTH.
Cheers.
Carlos.
 
 

aneelkumar04 22 posts Joined 07/11
20 Jul 2016

Hi CarlosAL
Thank you for reply, i checked your command,but i didn't get duplicate records
Thanks in advanced

M.Saeed Khurram 544 posts Joined 09/12
20 Jul 2016

As you told earliar the fastload script got aborted. Fastoad only load into an empty table. 
Can you check if there is no data in the Target table?
Otherwise the best option is to use NOPI table in some temporary database, load the file and check for issues.

Khurram

aneelkumar04 22 posts Joined 07/11
26 Jul 2016

I got the solution. Thanks for your valueble time.

You must sign in to leave a comment.