09 Mar 2006
Did you figure it out?your post helped me in figuring out how to extract with format fastload mode indicators and the use it to load in table using fastload script. Thank you very much.now regarding your question. did you try to convert the unicode char into something else in the fastexport script? I am not sure but I guess the unicode value is inrefering with the indicator that are used by the file.I will try to work on this too.and once again thanks for your post.
I have a script fastexport and fastload table to sync with prod and dev. That runs well on ASCII env. but it failed when table column contains Unicode columns.fastexport script:.logtable user_work.qa_accounts_log ;.logon prod_host/user,password;.begin export sessions 4;lock table dbname.tablename for accessselect *from dbname.tablename condition;.export outfile tablename.dat format fastload mode indicator;.end export;.logoff;fastload script:sessions 16;logon dev_host/user,passwd;DROP TABLE working_env_id.UV_tablename ;DROP TABLE working_env_id.ET_tablename ;DELETE FROM dbname_env_id.tablename ;set record formatted;define file=tablename.dat;begin loading dbname_env_id.tablename errorfiles working_env_id.ET_tablename , working_env_id.UV_tablename indicators checkpoint 10000;insert into dbname_env_id.tablename.*; end loading;logoff;for utf8 runsfexp -c 'utf8' fastload -c 'utf8' but it always with failed on fastload as below message?**** 18:19:14 Number of recs/msg: 4**** 18:19:14 Starting to send to RDBMS with record 1**** 18:19:14 Bad file or data definition.**** 18:19:14 The length of: TAX_EXPLANATION in row: 1 was greater than defined. Defined: 100, Received: 8224 ====================================================== =============