All Forums Tools
kishu 26 posts Joined 03/08
06 Nov 2008
Handling timestamp column during fast export

Hi,I am getting an error while running fast export script for timestamp column as **** 08:33:59 UTY8713 RDBMS failure, 5407: Invalid operation on an ANSI Datetime or Interval value.for the timestamp column in table.Please let me know how to get through this.thanks

gopinadh 15 posts Joined 10/08
08 Nov 2008

select current timestampfrom tablename..

joedsilva 505 posts Joined 07/05
09 Nov 2008

It doesn't help in giving a solution if we don't know what you are doing with that time stamp column in the fast export script.Does this same SQL run in queryman ?Are you doing some conversion / addition etc in the SQL on the timestamp column ? (most likely you are trying to add something but not using the right interval data type).If that doesn't help, please post the relevant SQL. and describe the intentions of the SQL.

22 Jun 2014

TOP 10
ACCT_NBR ||'|'||
acct_type_cd ||'|'||
acct_sub_type_cd ||'|'||
ent_curr_acct_sts_cd ||'|'||
acct_subsrptn_eff_dt ||'|'||
curr_srv_accs_nbr ||'|'||
srv_accs_id ||'|'||
orgnl_srv_dt ||'|'||
src_sys_orgnl_srv_dt ||'|'||
curr_subsrptn_sts_cd ||'|'||
CURR_IMSI ||'|'||
CURR_IMEI ||'|'||
PRD_CD ||'|'||
PRD_DESC ||'|'||
PRD_CAT_CD ||'|'||
blng_eff_dt ||'|'||
blng_end_dt ||'|'||
entry_dt_tm (title '')
 *** Failure 5407 Invalid operation for DateTime or Interval.
                Statement# 1, Info =0
 *** Total elapsed time was 1 second.
After running the above in bteq, I'm getting the above error. The data type of the Column entry_dt_tm is TS. I'm getting the error for this column. Request you to please help me with the format.

dnoeth 4628 posts Joined 11/04
22 Jun 2014

There's no automatic typecast for TIME and TIMESTAMP, you must do it explicitly:



CAST(entry_dt_tm AS VARCHAR(20)) (title '')


22 Jun 2014

Thank you so much. It did work.

22 Jun 2014

 *** Query completed. 22526460 rows found. One column returned.
 *** Total elapsed time was 3 seconds.
Using bteq, I'm trying to export about 22M records from the table to the flat file. It is running for more then 4 hours and the export to the flat file is still happening. The logs are rolling.
But does it take so much of time for the export to happen?

Raja_KT 1246 posts Joined 07/09
22 Jun 2014

Just the initial thought.How about your sessions? how many!!! Is it peak time?  You can think of fastexport too. 

Raja K Thaw
My wiki:
Street Children suffer not by their fault. We can help them if we want.

dnoeth 4628 posts Joined 11/04
23 Jun 2014

22M records is not that much, but it mainly depends on the overall size, 22M * 100 bytes or 22M * 10KB...
You should try to find the bottleneck:
BTEQ is single-threaded, so check if one CPU is running at 100%, if it's less it might be the network.
Also check the settings for RESPBUFLEN using .SHOW CONTROLS, if your network is reliable you can switch to the maximum:
There's another setting, but it already defaults to ON:


23 Jun 2014

Thanks a lot once again. After updating the below values, the export to the flat file got completed in about 4 hours for all 22527844 rows. Just wondering, if there are some additional changes that can be done, so that the export gets completed in less then 4 hours time.
Before making this changes yesterday, just to export .6M records it took nine hours.

30 Jun 2014

For exporting 16M records to the flat file it took 3 hrs 17 minutes. Below is the File size:
-rw-r--r--   1 sbptest5 sbptest  3606302553 Jun 30 16:25 SBPSubsFromTLGdtls.txt
To implement in production, we would need much better performance.Can someone, please advice if there is a way to improve the performance. I would prefer to use bteq rather then fexp.

dnoeth 4628 posts Joined 11/04
30 Jun 2014

Did you check what's the bottleneck, BTEQ itself?
Why do you prefer BTEQ over FExp?
To export delimited data the best choice is a TPT Export in delimited format, no need to manually concat all the columns.


You must sign in to leave a comment.