#DateForumTypeThreadPost
2216 Mar 2006 @ 01:59 PSTUDAReplyDefault spacesit depends on how you define the field.if it is defined as field1 char(25) then the field lenght is 25 no matter what data you put in it.if you define the field as varchar(25) then the field lenght...
2113 Mar 2006 @ 12:51 PSTToolsReplyimport data from text filecould you please write the complete sql that you are using.plus are you using a BTEQ to do this or using fastexport and fastload to do this.try to cast the fields in you select while writing to a f...
2010 Mar 2006 @ 10:21 PSTDatabaseReplySpool Space Errorcould you send the query that you are using.There could be a lot of reasons why you get spool error.and check the explain too to see how the rows are getting distributed
1909 Mar 2006 @ 11:30 PSTToolsReplyfastexport and fastload forutf8 tableDid you figure it out?your post helped me in figuring out how to extract with format fastload mode indicators and the use it to load in table using fastload script. Thank you very much.now regardin...
1809 Mar 2006 @ 11:20 PSTToolsReplyFast ExportThis is very strange i am not sure why we get that extra char after casting each field in the select statement.The last post wich has cast( cast(......) as (sum of all the filed lengths) should wor...
1706 Mar 2006 @ 12:28 PSTDatabaseTopicfastexport to fastloadGurus and all,I am using fastexport to write data to a file and the default MODE = INDICATORS and the default FORMAT = FASTLOAD.now I am not sure how to use this output file in the fastload script ...
1601 Mar 2006 @ 08:05 PSTDatabaseReplySecondary IndexUse PI to distribute rows evenly + use a field that is used very often in access.If you have additional fields in the table that are used very often in the queries create secondary index. or if yo...
1501 Mar 2006 @ 07:57 PSTDatabaseTopicLoading 50 million rows.We are planning to load 50 million rows (assuming it is a average size row) into a teradata table. The data is currently available on DVD's (roughly 15 of them). Do you guys know of any precautions...
1413 Feb 2006 @ 10:08 PSTDatabaseTopicImport from MS XLS (EXCEL) file to teradata table.I am trying to import data into a Teradata table from an excel file is this possible?I can save the excel file as tab delimited text file and then import it from queryman (sql assistant)that works....
1309 Feb 2006 @ 05:46 PSTDatabaseReplyUpdate using the result of a SELECTI am not sure about the table names and the field name that you have but i guess this is how you have to write this queryupdate cpu_stat_matrixfrom (select sum(TotalCpuTime) from DBC.QryLog WHERE C...
1209 Feb 2006 @ 11:28 PSTDatabaseReplyexplain planThis needs a long discussion, but anyways if you run the explain check for words like"By way of primary index" (rows are read using primary index column(s))"By way of index number" (rows are read u...
1109 Feb 2006 @ 11:20 PSTDatabaseReplyinfluance record length on the performanceSpool is nothing but blocks of space that can be used. it does not matter if it is a big record or a small recrod as long as the data can fit in the available spool it should work, once it exceeds ...
1009 Feb 2006 @ 11:13 PSTDatabaseReplycan insert the derived table in Between clause?Manoj,I am not sure why it does not allow an select in the between clause, but it does not allow a select. The option that you had "where date_val >= (select .....) and date_val
908 Feb 2006 @ 01:36 PSTDatabaseReplyHow to Specifying date range (Date + Hour) in the queryI am not sure if i understood your question correct.But if you want the timestamps to be date instead you can cast them to date.select sum(TotalCpuTime) from DBC.QryLog WHERE cast(LogonDateTime as ...
808 Feb 2006 @ 01:24 PSTDatabaseReplyMonth differencedate '2006-01-01' - date '2006-03-01' is not 2 it is -2try this select date '2006-03-01' - date '2006-01-01' month as ActualMonthsthe result will be 2but the problem with this operation is select...
707 Feb 2006 @ 12:25 PSTDatabaseReplyinfluance record length on the performanceYes there is a difference because the data for a long record might be running into multiple blocks, where as the data for a small record might be in just one block. This way when we you do a select...
606 Feb 2006 @ 01:14 PSTUDAReplyUNION limitWe had a similar issue in our warehouse and this is what we did to solve.Modify the 'MaxP****TreeSegs' (actully the word in starts is P a r s e with no spaces)setting from 1000 to 3000.we had this...
503 Feb 2006 @ 10:45 PSTDatabaseReplyPlease Help me in Bteq ScriptsCould you post the query and some sample data. That will help.
424 Jan 2006 @ 12:14 PSTDatabaseReplyTablename can be deifned in where clauseThis is something we should be very careful while writing queries. what happened in the query is personnel1 is considered as an alias for personnel and it works with out error. similarly if we have...
323 Jan 2006 @ 06:43 PSTDatabaseReplyavg on time field not working ?Try this select CHAINE, avg (cast(trim(duree)||':00' as interval hour(4) to second (0) )) from dwhtrf.PO_SUIVI_DICZ group by CHAINE
212 Jan 2006 @ 10:51 PSTUDAReplyGrouping setsApukad, ( if you already go the results let me know if there is a good way)i am not sure if this will work for you or not. I tried this example and could get results by using derived tables.in my e...
111 Jan 2006 @ 01:44 PSTUDATopiccompress Timestamp(6)Is there a way to compress timestamp(6) field.we have a exp_ts field that gets a default value of 9999-12-31 00:00:00.000000 I want to compress this field is this possible?If it is possible how to ...

Pages