54 | 25 May 2015 @ 06:36 PDT | Database | Reply | Teradata valid date check without udf | Hi Dieter,
There is no fixed format. Currently that column has 0, 50, AC etc...as values. Might be in future we will get date values. I am assuming that should be in yyyy-mm-dd format.
|
53 | 24 May 2015 @ 02:50 PDT | Database | Reply | Teradata valid date check without udf | I don't want to join with sys_calendar. Is there any other way?
|
52 | 24 May 2015 @ 06:15 PDT | Database | Topic | Teradata valid date check without udf | Hi All,
I am having a column with data type as CHAR(25) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL but there are chances the same column will hold date value as well. We need to check if t... |
51 | 24 May 2015 @ 06:08 PDT | Database | Reply | DROP COLUMN | Hi John,
Drop column command will delete just the column and column data but not all table data.
|
50 | 27 Mar 2015 @ 02:03 PDT | Database | Reply | Performance with pertitioned table | It won't increase the performance.
|
49 | 26 Mar 2015 @ 10:35 PDT | Database | Reply | Performance with pertitioned table |
Hi Moutusi,
You can try the same way as Dieter mentioned above. Try the below menthod:
extract(year from T1.STRT_DT) >= CAST(T2.STRT_YR AS TIMESTAMP(0)))
|
48 | 26 Mar 2015 @ 09:57 PDT | Database | Reply | Order by Statement for Volatile Tables and Efficiency |
Hi Bob,
I am not clear. What's the issue in the above code and what error message you are getting while creating volatile table?
Only issue i can see in the above code is ... |
47 | 25 Mar 2015 @ 02:08 PDT | Database | Reply | Need help in generating sequence numbers in BTEQ script |
Hi Lalitha,
I am not clear with the requirement. If this seq id is for a table then you can try like this:
SEQ_ID_COL BIGINT NOT NULL GENERATED... |
46 | 20 Mar 2015 @ 06:45 PDT | Database | Reply | Failure 3932 Only an ET or null statement is legal after a DDL Statement. |
KVB,
I tried the same way where i am able to run successfully. Let me know how you executed?
*** Logon successfully completed.
*** Teradata Database... |
45 | 20 Mar 2015 @ 05:40 PDT | Database | Reply | How SYSDATE will work if load is going more than one day | You will get different date as system date for the next day will be that day current date. You can also try with current_date or date.
|
44 | 18 Mar 2015 @ 02:21 PDT | Database | Reply | Need suggestions to optimize the complex query (below) involving several Joins and Aggregation.. | Hi Sumit,
1) Can you try collecting stats individually for the below columns. I mean single column stats. Multi column stats if collected already let it be.
T9.MATERIAL_ID
T9.PLANT_ID
T9.END_DA... |
43 | 18 Mar 2015 @ 01:18 PDT | Database | Reply | Non duplicate row selection query | Can you try including DIRECT_ID column also in the Partition by and rerun and see the performance.
|
42 | 17 Mar 2015 @ 11:19 PDT | Database | Reply | Substitute Values from Predecing or following columns | Hi,
For the same question, Dnoeth has replied in the below link. Did you tried using that?
http://forums.teradata.com/forum/database/ordered-analytical-functioniterative-loop-1
|
41 | 17 Mar 2015 @ 10:32 PDT | Database | Reply | Non duplicate row selection query | Hi John,
Apart from DIRECT_ID is there any other columns which you can use as a differentiating column between two records. For ex: date or timestamp column etc..
When you are inserting to&n... |
40 | 24 Feb 2015 @ 02:15 PST | Database | Reply | Remove the first 4 characters in a string | Hi Dvya,
I am not able to understand your question? Are you saying if there is matching offerid in table a and b, it should return only those records?
|
39 | 24 Feb 2015 @ 01:31 PST | Database | Reply | Volatile query not working or a way to select a subset of the main query | Hi,
This should help to get subset of the main query:
sel col1, amt from
(
sel col1, (col2 * col3) as amt from tbname
) aa;
|
38 | 24 Feb 2015 @ 12:29 PST | Database | Reply | Optimization of Insert....Select statement | Apoorv,
Can you give some more details?
-- What is the difference between the 2 inserts?
-- 2 inserts are selecting data from different base tables?
-- Usually which insert will have more volum... |
37 | 24 Feb 2015 @ 12:14 PST | Database | Reply | Multiload Error Table Interpretation | Hi Abhi,
Can you give the error message you are getting? Are you loading it through informatica?
|
36 | 03 Feb 2015 @ 11:47 PST | Database | Reply | Partition Row number usage | Saranya,
1) I am seeing the below peice used twice. Might be you are doing double check but using in anyone place is advisable.
AND a.start_date BETWEEN '2015-01-01' AND '2015-01-02... |
35 | 03 Feb 2015 @ 11:04 PST | Database | Reply | Performance | Hi John,
For the above query which you are facing spool space issue, i have few questions?
1. Any idea why there is no alias used in TIME_PERIOD- EXTRACT(DAY FROM TIME_PERIOD? Even for this ... |
34 | 09 Jan 2015 @ 01:18 PST | Database | Reply | ALTER TO CURRENT | Hi,
Instead of getting the entire result set, interval is used to extract only particular 'n' days of data because of which your query plan changes. Reg' the time zones, i am not clear... |
33 | 09 Jan 2015 @ 11:58 PST | Database | Reply | Parameter Query in Excel ODBC | Hi Vinay,
I am not understanding correctly. What you are trying to acheive by giving parameter query string? Above syntax is not working correctly for me.
|
32 | 09 Jan 2015 @ 09:55 PST | Database | Reply | required to get the records for last 6 months | Hi Saranya,
Let me know if this works for you.
sel order_date FROM Database.tablename tb1,
(sel max(order_date) as mdate FROM Database.tablename) tb2
where order_date between add_month... |
31 | 09 Jan 2015 @ 09:28 PST | Database | Reply | Difference between accessing single Partition and Multiple partitions | Hi KVB,
Ideally it should take less time if you query for a month. What is the data volume you have for both the months?
|
30 | 06 Jan 2015 @ 12:08 PST | Database | Reply | How to identify the unused stats in a query. | Hi Sreedhar,
You can collect stats on the index columns & join columns. It's not necessary to collect stats on all the columns in a table.
|