3203 | 15 Oct 2014 @ 10:21 PDT | Tools | Reply | View Joining two fields of a table | yearmonth is usually calculated using yearcol * 100 + monthcol as yearmonth
But why can't you simply use two conditions? WHERE yearcol = 2014 and monthcol = 10 will allow the... |
3202 | 15 Oct 2014 @ 10:05 PDT | Database | Reply | Transpose data from wide table to long table | Ulrich's code avoids the UNION and is the most efficient solution before TD14.10 which added a TD_UNPIVOT table function:
SELECT customerid, id
FROM TD_UNPIVOT
(
ON (SELECT * FROM vT... |
3201 | 14 Oct 2014 @ 11:24 PDT | Database | Reply | Remove Chars and get only numeric values | There's an old trick utilizing nested oTranslate: remove all characters you want to keep and then use this as characters to be removed:
oTranslate(x, oTranslate(x, '0123456789','... |
3200 | 14 Oct 2014 @ 11:20 PDT | Database | Reply | Separation of columns into rows | Before TD14 it's much more complicated, see a solution I posted at:
https://forums.teradata.com/forum/database/stored-procedures-need-help-with-massaging-data-within-an-input-parameter
|
3199 | 11 Oct 2014 @ 03:36 PDT | Database | Reply | Separation of columns into rows | What's your Teradata release?
In TD14 there's a STRTOK_SPLIT_TO_TABLE which does exactly what you need:
SELECT *
FROM TABLE (STRTOK_SPLIT_TO_TABLE(your_table.id,your_table.demog,';... |
3198 | 11 Oct 2014 @ 02:47 PDT | Database | Reply | Analytic function | I assume you ment range between interval '1' month preceding and '1' month preceding instead of rows. The RANGE keyword is not implemented in Teradata (probably due to possible perf... |
3197 | 10 Oct 2014 @ 02:45 PDT | Database | Reply | Performance of Correlated Subquery!!! | Hi Anish,
the TD optimizer can't do anything else but rewrite a CS to a join (in worst case a cross/product join), a Hash Join would be the closest to a lookup processing.
Of course there mig... |
3196 | 10 Oct 2014 @ 12:50 PDT | Database | Reply | Performance of Correlated Subquery!!! | Hi Anish,
some remarks:
There's no special technique called "shared spool" for correlated subqueries (CS), the query is simply rewritten as join.
Joe D'Silva didn't claim t... |
3195 | 09 Oct 2014 @ 11:05 PDT | Database | Reply | Row level Duplicate Record elimination | Hi Ambuj,
SET tables should perform best unless the PI is very non-unique.
|
3194 | 09 Oct 2014 @ 12:49 PDT | Analytics | Reply | Bteq script error handling not working | And your problem is?
Guessing: GOTO can only skip forward, but not back.
Why are there two bteq <<EOF?
|
3193 | 09 Oct 2014 @ 08:55 PDT | Database | Reply | Where does the CHECKSUM metadata value for a table get stored in DBC Teradata catalog tables | CHECKSUM can't be found in any system table, it's only in the table header (similar to BLOCKSIZE).
The only way to get this info using SQL is a SHOW TABLE, otherwise only a low-level comma... |
3192 | 09 Oct 2014 @ 05:23 PDT | Database | Reply | How to avoid using QUALIFY OVER PARTITION BY | Hi Sarang,
depending on your actual data you might do the ROW_NUMBER in a Derived Table instead:
LEFT JOIN (SELECT * FROM TableF
QUALIFY ROW_NUMBER() OVER (PARTITION BY key_field OR... |
3191 | 08 Oct 2014 @ 09:24 PDT | Database | Reply | Need help in Optimizing Row_Number/RANK() operations | Can you show your current process?
And what's your TD release?
|
3190 | 08 Oct 2014 @ 07:12 PDT | Database | Reply | Insert ASCII 222 value | This must be related to something else:
BTEQ -- Enter your SQL request or BTEQ command:
CREATE MULTISET VOLATILE TABLE vt (x VARCHAR(10)) ON COMMIT PRESERVE ROWS;
*** Table has been cre... |
3189 | 08 Oct 2014 @ 06:08 PDT | Database | Reply | Failure 2620 The format or data contains a bad character | What's the datatype of TYP_CHK_CD?
|
3188 | 08 Oct 2014 @ 06:06 PDT | Database | Reply | Insert ASCII 222 value | Only hex '1A' will result in an error, but 222 should work
What did you try?
SELECT CHR(222), 'DE'xc
|
3187 | 08 Oct 2014 @ 02:13 PDT | Database | Reply | An owner referenced by user does not have REFERENCES access | Please read the topic on SQL SECURITY in the manuals, you might want to switch to CREATOR:
http://www.info.teradata.com/HTMLPubs/DB_TTU_14_10/index.html#page/SQL_Reference/B035_1144_112A/Create_Pr... |
3186 | 08 Oct 2014 @ 01:52 PDT | Database | Reply | Error Handling in Stored Procedure | Update/Insert as part of the handler?
BEGIN Update/Insert END
Outside of the handler?
Nested handlers
There are lots of examples in the manuals:
http://www.info.teradata.com/HT... |
3185 | 08 Oct 2014 @ 01:44 PDT | Database | Reply | Teradata SQL Assistant 15.00 DROP STATISTICS not working | Hi Peter,
as a TD customer you don't need to download the TTU from DevEx, ask your DBA to get it from T@YS, there's always the latest version.
|
3184 | 07 Oct 2014 @ 11:34 PDT | General | Reply | AMP FATAL | Did you contact Teradata support, yet?
Why are those AMPs fatal?
If it's due to disk/RAID failures the process is called "restore from backup" :-)
|
3183 | 07 Oct 2014 @ 04:21 PDT | General | Reply | HIGH - SesAmpCpu | Depending on the actual queries 15,000 CPU seconds for a few hundred queries might be low.
I've seen single Selects using a way more than 100,000 seconds.
|
3182 | 07 Oct 2014 @ 03:38 PDT | Database | Reply | Teradata SQL Assistant 15.00 DROP STATISTICS not working | This was fixed in SQLA 15.0.0.3 :-)
|
3181 | 06 Oct 2014 @ 11:46 PDT | Tools | Reply | Using the Append statement to write the results from 2 queries to one txt file. | This should work, what's the error message?
|
3180 | 06 Oct 2014 @ 10:22 PDT | Database | Reply | Creating Header,Detail and Trailer records from one SQL statement | What tool do you use for exporting?
- both BTEQ and FastExport append the data of multiple selects by default
- SQL Assistant got an option "write all answer set to a single file"
&nbs... |
3179 | 06 Oct 2014 @ 09:57 PDT | General | Reply | recursive query spool space error | This is not due to recursion :-)
Teradata was implemented before there was Standard SQL, the initial query language was called TEQUEL (TEradata QUEry Language), whose syntax didn't require to ... |