#DateForumTypeThreadPost
1305 Jan 2016 @ 09:30 PSTDatabaseReplyFASTLOAD - Record too long by n bytesNulls in a VARCHAR column in the extract file was the culprit. Found a workaround by treating it as a CHAR by ignoring the length byte, and then using NULLIFF='00000....XC'. Another option...
1205 Jan 2016 @ 08:33 PSTDatabaseTopicFASTLOAD - Record too long by n bytesFacing the following issue with FASTLOAD: RECORD is too long by n byte(s) My process creates an extract from DB2 by executing a SQL through DB2 utility,  and loads the extract to a Teradata ...
1107 Feb 2014 @ 12:04 PSTDatabaseReplyTemporal conceptTemporal tables will maintain the history for you, just like you mentioned in your example. When you update a row. For example in this case: UPDATE EMPLOYEE SET SALARY = 200 WHERE EMPLOYEE_NBR =...
1028 Mar 2013 @ 11:46 PDTDatabaseTopicQuestions regarding Aggregate Join IndexDear All, We are on TD 13 and want to create an AJI, however the number of columns we need to include exceeds the 64 column limit.  Can the gurus help me with some suggestions on the ways aro...
917 Feb 2012 @ 09:25 PSTDatabaseReplyINTERVAL SUBTRACTION FROM DATE FIELD IN WHERE CLAUSE Have to agree with that :-) -PT
816 Feb 2012 @ 11:47 PSTDatabaseReplyINTERVAL SUBTRACTION FROM DATE FIELD IN WHERE CLAUSE Your table could have an invalid date for one or more rows in the column PROMO_DATE. The query you have mentioned in your first post should work just fine. Can you please validate the values...
716 Feb 2012 @ 07:01 PSTDatabaseReplySignificance of MaxLength column As you would know Max(Column_name) will give you the maximum value present for that column in the table. MaxLength is different from max function. It indicates the number of b...
608 Feb 2012 @ 03:30 PSTDatabaseReplyFastest way to update a table joining to a huge table Thanks Rob. Will definitely look at the suggestion you made.
508 Feb 2012 @ 02:55 PSTDatabaseReplyInteger(7) to Date Hi Marc, Date format in TD is CYYMMDD. C here stands for century. For the year 1987, C = '0' and if C = 1 TD will assume it is 21st century. If in the data that you have C = 1 st...
402 Feb 2012 @ 08:56 PSTDatabaseReplyFastest way to update a table joining to a huge table First of all thanks so much for such a prompt response. Then thanks again for the simple and effective suggestion of carrying out second update first. That actually might work out :)))&...
302 Feb 2012 @ 08:22 PSTDatabaseTopicFastest way to update a table joining to a huge table I was wondering what is the most optimum/fastest method to carry out an update under the following scenario. What we have now is working fine, but with production volumne can create a problem. ...
210 Jan 2012 @ 09:18 PSTDatabaseReplyCumulative calculation problem Thanks a ton for taking out time for this. I will test this out. -PT
109 Jan 2012 @ 10:07 PSTDatabaseTopicCumulative calculation problem I have this situation. The base table here stores counts(number of units) flowing in for a product on a particular date and the Cumulative unit count.  Bas...