#DateForumTypeThreadPost
89609 Sep 2014 @ 03:55 PDTDatabaseReplyHow to resolve blocking of sessionsYou can also write also write a small unix script checking the status $?  after logging to database with the db script nowait.....If status is not successful or 0 then  ....echo &quo...
89509 Sep 2014 @ 03:39 PDTGeneralReplyDATA TYPE FOR EMAIL ADDRESSYou can make varchar. But see the precision is high enough to accomodate long email addresses :).
89408 Sep 2014 @ 10:16 PDTDatabaseReplyDays between two dates (different years)What values you get from cast(cast(min(sale_date)as date format 'yyyy-mm-dd')) ? Do a select of your cast date and see its value. So you can think of bringing to the same format like...
89308 Sep 2014 @ 09:42 PDTDatabaseReplyDays between two dates (different years)Your dates are in different formats: Try to  bring both in the same  format  say example below: SELECT DATE '2014- 11-01' - DATE '2013- 11-01';  
89208 Sep 2014 @ 08:09 PDTDatabaseReplyLinkserver from Teradata to SQL ServerFor one off job, download to a file. Then tpt...Fastload... to Teradata table staging. Then compare staged table with  Teradata to avoid the duplicates as per your requirement. As far as...
89108 Sep 2014 @ 07:50 PDTGeneralReplyGeneral QuestionTo load duplicate rows in a MULTISET table, use MultiLoad. I suggest you to read the material and implement too, using vmware, express. It is free.  
89008 Sep 2014 @ 10:24 PDTDatabaseReplyLinkserver from Teradata to SQL ServerYou can export the the data from sql server to a landing zone, in file format and then load the files using tpt or fastload.... You may need to do data quality check after landing like formatting e...
88908 Sep 2014 @ 07:22 PDTDatabaseReplyDatabase backup failing due to untranslatable characters in a tableOops!Eusha, I should have suggested you to consult before deleting too. I have gone in your direction. The very basic idea of backup is that we can restore later. Data is very important, that is wh...
88808 Sep 2014 @ 04:40 PDTDatabaseReplyDatabase backup failing due to untranslatable characters in a tableYou verified also  that you deleted successfully? This link of translate_chk, you can see one by one: http://www.info.teradata.com/HTMLPubs/DB_TTU_14_00/index.html#page/SQL_Reference/B035_11...
88708 Sep 2014 @ 03:29 PDTDatabaseReplyLinkserver from Teradata to SQL ServerI am not sure about your target. Even in  TPT wizard, I did not see SQL server as source. Maybe I deal with an old version? What I see is that for sanity purpose, it is good to land the data ...
88608 Sep 2014 @ 01:23 PDTGeneralReplyTraining TeradataJust my suggestion: you can provide more details of the reqt, the audience, your contact etc. You can get response. I feel this should go to job forum.
88507 Sep 2014 @ 11:39 PDTExtensibilityReplyREGEXP_SUBSTR in Teradata 13.10From github, we can get few ready made things, like 3d.js codes etc. I have not tried this. Maybe you can have a look at it if it is similar. https://github.com/hholzgra/mysql-udf-regexp/blob/mas...
88407 Sep 2014 @ 10:15 PDTExtensibilityReplyREGEXP_SUBSTR in Teradata 13.10I am not able to download the zip file. Did you try to see from here or simulate the same? https://downloads.teradata.com/download/extensibility/teradata-udfs-for-popular-oracle-functions I can s...
88307 Sep 2014 @ 08:22 PDTDatabaseReplyConvert string into dateUsing dbc, you can use for example: select columnname as col1,columnlength as cl ....... from dbc.columns where databasename='DEV_RETAIL' and tablename='tb1' union/minus ( your ch...
88207 Sep 2014 @ 02:24 PDTDatabaseReplyComplex SQL query It can be thus too: select k.ref_id,k.seq_nbr,k.depth,sum(k.nr) over( order by k.nr rows unbounded preceding) sm  from (select ref_id,seq_nbr,depth, case when depth-coalesce(min(depth)over(...
88106 Sep 2014 @ 02:14 PDTDatabaseReplyOptimization of a Join QueryIt 's difficult to say without explain plan. Is this just a query or a JI exists.... stats collected ....partitioning...PIs info.... and many more to look into.
88005 Sep 2014 @ 11:26 PDTDatabaseReply transaction log of a user Hope DBQL logging is in place.If you look at this link, you can see lots of DBQL tables and views. Check all the fields from DBQLogTbl and you can get what you want. Also I hope you main...
87905 Sep 2014 @ 09:04 PDTGeneralReplyBTEQ export to flat fileselect cast(max(your_date) as date format 'YYYY-MM-DD') from your table; in your bteq script after .export report file=your directory
87805 Sep 2014 @ 10:45 PDTDatabaseReplyHelp with macroIf you are using unix, then you can combine database script and unix script, making  different calls to parameters you want based on your conditions. For example, in one of my previous projec...
87705 Sep 2014 @ 07:23 PDTDatabaseReplyImporting Data from .XLS I think it will also be difficult to load from excel if it works.For example, if the data for few columns are frozen and saved or data is taken from tabs , which are computed and some more fe...
87605 Sep 2014 @ 05:40 PDTTeradata ApplicationsReplyFASTEXPORT - Sorted order two further distribution between amps. But Why?It is my bad. All the while, I have been reading FASTLOAD instead of FASTEXPORT as the other post(posted next to yours). I was wondering how can it be? :). You can use nospool option too: The NoS...
87505 Sep 2014 @ 05:09 PDTTrainingReplyNeed Help Fro Traning PurposePlease dont mind my suggestions. Reading the materials and practicing like in Express  as mentioned above will help. By practicing and having real time implementation, it gives good confidence...
87405 Sep 2014 @ 12:04 PDTTeradata ApplicationsReplyFASTEXPORT - Sorted order two further distribution between amps. But Why?The question is not clear. Can you elaborate more?
87304 Sep 2014 @ 10:02 PDTDatabaseReplyImporting Data from .XLSEven in SQL Assistant, if you import, the format is csv. I am not sure if DIF will work with xls. But you can save your file in csv right?
87204 Sep 2014 @ 10:16 PDTDatabaseReplySelecting only non duplicate values from a set of columnstry this: select c1,c2,c3,c4,c5,c6 from your_table qualify row_number() over(partition by c1,c2,c3 order by c6 desc) =1

Pages