All forum activity

#DateUserForumTypeThreadPost
5859220 Jul 2016 @ 01:20 PDTaneelkumar04DatabaseReplyFastload Duplicate Records issuesHi CarlosAL Thank you for reply, i checked your command,but i didn't get duplicate records Thanks in advanced
5859120 Jul 2016 @ 01:04 PDTGhaliaAnalyticsReplyMAX() and MAX() OVER (PARTITION BY ....) in the same query produces error 3504Maybe you just need to group also by course_code ( the field in the partition part)...
5859020 Jul 2016 @ 12:59 PDTCarlosALDatabaseReplyFastload Duplicate Records issuesHi. Fastload DOES NOT put full duplicate rows in an ERROR TABLE. Fastload puts UNIQUE PRIMARY INDEX violations in ERROR TABLE 2. To the OP: You can check duplicates in the file using something l...
5858920 Jul 2016 @ 12:35 PDTM.Saeed KhurramDatabaseReplyFastload Duplicate Records issuesHi, Fastload do not load full row duplicate records, rather put them in a separate error table by default.  As per the log, there are 5 duplicates but no dups in error table, So one reason c...
5858820 Jul 2016 @ 12:34 PDTKarthikeyan_dskAnalyticsReplyReplacing duplicates while transpose in TeradataMaking Tables Proper TBL_RULES TBL_NM VLD_ID COL_NM TABLE1 1000 COL_1 TABLE1 1001 COL_2 TABLE1 1002 COL_2 TABLE1 1003 COL_3 TABLE_LOG TBL_NM VLD_ID REC_NBR SUCCESS_FLAG TABLE1 1...
5858720 Jul 2016 @ 12:27 PDTKarthikeyan_dskAnalyticsTopicReplacing duplicates while transpose in TeradataHi, I have two tables as below. TBL_RULES TBL_NM VLD_ID COL_NM TABLE1 1000 COL_1 T...
5858620 Jul 2016 @ 12:19 PDTKarthikeyan_dskJobsReplyTerradata developer opportunity in Toronto, CanadaHi, To which ID I should send. Do you expect profiles from within Canada?
5858520 Jul 2016 @ 12:14 PDTaneelkumar04DatabaseReplyFastload Duplicate Records issuesHi,   Thnaks for reply, But its production database.
5858420 Jul 2016 @ 12:08 PDTAtardecerR0j0DatabaseReplyFastload Duplicate Records issuesYou can load data again using a NOPI table, so that you'll load duplicate rows. Then you can use sql to check about duplicate rows.
5858319 Jul 2016 @ 11:58 PDTaneelkumar04DatabaseTopicFastload Duplicate Records issuesHi All,   I am loading data from file to Db using fastload. My file got aborted due to duplciate records,but when i chcecked in file there is no duplicate records in file. I am confusinng.&n...
5858219 Jul 2016 @ 11:16 PDTKNToolsReplyTPT - ScriptingThank you all for the pointers... We are on DBS version is on 14.10 currently.. I have set up a Teradata Sandpit environment on VM infrastructure and that is on TD 15.0..Basically We wanted to off...
5858119 Jul 2016 @ 10:42 PDTJohannes VinkViewpointReplyMoving Data from One Viewpoint Server to a new Viewpoint ServerIf you want to move everything: move everything in the /backup folder to the new server in the /backup folder there and restore the data/configuration. The restore process is described in chapter 6...
5858019 Jul 2016 @ 10:15 PDTAVOTViewpointTopicMoving Data from One Viewpoint Server to a new Viewpoint ServerHi Techos, We have three VP 15.00 servers in the DEV environment. Assume S1, S2, and S3. S1 is currently used as a standalone VP server and is collecting the data. I have set up S2(primary) and S3...
5857919 Jul 2016 @ 09:30 PDTtisenhartTeradata Database on AWSTopicAWS instance unreachable for Teradata Database Developer (Single Node)I've launched the AWS Teradata Database Developer (Single Node) , and can't connect with ssh to start the DB ; not sure why i can't ssh. The AWS instance state shows the instance ...
5857819 Jul 2016 @ 06:14 PDTzhenwuyangDatabaseReplyFunction quantile throws a numeric overflow error in a large tableThank you Fred and Dnoeth. I tried the OLAP option, but it takes too long. I worked with my busessiness user and reduced the amount of data we are pulling. That works for now. For Dnoeth's sug...
5857719 Jul 2016 @ 04:54 PDTsudarmoTeradata StudioReplyKerberos AuthenticationHi Francine, Where can I find the "Studio Help Content" that explained Kerberos Authentication as you described above? I tried Teradata Studio Help Content but couldn't locate any s...
5857619 Jul 2016 @ 02:44 PDTg_rodrigAsterReplyAster 6.0 to HDP 2.1 select does not workHello , i am receiving a similar error when running a query from aster 6.20 to Cloudera 5.4. ERROR:  SQL-MR function LOAD_FROM_HCATALOG failed: Failed to initiate data reading from hcatalog t...
5857519 Jul 2016 @ 02:16 PDTHFDatabaseReplyConsolidate rowsI have version 15.1, but your query worked perfectly. Thank you.
5857419 Jul 2016 @ 02:06 PDTdnoethDatabaseReplyConsolidate rowsWhat's your Teradata version? 14.10 has a quite unknown syntax using NORMALIZE over PERIODs: SELECT CATEGORY, TYPE, -- split the period in start and end again BEGIN(pd), LAST(pd) F...
5857319 Jul 2016 @ 01:42 PDTHFDatabaseReplyConsolidate rows2016-02-01 is February 1st.  There are no gaps in time.  My example END_TS should really have been the last second prior to the start_ts of the following row. For instance the 1st row in...
5857219 Jul 2016 @ 12:00 PDTdnoethDatabaseReplyConsolidate rowsIs "2016-02-01" Feb. 1st or Jan. 2nd, i.e. do you want to combine only periods without gaps?
5857119 Jul 2016 @ 11:23 PDTfeinholzToolsReplyZero records are fetching when using where statement in TPTI have enough information. I needed to know if the single-quotes were being stripped.  
5857019 Jul 2016 @ 11:18 PDTvenkata_k01ToolsReplyZero records are fetching when using where statement in TPTHi Steve,   I am facing issue while uploading complete .out file. Please let me know if the above information is enough for furthur analysis before my attempt to upload the file is successf...
5856919 Jul 2016 @ 10:38 PDTvenkata_k01ToolsReplyZero records are fetching when using where statement in TPTHi Steve, PFB the select statement from the .out file. Single quote before & after the date value is missing and that is why it is fetching 0 records. k   ==========...
5856819 Jul 2016 @ 10:30 PDTHFDatabaseTopicConsolidate rowsI'm looking to consilidate rows in my table by combining similar consecutive rows together.  In the example I have category "A" where the type "X" is the 1st...

Pages