#DateForumTypeThreadPost
123407 Sep 2016 @ 03:24 PDTToolsReplyFastLoad - Data ConversionsFastExport itself cannot export the data and write it out in delimited format. So, I guess you must be CASTing your SELECT statement. Do you have to use SQLAssistant? Have you tried TPT? Can you...
123307 Sep 2016 @ 01:48 PDTToolsReplyFastLoad - Data ConversionsTo which type of data conversions are you referring? The FastLoad script is just loading data. Teradata expects the Date/Time/Timestamp data to be in a very specific format. If the incoming data...
123206 Sep 2016 @ 12:09 PDTToolsReplyTPT Multiple JobVar filesYes, TPT does support multiple job variable files. You just add additional -v <filename> command line options to the command line. This feature was implemented recently (targeted for 16.0),...
123102 Sep 2016 @ 05:15 PDTToolsReplyFile Writer operator writing out 0 for decimal columns with precision greater than 18.Ok, if the "issue" that we fixed is what I think, these are the releases in which the issue was resolved:   14.10.00.014 15.00.00.001  
123002 Sep 2016 @ 12:05 PDTToolsReplyFile Writer operator writing out 0 for decimal columns with precision greater than 18.I am trying to look up when this was fixed. Since it looks like several people are having issues, I need to know what version of TPT everyone on this thread is running.
122901 Sep 2016 @ 11:25 PDTToolsReplyTPT attributes SkipRows and SkipRowsEveryFileIn your data file, is the EOF on the same line as the last header record?  
122801 Sep 2016 @ 11:17 PDTToolsReplyfastload cannot start with error "The request exceeds the length limit, The maximum length is 1048500."The delimiter must be a printable character. FastLoad does not support hex values.  
122701 Sep 2016 @ 09:05 PDTToolsReplyTPT and Windows PowerShellFrom the Powershell documentation, it looks like that would be the best solution to running TPT under Powershell. Even typing just tbuildexe.exe indicates the environment (and environment variable...
122601 Sep 2016 @ 08:41 PDTToolsReplyTPT Script - Import from TD into HDFS.TPT provides samples in a "samples" directory where TPT is installed. Look in the directory called "userguide" inside "samples".  PTS00029 shows an example of r...
122530 Aug 2016 @ 08:47 PDTToolsReplyAutomatic table creation based on the header in raw file during data loading data into Teradata using any load utilityTPT does not automatically create tables. The commands to create a table must be supplied by the user through the DDL operator.  
122425 Aug 2016 @ 12:49 PDTToolsReplyTPT Script - Import from TD into HDFS.As with any other TD-to-flat-file TPT scenario, you can use the Export-operator-to-DC-operator scenario. This will export data from Teradata and write to HDFS. Just provide the information for th...
122324 Aug 2016 @ 11:53 PDTToolsReplyEasyloader Schema Error 15.10 OnlyOk, another update. Yes we have a regression. We were doing an EasyLoader overhaul and introduced a bug (that is currently being fixed). If you add --SourceFormat delimited (I do not remember whet...
122224 Aug 2016 @ 11:39 PDTToolsReplyfastload cannot start with error "The request exceeds the length limit, The maximum length is 1048500."I believe the issue *might* be related to the delimiter. The use of "\t" does not mean use the TAB character. We will look for the characters "\" followed by "t", and...
122124 Aug 2016 @ 11:34 PDTToolsReplyEasyloader Schema Error 15.10 OnlyWell, I could have edited the previous comment but did not in case someone read the old one. I checked with the developer to make sure no regressions have occurred and I was assured no regressions...
122024 Aug 2016 @ 11:25 PDTToolsReplyEasyloader Schema Error 15.10 OnlyIf you are loading data from a flat file to the Teradata Database, the schema will be taken from the target table. However, it looks like we may have a bug/regression from when EasyLoader was intr...
121923 Aug 2016 @ 10:22 PDTToolsReplyTPT, how to output columns enclosed in quotes?Set QuotedData to 'yes'.
121823 Aug 2016 @ 09:20 PDTToolsReplyEasyloader Schema Error 15.10 OnlyAs you can see in the column/field definition, you have a FLOAT field. The error is to be expected. In order to process delimited datam the schema must be all VARCHAR. In the job output above fr...
121723 Aug 2016 @ 07:34 PDTToolsReplyZero records are fetching when using where statement in TPTAs I indicated, you have to write a script for this, not use Easy Loader. We currently have a bug when using Easy Loader and the quotes are being stripped. Thus, if you want to use a SELECT state...
121622 Aug 2016 @ 04:23 PDTToolsReplyError Running Teradata to hadoop script "PTS00031"Did you provide the Hadoop_Home and/or Hadoop_Prefix job variables?  
121522 Aug 2016 @ 03:32 PDTToolsReplyEasyloader Schema Error 15.10 OnlyYour 15.10 job is not anything like the 15.0 job. The 15.00 job uses the DataConnector operator and the Load operator. The 15.10 job is using the ODBC operator and the Stream operator.  
121422 Aug 2016 @ 03:24 PDTToolsReplyTPT Script, Only Pull Entries Within Last 6 Months?SQL statements assigned to attributes are enclosed in single-quotes. Because of this, all single-quotes inside that string must be escaped (doubled) in order to be preserved.   So, your Sel...
121322 Aug 2016 @ 03:13 PDTToolsReplyEasy Loader Error - 2 Teradata SystemsPlease change the job variable "SelectStmt" to "SourceSelectStmt" and let me know if the job is successful.
121222 Aug 2016 @ 03:02 PDTToolsReplyZero records are fetching when using where statement in TPTYou need to escape (double) the single quote characters around the date.
121122 Aug 2016 @ 11:29 PDTToolsReplyEasyloader Schema Error 15.10 OnlyYou will have to show me a job run with TPT 15.0 and a schema with FLOAT. It should not have worked. We have been pretty strict about this since Day 1.
121022 Aug 2016 @ 08:58 PDTToolsReplyEasyloader Schema Error 15.10 OnlySUBSCRIBER_NO is defined as FLOAT. It needs to be VARCHAR.  

Pages