#DateForumTypeThreadPost
60904 Mar 2014 @ 09:42 PSTToolsReplyTeradata Parallel Transporter - Session Character SetIt is the other way around.  :) The question should be, "Does TPT support Teradata 13?" The answer is "yes", but it depends on which versions of TPT. Our client products...
60803 Mar 2014 @ 10:04 PSTToolsReplyTeradata Parallel Transporter - Session Character SetPlease send your entire JCL. What dataset is PTYIN pointed to?
60703 Mar 2014 @ 06:02 PSTToolsReplyTeradata Parallel Transporter - Session Character SetI think we need to see the entire set of output. We only have a partial story here. The job terminated with an error (Job step LOAD_TABLES terminated (status 8) ). But the part of the output you...
60603 Mar 2014 @ 09:59 PSTToolsReplyTeradata Parallel Transporter - Session Character Set@Santanu84: please do NOT use the attributes ValidUTF8 and ReplaceUTF8Char.   We are not happy with the results we are getting from that feature and we are redesigning it.   @arun_tim...
60527 Feb 2014 @ 11:33 PSTToolsReplyFast export and mload in vartext mode - Data item too large for field issueYou should consider switching to TPT. In 14.10 TPT supports the ability to write out data (retrieved from Teradata or any other ODBC compliant data source) to delimited fashion without the need to...
60427 Feb 2014 @ 11:27 PSTToolsReplytdload having issues loading data where the delimiter is part of a fieldthomspsonhab: TPT does not yet support embedded cr/lf.   manharrishi: tdload (aka EasyLoader) does not yet support quoted data (but we should have so I will make sure we get this efixed).
60326 Feb 2014 @ 03:57 PSTToolsReplytdload having issues loading data where the delimiter is part of a fieldSupport for quoted fields (and thus, embedded delimiters) did go into TPT in 14.0. However, TPT14.10 is probably a better release (performance improvements) if you can use that one. We still do n...
60226 Feb 2014 @ 12:27 PSTToolsReplytdload having issues loading data where the delimiter is part of a fieldTPT does not support embedded delimiters in release 13.
60126 Feb 2014 @ 11:30 PSTToolsReplyTeradata Parallel Transporter - Session Character SetThe USING CHARACTER SET UTF8 is only for the data. Not the script. The script does not have to be encoded in UTF8 in order for you to load UTF8 data into Teradata. If there is nothing special in ...
60025 Feb 2014 @ 12:14 PSTToolsReplyfast export and mload in fastload format issue - error code 2793Thanks Dieter! That is where I was going next, right after looking at a hex dump of the data.  
59925 Feb 2014 @ 11:28 PSTToolsReplyfast export and mload in fastload format issue - error code 2793Try it with just one row of data and post the hex output of the data file from the FastExport job.  
59821 Feb 2014 @ 06:07 PSTToolsReplyBacklog issue occuring when UNION ALLing multiple TPT DataConnector operators into StreamI will look into the checkpoint issue, to see if we have a synchronization issue regarding the use of UNION ALL. I know that when using multiple instances of a single DC operator, the checkpoint do...
59720 Feb 2014 @ 07:16 PSTToolsReplyBacklog issue occuring when UNION ALLing multiple TPT DataConnector operators into StreamA few thoughts. If you are using the Stream operator, I doubt the file reading will ever be the bottleneck. The Stream operator will probably always be running slower than the file reading. Next...
59619 Feb 2014 @ 12:04 PSTToolsReplyNeed Header Row in FastExport / TPT Export.?TPT cannot load data into any database other than Teradata. (You can use TPT to move data from a non-Teradata database to Teradata without landing the data to disk.) However, TPT is a bulk data l...
59519 Feb 2014 @ 11:55 PSTToolsReplyTPT 14.10 output to named pipe and then gzip to final filesThanks for the validation (and your patience).
59419 Feb 2014 @ 12:02 PSTToolsReplyTPT 14.10 output to named pipe and then gzip to final filesOk. You cannot "checkpoint" when using pipes. Remove the "-z 60" part of the command and your job should be able to run.
59318 Feb 2014 @ 07:35 PSTToolsReplyTPT 14.10 output to named pipe and then gzip to final filesWhat is the command line you are using to run the TPT job?
59218 Feb 2014 @ 11:59 PSTToolsReplyTPT 14.10 output to named pipe and then gzip to final filesPipe issues resulting in "getpos" errors was fixed in 14.10.00.003 and 14.00.00.011.
59118 Feb 2014 @ 10:30 PSTToolsReplyTPT 14.10 output to named pipe and then gzip to final filesNot sure on the exact naming convention yet. We have to be careful with backwards compatibility issues. What specifically do you want to know about named pipes? Are you using the named pipe acce...
59018 Feb 2014 @ 09:33 PSTToolsReplyTPT 14.10 output to named pipe and then gzip to final filesYup. Found a bug.   The DC operator is looking at the .gz-x extension and not the .gz extension (prior to appending the instance number).   Plus, putting the instance number on the fi...
58917 Feb 2014 @ 10:24 PSTToolsReplyZip/GZip Support in TTU 14?TPT's implementation with TDCH will most likely not support .gz files. The TPT implementation of the HDFS API will support .gz files.  
58817 Feb 2014 @ 10:22 PSTToolsReplyTPT 14.10 output to named pipe and then gzip to final filesYou can specify multiple instances of the DataConnector operator (as the file writer) and use the -C option on the command line. TPT will round-robin the data to each instance and each instance wil...
58717 Feb 2014 @ 05:49 PSTToolsReplyTPT 14.10 output to named pipe and then gzip to final filesWhat are you trying to accomplish here? TPT can both read and write gzip files. Might be easier to do that than to use pipes.  
58617 Feb 2014 @ 09:48 PSTToolsReplyNeed Header Row in FastExport / TPT Export.?BTEQ is often known (or referred to) as a report writing tool. It has those capabilities. FastExport and TPT are high speed bulk loading/unloading tools. It is best to use BTEQ for the types of ...
58512 Feb 2014 @ 04:32 PSTToolsReplyDEFINE SCHEMA target_schema FROM TABLEOk, well you found a bug in our code and we will get it fixed. The workaround is to use that obsolete syntax. We never deprecate the syntax, even when we make it obsolete. So, it will work for yo...

Pages