#DateForumTypeThreadPost
105921 Mar 2016 @ 02:21 PDTToolsReplyVery large table loading fails with Fastload, what are other possible solutions? TPT can actually support having multiple processes read from single, large file and the performance would probably be better than what you can accomplish via FastLoad.  
105821 Mar 2016 @ 11:35 PDTToolsReplyTPT load failure - Please suggestA correction to what Fred said above. DataDirect does not usually provide drivers for the mainframe. When discussing the DB2 drivers above, no mention was made of the platform. TPT does work wit...
105721 Mar 2016 @ 10:54 PDTToolsReplyTPT - Handling Delimited File with Embedded CR\LF in fieldRe: Thompson: you did not indicate which version of TPT you are running. TPT versions prior to 15.10 do not support embedded CR/LF, even in quoted strings.
105615 Mar 2016 @ 12:26 PDTToolsReplyTPT : multiple sources in an operator ?For Job Variables, as you asking for the size of the name? Or the assigned value? I believe the assigned value can hold at least 1MB. We have to because some operator attributes (and therefore jo...
105514 Mar 2016 @ 11:05 PDTToolsReplyTPT15.10. The need for TWB_ROOT ?TWB_ROOT is not needed to be set by the user. Please provide the following: 1. the command you are using to run TPT 2. the contents of your PATH environment variable. You say you are running TP...
105414 Mar 2016 @ 10:22 PDTToolsReplyTPT script: error while executing TPT scriptYou ALWAYS need to tell us what version of TPT you are using. That way we will know if there is a fix for a particular version that you might be able to use. This error sometimes comes up when yo...
105311 Mar 2016 @ 11:08 PSTToolsReplyReceive an error message while trying to use fast loadThe error explains the problem. The logon credentials are not correct. Can you connect to the DBS using BTEQ and those same credentials?  
105207 Mar 2016 @ 11:09 PSTToolsReplyhow to load different files using multiload into different tables.Those 5 IMPORTs are processed sequentially.
105103 Mar 2016 @ 10:58 PSTToolsReplyHow to extract the count for the odbc source records in TPTTake a look in the TPT binary job log at the TWB_STATUS private log: command: tlogview -j <job-id> -f TWB_STATUS We provide that data in a fixed type format. We also publish in the manu...
105003 Mar 2016 @ 10:57 PSTToolsReplyHow to extract the count for source & target records in TPTTake a look in the TPT binary job log at the TWB_STATUS private log: command: tlogview -j <job-id> -f TWB_STATUS We provide that data in a fixed type format. We also publish in the manual ...
104925 Feb 2016 @ 03:06 PSTToolsReplyTPT Statement Not Working: DEFINE SCHEMA <SchemaName> FROM TABLE '<TableName>';Actually, sorry, I can further cimplify the script:   DEFINE JOB . . . . ( APPLY $INSERT  TO OPERATOR ($UPDATE) SELECT * FROM OPERATOR ($EXPORT) ; );   With this syntax...
104825 Feb 2016 @ 02:51 PSTToolsReplyTPT Statement Not Working: DEFINE SCHEMA <SchemaName> FROM TABLE '<TableName>';Please try removing the parentheses from before and after the $INSERT. Also, if you use templates, you can shorten your script quite a lot. All you would need is this: DEFINE JOB . . . . ( DEF...
104723 Feb 2016 @ 02:41 PSTToolsReplyfast export and mload with unicode column type and valueUTF8 characters can be 1-, 2-, or 3-bytes in length. Thus, even with a character set of UTF8, a 7-bit ASCII character is still a single-byte UTF8 character.   However, our load/unload produ...
104623 Feb 2016 @ 02:14 PSTToolsReplyfast export and mload with unicode column type and valueWhen using aclient session character set of ASCII, the unicode field will only be loaded properly if the the data for that column consists of single-byte characters.   If the data contains m...
104522 Feb 2016 @ 10:10 PSTToolsReplyTPT Stream OperatorThe idea of mini-batch is to load into a staging table, and then perform an INS-SEL or MERGE-INTO into the target table.   For this type of mini-batch, the Stream operator is actually the wr...
104419 Feb 2016 @ 06:10 PSTToolsReplyZip/GZip Support in TTU 14?If TDCH (Teradata Connector For Hadoop) supports that compressed file format, then TPT can be used (TPT integrates with TDCH).  
104311 Feb 2016 @ 06:40 PSTToolsReplyfast export and mload with unicode column type and valueThe number of bytes actually exported depends on the "export width" setting in the DBS.   In order to better investigate the different errors, is it possible to send me some sample...
104210 Feb 2016 @ 02:37 PSTToolsReplyfast export and mload with unicode column type and valueWhat version of TPT are you using? Can you send me a sample of the data? Since you indicated the errors were database errors, I am assuming the job ran to completion and the rows in question were...
104108 Feb 2016 @ 08:31 PSTToolsReplyTPT Wizard driver errorDoes this error occur when using the Wizard to set up a job? Or does this error occur when you are actually running the TPT job.  
104008 Feb 2016 @ 08:28 PSTToolsReplyTDLOADERWhat version of TPT are you using?
103901 Feb 2016 @ 03:39 PSTToolsReplyTPT - Delimited Data Parsing error: Invalid multi-byte characterPlease provide the script and first few rows of data.  
103828 Jan 2016 @ 08:46 PSTToolsReplyUsing name of Flat File in TPT Load Please provide your entire script and job variable file so I can take a look.
103725 Jan 2016 @ 02:49 PSTToolsReplyTPT_INFRA: TPT02019: Error: opening file '$SCHEMA_GEN_D_TBL001.txt': "No such file or directory" (error code 2).This error is not due to the job name. It is due to a temp file that is created for a brief moment and when 2 jobs are started simulataneously, there is a slight chance of a collision. We fixed th...
103622 Jan 2016 @ 03:03 PSTToolsReplyTPT Date format errorIf your destination date format is 'yyyy-mm-dd' then that is the format in which the data must be sent to Teradata. Your SQL is cast-ing as mm.dd/yyyy (unless you meant mm/dd/yyyy, but even...
103513 Jan 2016 @ 12:19 PSTToolsReplyTDLOAD Utility - Input DATE and TIME STAMP format issuetdload does not support that functionality (yet) from a global standpoint. Until we add that feature, you will have to write a TPT script and use the FORMATIN/FORMATOUT syntax in the schema to hav...

Pages