1059 | 21 Mar 2016 @ 02:21 PDT | Tools | Reply | Very large table loading fails with Fastload, what are other possible solutions? | TPT can actually support having multiple processes read from single, large file and the performance would probably be better than what you can accomplish via FastLoad.
|
1058 | 21 Mar 2016 @ 11:35 PDT | Tools | Reply | TPT load failure - Please suggest | A correction to what Fred said above.
DataDirect does not usually provide drivers for the mainframe.
When discussing the DB2 drivers above, no mention was made of the platform.
TPT does work wit... |
1057 | 21 Mar 2016 @ 10:54 PDT | Tools | Reply | TPT - Handling Delimited File with Embedded CR\LF in field | Re: Thompson: you did not indicate which version of TPT you are running.
TPT versions prior to 15.10 do not support embedded CR/LF, even in quoted strings.
|
1056 | 15 Mar 2016 @ 12:26 PDT | Tools | Reply | TPT : multiple sources in an operator ? | For Job Variables, as you asking for the size of the name? Or the assigned value?
I believe the assigned value can hold at least 1MB.
We have to because some operator attributes (and therefore jo... |
1055 | 14 Mar 2016 @ 11:05 PDT | Tools | Reply | TPT15.10. The need for TWB_ROOT ? | TWB_ROOT is not needed to be set by the user.
Please provide the following:
1. the command you are using to run TPT
2. the contents of your PATH environment variable.
You say you are running TP... |
1054 | 14 Mar 2016 @ 10:22 PDT | Tools | Reply | TPT script: error while executing TPT script | You ALWAYS need to tell us what version of TPT you are using.
That way we will know if there is a fix for a particular version that you might be able to use.
This error sometimes comes up when yo... |
1053 | 11 Mar 2016 @ 11:08 PST | Tools | Reply | Receive an error message while trying to use fast load | The error explains the problem.
The logon credentials are not correct.
Can you connect to the DBS using BTEQ and those same credentials?
|
1052 | 07 Mar 2016 @ 11:09 PST | Tools | Reply | how to load different files using multiload into different tables. | Those 5 IMPORTs are processed sequentially.
|
1051 | 03 Mar 2016 @ 10:58 PST | Tools | Reply | How to extract the count for the odbc source records in TPT | Take a look in the TPT binary job log at the TWB_STATUS private log:
command: tlogview -j <job-id> -f TWB_STATUS
We provide that data in a fixed type format.
We also publish in the manu... |
1050 | 03 Mar 2016 @ 10:57 PST | Tools | Reply | How to extract the count for source & target records in TPT | Take a look in the TPT binary job log at the TWB_STATUS private log:
command: tlogview -j <job-id> -f TWB_STATUS
We provide that data in a fixed type format.
We also publish in the manual ... |
1049 | 25 Feb 2016 @ 03:06 PST | Tools | Reply | TPT Statement Not Working: DEFINE SCHEMA <SchemaName> FROM TABLE '<TableName>'; | Actually, sorry, I can further cimplify the script:
DEFINE JOB . . . .
(
APPLY $INSERT
TO OPERATOR ($UPDATE)
SELECT * FROM OPERATOR ($EXPORT) ;
);
With this syntax... |
1048 | 25 Feb 2016 @ 02:51 PST | Tools | Reply | TPT Statement Not Working: DEFINE SCHEMA <SchemaName> FROM TABLE '<TableName>'; | Please try removing the parentheses from before and after the $INSERT.
Also, if you use templates, you can shorten your script quite a lot.
All you would need is this:
DEFINE JOB . . . .
(
DEF... |
1047 | 23 Feb 2016 @ 02:41 PST | Tools | Reply | fast export and mload with unicode column type and value | UTF8 characters can be 1-, 2-, or 3-bytes in length.
Thus, even with a character set of UTF8, a 7-bit ASCII character is still a single-byte UTF8 character.
However, our load/unload produ... |
1046 | 23 Feb 2016 @ 02:14 PST | Tools | Reply | fast export and mload with unicode column type and value | When using aclient session character set of ASCII, the unicode field will only be loaded properly if the the data for that column consists of single-byte characters.
If the data contains m... |
1045 | 22 Feb 2016 @ 10:10 PST | Tools | Reply | TPT Stream Operator | The idea of mini-batch is to load into a staging table, and then perform an INS-SEL or MERGE-INTO into the target table.
For this type of mini-batch, the Stream operator is actually the wr... |
1044 | 19 Feb 2016 @ 06:10 PST | Tools | Reply | Zip/GZip Support in TTU 14? | If TDCH (Teradata Connector For Hadoop) supports that compressed file format, then TPT can be used (TPT integrates with TDCH).
|
1043 | 11 Feb 2016 @ 06:40 PST | Tools | Reply | fast export and mload with unicode column type and value | The number of bytes actually exported depends on the "export width" setting in the DBS.
In order to better investigate the different errors, is it possible to send me some sample... |
1042 | 10 Feb 2016 @ 02:37 PST | Tools | Reply | fast export and mload with unicode column type and value | What version of TPT are you using?
Can you send me a sample of the data?
Since you indicated the errors were database errors, I am assuming the job ran to completion and the rows in question were... |
1041 | 08 Feb 2016 @ 08:31 PST | Tools | Reply | TPT Wizard driver error | Does this error occur when using the Wizard to set up a job?
Or does this error occur when you are actually running the TPT job.
|
1040 | 08 Feb 2016 @ 08:28 PST | Tools | Reply | TDLOADER | What version of TPT are you using?
|
1039 | 01 Feb 2016 @ 03:39 PST | Tools | Reply | TPT - Delimited Data Parsing error: Invalid multi-byte character | Please provide the script and first few rows of data.
|
1038 | 28 Jan 2016 @ 08:46 PST | Tools | Reply | Using name of Flat File in TPT Load | Please provide your entire script and job variable file so I can take a look.
|
1037 | 25 Jan 2016 @ 02:49 PST | Tools | Reply | TPT_INFRA: TPT02019: Error: opening file '$SCHEMA_GEN_D_TBL001.txt': "No such file or directory" (error code 2). | This error is not due to the job name.
It is due to a temp file that is created for a brief moment and when 2 jobs are started simulataneously, there is a slight chance of a collision. We fixed th... |
1036 | 22 Jan 2016 @ 03:03 PST | Tools | Reply | TPT Date format error | If your destination date format is 'yyyy-mm-dd' then that is the format in which the data must be sent to Teradata. Your SQL is cast-ing as mm.dd/yyyy (unless you meant mm/dd/yyyy, but even... |
1035 | 13 Jan 2016 @ 12:19 PST | Tools | Reply | TDLOAD Utility - Input DATE and TIME STAMP format issue | tdload does not support that functionality (yet) from a global standpoint.
Until we add that feature, you will have to write a TPT script and use the FORMATIN/FORMATOUT syntax in the schema to hav... |