50 - 100 of 216 tags for tpt

Pages

The application team experienced a deadlock issue.
There were three different jobs running:  Job A was blocked by Job B.  Job B was blocked by Job A.  Job C was blocked by both Job A and Job B.

Hello good people
I have a table of ~77 million rows that I need to export to fixed width text files, however I need to impose a limit of 5 million rows per file.
I am currently achieving this with "SELECT * FROM MYTABLE QUALIFY ROW_NUMBER() OVER(ORDER BY COL1 ASC) BETWEEN 1 AND 5000000"

Hello,
We are trying to load a pipe delimited flat with qulaifiers as " utlizing the TPT script but our loads seems to be failing since there is a embedded " in certain fields. Its just sending the error record to a error file and failing after that. Do we have some option or add on in the TPT script to load this data as is.

Hi,
 

Hello:
I posted a few weeks ago here just for some background.
http://forums.teradata.com/forum/tools/easy-loader-issue 
I'm finding I can't use Easy Loader to load a Target table unless that table is created under my default logon directory and that table is qualified in the command line with my default database.  

Hello guys, I'm new to TPT and I run into 

 

Here is a snippet of my TPT code 

I want to create and test some user templates, which I'm going to base on the Teradata provided templates $LOAD and $EXPORT.

On August 27, between the hours of 8:30 and approx 11:30.  something has caused our MF tpt's to go awry.  MF team says nothing changed.  DB team says nothing changed.  the error we are now getting is:
TPT_INFRA: TPT04013: Error: opening temporary job script output file: "EDC5000I No error occurred." (0).

Dear colleges,
I want to extract information form a Teradata environment and load it in SQL server ( I know this is not done :-P ). For now my process writs in a file, and then the file is load in SQLserver. Important to tell, I do not want to use SQLserver SSIS but TPT command line version!

Hi,
I'm was using TTU 13.10 until now, when I upgraded my production machine with TTU 15.00.
But now I'm getting this error:
TPT19435 pmRead failed. EOF encountered before end of record (35)
The process that load data to the named pipes doesn't have time to connect to the pipes.

Hi,
I am using TPT to load an extract. The job is running fine but it prints a lot of statements to the stdout of unix.
Example :

I have a file to load which doesn't seem to be in too unusual a format, but I'm not sure if it is possible to load it using TPT.  Can anyone let me know whether this is possible, or if I have to write some unix script to edit the file before I can load it.
 
The data looks like this:

Our enterprise teradata version is 13.10.07.12. I have installed TTU 15 on our 64bit windows server. While trying to use TPT wizard to connect to SQL Server using odbc DSN I am getting Driver Error: There was an error initializing the driver for usage.

Hi All
I want to know one thing. I submit one TPT Load job. If the job fails the target table will be locked (similar to fastload).
Both are valid, but what is the better remedy?
use Standalone TPT Load job to apply and release the lock 
or
using ddl drop and recreate the table every time.
 
Thanks

We sometimes see the LOAD operator leave a table in a locked state if the TPT job fails at certain places in the script.  The only solution that works for us is to drop and recreate the table.  But will the INSERTER operator ever leave a table locked?
 
Jerry

Hi ,
The subject might be old, but I was unable to figure out my solution, so please take a look once.
Trying to TPT using the below Script :

Does anyone have an example of a TPT script with format = text?
I'm trying to load a file with variable length records into a table with a single varchar(1000) column.  I tried basing the script on the qstart1 example and just changing 'delimited' to 'text', but I get an error:

If we have a file which contains multiple record types, each with a different number of fields and record length but identified by the value in the first field, is it possible to load this using TPT?
A simple example would be a file with a header and trailer, eg:

01|20140523

10|ID01|JOHN

10|ID02|PETER

Hi All,
Simple (basic?!) TPT question.  I know IBM MQ can be a Producer DataConnector (i.e., an input) for a TPT process, but can it also be a Consumer DataConnector (i.e., an output) as well? 
I can only find references to it as an Input source in the TPT 14.00 documentation, not an output source.
All assistance greatly received.

Hi,
 
We are using Teradata Parallel transporter (TPT) loaders as both source and target connection.
 
     Source     - TPT Export
     Target      - TPT Load
 
Using that performance has improved significantly compared to relational connection.
 

Hi all, I'm pretty new to using TPT and am working with Version 14.00.00.07.  In the last week or so I have been finding that almost all of my jobs involving a simple load from one csv into one table are failing.

Hi guys,
I would like to understand peoples views on utilising DBC for dynamically generating scripts for data processing.
Considering DBC is Teradata's own metadata store and the development of this is owned and managed by Teradata do you feel it appropriate to develop frameworks for processing data based upon this?

Hi to all,
I want to load data using TPT utilities into table which has special characters in table name and column names. In my code I build dml statement:

Hi All,
Please help me where I am going wrong. The error im getting is relating to the FileName given.
This is the First time i am using TPT.
Below is the TPT Script:
 
DEFINE JOB FlatJOBS
DESCRIPTION 'FLAT FILE FORMAT'
(
DEFINE SCHEMA Schemas
DESCRIPTION 'Schemas'
(

I have a CSV file that I'm attempting to load via TPT.  I have created the CSV file in Excel.  When I try to load the file with the appropriate number of delimiters, I am getting a BUFFERMAXSIZE error.  When I add another delimiter to the end of each record, the file loads just fine.

Hi everyone!!
I have several tpt scripts (which input is a fifo file) that used to run properly in tpt 13.0, now the system in which I'm working is upgrading to 14.10 and the scripts fails...
the error is:
        "TPT19120 !ERROR! multi-instance read feature requires REGULAR file names."

Hello, 

Hi all,
 
I am encountering an unexpected situation when using multiple DataConnector operators UNION ALLed together, and would appreciate your input/suggestions.
 

Hi,
 
I am trying to export data from a sql uery to a file using TPT Export operator.
My Sql query is running fine in sql assistant but when it is run through TPT it is giving non-existent syntax errors.
I have used two single quotes wherever I have one single quote in my query.
My requirements:

I tried to write TPT output to named pipe, and then use gzip to compress the stream into final files.
gzip < /var/tmp/fact_abc_segment.fifo-1 > ~/data/fact_abc_segment.1.fastload.gz
gzip < /var/tmp/fact_abc_segment.fifo-8 > ~/data/fact_abc_segment.8.fastload.gz
 

Hi All
We have a requirement where we need to load one single table in parallel from different source tables.
Say, TableX needs to be loaded from TableA, TableB and TableC. They have different source data (Rows).

Since I am new to teradata my DBA suggested to use TPT Wizard to export data from a teradata table to a flat file in | delimited format.
However I noticed that all columns needed to be varchar for this export using TPT. Even though I changed the TPT Type to varchar I am getting below error,

Hi i am trying to use DEFINE SCHEMA target_schemaz FROM TABLE in a tbuild script and it is failing with
"Teradata Parallel Transporter Version 14.10.00.01

TPT_INFRA: TPT05004: CLI error 224 in connecting session (function ConnectSession).

MTDP: EM_NOHOST(224): name not in HOSTS file or names database.

 

Hi,
We have some jobs loading data from TPT which creates ET for duplicate records.
Does anyone know how to convert Hostdata values into actual record from ET? I have limitations, can not user fastexport or Varbyte2Varchar UDF.

Hi TPT Users,
I see there's some issue with the datafile that I have been trying to load.
I keep getting exit status 8 and exit status 12 whenever trying to load full file.

Hi All,
I am having one problem.
I know BTET transaction is "all or none". If I submit a multistatement transaction and any one statemetn fails then entire transaction will roll back in BTET mode. However in ANSI mode if I use COMMIT then only the failed statements will roll back while the others will commit.

I have a file with fixed-length records, but they need to be interpreted in various ways, depending on first field (let's call it RECORD_TYPE). We currently do it easily with Multiload:

I have coded a simple TPT script using operator templates ($EXPORT, $INSERT and $LOAD) to copy data of Table A from TDSERVER-A to TDSERVER-B. TDSERVER-A and TDSERVER-B use LDAP based authentication. TPT (TBUILD) raises errors when I run the script either by initializing LogonMech, UserName, UserPassword variables for Source/Target operators in JobVars fil

Hi all,
I am building an SSIS package in SQL Server BIDS 2008 version to pull data from Teradata and dump to SQL Server.
1. I am able to connect to Teradata sucessfully as Source.
2. Connection to SQL Server as Destination is also good.
2. When I am running the package I am getting the error
*********  ERROR *******

Insert into emp values
(:FNAME ,.......

Above Sample Code from TPT works fine. I want to convert null values in flatfile to blank while loading

insert into emp values ( COALESCE(:Fname,' '),.... -- Throws ERROR

Lately, when running TPT jobs, we've experienced job failures. When looking at the logs for the TPT jobs, sometimes (sporadically) the log will start off normally referencing the correct job, but will switch over a few lines in and reference a completely seperate unrelated job. 

Hi All,
We are trying extarct data from DB2 tables and loading into teradata tables and we are using tbuild utility in TPT.
 
Whuile connecting to DB2 we are getting the following error.
 
W_0_o_Test5: TPT10551: CLI '215' occurred while connecting to the RDBMS

How do I find the value of the TPT parallel connection limit?  Is this a field that is displayed in DBS Control?  If so, which field is it?  MaxLoadTasks?

Can somebody tell me if there is a way to use the COBOL INMOD function with TPT. I have a COBOL program the output of which need to be loaded into TD table using TPT after checking each field for nulls. I was using MLOAD funtion earlier and INMOD function could handle this check.

Hi,
I'm using TPT 13.10 and I need to do a script for loading Flat File in a Teradata Table.
I would like to store the name of flat file into the table but the file is unformatted and I read that metadata option requires a delimited file.
Can anyone provide any solution for this?
 Thanks !
 

I have a file I am loading to a table in my database via TPT. Everything is working properly, however I am dropping records where the date format is m/d/yyyy (as opposed to mm/dd/yyyy). Is there a way, in the insert statement in APPLY to convert the field on the fly via CASE, CAST, or some string manipulation?  

Hi All,
    Can someone please explain what does this option mean? Nospool or spool  which is advised for extractions involving huge tables with more than 100 million rows.
 

Below query is throwing
TPT12108: Output Schema does not match data from SELECT statement
can some body help me in this
DEFINE JOB DBtoFile
/* 4 */ (
/* 5 */ DEFINE OPERATOR W_1_o_DBtoFile
/* 6 */ TYPE DATACONNECTOR CONSUMER
/* 7 */ SCHEMA *
/* 8 */ ATTRIBUTES
/* 9 */ (

Where there are several TPT Load jobs running at the same time, the TPT will randomly fail with the following error(s). But once we rerun the job again, the TPT just succeeds.
We try to let TPT figure out the file layout based on the target table's structure. It seems that the error happens before the Load operator even kicks off