150 - 200 of 216 tags for tpt

Pages

If you’re waiting for an easy way to load data from one or more Teradata table(s) into a Teradata table without writing a Teradata PT script, wait no further.  Teradata PT Easy Loader can do it easily.  In the 14.0 release, the tool can load data from a Teradata table or from SELECT statement(s).

Hello,

Teradata Parallel Transporter (Teradata PT) has fourteen different operators. Each behaves differently. This article provides a table to help you in selecting the right operator to use for your Teradata PT job. You can view the table as Excel .xls or PDF.

Hi all,

        I would appreciate if some one can give the insight to this problem.

We have TPT available in Teradata Box however same is not installed in one of other  unix servers that we use to load files.

My question is ,Do we need to purchase TPT license for installing it on unix server or do we just need to install drivers and stuff in the server .

Hi all,

I have to load some data using Teradata Parallel Transporter...

 

I have a record like:

1234ANA MARIA       20111209

 

I need to load something like:

PID: 1234
NAME: ANA MARIA

DATE: 2011-11-09

 

I expected to do something like:

Hi all,

I got a "EOF encountered before end of record" error when running a TPT script with filename = '*' (file by file runs fine).

Do you want to have your Teradata Parallel Transporter (Teradata PT) Stream operator jobs run faster? Are you having difficulty determining the optimal pack factor for your Stream operator jobs? Knowing how to use the Stream operator’s PackMaximum attribute enables you to determine the optimal pack factor and thus improve the performance of your Stream operator job.

Hi,

Wanted to know any good study guide or ref. material,with sample scripts  available for

gaining knowledge on Teradata Parallel Transport?

 

Thanks,

Kishore

What is the data format using by TPT\Fastload while primary transfer to Teradata instance?

We using Fixed-width source-files for fasload - its a really excessive data (higly compressible).

Does Fastload (or TPT) use a custom format for transmission to the lower coefficient of redundancy,

 

Can TPT compress data before ethernet transporting for bandwidth saving (on ETL-side) ? 

Can Fastload do it likewise?

Hi, I want to use TPT in one of my project and I am facing the following issue,

(I have read http://developer.teradata.com/tools/articles/teradata-parallel-transporter-unicode-usage#comment-17552 and doing things mentioned in that article but something is missing, here are the details)

We have made great strides in improving our handling of delimited data (i.e. CSV data) in Teradata Parallel Transporter for the TTU14.00 release. This article will describe the background of the data format, the original support, and the enhancements we have made.

Teradata Parallel Transporter (TPT) is the premiere client load/unload tool, providing parallel extraction and loading of critical data. It is a flexible, high-performance data warehouse loading tool specifically optimized for the Teradata Database that enables high speed data extraction and loading under a single framework. TPT employs a single scripting language, utilizing the FastExport, MultiLoad, FastLoad, TPump and ODBC protocols. We are making great strides in each release to improve its ease-of-use and provide features that add value to the customer, their loading environment and their data.

I'm using Ole Load Utility to generate scripts to pull data directly from a SQL Server database and load Teradata table.  Everything runs fine when I run the load from the Ole Load GUI.

I used the TPT wizard to create an export from Oracle into a 'formatted' file.  When running this job, I get the error below which abends the job.  Any clue what this is indicative of?

Hi have a TPT job that connects to SQL Server 2005 to select rows from and load Teradata target table via UPDATE.  I'm using the MIcrosoft SQL Server ODBC driver to connect (Yes, I know its not certified).
 

Reviewing the TPT manuals, I do not see a set of coding examples using Java.  This is a show-stopper for anyone using Java programs to use TPT.  I would like to see such an example in this forum and in the manuals because Java is arguably the most important enterprise languag

Teradata Parallel Transporter (TPT) is a flexible, high-performance Data Warehouse loading tool, specifically optimized for the Teradata Database, which enables data extraction, transformation and loading. TPT incorporates an infrastructure that provides a parallel execution environment for product components called “operators.”  These integrate with the infrastructure in a "plug-in" fashion and are thus interoperable.

TPT operators provide access to such external resources as files, DBMS tables, and Messaging Middleware products, and perform various filtering and transformation functions. The TPT infrastructure includes a high performance data transfer mechanism called the data stream, used for interchanging data between the operators.

Hi all,
I am doing TPT from mainframe. I have data connector -- producer reading from file and load operator loading the file. The script runs fine without multiple instances. If I add [n] for multiple instances to the script, I get below syntax issue and unable to resolve it. Any help is appreciated.

Below works.

/* JGB. JM */

DEFINE JOB LOAD_TABLE
DESCRIPTION 'LOAD PTY TABLE'
(

Hi,

I've been trying to implent a TPT using the feature RowErrFileName but it does't seems to work, when I load a Input File with some errors like incomplete columns it doesn't send the bad rows to the 'RowErrFileName' instead of this the TPT goes to FatalError and crashes.

Any Help?

This is how I am writing the TPT DataConnector Operator

DEFINE OPERATOR CONECTOR_PRODUCER
DESCRIPTION 'DataConnector'
TYPE DATACONNECTOR PRODUCER
SCHEMA PnrItineraries_SCHEMA
ATTRIBUTES
(
VARCHAR PrivateLogName = 'dataconector_log' ,

Hi,
I am new to TPT and calling TPT 13.10 from mainframe. I get a error code TPT_INFRA: TPT02040: Error: Insufficient main storage to allocate 973078581 bytes for a buffer for the name of an INCLUDE file.

Job script preprocessing failed.

I am calling a simple DDL Operator to create a table.

As my handle suggests I am a mainframer, relatively new to Teradata. I'm trying to help out my Teradata team members setup some mainframe based ETL processes. I've done enough reading to figure out that I should be using TPT (TBUILD) with FastLoad INMOD option, as I need to cleanse the data prior to handing it over to Producer operator. I've wrote a COBOL program that reads the file and performs cleansing, in addition to that I have a JCL that runs TBUILD program which runs the TPT script. I am currently running into couple of issues and would like your suggestions and advice to proceed.

Hi

I want to load multiple files into one target table, every file has the same structure so I want to use only one File Reader.

I have defined:

VARCHAR FileList = 'Y',
VARCHAR ARRAY = ['file1.txt','file2.txt']

and I get following error:

Hello,

I am trying to use the Oracle UDFs provided by teradata in my TPT script. The problem I am facing is with the use of single quotes " ' " inside my insert part of my apply.

since the INSERT statement within the APPLY has to be contained within " ' "(single quotes) I cannot use the "nvl" udf that requires the replacement string to be in " ' ".

Example: 'Insert into table1 values ( nvl(:column, 'replacement string'));'

If anyone knows the escape character that may be used to escape the " ' " enclosing the "replacement string" then please help me.

Hello,

I have created TPT scripts and they run fine on the prompt with tbuild -d command..

I would like to call these scripts from a shell script.. like in the case of other teradata utilities where I can invoke them by

Ex: FlastLoad

fload << EOF
..code
..code
EOF

I am trying to load data into a table, that contains some date fields, format of the data in my sourcefile is out of my own control and date fields have to be changed when loaded to date fields in a teradata table. the source files is large (some 80+ gb) and my batch windows is not big enough to transform the file before loading into teradata.

da in input file could be something like:
4 30 2011 23:59:59;655609;12 31 2099 00:00:00;IP;NOK;0.0000000000000000
4 30 2011 23:59:59;655609;12 31 2099 00:00:00;MD;DKK;45000.0000000000000000

Anybody has any info related to it.

TPT10517: At least 1 instance could not connect special sessions

I have checked in log,It has the following content

TPT01514: Error: Unable to Open File. Error code 2, Reason code 530, Class code 501

No such file or directory

My data:
15-MAR-11 07.54.08.000000 AM
15-MAR-11 07.53.40.000000 AM
15-MAR-11 07.53.04.000000 AM
15-MAR-11 07.52.55.000000 AM
15-MAR-11 07.52.16.000000 AM

My current DDL:
CREATE multiset TABLE dbname.dbtable
(
CREATE_DATE timestamp(6)
);

My current TPT Insert statement:
'INSERT INTO '||@TargetDatabase||'.dbtable
(
CREATE_DATE = :CREATE_DATE (TIMESTAMP, FORMAT ''DD-MMM-YYBHH.MI.SS.S(6)BT'')
);'

The data is being stored in the database table with century of '19' instead of '20'. My data will always be century of '20':

CREATE_DATE
1911-03-15 07:54:08

Hi All,

I have installed TTUV13 in AIX 64 bit version and have TPT. Am trying to invoke tbuild but it throws 2001 / 2991 and other errors but the issue is the Message Catalog error which is not able to translate the error number to error message.

I am loading a column from an Oracle table defined as DECIMAL(30,3). I am using the ODBC interface to exract the data. I have the columns defined as DECIMAL(30,3) in the Teradata table. The completes without errors but the values are all divided by 1000. For example, Oracle value = 1000, Teradata = 1. Maybe it's an ODBC setting.

In TPT apply clause, you can specify number of instances used for each operator. For example

APPLY ('INSERT INTO TARGET TABLE (COL1, COL2) VALUES (:COL1, :COL2);') IGNORE DUPLICATE ROWS TO OPERATOR ( LOAD_OPERATOR[2])
SELECT * FROM OPERATOR(READ_OPERATOR[3]);

In this sample, my load_operator is a update operator, while read_operator is DATACONNECTOR PRODUCER hook up to a single file.

specifying the instance on the loader and reader makes me wondering what will happen in reality:

Teradata Parallel Transporter is a really nice tool which offers new capabilities in comparison to the existing "old" TD load utilities. One of many other advantages is the ability to move data without landing it to disk (see this article). There are also other post / blogs which give good introductions to TPT.

But as TPT needs a different scripting syntax it is sometimes not easy to get a start on this great tool. Therefore we developed and attached (zip download) a wrapper-like java tool which should help to get a first impression on the new capabilities without the need to care about TPT syntax.

The main functionality is to copy one table from one system to a second system in a Fastexport to pipe to Fastload style. The tool checks the table DDL, generates the TPT script and execute the needed scripts.

TPT constructs a unique identifier for each TPT job submitted for execution. Even though it is not generated until your job executes, you can reference its unique job identifier in your job script via the new script keyword $JOBID. The job identifier consists of the job name and a TPT-generated job sequence number, joined by the hyphen character ('-'):

    <job name> - <job sequence number>

One of the biggest benefits of Parallel Transporter is the ability to scale the load process on a client load server to circumvent performance bottlenecks.  This can sometimes offer huge performance gains as compared to the legacy Teradata Stand-alone load tools such as FastLoad and MultiLoad.

On z/OS, the ODBC operator can be used to extract data from DB2.  The purpose of this article is to demystify the use of the ODBC operator in conjunction with DB2 on IBM’s z/OS operating system.  

hi,

I am trying to load data into my teradata staging table,
I defined ACTIONTIMESTAMP column as TIMESTAMP but my input file have data like '20101203151515' for ACTIONTIMESTAMP
column but I can load '2010 12 03 15 15 15'.
for that in tpt script I have used substr function just to add space between while inserting into the staging table.
I got an error with the code :3760:String not terminated before end of text.
Do I need to add anything else or is there any other way to aceive the same.
Can any one faced this issue,please share with me the solution if u have any.

Teradata Parallel Transporter (Teradata PT) supports moving data from a table on a Teradata Database system, or from an ODBC-compliant RDBMS, to a Teradata table on a different Teradata Database system without landing the data to disk.

There are performance benefits, cost savings, and ease of script maintenance associated with using Teradata PT to move data without landing it to disk.

I am trying to use TPT to move a table from Oracle 10g to Teradata 12 using the ODBC operator with Oracle ODBC driver 10.2.0.3. When I execute the script with tbuild, I get

EMAIL_SRC_CD_LK_ODBC: connecting sessions
**** 12:38:34 Fatal error received from ODBC driver:
STATE=S1C00, CODE=0,
MSG='[Oracle][ODBC]Driver not capable.'
=================================================================
= =
= ODBC Driver Information =

This article will provide you with accurate information regarding the Teradata Parallel Transporter product. Hopefully this will educate and clear up any misunderstandings regarding the basic information about the product.

Hi Guys,

I'm loading flat files into a staging table using the LOAD operator (no indexes). Once the load is done, i do a select * on that table and insert into a allready populated table with a unique ppi using a DDL Operator.
My problem is that if the insert encounters a duplicate record, the entire batch insert is skipped. So no records are inserted. Is there a way for me to just ignore duplicate records like i can do when using the load operator?

I have changed my script to use an EXPORT and the IMPORT instead of the select * insert into, but that results in slower performance.

I have SQL that needs to run some preprocessing creating a volatile table to be exported to a file. If I use a setup step TPT is using a different session for each step and the volatile table goes away. Anyone have a work around?

Any idea why TPT is using more spool than the same query in other tools such a SQL Assistiant?

Can someone help me with all the parameters and options available in tlogview command for reading TPT logs.

Experts,
I am looking for Solution which can synchronize/replicate data between two Teradata Production boxes. i know there are couple of solution like Teradata Mover,NPARC,Golden Gate.

i would like to know if Teradata Parallel Transporter can achieve data replication/synchronization?
Objective is to move Full/Partial data between two Teradata boxes.
Is there any limitiations on TPT like Can we move Journals,Indexes(JOIN/HASH),STATs using TPT?
And how the performance of TPT is like how do you achieve the SLA of and hour?

I have been doing teradata export and import through named pipe until i used tpt. But now after i learnt how to write a tpt, i have been using import from export operator in tpt.

Teradata Parallel Transporter is the best-performing and recommended load/unload utility for the Teradata Database. After watching this presentation, you will learn...

Teradata Parallel Transporter (TPT) is an object-oriented client application that provides scalable, high-speed, parallel data Extraction, Loading and Updating. These capabilities can be extended with customizations or with third-party products. Teradata PT uses and expands on the functionality of the traditional Teradata extract and load utilities (FastLoad, MultiLoad, FastExport, and TPump). This presentation will focus on batch and active loading techniques and real-world examples of how TPT has applied its parallelization and scalability features to address loading challenges. Gain knowledge about the latest features from the latest releases. Learn about the best practices for implementing loading and unloading processes and tips and techniques to build reusable scripts that can be deployed across different system environments. In addition, we will explore how the Logger Services can help capture and monitor the status and performance information of jobs running in your environment.

The TPT DataConnector Operator is the mechanism by which TPT obtains data from or sends data to external native operating system flat (sequential) files or attached access modules.

This article will address the data formats usable with the TPT DataConnector Operator. For each format, we will first deal with the concept of a record, which roughly maps to a DBS table row. Secondly, we will address how these data are resolved to row columns using the TPT data schema. When these two requirements are validated, we will have converted the external data record to a Teradata DBS row. If not, we have a fatal data error*.