200 - 216 of 216 tags for tpt

Pages

While transactional processing through the use of “message queues” is a common approach in ADW today, the file-oriented approach is also begining to find its way into ADW due to its simplicity in nature and ease-of-control. Today, many companies monitor and store thousands--or hundreds of thousands--of transactions per day across their branches and stores. Transactional data is usually collected and stored as files in directories, before being merged into the enterprise-wide data warehouse. In fact, there have been Teradata sites which extract transactional data from message queues, pre-process them, and store them into different directories based on transaction types, in an “active” manner. By “active”, we mean the files are created as transactions are collected.

The Teradata Parallel Transporter (TPT) External Command Interface is a command based interface which allows users to issue commands to TPT jobs. The term “external commands” implies two important implementations. First, it implies that users can issue commands to TPT jobs from outside the TPT address space. Secondly, it implies that commands are processed by TPT while it is in the middle of performing ETL operations. In addition, TPT internal components such as operators (which run under different processes) can also communicate with each other within a job through the same interface by using commands. As a result, ETL and system “events” are not only shared between the TPT and users, but also shared amongst TPT components within a job at runtime.

This book provides information on how to use Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or third-party products.

This book provides information on how to use the Teradata Parallel Transporter (Teradata PT) Application Programming Interface. There are instructions on how to set up the interface, adding checkpoint and restart, error reporting, and code examples.

This book provides reference information about the components of Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or with third-party products.

This book provides reference information about the components of Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or with third-party products.

Hi All,
I'm trying to use TPT to load data directly from an Oracle table to Teradata. BUT.. I'm having a little problem with numeric data types.
In oracle, I have a lot of different NUMBER(10,0) and other columns.
Lets take one for example a column called DURATION. The actual data in this column fits in an INTEGER type easily. So in the SCHEMA in the TPT script file and in Teradata, i identified it as INTEGER.
Now when running the TPT script, i get this error:

Error: Row 1: data from source column too large for target column
Column: DURATION, Data: '3'

Hello ...

I've downloaded and configured the 40Gb version of TDE13.

I've not used teradata since V2R5, and the database, all the tools and documentation were pre-configured for me ...

I have several questions:
- are the PDF files included in the VM or are they a separate download
- I'm able to start the instance and login with BTEQ, though logging in takes 45 seconds (repeatably). I've made the change to /etc/hosts for associating my IP address with dbccop1, but am at a loss as to how to speed up the login within the VM.

Provides a detailed demonstration of the features and functions of the Teradata Parallel Transporter (TPT) product. It explores the product capabilities and demonstrates how the product fits into the Teradata product environment as a load/unload tool, with substantial implications for the throughput and performance expectations of an Active Data Warehouse. In addition to the many special purpose ‘operators’ provided by TPT, we look at the peripheral features which make it a seamless fit into an enterprise environment. The benefits and flexibilities of a single scripting language to support all loading and unloading functions are also demonstrated, along with many practical applications of the product to real customer requirements.

The Teradata Parallel Transporter (Teradata PT) operators play a vital role in high-speed data extraction and loading geared towards the Teradata Database. Besides interfacing with the Teradata Database, some of the Teradata PT operators provide access to external sources such as files, ODBC-compliant DBMS, and message-oriented middleware.

This article describes usage tips on how to load/unload Unicode data with the UTF8 and UTF16 Teradata client session character sets using Teradata Parallel Transporter (TPT).

As of this writing, Teradata Parallel Transporter supports Unicode only on network-attached platforms. 

 

A high-availability system must have the ability to identify and correct errors, exceptions and failures in a timely and reliable manner to meet challenging service level objectives. The Teradata database and the utilities and components (used to both load and access data) provide capabilities to implement reliable error and exception handling functionality. These capabilities combined with a well designed high availability architecture allow a Teradata Active Enterprise Intelligence (AEI) system to meet the service level objectives required to support mission critical business processes.

With traditional Teradata utilities such as Fastload, Multiload, and TPump, multiple data files are usually processed in a serial manner. For example, if the data to be loaded into the Data Warehouse reside in several files, they must be either concatenated into a single file before data loading or processed sequentially on a file-by-file basis during data loading.

In contrast, Teradata Parallel Transporter (TPT) provides a feature called “directory scan” which allows data files in a directory to be processed in a parallel and scalable manner as part of the loading process. In addition, if multiple directories are stored across multiple disks, a special feature in TPT called “UNION ALL” can be used to process these directories of files in parallel, thus achieving more throughput through scalability and parallelism across disks.

i'm running into this error while writing to a file in TPT:

STR_EMI_DET_SQL: sending SELECT request
STR_EMI_DET_SQL: retrieving data
PXTBS_PutRow: Invalid input row length, status = Length Error
STR_EMI_DET_SQL: Error 29 in putting a record onto the data stream
Operator(libselectop.so) instance(1): EXECUTE method failed with status = Length Error
STR_EMI_DET_SQL: disconnecting sessions
STR_EMI_DET_FILE_APPEND: Total files processed: 0.
Job step S2 terminated (status 8)
Job dcole terminated (status 8)

Teradata Parallel Transporter (TPT) is a flexible, high-performance Data Warehouse loading tool specifically optimized for Teradata Database, which enables data extraction, transformation and loading. It incorporates an infrastructure, which provides a parallel execution environment for product components called “operators”, which integrate with the infrastructure in a "plug-in" fashion and are thus interoperable.

There is now a new command line interface to Teradata Parallel Transporter!

TPT users have been asking for this for a while now, and we have delivered. We will explore here a brief summary of what it is and how to use it.