0 - 31 of 31 tags for tpump

Hi , 
Can anyone explain me whether TPT Stream Operator(TPUMP) can be used to load data from file to table and table to file. If it is possible could you kindly provide the script for the same. 
In my view, TPT Stream Operator is best suitable for mini batch loads. Could you confirm this also please. 
 
Thank you,

Hi All,
I am new to Teradata. I am facing an issue while trying to load to the same TD target table from multiple staging tables simultaneously. I searched the net but got confusing inputs. Please help me clarify what my options are regarding the following. We are using TD 15.

Need to load from Table 1 to Table 2. Which utility work well here and WHY?
Please explain the limitations and advantages against each utility.

Hello,
In my mload script i have the following:
0037 .IMPORT INFILE xxxx
        FROM 4
        FORMAT  VARTEXT '^\' NOSTOP
        LAYOUT &load_table.
        APPLY Inserts;
The multiload skips the errors and inserts the records which respects the conditions
 

How does TPUMP Handle input file ?
 
if the source system keep on appending data to same file ..
and tpump keep on loading to teradata , at some point file becomes huge and need to be deleted 
then do we need to handle the process manually ?
 
Thanks.
 

Hi guys,
just want to ask the purpose of Serialize ON in TPUMP. How is it related to the types of blocking or locking if it is not specified in a TPUMP? Are there any circumstances that would affect the blocking or locking?
 
Thanks! :)
 

Hi All
We have a requirement where we need to load one single table in parallel from different source tables.
Say, TableX needs to be loaded from TableA, TableB and TableC. They have different source data (Rows).

The complete error message from Tpump log was:-
 UTY8001 RDBMS failure in Macro Build: 2673, The source parcel length does not match data that was defined.
My data source is (first several records)
731727336!1!2100034078!1!2.4!1.3!1!2005-07-01 09:12:13.000000!1!2005-07-01 09:12:24.000000!2005-07-01!20!11:46:54.170000

Hi Experts,
There is a tpump which is taking 10 hours to process 7000 records, whcih earlier was taking 1 hour..  not sure why its taking 
so long now. the script cotains the normal insert select statement. 
just want to know that is there any optimal parametrs setting for the various TPUMP parameters..
like 
PACK

How to change session mode inside tpump script?

Hi,
I am calling tpump script through a shell script  which loads the data from a flat file to an Teradata table.
While trying to load the data from the flatfile to the table, the script exists with the below error.
Kindly help me understand why and when this error occurs and also help with getting over such issues. Thank you.

Hello,

How does TPUMP behaves if we have 2 rows with same UPI key in flat file?

Is there any method it choose order of data while executing tpump?

I have flat file with data as:

 

ID | Name

1 | A

1 | B

 

Hi,

we have TD12.0 and use a generic load id that does tpump, multi load, fast load, fast export, .....

We have logging OFF for this id by default because logging on tpump is a big overhead. As a result we can't log for non-tpump jobs either.

Teradata has completed a major compiler conversion effort that should be completely transparent to customers, yet they are the main beneficiaries.  This article:

  • Provides some historical background and context,
  • Discusses the reasons we switched compilers,
  • Identifies certain behavioral changes that were unavoidable,
  • And, finally, answers a few technical questions related to the overall process.

The venerable IBM mainframe was the original client platform developed for the Teradata RDBMS via channel-attached connections way back in the 1980s. Although we now support a wide range of client platforms using network-attached connections, a significant segment of our customer base continues to use mainframe clients.

Hi. I want to export data from one place and import it to other. 

Here is the source data declaration:

TPump has been enhanced to dynamically determine the PACK factor and fill up data buffer if there is variable-length data. This feature is available in Teradata TPump 13.00.00.009, 13.10.00.007, 14.00.00.000 and higher releases.

Teradata FastLoad has a feature named Tenacity that allows user to specify the number of hours that FastLoad continues trying to log on when the maximum number of load operations is already running on the Teradata Database.

By default the Tenacity feature is not turned on. The feature is turned on by the script command:

TPump macrocharset support

TPump now forces CHARSET internally when building its macros! This feature is new starting in TPump 13.10.00.03 release.

Hi,

What do the following statements in the DML LABEL section in a tpump script do exactly:
IGNORE MISSING ROWS
IGNORE EXTRA ROWS;

The input is a flat file with records inserting data into a database table.

Hi all,

I wanted to ask a question regarding Tpump loading into a Set table.
If the load fails, can it pick up again and continue from where it failed? Or does there have to be some form of clean up and restarting of the job?

Also, is there any difference between using tPump to load data into a Set and Multiset table?

Hi,

Can some one provide me the link to Teradata Load utilities for linux download location. I have searched the download center extensively with no result. I would like to install BTEQ,FASTLOAD,MULTILOAD,TPUMP utilites on linux box.

Thanks

This presentation describes, in detail, the various load utilities supported by Teradata.

In TPump 13.00.00.02 and higher releases the maximum pack factor has been increased from 600 to 2430.

TPump users use "PACK <statements>" in the "BEGIN LOAD" command to specify the number of data records to be packed into one request, where PACK is a TPump keyword and “statements” actually refers to the number of data records to be packed.

Packing improves network/channel efficiency by reducing the number of sends and receives between the application and the Teradata Database.

This book provides information on how to use Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or third-party products.

This book provides information about Teradata Parallel Data Pump (TPump), a data loading utility that helps you maintain (update, delete, insert, and atomic upsert) the data in your Teradata Database. TPump uses standard Teradata SQL to achieve moderate to high data-loading rates.

This book provides information on how to use the Teradata Parallel Transporter (Teradata PT) Application Programming Interface. There are instructions on how to set up the interface, adding checkpoint and restart, error reporting, and code examples.

This book provides reference information about the components of Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or with third-party products.

This book provides reference information about the components of Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or with third-party products.

Suppose you have a table with several non-unique secondary indexes (NUSI). When TPump loads the table, should you expect that each row’s INSERT will cause a table level lock on each of the secondary index sub-tables? And if so, couldn’t this create a lot of blocking across sessions?

A high-availability system must have the ability to identify and correct errors, exceptions and failures in a timely and reliable manner to meet challenging service level objectives. The Teradata database and the utilities and components (used to both load and access data) provide capabilities to implement reliable error and exception handling functionality. These capabilities combined with a well designed high availability architecture allow a Teradata Active Enterprise Intelligence (AEI) system to meet the service level objectives required to support mission critical business processes.