0 - 39 of 39 tags for fastexport

Friends, I have the following problem: I have a final table, where some values are rounded wrongly. There is no transformation of these values. The tables have the same data type, decimal (9.2) and FastExport it converts to the same data type. An example I got was a value of 46.23.

While exporting data from Teradata table with MLSCRIPT option to generate mload script too. I am getting below strings p1<96> or p1 instead of DATE/TIME/TIMESTAMP fields in layout section.
p1<96>p1<96>.FIELD COLUMN2 * CHAR(1);

While executing Fast Export, we have a option to create a MLOD script by provoding the MLSCRIPT. Do we have a similar option for FASTLOAD? Can someone help me know if there is one. Or let me know why we dont have one for Fast load.

Hello All,
Teradata Utilities are specialised tool for loading or exporting huge volume of data as compared to conventional SQL tools like SQL Assistant & BTEQ. I wish to learn the fundamental performance gain reason between them. I have provided my understanding on them below. I would appreciate any addition/correction to them.

Hello All,
Below script i exported the data using Fast Export and tried to import the data using Fast Load. While doing so. I am getting a error in FastLoad. Plz help me understand. Where i am going wrong.
FastExport Script
.LOGTABLE DB.errors;
.LOGON jugal/jbhatt,bhatt;

Greetings experts,
I am facing the following issue in TD Demo version 13.0 , windows 7
Scenario:  Fast exporting from a table samples.itemppi_bkp in "mode record format text" (vartext separated by ,) and then using this file as source for mloading into target table samples.itemppi_wodate.  

I am trying to export data from a sql uery to a file using TPT Export operator.
My Sql query is running fine in sql assistant but when it is run through TPT it is giving non-existent syntax errors.
I have used two single quotes wherever I have one single quote in my query.
My requirements:

Hello All,
I am trying to run the below script in fxp:
.LOGON jugalDB/JBhatt,jugal
.EXPORT OUTFILE samples111
Sel top 10 * from db.table1
In the Interactive Mode i gave it as below:
fxp < fxpExample.txt

We've had a long outstanding issue with the data transfer throughput of our fastexports, which essentially run in the low single MB/sec range - all of them, not just this one or that one.  it's takes 10+ hours to extract a few hundred GB of data. The issue is not with the client or the network - it's in the DBMS. 

The following shows how to select rows from a database table using JDBC FastExport, which only works with JDBC PreparedStatement.

I have requirement where i need to send mutliple files based on the data. Our enivroment is Unix and teradata.
EG: below is table with sample table.
C_Name   ID
xxxxx        1
yyyy          1
aaaaa       2
bbbbb       2
ccccc         1

I have a requirement i need to export a data from one table using fast export  Parallel load it into a table using MLOAD.
I can write seprated Fast export and Mload but As the table is huge and i dont want waste unix space as the fast export creates the big file.
Can we load it like a queue..

Hi, Guys
I am using fastexport to export my db result, but from the logs, there are some logs show like:
**** 18:42:33 UTY8722 234664 total records written to output file.
which means the records is write to the output file, but there are some logs show like:

Is there any way to measure the total AMPCPUTIME consumed by any MLOAD/FLOAD/FEXP job post completion? The values from the dbql(sum of ampcputime for a particular LSN number) does not seem right.
I also tried dbc.acctg. There too, the values seem low.

Hello.  I need to export large numbers of tables to CSV files for customer deliverables.  I am working under Teradata 13.10.  For VARCHAR fields, I need to include a Character String delimiter (double-quote) in the CSV.   I have created a tbuild/TPT export script as described in Example 10 of the TPT User Guide, but cant find a way to get the string delim

I gave a presentation on AMP worker tasks at the Teradata User Group conference last week in Washington DC.   A question came from someone in the audience concerning FastExport jobs in response mode, and whether or not they were holding AMP worker tasks.  This post addresses that question.

I'm trying to load a table dump exported with fastexport, but i get the following error:

The length of: PREFIX in row: 1 was greater than defined.
              Defined: 3, Received: 3072

Here are my fastexport and fastload scripts:


Hello everyone!

I need to export about 5-6 files from a channel-attached instance, and these files will be consumed by an Unix application. Can I avoid a FTP or somethink alike by "replacing" my DDNAME by an Unix path, if possible?


Thanks in advance,


Hi everyone,
I would like to know if it's possible to know with some system table (such as dbc.dbqlogtbl or similar) to get the number of sessions opened by a MultiLoad/FastLoad/FastExport/TPT job.
In the dbqlogtbl it seems that Teradata puts only the father session which will create the child sessions...

Hello all:

Does anyone know why FastExport would not be able to find the named pipe below?  Any thoughts?  Thank you for your help!

The Named Pipes Access Module is installed (version

**** 09:56:52 UTY4019 Access module error '4' received during 'File open'



I am trying to export data from the database using Fastexport. This is the code





.begin export sessions 12;



.export outfile




Teradata has completed a major compiler conversion effort that should be completely transparent to customers, yet they are the main beneficiaries.  This article:

  • Provides some historical background and context,
  • Discusses the reasons we switched compilers,
  • Identifies certain behavioral changes that were unavoidable,
  • And, finally, answers a few technical questions related to the overall process.

The venerable IBM mainframe was the original client platform developed for the Teradata RDBMS via channel-attached connections way back in the 1980s. Although we now support a wide range of client platforms using network-attached connections, a significant segment of our customer base continues to use mainframe clients.

Teradata FastLoad has a feature named Tenacity that allows user to specify the number of hours that FastLoad continues trying to log on when the maximum number of load operations is already running on the Teradata Database.

By default the Tenacity feature is not turned on. The feature is turned on by the script command:

help with fast export


I am using fastexport in AIX environment. i have mentioned the query as follows. please provide assistance with this query. the database has about 1 million records.



We are installing TERADATA Tools and Utilities 12.0 on AIX 6.1 platform. And we need to validate the installation for FEXP.

We have written a FEXP script for a simple SELECT query as follows -

The FastExport utility is usually seen as a batch-oriented job to use when you need to return large numbers of rows from the database to a user.   It is the ideal tool for efficiently and quickly returning large answer sets, sorted or otherwise, back to the client from the database.   That’s why it was named the way it was:  “FastExport”.  

I am trying to select a "microfocus cobol format " dataset from teradata 13.0 ,so that cobol can read them .
I use fastexport to output the dataset.

select * from test;

I want to select data like :

old_xxx new_xxx
-------- --------
-123.456 12345v --negative number will convert to 'v' like in oracle " select chr(ascii('6')+64) from dual"
123.450 123450
-123.450 12345p
123.456 123456


Please can someone send a sample script to export the columnname as the header row using FastExport / TPT Export.?
I am able to do this using Bteq Export, but facing difficulty with FastExport/TPT Export.


This presentation describes, in detail, the various load utilities supported by Teradata.

I have been asked by the server admins to provide a space requirement to install the following Teradata 12 utilities on a SUN Solaris SPARC server:

(1) CLI and related security libraries
(2) BTEQ
(3) FastLoad
(4) MultiLoad
(5) FastExport

This book provides information on how to use Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or third-party products.

This book provides information on how to use the Teradata Parallel Transporter (Teradata PT) Application Programming Interface. There are instructions on how to set up the interface, adding checkpoint and restart, error reporting, and code examples.

This book provides reference information about the components of Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or with third-party products.

This book provides reference information about the components of Teradata Parallel Transporter (Teradata PT), an object-oriented client application that provides scalable, high-speed, parallel data extraction, loading, and updating. These capabilities can be extended with customizations or with third-party products.

How can I define in the FASTEXPORT script the first row in the exported file to be the names of the exported columns ?


I left this post on the main Teradata website forum, but I thought this might be more appropriate and perhaps get a response. Thanks for taking a look. I just found this little nook inside of Teradata and I'm glad it exists.

I've created a Windows named pipe in a vb.net application using the System.IO.Pipes namespace. I'm creating a pipe, my process appears to be connecting to the pipe block and waiting to read and now I would just like to feed the pipe from Teradata.