All Forums Tools
sheridany 12 posts Joined 04/06
21 Nov 2013
Export to delimited file

I have created a tpt export job from a sample script and if I limit the records to a small number the job run fine.  When I take out the record number constraint the job doesnt run.  Timestamp format in the table is is mm/dd/yyyybhh.mi.ssT
Here is the script. 
DEFINE JOB EXPORT_DELIMITED_FILE
DESCRIPTION 'Export rows from a Teradata table to a delimited file'
(
DEFINE SCHEMA PRODUCT_SOURCE_SCHEMA
(
xxxxxx_key_fnl      VARCHAR(100),
xxxxxx_ident_nr       VARCHAR(10),
xxxxxx_function_cd    VARCHAR(6),
xxxxx_xxxx_nr           VARCHAR(7),
mo_id                 VARCHAR(10),
event_dt              VARCHAR(10),
xxxx_min_dttm     VARCHAR(50),
xxxxx_max_dttm     VARCHAR(50),
xxxxx                   VARCHAR(20),
had_overlap           VARCHAR(1),
relationship_found    VARCHAR(1),
xxxxx_min_dttm     VARCHAR(50),
xxxxxx_max_dttm         VARCHAR(50),
cntct_mthd            VARCHAR(50),
in_person_in          VARCHAR(1),
prtcp_xxx_nr          VARCHAR(20)
);
DEFINE OPERATOR SQL_SELECTOR
TYPE SELECTOR
SCHEMA PRODUCT_SOURCE_SCHEMA
ATTRIBUTES
(
VARCHAR PrivateLogName = 'selector_log',
VARCHAR TdpId = 'xxx.xxxxxxxxx.com',
VARCHAR UserName = 'xxxxxx',
VARCHAR UserPassword = 'xxxxxxx',
VARCHAR SelectStmt = 'select
CAST(trim(xxxxxx_key_fnl) as varchar(100)),
CAST(CAST(xxxxxx_ident_nr AS integer) as    VARCHAR(10)),
 CAST(xxxxxx_function_cd    AS VARCHAR(6)),
 CAST(xxxxx_xx_nr   AS VARCHAR(7)),
 CAST(mo_id   AS     VARCHAR(10)),  
 CAST(CAST(event_dt AS DATE) as VARCHAR(10)),
 CAST(cast(xxxxxx_min_dttm as timestamp(6)) AS    VARCHAR(50)),
 CAST(cast(xxxxx_max_dttm as timestamp(6))    AS VARCHAR(50)),
 CAST(CAST(xxxx AS BIGINT )      AS    VARCHAR(20)),
 trim(CAST(had_overlap   AS        VARCHAR(1))),
 trim(CAST(relationship_found    AS VARCHAR(1))),
 CAST(cast(xxxxxxx_min_dttm as timestamp(6)) AS    VARCHAR(50)),
 CAST(cast(xxxx_max_dttm  as timestamp(6)) AS    VARCHAR(50)),
 CAST(cntct_mthd  AS       VARCHAR(50)),
 CAST(in_person_in  AS  VARCHAR(1)),
 CAST(CAST(prtcp_xxxx_nr AS  BIGINT) AS    VARCHAR(20))
From xxxxxx.xxxxxhist_dataset1x',
VARCHAR ReportModeOn='Yes'
);
DEFINE OPERATOR FILE_WRITER
TYPE DATACONNECTOR CONSUMER
SCHEMA *
ATTRIBUTES
(
VARCHAR PrivateLogName = 'dataconnector_log',
VARCHAR DirectoryPath = '/xxxxxx/xxxxxx/xxxxxx/files/history',
VARCHAR FileName = 'dataset1_201209.csv',
VARCHAR Format = 'DELIMITED',
VARCHAR OpenMode = 'Write',
VARCHAR TextDelimiter = '%',
VARCHAR TRACELEVEL='ALL'
);
APPLY TO OPERATOR (FILE_WRITER)

SELECT * FROM OPERATOR (SQL_SELECTOR);
);
First Log
Teradata Parallel Transporter Version 14.00.00.03
Job log: /opt/teradata/client/14.00/tbuild/logs/mytest-175.out
Job id is mytest-175, running on xxxxxx
Found CheckPoint file: /opt/teradata/client/14.00/tbuild/checkpoint/ mytestLVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter SQL Selector Operator Version 14.00.00.03
SQL_SELECTOR: private log specified: selector_log
Teradata Parallel Transporter DataConnector Version 14.00.00.03
FILE_WRITER Instance 1 directing private log report to 'dataconnector_log-1'.
FILE_WRITER Instance 1 restarting.
FILE_WRITER: TPT19007 DataConnector Consumer operator Instances: 1
FILE_WRITER: TPT19003 ECI operator ID: FILE_WRITER-23162
SQL_SELECTOR: connecting sessions
SQL_SELECTOR: restarting the job
FILE_WRITER: TPT19222 Operator instance 1 processing file '/xxxx/xxxx/xxxxx/files/history/dataset1_201209.csv'.
SQL_SELECTOR: sending SELECT request
SQL_SELECTOR: retrieving data
SQL_SELECTOR: Total Rows Exported:  20
SQL_SELECTOR: finished retrieving data
SQL_SELECTOR: disconnecting sessions
FILE_WRITER: TPT19221 Total files processed: 1.
SQL_SELECTOR: Total processor time used = '0.11 Second(s)'
SQL_SELECTOR: Start : Thu Nov 21 01:05:20 2013
SQL_SELECTOR: End   : Thu Nov 21 01:05:26 2013
Job step MAIN_STEP completed successfully
Job mytest completed successfully
 
Second Log
Teradata Parallel Transporter Version 14.00.00.03
Job log: /opt/teradata/client/14.00/tbuild/logs/mytest-177.out
Job id is mytest-177, running on lucas
Found CheckPoint file: /opt/teradata/client/14.00/tbuild/checkpoint/ mytestLVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter SQL Selector Operator Version 14.00.00.03
SQL_SELECTOR: private log specified: selector_log
Teradata Parallel Transporter DataConnector Version 14.00.00.03
FILE_WRITER Instance 1 directing private log report to 'dataconnector_log-1'.
FILE_WRITER: TPT19007 DataConnector Consumer operator Instances: 1
FILE_WRITER: TPT19003 ECI operator ID: FILE_WRITER-23381
FILE_WRITER: TPT19222 Operator instance 1 processing file '/xxxxx/xxxxxx/files/history/dataset1_201209.csv'.
SQL_SELECTOR: connecting sessions
SQL_SELECTOR: sending SELECT request
SQL_SELECTOR: retrieving data
FILE_WRITER: TPT19003 Number of characters in column 12 (65537) exceeds maximum allowed (65536)
FILE_WRITER: TPT19221 Total files processed: 0.
TPT_INFRA: TPT02268: Error: Cannot write message to Data Stream, status = DataStream Error
TPT_INFRA: TPT02269: Error: Data Stream status = 406
SQL_SELECTOR: disconnecting sessions
SQL_SELECTOR: Total processor time used = '0.37 Second(s)'
SQL_SELECTOR: Start : Thu Nov 21 01:09:15 2013
SQL_SELECTOR: End   : Thu Nov 21 01:09:19 2013
Job step MAIN_STEP terminated (status 8)
Job mytest terminated (status 8)
 

ulrich 816 posts Joined 09/09
21 Nov 2013

Your second job is a restart. What happend at the first start? Did you changed the script?
Can you run the SQL itself in SQL assitent or BTEQ or TD Studio?

feel free to donate bitcoin:12kgAUHFUqvG2sQgaRBXFhCwyf9HXdkGud

sheridany 12 posts Joined 04/06
21 Nov 2013

I only took out the top 20 after the select statement and then ran it again copying from the command line.  The sql code runs fine in sql assist.  I keep running into the error of the number of characters exceeds the maximum allowed after I tweak a column in sql code it shows up somewhere elese.  its maddening.  Here is the detailed log for what its worth...starting after clearing all the checkpoints.
 
TPT_INFRA: TPT04101: Warning: TMSM failed to initialize
Teradata Parallel Transporter Executor Version 14.00.00.03
Teradata Parallel Transporter Coordinator Version 14.00.00.03
Teradata Parallel Transporter Executor Version 14.00.00.03
Teradata Parallel Transporter SQL Selector Operator Version 14.00.00.03
SQL_SELECTOR: private log specified: selector_log
Teradata Parallel Transporter DataConnector Version 14.00.00.03
FILE_WRITER Instance 1 directing private log report to 'dataconnector_log-1'.
FILE_WRITER: TPT19007 DataConnector Consumer operator Instances: 1
FILE_WRITER: TPT19003 ECI operator ID: FILE_WRITER-27247
FILE_WRITER: TPT19222 Operator instance 1 processing file '/xxxxxxxxx/files/history/dataset1_201209.csv'.
SQL_SELECTOR: connecting sessions
Job is running in Buffer Mode
Task(SELECT_2[0001]): checkpoint completed, status = Success
Task(INSERT_1[0001]): checkpoint completed, status = Success
SQL_SELECTOR: sending SELECT request
SQL_SELECTOR: retrieving data
Data Block size: 1572896  Buffers/Block: 3  Data Buffer size: 524296
FILE_WRITER: TPT19003 Number of characters in column 3 (65537) exceeds maximum allowed (65536)
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0
FILE_WRITER: TPT19221 Total files processed: 0.
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 2, Total Rows Received = 0, Total Rows Sent = 0
SQL_SELECTOR: disconnecting sessions
SQL_SELECTOR: Total processor time used = '0.38 Second(s)'
SQL_SELECTOR: Start : Thu Nov 21 02:22:07 2013
SQL_SELECTOR: End   : Thu Nov 21 02:22:11 2013
Job step MAIN_STEP terminated (status 8)
Job mytest terminated (status 8)
Total available memory:          10002340
Largest allocable area:          10002340
Memory use high water mark:       4726308
Free map size:                       1024
Free map use high water mark:          17
Free list use high water mark:          0
 

ulrich 816 posts Joined 09/09
21 Nov 2013

you can also run with -s 1 option

feel free to donate bitcoin:12kgAUHFUqvG2sQgaRBXFhCwyf9HXdkGud

You must sign in to leave a comment.