All Forums Tools
mcorsi 13 posts Joined 07/16
07 Jul 2016
tpt conversion from varchar to timestamp(6) DEFINE SCHEMA FILE_SOURCE_SCHEMA ( Main_Id VARCH

I am having difficulty using tpt and bulk loading dates into a table. My define schema looks like:

 DEFINE SCHEMA FILE_SOURCE_SCHEMA
                (
                    Main_Id VARCHAR(25),
                    Parent_Id VARCHAR(25),
                    Channel_Cd VARCHAR(25),
                    Split_Pct VARCHAR(25),
                    Update_Ts VARCHAR(25),
                    Create_Ts VARCHAR(25)
                    );

after setting up the ddl operator, file reader and load operator and table my insert statement looks like:

                STEP LOAD_DATA
                (
                        APPLY (
                                'INSERT INTO XBI_RADBI_BIZ_APP.AMR_SCPTv3_f_Channels (
                                :Main_Id,
                                :Parent_Id,
                                :Channel_Cd,
                                :Split_Pct,
                                :Update_Ts,
                                :Create_Ts
                                );'
                        )
                        TO OPERATOR (LOAD_OPERATOR[1])
                        SELECT * FROM OPERATOR (FILE_READER[1]);
                );

The table itself looks like:

 CREATE MULTISET TABLE XBI_RADBI_BIZ_APP.AMR_SCPTv3_f_Channels ,NO FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,
     DEFAULT MERGEBLOCKRATIO
     (
      Main_Id NUMERIC NOT NULL,
      Parent_Id NUMERIC NOT NULL,
      Channel_Cd VARCHAR(25) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL,
      Split_Pct DECIMAL(15,3) NOT NULL,
      Update_Ts VARCHAR(25) NOT NULL,
      Create_Ts VARCHAR(25) NOT NULL)
UNIQUE PRIMARY INDEX ( Parent_Id ,Channel_Cd );

 
The problem is the Update_Ts and Create_Ts fields should be TimeStamp(6) fields. However, if I try to dictate them to be such, every record crashes when  it tries to load and I end up with an emtpy table. The numeric fields convert successfully as does the decimal, but the time fields will only load as strings.
I am very new to TeraData and this buik loading process. Any help will be greatly appreciated.

feinholz 1234 posts Joined 05/08
07 Jul 2016

Please provide the whole script.
In what format is your data?
 

--SteveF

mcorsi 13 posts Joined 07/16
07 Jul 2016

format of the data? It is in the format of the table I am trying to build. I am sucking data from one
database, writing to flat file, sending it to teradata and then trying to suck it in.

This is the whole script for tpt:
DEFINE JOB AMR_SCPTv3_f_Channels DESCRIPTION 'Loads f_channels_pub data into a XBI_RADBI_BIZ_APP AMR_SCPTv3_f_Channels Teradata table' ( DEFINE SCHEMA FILE_SOURCE_SCHEMA ( Main_Id VARCHAR(25), Parent_Id VARCHAR(25), Channel_Cd VARCHAR(25), Split_Pct VARCHAR(25), Update_Ts VARCHAR(25), Create_Ts VARCHAR(25) ); DEFINE OPERATOR DDL_OPERATOR TYPE DDL ATTRIBUTES ( VARCHAR PrivateLogName = 'AMR_SCPTv3_f_Channels_log', VARCHAR TdpId = @jobvar_tdpid_tgt, VARCHAR UserName = @jobvar_username_tgt, VARCHAR UserPassword = @jobvar_password_tgt, VARCHAR AccountID, VARCHAR DataEncryption, VARCHAR LogonMech, VARCHAR LogonMechData, VARCHAR QueryBandSessInfo, VARCHAR Errorlist = '3807' ); DEFINE OPERATOR FILE_READER TYPE DATACONNECTOR PRODUCER SCHEMA FILE_SOURCE_SCHEMA ATTRIBUTES ( VARCHAR PrivateLogName = 'AMR_SCPTv3_f_Channels_File_Log', VARCHAR DirectoryPath = '/home/devo/e0111518/scpt3/data/', VARCHAR FileName = 'AMR_SCPTv3_f_Channels.csv', VARCHAR FORMAT = 'Delimited', VARCHAR OpenMode = 'Read', VARCHAR TextDelimiter = 'TAB', INTEGER SkipRows = 0 ); DEFINE OPERATOR LOAD_OPERATOR TYPE LOAD SCHEMA * ATTRIBUTES ( VARCHAR PrivateLogName = 'AMR_SCPTv3_f_Channels_Load_Log', INTEGER MaxSessions = 32, INTEGER MinSessions = 1, VARCHAR TdpId = @jobvar_tdpid_tgt, VARCHAR UserName = @jobvar_username_tgt, VARCHAR UserPassword = @jobvar_password_tgt, VARCHAR TargetTable = @jobvar_tgt_dbname || '.AMR_SCPTv3_f_Channels', VARCHAR LogTable = @jobvar_wrk_dbname || '.AMR_SCPTv3_f_Channels_LG', VARCHAR ErrorTable1 = @jobvar_wrk_dbname || '.AMR_SCPTv3_f_Channels_ET', VARCHAR ErrorTable2 = @jobvar_wrk_dbname || '.AMR_SCPTv3_f_Channels_UV', VARCHAR WorkTable = @jobvar_wrk_dbname || '.AMR_SCPTv3_f_Channels_WT' ); STEP SETUP_TABLE ( APPLY( 'DELETE FROM XBI_RADBI_BIZ_APP.AMR_SCPTv3_f_Channels;' ) TO OPERATOR (DDL_OPERATOR); ); STEP LOAD_DATA ( APPLY ( 'INSERT INTO XBI_RADBI_BIZ_APP.AMR_SCPTv3_f_Channels ( :Main_Id, :Parent_Id, :Channel_Cd, :Split_Pct, :Update_Ts, :Create_Ts );' ) TO OPERATOR (LOAD_OPERATOR[1]) SELECT * FROM OPERATOR (FILE_READER[1]); ); );

mcorsi 13 posts Joined 07/16
07 Jul 2016

That was ugly - lets do it in code version:
 

DEFINE JOB AMR_SCPTv3_f_Channels
DESCRIPTION 'Loads f_channels_pub data  into a XBI_RADBI_BIZ_APP AMR_SCPTv3_f_Channels Teradata table'
(
                DEFINE SCHEMA FILE_SOURCE_SCHEMA
                (
                    Main_Id VARCHAR(25),
                    Parent_Id VARCHAR(25),
                    Channel_Cd VARCHAR(25),
                    Split_Pct VARCHAR(25),
                    Update_Ts VARCHAR(25),
                    Create_Ts VARCHAR(25)
                    );

                DEFINE OPERATOR DDL_OPERATOR
                TYPE DDL
                ATTRIBUTES
                (
                        VARCHAR PrivateLogName = 'AMR_SCPTv3_f_Channels_log',
                        VARCHAR TdpId = @jobvar_tdpid_tgt,
                        VARCHAR UserName = @jobvar_username_tgt,
                        VARCHAR UserPassword = @jobvar_password_tgt,
                        VARCHAR AccountID,
                        VARCHAR DataEncryption,
                        VARCHAR LogonMech,
                        VARCHAR LogonMechData,
                        VARCHAR QueryBandSessInfo,
                        VARCHAR Errorlist = '3807'
                );
                
		DEFINE OPERATOR FILE_READER
                TYPE DATACONNECTOR PRODUCER
                SCHEMA FILE_SOURCE_SCHEMA
                ATTRIBUTES
                (
                        VARCHAR PrivateLogName = 'AMR_SCPTv3_f_Channels_File_Log',
                        VARCHAR DirectoryPath = '/home/devo/e0111518/scpt3/data/',
                        VARCHAR FileName = 'AMR_SCPTv3_f_Channels.csv',
                        VARCHAR FORMAT = 'Delimited',
                        VARCHAR OpenMode = 'Read',
                        VARCHAR TextDelimiter = 'TAB',
                        INTEGER SkipRows = 0
                );

                DEFINE OPERATOR LOAD_OPERATOR
                TYPE LOAD
                SCHEMA *
                ATTRIBUTES
                (
                        VARCHAR PrivateLogName = 'AMR_SCPTv3_f_Channels_Load_Log',
                        INTEGER MaxSessions = 32,
                        INTEGER MinSessions = 1,
                        VARCHAR TdpId = @jobvar_tdpid_tgt,
                        VARCHAR UserName = @jobvar_username_tgt,
                        VARCHAR UserPassword = @jobvar_password_tgt,
                        VARCHAR TargetTable = @jobvar_tgt_dbname  || '.AMR_SCPTv3_f_Channels',
                        VARCHAR LogTable    = @jobvar_wrk_dbname  || '.AMR_SCPTv3_f_Channels_LG',
                        VARCHAR ErrorTable1 = @jobvar_wrk_dbname  || '.AMR_SCPTv3_f_Channels_ET',
                        VARCHAR ErrorTable2 = @jobvar_wrk_dbname  || '.AMR_SCPTv3_f_Channels_UV',
                        VARCHAR WorkTable   = @jobvar_wrk_dbname  || '.AMR_SCPTv3_f_Channels_WT'
                );

                STEP SETUP_TABLE
                (
                        APPLY( 'DELETE FROM XBI_RADBI_BIZ_APP.AMR_SCPTv3_f_Channels;' )
                        TO OPERATOR (DDL_OPERATOR);
                );

                STEP LOAD_DATA
                (
                        APPLY (
                                'INSERT INTO XBI_RADBI_BIZ_APP.AMR_SCPTv3_f_Channels (
                                :Main_Id,
                                :Parent_Id,
                                :Channel_Cd,
                                :Split_Pct,
                                :Update_Ts,
                                :Create_Ts
                                );'
                        )
                        TO OPERATOR (LOAD_OPERATOR[1])
                        SELECT * FROM OPERATOR (FILE_READER[1]);
                );
);

 

mcorsi 13 posts Joined 07/16
07 Jul 2016

This whole thread has gotten a little convoluted. I need the last part of that last post (the load_data step) to send Update_Ts and Create_Ts as timestamp objects. (datetime, timestamp(0), timestamp(6)... i don't care). Teradata will not convert no matter what I do with the database (up till now).

mcorsi 13 posts Joined 07/16
07 Jul 2016

btw- it got convoluted because I did not describe the problem correctly.
my bad.
My issue is that I need to upload a file with a string representing a datetime or timestamp field and import it into teradata using some form of bulkload (see above tpt file). Is this possible?
 
 

feinholz 1234 posts Joined 05/08
07 Jul 2016

Ok, so you have a TAB-delimited text file.
Can you provide a sample row so that I can see the format of the data that is destined for the timestamp columns?
Teradata is very picky about the format.
Also, if the text in the data file does not adhere (strictly) to the format expected by Teradata, TPT can do text conversions to get it right (we can discuss that later, if needed).
Also, what version of TPT are you running?
 

--SteveF

mcorsi 13 posts Joined 07/16
07 Jul 2016

example row:
 
121234 21653 Retail 40.15 "2016-05-27 13:37:08" "2016-05-27 13:37:08"
 
tab delimited (not that it looks like it on a cust and paste) last two fields are timestamp(6)
 
How do i check the tpt version. (As I said, I am new to this company and these sytems and teradata)
 
Mark

mcorsi 13 posts Joined 07/16
07 Jul 2016

Also Steve - Don't get tied up on the quotes wrapping the datetime fields. That was just my last pull/attempt. No matter what I wrap them in (singles, nothing etc...) they will not convert.
 
I think there must be some form of a direct conversion or cast that can be done in the final INSERT statemement for the data load? I have tried demanding the fields be a timestamp... (e.g.)

 STEP LOAD_DATA
                (
                        APPLY (
                                'INSERT INTO XBI_RADBI_BIZ_APP.AMR_SCPTv3_f_Channels (
                                :Main_Id,
                                :Parent_Id,
                                :Channel_Cd,
                                :Split_Pct,
                                :Update_Ts,
                                :Create_Ts  (timestamp(6) format...)
                                );'
                        )
                        TO OPERATOR (LOAD_OPERATOR[1])
                        SELECT * FROM OPERATOR (FILE_READER[1]);
                );

But that only crashed everything.
 
Shutting down for the night. But if you have any ideas please send them for an AM wakeup call.
 
Thanks for your efforts.
 
Mark

mcorsi 13 posts Joined 07/16
08 Jul 2016

Steven -
Any further thoughts on this issue? 
Regards,
Mark

feinholz 1234 posts Joined 05/08
08 Jul 2016

The version of TPT is present when you run TPT.
As for the timestamp, whether there are quotes or not is very important.
Unless you tell us otherwise, every character is considered part of the data.
Even the quotes.
And that is most likely why the rows ended up in the error table.
If your fields will be quoted, you need to set QuotedData='yes'.
But that means that every field must be quoted.
If only some fields will be quoted, you need to set QuotedData='optional'.
Then we will process non-quoted fields as not quoted, and for the fields that are quoted, we will strip the quotes before sending the data to Teradata.
 

--SteveF

mcorsi 13 posts Joined 07/16
08 Jul 2016

Where do i set QuotedData='optional' ? File_Source_Schema? Like:
 

		DEFINE OPERATOR FILE_READER
                TYPE DATACONNECTOR PRODUCER
                SCHEMA FILE_SOURCE_SCHEMA
                ATTRIBUTES
                (
                        VARCHAR PrivateLogName = 'AMR_SCPTv3_f_origins_File_Log',
                        VARCHAR DirectoryPath = '/home/devo/e0111518/scpt3/data/',
                        VARCHAR FileName = 'AMR_SCPTv3_f_origins.csv',
                        VARCHAR FORMAT = 'Delimited',
                        VARCHAR OpenMode = 'Read',
                        VARCHAR TextDelimiter = 'TAB',
                        INTEGER SkipRows = 0,
                        VARCHAR QuotedData='optional'

                );

??

feinholz 1234 posts Joined 05/08
08 Jul 2016

Yes, that is correct.

--SteveF

mcorsi 13 posts Joined 07/16
08 Jul 2016

OK - tried to change the two fields to TIMESTAMP(6) and QuotedData='optional', but all data fails to insert and in my error table I have all errors 6760. Any additional thoughts?
 
Table structure is now:

CREATE MULTISET TABLE XBI_RADBI_BIZ_APP.AMR_SCPTv3_f_Channels ,NO FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,
     DEFAULT MERGEBLOCKRATIO
     (
      Main_Id NUMERIC NOT NULL,
      Parent_Id NUMERIC NOT NULL,
      Channel_Cd VARCHAR(25) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL,
      Split_Pct DECIMAL(15,3) NOT NULL,
      Update_Ts TIMESTAMP(6),
      Create_Ts TIMESTAMP(6))
UNIQUE PRIMARY INDEX ( Parent_Id ,Channel_Cd );

 

mcorsi 13 posts Joined 07/16
08 Jul 2016

BTW - This is our tpt version:
Teradata Parallel Transporter Version 13.10.00.05

mcorsi 13 posts Joined 07/16
08 Jul 2016

Update:

VARCHAR QuotedData='optional'

 Did not remove the quotes before insert on my TPT version. 
I then removed the quotes manually and dictated the exact format of the timestamp field. Still no luck. 
Any chance version 13.10 is simply not capable of importing timestamps?

feinholz 1234 posts Joined 05/08
08 Jul 2016

TPT 13.10 does not support quoted data.
That went into TPT 14.0.
And TPT 13.10 is not a supported release anymore.
You will have to upgrade your version of TPT, or pre-process the data to remove the quotes.
 

--SteveF

mcorsi 13 posts Joined 07/16
08 Jul 2016

Thanks Steven. I removed the quotes, but it still wouldn't accept them as timestamps.
I have talked to the business unit that is going to use this software and they are going to be fine with the dates as strings. 
Thanks for all your help! Have a good weekend.

You must sign in to leave a comment.