All Forums Tools
blackrooster 14 posts Joined 02/14
09 Jan 2015
TPT Loading DateTime (SQL Server) with UTF-16 char set

Hi all,
I am having difficulties loading the DateTime data type with TPT from SQL server into teradata using the UTF-16 char set. I keep getting the 2673 error code '2673 SOURCE PARCEL LENGTH DOES NOT MATCH DATA THAT WAS DEFINED'. When I leave out the DateTimes my script loads fine, it is only when I add a DateTime field that things go sour. Any ideas anyone on how to correctly load DATETIME data type into TIMESTAMP(3)? Exporting the the data to a file with bcp utility and doing a fastload works fine.
this is my script:

USING CHAR SET UTF16

DEFINE JOB TPT_LOAD_Dim_Billing_Document_Type

DESCRIPTION 'ODBC LOAD Dim_Billing_Document_Type TABLE'

(

  DEFINE SCHEMA ODBC_Dim_Billing_Document_Type

  (

    SK_Billing_Document_Type     INTEGER,

    Billing_Document_Type_Code   VARCHAR(100),

    Billing_Document_Type_Name   VARCHAR(200),

    Analytic_Relevence_Indicator VARCHAR(20),

    Billing_Document_Group_Code  VARCHAR(20),

    Record_Source_Timestamp      CHAR(46), 

    Record_Checksum_SCDType1     INTEGER,

    Record_Checksum_SCDType2     INTEGER

  

  );

 

  

  DEFINE OPERATOR DDLOperator()

  TYPE DDL

  ATTRIBUTES

  (

    VARCHAR PrivateLogName = 'ddl_log',

    VARCHAR TdpId = @MyTdpId,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR WorkingDatabase = @MyDatabase,

    VARCHAR ARRAY ErrorList = ['3807','3803']

  );

  

  

  DEFINE OPERATOR ODBC_Operator

    DESCRIPTION 'Teradata Parallel Transporter ODBC Operator'

    TYPE ODBC

    SCHEMA ODBC_Dim_Billing_Document_Type

    ATTRIBUTES

    (

    VARCHAR PrivateLogName = 'odbc_log',

    VARCHAR DSNName = @jobvar_SQL_SERVER_DNS,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR SelectStmt = 'Select  SK_Billing_Document_Type, 

      Billing_Document_Type_Code, 

      Billing_Document_Type_Name, 

      Analytic_Relevence_Indicator, 

      Billing_Document_Group_Code, 

      convert(char(23), Record_Source_Timestamp, 121), 

      Record_Checksum_SCDType1, 

      Record_Checksum_SCDType2 

      FROM [WILD_DWH].[DWH].[Dim_Billing_Document_Type];'

    );

  

  

  DEFINE OPERATOR Load_Operator

  TYPE LOAD

  SCHEMA *

  ATTRIBUTES

  (

    VARCHAR ErrorTable1 = 'Dim_Billing_Document_Type_errors1',

    VARCHAR ErrorTable2 = 'Dim_Billing_Document_Type_errors2',

    VARCHAR LogTable = '"D0_EU_STG_T"."Dim_Billing_Document_Type_Log"',

    VARCHAR PrivateLogName = 'load_log',

    VARCHAR TargetTable = '"Dim_Billing_Document_Type"',

    VARCHAR TdpId = @MyTdpId,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR WorkingDatabase = @MyDatabase

  );

  STEP drop_and_create_the_table

  (

    APPLY

    ('DROP TABLE "Dim_Billing_Document_Type_errors1";' ),

    ('DROP TABLE "Dim_Billing_Document_Type_errors2";' ), 

    ('DROP TABLE "Dim_Billing_Document_Type";' ),

    ('CREATE MULTISET TABLE "Dim_Billing_Document_Type" ( SK_Billing_Document_Type     INTEGER NOT NULL ,

                                                          Billing_Document_Type_Code   VARCHAR(50) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,

                                                          Billing_Document_Type_Name   VARCHAR(100) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Analytic_Relevence_Indicator VARCHAR(10) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Billing_Document_Group_Code  VARCHAR(10) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Record_Source_Timestamp      TIMESTAMP(3) NOT NULL,

                                                          Record_Checksum_SCDType1     INTEGER NOT NULL ,

                                                          Record_Checksum_SCDType2     INTEGER NOT NULL  

                                                           );')

      TO OPERATOR (DDLOperator);

  );

  STEP load_the_data

  (

    APPLY

    ('INSERT INTO "Dim_Billing_Document_Type" ( :SK_Billing_Document_Type, 

                                                :Billing_Document_Type_Code, 

                                                :Billing_Document_Type_Name, 

                                                :Analytic_Relevence_Indicator, 

                                                :Billing_Document_Group_Code, 

                                                :Record_Source_Timestamp, 

                                                :Record_Checksum_SCDType1, 

                                                :Record_Checksum_SCDType2

                                                );')

      TO OPERATOR (Load_Operator)

      SELECT 

      SK_Billing_Document_Type, 

      Billing_Document_Type_Code, 

      Billing_Document_Type_Name, 

      Analytic_Relevence_Indicator, 

      Billing_Document_Group_Code, 

      Record_Source_Timestamp, 

      Record_Checksum_SCDType1, 

      Record_Checksum_SCDType2

      FROM OPERATOR (ODBC_Operator);

  );

);
 
S-
 

blackrooster 14 posts Joined 02/14
14 Jan 2015

Update for other people battling with the same issue.
 
I got this to work using the following script:
 

USING CHAR SET UTF16

DEFINE JOB TPT_LOAD_Dim_Billing_Document_Type

DESCRIPTION 'ODBC LOAD Dim_Billing_Document_Type TABLE'

(

  DEFINE SCHEMA ODBC_Dim_Billing_Document_Type

  (

    SK_Billing_Document_Type     INTEGER,

    Billing_Document_Type_Code   VARCHAR(100),

    Billing_Document_Type_Name   VARCHAR(200),

    Analytic_Relevence_Indicator VARCHAR(20),

    Billing_Document_Group_Code  VARCHAR(20),

    Record_Source_Timestamp      VARCHAR(48), 

    Record_Checksum_SCDType1     INTEGER,

    Record_Checksum_SCDType2     INTEGER

  

  );

 

  

  DEFINE OPERATOR DDLOperator()

  TYPE DDL

  ATTRIBUTES

  (

    VARCHAR PrivateLogName = 'ddl_log',

    VARCHAR TdpId = @MyTdpId,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR WorkingDatabase = @MyDatabase,

    VARCHAR ARRAY ErrorList = ['3807','3803']

  );

  

  

  DEFINE OPERATOR ODBC_Operator

    DESCRIPTION 'Teradata Parallel Transporter ODBC Operator'

    TYPE ODBC

    SCHEMA ODBC_Dim_Billing_Document_Type

    ATTRIBUTES

    (

    VARCHAR PrivateLogName = 'odbc_log',

    VARCHAR DSNName = @jobvar_SQL_SERVER_DNS,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR SelectStmt = 'Select  SK_Billing_Document_Type, 

      Billing_Document_Type_Code, 

      Billing_Document_Type_Name, 

      Analytic_Relevence_Indicator, 

      Billing_Document_Group_Code, 

      convert(nvarchar(24), convert(varchar, Record_Source_Timestamp, 121)), 

      Record_Checksum_SCDType1, 

      Record_Checksum_SCDType2 

      FROM [WILD_DWH].[DWH].[Dim_Billing_Document_Type];'

    );

  

  

  DEFINE OPERATOR Load_Operator

  TYPE LOAD

  SCHEMA *

  ATTRIBUTES

  (

    VARCHAR ErrorTable1 = 'D0_EU_AUX_T.Dim_Billing_Document_Type_errors1',

    VARCHAR ErrorTable2 = 'D0_EU_AUX_T.Dim_Billing_Document_Type_errors2',

    VARCHAR LogTable = '"D0_EU_STG_T"."Dim_Billing_Document_Type_Log"',

    VARCHAR PrivateLogName = 'load_log',

    VARCHAR TargetTable = '"Dim_Billing_Document_Type"',

    VARCHAR TdpId = @MyTdpId,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR WorkingDatabase = @MyDatabase

  );

  STEP drop_and_create_the_table

  (

    APPLY

    ('DROP TABLE "D0_EU_AUX_T"."Dim_Billing_Document_Type_errors1";' ),

    ('DROP TABLE "D0_EU_AUX_T"."Dim_Billing_Document_Type_errors2";' ), 

    ('DROP TABLE "Dim_Billing_Document_Type";' ),

    ('CREATE MULTISET TABLE "Dim_Billing_Document_Type" ( SK_Billing_Document_Type     INTEGER NOT NULL ,

                                                          Billing_Document_Type_Code   VARCHAR(50) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,

                                                          Billing_Document_Type_Name   VARCHAR(100) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Analytic_Relevence_Indicator VARCHAR(10) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Billing_Document_Group_Code  VARCHAR(10) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Record_Source_Timestamp      TIMESTAMP(3) NOT NULL,

                                                          Record_Checksum_SCDType1     INTEGER NOT NULL ,

                                                          Record_Checksum_SCDType2     INTEGER NOT NULL  

                                                           );')

      TO OPERATOR (DDLOperator);

  );

  STEP load_the_data

  (

    APPLY

    ('INSERT INTO "Dim_Billing_Document_Type" ( :SK_Billing_Document_Type, 

                                                :Billing_Document_Type_Code, 

                                                :Billing_Document_Type_Name, 

                                                :Analytic_Relevence_Indicator, 

                                                :Billing_Document_Group_Code, 

                                                :Record_Source_Timestamp, 

                                                :Record_Checksum_SCDType1, 

                                                :Record_Checksum_SCDType2

                                                );')

      TO OPERATOR (Load_Operator)

      SELECT 

      SK_Billing_Document_Type, 

      Billing_Document_Type_Code, 

      Billing_Document_Type_Name, 

      Analytic_Relevence_Indicator, 

      Billing_Document_Group_Code, 

      Record_Source_Timestamp, 

      Record_Checksum_SCDType1, 

      Record_Checksum_SCDType2

      FROM OPERATOR (ODBC_Operator);

  );

);

feinholz 1234 posts Joined 05/08
14 Jan 2015

Why does your schema have CHAR(48) instead of TIMESTAMP(3)?

--SteveF

blackrooster 14 posts Joined 02/14
15 Jan 2015

Hi Steve,
because then I get the same 2673 error described in my original post.
S-

blackrooster 14 posts Joined 02/14
15 Jan 2015

this happens even when I try to format my datetime as convert( datetime, Record_Source_Timestamp, 121) in SelectStmt.
 

feinholz 1234 posts Joined 05/08
15 Jan 2015

Granted I do not have SQLServer on my PC, but when I do an export from Teradata and then a load into Teradata, with TIMESTAMP(3) in the schema, and the value of the timestamp column in the source table is something like '2014-03-06 12:13:14.123', then it loads just fine. And I tried with and without  USING CHARACTER SET UTF16.
 
What is the value in SQLServer that you are trying to export and load?
 

--SteveF

feinholz 1234 posts Joined 05/08
15 Jan 2015

I will continue to look into this.
I just installed SQL Server onto my PC and created a table with a DataTime column and inserted a single column/row with value:
2014-03-04 12:13:14.123
If I do a SELECT * FROM <tablename>; then I get the proper value.
If I do a SELECT convert(char(23), COL1, 121) FROM <tablename>; then I get the proper value.
(These are queries in SQL Server Management Studio.)
Thus, it is possible that ODBC is not returning the value properly to TPT and I will have to look into that.
I would think that converting to nvarchar would not have worked because you are providing a VARCHAR to TPT and TPT is expecting CHAR.
Can you do me a favor?
Can you re-run the original test (that failed), but in the script for the ODBC operator, set TRACELEVEL='all' and send me the entire job log (the .out file), I would appreciate it.
Send to: steven.feinholz@teradata.com
Being a binary file (you may have to zip it or rename it for it to get through to my inbox).
Thanks!

--SteveF

blackrooster 14 posts Joined 02/14
16 Jan 2015

Hi Steve,
in the script that worked I coded the datetime as varchar(48) not as char(48) in the SCHEMA definition, so I don't understand what you mean with your statement "I would think that converting to nvarchar would not have worked because you are providing a VARCHAR to TPT and TPT is expecting CHAR."
In SQL server management studio I also get the proper values with and without converting to char(23).
I am using SQL server ODBC drivers, not the datadirect drivers.
Please see your mail for the requested log file.
Thanks,
Sven
 

blackrooster 14 posts Joined 02/14
16 Jan 2015

Hi,
I have an additional problem. When i have nvarchar field in SQL Server which contains digits with leading zeros this results in a fatal error for the ODBC Operator: ODBC_Operator: TPT17107: Retrieval error for row ...
I''l try to find a way to code around it
 

feinholz 1234 posts Joined 05/08
16 Jan 2015

The original script had a schema with CHAR(46) and a 'convert' in the SELECT to CHAR(23).
That should have worked.
Timestamps are fixed length CHAR fields, not VARCHAR.
The fact that the original job did not run concerns me because it should have run fine.
The conversion of DateTime using code 121 converts to a 23-byte string.
With a client session character set of UTF16, the resultant string should have been 46 bytes.
That is what I see in my SQL Server test (not run with the ODBC operator yet).
Thus, I am wondering if the ODBC operator has a bug somewhere.
The fact that converting to VARCHAR did not issue an error is probably due to the operator binding the column as CHAR (since the schema indicated CHAR) and the ODBC driver converting from VARCHAR to CHAR.
 

--SteveF

blackrooster 14 posts Joined 02/14
18 Jan 2015

Hi Steve,
Thanks for thinking this through with me, but again, I don't understand your statement: "The fact that converting to VARCHAR did not issue an error is probably due to the operator binding the column as CHAR (since the schema indicated CHAR) and the ODBC driver converting from VARCHAR to CHAR."
Assuming you are talking in that sentence about the script that worked, I fail to see what you mean by  "since the schema indicated CHAR", as the column Record_Source_Timestamp is defined there as VARCHAR(48). Am I missing something?
Sven
 

feinholz 1234 posts Joined 05/08
19 Jan 2015

I just brought it up because we enforce that the data must match the schema.
 
If the data is coming back as VARCHAR but the schema is defined as CHAR, that can sometimes cause issues because of the difference in the data layouts.
 
Thus, if you specify CHAR in the schema, then your SELECT statement muct provide the data as a CHAR and not VARCHAR.
 

--SteveF

You must sign in to leave a comment.