All Forums Tools
jonesj2 16 posts Joined 02/09
18 Feb 2011
TPT Extracting Data from Oracle ODBC

I am loading a column from an Oracle table defined as DECIMAL(30,3). I am using the ODBC interface to exract the data. I have the columns defined as DECIMAL(30,3) in the Teradata table. The completes without errors but the values are all divided by 1000. For example, Oracle value = 1000, Teradata = 1. Maybe it's an ODBC setting. Thanks in advance for any help.

Tags:
feinholz 1234 posts Joined 05/08
23 Feb 2011

I would need to see the script, and the TPT output log showing me the detailed output from the ODBC and loading (Load, Update or Stream) operators (the entire .out file would be most helpful).

Please also provide the platform and version of TPT and ODBC operator you are using.
What ODBC drivers are you using?

--SteveF

jonesj2 16 posts Joined 02/09
22 Mar 2011

Thanks for the response.
I am using :
Teradata Parallel Transporter Version 13.00.00.02
Windows XP
Oracle ODBC driver, Release 10.1.0.2.0

TPT Script:
DEFINE JOB ODBC_LOAD
DESCRIPTION 'ODBC LOAD SUPL DEFINITION TABLE'
(
DEFINE SCHEMA Sanity_Test_ODBC_Schema
(
ITM_ROW_ID decimal(27,0),
ENG_RLT_CMPNT_QTY decimal (30,3)

);

DEFINE OPERATOR DDL_Operator
TYPE DDL
ATTRIBUTES
(
VARCHAR PrivateLogName = 'ddl_log',
VARCHAR TdpId = 'tdev',
VARCHAR UserName = '*******1',
VARCHAR UserPassword = '*******',
VARCHAR ARRAY ErrorList = ['3807','3803']
);

DEFINE OPERATOR ODBC_Operator
DESCRIPTION 'Teradata Parallel Transporter ODBC Operator'
TYPE ODBC
SCHEMA Sanity_Test_ODBC_Schema
ATTRIBUTES
(
VARCHAR PrivateLogName = 'odbc_log',
VARCHAR DSNName = 'cbto1',
VARCHAR UserName = '*******',
VARCHAR UserPassword = '*****',
VARCHAR SelectStmt = 'Select ITM_ROW_ID,ENG_RLT_CMPNT_QTY from EMF_A02.EMFA140_BOM_FACT where ENG_RLT_CMPNT_QTY is not null and rownum < 6;'
);

DEFINE OPERATOR Load_Operator
TYPE LOAD
SCHEMA *
ATTRIBUTES
(
VARCHAR PrivateLogName = 'load_log',
VARCHAR TdpId = 'tdev',
VARCHAR UserName = '******',
VARCHAR UserPassword = '********',
VARCHAR TargetTable = 'DLTB101_ENG.testj',
VARCHAR LogTable = 'DLTB101_ENG.testj_log',
VARCHAR ErrorTable1 = 'DLTB101_ENG.testj_er1',
VARCHAR ErrorTable2 = 'DLTB101_ENG.testj_er2'
);

Step Setup_Into_Tables
(
APPLY
('drop table DLTB101_ENG.testj_er1;' ),
('drop table DLTB101_ENG.testj_er2;' ),
('drop table DLTB101_ENG.testj ;' ),
('create multiset table DLTB101_ENG.testj (
ITM_ROW_ID decimal(27,0),
ENG_RLT_CMPNT_QTY decimal(30,3)

)
PRIMARY INDEX ( ITM_ROW_ID );'

)
TO OPERATOR (DDL_Operator);
);

Step Insert_Into_Tables
(
APPLY
'INSERT INTO DLTB101_ENG.testj
(
:ITM_ROW_ID,
:ENG_RLT_CMPNT_QTY
);'

TO OPERATOR (Load_Operator)
Select *
FROM OPERATOR (ODBC_Operator);
);
);

Output from TPT:

Teradata Parallel Transporter Version 13.00.00.02
Job log: C:\Program Files\Teradata\Client\13.0\Teradata Parallel Transporter\\/logs/engine-195.out
Job id is engine-195, running on WHQPC-002547
Teradata Parallel Transporter SQL DDL Operator Version 13.00.00.02
DDL_Operator: private log specified: ddl_log
DDL_Operator: connecting sessions
DDL_Operator: sending SQL requests
DDL_Operator: TPT10508: RDBMS error 3807: Object 'DLTB101_ENG.testj_er1' does not exist.
DDL_Operator: TPT18046: Warning: error is ignored as requested in ErrorList
DDL_Operator: TPT10508: RDBMS error 3807: Object 'DLTB101_ENG.testj_er2' does not exist.
DDL_Operator: TPT18046: Warning: error is ignored as requested in ErrorList
DDL_Operator: disconnecting sessions
DDL_Operator: Total processor time used = '0.15625 Second(s)'
DDL_Operator: Start : Tue Mar 22 09:48:54 2011
DDL_Operator: End : Tue Mar 22 09:48:57 2011
Job step Setup_Into_Tables completed successfully
Teradata Parallel Transporter Load Operator Version 13.00.00.02
Load_Operator: private log specified: load_log
Teradata Parallel Transporter ODBC Operator Version 13.00.00.02
ODBC_Operator: private log specified: odbc_log
ODBC_Operator: connecting sessions
ODBC_Operator: TPT17175: Warning: Message received from ODBC driver:
STATE=S1C00, CODE=0,
MSG='[Oracle][ODBC]Driver not capable.'
Load_Operator: connecting sessions
Load_Operator: preparing target table
Load_Operator: entering Acquisition Phase
ODBC_Operator: sending SELECT request
ODBC_Operator: data retrieval complete
Load_Operator: entering Application Phase
Load_Operator: Statistics for Target Table: 'DLTB101_ENG.testj'
Load_Operator: Total Rows Sent To RDBMS: 5
Load_Operator: Total Rows Applied: 5
Load_Operator: disconnecting sessions
ODBC_Operator: disconnecting sessions
ODBC_Operator: Total processor time used = '0.0625 Second(s)'
ODBC_Operator: Start : Tue Mar 22 09:48:58 2011
ODBC_Operator: End : Tue Mar 22 09:49:10 2011
Load_Operator: Total processor time used = '0.203125 Second(s)'
Load_Operator: Start : Tue Mar 22 09:48:58 2011
Load_Operator: End : Tue Mar 22 09:49:11 2011
Job step Insert_Into_Tables terminated (status 4)
Job engine completed successfully, but with warning(s).

jonesj2 16 posts Joined 02/09
22 Mar 2011

How can I get the .out file to you?

jonesj2 16 posts Joined 02/09
23 Mar 2011

Here's the contents of the out log file:

U:\>tlogview -j testj-224
Found CheckPoint file: C:\Program Files\Teradata\Client\13.0\Teradata Parallel
ransporter\\/checkpoint\testjLVCP
This is a restart job; it restarts at step Insert_Into_Tables.
Teradata Parallel Transporter Executor Version 13.00.00.02
Teradata Parallel Transporter Coordinator Version 13.00.00.02
Teradata Parallel Transporter Executor Version 13.00.00.02
Teradata Parallel Transporter ODBC Operator Version 13.00.00.02
ODBC_Operator: private log specified: odbc_log
Teradata Parallel Transporter Load Operator Version 13.00.00.02
Load_Operator: private log specified: load_log
ODBC_Operator: connecting sessions
ODBC_Operator: TPT17175: Warning: Message received from ODBC driver:
STATE=S1C00, CODE=0,
MSG='[Oracle][ODBC]Driver not capable.'
Load_Operator: connecting sessions
Load_Operator: preparing target table
Load_Operator: entering Acquisition Phase
Job is running in Buffer Mode
Task(APPLY_1[0001]): checkpoint completed, status = Success
Task(SELECT_2[0001]): checkpoint completed, status = Success
ODBC_Operator: sending SELECT request
ODBC_Operator: data retrieval complete
Task(SELECT_2[0001]) ready to checkpoint

Task(SELECT_2[0001]): checkpoint completed, status = Success
Task(APPLY_1[0001]): checkpoint completed, status = Success
Task(SELECT_2[0001]) ready to take the EOD checkpoint

Task(SELECT_2[0001]): checkpoint completed, status = Success
Task(APPLY_1[0001]): checkpoint completed, status = Success
Load_Operator: entering Application Phase
Load_Operator: Statistics for Target Table: 'DLTB101_ENG.testj'
Load_Operator: Total Rows Sent To RDBMS: 5
Load_Operator: Total Rows Applied: 5
PXTB_Terminate: Message Buffers Sent/Received = 1, Total Rows Received = 0, Tot
l Rows Sent = 0
PXTB_Terminate: Message Buffers Sent/Received = 1, Total Rows Received = 0, Tot
l Rows Sent = 0
ODBC_Operator: disconnecting sessions
Load_Operator: disconnecting sessions
ODBC_Operator: Total processor time used = '0.09375 Second(s)'
ODBC_Operator: Start : Wed Mar 23 09:12:11 2011
ODBC_Operator: End : Wed Mar 23 09:12:30 2011
Load_Operator: Total processor time used = '0.328125 Second(s)'
Load_Operator: Start : Wed Mar 23 09:12:11 2011
Load_Operator: End : Wed Mar 23 09:12:33 2011
Job step Insert_Into_Tables terminated (status 4)
Job testj completed successfully, but with warning(s).
Total available memory: 10000000
Largest allocable area: 10000000
Memory use high water mark: 71252
Free map size: 1024
Free map use high water mark: 20
Free list use high water mark: 0

feinholz 1234 posts Joined 05/08
23 Mar 2011

Since you have such a small row set size right now, I will ask for you to do something.
In the script, in the ODBC operator definition, add this:
VARCHAR TraceLevel = 'row'
And re-run the job.
You will then need to look at the diagnostic information.
To extract everything out, do this from the command line:
$ tlogview -j -f "*" -g
where is the name of the job (but I see you knew that already, that is good).
Redirect the tlogview command to a text file.
Look in the text file for all output prefixed by CONVERTDECIMAL.
Provide only those messages (for now).
You should see something like:
CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '%s'
where "%s" is the decimal data string from the ODBC driver.

Here is what might be happening.
If we get a string of "1000" we convert that to an integer and send to Teradata.
If Teradata receives a value of 1000 and the column has a scale of 3, then Teradata will view the value as 1.
For a decimal value on a DECIMAL(30,3) column to be considered equal to 1000.000 we need to receive a string of "1000.000" from the ODBC driver.
I am wondering if the driver is not returning to us the value we need.

I also need to point out 2 things:

1. we only support the ODBC drivers from DataDirect (you are using the drivers from Oracle)
2. the TraceLevel of "row" is only to be used when instructed; it provides a lot of detailed information and if you try this while loading a lot of data, you could easily run out of disk space, the trace file would get pretty large

--SteveF

jonesj2 16 posts Joined 02/09
24 Mar 2011

When I use the same Oracle ODBC driver in SQL Assistant, I get the correct results. The info you requested follows. I think it looks as you expected. Any suggestions on how to make Oracle ODBC work would be appreciated. I will try to obtain the ODB drivers from Data Direct. Thanks for your help.

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '733485'
CONVERTDECIMAL: column length: 6'
CONVERTDECIMAL: Source precision: 27
CONVERTDECIMAL: Source scale: 0
CONVERTDECIMAL: size: 29
CONVERTDECIMAL: Target Precision: 27
CONVERTDECIMAL: Target Scale: 0
CONVERTDECIMAL: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '3'
CONVERTDECIMAL: column length: 1'
CONVERTDECIMAL: Source precision: 30
CONVERTDECIMAL: Source scale: 3
CONVERTDECIMAL: size: 32
CONVERTDECIMAL: Target Precision: 30
CONVERTDECIMAL: Target Scale: 3
CONVERTDECIMAL: leaving
FETCHROW: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '733695'
CONVERTDECIMAL: column length: 6'
CONVERTDECIMAL: Source precision: 27
CONVERTDECIMAL: Source scale: 0
CONVERTDECIMAL: size: 29
CONVERTDECIMAL: Target Precision: 27
CONVERTDECIMAL: Target Scale: 0
CONVERTDECIMAL: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '1'
CONVERTDECIMAL: column length: 1'
CONVERTDECIMAL: Source precision: 30
CONVERTDECIMAL: Source scale: 3
CONVERTDECIMAL: size: 32
CONVERTDECIMAL: Target Precision: 30
CONVERTDECIMAL: Target Scale: 3
CONVERTDECIMAL: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '733696'
CONVERTDECIMAL: column length: 6'
CONVERTDECIMAL: Source precision: 27
CONVERTDECIMAL: Source scale: 0
CONVERTDECIMAL: size: 29
CONVERTDECIMAL: Target Precision: 27
CONVERTDECIMAL: Target Scale: 0
CONVERTDECIMAL: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '2'
CONVERTDECIMAL: column length: 1'
CONVERTDECIMAL: Source precision: 30
CONVERTDECIMAL: Source scale: 3
CONVERTDECIMAL: size: 32
CONVERTDECIMAL: Target Precision: 30
CONVERTDECIMAL: Target Scale: 3
CONVERTDECIMAL: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '733420'
CONVERTDECIMAL: column length: 6'
CONVERTDECIMAL: Source precision: 27
CONVERTDECIMAL: Source scale: 0
CONVERTDECIMAL: size: 29
CONVERTDECIMAL: Target Precision: 27
CONVERTDECIMAL: Target Scale: 0
CONVERTDECIMAL: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '1'
CONVERTDECIMAL: column length: 1'
CONVERTDECIMAL: Source precision: 30
CONVERTDECIMAL: Source scale: 3
CONVERTDECIMAL: size: 32
CONVERTDECIMAL: Target Precision: 30
CONVERTDECIMAL: Target Scale: 3
CONVERTDECIMAL: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '733175'
CONVERTDECIMAL: column length: 6'
CONVERTDECIMAL: Source precision: 27
CONVERTDECIMAL: Source scale: 0
CONVERTDECIMAL: size: 29
CONVERTDECIMAL: Target Precision: 27
CONVERTDECIMAL: Target Scale: 0
CONVERTDECIMAL: leaving

CONVERTDECIMAL: entering
CONVERTDECIMAL: column data: '1'
CONVERTDECIMAL: column length: 1'
CONVERTDECIMAL: Source precision: 30
CONVERTDECIMAL: Source scale: 3
CONVERTDECIMAL: size: 32
CONVERTDECIMAL: Target Precision: 30
CONVERTDECIMAL: Target Scale: 3
CONVERTDECIMAL: leaving

jonesj2 16 posts Joined 02/09
24 Mar 2011

FYI: I used 1000 as an example in my original post. The values loaded in Teradata in the test run posted above are :
0.003
0.002
0.001
0.001
0.001

jonesj2 16 posts Joined 02/09
24 Mar 2011

FYI: Another interesting fact. Another column in the same table that is defined DECIAML(20,10) loads correctly but it has values with decimal digits, such as :
0.0400000000
0.4000000000
0.1430000000

feinholz 1234 posts Joined 05/08
31 Mar 2011

Can you try something for me?

Even though your source table has DECIMAL(30,3), if you know your data will never be larger than would fit into DECIMAL(18,3), can you change the TPT script so that the field has DECIMAL(18,3) and re-run the job?

You may have uncovered a bug.

--SteveF

feinholz 1234 posts Joined 05/08
13 Apr 2011

Just a followup. This thread did actually uncover a bug in the ODBC operator when the large decimal specification is used. This bug is also due to the ODBC driver you are using (remember, we only support the DataDirect drivers, not the Oracle or Windows drivers). When there is a NUMBER (or DECIMAL) field, and the scale is not 0, there should be a decimal point. So, a decimal number of 3 would at least be returned to us a "3." or "3.000" if the scale is 3.

But the Oracle driver is not doing that, and provides us with "3". When that is sent to Teradata, Teradata assumes the "3" (with a scale of 3) means ".003".

To make up for the ODBC driver's shortcoming, we will enhance our code to react appropriately if there is no decimal point.

--SteveF

jonesj2 16 posts Joined 02/09
18 Apr 2011

Sorry for the late response. For some reason, I'm not getting emails when this post gets a reply. I ran the suggested test and got the same results. Acquiring the DD drivers will take some time. Can you provide a time estimate on when the enhancemant to support the Oracle driver will be available? Thanks for your help.

feinholz 1234 posts Joined 05/08
20 Apr 2011

You can get evaluation copies of the DD drivers fairly quickly if that is allowed in your environment.
As for the timeframe, I believe it will be a few more weeks before the patch becomes available.
It is still undergoing formal efix testing.

--SteveF

amoswooten 1 post Joined 04/11
20 Apr 2011

We have list of Self Study CD's to enhance your Oracle Database 11g Learning.

Oracle Database 11g: Administration Workshop I Self-Study CD
Oracle Database 11g: Administration Workshop II Self-Study CD
Oracle Database 11g: New Features for Administrators Self-Study CD
Oracle Database 11g: PL/SQL Fundamentals Self-Study CD
Oracle Database 11g: SQL Fundamentals I Self-Study CD
Oracle Database 11g: SQL Fundamentals II Self-Study CD
Oracle Database 11g: Performance Tuning Self-Study CD
Oracle Database 11g: SQL Tuning Workshop Self-Study CD
Oracle Database 11g: RAC Administration Self-Study CD
Oracle Database 11g: Oracle Secure Backup Self-Study CD
Contact me to know the best price of any of these self study CD's at amoswooten@gmail.com

You must sign in to leave a comment.