All Forums UDA
drsaxman 2 posts Joined 10/06
30 Oct 2006
ETL CLOB from Oracle to Teradata

Is anyone ETL'ing a clob from Oracle to Teradata or does anyone have any thoughts how one would? What strategies are you using? We usually leverage Informatica to ETL into our data warehouse, and Informatica does support CLOBs, however, I'm not sure if the way Informatica handles CLOBs is compatible with Teradata.I've tried using Multiload as well as an INSERT INTO with no success so far, so any help would be greatly appreciated.Thanks,Dan

Jim Chapman 449 posts Joined 09/04
30 Oct 2006

The Teradata load utilities (BTEQ, Fastload, Multiload) currently do not have good support for LOBs. The best-supported interfaces for loading LOB data are the open APIs (ODBC, JDBC). If Informatica's (or any other vendor's) ETL product supports loading LOBs using ODBC or JDBC, it should work with Teradata.

drsaxman 2 posts Joined 10/06
09 Nov 2006

I understand what you are saying. I'm still not having much luck.Has anyone accomplised this via Informatica? Also, I haven't had any luck getting the UDF's from the Teradata site to compile. It seems that they are missing a header file or something.Thanks,

Cherry_Sy 4 posts Joined 04/10
20 Apr 2010

hello drsaxman!since the problem you posted is way back 2006, do you have any solutions now?

I'm encountering the same problem also, we are using Informatica to load in Teradata. I tried using the data type string but the maximum length cannot be greater than 104857600.

We also tried loading the CLOB using Bteq but it is still failing. Not sure if BTEQ 08.02.04.00 can support CLOB.

CarlosAL 512 posts Joined 04/08
21 Apr 2010

Cherry:

You can load CLOBs with bteq, but version must be 12.00.00.02 or posterior.

There is an example here:

http://carlosal.wordpress.com/2010/01/25/insertar-ficheros-en-teradata-como-blobs-con-bteq/

But I warn you: It's slooowww.

Cheers.

Carlos.

Cherry_Sy 4 posts Joined 04/10
21 Apr 2010

Thanks Carlos!

Does bteq version 12.00.00.02 compatible with V2R6? Is it feasible for us to upgrade our bteq version without upgrading our DB version?

Is there any other tool beside bteq that can help us load CLOB?
The CLOB that we are loading in Teradata is just simple string of characters.

Thanks,
Cherry

CarlosAL 512 posts Joined 04/08
21 Apr 2010

Cherry:

You can load LOBs in many different ways ( http://carlosal.wordpress.com/2008/11/23/insertar-ficheros-de-imagen-en-teradata-usando-windows-scripting-host-y-ado/ ), BUT Teradata load utilities don't deal very well with LOBs.

If the CLOBs are < 64000 I'd try and load them as VARCHAR(64000) with fastload/multiload.

Otherwise, you can always split the CLOBs in chunks of 64000 chars (or 32000 if Oracle, or...) in the export process, load them in staging tables and convert to CLOB if needed.

HTH.

Cheers.

Carlos.

Jim Chapman 449 posts Joined 09/04
21 Apr 2010

Since I last posted on this thread back in 2006, the utility support for large objects has improved somewhat. In addition to the BTEQ support, The TPT (parallel transporter) utility now supports large objects.

Cherry_Sy 4 posts Joined 04/10
23 Apr 2010

Thanks again carlos,

We just agreed on loading 64000 bytes of data in teradata. But I encountered a problem when i used string (64000) in informatica to load in teradata using mload utility.

**** 13:58:17 UTY1425 The constructed data row exceeds the maximum allowable siz
e of 64000
bytes.

Instead of using 64000, i used 63500 and was able to successfully load the data. Is it because the total number of bytes plus other columns must not exceed 64000 bytes?

Thanks,
Cherry

CarlosAL 512 posts Joined 04/08
26 Apr 2010

Multiload reference:

"The maximum row size for a MultiLoad job, data plus indicators, is
approximately 64,000 bytes. This limit is a function of:
• The row size limit of the Teradata Database
• The MultiLoad work tables and error tables
MultiLoad cannot accommodate a row size that is near the row size
limit of the Teradata Database because of the operational overhead
associated with MultiLoad work tables and error tables. If the
MultiLoad job exceeds this limit, the Teradata Database returns
Error Message 3577 (row size or sort key size overflow), and
terminates the job."

HTH.

Cheers.

Carlos.

Cherry_Sy 4 posts Joined 04/10
28 Apr 2010

thanks carlos for the information =)

Cherry

12 Dec 2012

Hi,
while loading the data from CSV file to Teradata table, I am missing some data. please help me out
example:
Transformation_desc is the column in target table, which is having length varchar(4000).
 
transformation_desc is the column in file, length string(4000), which is sending 3400 length record only for loading into the table field transformation_desc. But 1770 bytes data only loaded into column field in the table.
 
3400 bytes data in Double quotes "..."
 
Thank you
 
Murali. 
 

vetriselvan 9 posts Joined 04/14
18 Apr 2014

Can anyone tell me how to load CLOB/BLOB data from Oracle to Teradata table using TPT? I am able to load CLOB data when I declare it as VARCHAR in teradata but it fails If I declare it CLOB or BLOB. Please help me.

Adeel Chaudhry 773 posts Joined 04/08
20 Apr 2014

Please have a look at following link:
 
http://developer.teradata.com/tools/articles/large-object-loading-with-teradata-parallel-transporter

-- If you are stuck at something .... consider it an opportunity to think anew.

You must sign in to leave a comment.