0 - 6 of 6 tags for #CLOB

We want to offload LOB objects from Teradata tables where  size is greater than 64K.I am trying to evaluate all the options at disposal here.

I have a simple stored procedure which returns CLOB value. I am calling the procedure from BTEQ and then exporting that value in a file.
Now the problem is that BTEQ is truncating the value from the end. I have used also used SET WIDTH but no luck. Following is my BTEQ code:

TPT DataConnector producer error : "Delimited Data Parsing error: Input record exceeds allocated storage"
I am using TPT Data connector producer and sql inserter operator to load clob data using tab delimited file. Max length of text I am loading into CLOB is 90000 approx.

Hi All,
I am loading varchar(max) / nvarchar(max) data from sql server to Teradata using TPT and data direct drivers. The trimmed length of varchar(max) column is 5356651. I am unable to load this data as data dorect drivers replaces varchar(-1) instead of varchar(max).
Any idea how I can load this data into teradata?


Has anyone gotten this to work?  Apparently Oracle Transparent Gateway is trying to convert a character column defined as varchar(10000) -- only the first 999 characters are populatedm, but trim doesn't help -- OTG converts it to LONG.  The select / join gets an error -- illegal use of LONG datatype. 

Hello, I am trying to run multiple TPT jobs in parallel(using selector operator to select data from a Teradata table that has BLOB and CLOB columns and data connector operator to write the data in files in deferred mode ), and getting the following error: