0 - 9 of 9 tags for #Fastload

Hello everyone.  This is my first post.  I have a question about deleting and rebuilding tables.  I have inherited some code that is structured like
1.  create an empty table in the sandbox  (I'll call it table A)
2.  insert all data from an existing table (I'll call it table B) to table A
3.  drop table B

Hi,
Does TPT SQL Inserter operator supports error limit attribute?. Looking at the manual it seems it doesn't. But Stream operator does seems to support it. currently we are looking for alternatives to Load operator for low volume imports but our alternative solution must support error limit and skip row features.
 
Thanks,

Hi All,
              I am working on a code to export data from a file to Teradata. For this I am using Fastload Java API.
              I am facing an issue when I try to use tab as the delimiter. 
              The url which I am using to create a fastload jdbc connection is as below:

Hi ,
 
I am getting below error while loading text file in Teradata via fastload :-
 
Error on Piom  GET ROW :17 , text Buffer overflow !ERROR!
NOT ENOUGH room in buffer for another record offset = 68434
buffersize=65536 bufferPosition=0

Hi, I am a new learner of Teradata. When I try using the demo script provided by "Teradata Fastload Reference" (2911.pdf) p28, I got the error as:
                                                                         
**** 10:14:13 Number of recs/msg: 698                                      

Dear All,
I am trying to fload a flat file, the file contains 44 columns (44 'Û' delimiters). 
Following is the sample header and data rows (header is removed from the actual file being loaded):

HEADER:

Hello there,
I'm running a fastload job and at the end of the job before logging off, I would like to check if the fastload excuted correctly. 
Is there a way to check the return code or error code of the fast load? 
Thanks

Hello, I am trying to run multiple TPT jobs in parallel(using selector operator to select data from a Teradata table that has BLOB and CLOB columns and data connector operator to write the data in files in deferred mode ), and getting the following error:

While using utilities like Multiload, when any job ends abnormally, Teradata still holds locks and we get error 'Table is being mLoaded'.
While in case of FastLoad/TPT end, we get error 'Table is being loaded'. Even after dropping error/log/work tables, we still get same error. What I do now is just drop these tables, and recreate them.