0 - 50 of 57 tags for mload

Pages

I have a very simple working MLOAD and have a new requirement to block data from being INSERTED.  I see many examples where the WHERE of an APPLY statement is specified as equality (=) and am sure I've done that in the past successfully, but my new requirement is to skip based on 'not equal to'.
 
 
Layout of file:

Do I need to define the column in file format section of mload which I am planning to load as hard coded value coming from parameter file. In attached code report_suit_id should be passed from parameter file as &val4 but I am getting syntax error saying that "Syntax error: expected something between '(' and the end of the request.".

Greetings Experts,
Can you please confirm my assumptions regarding MLoad are correct.. Tablea has around 1000 rows belore Mload;  File to be Mloaded has 500 rows to be upserted with checkpoint of 100 rows 

I am using mload to load in a large data set. However, I can't get any of hte data to load because of the error 6760. I know it deals with the timestamp.
 

Hello Everyone,
I have a question on Mload & FastLoad.
Lets assume that I have an empty table and I am trying to  load a file into this table using Mload or fastload. Now, based on below knowledge, I want to conclude, in application phase , which of these utilities will perform better.

Hi,
I am using Teradata 13 on Windows... I am using the MLOAD client on Windows to load a data file which is in UTF16-LE format. All the columns are getting into the ERROR table. I tried running with the -c(character Set)  option with UTF16 and blank(ASCII) as well ,it doesn't help...

No idea why Mload failed in initial phase.  Could anyone help?
 
2015-08-19 05:03:03.675000 :Starting job  jira_issue_test

2015-08-19 05:03:03.675000 :Loading conf file  C:\workspace\DIS\DISMetrics\conf\jira\sql1.mld using  mload

     ========================================================================

 

Hi ,
 
I am deleting 80 millions of records from a table , for this tpt is taking 12 hours to complete where as bteq deleting data in just 2 hours. Actually tpt should be more faster than the bteq.  Please suggest me how can i avoid this long run thrugh tpt.

Hi,
 
I am getting the below message from the log. I could not see that there was a sucessfull connect happened to the DB at all. Usually "UTY6211 A successful connect was made to the RDBMS." would be there. Its missing here and teh steps there after but directly returns error code. Please help on this.
 

Hello All,
I am running a mload job. In the source file, the fields are having different data types like Decimal, Char & Varchar. But when i am running the job, major volume of records are going to error tables with error code 2797 (saying MLoad MARK MISSING UPDATE). That means its not finding records with given where conditions.

Hello Everyone.

Friends, I have the following problem: I have a final table, where some values are rounded wrongly. There is no transformation of these values. The tables have the same data type, decimal (9.2) and FastExport it converts to the same data type. An example I got was a value of 46.23.

Need to load from Table 1 to Table 2. Which utility work well here and WHY?
Please explain the limitations and advantages against each utility.

I have a case where I need to use multiple delimiter in mload script .I have data in below format:

Hi All,
Is it possible to capture FLOAD lock and MLOAD lock using ERROR HANDLER in Teradata Procedure . If yes , please let me know the steps / sample . If not , please suggest a better way to capture FLOAD lock and MLOAD lock .
My requirement :-

Hello,
In my mload script i have the following:
0037 .IMPORT INFILE xxxx
        FROM 4
        FORMAT  VARTEXT '^\' NOSTOP
        LAYOUT &load_table.
        APPLY Inserts;
The multiload skips the errors and inserts the records which respects the conditions
 

I have an Informatica mapping trying to load data from a flatfile to a table. The target table uses MLOAD to load the table in one of the database in teradata development database.

Hi All,
I am new to teradata and done some findings on FLOAD, MLOAD and manually creating a job to load data from 1 table to another.
I am able to do FLOAD, mload for loading the data from 1 table to another table.
The thing is

I have a scenario at hand:-
Source: 9 Binary Flat Files (From Mainframe Source Systems)
Target: 1 Teradata Table
ETL Operations: Insert / Update / Delete using Informatica Workflows – Teradata MLOAD INSERT / UPDATE Connection String & Teradata MLOAD DELETE Connections String

While executing Fast Export, we have a option to create a MLOD script by provoding the MLSCRIPT. Do we have a similar option for FASTLOAD? Can someone help me know if there is one. Or let me know why we dont have one for Fast load.

Hello All,
This the first time i am running MLOAD Script. Please Help me find the error.
Script:
.LOGTABLE DB.logs2;
.LOGON Jugal/jbhatt,jugal;
CREATE MULTISET TABLE DB.Mload_Input ,NO FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,

Hi,
I need to delete the rows from table in mload script which are not matching in the file.
Tried couples of ways but couldn't succeed.

DELETE FROM Employee WHERE EmpNo <> :EmpNo and EmpName <> :Empname;

UTY0805 RDBMS failure, 3537: A MultiLoad DELETE Statement is Invalid.

Greetings Experts,
I am using windows 7, TD 13.0 express demo version on my laptop.  I have created the sample Mload file in C:\mload_in.txt which is below.

Hi All,
I've a table which is being loaded through Informatica ETL tool. One column in the table is defined as PERIOD(TIMESTAMP(6) WITH TIME ZONE). There is no such datatype i found in the ETL tool. As the field is straight move from the source(flat file), I've made the datatype CHAR(72) in the ETL Tool.

Greetings Experts,

 

Hi Experts,
 

I'm using this MLOAD script:
(...)

CREATE MULTISET TABLE "DB"."TABLE" (FIELD1 TIMESTAMP(0) NOT NULL) ;
.BEGIN IMPORT MLOAD TABLES "DB"."TABLE" ;

.LAYOUT Layout1 INDICATORS ;

.FIELD  FIELD1 * CHAR(19) ;

.DML LABEL LabelA ;

INSERT INTO "DB"."TABLE" (FIELD1)

    VALUES ( :FIELD1 ) ;

Hi Experts,
The excerpt from the script

Greetings Experts,

Does the errorlimit option does consider the records in ET during acquisition phase or even for the application phase of a Mload job?

Greetings Experts,
The DELETE task in Mload doesn't have a acquisition phase.  Now, when I specify the an input file with only one record (as supported by delete task) as follows.

Greetings Experts,
"When there are no expressions involved and at most one conversion required before
loading the data into the target table, the data conversion takes place in the acquisition

Hi Experts,
say the data in the source file consists of 100 records with empno ranging from 100-200.  If I hardcode the update operation with a where clause =100 in DML label (which should update only 1 row in target), will the work table be loaded with 1 row or from rows 100-200?
 

Hi,
I am working on a project which is intended to migrate our SQL Server DataBases to TeraData. We are using TD 13.10.
1. I have tried OleLoad. But using this we can only export one table at a time.
2. I have used EFS Tools. But data Conversion is wrong.

Hi,
I have a requirement i need to export a data from one table using fast export  Parallel load it into a table using MLOAD.
I can write seprated Fast export and Mload but As the table is huge and i dont want waste unix space as the fast export creates the big file.
Can we load it like a queue..

Greetings Experts,
We are using some jobs that use Mload connection in Informatica.  If the number of rows in ET table is >= errorlimit specified, will the session be forced to fail?

hello guys,
I have 80 to 90 tables and each has one file to get loaded.(file1 for T1, file2  T2 etc respectively)
Can i create one single script(fload,bteq or mload) for loading all these tables.
My input parameters will be filename and the target table-->  run script <file1> <targettable1>

Every day i will receive a Flat file which i have Mload it in prod. So i have to check the Header in file and make sure it matches with old one.
File Delimited '|'

Eg : S_NO | LNAME | FNAME  (Given header)

Eg : S_NO | LNAME | FNAME (File header)

Hello,
Is there any way to measure the total AMPCPUTIME consumed by any MLOAD/FLOAD/FEXP job post completion? The values from the dbql(sum of ampcputime for a particular LSN number) does not seem right.
I also tried dbc.acctg. There too, the values seem low.

We are tying to get some metrics on run times for mloads , does any one have this script which can calculate the mload duration.

Hi
I need to load a file using mload where  the load clm has a data like this 03/29/2013 12:00:00 PM.
select cast(cast('03/29/2013 12:00:00 PM' as timestamp(0) format 'MM/DD/YYYYBHH:MI:SSBT') as varchar(22))
I need it to load as Timestamp rather than varchar
 
Thanks
Balu
 

Hi,
If we have three .IMPORT statements for three different tables in a single MLOAD job, will it run in sequential order or in parallel?
 
Thank you

Does Mload support loading duplicate rows? If yes how does it handle loading the duplicate rows after a failure(While restarting the same MLoad job again)?

 

Hello,

 

We notice in Viewpoint that our Mload jobs are assigned session IDs for each thread, but only 1 of the sessions has a workload associated with it.   Does that mean that the entire Mload (multi-threaded) is only using 1 AWT?

Thanks for the insight.

 

Hi everyone! I am currently studying Teradata and currently working on this problem of mine. You see, i have this set of data on a flat file that needs to be loaded in a table based on a condition.

This is what my flat file looks like:

##Flatfile
SOR|03/17/2012|2 -- This is a header

Hi Masters,

I have a file which I am loading in to two tables one with MLOAD and other with FASTLOAD. Let's say the first column (non-PI column) is coming in as CHAR(20) and second column (also non-PI) as 12 DIGIT number from file. And in both the tables the first column is defined as CHAR(15) and second one as INTEGER. Below are my questions