just remove end loading in your script and fast load will be paused... then re run the script with second file name this time with end loading.. i think this will do it.. please post your result here.. so that it will be helpful for others
The only way to accomplish it is to have multiple scripts, where each script omits the END LOADING statement, except for the last script.
So, an example could be:
Script 1
=====
.logon tdpid/username,password
.define
F1 . . .
F2 . . .
. . .
FILE=file1.dat;
BEGIN LOADING table_a ERRORFILES table_a_et, table_a_uv;
INSERT INTO table_a VALUES ( . . . . );
.QUIT
Script 2
=====
.logon tdpid/username,password
.define
F1 . . .
F2 . . .
. . .
FILE=file2.dat;
BEGIN LOADING table_a ERRORFILES table_a_et, table_a_uv;
INSERT INTO table_a VALUES ( . . . . );
.QUIT
Script 3
=====
.logon tdpid/username,password
.define
F1 . . .
F2 . . .
. . .
FILE=file3.dat;
BEGIN LOADING table_a ERRORFILES table_a_et, table_a_uv;
INSERT INTO table_a VALUES ( . . . . );
END LOADING;
.QUIT
--SteveF
done.Thanks..
hi bmrout007 ...
can you please explain the process you did.. bcoz i am getting the below error
RDBMS error 2635: Error tables are invalid OR
ngu414_fload_test is in an invalid state for Load
if i omit the end loading in first script and adding it in second script... thanks in advance
TPT multiple reader and apply to the same writer. try that out.
jana.teradata.
Please send me your scripts so that I can take a look at what you are trying to do.
--SteveF
Hi, Already feinholz has written the structure. follow that in case of any questions.
From the error message it seems you have dropped some error table.
Please drop all the error table( i.e both ET and UV).
then drop the target table and recreate it.
then run your scripts.
sample loading scripts
==============
script1
------
TENACITY 5;
SLEEP 5;
ERRLIMIT 50;
LOGON tdenv/tduser,tdpwd;
SHOW VERSIONS;
SET RECORD VARTEXT ",";
DEFINE
v_empno (VARCHAR(10))
,v_ename (VARCHAR(30))
,v_deptno (VARCHAR(10))
,v_salary (VARCHAR(10))
,v_doj (VARCHAR(10))
FILE=/tdice/file1.txt ;
BEGIN LOADING dbname.emp_fload
ERRORFILES dbname.ET_floademp,dbname.UV_floademp
CHECKPOINT 1000;
INSERT INTO dbname.emp_fload VALUES(:v_empno,:v_ename,:v_deptno,:v_salary,:v_doj);
LOGOFF;
##########Script2#########
LOGON tdenv/tduser,tdpwd;
SHOW VERSIONS;
SET RECORD VARTEXT ",";
DEFINE
v_empno (VARCHAR(10))
,v_ename (VARCHAR(30))
,v_deptno (VARCHAR(10))
,v_salary (VARCHAR(10))
,v_doj (VARCHAR(10))
FILE=/tdice/file2.txt ;
BEGIN LOADING dbname.emp_fload
ERRORFILES dbname.ET_floademp,dbname.UV_floademp
CHECKPOINT 1000;
INSERT INTO dbname.emp_fload VALUES(:v_empno,:v_ename,:v_deptno,:v_salary,:v_doj);
END LOADING;
LOGOFF;
fastload < Script1
==>once it is complete then
fastload < script2
You can do this easily with pipes and a single FL script (assuming you're using linux but i believe you can also do this in windows). I'm cat'ing data over SSH from one linux machine to as fastload script on TD nodes:
From remote box where data exisits:
cat myzipfile[0-9].txt | gzip -2 | ssh tdnode "/scriptdir/fastload.sh" &
([0-9] tells it to cat file named myzipfile0,1,2,3,4,5,6,7,8 or 9
top of FL script:
mkfifo myfifo
then file part of script:
FILE=myfifo;
After fastload script:
gzip -d > myfifo
This allows you to load multiple files to one table via one script and also zips the data before it's sent over the network and unzips on the node. We use this method to load a lot of very quickly.
Alternatively, you could try usi a single script on your machine running fastload (using stdin or pipe as the source) and cat the data to the script from multiple files, i.e.:
cat mydata[0-9].txt | your_fastload_script
FL script:
file=stdin;
Hope this helps.
this is my script it is not work as you tell above
SHOW VERSION;
.LOGON 127.0.0.1/izhar,12345;
DATABASE csit;
CREATE TABLE KHR_Reg_BS(serial INTEGER GENERATED ALWAYS AS IDENTITY
(START WITH 1
INCREMENT BY 1
),studentID VARCHAR(15) NOT NULL,course VARCHAR(10),marks VARCHAR(10),semester VARCHAR(20),discipline VARCHAR(10) );
.SET RECORD VARTEXT "," ;
DEFINE studentID(VARCHAR(15)),course(VARCHAR(10)),marks(VARCHAR(10)),semester(VARCHAR(20)),discipline(VARCHAR(10))
FILE=D:\project\karachi\Reg_BS_KHR1.csv;
SHOW;
.BEGIN LOADING KHR_Reg_BS ERRORFILES error1, error2;
INSERT INTO KHR_Reg_BS(studentID,course,marks,semester,discipline) VALUES(:studentID,:course,:marks,:semester,:discipline);
FILE=D:\project\karachi\Reg_BS_KHR2.csv;
SHOW;
.BEGIN LOADING KHR_Reg_BS ERRORFILES error1, error2;
INSERT INTO KHR_Reg_BS(studentID,course,marks,semester,discipline) VALUES(:studentID,:course,:marks,:semester,:discipline);
FILE=D:\project\karachi\Reg_BS_KHR3.csv;
SHOW;
.BEGIN LOADING KHR_Reg_BS ERRORFILES error1, error2;
INSERT INTO KHR_Reg_BS(studentID,course,marks,semester,discipline) VALUES(:studentID,:course,:marks,:semester,:discipline);
.END LOADING;
.LOGOFF;
Please review the FastLoad Reference manual for the correct use of the script language.
(There can only be one DEFINE statement, and that DEFINE statement can only have 1 FILE definition. Each FastLoad script can only load data from one file.)
You should be using TPT. TPT can easily load data from multiple files.
--SteveF
thnx @feinholz sir.
Can you provide the TPT script .?
Please refer to the TPT documentation (same location as the FastLoad documentation).
Also, when you install TPT, we provide sample scripts for a variety of scenarios.
Those scenarios are also described in the User Guide.
--SteveF
There are multiple ways to create a TPT Script.
i) Manually write the Scripts following the documents or Sample Scripts
ii) Teradata Parallel Transporter Wizard to create scripts dynamically that supports wide variety of scenarios based on
a) Source System (File, Tables etc)
b) Load Operator, Update Operator or Stream Operator
c) To pass values as parameters or to embed directly in the script for UserID, Password etc.
iii) Using ETL tools such as Informatica that internally creates equivalent TPT Script based on the developed mappings and the connections being used.
A Sample TPT Script will appear like below that extracts data from SQL Server table(SrcDB.Table1 )and makes use of load operator to load data into Teradata table (TgtDB.Table1).
USING CHARACTER SET ASCII
DEFINE JOB Table1_NEW
DESCRIPTION 'To load Table1_DATA'
(
DEFINE OPERATOR W_1_o_Table1_NEW
TYPE LOAD
SCHEMA *
ATTRIBUTES
(
VARCHAR UserName,
VARCHAR UserPassword,
VARCHAR LogTable,
VARCHAR TargetTable,
INTEGER BufferSize,
INTEGER ErrorLimit,
INTEGER MaxSessions,
INTEGER MinSessions,
INTEGER TenacityHours,
INTEGER TenacitySleep,
VARCHAR AccountID,
VARCHAR DateForm,
VARCHAR ErrorTable1,
VARCHAR ErrorTable2,
VARCHAR NotifyExit,
VARCHAR NotifyExitIsDLL,
VARCHAR NotifyLevel,
VARCHAR NotifyMethod,
VARCHAR NotifyString,
VARCHAR PauseAcq,
VARCHAR PrivateLogName,
VARCHAR TdpId,
VARCHAR TraceLevel,
VARCHAR WorkingDatabase
);
DEFINE SCHEMA W_0_s_Table1_NEW
(
COMMENT_DATA_ID INTEGER,
STRING_COMMENT VARCHAR(4000)
);
DEFINE OPERATOR W_0_o_Table1_NEW
TYPE ODBC
SCHEMA W_0_s_Table1_NEW
ATTRIBUTES
(
VARCHAR UserName,
VARCHAR UserPassword,
VARCHAR SelectStmt,
VARCHAR PrivateLogName,
VARCHAR DSNName,
VARCHAR ConnectString,
VARCHAR TruncateData
);
APPLY
(
'INSERT INTO TgtDB.Table1_DATA1 (COMMENT_DATA_ID,STRING_COMMENT) VALUES (:COMMENT_DATA_ID,:STRING_COMMENT);'
)
TO OPERATOR
(
W_1_o_Table1_NEW[1]
ATTRIBUTES
(
UserName = 'TDUser',
UserPassword = 'TDPsswrd',
LogTable = 'TgtDB.Table1_DATA1_log',
TargetTable = 'TgtDB.Table1_DATA1',
TdpId = 'TDPROD/LOGMECH=LDAP'
)
)
SELECT * FROM OPERATOR
(
W_0_o_Table1_NEW[1]
ATTRIBUTES
(
UserName = 'SQLSrcUsr',
UserPassword = 'SQLSrcPwd',
SelectStmt = 'SELECT COMMENT_DATA_ID,STRING_COMMENT FROM SrcDB.Table1_DATA;',
DSNName = 'SQL Server'
)
);
);
Need fastload script example to load a empty table from multiple files
Thanks