All Forums Database
abhi_pn 14 posts Joined 02/14
20 Nov 2014
Connecting Multiple Teradata Servers

Hello All..
Greetings of the day..
I have a doubt regarding connecting to multiple TD servers.
Till date I have done my works in a particular server, now I need to be involved in 2 servers. Below is the scenario, Would need little bit guidance here.
Server-1 IP: A.A.A.50   Server-2 IP: A.A.A.100
I need to select data from a table in Server-1 and load the total records into another table in Server-2 (lets say using Bteq utility). Simply dumping all the records into a new server.
Is there any possiblity to logon to multiple servers in a single Bteq job and do operations  OR any other options/way to ease the scenario.

sgarlapa 88 posts Joined 03/13
21 Nov 2014

I think you can use TPT.  TPT only can conect both the servers simultaneously and stream data from source to target in parallel.  or need to use any ETL GUI tools like informatica or datastage..etc.

Raja_KT 1246 posts Joined 07/09
21 Nov 2014

Think of Unity loader and Unity mover.

Raja K Thaw
My wiki:
Street Children suffer not by their fault. We can help them if we want.

andydoorey 35 posts Joined 05/09
21 Nov 2014

I've been doing something similar myself recently.  I tried using TPT to copy tables in a generic manner.  However this didn't work with geospatial data, would be difficult with XML or other LOB data, and I'm not sure whether it works with PERIOD data.
I ended up transfering the data using arcmain through a unix pipe.  This should work for any data types (though the teradata systems will probably need to be on the same/similar versions)
I have written unix scripts that generate sql code to copy data from the tables I want to move to a table on a 'transfer' database on the source, then arcmain those tables from the transfer database on the source to an 'apply' database on the target, then refresh/append/upsert the records on the target table.  
I think this is similar to the functionality that unity provides, however this way we don't have to pay for unity! and also it can run as part of our cron scheduled unix scripts.

ToddAWalter 316 posts Joined 10/11
21 Nov 2014

If the tables are moderate size, the latest version of TDStudio has a drag and drop capability. Not appropriate for large table movement but very easy to use for small/medium on demand movement.
If you need to move big tables, groups of tables and/or schedule the process, then Data Mover is the Teradata utility best suited for the function. As noted above, with some scripting effort, other tools such as TPT and ARC can be used to perform the function as well.
And of course if your environment utilizes one of the major ETL tools, they also have the capability for perfoming the movement between systems. Make sure to configure them for use of the bulk utilities on both sides - export and load.

Langaliya.Nirav 15 posts Joined 04/11
24 Nov 2014

it is very simple you need to just write down TPT script which can fetch data from one table for server - A and then dump data on other server. 
or you can write down fast export script and export data into with mode fastload  format indicator from one server and move file to another server via SSH or SCP and load data into another server via fast load  

--Nirav Langaliya

ulrich 816 posts Joined 09/09
24 Nov 2014

If you have 15.0 installed check also the easy loader from tpt
The Teradata PT Easy Loader command, tdload, loads data into a Teradata Database table from a comma-delimited flat file or another Teradata Database table without requiring a Teradata PT job script to be written.
So a single and "easy" job can move the data for one table.
arcmain seems to be the fastest way to move the data, see also Update on copy / move data between systems where a simple wrapper was provided for demo purposes.

feel free to donate bitcoin:12kgAUHFUqvG2sQgaRBXFhCwyf9HXdkGud

abhi_pn 14 posts Joined 02/14
04 Dec 2014

Thanks all for the suggestions. Currently TD 14 we are using. But the TPT utility is not configured for Development purpose. However Ole Load is available. I also tried through this. All seem to be working fine.. Is there anything imprtant about this utility which is a caution (which I might have missed or went unnoticed); Like duplicate data loss or index prob, etc...

mail2aks2 5 posts Joined 01/14
25 Dec 2014

Hi abhijit.p,
If you want to copy All AMP/cluster/specific-AMP table/database from one machine to another, You can give a try to NPARC without creating the dump file. There is no need to drop indexes, jounals etc... However, you need to create two scripts one for archive(source machine/dbs) and another for copy(target machine/dbs) and run them in parallel.
If you are projecting few columns in a table and storing on another machine then above is not a straight forward way doing this.

Ashok Sagar

You must sign in to leave a comment.