All Forums Third Party Software
08 Jul 2006
DataStage Enterprise Stage Executing Teradata Macros

We are beginning to work with DataStage and wish to switch a lot of processing from executing on the Mainframe to DataStage. We employ a lot of parameterized Macros in our ETL environment, and some are very complex with many SQL statements and layers of Macros imbedded in them. Our expectation was that we should be able to execute macros exactly as they were developed and run in production, on the mainframe. We are experiencing abends with Macros containing more than one SQL statement. It appears the Ascential Teradata Enterprize stage is able to execute something simple like a stored procedure but is not robust. It appears to us the solutions are to leave Macro execution on the mainframe or rewrite all the macro processes for Ascential. What are the experiences of other Teradata shops? Is there some secret to getting Macros to work?

18 Feb 2007

There are two things to consider with this issue. DataStage 8 on the Information Server significantly improves the integration between DataStage and Teradata. DataStage 8 was released for Windows last December and will soon be available on Unix/Linux/USS. See What is new for DataStage 8 on the Information Server for details, in summary DataStage 8 will support:Teradata v2r5.1, v2r6.0, v2r6.1 (DB server) / 8.1 (TTU) plus Teradata Parallel Transport (TPT) and stored procedures and macro support, reject links for bulk loads, restart capability for parallel bulk loads.The other thing to consider is whether you want to do transformation processing via a macro or via a DataStage job. It's the classic ETL versus ELT debate. Once the data is loaded onto Teradata the Teradata engine is extremely fast and efficient at ELT (Extract, Load, Transform) processing letting you aggregate and resave data. However DataStage offers the better GUI Design tool and process execution logging, error reporting and exception handling. It also has the better data lineage reporting. So you need to use the strengths of each tool. Try to do as much transformation as possible during the DataStage preparation and load steps and leverage Teradata for the simpler post processing such as aggregation tables.

trojancjs 17 posts Joined 03/07
12 Mar 2007

Hi Teradata Ace,Vincent got to the heart of the matter - DataStage EE probably isn't going to be a good choice for executing macros that perform set operations on Teradata (like multi-row DELETEs, INSERT ... SELECTs, UPDATE ... FROM statements) I'm at a site that has tried to use DataStage as its one-stop shop for loading Teradata, and it just hasn't worked very well. If you want the most pain-free experience using DataStage as your ETL tool, either keep all the macros outside of DataStage or rewrite the macros as DataStage jobs.But enough advice, that's probably not what you were looking for - The Teradata Enterprise stage uses the FastLoad protocol for loading data and the FastExport protocol for extracting data (as far as I have seen), so this wouldn't be a good choice for attempting to integrate macro execution. The Teradata API stage connects normal DBC/SQL sessions in ANSI mode. It submits whatever statements you define exactly as you define them on a per-incoming row basis, with a COMMIT sent after each "Transaction Size" # of rows. If your macros were not intended to be executed on a per-row basis, we have designed jobs that read a single row from some source stage to execute the custom SQL command once, with the "Transaction Size" parameter set to 1. If they are, execute away, but keep in mind you are connected in ANSI mode.

Rbar 7 posts Joined 10/04
21 Mar 2007

We use DataStage on Windows; we routinely execute Teradata macros launched from DataStage. You must be careful about the ANSI transaction semantics or you will get surprises.

06 Dec 2007

I have an update to the question I originally posted. We now have datastage 8.0.1 installed and running in all our Linux environments. This version is able to execute some of our native Teradata DML macros, but I have not tested the very complex ones. It is successfully returning the .abort messages which help reduce problem analysis and getting things running again. We are in the process of installing a patch which should allow us to run collect statistics (DDL) statements from datastage. Our envirement is shifting slightly from ELT because we now perform a small amount of transformation inside of datastage, and prior to the Mload. We are still doing the heavy lifting in Teradata using Macros. The Macros are nice, because if they run into an issue and abort, changes are rolled back to a known state. Our macros display remarks as Teradata works its way through the SQl statements, so we know exactly where an abend occurs and how many rows are impacted by each step.OK, that may be too much information. The bottom line is we are reusing our existing Macros and not having to reinvent the wheel. Every little bit that helps us get off the mainframe, is a good thing.


06 Dec 2007

That's some good feedback. The Fix Pack to DataStage 8.0.1 has a brand new Teradata Connector - are you using that to run your macros or are you using the Teradata Enteprise stage? DataStage 8.1 due out in the next six months will have an optimizer that pushes parts of a job down to the source or target database as custom SQL letting you run some steps inside Teradata and some on the DataStage server.

nhari 8 posts Joined 01/11
02 Apr 2013

I am trying call a teradata macro within a Datastage (v8.5) job using Teradata Connector stage. The job hangs when it starts executing the macro and is also causing other jobs to hang.
It would be a great help if someone could provide some suggestions for avoiding this scenario?
Thanks in Advance,

You must sign in to leave a comment.