Package icon simpleautomation.zip1.56 KB

If you are like me you tend to have several sandbox databases that you use day in and day out. Some on different servers all with different names and each one needing to have the latest ddl changes applied to keep things moving. In development one of the requirements is that all of the database objects that are created are qualified by the correct database name. In this article I will show you the how to quickly automate transforming and running the ddl scripts using simple ant and bteq scripts. The requirements for these examples or at least what I am using are ant 1.7 and any recent version of bteq. If you don’t know what ant is stop reading right now, hit the back button and move on nothing to see here. If you don’t know what bteq is see last sentence.

I will assume that you have ant and bteq working from a command prompt if not I will wait….

So lets take our ddl script that we want to use.

CREATE TABLE $CoreDb.MYGREATTABLE { The_Best_Column VARCHAR(20), The_Second_Best_Column INTEGER };

Lets save this in a file called tables.ddl. If you notice I have this tag or token in the file $CoreDb, that is what I will look for to replace it with the actual target database name when it comes time to run this script. If we loaded this script up and tried to run it right now Teradata would tell you that it probably can’t find the object $CoreDb unless you happen to have one on your server, I don’t. I tend to break the problem down into two phases, the first phase transforms the scripts into something that can be run in the target database and the second phase actually calls the bteq script to run the scripts. The main benefit is that what we get after the transformation phase is actually a working set of scripts that can be rerun if I need to. Also I can perform the transformation by itself and review the scripts before I actually run them. Ok so for the transformation we can use the plain jane vanilla ant copy task using a filterset to perform the actual token replacement. So here is a sample copy.replace.all ant target.

<target name="copy.replace.all">
    <copy file="${input}" tofile="${output}" overwrite="true">
        <filterset begintoken="$" endtoken=".">
        <filter token="CoreDb" value="${core.db}." /> 
        <filterset begintoken="$" endtoken=" ">
        <filter token="CoreDb" value="${core.db}" /> 
        <filter token="server" value="${server}" /> 
        <filter token="username" value="${username}" /> 
        <filter token="password" value="${password}" /> 

If you notice there are a couple of things going on here. Since this is the same target that is used to copy all of the files all of the possible tokens are used every time. Also since in some of the files the endtoken can be different then we have to have the CoreDb token duplicated in the copy task. But since this is called simple automation we will keep it simple for right now. Now this target it put into a script by itself called copy.replace.xml and the copy.replace.all target is called from the main build.xml script file. The main build.xml script file is fairly simple as well just having two targets. I set the default target to copy.replace.all because it doesn’t require anything other than ant to run and it speeds up testing.

<target name="copy.replace.all">
    <delete dir="${core.db}" failonerror="false"/>
    <mkdir dir="${core.db}"/>
    <ant antfile="copy.replace.xml">
        <property name="input" value="tables.ddl"/>
        <property name="output" value="${core.db}/tables.ddl"/>
    <ant antfile="copy.replace.xml">
        <property name="input" value="install.bteq"/>
        <property name="output" value="${core.db}/install.bteq"/>
    <ant antfile="copy.replace.xml">
        <property name="input" value="logon.bteq"/>
        <property name="output" value="${core.db}/logon.bteq"/>
<target name="install" depends="copy.replace.all">
    <exec executable="bteq" dir="${core.db}/" input="${core.db}/install.bteq" output="${core.db}/install.log" append="true"/>
    <delete file="${core.db}/logon.bteq"/>


So all I am doing here is just making a copy of my script files using the copy.replace.xml antfile which performs the token replacement using values from the file or the property values can be passed into the command line. The good thing about ant is that whoever sets the value of the property first wins so this allows you to override anything from the command line using the -Dpropertyname=value option. The easiest way that I use this is to create a simple batch file that takes the main parameters. I will also show you in a follow up article how you can use this when developing for multiple target environments. So if you look at the install target I use ants exec command to call bteq passing in the install.bteq file and routing the output to an install.log file. As if you didn’t know the install.bteq script is the main driver for actually running the sql on the Teradata database. And no I don’t think that “.bteq” is an actual file type association with the bteq scripting engine it is just the convention that I use to name my bteq scripts.

So lets take a quick look at the bteq script.

.MESSAGEOUT FILE=install.log;


.RUN FILE=logon.bteq;



.RUN file='tales.ddl';



Not much going on here the major driver is the .RUN command which tells bteq to run the script file called tables.ddl. But there are a couple of subtle things also happening. First we tell bteq to route all of the output of the scripts to a file called install.log, this is very helpful in determining when something went wrong. The first thing that I do is to search for the string ‘* fail’ in the logs this will usually tell you very quickly if bteq reported any problems when running the scripts. Also after the call to .RUN we check if the errorlovel != 0 this can get a little tricky because bteq has this great feature that if there is an error in one statement but the following statement succeeds then bteq treats the whole script as a success which from ant will tell you that the build was a success even though there was a failure in the script somewhere. So I have to make sure that all of the statements in my scripts check for errors afterwards or the build my report success even if there was a failure. A quick check of the log tells me exactly what happened so usually it doesn’t take long to correct the offending script. So we have our build scripts our ddl and our bteq scripts all we need now is a database to go and run them on. But wait where are all of those values coming from you ask..good question there is one other file that I almost forgot to mention. The file contains all of the replacement values for all of the tokens that we have defined.


Ant will read the contents of this file and use the values as the replacements for the ${} parameters in the ant scripts. These also allows us to override any of the parameters from the command line for example to use Lanyard as the core.db instead of CoreDatabase all we would have to do is..

> ant -Dcore.db=Lanyard

But there is still one more file that I haven’t explained yet. The file logon.bteq is used to hold that username and password for the bteq scripts. Sometimes you end up having multiple bteq scripts and having the password spread out in each script is in my opinion messy and a little unsecure so my convention is to have the password replacement variable only in one file and to also delete that file after we run the bteq script. In development this really isn’t an issue but hopefully we can reuse some of this functionality if we need to deliver the source to a client or customer because there are some useful installation tools that allow you to use ant scripts as part of your installation process.

So hopefully you can see some benefit to trying to model your script process in this fashion. Some of the benefits that I have seen are that this can be automated for multiple environments as well as extended to support to varied token replacement. Also this could be used to help deploy your solution to a customer environment. Hopefully I will get to show you how this technique can be extended to support agile development as well. Mainly one of the features of Ruby on Rails that I like are the migrations of ActiveRecord models. This allows developers to move the data model and the code in an incremental fashion. This is something that I have found to be very useful when you have a large team of developers making changes all of the time and allowing the development environments to move incrementally speeds up development. Also automating database installations and migrations when new code is checked in as well as running test against those environments is a very powerful tool in agile software development. So get on with the simple automation and I will get on with some more advanced examples and utilities to help you. Be sure to download and try out the code in this article from the attachment

ayman 1 comment Joined 11/07
22 May 2009

useful code.. samples for a custom task would be cool..

Hal 3 comments Joined 03/09
22 May 2009

No problem I actually use a custom task for the regular expression replacement instead of the built in regexp because I need to do some expanded replacements. I will try to extract some of that and provide some examples. There are good tutorials online, google is indeed your friend. See and as a starting point.

nneill 1 comment Joined 04/10
28 Apr 2010

I went looking for any additional articles you might have written in the past year, especially regarding:
"Also automating database installations and migrations when new code is checked in as well as running test against those environments is a very powerful tool in agile software development."

You still out there?

Hal 3 comments Joined 03/09
05 Oct 2010

Still here and actually I have modified the process a little for a simple migration strategy. If I find some time I will write it up and post it. Sorry for the long delay but somehow comments don't get sent to the correct email address, looking into that.
I am using a combination of ant and a groovy script for my database migrations.

You must sign in to leave a comment.