All Forums Tools
Darin 13 posts Joined 09/05
31 Aug 2010
How to use Volatile tables in Parallel Transporter

I have SQL that needs to run some preprocessing creating a volatile table to be exported to a file. If I use a setup step TPT is using a different session for each step and the volatile table goes away. Anyone have a work around?

ratchetandclank 49 posts Joined 01/08
31 Aug 2010

This is a limitation in TPT. They create new sessions for every step. Unless other experts comment about how to achieve your requirement..

Darin 13 posts Joined 09/05
01 Sep 2010

That is what I was thinking, a bummer, but what I was thinking. Thank you for your response. Maybe one of these guys will have a work around.

These are very big queries with in some cases 200+ GB output. The volatile tables are helping with spool issues. The path I am currently looking at is the combined data source abilities. I am thinking breaking the data up into regions and running multiple queries. If I can use different export operators then I can use different logons for each. Each logon will have its own spool space allocation.

ratchetandclank 49 posts Joined 01/08
02 Sep 2010

If you are planning to have combined data souces, then you can use the UNION ALL option in TPT to combine the data from different data sources and pump it in to one target table (or even multiple target tables). Well, thats an option if you have the same schema across different source tables.

If you are going to break up the data into regions and submit multiple queries, and if they all have the same schema, then you can define one export operator, use UNION ALL, and specify the different SQLs, one each, in each your select statements with just the "SelectStmt"attribute changing to have a new SQL in each of the SELECT in the APPLY clause.

Expert's opinion would be nice to get..

You must sign in to leave a comment.