16 | 10 Jul 2016 @ 11:13 PDT | Tools | Reply | TPT02639 - Conflicting data type error for column | Hi Steve
Version used is 14.00.00.09. By using cast conversion we were able to solve the issue but the chinese characters are not getting exported appropriately. Is there anything to do with the e... |
15 | 08 Jul 2016 @ 05:37 PDT | Tools | Topic | TPT02639 - Conflicting data type error for column | Hi
I am trying to export table data to file using TPT script using encoding UTF8 and the column structure is as follows that contains chinese title ex: zone_num char(6) CHARACTER SET LATIN CASESPE... |
14 | 21 Jun 2016 @ 02:10 PDT | Tools | Reply | TPT API C program usage | Hi Steve,
As you have mentioned in the above comments, Using GetBuffer we can retrieve 64K buffer of rows. what is the maximum buffer of rows that can be retrieved?
Thanks.
|
13 | 20 Jun 2016 @ 04:37 PDT | Tools | Reply | Usage of GetRow method in TPT API for exporting table data into csv | Thanks Steve for your valuable information!
|
12 | 17 Jun 2016 @ 06:36 PDT | Tools | Topic | Usage of GetRow method in TPT API for exporting table data into csv | Hi
By using Getrow method to export data into a file, resulted in no data in the csv file even though ther is data in the table.
Is there any issue in the format that has to be mentioned while ex... |
11 | 08 Jun 2016 @ 03:33 PDT | Database | Topic | Teradata demo version 13.1 | Hi
Can anyone help us on finding the free Teradata demo version 13.1 for windows?
Thanks
|
10 | 08 Jun 2016 @ 12:08 PDT | Tools | Reply | TPT API C program usage | Thanks Steve :)
As mentioned by Todd, Is TPT EXPORT a tool for better performance?
|
9 | 07 Jun 2016 @ 12:26 PDT | Tools | Reply | TPT API C program usage | @Steve - can i use * or something like that to indicate all column names? (Since different tables with different columns and updating column names in a TPT-API code will be a tedious process :( )
|
8 | 07 Jun 2016 @ 12:19 PDT | Tools | Reply | TPT API C program usage | Thanks Steve and Todd.
We are currently exporting using Script-Based TPT which is making IO operations, hence performance is reduced.
Hence we are trying to establish TPT-API connection and send ... |
7 | 03 Jun 2016 @ 01:39 PDT | Tools | Reply | TPT API C program usage | Thanks Steve.
1.Is there any alternative so that we can skip the column names? (as its a tedious process to add all column names and we are trying to export 5000 tables and each having 50 or ... |
6 | 02 Jun 2016 @ 12:37 PDT | Tools | Topic | TPT API C program usage | Hi,
Found this piece of code
#include "connection.h"
#include "schema.h"
#include "DMLGroup.h"
using namespace teradata::client::API;
int returnValue = 0;
... |
5 | 30 May 2016 @ 05:09 PDT | Tools | Reply | Executing multiple TPT scripts simultaneously will yield to better perfomance? | Thanks Steve for the response..
Need clarity on the multithread concept in TPT..
I am aware that we can perform exporting of multiple tables by creating more instance of the producers and th... |
4 | 18 May 2016 @ 02:07 PDT | Tools | Reply | Executing multiple TPT scripts simultaneously will yield to better perfomance? | Thanks Steve for the information,
Currently we are executing as follows: tbuild -f <filename>
Should the command line be changed as follows for the round-robin fashion ?
tbuild -C <... |
3 | 17 May 2016 @ 12:18 PDT | Tools | Reply | Executing multiple TPT scripts simultaneously will yield to better perfomance? | Hi Steve
I have the follwoing TPT script written to get table data exported to one file. I would like to know if it is possible to export into multiple files to get a better through... |
2 | 16 May 2016 @ 02:20 PDT | Tools | Topic | Executing multiple TPT scripts simultaneously will yield to better perfomance? | Hi
Can i execute multiple scripts with different producers(source tables) and different consumers(target files) simultaneously to get a better perfomance.
Thanks in advance
|
1 | 16 May 2016 @ 02:03 PDT | Tools | Topic | Better performance than fastexport | Hi
I would like to know if TPT is faster than Fast export?
As we want to export 180TB data into a file system, currently through Fast eport we are achieving 1TB per day . Can I expedite the proce... |