100 - 139 of 139 tags for bteq

Pages

Can anyone provide information as to the encrytion strength of BTEQ/Fastload/Multiload etc when DATAENCRYPTION is ON?

I've searched the internet and this site and can't find it.  Any assistance would be greatly appreciated.

(What is the secret handshake?)

 

mike

I am trying to create a Bteq job that writes the SQL results to an output file with a seperator between each set of results.

I need help with making the seperator in the output file. Is it something like "print" or "echo"?

 

I am not seeing who to do it based on the info in here:

The Teradata RDBMS can return a variety of errors. Some of these errors are retryable (that is, the request can be resubmitted); the simplest example of this is a 2631 (Transaction aborted due to %VSTR) caused by a deadlock condition. Other errors are not retryable; data-related errors (constraint violations, etc.) are an example.

Hi All,

I have text file delimited with '|'.

emp_id|DOB(mm-dd-yyyy)|DOJ(dd/mm/yyyy)   *DOJ - Date of joining

101|12-25-1986|24/10/2008

102|01-23-1982|28/11/2006

What is the best way to extract the output of the 'Help statistics DbName.TblName column Partition;", for the purpose of extracting the value of the column 'Number of Rows'. If you have attempted this through SQL, please share your thoughts. What I've already have in place is the following methodology...
Run 'Collect Statistics on DbName.TblName Column (Partition);
Run 'Help Statistics DbName.TblName Column (Partition);' through BTEQ with SideTitles on

Hello,

I would like to know if there is a way to define and use parameters in BTEQ scripts. The only such feature i am aware of is the "using" feature when importing from files. However, What I would like to do is something like the following:

 

------ define params

param1 = value1

param2 = value2

 

Does BTEQ not allow nested .run file commands? I have a BTEQ script that has .run file. The file also has .run file in it. When the script reaches the end of the nested .run file command, it steps all the way out to the bteq script, but I want it to continue with the rest of the steps from the file referred to in the first .run file command.

BTEQ can read the output data directly from another BTEQ job,
using named pipes, without even getting the data on a storage device.

Could anyone please explain this with an example.

A sample code snippet will be helpful.

I was trying to export DDLs for the existing tables. I wrote a bteq script as follows. Although width is set to 20000, it exports partial row. I tried to increase width size to max 60000, still I have no luck. It exported same partial rows.

Can somebody point me what is wrong in this bteq?

-----------------------------

.logon
.set width 20000
.set titledashes off

.export report file=cm_crt_table.sql

select requesttext(title '') from DBC.TABLES where databasename='WRK_DB' and tablename like 'CM%';

.export reset

.logoff

.quit

Hi,

I have searched, and can not find out whether the bteq command is provided/supported on Solaris 10 running on an Intel-based machine.

bteq currently runs on an old Sun SPARC machine, and I want to migrate this to a virtualized environment using a Sun/Oracle X4270 M2 running Oracle VM with a Solaris 10 virtual machine.

When it comes to establishing Teradata Database sessions, you may find that using BTEQ’s LOGON command by itself is not sufficient to pass along your credentials for user authentication. This article will explain what other commands you might need to use.

I am trying to understand why BTEQ .PACK command creates dead lock on tables with NUPI.

I had a scenario where I needed to load nearly 2 million rows on mutiset table with NUPI. I had lot of duplicates on NUPI columns.

When sessions set to 4, PACK set to 2000, my bteq script's run time output displayed the following:

*** Growing Buffer to 399
*** Failure 2631 Transaction ABORTed due to deadlock.
Statement# 1, Info =0

*** Warning: Attempting to resubmit last request.
*** Failure 2631 Transaction ABORTed due to deadlock.

Using these settings the results of my SQL ran using Bteq is written to a file with no column headers, the only thing I can't figure out is how to get rid of the 10 spaces that are written to the export file before the results of the SQL.

Chapter 2: Starting and Exiting BTEQ
Logging on to the Teradata Database
Page 37

Submit the LOGON command in an input file, including
the password, as follows:
.LOGON tdpid/userid, password

Is there any way I can choose the database to connect to at the same time as the logon command, all in one line?

Earlier last year, there was a post on this forum asking loop function in BTEQ, this post will also show a trick how to "mimic" a loop using BTEQ.

Recently, I encountered a particular issue in our environment. A process using parallel BTEQ scripts to process data and target to the same destination. Generally, there will be multiple insert/update/delete in each script targeting at one main target table. At first glance, there should be no issue, since RDBMS manage locks and will leave the session in "blocked" state and serve the request FIFO.

I'm trying to setup teradata with LDAP (sun one) authentication (not authorization)

I get this error when trying to logon using BTEQ :

.logmech ldap
Teradata BTEQ 08.02.03.03 for WIN32. Enter your logon or BTEQ command:
.logon demotdat/auto101

.logon demotdat/auto101
Password:
*** CLI error: CLI2: BADLOGMECH(507): Requested logon mechanism is not available.

Using tdsbind works :

C:\Program Files\NCR\Teradata GSS\nt-i386\12.00.00.00\bin>tdsbind.exe -u auto101

Enter LDAP password:
LdapGroupBaseFQDN: dc=XXXXXX,dc=com
LdapUserBaseFQDN: dc=XXXXXX,dc=com

Can we call a BTEQ script from another BTEQ script? I am working on MS-DOS.

Also I have another question on "activitycount".

Is it possible to get activitycount on any INSERT|UPDATE|DELETE operation on a table? If yes how.

In BTEQ
I did .SET LOGONPROMPT OFF
then I checked whether it is really bypassing login username and password.
I tried to login with the following command
.LOGON TDPID/;
but it didn't allow me to login instead it prompted for username and password.

Hi,

Can some one provide me the link to Teradata Load utilities for linux download location. I have searched the download center extensively with no result. I would like to install BTEQ,FASTLOAD,MULTILOAD,TPUMP utilites on linux box.

Thanks

Hi,

I am using the bteq import export option,I have followed the following steps in order:

Step 1
------
export data from table to file

.set width 900
.export data file=myoutput.txt
Select item_id from
db.tab1
sample 10;

.export rest;

Step 2
------
creating a new table

create table db.tab2
(item_id integer)
;

Step 3
-------
import data from file to table

.import data file=myoutput.txt,skip=3
.quiet on
.repeat *
using item_id(integer)
insert into db.tab2
(item_id
)
values
(
:item_id
);
.quit;

This presentation describes, in detail, the various load utilities supported by Teradata.

Hello Everyone,

I have a brilliant opportunity for a Teradata specialist within a global company.

• Need to have several years of experience in DWH with strong expertise in Teradata database and Teradata Utility.
• Extensive experience in coding Teradata SQL, BTEQ, MLOAD, Fast Load and Fast Export.
• Should have expertise in requirement gathering/analysis, data modeling/design, performance tuning and development for DWH projects.
• Must have relevant experience working in a banking domain.

I have 2 BTEQs having following type of Statements. Both are submitted at the same time and both are trying to insert in the same table.

BTEQ1
---------------------------------
BT;
INSERT INTO T1 SEL * FROM GTT;
.IF ERRORCODE <> 0 THEN .GOTO LABELS
ET;
.IF ERRORCODE <> 0 THEN .GOTO LABELS
.QUIT
.LABEL LABELS
.QUIT ERRORCODE

BTEQ2
---------------------------------
BT;
INSERT INTO T1 SEL * FROM GTT;
.IF ERRORCODE <> 0 THEN .GOTO LABELS
ET;
.IF ERRORCODE <> 0 THEN .GOTO LABELS
.QUIT
.LABEL LABELS
.QUIT ERRORCODE

-----------------------------

Problem statement -
We need a process to tie the Querylog to the BTEQ ? Is there some option in Teradata to capture the BTEQ Name from Query Log. ? Can we use PROCID or is there any other option ?

When first installing Teradata Express for your VMware or Amazon EC2 environment, there are some basic configuration steps that we need to make based on the IP addresses that are given to the instance when it is started.

Hi, I am new to SQL and to Teradata so I am sorry if this is a stupid question. I studied real-time programming in school using C. Recently I saw a SQL statement in a BTEQ that didn't sit right with me. Unfortunately I don't have access to Teradata at work. I will have to wait until I get home to play with it but I can't find any fault in my logic. So here goes.

I have installed Teradata 13 on VMWare and trying to play around with it but have a few issues with bteq (being slow and hanging after a missing semicoln and so on).

Hi Gurus

Iam trying to import datafile into a table using BTEQ import...getting the following error
any help appreciated...
Thanks

*** Starting Row 2 at Wed Mar 24 15:40:18 2010

*** Failure 3706 Syntax error: Column name list longer than value list.
Statement# 1, Info =442

*** Warning: Repeat is cancelled.
*** Finished at input row 4 at Wed Mar 24 15:40:19 2010
*** Total number of statements: 1, Accepted : 0, Rejected : 1

*** Total elapsed time was 1 second.

+---------+---------+---------+---------+---------+---------+---------+----

Problem Statement - We implement a lot of ELT solutions at our client side. Becuase majority of our code is in BTEQ it is becoming more and more difficult to identofy which BTEQ upserts/inserts which table. Is there some log in Teradata which will tell us which BTEQ script ran a particular query ?

I have been asked by the server admins to provide a space requirement to install the following Teradata 12 utilities on a SUN Solaris SPARC server:

(1) CLI and related security libraries
(2) BTEQ
(3) FastLoad
(4) MultiLoad
(5) FastExport

I am new to stored procedure. We might need to capture error code from store procedure in bteq to take decision based on success/failure of stored procedure.How can I return ERRORCODE from store procedure? I would like to use BTEQ syntax .IF ERRORCODE <> 0 to take decision based on stored procedure success/failure.

My connectivity with TD is almost from third party ETL tool. ETL tool can create encrypted password to logon teradata.

This book provides reference information about BTEQ (Basic Teradata Query), a general-purpose, command-based report and load utility tool. BTEQ provides the ability to submit SQL queries to a Teradata Database in interactive and batch user modes, then produce formatted results.

Hello ...

I've downloaded and configured the 40Gb version of TDE13.

I've not used teradata since V2R5, and the database, all the tools and documentation were pre-configured for me ...

I have several questions:
- are the PDF files included in the VM or are they a separate download
- I'm able to start the instance and login with BTEQ, though logging in takes 45 seconds (repeatably). I've made the change to /etc/hosts for associating my IP address with dbccop1, but am at a loss as to how to speed up the login within the VM.

Workstation BTEQ recently added Unicode Support to its list of capabilities.  This article will explain to you how to start a Unicode BTEQ session in both interactive and batch mode. Command line options have been provided to give you control and flexibility to execute BTEQ in various Unicode environments, while preserving BTEQ’s legacy behavior.

In part 1, we will look at getting those Large Objects of yours into the Teradata. One way or another we will get them into Teradata, kicking and screaming if need be (however, it will be relatively painless, I promise) and we will do this despite any "Objections" "Large" may have.

Later in part 2, we will drag those Large Objects back out and use them in a web application.

If you are like me you tend to have several sandbox databases that you use day in and day out. Some on different servers all with different names and each one needing to have the latest ddl changes applied to keep things moving.