0 - 20 of 20 tags for tuning

There is  "poetic query" that takes forever to run
Kind of looks like this
(case <condition involving tb1 and tb2>) as "Dcol1",
Sum ( col1.tb3),
sum (col2.tb3 ),
sum (col3.tb3)
tb1 left outer join tb2 <condition> LOJ tb3 <conditions>

Below delete is consuming 11K CPU.Need assistance to re-write the below delete sql to consume less CPU.




This session will discuss Application Query Tuning on the Teradata platform. 

 We are doing insert into select from source to back up table.Select part is running fine ,where are when doing insert it is taking time and using high CPU. Based on below parameters and explain,request you to provide suggestion for tuning.

Have you ever wished for a magic wand that could quickly point out the missing, stale, and unused statistics for you?

Hello All !
Wondering how Nodes & no. of AMPs effect the run of a query?
 For ex: if Non Prod Env has 10 Nodes but Prod env has 40 nodes.
How it'll effect the same query in both env's?
In that case will it be okay to tune the query in non-prod env?

staging_table STG
target_table TGT
AND TGT.SRC_SYS='sourcename'
In above query, both STG and TGT has PI as SRC_TRAN_ID
TGT has stats collected on ROW_STAT_CD and SRC_SYS

SQL performance is vital for driving value from a data warehouse. With today's growing query complexity, optimizing SQL can be a daunting task ...

Hello All !
I am bit new to Teradata and started working on it couple of months back.
I working on performance tuning suggestions for some new queries.
Wondering if anybody can help on:
How we can decide for the stats for new queries. I mean, the queries which are not in production yet.

Performance Diagnostic Method and Tools is targeted to uncover and resolve performance related issues in three primary areas...

Folks - if ANYONE out there wants to join me in getting access to TPN+ membership reply or IM me ( preferably if there, IM in the forum that is ) . I have  been in Terdata world for a long time - technology has now changed drastically from what it used to be during V2R3/5/6 days .

I have a table which has a column where I can find special characters like &Altide; &Abc; etc. The total list of special characters is 110, I identified them using the logic anything between "&" and ";" is a special character and finally cross checked manually. Now I need to find and replace them in my table with space(" ").

Have you ever struggled over whether, or how, to use a new database feature?  In this presentation, we’ll demonstrate how database query log (DBQL) data can help when it comes to determining candidates for MLPPI (Multi-Level Partitioned Primary Index) as well as the new Columnar Partitioning feature of Teradata 14.

One of the more difficult challenges in database management and administration is determining where and how to implement a new RDBMS feature or function. In this presentation we’ll look at the DBQL data available for evaluation of tables and columns used within a workload and how this data can be leveraged for determining candidates for MLPPI (Multi-Level Partitioned Primary Index).

Have you ever been bewildered trying to tune complex Teradata queries that go on for pages and pages of EXPLAIN text?  This presentation goes beyond "Explain the Explain", to teach strategies and techniques for quickly deciphering the most expensive operations in your queries.  Taught by a 20+ year Teradata veteran, examples of real queries gathered from some of the largest production environments in the world are used.  This is advanced material covered at a quick pace.

When defining a PPI on a date column, would it be better to define your partitions so that fewer (and larger) partitions are accessed by the query or would it be better to define your partitions so that more (but smaller) partitions are accessed?

This presentation will provide recommendations and guidelines for using leading practices to enable system and performance tuning.  You will learn the benefits of using Teradata's most recent set of recommendations to manage workloads on a Teradata system. For example, by using Teradata's data collection guidelines, recommendations for Account Management and Logging, a user will be able to identify, analyze and tune problem queries for better performance. This presentation will also focus on how to use the tools provided in the Teradata Analyst Pak. 

The primary goals in a successful backup strategy are to ensure backup integrity, minimize backup & restore time, & reduce impact on the Teradata system. This presentation covers the connection, configuration, & tuning of the BAR hardware and software components, protecting & validating backups, & how different backup strategies can be applied to reduce system impact.


I have a simple question. Is it important the order of joind tables? What's better?:

select ...

from large_Table lt

inner join medium_table mt

inner join small_table st


select ...

from small_Table st

inner join medium_table mt