0 - 14 of 14 tags for dbc

I would like to ask you about the DBC.ResUsageSpma Table and the performance of the Teradata.
We have a Teradata 2800 with 2 Nodes and we are having some performance issues. When I take a look at the DBC.ResUsageSpma the results the attached.
Is something wrong with Node ID 105?.
Thanks in advance.

I have a TARA job which backs up everything under dbc excluding dbc. The following is part of the arc script which works












If you are user DBC or user TDWM and you log onto a Teradata Database, you might get treated somewhat differently than other users.  This posting describes what user DBC and user TDWM do, some of the special things about them, and when you can expect them to get treated differently.   We’ll also look at any implications in setting up workload management for these two special users.

Hi all,
I'm writing here because I didn't find any previous entry that solve my strange problem.

1) I define a table that has a column with COMPRESS clause

Hello all,
I have below 2 questions with regards to TD DBAs :

Hi all, I am looking for a way to be able to see what database/table(s) a Terdata MACRO inserts into, updates, or deletes from.  We are setting query banding and are able to see the ALL of the tables the MACRO leverages, but simply are unable to determine which one(s) it is inserting into, updating or deleteing from.

Hi All


If a table is defined as MLPPI the constraint can be checked either from DBC.PartitioningConstraintsV or from DBC.IndexConstraintsV. 

There the ConstraintText feild holds the explanation in a certain format such as


/* nn bb cc */ partition_expression_i /* i  d+a */



Hi everyone,
I'm tryng to build a set of queries with the objective of getting the DDL of tables only from the info inside the dbc tables (columns, tvm, dbase, tableconstraints...).
I started from the table dbc.columns to get all the info about every field of a table.
Below you can find the work in progress query.

Hi guys,
I would like to understand peoples views on utilising DBC for dynamically generating scripts for data processing.
Considering DBC is Teradata's own metadata store and the development of this is owned and managed by Teradata do you feel it appropriate to develop frameworks for processing data based upon this?

Is there any way to measure the total AMPCPUTIME consumed by any MLOAD/FLOAD/FEXP job post completion? The values from the dbql(sum of ampcputime for a particular LSN number) does not seem right.
I also tried dbc.acctg. There too, the values seem low.

Hi all,
my project uses a TD 13.10 system.
we are planning to develop a Java based application to dynamically view the queries run by our ETL jobs.
specifically queries that are blocked by other queries. I know this can be accomplished by PMon tool.
but we would like to view this information in a custom built application.

Hi ,
The field typeofuse of the table dbc.dbqlobjtbl according to the manual (TD14) has the following values:

1 = Found in the resolver

• 2 = Accessed during query processing

• 4 = Found in a conditional context

• 8 = Found in inner join condition

• 16 = Found in outer join condition

Is there a table or view that contains the NumberOfRows, NumberOfColumns, DataBaseName, TableName?

In Oracle its called all_tables and in Netezza its called _v_table_only_storage_stat.

Here is where I looked:

From Teradata documentation, the definitions of these ResUsageSpma columns are:

FilePres Total number of times a cylinder is loaded.

FilePreReads Number of times a cylinder is loaded.

which look similar, but I observed on our lab's Teradata and found quite different values with an observation that:
FilePres = 2 x FilePreReads (approx.)

I also looked into KA but couldn't find any explanation. Please respond if you know the meaning of these fields.