0 - 19 of 19 tags for select

Team,
 
We have an insert statement that is inserting around 92 million records into a table based on a select statement from another table.  The select piece runs fine, but the following step takes a long time.
We do a MERGE into table HUB_ENCOUNTER_MESSAGE from Spool 18767.

Hello!
Teradata allows to use fields from SELECT block in other blocks. 
A have two tables T1 and T2. And there is a field "a" in table T1 and in T2. And i made a new field in select block^
SELECT 
case when T1.a > T2.a then T1.a else T2.a end AS a
 

Hi,
 
I'm trying to do something like below, but its giving me syntax error,
can anyone please tell me what is the issue and how to fix this.
 
with cte0(ID,int_col1) as(
    select row_number() over (order by (select 1)) AS ID,int_col1 from shuffletest
)
update cte0 set int_col1=ID;
 

Hi,
while searching for a possibility to get the datatype of the columns of a view I came

Hello,
 
I am facing a problem when writing a case statement, 
Here is the sample query

Hi,
I am exporting a large amount of data into a file.The specification is like each column should be seperated by pipe.If there is no data,there should be space.(| |)
I need file in this format
if there is data it should be like this

The following shows how to select rows from a database table using JDBC FastExport, which only works with JDBC PreparedStatement.

Dear All,
I have a Table where i have 2 fields - date_col and time_col.
DATE_COL - DATE FORMAT 'YY/MM/DD'
TIME_COL - TIME(6)
I am trying to select the min(time_col) for a particular date / current_date...
When i select all the records from the table for a particular date, i can see the min of time_col is '00:01:33'

Hi All,
I have the following SQL inside a stored procedure file and need some help.  I don't have access to the database so I can't test it out.

Good morning TD community,

Good Afternoon,

My question is this what I want to do is to a date depending on the month to add the last day of the month.

For example I get:

Hi,

I got to know that I can use TPT ODBC Operator to select data from another ODBC DB and then load that data into TD DB, using the LOAD Operator.

I'm still new to TeraData, so most of my issues and inquiries are probably simple.  I've done a few searches, and can't seem to find much...

If I have 5 columns, itm_cd1 thru itm_cd5, is there an easy way to select the highest value of the 5?

Hi,

I have a very large table which has a primary index (x, y). I am trying to perform a search on multiple areas; e.g. area 1 is located between x=1, y=1 and x=100, y=100. My aim is to retrieve the first available row for each searching area as fast as possible.

How do i Extract the Data From a volatile table that i have just created

I'm adding up totals within a database and would like to see how the totals change over different time periods.

I'm using this code:

<code>

Select Sum(CASE WHEN Offer_1 = 'Y' THEN 1 ELSE 0 END) as Offer1,

Sum(CASE WHEN Offer_2 = 'Y' THEN 1 ELSE 0 END) as Offer2,

Hello folks,

I am reviewing some documentation and have done some research on the ability to protect my data using the Archive utility. My source of the documentaiton is here... http://www.info.teradata.com/edownload.cfm?itemid=101680007

For my situation, I need to have a few datasets (7 at most) to recover from. So given the following Archive script...

-----
LOGON DBCID/UID,PW;
ARCHIVE DATA TABLES (MYONLINEDATA) ALL,
RELEASE LOCK,
FILEDEF=(tddumps,/var/tddumps/dump.%UEN%.out);
LOGOFF;
-----

Hi,

I am facing a weird problem of which can not really figure out what could be the reason of that. Consider following SELECT statement:

SELECT
Col1, Col2, Col3
FROM
View1
QUALIFY ROW_NUMBER() OVER(PARTITION BY Col4 ORDER BY Col5) = 1

We get 644,539 rows in return.

If we take a COUNT(*) as below:

SELECT COUNT(*) FROM
(
SELECT
Col1, Col2, Col3
FROM
View1
QUALIFY ROW_NUMBER() OVER(PARTITION BY Col4 ORDER BY Col5) = 1
) A

It gives us count to be 158,446

If we do SELECT * as below:

SELECT * FROM
(
SELECT
Col1, Col2, Col3
FROM
View1

Hi Guys,

Our portlet has to store a considerably large amount of data. Is there a way to "create" a custom table in the Postgres SQL DB for storing this Data. And afterwards be able to query this table and also do inserts and updates?