All Forums Database
vinuodh 3 posts Joined 02/12
15 Aug 2012
Getting the Max values and Grouping.

Hi,

I have a requirement like Get the earliest of field4 and latest of field5 for the same field1,field2,field6 if the filed3 has not changed.

Below is my sample records

Field1          Field2  Field3  Field4          Field5          Field6
123456789   120230 2.750 2012-07-26  2012-07-27  2012007
123456789   120230 2.300 2012-07-30  2012-07-31  2012007
123456789   120230 2.300 2012-07-17  2012-07-26  2012007
123456789   120230 2.750 2012-07-01  2012-07-16  2012007
123456789   120230 2.300 2012-07-31  2012-08-01  2012007
123456789   120230 2.300 2012-07-26  2012-07-27  2012007
123456789   120230 2.300 2012-07-01  2012-07-16  2012007
123456789   120230 2.750 2012-07-17  2012-07-26  2012007
123456789   120230 2.750 2012-07-16  2012-07-17  2012007
123456789   120230 2.300 2012-07-27  2012-07-30  2012007
123456789   120230 2.750 2012-07-27  2012-07-30  2012007
123456789   120230 2.300 2012-07-16  2012-07-17  2012007

And below is as it is expected.

Field1          Field2  Field3  Field4          Field5          Field6
123456789   120230 2.750 2012-07-01  2012-07-16  2012007
123456789   120230 2.300 2012-07-01  2012-07-17  2012007
123456789   120230 2.750 2012-07-16  2012-07-17  2012007
123456789   120230 2.300 2012-07-17  2012-07-26  2012007
123456789   120230 2.750 2012-07-17  2012-07-26  2012007
123456789   120230 2.300 2012-07-26  2012-07-27  2012007
123456789   120230 2.750 2012-07-26  2012-07-27  2012007
123456789   120230 2.300 2012-07-27  2012-07-30  2012007
123456789   120230 2.750 2012-07-27  2012-07-30  2012007
123456789   120230 2.300 2012-07-30  2012-08-01  2012007

 

 

 

ulrich 816 posts Joined 09/09
16 Aug 2012

Can be done with Olap functions.

Can you provide us with a DDL and insert statements for the test data.

I loose willingness to set these up on my own...

check also 

http://forums.teradata.com/forum/general/need-help-merging-consecutive-and-overlapping-date-spans might give you 

feel free to donate bitcoin:12kgAUHFUqvG2sQgaRBXFhCwyf9HXdkGud

vinuodh 3 posts Joined 02/12
19 Aug 2012

The below is the create and insert scripts

CREATE MULTISET TABLE test
(
      Field1 VARCHAR(32),
      Field2 DECIMAL(6,0),
      Field3 DECIMAL(17,3),
      Field4 DATE FORMAT 'YYYYMMDD'  ,
      Field5 DATE FORMAT 'YYYYMMDD'  ,
      Field6 DECIMAL(7,0)
)

INSERT INTO test VALUES ('852369745','120010','4.5','20120601','20120628','2012003');
INSERT INTO test VALUES ('852369745','120010','3','20120628','20120815','2012003');
INSERT INTO test VALUES ('852369745','120010','3','20120601','20120628','2012003');
INSERT INTO test VALUES ('852369745','120010','4.5','20120628','20120815','2012003');

INSERT INTO test VALUES ('8547123','120010','2.3','20120701','20120705','2012003');
INSERT INTO test VALUES ('8547123','120010','2.3','20120705','20120709','2012003');
INSERT INTO test VALUES ('8547123','120010','2.75','20120709','20120710','2012003');
INSERT INTO test VALUES ('8547123','120010','2.3','20120710','20120717','2012003');
INSERT INTO test VALUES ('8547123','120010','2.3','20120717','20120719','2012003');
INSERT INTO test VALUES ('8547123','120010','2.3','20120719','20120723','2012003');

INSERT INTO test VALUES ('96578412','120010','2.3','20120401','20120405','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120405','20120409','9012012');
INSERT INTO test VALUES ('96578412','120010','2.75','20120409','20120410','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120410','20120417','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120417','20120419','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120419','20120423','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120423','20120501','9012012');
INSERT INTO test VALUES ('96578412','120010','3.3','20120501','20120505','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120505','20120509','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120509','20120510','9012012');
INSERT INTO test VALUES ('96578412','120010','2.75','20120510','20120517','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120517','20120519','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120519','20120523','9012012');
INSERT INTO test VALUES ('96578412','120010','2.75','20120523','20120527','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120527','20120529','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120529','20120601','9012012');
INSERT INTO test VALUES ('96578412','120010','2.3','20120601','20120609','9012012');
INSERT INTO test VALUES ('96578412','120010','3','20120609','20120611','9012012');

ulrich 816 posts Joined 09/09
21 Aug 2012

This might give you what you are looking for.

Where I have to mention that I don't understand how you deal with this data as there are still many overlaps, at least for the first group of sample data.

select Field1, 
       field2, 
       field3, 
       min(field4) as field4,
       max(field5) as field5,
       field6
from
(
select Field1, 
       field2, 
       field3, 
       field4,
       field5,
       field6,
       sum(change_flag) over (partition by field1, field2, field6 order by field4, field5, field3 rows between unbounded preceding and current row) as period_id
from (
SELECT Field1, 
       field2, 
       field3, 
       field4,
       field5,
       field6,
       case when field3 <> max(field3) over (partition by field1, field2, field6 order by field4, field5, field3 rows between 1 preceding and 1 preceding)  
                then 1 
            when field3 = max(field3) over (partition by field1, field2, field6 order by field4, field5, field3 rows between 1 preceding and 1 preceding)  
                then 0
            else 1
       end as change_flag
from OLAP_TEST
) as t1
) as t2
group by Field1, 
       field2,
       field3,
       field6,
       period_id
order by 1,2,6,4,5,3;

 

feel free to donate bitcoin:12kgAUHFUqvG2sQgaRBXFhCwyf9HXdkGud

You must sign in to leave a comment.