All Forums Database
HF 3 posts Joined 07/16
19 Jul 2016
Consolidate rows

I'm looking to consilidate rows in my table by combining similar consecutive rows together. 
In the example I have category "A" where the type "X" is the 1st two rows in order of START_TS. I need those two rows to combine with the start_TS from the 1st row and the END_TS of the second row. The other two rows would remain the same.
With category "B", all of the rows are type "X"  so all four rows would combine into one row with the earliest Start_TS and the latest END_TS
With category "C", there are no consecutive rows with the same type, so all four records would remain unchanged.
I've tried to use something with recursive but haven't been able to make it work.

CURRENT TABLE
CATEGORY TYPE START_TS                      END_TS
A               X      2016-01-01 00:00:00     2016-01-01 23:59:59
A               X      2016-02-01 00:00:00     2016-02-01 23:59:59
A               Q      2016-03-01 00:00:00     2016-03-01 23:59:59
A               X      2016-04-01 00:00:00     2016-04-01 23:59:59
B               X      2016-01-01 00:00:00     2016-01-01 23:59:59
B               X      2016-02-01 00:00:00     2016-02-01 23:59:59
B               X      2016-03-01 00:00:00     2016-03-01 23:59:59
B               X      2016-04-01 00:00:00     2016-04-01 23:59:59
C               Q      2016-01-01 00:00:00     2016-01-01 23:59:59
C               X      2016-02-01 00:00:00     2016-02-01 23:59:59
C               Q      2016-03-01 00:00:00     2016-03-01 23:59:59
C               X      2016-04-01 00:00:00     2016-04-01 23:59:59
   
UPADTED (or new) table
CATEGORY TYPE START_TS                      END_TS
A              X       2016-01-01 00:00:00     2016-02-01 23:59:59
A              Q       2016-03-01 00:00:00     2016-03-01 23:59:59
A              X       2016-04-01 00:00:00     2016-04-01 23:59:59
B              X       2016-01-01 00:00:00     2016-04-01 23:59:59
C              Q       2016-01-01 00:00:00     2016-01-01 23:59:59
C              X       2016-02-01 00:00:00     2016-02-01 23:59:59
C              Q       2016-03-01 00:00:00     2016-03-01 23:59:59
C              X       2016-04-01 00:00:00     2016-04-01 23:59:59
 

Tags:
dnoeth 4628 posts Joined 11/04
19 Jul 2016

Is "2016-02-01" Feb. 1st or Jan. 2nd, i.e. do you want to combine only periods without gaps?

Dieter

HF 3 posts Joined 07/16
19 Jul 2016

2016-02-01 is February 1st.  There are no gaps in time.  My example END_TS should really have been the last second prior to the start_ts of the following row.
For instance the 1st row in the example should have had an end_ts of 2016-01-31 23:59:59
The data I'm actually working with could span a few seconds to several months, but the the start_ts will always be one second after the end_ts of the previous record.
 

dnoeth 4628 posts Joined 11/04
19 Jul 2016

What's your Teradata version?
14.10 has a quite unknown syntax using NORMALIZE over PERIODs:

SELECT CATEGORY, TYPE,
   -- split the period in start and end again
   BEGIN(pd), LAST(pd)
FROM
 (
   SELECT NORMALIZE 
      CATEGORY, TYPE, 
      -- create a period on-the-fly to be able to use NORMALIZE
      PERIOD(START_TS, END_TS + INTERVAL '1' SECOND) AS pd 
   FROM tab
 ) AS dt

 

Dieter

HF 3 posts Joined 07/16
19 Jul 2016

I have version 15.1, but your query worked perfectly. Thank you.

DPKKUMARARORA 2 posts Joined 11/15
06 Aug 2016

Please use below query for TD-14  or prior versions.

 

 

 

CREATE TABLE TEST.TEST_SCENARIO1

 

(

 

CATEGORY VARCHAR(2),

 

TYPE1 VARCHAR(2),

 

START_TS TIMESTAMP(6),

 

END_TS TIMESTAMP(6)

 

)

 

PRIMARY INDEX (CATEGORY);

 

 

 

 

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('A' ,              'X' ,    '2016-01-01 00:00:00' ,    '2016-01-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('A',               'X' ,     '2016-02-01 00:00:00' ,    '2016-02-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('A' ,              'Q'  ,    '2016-03-01 00:00:00'  ,   '2016-03-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('A',               'X',      '2016-04-01 00:00:00'  ,   '2016-04-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('B',               'X'  ,    '2016-01-01 00:00:00'  ,   '2016-01-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('B',               'X' ,     '2016-02-01 00:00:00'  ,   '2016-02-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('B' ,              'X'  ,    '2016-03-01 00:00:00'  ,   '2016-03-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('B',               'X' ,     '2016-04-01 00:00:00'  ,   '2016-04-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('C' ,              'Q'  ,    '2016-01-01 00:00:00'  ,   '2016-01-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('C' ,              'X'  ,    '2016-02-01 00:00:00'  ,   '2016-02-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('C' ,              'Q'   ,   '2016-03-01 00:00:00'  ,   '2016-03-01 23:59:59');

 

INSERT INTO TEST.TEST_SCENARIO1 VALUES ('C'   ,            'X'   ,   '2016-04-01 00:00:00'  ,   '2016-04-01 23:59:59');

 

 

 

SELECT * FROM TEST.TEST_SCENARIO1  ORDER BY 1,3;

 

 

 

SELECT

 

CATEGORY

 

,TYPE1

 

,START_TS

 

,CASE WHEN (START_TS=UPDATE_DT) THEN END_TS ELSE UPDATE_DT END  AS NEW_END_DATE

 

,ROW_NUMBER() OVER (PARTITION BY CATEGORY ORDER BY CATEGORY,START_TS ) AS "ROW"

 

FROM

 

(

 

SELECT

 

CATEGORY

 

,TYPE1

 

,START_TS

 

,END_TS

 

,PREVIOUS

 

,MAX(START_TS) OVER (PARTITION BY CATEGORY,TYPE1 ORDER BY CATEGORY,START_TS rows between current row and unbounded following) as

 

UPDATE_DT

 

---,MAX(START_TS) OVER (PARTITION BY CATEGORY,TYPE1 ORDER BY CATEGORY,START_TS rows unbounded following)

 

FROM

 

(

 

SELECT CATEGORY,TYPE1,START_TS,END_TS,

 

COALESCE(MAX(TYPE1) OVER (PARTITION BY CATEGORY,TYPE1 ORDER BY CATEGORY,START_TS ROWS  BETWEEN 1 PRECEDING AND 1 PRECEDING),'~')

 

AS PREVIOUS

 

FROM TEST.TEST_SCENARIO1 )A

 

WHERE COALESCE(TYPE1,'~')=COALESCE(PREVIOUS,'~')

 

)B

 

QUALIFY  "ROW"=1

 
 

ORDER BY CATEGORY,START_TS

You must sign in to leave a comment.