【问题标题】:Oracle: Optimizing twice self-join queryOracle:优化两次自连接查询
【发布时间】:2016-04-28 06:14:28
【问题描述】:

在过去的两天里,我一直在努力让这个查询有效地工作。我已经了解了有关 Oracle 索引行为的更多信息,我想我现在很困惑什么应该有效,什么无效。

基本上,查询是汇总值并与昨天和上周的值进行比较。

我已经尝试过分解它,我在脑海中玩弄过分析查询和更改索引的顺序,但似乎没有任何效果。我所有的测试都是在一个有 500K 行的表上进行的,一旦我在一个有 2000 万行的表上运行它,就需要永远。

非常感谢任何帮助。

我修改了原始帖子以帮助您帮助我。 :)

CREATE TABLE TABLE_1
(ORDER_LINE_ID NUMBER, OFFSET NUMBER, BREAK_ID NUMBER, ZONE NUMBER, NETWORK NUMBER, HOUR_OF_DAY NUMBER, START_TIME DATE, END_TIME DATE, SUCCESS NUMBER
  CONSTRAINT "TABLE_1_PK" PRIMARY KEY (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, HOUR_OF_DAY))

-- SUCCESS is already aggregated during the insert
-- These are last week's records
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (1,0,1, 1, 1, 2016042001,'04/20/2016 00:00:00', '04/20/2016 02:00:00', 1);
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (1,30,1, 1, 1, 2016042001,'04/20/2016 00:00:00', '04/20/2016 02:00:00', 2);
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (2,0,1, 1, 1, 2016042001,'04/20/2016 00:00:00', '04/20/2016 02:00:00', 1);
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (2,30,1, 1, 1, 2016042001,'04/20/2016 00:00:00', '04/20/2016 02:00:00', 1);

-- These are yesterday's records
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (3,0,1, 1, 1, 2016042601,'04/26/2016 00:00:00', '04/26/2016 02:00:00', 1);
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (3,30,1, 1, 1, 2016042601,'04/26/2016 00:00:00', '04/26/2016 02:00:00', 2);
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (4,0,1, 1, 1, 2016042601,'04/26/2016 00:00:00', '04/26/2016 02:00:00', 1);
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (4,30,1, 1, 1, 2016042601,'04/26/2016 00:00:00', '04/26/2016 02:00:00', 1);

-- This is today's records
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (5,0,1, 1, 1, 2016042701,'04/27/2016 00:00:00', '04/27/2016 02:00:00', 1);
INSERT INTO TABLE_1 (ORDER_LINE_ID, OFFSET, BREAK_ID, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME, SUCCESS)
VALUES (5,30,1, 1, 1, 2016042701,'04/27/2016 00:00:00', '04/27/2016 02:00:00', 1);

-- Original twice join query
SELECT BREAK_ID, ORDER_LINE_ID, HOUR_OF_DAY, OFFSET, ZONE, NETWORK, START_TIME, END_TIME, SUM(SUCCESS), SUM(YESTERDAY_SUCCESS), SUM(LAST_WEEK_SUCCESS)
FROM TABLE_1 CURRENT_DAY
LEFT OUTER JOIN (
  SELECT SUM(SUCCESS) YESTERDAY_SUCCESS, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME FROM TABLE_1 
  GROUP BY ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME
) YESTERDAY 
  ON YESTERDAY.START_TIME + 1 = CURRENT_DAY.START_TIME
  AND YESTERDAY.END_TIME + 1 = CURRENT_DAY.END_TIME
  AND YESTERDAY.HOUR_OF_DAY = CURRENT_DAY.HOUR_OF_DAY
  AND YESTERDAY.NETWORK = CURRENT_DAY.NETWORK
  AND YESTERDAY.ZONE = CURRENT_DAY.ZONE
LEFT OUTER JOIN (
  SELECT SUM(SUCCESS) LAST_WEEK_SUCCESS, ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME FROM TABLE_1
  GROUP BY ZONE, NETWORK, HOUR_OF_DAY, START_TIME, END_TIME
  ) LAST_WEEK 
  ON YESTERDAY.START_TIME + 7 = CURRENT_DAY.START_TIME
  AND YESTERDAY.END_TIME + 7 = CURRENT_DAY.END_TIME
  AND YESTERDAY.HOUR_OF_DAY = CURRENT_DAY.HOUR_OF_DAY
  AND YESTERDAY.NETWORK = CURRENT_DAY.NETWORK
  AND YESTERDAY.ZONE = CURRENT_DAY.ZONE
GROUP BY BREAK_ID, ORDER_LINE_ID, HOUR_OF_DAY, OFFSET, ZONE, NETWORK, START_TIME, END_TIME;

-- Using Analytic Query (thank you to MT0)
SELECT BREAK_ID, ORDER_LINE_ID, HOUR_OF_DAY, OFFSET, ZONE, NETWORK, START_TIME, END_TIME, SUM(SUCCESS), SUM(YESTERDAY_SUCCESS), SUM(LAST_WEEK_SUCCESS)
FROM (
  SUM( SUCCESS )
    OVER ( PARTITION BY ZONE, NETWORK, HOUR_OF_DAY, TO_CHAR(START_TIME, 'HH24:MI:SS'), TO_CHAR(END_TIME, 'HH24:MI:SS')
           ORDER BY START_TIME
           RANGE BETWEEN INTERVAL '1' DAY PRECDEDING AND INTERVAL '1' DAY PRECEDING
          ) AS YESTERDAY_SUCCESS,
SUM ( SUCCESS )
    OVER ( PARTITION BY ZONE, NETWORK, HOUR_OF_DAY, TO_CHAR(START_TIME, 'HH24:MI:SS'), TO_CHAR(END_TIME, 'HH24:MI:SS')
           ORDER BY START_TIME
           RANGE BETWEEN INTERVAL '7' DAY PRECDEDING AND INTERVAL '7' DAY PRECEDING
         ) AS LAST_WEEK_SUCCESS
FROM TABLE_1 
) T1
WHERE SYSDATE - INTERVAL '12' HOUR <= START_TIME
AND START_TIME < SYSDATE - INTERVAL '1' HOUR
GROUP BY BREAK_ID, ORDER_LINE_ID, HOUR_OF_DAY, OFFSET, ZONE, NETWORK, START_TIME, END_TIME;

我必须说谢谢你帮助我把这个问题提出来,我希望它更容易理解。一切都按预期工作,但性能可能需要一些调整。

500K 行的表需要 1.8 秒

2000 万行的表需要 400 秒

我还想添加一些 Oracle 提供的执行计划。我在调整性能时遇到问题。

-- using twice self join
--------------------------------------------------------------------------------------------------------------------------------------------------------------------
| Id  | Operation                       | Name                      | Starts | E-Rows | A-Rows |   A-Time   | Buffers | Reads  | Writes |  OMem |  1Mem |  O/1/M   |
--------------------------------------------------------------------------------------------------------------------------------------------------------------------
|   0 | SELECT STATEMENT                |                           |      1 |        |     50 |00:00:00.84 |   99875 |    217 |   1705 |       |       |          |
|   1 |  HASH GROUP BY                  |                           |      1 |   6711 |     50 |00:00:00.84 |   99875 |    217 |   1705 |  1616K|   995K|          |
|*  2 |   FILTER                        |                           |      1 |        |    119K|00:00:00.65 |   99875 |      0 |      0 |       |       |          |
|   3 |    NESTED LOOPS OUTER           |                           |      1 |     54M|    119K|00:00:00.64 |   99875 |      0 |      0 |       |       |          |
|*  4 |     HASH JOIN OUTER             |                           |      1 |    109 |    119K|00:00:00.52 |   99875 |      0 |      0 |    13M|  2093K|     1/0/0|
|   5 |      TABLE ACCESS BY INDEX ROWID| TABLE_1_IDX               |      1 |    109 |    119K|00:00:00.14 |   85908 |      0 |      0 |       |       |          |
|*  6 |       INDEX RANGE SCAN          | START_TIME_IDX            |      1 |    109 |    119K|00:00:00.02 |     320 |      0 |      0 |       |       |          |
|   7 |      VIEW                       |                           |      1 |   1250 |  29311 |00:00:00.23 |   13967 |      0 |      0 |       |       |          |
|   8 |       HASH GROUP BY             |                           |      1 |   1250 |  29311 |00:00:00.22 |   13967 |      0 |      0 |  3008K|  1094K|     1/0/0|
|*  9 |        FILTER                   |                           |      1 |        |  88627 |00:00:00.20 |   13967 |      0 |      0 |       |       |          |
|* 10 |         TABLE ACCESS FULL       | TABLE_1                   |      1 |   1250 |  88627 |00:00:00.19 |   13967 |      0 |      0 |       |       |          |
|  11 |     VIEW                        |                           |    119K|    499K|      0 |00:00:00.10 |       0 |      0 |      0 |       |       |          |
|  12 |      SORT GROUP BY              |                           |    119K|    499K|      0 |00:00:00.08 |       0 |      0 |      0 |  1024 |  1024 |     1/0/0|
|* 13 |       FILTER                    |                           |    119K|        |      0 |00:00:00.02 |       0 |      0 |      0 |       |       |          |
|  14 |        TABLE ACCESS FULL        | TABLE_1                   |      0 |    499K|      0 |00:00:00.01 |       0 |      0 |      0 |       |       |          |
--------------------------------------------------------------------------------------------------------------------------------------------------------------------

Predicate Information (identified by operation id):
---------------------------------------------------

   2 - filter(SYSDATE@!-17<SYSDATE@!-16)
   4 - access("YESTERDAY"."ZONE"="T1"."ZONE" AND "YESTERDAY"."NETWORK"="T1"."NETWORK" AND "YESTERDAY"."HOUR_OF_DAY"="T1"."HOUR_OF_DAY" 
              AND "T1"."END_TIME"=INTERNAL_FUNCTION("YESTERDAY"."END_TIME")+1 AND 
              "T1"."START_TIME"=INTERNAL_FUNCTION("YESTERDAY"."START_TIME")+1)
   6 - access("T1"."START_TIME">=SYSDATE@!-17 AND "T1"."START_TIME"<SYSDATE@!-16)
   9 - filter(SYSDATE@!-17<SYSDATE@!-16)
  10 - filter((INTERNAL_FUNCTION("START_TIME")+1>=SYSDATE@!-17 AND INTERNAL_FUNCTION("START_TIME")+1<SYSDATE@!-16))
  13 - filter(("YESTERDAY"."ZONE"="T1"."ZONE" AND "YESTERDAY"."NETWORK"="T1"."NETWORK" AND "YESTERDAY"."HOUR_OF_DAY"="T1"."HOUR_OF_DAY" 
              AND "T1"."END_TIME"=INTERNAL_FUNCTION("YESTERDAY"."END_TIME")+7 AND 
              "T1"."START_TIME"=INTERNAL_FUNCTION("YESTERDAY"."START_TIME")+7))

另一个使用分析查询的执行计划(再次感谢 MT0)

-- using analytic query
-------------------------------------------------------------------------------------------------------------------------------
| Id  | Operation             | Name             | Starts | E-Rows | A-Rows |   A-Time   | Buffers |  OMem |  1Mem |  O/1/M   |
-------------------------------------------------------------------------------------------------------------------------------
|   0 | SELECT STATEMENT      |                  |      1 |        |     50 |00:00:01.51 |   13967 |       |       |          |
|   1 |  HASH GROUP BY        |                  |      1 |    499K|     50 |00:00:01.51 |   13967 |    98M|  7788K|          |
|*  2 |   VIEW                |                  |      1 |    499K|    119K|00:00:01.15 |   13967 |       |       |          |
|   3 |    WINDOW SORT        |                  |      1 |    499K|    499K|00:00:01.43 |   13967 |    66M|  2823K|     1/0/0|
|*  4 |     FILTER            |                  |      1 |        |    499K|00:00:00.16 |   13967 |       |       |          |
|   5 |      TABLE ACCESS FULL| TABLE_1          |      1 |    499K|    499K|00:00:00.12 |   13967 |       |       |          |
-------------------------------------------------------------------------------------------------------------------------------

Predicate Information (identified by operation id):
---------------------------------------------------

   2 - filter(("T1"."START_TIME">=SYSDATE@!-INTERVAL'+17 00:00:00' DAY(2) TO SECOND(0) AND 
              "T1"."START_TIME"<SYSDATE@!-INTERVAL'+16 00:00:00' DAY(2) TO SECOND(0)))
   4 - filter(SYSDATE@!-INTERVAL'+17 00:00:00' DAY(2) TO SECOND(0)<SYSDATE@!-INTERVAL'+16 00:00:00' DAY(2) TO 
              SECOND(0))

如您所见,我在 start_time 上添加了一个索引,自联接查询受益于该索引,但估计值与实际值不一致。分析查询只是决定它与索引无关。非常感谢任何想法、参考点或帮助。提前谢谢大家。

【问题讨论】:

  • 您能否发布一些示例数据来帮助解释您要做什么? field_6field_7 是没有时间的日期,还是它们有时间组件并且同一组中有多个值?
  • 您在 field_6field_7 上进行外部连接 - 您怎么知道昨天和上周的连接在字段 1、2 和 3(您正在分组)上具有相同的值?
  • MT0,感谢您的回复。我修改了查询以显示更真实的字段名称。我还指定了类型。谢谢你的帮助。
  • INSERT INTO table_1 VALUES ( 1, 1, 1, 1, 0, TIMESTAMP '2016-04-26 01:23:45', TIMESTAMP '2016-04-26 12:34:56' )INSERT INTO table_1 VALUES ( 2, 2, 2, 1, 0, TIMESTAMP '2016-04-25 01:23:45', TIMESTAMP '2016-04-25 12:34:56' ) 这两行将被连接(因为它们的开始和结束时间都正好相隔 1 天)但它们有不同的 order_linezonenetwork价值观 - 你确定这是你所追求的行为吗?
  • 你确定order_line, zone, network是这个表的主键吗?它似乎不是唯一的,看看你的插入语句?

标签: oracle query-optimization self-join


【解决方案1】:

目前尚不清楚为什么只有当今天和昨天(或上周)时间完全相同的行时才加入,但如果您只想要特定时间之间的行,那么您可以消除所有自联接和做:

SELECT order_line,
       zone,
       network,
       sum(
        CASE WHEN SYSDATE - INTERVAL '12' HOUR <= start_time
             AND  start_time < SYSDATE - INTERVAL '1' HOUR
             THEN success
             END
       ) AS total_successes_today,
       sum(
        CASE WHEN SYSDATE - INTERVAL '12' HOUR <= start_time
             AND  start_time < SYSDATE - INTERVAL '1' HOUR
             THEN error
             END
       ) AS total_errors_today,
       sum(
        CASE WHEN SYSDATE - INTERVAL '36' HOUR <= start_time
             AND  start_time < SYSDATE - INTERVAL '25' HOUR
             THEN success
             END
       ) AS total_successes_yesterday,
       sum(
        CASE WHEN SYSDATE - INTERVAL '180' HOUR <= start_time
             AND  start_time < SYSDATE - INTERVAL '169' HOUR
             THEN success
             END
       ) AS total_successes_last_week
FROM   table_1
WHERE  (    SYSDATE - INTERVAL '12' HOUR <= start_time
        AND start_time < SYSDATE - INTERVAL '1' HOUR ) -- today
OR     (    SYSDATE - INTERVAL '36' HOUR <= start_time
        AND start_time < SYSDATE - INTERVAL '25' HOUR ) -- yesterday = today + 24 hours
OR     (    SYSDATE - INTERVAL '180' HOUR <= start_time
        AND start_time < SYSDATE - INTERVAL '169' HOUR ) -- last week = today + 7*24 hours

但是,如果您确实希望在开始时间和结束时间保持连接,那么您可以使用分析查询:

SELECT order_line,
       zone,
       network,
       SUM( success ),
       SUM( error ),
       SUM( yesterday_success ),
       SUM( last_week_success )
FROM   (
  SELECT t.*,
         SUM( success )
           OVER ( PARTITION BY id,
                               TO_CHAR( start_time, 'HH24:MI:SS' ),
                               TO_CHAR( end_time, 'HH24:MI:SS' )
                  ORDER BY start_time
                  RANGE BETWEEN INTERVAL '1' DAY PRECEDING AND INTERVAL '1' DAY PRECEDING
                ) AS yesterday_success,
         SUM( success )
           OVER ( PARTITION BY id,
                               TO_CHAR( start_time, 'HH24:MI:SS' ),
                               TO_CHAR( end_time, 'HH24:MI:SS' )
                  ORDER BY start_time
                  RANGE BETWEEN INTERVAL '7' DAY PRECEDING AND INTERVAL '7' DAY PRECEDING
                ) AS last_week_success
  FROM   TABLE_1 t
)
WHERE  SYSDATE - INTERVAL '12' HOUR <= start_time 
AND    start_time < SYSDATE - INTERVAL '1' HOUR
GROUP BY
       order_line,
       zone,
       network
ORDER BY
       order_line,
       zone,
       network

您可以查看在TO_CHAR( start_time, 'HH24:MI:SS' )TO_CHAR( end_time, 'HH24:MI:SS' ) 上使用基于函数的索引是否会提高速度。

【讨论】:

  • MT0,你又是一个救生员。我想这就是我一直在寻找的。让我玩一下你的样品,看看我能不能让它发挥作用。
  • MT0,你就是男人!!!该查询准确地返回了我需要的内容,即使速度不是那么快,速度也更加一致。在具有 500K 行的表上,它以一致的 1.8 秒执行。自联接在初始查询中需要 10 秒,一旦缓冲,将在 0.5 到 1.25 秒之间反弹。我正在测试一个有 2000 万行的表。我仍将修改示例以显示不同的行业部门,希望我们可以调整此查询。我想将此标记为答案,但我想我在这个论坛中没有足够的“布朗尼积分”。 :)
猜你喜欢
  • 2021-12-11
  • 1970-01-01
  • 2019-08-27
  • 1970-01-01
  • 2013-08-30
  • 2019-05-12
  • 1970-01-01
  • 1970-01-01
  • 2018-02-26
相关资源
最近更新 更多