Odd behaviour -- Index scan vs. seq. scan

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Carlos Moreno

    Odd behaviour -- Index scan vs. seq. scan


    I can't find a reasonable explanation for this.

    I have a table game, with primary key gameid (an int).

    If I use a where involving gameid and <, or >, or <=,
    or >=, then I get a sequential scan. If I use =, then
    of course I get an Index scan.

    More surprising to me is the fact that using BETWEEN,
    I get an Index scan!!

    The following is a copy-n-paste of the EXPLAIN outputs
    (with the useless lines removed):

    explain delete from game where gameid = 1000;
    Index Scan using game_pkey on game (cost=0.00..3.1 4 rows=1 width=6)

    explain delete from game where gameid < 1000;
    Seq Scan on game (cost=0.00..477 9.50 rows=200420 width=6)

    explain delete from game where gameid between 1000 and 2000;
    Index Scan using game_pkey on game (cost=0.00..3.1 5 rows=1 width=6)


    How's that possible? Is it purposely done like this, or
    is it a bug? (BTW, Postgres version is 7.2.3)

    Thanks,

    Carlos
    --


    ---------------------------(end of broadcast)---------------------------
    TIP 6: Have you searched our list archives?



  • Adam Kavan

    #2
    Re: Odd behaviour -- Index scan vs. seq. scan

    [color=blue]
    >
    > explain delete from game where gameid = 1000;
    > Index Scan using game_pkey on game (cost=0.00..3.1 4 rows=1 width=6)
    >
    > explain delete from game where gameid < 1000;
    > Seq Scan on game (cost=0.00..477 9.50 rows=200420 width=6)
    >
    > explain delete from game where gameid between 1000 and 2000;
    > Index Scan using game_pkey on game (cost=0.00..3.1 5 rows=1 width=6)
    >
    >
    >How's that possible? Is it purposely done like this, or
    >is it a bug? (BTW, Postgres version is 7.2.3)[/color]


    Postgres thinks that for the = line there will only be 1 row so t uses an
    index scan. Same thing for the between. However it thinks that there are
    200420 rows below 1000 and decides a seq scan would be faster. You can run
    EXPLAIN ANALYZE to see if its guesses are correct. You can also try SET
    enable_seqscan = FALSE; to see if it is faster doing an index scan. If it
    is faster to do an index scan edit your postgres.conf file and lower the
    cost for a random tuple, etc.

    --- Adam Kavan
    --- akavan@cox.net



    ---------------------------(end of broadcast)---------------------------
    TIP 1: subscribe and unsubscribe commands go to majordomo@postg resql.org

    Comment

    • Christopher Browne

      #3
      Re: Odd behaviour -- Index scan vs. seq. scan

      A long time ago, in a galaxy far, far away, moreno@mochima. com (Carlos Moreno) wrote:[color=blue]
      > I can't find a reasonable explanation for this.
      >
      > I have a table game, with primary key gameid (an int).
      >
      > If I use a where involving gameid and <, or >, or <=,
      > or >=, then I get a sequential scan. If I use =, then
      > of course I get an Index scan.
      >
      > More surprising to me is the fact that using BETWEEN,
      > I get an Index scan!!
      >
      > The following is a copy-n-paste of the EXPLAIN outputs
      > (with the useless lines removed):
      >
      > explain delete from game where gameid = 1000;
      > Index Scan using game_pkey on game (cost=0.00..3.1 4 rows=1 width=6)
      >
      > explain delete from game where gameid < 1000;
      > Seq Scan on game (cost=0.00..477 9.50 rows=200420 width=6)
      >
      > explain delete from game where gameid between 1000 and 2000;
      > Index Scan using game_pkey on game (cost=0.00..3.1 5 rows=1 width=6)
      >
      >
      > How's that possible? Is it purposely done like this, or
      > is it a bug? (BTW, Postgres version is 7.2.3)[/color]

      It would appear as though the statistics may be a bit stilted such
      that the second query is being handled wrongly.

      #1 and #3 are pretty clear...

      - In #1, it's using the index, correctly estimating that there are
      only a few rows with "gameid = 1000"

      - In #3, it's using the index, correctly estimating that there are
      few rows with gameid between 1000 and 2000.

      It seems surprising that the optimizer is estimating that there are
      200420 rows with gameid < 1000.

      Is it possible that you did an ANALYZE a long while back, back when
      you had an enormous number of rows with gameid < 1000?

      Try running ANALYZE again on the table, and see if the estimates
      change.
      --
      select 'aa454' || '@' || 'freenet.carlet on.ca';

      Why isn't phonetic spelled the way it sounds?

      Comment

      • scott.marlowe

        #4
        Re: Odd behaviour -- Index scan vs. seq. scan

        On Mon, 15 Sep 2003, Adam Kavan wrote:
        [color=blue]
        >[color=green]
        > >
        > > explain delete from game where gameid = 1000;
        > > Index Scan using game_pkey on game (cost=0.00..3.1 4 rows=1 width=6)
        > >
        > > explain delete from game where gameid < 1000;
        > > Seq Scan on game (cost=0.00..477 9.50 rows=200420 width=6)
        > >
        > > explain delete from game where gameid between 1000 and 2000;
        > > Index Scan using game_pkey on game (cost=0.00..3.1 5 rows=1 width=6)
        > >
        > >
        > >How's that possible? Is it purposely done like this, or
        > >is it a bug? (BTW, Postgres version is 7.2.3)[/color]
        >
        >
        > Postgres thinks that for the = line there will only be 1 row so t uses an
        > index scan. Same thing for the between. However it thinks that there are
        > 200420 rows below 1000 and decides a seq scan would be faster. You can run
        > EXPLAIN ANALYZE to see if its guesses are correct. You can also try SET
        > enable_seqscan = FALSE; to see if it is faster doing an index scan. If it
        > is faster to do an index scan edit your postgres.conf file and lower the
        > cost for a random tuple, etc.[/color]

        Before you do that you might wanna issue this command:

        alter table game alter column gameid set statistics 100;
        analyze game;

        and see what you get.


        ---------------------------(end of broadcast)---------------------------
        TIP 7: don't forget to increase your free space map settings

        Comment

        Working...