IIF vs Where in Group query, what is faster ?

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • hvsummer
    New Member
    • Aug 2015
    • 215

    IIF vs Where in Group query, what is faster ?

    I want to know what will be faster in this situation:

    I have table1 = {ID, Type, BData, Quantity}

    There are 2 type {A, B}

    if I want to know sum of A, B
    I can use this

    Code:
    SELECT ID, Type, Month(Bdate) as M, Sum(Quantity)
    FROM Table1
    Where Type like "A"
    Group by ID, Type, Month(Bdata);
    but I can even use iif to do that :
    Code:
    SELECT ID, Month(BDate) as M, Sum(iif(Type like "a", Quantity, 0))
    FROM Table1
    Group by ID, Month(Bdate);
    my question is, which method will perform better ?
    IIF or Where ?
  • Rabbit
    Recognized Expert MVP
    • Jan 2007
    • 12517

    #2
    The true test is run both multiple times and see what the average run time is. But you will most likely find that WHERE is faster.

    Comment

    • hvsummer
      New Member
      • Aug 2015
      • 215

      #3
      omg, I did use a lot of sumiif function in query... ==
      Do I have to change all of them back into Where Method ? :(

      Comment

      • zmbd
        Recognized Expert Moderator Expert
        • Mar 2012
        • 5501

        #4
        depends... is the return of data within an acceptable timeframe.
        If so, then leave it alone
        If not, then change things around.

        One will find that WHERE clauses tend to work faster (but not noticeably when using small data sets).
        IIF(), [field] IS NULL, and other SQL type commands will tend to work faster, but not noticeably when using small data sets, than the VBA NZ(), or any domain function such as DCOUNT() or DSUM - once again, small datasets the performance hit isn't likely to be noticed by the user. In-fact, I have an early DB that uses domain functions on over 5000 records and the return is under 5 seconds... used to be less than 2; however, the back-end is on the server now.

        In this case, I think that this will have to be a judgment call on your part unless Rabbit, Neopa, or one of the other more experienced DBA/DD have better insight.

        Comment

        • hvsummer
          New Member
          • Aug 2015
          • 215

          #5
          well, my DBs hit 1.2GB a couple day ago, 1.5million rows of main table and somewhat 200k rows the rest tables.

          and query return data in 30s-60s, I will never use Domain function (except in Vba function), query will be code with purely SQL.

          Comment

          • zmbd
            Recognized Expert Moderator Expert
            • Mar 2012
            • 5501

            #6
            AT that size, you need to be absolutely sure that your data is properly normalized.

            Comment

            • hvsummer
              New Member
              • Aug 2015
              • 215

              #7
              if I could not be sure, what I can do now when it already reach that size hahaha

              btw, my data will not be normalized for sure, I got data raw from another online-database, and working on it, (data raw mean it content everything in detal and information that can be duplicate with another table).
              and any table connect to this main table will have *-to-M relationship (1-to-m, m-to-m)

              Comment

              • zmbd
                Recognized Expert Moderator Expert
                • Mar 2012
                • 5501

                #8
                I would suspect therefore that your "raw data" is the root source of a majority of your lag and database size; thus, I'd be looking for anything that indexes the data and filters using the WHERE or the other criteria clauses and avoid any domain/VBA functions as much as possible.

                Maybe one of the other Experts here will have better solutions.

                As for normalization, I would be looking at some VBA coding to pull that data apart. I've done that in the past and it's a real pain but worth the effort.

                Comment

                • MikeTheBike
                  Recognized Expert Contributor
                  • Jun 2007
                  • 640

                  #9
                  Hi

                  Just a small question on the topic of query speed with large datasets,
                  I figured (assumed) WHERE would be quicker than IIF()
                  but would this
                  Code:
                  Where Type = "A"
                  be quicker then
                  Code:
                  Where Type like "A"
                  ???

                  As it would seem that in this case the only possible option are 'A' or 'B'



                  MTB

                  Comment

                  • zmbd
                    Recognized Expert Moderator Expert
                    • Mar 2012
                    • 5501

                    #10
                    Agreed that the equality would more than likely be faster; however, you lose wild cards such as:
                    Code:
                    WHERE LastName Like '[A-D]*';")

                    Comment

                    • hvsummer
                      New Member
                      • Aug 2015
                      • 215

                      #11
                      ok 0.0 I think I know how to pull those data apart with vba.. but, yeah, but if i do that I need something bigger than 2GB limit, already 1.2GB mean I don't have enough space to do that with vba.

                      I'll consider this later, now I'm working on improve some new complex query that I have not done before.

                      Comment

                      • zmbd
                        Recognized Expert Moderator Expert
                        • Mar 2012
                        • 5501

                        #12
                        if your data is duplicated across tables/records, the normalization should reduce the over all size.

                        Comment

                        • hvsummer
                          New Member
                          • Aug 2015
                          • 215

                          #13
                          but the memory buffer to transfer data between old and new table can't handle 1.3GB. the only way to do this is split database, then manipulate it within vba of back-end db...

                          Comment

                          • zmbd
                            Recognized Expert Moderator Expert
                            • Mar 2012
                            • 5501

                            #14
                            split databases are a very common occurrence.
                            you can search here on Bytes and find hundreds of threads covering various aspects of how to connect and parse records. :)

                            Comment

                            • strive4peace
                              Recognized Expert New Member
                              • Jun 2014
                              • 39

                              #15
                              for speed, another thing to realize is that inequality comparisons are FASTER than equality comparisons.

                              also, on IIF ... even if the condition is true, the false argument is still evaluated.
                              Last edited by zmbd; Nov 15 '15, 12:25 AM. Reason: [z{merged related posts and edited text per S4P}}

                              Comment

                              Working...