For vs. For Each

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Guest's Avatar

    #31
    Re: For vs. For Each

    "mikeb" wrote:[color=blue]
    > I think you're confusing the terminology with an enumeration, such as
    > declared by VB.NET's Enum statement. A completely different animal.[/color]

    That's exactly what I was doing. Thank you for the clarification, Mike &
    Terry.

    Eric


    Comment

    • JohnLiu

      #32
      Re: For vs. For Each

      I disagree that foreach-construct being readonly is a bad thing. Not
      to completely disregard Alvin's gripe, but here's my point of view.

      Typically, you use foreach to iterate throught the collection.
      Adding/Removing items from the collection during this time puts the
      collection into a funny mode that others may not be ready to deal
      with, what if you have multiple emunerators? (this is very common for
      a nested foreach scenario, yes, that'd be O(n^2) ). I believe in C++
      STL libraries you can remove current iterated item, but that opens a
      can of worms, you always have to worry whether your current item has
      been deleted by another thread.

      Also, I highly disagree that the for-construct is faster than
      foreach-construct. That is only true when you are talking about ARRAY
      collection types. In a linked-list implementation, foreach-construct
      O(1) would be faster than for-construct O(n) for iterating through a
      collection. The fact that the .NET framework collections are almost
      solely based on array types may make the statement correct in 90%+ of
      the time, but it is not a correct statement to make generally. And,
      besides, I wait for generics!

      Typically, hashtables are iterated for the entire key/value pairs,
      given that accessing the value of a hashtable is O(1), if you need to
      iterate through a hastable, it's easier to iterate through the keys
      O(n), and grabbing the value as you go O(1). But I can't think of why
      anyone would be iterating through a hashtable except may be as a debug
      step to see the contents of the hashtable.

      jliu - www.ssw.com.au - johnliu.net

      Comment

      • Guest's Avatar

        #33
        Re: For vs. For Each

        "JohnLiu" <johnnliu@gmail .com> wrote in message
        news:37c542a6.0 408120003.14e4e 17f@posting.goo gle.com...[color=blue]
        > I disagree that foreach-construct being readonly is a bad thing. Not
        > to completely disregard Alvin's gripe, but here's my point of view.
        >
        > Typically, you use foreach to iterate throught the collection.
        > Adding/Removing items from the collection during this time puts the
        > collection into a funny mode that others may not be ready to deal
        > with, what if you have multiple emunerators? (this is very common for
        > a nested foreach scenario, yes, that'd be O(n^2) ). I believe in C++
        > STL libraries you can remove current iterated item, but that opens a
        > can of worms, you always have to worry whether your current item has
        > been deleted by another thread.
        >
        > Also, I highly disagree that the for-construct is faster than
        > foreach-construct. That is only true when you are talking about ARRAY
        > collection types. In a linked-list implementation, foreach-construct
        > O(1) would be faster than for-construct O(n) for iterating through a
        > collection. The fact that the .NET framework collections are almost
        > solely based on array types may make the statement correct in 90%+ of
        > the time, but it is not a correct statement to make generally. And,
        > besides, I wait for generics!
        >
        > Typically, hashtables are iterated for the entire key/value pairs,
        > given that accessing the value of a hashtable is O(1), if you need to
        > iterate through a hastable, it's easier to iterate through the keys
        > O(n), and grabbing the value as you go O(1). But I can't think of why
        > anyone would be iterating through a hashtable except may be as a debug
        > step to see the contents of the hashtable.
        >
        > jliu - www.ssw.com.au - johnliu.net[/color]

        With a For n=start to end step loop you would still have to worry if the
        current element has been deleted by either this or another thread since the
        start end & step are only evaluated once in the loop.

        --
        Jonathan Bailey.


        Comment

        • David

          #34
          Re: For vs. For Each

          On 2004-08-11, Alvin Bruney [MVP] <> wrote:[color=blue]
          > I'll chime in here with my longtime gripe.
          >
          > The foreach implementation is flawed because the container is marked as
          > readonly during the iteration. This is a crime in my opinion because it is
          > *normal to effect a change on the container while iterating especially from
          > a vb point of view.[/color]

          I see your point, but look at it from the implementers' point of view.

          Making the container read-only allows for very efficient implementations
          of the enumerator, and also makes writing new enumerators fairly simple.
          Also, it eliminates a real ambiguity to the For Each statement, does
          foreach iterate over the original collection, or over the entire
          collection as it changes over time?

          Dim i As Integer
          For Each o as Object in MyCollection
          i += 1
          If i = 3 Then
          MyCollection.In sert(0, new Object())
          MyCollection.Ad d(New Object())
          End If
          Next

          What would the iteration be in this case? Should both new objects be
          iterated, or neither? Or just one? You could think of some reasonable
          rules to apply to arrays, but what about things like hashes where
          position doesn't have a fixed meaning? And how is the Enumerator
          supposed to keep track of what's happening to the collection? Do we add
          some kind of event to the IEnumerable interface? If so, that could turn
          into a lot of overhead since the enumerator has to check for changes on
          each iteration.

          For efficiency's sake, maybe we could have two different enumeration
          types, one for mutable containers and one for read-only, but then not
          only are you complicating the class library tremendously, but calling
          conventions can get strange (since only one of them can use For Each).

          David

          Comment

          • Cor Ligthert

            #35
            Re: For vs. For Each

            Can you give some sample applications where this statement of you is true?
            [color=blue]
            > Although due to the nature of loops, they oftentimes fall into
            > the 20 percent of code that consumes 80 percent of the time.[/color]

            It is in my opinion definitly not with applications where is by instance
            screen painting or/and dataprocessing.

            It is in my opinion definitly true for applications where is image
            processing where not the GDI+ encoding is used.

            However that is in my opinion surely not the majority of the applications.

            So I am curious in what type of other applications stand alone loops can
            consume 80% of the time?

            Just my thought,

            Cor


            Comment

            • Alvin Bruney [MVP]

              #36
              Re: For vs. For Each

              > Making the container read-only allows for very efficient implementations[color=blue]
              > of the enumerator, and also makes writing new enumerators fairly simple.
              > Also, it eliminates a real ambiguity to the For Each statement, does
              > foreach iterate over the original collection, or over the entire
              > collection as it changes over time?[/color]

              I don't disagree with that. very good point indeed. but, the current
              approach makes it impossible to perform simple tasks inherent in UI
              programming (like removing multiselects in a listbox for instance). Where
              such simple tasks are overly complicated, i believe the design should be
              reviewed.
              [color=blue]
              > For efficiency's sake, maybe we could have two different enumeration
              > types, one for mutable containers and one for read-only, but then not
              > only are you complicating the class library tremendously,[/color]

              I think it is a reasonable approach. It would just be another way to iterate
              a container and it shouldn't complicate matters since it could be made to
              appear as an overload

              but calling[color=blue]
              > conventions can get strange (since only one of them can use For Each).[/color]

              That's really a design issue which needs to be hashed out in a way to make
              this approach feasible.

              --
              Regards,
              Alvin Bruney
              [ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
              Got tidbits? Get it here... http://tinyurl.com/27cok
              "David" <dfoster@woofix .local.dom> wrote in message
              news:slrnchmj96 .j7j.dfoster@wo ofix.local.dom. ..[color=blue]
              > On 2004-08-11, Alvin Bruney [MVP] <> wrote:[color=green]
              >> I'll chime in here with my longtime gripe.
              >>
              >> The foreach implementation is flawed because the container is marked as
              >> readonly during the iteration. This is a crime in my opinion because it
              >> is
              >> *normal to effect a change on the container while iterating especially
              >> from
              >> a vb point of view.[/color]
              >
              > I see your point, but look at it from the implementers' point of view.
              >
              > Making the container read-only allows for very efficient implementations
              > of the enumerator, and also makes writing new enumerators fairly simple.
              > Also, it eliminates a real ambiguity to the For Each statement, does
              > foreach iterate over the original collection, or over the entire
              > collection as it changes over time?
              >
              > Dim i As Integer
              > For Each o as Object in MyCollection
              > i += 1
              > If i = 3 Then
              > MyCollection.In sert(0, new Object())
              > MyCollection.Ad d(New Object())
              > End If
              > Next
              >
              > What would the iteration be in this case? Should both new objects be
              > iterated, or neither? Or just one? You could think of some reasonable
              > rules to apply to arrays, but what about things like hashes where
              > position doesn't have a fixed meaning? And how is the Enumerator
              > supposed to keep track of what's happening to the collection? Do we add
              > some kind of event to the IEnumerable interface? If so, that could turn
              > into a lot of overhead since the enumerator has to check for changes on
              > each iteration.
              >
              > For efficiency's sake, maybe we could have two different enumeration
              > types, one for mutable containers and one for read-only, but then not
              > only are you complicating the class library tremendously, but calling
              > conventions can get strange (since only one of them can use For Each).
              >
              > David[/color]


              Comment

              • Alvin Bruney [MVP]

                #37
                Re: For vs. For Each

                > Typically, you use foreach to iterate throught the collection.[color=blue]
                > Adding/Removing items from the collection during this time puts the
                > collection into a funny mode that others may not be ready to deal
                > with, what if you have multiple emunerators?[/color]

                That is a design and implementation issue, not a programming issue.
                Iterating a container which can change during iteration is rightly handled
                internally by the construct itself and not by the iterating code so there
                should be no funny mode. For instance, what's to stop the internal code from
                re-adjusting its contents based on the removal or addition of an item on the
                fly? This is very basic functionality available in vb if memory serves me
                right.

                Multiple enumerators can be handled internally thru synchronization means
                and this can all be hidden from the programmer so that she is not aware how
                the iterating construct is implemented (good design). I think the choice to
                implement this construct as readonly must have come down to efficiency over
                functionality. That's the only reason I can think of.

                --
                Regards,
                Alvin Bruney
                [ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
                Got tidbits? Get it here... http://tinyurl.com/27cok
                "JohnLiu" <johnnliu@gmail .com> wrote in message
                news:37c542a6.0 408120003.14e4e 17f@posting.goo gle.com...[color=blue]
                >I disagree that foreach-construct being readonly is a bad thing. Not
                > to completely disregard Alvin's gripe, but here's my point of view.
                >
                > Typically, you use foreach to iterate throught the collection.
                > Adding/Removing items from the collection during this time puts the
                > collection into a funny mode that others may not be ready to deal
                > with, what if you have multiple emunerators? (this is very common for
                > a nested foreach scenario, yes, that'd be O(n^2) ). I believe in C++
                > STL libraries you can remove current iterated item, but that opens a
                > can of worms, you always have to worry whether your current item has
                > been deleted by another thread.
                >
                > Also, I highly disagree that the for-construct is faster than
                > foreach-construct. That is only true when you are talking about ARRAY
                > collection types. In a linked-list implementation, foreach-construct
                > O(1) would be faster than for-construct O(n) for iterating through a
                > collection. The fact that the .NET framework collections are almost
                > solely based on array types may make the statement correct in 90%+ of
                > the time, but it is not a correct statement to make generally. And,
                > besides, I wait for generics!
                >
                > Typically, hashtables are iterated for the entire key/value pairs,
                > given that accessing the value of a hashtable is O(1), if you need to
                > iterate through a hastable, it's easier to iterate through the keys
                > O(n), and grabbing the value as you go O(1). But I can't think of why
                > anyone would be iterating through a hashtable except may be as a debug
                > step to see the contents of the hashtable.
                >
                > jliu - www.ssw.com.au - johnliu.net[/color]


                Comment

                • Michal Dabrowski

                  #38
                  Re: For vs. For Each

                  On Wed, 11 Aug 2004 08:21:47 -0500, anonymous@discu ssions.microsof t.com
                  wrote:
                  [color=blue]
                  > Is there a performance difference between this:
                  >
                  > \\\
                  > Dim i As Integer
                  > For i = 0 to myObject.Contro ls.Count - 1
                  > myObject.Contro ls(i) = ...
                  > Next
                  > ///
                  >
                  > and this:
                  >
                  > \\\
                  > Dim ctl As Control
                  > For Each ctl In myObject.Contro ls
                  > ctl = ...
                  > Next
                  > ///
                  >
                  > Or is For Each just "prettier"?
                  >[/color]

                  They are almost identical when your collection is some sort of an array.
                  But if the collection is e.g. linked list, then executing .Controls(n) will
                  cause your app to traverse through n elements - the bigger n is the slower
                  it will take to find n-th element. Using enumerators (For Each) is
                  considerably faster here.

                  Best regards,
                  Michal Dabrowski

                  Comment

                  • Nick Malik

                    #39
                    Re: For vs. For Each

                    > I don't disagree with that. very good point indeed. but, the current[color=blue]
                    > approach makes it impossible to perform simple tasks inherent in UI
                    > programming (like removing multiselects in a listbox for instance). Where
                    > such simple tasks are overly complicated, i believe the design should be
                    > reviewed.[/color]

                    good point. I like the idea of a collection object that isn't read-only,
                    seperate from other types of collections. Didn't another thread mention a
                    bit of code that Ericgu put out that does exactly this?


                    Comment

                    • Cor Ligthert

                      #40
                      Re: For vs. For Each

                      From this document



                      The performance difference between For and For Each loops does not appear to
                      be significant.

                      I hope this helps?

                      Cor


                      Comment

                      • Jay B. Harlow [MVP - Outlook]

                        #41
                        Re: For vs. For Each

                        Alvin,
                        VB for as long as I can remember (VB1, VB2, VB3, VB5, VB6, VBA) has had
                        trouble modifying the collection itself when you use For Each. There may
                        have been one or two specific collections that may have worked, or more then
                        likely one thought they worked, but really didn't.

                        The problem is the delete/insert code would need some method of notifying
                        (an event possible) one or more enumerators that the collection itself was
                        modified, this notification IMHO for the most part is too expensive to
                        justify adding it in all cases.

                        Although I do agree, it would be nice if collections had optional
                        Enumerators. The "fire house" version of today, which is normally used. Plus
                        a "safe" version that allowed modifying the collection itself while your
                        iterating... For example: using For Each on DataTable.Rows is not
                        modifiable, while using For Each on DataTable.Selec t is modifiable! By
                        modifiable means you can call DataRow.Delete or Rows.Add...

                        Hope this helps
                        Jay

                        "Alvin Bruney [MVP]" <vapor at steaming post office> wrote in message
                        news:uZfTmTHgEH A.3292@TK2MSFTN GP10.phx.gbl...[color=blue][color=green]
                        > > Typically, you use foreach to iterate throught the collection.
                        > > Adding/Removing items from the collection during this time puts the
                        > > collection into a funny mode that others may not be ready to deal
                        > > with, what if you have multiple emunerators?[/color]
                        >
                        > That is a design and implementation issue, not a programming issue.
                        > Iterating a container which can change during iteration is rightly handled
                        > internally by the construct itself and not by the iterating code so there
                        > should be no funny mode. For instance, what's to stop the internal code[/color]
                        from[color=blue]
                        > re-adjusting its contents based on the removal or addition of an item on[/color]
                        the[color=blue]
                        > fly? This is very basic functionality available in vb if memory serves me
                        > right.
                        >
                        > Multiple enumerators can be handled internally thru synchronization means
                        > and this can all be hidden from the programmer so that she is not aware[/color]
                        how[color=blue]
                        > the iterating construct is implemented (good design). I think the choice[/color]
                        to[color=blue]
                        > implement this construct as readonly must have come down to efficiency[/color]
                        over[color=blue]
                        > functionality. That's the only reason I can think of.
                        >
                        > --
                        > Regards,
                        > Alvin Bruney
                        > [ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
                        > Got tidbits? Get it here... http://tinyurl.com/27cok
                        > "JohnLiu" <johnnliu@gmail .com> wrote in message
                        > news:37c542a6.0 408120003.14e4e 17f@posting.goo gle.com...[color=green]
                        > >I disagree that foreach-construct being readonly is a bad thing. Not
                        > > to completely disregard Alvin's gripe, but here's my point of view.
                        > >
                        > > Typically, you use foreach to iterate throught the collection.
                        > > Adding/Removing items from the collection during this time puts the
                        > > collection into a funny mode that others may not be ready to deal
                        > > with, what if you have multiple emunerators? (this is very common for
                        > > a nested foreach scenario, yes, that'd be O(n^2) ). I believe in C++
                        > > STL libraries you can remove current iterated item, but that opens a
                        > > can of worms, you always have to worry whether your current item has
                        > > been deleted by another thread.
                        > >
                        > > Also, I highly disagree that the for-construct is faster than
                        > > foreach-construct. That is only true when you are talking about ARRAY
                        > > collection types. In a linked-list implementation, foreach-construct
                        > > O(1) would be faster than for-construct O(n) for iterating through a
                        > > collection. The fact that the .NET framework collections are almost
                        > > solely based on array types may make the statement correct in 90%+ of
                        > > the time, but it is not a correct statement to make generally. And,
                        > > besides, I wait for generics!
                        > >
                        > > Typically, hashtables are iterated for the entire key/value pairs,
                        > > given that accessing the value of a hashtable is O(1), if you need to
                        > > iterate through a hastable, it's easier to iterate through the keys
                        > > O(n), and grabbing the value as you go O(1). But I can't think of why
                        > > anyone would be iterating through a hashtable except may be as a debug
                        > > step to see the contents of the hashtable.
                        > >
                        > > jliu - www.ssw.com.au - johnliu.net[/color]
                        >
                        >[/color]


                        Comment

                        • Cablewizard

                          #42
                          Re: For vs. For Each

                          I gave this a little bit of thought. I realized that the sort of coding I do is
                          quite different than what "most" people are doing. I perform mostly engineering
                          and geospatial analysis. For me, this involves many loops, and loops within
                          loops. Along with iterating over recordsets countless times. In a literal since,
                          this is data processing in the extreme, but certainly not like manual data
                          entry.

                          However, loops are used to perform some sort of search and/or work on a block of
                          items. By nature, they can consume a decent portion of the overall processing
                          time as they are oftentimes the place where much of the actual work is taking
                          place. Any number of smaller functions may be performed, but potentially it is
                          performed many if not thousands of times. In this particular thread's example,
                          the operator is iterating through all the controls in a collection, presumably
                          to do something with them. I would hazard a guess that if you compared the
                          overall processor time spent within the scope of the loop, it would be
                          significant relative to other non-loop functions.

                          So while what I do may in fact be much different than most others, I still stand
                          by my statement. Just look at the number of times people want to know how to
                          keep their GUI responsive while some sort of iterative process is occurring.
                          Forget for a moment about the design considerations of what is really happening.
                          Bottom line is that the iterative processing is consuming an amount of time
                          significant enough to be noticeable to the operator.

                          Since you asked, and to exemplify Alvin's comments, here is a common occurrence
                          for me.
                          (For those who don't what to read a confusing and long-winded example, stop
                          reading here)
                          I have a geospatial dataset that contains some number of polygons/regions.
                          I need to find any overlapping/intersecting regions and degenerate those
                          intersections into separate regions.
                          This requires iterating through every element in the dataset and compare it to
                          every other element.
                          Additionally, for every potentially intersecting element combination, you must
                          iterate through every combination of vertices/segments to determine
                          intersection.
                          Each combination of intersections could result in the creation of a new region.
                          Each new region could also intersect with subsequent existing and/or new
                          regions, which could also generate new regions...
                          Now, if when existing regions could be degenerated into sub-regions, I could
                          remove the existing regions from the collection and add the new regions to the
                          end of the collection, then theoretically I could determine all possible tests
                          within the scope of 1 top-level For Each loop. But instead, I must be creative
                          and do something like mark the existing regions for deletion within the master
                          collection, add the newly created regions to a separate collection. Then perform
                          the same iteration over the new collection, and potentially create an additional
                          collection, and so on. Once all combinations are resolved, then I must go back
                          and iterate through all of the resulting collections to recreate the master
                          collection. Now in practice, the resulting implementation isn't exactly like
                          that, but logically it is similar.

                          So for me, loop performance and implementation is extremely important.

                          Gerald

                          "Cor Ligthert" <notfirstname@p lanet.nl> wrote in message
                          news:eijj0FGgEH A.4092@TK2MSFTN GP10.phx.gbl...[color=blue]
                          > Can you give some sample applications where this statement of you is true?
                          >[color=green]
                          > > Although due to the nature of loops, they oftentimes fall into
                          > > the 20 percent of code that consumes 80 percent of the time.[/color]
                          >
                          > It is in my opinion definitly not with applications where is by instance
                          > screen painting or/and dataprocessing.
                          >
                          > It is in my opinion definitly true for applications where is image
                          > processing where not the GDI+ encoding is used.
                          >
                          > However that is in my opinion surely not the majority of the applications.
                          >
                          > So I am curious in what type of other applications stand alone loops can
                          > consume 80% of the time?
                          >
                          > Just my thought,
                          >
                          > Cor
                          >
                          >[/color]


                          Comment

                          • Howard Kaikow

                            #43
                            Re: For vs. For Each

                            My recollection is that MSFT claimed that .NET, for practical purposes,
                            eliminated the difference in speed between For and For Each, but I've not
                            recently tested that assertion.

                            --
                            http://www.standards.com/; See Howard Kaikow's web site.
                            "Nick Malik" <nickmalik@hotm ail.nospam.com> wrote in message
                            news:BgqSc.1307 47$eM2.70902@at tbi_s51...[color=blue]
                            > Would you have been happier if Eric has written the question in C#? This[/color]
                            is[color=blue]
                            > very much as important a question in C# as it is in VB.NET.
                            >
                            > foreach (Control ctl in myObject.Contro ls)
                            > {
                            > // do something useful with 'ctl'
                            >
                            > }
                            >
                            > I've had folks tell me that 'for' is more efficient than 'foreach' because
                            > of enumerator overhead. For most of my code, however, this is a moot[/color]
                            point.[color=blue]
                            > Unless the code is in a critical loop, the difference in processing so[/color]
                            tiny[color=blue]
                            > that the improvement in code readability greatly outweighs the overhead of
                            > allowing .NET to manipulate the enumerator.
                            >
                            > --- Nick
                            >
                            > "Ignacio Machin ( .NET/ C# MVP )" <ignacio.mach in AT dot.state.fl.us >[/color]
                            wrote[color=blue]
                            > in message news:%235r%23ef 6fEHA.2896@TK2M SFTNGP11.phx.gb l...
                            > <<clipped>>
                            >[color=green]
                            > > and finally this is a VB.net question, not a C# one, there is no need to
                            > > post it on microsoft.publi c.dotnet.langua ges.csharp
                            > >[/color]
                            > <<clipped>>[color=green]
                            > >
                            > > <anonymous@disc ussions.microso ft.com> wrote in message
                            > > news:OYvXoY6fEH A.3320@TK2MSFTN GP11.phx.gbl...[color=darkred]
                            > > > Is there a performance difference between this:
                            > > >
                            > > > \\\
                            > > > Dim i As Integer
                            > > > For i = 0 to myObject.Contro ls.Count - 1
                            > > > myObject.Contro ls(i) = ...
                            > > > Next
                            > > > ///
                            > > >
                            > > > and this:
                            > > >
                            > > > \\\
                            > > > Dim ctl As Control
                            > > > For Each ctl In myObject.Contro ls
                            > > > ctl = ...
                            > > > Next
                            > > > ///
                            > > >
                            > > > Or is For Each just "prettier"?
                            > > >
                            > > > Thanks,
                            > > >
                            > > > Eric
                            > > >
                            > > >[/color]
                            > >
                            > >[/color]
                            >
                            >[/color]


                            Comment

                            • Cor Ligthert

                              #44
                              Re: For vs. For Each

                              Gerald,.

                              I readed it completly, however in my opinion is everything what happens
                              between processor and memory nowadays extremely fast and that is often
                              forgotten. (I am not writing this about your situation)

                              There is a lot of looping in every program even when you try to avoid it. I
                              think that the code which is created by the ILS will make a lot of loops.

                              Looping is in my opinion the basic of good programming, and people who think
                              they can avoid it are mostly making even more code to process or stop the
                              loop. (By instance by making a test in the loop which cost of course more
                              than a simple change of a byte).

                              The performance difference of the methods can be neglected, see for that the
                              MSDN article I point on in the mainthread of this thread.

                              I find it mostly overdone how many attention people take to a loop, while
                              the total througput time will mostly not change.

                              I wrote mostly. I think that it needs forever and for you specialy to be
                              done well and that it needs in a lot of situations extra attention. However
                              when it comes to optimizing the througput, I would first look in most cases
                              to other parts of the program.

                              Just my thougth

                              Cor
                              [color=blue]
                              > I gave this a little bit of thought. I realized that the sort of coding I[/color]
                              do is[color=blue]
                              > quite different than what "most" people are doing. I perform mostly[/color]
                              engineering[color=blue]
                              > and geospatial analysis. For me, this involves many loops, and loops[/color]
                              within[color=blue]
                              > loops. Along with iterating over recordsets countless times. In a literal[/color]
                              since,[color=blue]
                              > this is data processing in the extreme, but certainly not like manual data
                              > entry.
                              >
                              > However, loops are used to perform some sort of search and/or work on a[/color]
                              block of[color=blue]
                              > items. By nature, they can consume a decent portion of the overall[/color]
                              processing[color=blue]
                              > time as they are oftentimes the place where much of the actual work is[/color]
                              taking[color=blue]
                              > place. Any number of smaller functions may be performed, but potentially[/color]
                              it is[color=blue]
                              > performed many if not thousands of times. In this particular thread's[/color]
                              example,[color=blue]
                              > the operator is iterating through all the controls in a collection,[/color]
                              presumably[color=blue]
                              > to do something with them. I would hazard a guess that if you compared the
                              > overall processor time spent within the scope of the loop, it would be
                              > significant relative to other non-loop functions.
                              >
                              > So while what I do may in fact be much different than most others, I still[/color]
                              stand[color=blue]
                              > by my statement. Just look at the number of times people want to know how[/color]
                              to[color=blue]
                              > keep their GUI responsive while some sort of iterative process is[/color]
                              occurring.[color=blue]
                              > Forget for a moment about the design considerations of what is really[/color]
                              happening.[color=blue]
                              > Bottom line is that the iterative processing is consuming an amount of[/color]
                              time[color=blue]
                              > significant enough to be noticeable to the operator.
                              >
                              > Since you asked, and to exemplify Alvin's comments, here is a common[/color]
                              occurrence[color=blue]
                              > for me.
                              > (For those who don't what to read a confusing and long-winded example,[/color]
                              stop[color=blue]
                              > reading here)
                              > I have a geospatial dataset that contains some number of polygons/regions.
                              > I need to find any overlapping/intersecting regions and degenerate those
                              > intersections into separate regions.
                              > This requires iterating through every element in the dataset and compare[/color]
                              it to[color=blue]
                              > every other element.
                              > Additionally, for every potentially intersecting element combination, you[/color]
                              must[color=blue]
                              > iterate through every combination of vertices/segments to determine
                              > intersection.
                              > Each combination of intersections could result in the creation of a new[/color]
                              region.[color=blue]
                              > Each new region could also intersect with subsequent existing and/or new
                              > regions, which could also generate new regions...
                              > Now, if when existing regions could be degenerated into sub-regions, I[/color]
                              could[color=blue]
                              > remove the existing regions from the collection and add the new regions to[/color]
                              the[color=blue]
                              > end of the collection, then theoretically I could determine all possible[/color]
                              tests[color=blue]
                              > within the scope of 1 top-level For Each loop. But instead, I must be[/color]
                              creative[color=blue]
                              > and do something like mark the existing regions for deletion within the[/color]
                              master[color=blue]
                              > collection, add the newly created regions to a separate collection. Then[/color]
                              perform[color=blue]
                              > the same iteration over the new collection, and potentially create an[/color]
                              additional[color=blue]
                              > collection, and so on. Once all combinations are resolved, then I must go[/color]
                              back[color=blue]
                              > and iterate through all of the resulting collections to recreate the[/color]
                              master[color=blue]
                              > collection. Now in practice, the resulting implementation isn't exactly[/color]
                              like[color=blue]
                              > that, but logically it is similar.
                              >
                              > So for me, loop performance and implementation is extremely important.
                              >
                              > Gerald
                              >
                              > "Cor Ligthert" <notfirstname@p lanet.nl> wrote in message
                              > news:eijj0FGgEH A.4092@TK2MSFTN GP10.phx.gbl...[color=green]
                              > > Can you give some sample applications where this statement of you is[/color][/color]
                              true?[color=blue][color=green]
                              > >[color=darkred]
                              > > > Although due to the nature of loops, they oftentimes fall into
                              > > > the 20 percent of code that consumes 80 percent of the time.[/color]
                              > >
                              > > It is in my opinion definitly not with applications where is by instance
                              > > screen painting or/and dataprocessing.
                              > >
                              > > It is in my opinion definitly true for applications where is image
                              > > processing where not the GDI+ encoding is used.
                              > >
                              > > However that is in my opinion surely not the majority of the[/color][/color]
                              applications.[color=blue][color=green]
                              > >
                              > > So I am curious in what type of other applications stand alone loops can
                              > > consume 80% of the time?
                              > >
                              > > Just my thought,
                              > >
                              > > Cor
                              > >
                              > >[/color]
                              >
                              >[/color]


                              Comment

                              • Cablewizard

                                #45
                                Re: For vs. For Each

                                Cor,

                                If I understand your comments, then I completely agree.
                                1. Use loops appropriately.
                                2. Don't loop when it is not necessary.
                                3. Do loop when appropriate.
                                4. When you do use a loop, in the end it makes little difference if you use Do,
                                While, For Index, or For Each. My own testing has shown this to be true.
                                5. What you do while in the loop is much more important than the loop itself.
                                Make it as efficient as practical.
                                6. Just make sure that your overall code design and implementation is done
                                well/correctly.

                                In the end, follow Jay's advice. Try to do it right in the first place, and if
                                you find out it is a problem, then worry about the extra code to try to make it
                                faster.

                                Gerald

                                "Cor Ligthert" <notfirstname@p lanet.nl> wrote in message
                                news:%23jCQ$tIg EHA.3148@TK2MSF TNGP10.phx.gbl. ..[color=blue]
                                > Gerald,.
                                >
                                > I readed it completly, however in my opinion is everything what happens
                                > between processor and memory nowadays extremely fast and that is often
                                > forgotten. (I am not writing this about your situation)
                                >
                                > There is a lot of looping in every program even when you try to avoid it. I
                                > think that the code which is created by the ILS will make a lot of loops.
                                >
                                > Looping is in my opinion the basic of good programming, and people who think
                                > they can avoid it are mostly making even more code to process or stop the
                                > loop. (By instance by making a test in the loop which cost of course more
                                > than a simple change of a byte).
                                >
                                > The performance difference of the methods can be neglected, see for that the
                                > MSDN article I point on in the mainthread of this thread.
                                >
                                > I find it mostly overdone how many attention people take to a loop, while
                                > the total througput time will mostly not change.
                                >
                                > I wrote mostly. I think that it needs forever and for you specialy to be
                                > done well and that it needs in a lot of situations extra attention. However
                                > when it comes to optimizing the througput, I would first look in most cases
                                > to other parts of the program.
                                >
                                > Just my thougth
                                >
                                > Cor
                                >[color=green]
                                > > I gave this a little bit of thought. I realized that the sort of coding I[/color]
                                > do is[color=green]
                                > > quite different than what "most" people are doing. I perform mostly[/color]
                                > engineering[color=green]
                                > > and geospatial analysis. For me, this involves many loops, and loops[/color]
                                > within[color=green]
                                > > loops. Along with iterating over recordsets countless times. In a literal[/color]
                                > since,[color=green]
                                > > this is data processing in the extreme, but certainly not like manual data
                                > > entry.
                                > >
                                > > However, loops are used to perform some sort of search and/or work on a[/color]
                                > block of[color=green]
                                > > items. By nature, they can consume a decent portion of the overall[/color]
                                > processing[color=green]
                                > > time as they are oftentimes the place where much of the actual work is[/color]
                                > taking[color=green]
                                > > place. Any number of smaller functions may be performed, but potentially[/color]
                                > it is[color=green]
                                > > performed many if not thousands of times. In this particular thread's[/color]
                                > example,[color=green]
                                > > the operator is iterating through all the controls in a collection,[/color]
                                > presumably[color=green]
                                > > to do something with them. I would hazard a guess that if you compared the
                                > > overall processor time spent within the scope of the loop, it would be
                                > > significant relative to other non-loop functions.
                                > >
                                > > So while what I do may in fact be much different than most others, I still[/color]
                                > stand[color=green]
                                > > by my statement. Just look at the number of times people want to know how[/color]
                                > to[color=green]
                                > > keep their GUI responsive while some sort of iterative process is[/color]
                                > occurring.[color=green]
                                > > Forget for a moment about the design considerations of what is really[/color]
                                > happening.[color=green]
                                > > Bottom line is that the iterative processing is consuming an amount of[/color]
                                > time[color=green]
                                > > significant enough to be noticeable to the operator.
                                > >
                                > > Since you asked, and to exemplify Alvin's comments, here is a common[/color]
                                > occurrence[color=green]
                                > > for me.
                                > > (For those who don't what to read a confusing and long-winded example,[/color]
                                > stop[color=green]
                                > > reading here)
                                > > I have a geospatial dataset that contains some number of polygons/regions.
                                > > I need to find any overlapping/intersecting regions and degenerate those
                                > > intersections into separate regions.
                                > > This requires iterating through every element in the dataset and compare[/color]
                                > it to[color=green]
                                > > every other element.
                                > > Additionally, for every potentially intersecting element combination, you[/color]
                                > must[color=green]
                                > > iterate through every combination of vertices/segments to determine
                                > > intersection.
                                > > Each combination of intersections could result in the creation of a new[/color]
                                > region.[color=green]
                                > > Each new region could also intersect with subsequent existing and/or new
                                > > regions, which could also generate new regions...
                                > > Now, if when existing regions could be degenerated into sub-regions, I[/color]
                                > could[color=green]
                                > > remove the existing regions from the collection and add the new regions to[/color]
                                > the[color=green]
                                > > end of the collection, then theoretically I could determine all possible[/color]
                                > tests[color=green]
                                > > within the scope of 1 top-level For Each loop. But instead, I must be[/color]
                                > creative[color=green]
                                > > and do something like mark the existing regions for deletion within the[/color]
                                > master[color=green]
                                > > collection, add the newly created regions to a separate collection. Then[/color]
                                > perform[color=green]
                                > > the same iteration over the new collection, and potentially create an[/color]
                                > additional[color=green]
                                > > collection, and so on. Once all combinations are resolved, then I must go[/color]
                                > back[color=green]
                                > > and iterate through all of the resulting collections to recreate the[/color]
                                > master[color=green]
                                > > collection. Now in practice, the resulting implementation isn't exactly[/color]
                                > like[color=green]
                                > > that, but logically it is similar.
                                > >
                                > > So for me, loop performance and implementation is extremely important.
                                > >
                                > > Gerald
                                > >
                                > > "Cor Ligthert" <notfirstname@p lanet.nl> wrote in message
                                > > news:eijj0FGgEH A.4092@TK2MSFTN GP10.phx.gbl...[color=darkred]
                                > > > Can you give some sample applications where this statement of you is[/color][/color]
                                > true?[color=green][color=darkred]
                                > > >
                                > > > > Although due to the nature of loops, they oftentimes fall into
                                > > > > the 20 percent of code that consumes 80 percent of the time.
                                > > >
                                > > > It is in my opinion definitly not with applications where is by instance
                                > > > screen painting or/and dataprocessing.
                                > > >
                                > > > It is in my opinion definitly true for applications where is image
                                > > > processing where not the GDI+ encoding is used.
                                > > >
                                > > > However that is in my opinion surely not the majority of the[/color][/color]
                                > applications.[color=green][color=darkred]
                                > > >
                                > > > So I am curious in what type of other applications stand alone loops can
                                > > > consume 80% of the time?
                                > > >
                                > > > Just my thought,
                                > > >
                                > > > Cor
                                > > >
                                > > >[/color]
                                > >
                                > >[/color]
                                >
                                >[/color]


                                Comment

                                Working...