creating really big lists

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Dr Mephesto

    creating really big lists

    Hi!

    I would like to create a pretty big list of lists; a list 3,000,000
    long, each entry containing 5 empty lists. My application will append
    data each of the 5 sublists, so they will be of varying lengths (so no
    arrays!).

    Does anyone know the most efficient way to do this? I have tried:

    list = [[[],[],[],[],[]] for _ in xrange(3000000)]

    but its not soooo fast. Is there a way to do this without looping?

    David.

  • Paul Rudin

    #2
    Re: creating really big lists

    Dr Mephesto <dnhkng@googlem ail.comwrites:
    Hi!
    >
    I would like to create a pretty big list of lists; a list 3,000,000
    long, each entry containing 5 empty lists. My application will append
    data each of the 5 sublists, so they will be of varying lengths (so no
    arrays!).
    >
    Does anyone know the most efficient way to do this? I have tried:
    >
    list = [[[],[],[],[],[]] for _ in xrange(3000000)]
    >
    but its not soooo fast. Is there a way to do this without looping?
    You can do:

    [[[],[],[],[],[]]] * 3000000

    although I don't know if it performs any better than what you already
    have.

    Comment

    • Diez B. Roggisch

      #3
      Re: creating really big lists

      Paul Rudin wrote:
      Dr Mephesto <dnhkng@googlem ail.comwrites:
      >
      >Hi!
      >>
      >I would like to create a pretty big list of lists; a list 3,000,000
      >long, each entry containing 5 empty lists. My application will append
      >data each of the 5 sublists, so they will be of varying lengths (so no
      >arrays!).
      >>
      >Does anyone know the most efficient way to do this? I have tried:
      >>
      >list = [[[],[],[],[],[]] for _ in xrange(3000000)]
      >>
      >but its not soooo fast. Is there a way to do this without looping?
      >
      You can do:
      >
      [[[],[],[],[],[]]] * 3000000
      >
      although I don't know if it performs any better than what you already
      have.
      You are aware that this is hugely different, because the nested lists are
      references, not new instances? Thus the outcome is most probably (given the
      gazillion of times people stumbled over this) not the desired one...

      Diez

      Comment

      • Bryan Olson

        #4
        Re: creating really big lists

        Paul Rudin wrote:
        Dr writes:
        >I would like to create a pretty big list of lists; a list 3,000,000
        >long, each entry containing 5 empty lists. My application will append
        >data each of the 5 sublists, so they will be of varying lengths (so no
        >arrays!).
        >>
        >Does anyone know the most efficient way to do this? I have tried:
        >>
        >list = [[[],[],[],[],[]] for _ in xrange(3000000)]
        >>
        >but its not soooo fast. Is there a way to do this without looping?
        >
        You can do:
        >
        [[[],[],[],[],[]]] * 3000000
        >
        although I don't know if it performs any better than what you already
        have.
        Actually, that produces list of 3000000 references to the same
        5-element list. A reduced example:
        >>lst = [[[],[],[],[],[]]] * 3
        >>lst[1][1].append(42)
        >>print lst
        [[[], [42], [], [], []], [[], [42], [], [], []], [[], [42], [], [], []]]


        --
        --Bryan

        Comment

        • Paul Rudin

          #5
          Re: creating really big lists

          "Diez B. Roggisch" <deets@nospam.w eb.dewrites:
          Paul Rudin wrote:
          >
          >Dr Mephesto <dnhkng@googlem ail.comwrites:
          >>
          >>Hi!
          >>>
          >>I would like to create a pretty big list of lists; a list 3,000,000
          >>long, each entry containing 5 empty lists. My application will append
          >>data each of the 5 sublists, so they will be of varying lengths (so no
          >>arrays!).
          >>>
          >>Does anyone know the most efficient way to do this? I have tried:
          >>>
          >>list = [[[],[],[],[],[]] for _ in xrange(3000000)]
          >>>
          >>but its not soooo fast. Is there a way to do this without looping?
          >>
          >You can do:
          >>
          > [[[],[],[],[],[]]] * 3000000
          >>
          >although I don't know if it performs any better than what you already
          >have.
          >
          You are aware that this is hugely different, because the nested lists are
          references, not new instances? Thus the outcome is most probably (given the
          gazillion of times people stumbled over this) not the desired one...
          Err, yes sorry. I should try to avoid posting before having coffee in
          the mornings.

          Comment

          • Dr Mephesto

            #6
            Re: creating really big lists

            yep, thats why I'm asking :)

            On Sep 5, 12:22 pm, "Diez B. Roggisch" <de...@nospam.w eb.dewrote:
            Paul Rudin wrote:
            Dr Mephesto <dnh...@googlem ail.comwrites:
            >
            Hi!
            >
            I would like to create a pretty big list of lists; a list 3,000,000
            long, each entry containing 5 empty lists. My application will append
            data each of the 5 sublists, so they will be of varying lengths (so no
            arrays!).
            >
            Does anyone know the most efficient way to do this? I have tried:
            >
            list = [[[],[],[],[],[]] for _ in xrange(3000000)]
            >
            but its not soooo fast. Is there a way to do this without looping?
            >
            You can do:
            >
            [[[],[],[],[],[]]] * 3000000
            >
            although I don't know if it performs any better than what you already
            have.
            >
            You are aware that this is hugely different, because the nested lists are
            references, not new instances? Thus the outcome is most probably (given the
            gazillion of times people stumbled over this) not the desired one...
            >
            Diez

            Comment

            • Aahz

              #7
              Re: creating really big lists

              In article <1188985838.661 821.41530@k79g2 000hse.googlegr oups.com>,
              Dr Mephesto <dnhkng@googlem ail.comwrote:
              >
              >I would like to create a pretty big list of lists; a list 3,000,000
              >long, each entry containing 5 empty lists. My application will append
              >data each of the 5 sublists, so they will be of varying lengths (so no
              >arrays!).
              Why do you want to pre-create this? Why not just create the big list and
              sublists as you append data to the sublists?
              --
              Aahz (aahz@pythoncra ft.com) <* http://www.pythoncraft.com/

              "Many customs in this life persist because they ease friction and promote
              productivity as a result of universal agreement, and whether they are
              precisely the optimal choices is much less important." --Henry Spencer

              Comment

              • John Machin

                #8
                Re: creating really big lists

                On Sep 5, 7:50 pm, Dr Mephesto <dnh...@googlem ail.comwrote:
                Hi!
                >
                I would like to create a pretty big list of lists; a list 3,000,000
                long, each entry containing 5 empty lists. My application will append
                data each of the 5 sublists, so they will be of varying lengths (so no
                arrays!).
                Will each and every of the 3,000,000 slots be used? If not, you may be
                much better off storagewise if you used a dictionary instead of a
                list, at the cost of slower access.

                Cheers,
                John

                Comment

                • Hrvoje Niksic

                  #9
                  Re: creating really big lists

                  Dr Mephesto <dnhkng@googlem ail.comwrites:
                  I would like to create a pretty big list of lists; a list 3,000,000
                  long, each entry containing 5 empty lists. My application will
                  append data each of the 5 sublists, so they will be of varying
                  lengths (so no arrays!).
                  >
                  Does anyone know the most efficient way to do this? I have tried:
                  >
                  list = [[[],[],[],[],[]] for _ in xrange(3000000)]
                  You might want to use a tuple as the container for the lower-level
                  lists -- it's more compact and costs less allocation-wise.

                  But the real problem is not list allocation vs tuple allocation, nor
                  is it looping in Python; surprisingly, it's the GC. Notice this:

                  $ python
                  Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
                  [GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
                  Type "help", "copyright" , "credits" or "license" for more information.
                  >>import time
                  >>t0=time.time( ); l=[([],[],[],[],[]) for _ in xrange(3000000)];
                  >>t1=time.time( )
                  >>t1-t0
                  143.89971613883 972

                  Now, with the GC disabled:
                  $ python
                  Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
                  [GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
                  Type "help", "copyright" , "credits" or "license" for more information.
                  >>import gc
                  >>gc.disable( )
                  >>import time
                  >>t0=time.time( ); l=[([],[],[],[],[]) for _ in xrange(3000000)];
                  >>t1=time.time( )
                  >>t1-t0
                  2.9048631191253 662

                  The speed difference is staggering, almost 50-fold. I suspect GC
                  degrades the (amortized) linear-time list building into quadratic
                  time. Since you allocate all the small lists, the GC gets invoked
                  every 700 or so allocations, and has to visit more and more objects in
                  each pass. I'm not sure if this can be fixed (shouldn't the
                  generational GC only have to visit the freshly created objects rather
                  than all of them?), but it has been noticed on this group before.

                  If you're building large data structures and don't need to reclaim
                  cyclical references, I suggest turning GC off, at least during
                  construction.

                  Comment

                  • Dr Mephesto

                    #10
                    Re: creating really big lists

                    On 6 Sep., 01:34, "Delaney, Timothy (Tim)" <tdela...@avaya .comwrote:
                    Hrvoje Niksic wrote:
                    Dr Mephesto <dnh...@googlem ail.comwrites:
                    >
                    I would like to create a pretty big list of lists; a list 3,000,000
                    long, each entry containing 5 empty lists. My application will
                    append data each of the 5 sublists, so they will be of varying
                    lengths (so no arrays!).
                    >
                    Does anyone know the most efficient way to do this? I have tried:
                    >
                    list = [[[],[],[],[],[]] for _ in xrange(3000000)]
                    If you're building large data structures and don't need to reclaim
                    cyclical references, I suggest turning GC off, at least during
                    construction.
                    >
                    This is good advice, but another question is whether you really want
                    such a list. You may well be better off with a database of some kind -
                    they're designed for manipulating large amounts of data.
                    >
                    Tim Delaney
                    I need some real speed! a database is waaay to slow for the algorithm
                    im using. and because the sublists are of varying size, i dont think I
                    can use an array...

                    Comment

                    • Paul McGuire

                      #11
                      Re: creating really big lists

                      On Sep 6, 12:47 am, Dr Mephesto <dnh...@googlem ail.comwrote:
                      >
                      I need some real speed! a database is waaay to slow for the algorithm
                      im using. and because the sublists are of varying size, i dont think I
                      can use an array...- Hide quoted text -
                      >
                      - Show quoted text -
                      How about a defaultdict approach?

                      from collections import defaultdict

                      dataArray = defaultdict(lam bda : [[],[],[],[],[]])
                      dataArray[1001][3].append('x')
                      dataArray[42000][2].append('y')

                      for k in sorted(dataArra y.keys()):
                      print "%6d : %s" % (k,dataArray[k])

                      prints:
                      1001 : [[], [], [], ['x'], []]
                      42000 : [[], [], ['y'], [], []]

                      -- Paul


                      Comment

                      • Hrvoje Niksic

                        #12
                        Re: creating really big lists

                        Dr Mephesto <dnhkng@googlem ail.comwrites:
                        I need some real speed!
                        Is the speed with the GC turned off sufficient for your usage?

                        Comment

                        • Dr Mephesto

                          #13
                          Re: creating really big lists

                          On 6 Sep., 09:30, Paul McGuire <pt...@austin.r r.comwrote:
                          On Sep 6, 12:47 am, Dr Mephesto <dnh...@googlem ail.comwrote:
                          >
                          >
                          >
                          I need some real speed! a database is waaay to slow for the algorithm
                          im using. and because the sublists are of varying size, i dont think I
                          can use an array...- Hide quoted text -
                          >
                          - Show quoted text -
                          >
                          How about a defaultdict approach?
                          >
                          from collections import defaultdict
                          >
                          dataArray = defaultdict(lam bda : [[],[],[],[],[]])
                          dataArray[1001][3].append('x')
                          dataArray[42000][2].append('y')
                          >
                          for k in sorted(dataArra y.keys()):
                          print "%6d : %s" % (k,dataArray[k])
                          >
                          prints:
                          1001 : [[], [], [], ['x'], []]
                          42000 : [[], [], ['y'], [], []]
                          >
                          -- Paul
                          hey, that defaultdict thing looks pretty cool...

                          whats the overhead like for using a dictionary in python?

                          dave

                          Comment

                          • Gabriel Genellina

                            #14
                            Re: creating really big lists

                            En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dnhkng@googlem ail.com>
                            escribi�:
                            hey, that defaultdict thing looks pretty cool...
                            >
                            whats the overhead like for using a dictionary in python?
                            Dictionaries are heavily optimized in Python. Access time is O(1),
                            adding/removing elements is amortized O(1) (that is, constant time unless
                            it has to grow/shrink some internal structures.)

                            --
                            Gabriel Genellina

                            Comment

                            • Dr Mephesto

                              #15
                              Re: creating really big lists

                              On Sep 8, 3:33 am, "Gabriel Genellina" <gagsl-...@yahoo.com.a rwrote:
                              En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dnh...@googlem ail.com>
                              escribi?:
                              >
                              hey, that defaultdict thing looks pretty cool...
                              >
                              whats the overhead like for using a dictionary in python?
                              >
                              Dictionaries are heavily optimized in Python. Access time is O(1),
                              adding/removing elements is amortized O(1) (that is, constant time unless
                              it has to grow/shrink some internal structures.)
                              >
                              --
                              Gabriel Genellina
                              well, I want to (maybe) have a dictionary where the value is a list of
                              5 lists. And I want to add a LOT of data to these lists. 10´s of
                              millions of pieces of data. Will this be a big problem? I can just try
                              it out in practice on monday too :)

                              thanks

                              Comment

                              Working...