Re: very large dictionary
Have you considered that the operating system imposes per-process limits
According to our system administrator, I can use all of the 128G.
>
Probably a good example of premature optimization.
Well, as I was using Python, I did not expect to have to care about
the language's internal affairs that much. I thought I could simply do
always the same no matter how large my files get. In other words, I
thought Python was really scalable.
I do not remember this exactly. But I think it was not much more than
an hour.
Have you considered that the operating system imposes per-process limits
on memory usage? You say that your server has 128 GB of memory, but that
doesn't mean the OS will make anything like that available.
doesn't mean the OS will make anything like that available.
I thought it would be practical not to create the
dictionary from a text file each time I needed it. I.e. I thought
loading the .pyc-file should be faster. Yet, Python failed to create a
.pyc-file
dictionary from a text file each time I needed it. I.e. I thought
loading the .pyc-file should be faster. Yet, Python failed to create a
.pyc-file
Probably a good example of premature optimization.
the language's internal affairs that much. I thought I could simply do
always the same no matter how large my files get. In other words, I
thought Python was really scalable.
Out of curiosity, how
long does it take to create it from a text file?
long does it take to create it from a text file?
an hour.
Comment