3

Today I wrote a program using an array/list with 64000000 entries. However, when writing sigma=[1]*64000000 using Python it runs fine, but later on, as the program computes, my Ubuntu freezes - without any reaction to inputs, not even mouse movement. I tried twice and the results are the same.

When implemented in C++, long long sigma[64000000] holds up fine and runs very fast.

Is there any reason that my program would freeze in the middle of running, other than crashes at beginning?

EDIT: To reply to Chris below, my code did not freeze until a couple of loops later.

Thank you all!

For those who interested in seeing the code, this is the program, a brute-force Project Euler 211:

def e211():
ans=0
sigma=[1]*64000000
for i in range(2,64000000):
 j=i;
 if ((j%1000==0) or (j<100)):
 print(j)
 q=i*i
 while j<64000000:
 sigma[j]+=q
 j+=i
for i in range(1,64000000):
 j=int(sqrt(sigma[i]))
 if j*j==sigma[i]:
 ans+=i
if __name__=='__main__':
 print(e211())
Zero Piraeus
59.7k28 gold badges158 silver badges164 bronze badges
asked May 30, 2011 at 4:44
6
  • You allocated a 500MB array on the stack in C++? Commented May 30, 2011 at 4:52
  • Which version of python are you using ? Commented May 30, 2011 at 4:54
  • @Nicholas Well that's daily meal for contest programmers:) Commented May 30, 2011 at 4:56
  • @Ziyao Wei, on a side note, How many have you solved so far ? Commented May 30, 2011 at 5:07
  • @Pavan Up until now, 164. Working towards the next level:) How about you? Commented May 30, 2011 at 5:28

2 Answers 2

7

Python lists are lists of objects. A number in Python is an object in itself, and takes up somewhat more storage than the 64 bits needed to represent a long long in C++. In particular, Python transparently handles numbers larger than 32 bits, which end up taking a lot more space than a simple integer.

You may be find the standard Python array module useful. It provides Python-compatible access to uniform arrays of integers of specified size. (However, I note that it doesn't offer a 64-bit integer type.)

answered May 30, 2011 at 4:49
Sign up to request clarification or add additional context in comments.

2 Comments

So this happens maybe because Python changes the sizes of the int variables during the run so the memory dried up? THX:)
I dont want to sound like an idiot, but "standard Python array module" ? I thought numeric arrays were something unique to numpy. But then again I googled it myself and found this docs.python.org/library/array.html. Only to find the entry points to be ridiculous :D somehow array([1,2,3],'l') seems more easy than array('l',[1,2,3])..
2
range(1,64000000):

This line creates a full list of size 64000000, so now you have two lists of that size in memory. Use xrange instead.

answered May 30, 2011 at 4:47

5 Comments

Thanks! This is something new to a Python newbie like me. However, my program did not freeze until the 49th and 97th loop, respectively:)
The behavior of range is different from python 2.7 to python 3.2. range in python 3.2 is an extended version of xrange from 2.7.
Ah but there's no way it can be python 3 if I put my fingers in my ears and shout "my old code will always work! Lalalalala!"
My code indeed is coded in Python 3. Is there any idea how to improve this using the new version's feature?
@Ziyao: just stay the course. Pavan is right, Python 3's range doesn't create a full list in memory, that was a python 2 thing. I'd listen to Greg if I were you.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.