3

in my unit test I deliberately trying to raise an OutOfMemoryError exception. I use a simple statement like the following:

byte[] block = new byte[128 * 1024 * 1024 * 1024];

The code works on Win7 64bit with jdk6u21 64bit. But when I run this on Centos 5 64bit with jdk6u21 no OutOfMemoryError thrown, even when I make the size of the array bigger.

Any idea?

Pascal Thivent
572k140 gold badges1.1k silver badges1.1k bronze badges
asked Jul 16, 2010 at 2:28

7 Answers 7

5

Linux doesn't always allocate you all the memory you ask for immediately, since many real applications ask for more than they need. This is called overcommit (it also means sometimes it guesses wrong, and the dreaded OOM killer strikes).

For your unittest, I would just throw OutOfMemoryError manually.

answered Jul 16, 2010 at 2:31
Sign up to request clarification or add additional context in comments.

2 Comments

I really wanted to post an answer consisting solely of throw new OutOfMemoryError(); but there'd be no point now... +1
Sorry, may be I was not to clear. I don't want to simply throw the OOM - but rather to exhaust the memory till dried up. This is because I am testing the soft memory references
4

If you just want to consume all the memory do the following:

 try {
 List<Object> tempList = new ArrayList<Object>();
 while (true) {
 tempList.add(new byte[128 * 1024 * 1024 * 1024]);
 }
 } catch (OutOfMemoryError OME) {
 // OK, Garbage Collector will have run now...
 }
answered Jul 16, 2010 at 7:32

1 Comment

Thanks! this works. I just change the new byte[128 * 1024 * 1024 * 1024] to new byte[Integer.MAX_VALUE].
3

128*1024*1024*1024=0 because int is 32-bit. Java doesn't support arrays larger than 4Gb.

answered Jul 16, 2010 at 5:05

Comments

1
ulimit -v 102400 
ulimit -d 102400
unitTest.sh

The above should limit your unit test to 1M of virtual memory, and 1M data segment size. When you reach either of those, your process should get ENOMEM. Careful, these restrictions apply for the process / shell where you called them exits; you might want to run them in a subshell.

man 2 setrlimit for details on how that works under the hood. help ulimit for the ulimit command.

answered Jul 16, 2010 at 3:34

Comments

1

You could deliberately set the maximum heap size of your JVM to a small amount by using the -Xmx flag.

Launch the following program:


public final class Test {
 public static void main(final String[] args) {
 final byte[] block = new byte[Integer.MAX_VALUE];
 }
}

with the following JVM argument: -Xmx8m

That will do the trick:


Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
 at Test.main(Test.java:4)
answered Jul 16, 2010 at 8:16

Comments

1

Minor point but allocating new long[Integer.MAX_VALUE] will use up memory 8x faster. (~16 GB each)

answered Jul 17, 2010 at 7:41

Comments

0

The reason for no OutofMemoryError is that the memory is being allocated in a uncommitted state, with no page.

If you write a non-zero byte into each 4K of the array, that will then cause the memory to be allocated.

answered Jul 16, 2010 at 2:37

2 Comments

Why each 4K of the array? If I fill the entire array a byte value such as the following, would it be the same? byte b = 10; Arrays.fill(block, b);
The poster assuumes a 4k block size (which is reasonable afaik). One byte per 4k is faster to do than filling the entire array.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.