Creating Java Memory Leaks with WeakHashMap
For many people Java is not only a programming language, it's more: it's a religion!
- They have a strong and unbreakable belief into the optimization capabilities of Java and neglect to pick proper algorithms.
- And with similar strength they trust into garbage collection.
So many developers detected the class WeakHashMap and are using in it in the belief it solves all memory issues, but is this correct?
Simple WeakHashMap test
Let's regard a simple program which utilizes one WeakHashMap:
import java.util.Random;
import java.util.WeakHashMap;
public class W1 {
private final static Random RAND = new Random();
private final static WeakHashMap<String, String> WH = new WeakHashMap<String, String>();
public static void main(String[] args) {
for (;;) {
String s = newString();
WH.put(s, s);
}
}
private static String newString() {
byte b[] = new byte[64];
RAND.nextBytes(b);
return new String(b, 0);
}
}
You can run it and it runs endless, so the belief seems ok.
More complex WeakHashMap test
But, wait, what happens if you use two WeakHashMap instances having objects which refer to each other?
import java.util.WeakHashMap;
public class W2 {
private final static WeakHashMap<Ref, Ref> RH1 = new WeakHashMap<Ref, Ref>();
private final static WeakHashMap<Ref, Ref> RH2 = new WeakHashMap<Ref, Ref>();
public static void main(String[] args) {
for (;;) {
Ref r1 = new Ref();
Ref r2 = new Ref();
r1.ref = r2;
r2.ref = r1;
RH1.put(r1, r2);
RH2.put(r2, r1);
}
}
static class Ref {
public Ref ref;
}
}
The result is (with -Xmx4096m):
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.WeakHashMap.newTable(WeakHashMap.java:189)
at java.util.WeakHashMap.resize(WeakHashMap.java:470)
at java.util.WeakHashMap.put(WeakHashMap.java:444)
at W2.main(W2.java:14)
Why this?
Explanation of the WeakHashMap behaviour
The 2nd test results into OOM because
- the table inside the WeakHashMap exceeds the available memory.
If the garbage collector would free enough weak references then there might be enough free memory for the table.
The current garbage collector frees the weak references chains only after a while, depending on the -XX:SoftRefLRUPolicyMSPerMB=<value> settings which defaults to 1000.
This is the reason that even unreferenced weak references live quite long until the memory fills up. Even with only 1 MB free memory left a soft reference would live 1 second.
- Does -XX:SoftRefLRUPolicyMSPerMB=1 help?
- It affects the live time of weak references. If your applications creates many weak references very fast, there is no change. On the other side, if you use weak references to cache something you will encounter more cache misses, since the live time is shortened.
But it does not really help. Let's update the test program:
import java.util.WeakHashMap;
public class W3 {
private final static WeakHashMap<Ref, Ref> RH1 = new WeakHashMap<Ref, Ref>();
private final static WeakHashMap<Ref, Ref> RH2 = new WeakHashMap<Ref, Ref>();
public static void main(String[] args) {
for (;;) {
Ref r1 = new Ref();
Ref r2 = new Ref();
RefData rdata = new RefData();
r1.ref = rdata;
r2.ref = rdata;
rdata.ref = r1;
rdata.ref2 = r2;
RH1.put(r1, r2);
RH2.put(r2, r1);
}
}
static class Ref {
public Ref ref;
}
static class RefData extends Ref {
public Ref ref2;
public byte data[] = new byte[0x1000];
}
}
Here I added an additional object referening the weak referenced Ref instances. So there is more effort needed to determine if a weak referenced object has a real hard reference or not.
And: program W3 fails way faster.
Conclusion for the usage of WeakHashMaps
Java is not magic!
Even if you use a WeakHashMap the memory may fill up and lead to an OutOfMemory exception.
Epic fail.
Bug Submitted
You can view the bug here http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=7145759 or even vote for it http://bugs.sun.com/bugdatabase/addVote.do?bug_id=7145759
(send your comment to stefan@franke.ms)
rev: 1.5