Resolving OutOfMemoryError: unable to create new native thread
(This was tested on Red Hat EL6 JDK 1.6.31 64-bit)
If you've ever come across this error, it can be very misleading.
The common suggestion (as the error suggests) is a memory related problem. Some tips but ultimately unhelpful:
http://candrews.integralblue.com/2009/01/preventing-outofmemoryerror-native-thread/
http://www.caucho.com/resin-3.0/performance/jvm-tuning.xtp
http://www.oracle.com/technetwork/java/javase/tech/vmoptions-jsp-140102.html
The common theme amongst all of them is the stack size.
Came across this post with same test code:
http://www.odi.ch/weblog/posting.php?posting=411
However the tests were inconsistent, modified it to only go 50 calls deep. Modified code here.
We discovered that changing the options made very little difference to the maximum number of threads.
Originally we thought it was related to changing the linux option for hard/soft maximum number of files that we did the day before:
However that didn't seem to be it.
Then we finally came across max process per user:
Which is set to 1024 by default. So adding this to ~/.profile fixed it for us:
Update 2013/08/15:
To see the number of process for a user run this (replacing [user] with the username):
If you've ever come across this error, it can be very misleading.
Caused by: java.lang.OutOfMemoryError: unable to create new native thread at java.lang.Thread.start0(Native Method) at java.lang.Thread.start(Thread.java:640)
The common suggestion (as the error suggests) is a memory related problem. Some tips but ultimately unhelpful:
http://candrews.integralblue.com/2009/01/preventing-outofmemoryerror-native-thread/
http://www.caucho.com/resin-3.0/performance/jvm-tuning.xtp
http://www.oracle.com/technetwork/java/javase/tech/vmoptions-jsp-140102.html
The common theme amongst all of them is the stack size.
Came across this post with same test code:
http://www.odi.ch/weblog/posting.php?posting=411
However the tests were inconsistent, modified it to only go 50 calls deep. Modified code here.
We discovered that changing the options made very little difference to the maximum number of threads.
Originally we thought it was related to changing the linux option for hard/soft maximum number of files that we did the day before:
# vi /etc/security/limits.conf testuser soft nofile 4096 testuser hard nofile 10240 # ulimit -Hn 10240
However that didn't seem to be it.
Then we finally came across max process per user:
ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 515005 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 4096 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 1024 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited
Which is set to 1024 by default. So adding this to ~/.profile fixed it for us:
ulimit -u 4096
Update 2013/08/15:
To see the number of process for a user run this (replacing [user] with the username):
ps -eLF -u[user] | wc -l
Saved my day! Ran into exactly same problem and spent hours on searching, tuning Java memory related options without success.
ReplyDeleteThis post hit the root cause for our application OOME!
Alternatively, you can put per user limit here:
cd /etc/security/limit.d
Create a file similar to 90_nprof.conf with a smaller number prefix, e.g. 80_nprof.conf, then put per user process limit there.
XF
Excelent !!!
ReplyDeletewe had the same problem with JADE Framework,
thanks for you help, this is the solution
Hi Chris and very good post on this common problem.
ReplyDeleteYou also may be interested in other common root causes of this problem such as native heap depletion of the Java process; especially for a 32-bit JVM. Thread dump analysis is also very useful when your application / Java EE container is firing too many threads.
Looking forward for more articles from you.
Thanks.
P-H
Thanks - a very big help! I was also trying to understand threads in Java, and found this page to be very helpful:
ReplyDeleteHow to Create a thread in Java
I could not understand where this ~./profile file resides and what to be modified in that file. Please help me
ReplyDeleteI agree with Hagasharath, "I could not understand where this ~./profile file resides and what to be modified in that file. Please help me". Could you clarify the step taken to perform this process?
ReplyDeleteThanks,
It's the environment init scripts for the current user.
ReplyDeleteSee http://linux.die.net/Bash-Beginners-Guide/sect_03_01.html
This comment has been removed by the author.
ReplyDeleteHelped me too...Really good post.
ReplyDeleteIt helped me with "unable to create new native thread".
ReplyDeleteThanks.
Shyam
Hi Chris,
ReplyDeleteFirst, sorry for posting question in Red Hat box but I don't have solution to solve my problem :(
How to solve the same problem on window with Java 32 bit. I faced that error.
I try to increase -Xmx and decrease -Xss parameter but not better :(.
I set -Xmx1024M but it increase over that value, java application stop working when reaching to maximum memory (about 1.8 GB).
Thanks,
Thai Huynh
Hi Thai Huynh,
DeleteOn a 32 bit machine you have to decrease the heap size to make available more memory for native processes. You can google it for more details. It is relatively simpler to fix this problem on 32 bit machine.
Thank you Christopher! Your post led me in the right direction. I found a file in /etc/security/limits.d that was limiting the number of processes for all non-root users to 1024, which is not enough for Jenkins. By the way that is the "correct" place to set the limits in Linux (at least in RHEL/CentOS).
ReplyDeleteIn addition to that, you have "ps -eLF -u[user] | wc -l" which should actually just be "ps -LFu [user] | wc -l" ... the "-e" shows all processes (everyone) and overrides the -u :)
~tommy
Hi Chris,
ReplyDeleteI have been facing the same issue of OutofMemory:unsable to create new native thread.
The issue comes up when my SOAP request tries to insert\update records in database. Update fails after inserting 600 or aprox records at a time throwing the native thread error and shuts down the server.
Although the suggestions and guideline you provided is for Linux, but since my application setup EAServer6(sybase product) is configured to run on JDK 1.7 so found the post helpful for my purpose as well.
EAServer6(32 Bit server application) is running on Window Server 2008 R2 Standard 64bit machine.
To run this server, script picks the below parameters:
REM set DJC_ARCH_64=true
REM set DJC_JAVA_HOME_17=C:\Program Files (x86)\Sybase\EAServer6\jdk\jdk1.7.0_09
set DJC_RT_DEFAULT=17
set DJC_JDK_DEFAULT=17
set DJC_JVM_MINHEAP=128M
set DJC_JVM_MAXHEAP=512M
set DJC_JVM_USER_ARGS=%DJC_JVM_USER_ARGS% -XX:PermSize=128M
set DJC_JVM_USER_ARGS=%DJC_JVM_USER_ARGS% -XX:MaxPermSize=512M
set DJC_JVM_USER_ARGS=%DJC_JVM_USER_ARGS% -XX:StackShadowPages=15
set DJC_JVM_USER_ARGS=%DJC_JVM_USER_ARGS% -Xss256k
REM set CLASSPATH=C:\Program Files (x86)\Sybase\Shared\PowerBuilder\pbjdbc12125.jar;C:\Program Files (x86)\Sybase\Shared\PowerBuilder\pbejbclient125.jar;%CLASSPATH%
set CLASSPATH=%CLASSPATH%;C:\Program Files (x86)\Sybase\EAServer6\genfiles\java\classes;
I tried tweking the stack size and heap parameters to get out of the issue but it has been of no use.
I could see that you also advised to include the max process per user, but no idea how and where I need to add this parameter value ?
Do you have any suggestion or info how I can proceed to fix the running issue in EAServer6.
Thanks,
Kunal
Hi Kunal,
DeleteThis post is specific to linux/unix as windows has a different threading model. I don't recall seeing this error before on windows but it could just be a standard out of heap space. Yours is only set to 512M, for win32 you can typically go as high as 1200M.
Also, for windows specific options see:
http://docs.oracle.com/javase/7/docs/technotes/tools/windows/java.html
But you'd likely find the solution on sybase's website.
This comment has been removed by a blog administrator.
ReplyDelete