I have developed a Fusion Web Application on Oracle ADF.
Now i have deployed this Application on Weblogic Linux 64bit server and test this application using JMeter.
Initially i have test this application with 50 users then 100 and 500 users.
Problem is this, Server Memory and Swap space is not getting free.
It is getting increase even i run the 10 users test multiple times
Related
We have two wildfly 16 server running on Linux. First with JDK 11.0.2, second with JDK 8.
Wildfly 1 has a remote outbound connection to wildfly 2 which is used for HTTP-remoting. This is necessary because it has to run with Java 8 32 bit.
When we perform a load test after 100.000 requests from wildfly 1 to wildfly 2 response time increases steadily.
A heap dump analysis of wildfly 2 using MAT gives us some information about the problem. The heap dump shows a lot of ‘io.netty.buffer.Poolchunks’ that use about 73% of the memory.
Seems the inbound buffers won't be cleaned properly.
Wildfly 2 does not recover when the load stops.
Is there any workaround or setting to avoid this?
Facing the below issue in production environment but not reproducible in QA environment. Please help to resolve this issue.
We have a java application deployed in Jboss EAP 6.2 cluster setup using domain controller and using HornetQ as messaging system which has 4 queues and one topic and using MDB and MDP listeners for the queues and topic, application works fine with good server health for a week but after a week server crashes with out of memory error. I took the heap dump and analysed it with MAT and the finding is many instances of ClientSessionFactoryImpl (around 1,73,329 instances occupying 707MB of heap memory) is created and it is not getting garbage collected.
Problem Suspect 1
1,73,329 instances of "org.hornetq.core.client.impl.ClientSessionFactoryImpl", loaded by "org.jboss.modules.ModuleClassLoader # 0x760004438" occupy 741,790,528 (83.13%) bytes. These instances are referenced from one instance of "java.util.HashMap$Entry[]", loaded by "<system class loader>"
Keywords
org.jboss.modules.ModuleClassLoader # 0x760004438
org.hornetq.core.client.impl.ClientSessionFactoryImpl
java.util.HashMap$Entry[]
OS: RHEL 6.8
server: Jboss EAP 6.2
java: JDK 1.6_24
MAT Screenshot
I am working with a ubuntu web server, where I have tomcat 8 and running a web application on tomcat. Recently I have faced some problem with cpu uses 100%. When I restart the tomcat server it is running good, but after one day or a few hour again same problem arise ( 100% cpu use) and that problem make my site slow. When I see the process list with htop command then i see so many process like
/opt/java8/jre/bin/java -Djava.util.logging.config.file=/opt/tomcat8/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoader
The website is a live eCommerce website and average user in every moment is around 100.
The server is a aws ec2 server where a tomcat and a java application inside the tomcat is running. The database (MySql) is in an another aws RDS server.
What can i do in this situation? Please help.
I'm using Jboss(WildFly 9) to host my Solr application on a Windows Server(2012), but if I start JBoss it's very slow and taking at least 15 mins to start. There is only one app (solr.war).
The same JBoss in my local with the same WAR takes less than a minute to start.
The JVM is set as MIN 4GB and Max 16GB on the servers.
What could be the reason it is very slow?
Try starting the sever without war and if it takes normal time in bootup then check the war code else check the jboss version and logs.
I am currently load testing my web application ( Spring + Hibernate based) on a standalone tomcat server (v7.0.27)on a Windows Server 2008 machine. I need to know how tomcat behaves as bulk requests come. e.g.
300 requests recevied - current heap size, server is hung up, server is unable to process, heap size, size of objects, number of objects. So on and so forth.
Is there a way to see this already ? (Info from the manager app is insufficient "current Threads active and memory occupied is not for my requirement).
P.S. maxThreads property for Connector element is 350.
Update : Another issue I faced while load testing - (Tomcat hangs up when i send 300 requests in some cases).
Any help would be highly and greatly appreciated.
you can use jconsole that ships with jdk.
http://docs.oracle.com/javase/6/docs/technotes/guides/management/jconsole.html
If the server hangs, there might be a deadlock.
You can try to attach with JProfiler, the monitoring section will show you the current locking situation and a possible deadlock.
Disclaimer: My company develops JProfiler.