OK, this is driving me insane! It used to be infrequent and now its practically ever character I type that causes Eclipse goes into 'Not Responding' and CPU rockets towards 100% and stays there for a minute. Sometimes this is accompanied by node.exe taking half the CPU and a LOT of memory. I kill node.exe and sometimes it stays off but mostly it comes back.
I've looked up node.exe and can't figure out what it has to do with my application. I'm writing a webapp using Tomcat, Struts, Java, JSP, JQuery. I disabled every plugin from Preferences->startup/shutdown with no effect.
Help! I can't develop when every keypress takes a minute!
Take a look at https://bugs.eclipse.org/bugs/show_bug.cgi?id=442049 or https://github.com/Nodeclipse/nodeclipse-1/issues/159
I overcomed by removing <nature>org.eclipse.wst.jsdt.core.jsNature</nature>
But you may have other JSDT related issue.
And you must know exactly what process is consuming CPU at what rate.
I would really suggest not using Eclipse for node. Try NotePad++ on Windows or Sublime or Atom or something...
Related
I'm using Guidewire development Studio (IntelliJ-based IDE), and it is very slow when handling big text files (~ 1500 lines and above). I tried with an out-of-the-box community IntelliJ as well, but meet the same problem.
When I open those files, it takes 1 second to type a character, even when I see clearly that the used memory is still plenty (1441 MB/ 3959 MB). Also, it quickly suck up all the memory if I open multiple files (I allocate 4GB just for IntelliJ). Intellisense and other automatical stuff is painfully slow as well.
I love IntelliJ, but working in those condition is just so hard. Is there any way to work around this problem? I have thought of some alternatives, like:
Edit big files on another editor (eg: Notepad++), then reload it on IntelliJ
Open another small file, copy your bit of code there, edit it, then copy it back. It would help because intellisense and code highlight is maintained, however it is troublesome
I did turn off all unnecessary plugins, only leaving those necessary, but nothing improved much.
I am also wondering if I can "embed" some of outside editor in IntelliJ? Like Notepad++, Notepad2 for example? I did my homework and google around but find no plugins/ configuration that allow to do that.
Is there anyone who's experienced can give me some advices how to work with big files in IntelliJ (without going mad)?
UPDATE: Through my research I learn that IntelliJ can break for very large files (like 20mb) or so on. But my file isn't that big. It just have about 100KB - 1MB, but it's very long text.
UPDATE 2 After trying increase the heap memory as Sajidkhan advise (I changed both idea64.vmoptions and idea.vmoptions), I realize that somehow IntelliJ doesn't take the change. The memory heap is stuck at maximum 3GB.
On another note, the slow performance can be perceived when the system uses only around ~1GB of heap memory, so I think the problem doesn't relate to memory issue.
After a while working around the bush, I find a workaround, kind of.
When I check other answers from similar questions, I found that they begin get troubles when the file size is at least several MBs. It doesn't make sense, since I got the trouble when the files are only several KBs. After more careful checking, I found that the Gosu plugin is the culprit: after I mark my Gosu file as "text only", the speed becomes normal.
So I guess the problem has something to do with code highlighter & syntax reminder. For now, the best way I work-around this is:
Right-click the file and mark it as plain text.
Close the file and open it again, then edit.
Note: Since it applies for all the file type in Guidewire development suit, you may want to mark permanently some long files as plain text, especially the *.properties (aka, i18n/international files). The benefit of code auto-completer just doesn't worth the trouble.
Can you try editing idea64.vmoptions in the bin folder. You could set the max heap and max PermGen to be a higher value
Don't forget to restart!
Tested on different PCs. Even on fast processors the editor is painfully slow when working with large files (2000+ lines of code).
Eclipse, Netbeans are absolutely OK. Tuning .vmoptions will not help.
This bug is still not fixed: https://intellij-support.jetbrains.com/hc/en-us/community/posts/206999515-PhpStorm-extremely-slow-on-large-source-files
UPDATE. Try 32 bit version with default settings. Usually 32bit idea works faster and eats less memory.
I have tried three IDEs, all of which I'm fairly sure require Java to run, and all of them start up very very slow (30 seconds to 1 minute) on the first launch of the day. After that, they all start up lightening fast.
The three programs are: Aptana Studio 3, Eclipse, and PHP Webstorm.
Based on upon my web searches, I have modified the AptanaStudio3.ini using some of the suggestions on how to speed it up and they all work ... for every start up after the first launch, that is, but the first launch of the day remains painfully and inexplicably slow.
I have searched SO and I did not see any questions speaking to this issue. If anyone finds an answer here, thank you very much but I could not.
My only conclusion is that this issue is related to how Java runs on Windows 8 since all three software programs are adversely affected. Is this a known bug in Java on Windows 8? I have no idea what to think but I would be greatly appreciative if someone can offer help.
OBSERVATION: from my testing, it seems that if I start up my laptop and then launch Eclipse or Aptana within the first say the first 10 minutes of booting, it launches quicker (still slow but not as bad) then if I were to wait for about an hour and then launch my IDE. Not sure what this indicates.
Thanks
Though you can tune the Eclipse (or Aptana) .ini file and do things like disable class verification and boot using the JVM DLL, this has more to do with OS and hardware disk caching than the JVM. Boot each of the IDEs from a Ramdisk and you'll see that they boot equally as quickly from RAM the first time as they do from 'disk' the second time.
Source: I've spent a lot of time trying to solve this problem already. :)
It might be worth checking your antivirus scanner behaviour - I have precisely this problem.
In spite of SSD & reasonably quick i5 on win8 ultimate, the first boot time for eclipse is measured in many minutes (can be over 10), with subsequent restarts being done in a matter of tens of seconds. The whole PC can do a full restart in about half a minute, so its unlikely to be a raw I/O issue.
From looking at the cpu hogs & digging from there, it appears that the a/v (macafee) is doing an on-access scan for all the eclipse components & plugins after every boot & I suspect this is where much of the time is being taken.
I'll post an update when I've persauded someone to exclude eclipse & jvm from the on-access scan...
Since Aptana Studio is based upon Eclipse there is no big difference to be expected.
This is not a known Bug for Java on Windows 8, since I experienced it at least already in Windows 7. AFAIK it has to do with starting the JVM for the first time.
Of course you could throw a lot of memory at it or tweak the .ini of the IDE. The JVM-startupprocess wouldn't really be affected and it would still be slow. What is neglectable for a server is a problem on the desktop. For details take a look at http://en.wikipedia.org/wiki/Java_performance#Startup%5Ftime
I'm making an application in Java using Eclipse Indigo. When I run it using Eclipse the Task Manager shows javaw.exe is using 50mb of memory. When I export the application as a runnable .jar and execute the .jar the Task Manager shows javaw.exe is using 500mb.
Why is this? How could I fix this?
Edit: I'm using a Windows 7 64 bit, and my system says I have Java 1.7 installed. Apparently the memory problem is caused by a while loop. I'll study what's inside the while loop causing the problem.
Edit: Problem found. At one point in the while loop new BufferedImage instances where created, instead of replacing the same BufferedImage.
Without any additional details about your code, I would suggest using a profiler to analyze the problem. I know YourKit and the one that is available for NetBeans are very good.
Once you run you app from the profiler, you should initially look at the objects and listeners created by your application's packages. If the issue is not there, you can expand your search to other packages until you identify things that are growing out-of-control, and then look at the code that handles those entities.
When you run certain parts of the code multiple times and still see memory utilization after that code stopped running, then you might have a leak and may consider nulling or emptying variables/listeners on exit.
It should be a good starting point, but please report your results back, so we know how it goes. By the way, what operating system are you using and what version of java?
--Luiz
You need to profile your code to get the exact answer, but from my experience when I see similar things I often equate it to garbage collecting. For example, I ran the same job and gave one job 10 gigs and the other 2 gigs..Both ran and completed but the 10gigs one used more memory(and finished faster) while the second(2gig) one, I believe, garbage collected so it still completed but took a bit more time with less memory. I'm a bit new to java so I maybe assuming the garbage collecting but I have seen what you are talking about.
You need to profile your code(check out jconsole, which is included with java, or visualVM)..
That sounds most peculiar.
I can think of two possible explanations:
You looked at the wrong javaw.exe instance. Perhaps you looked at the instance that is running Eclipse ... which is likely to be that big, or bigger.
You have (somehow) managed to configure Java to run with a large heap by default. On Linux you could do this with a wrapper script, a shell function or a shell alias. You can do at least the first of those on Windows.
I don't think it is the JAR file itself. AFAIK, you can't set JVM parameters in a JAR file. It is possible that you've somehow included a different version of something in the JAR file, but that's a bit of a stretch ...
If none of these ideas help, try profiling.
Problem found. At one point in the while loop new BufferedImage instances where created, instead of replacing the same BufferedImage.
Ah yes. BufferedImage uses large amounts of out-of-heap memory and that needs to be managed carefully.
But this doesn't explain why your application used more memory when run from the JAR than when launched from Eclipse ... unless you were telling the application to do different things.
We have an Java ERP type of application. Communication between server an client is via RMI. In peak hours there can be up to 250 users logged in and about 20 of them are working at the same time. This means that about 20 threads are live at any given time in peak hours.
The server can run for hours without any problems, but all of a sudden response times get higher and higher. Response times can be in minutes.
We are running on Windows 2008 R2 with Sun's JDK 1.6.0_16. We have been using perfmon and Process Explorer to see what is going on. The only thing that we find odd is that when server starts to work slow, the number of handles java.exe process has opened is around 3500. I'm not saying that this is the acual problem.
I'm just curious if there are some guidelines I should follow to be able to pinpoint the problem. What tools should I use? ....
Can you access to the log configuration of this application.
If you can, you should change the log level to "DEBUG". Tracing the DEBUG logs of a request could give you a usefull information about the contention point.
If you can't, profiler tools are can help you :
VisualVM (Free, and good product)
Eclipse TPTP (Free, but more complicated than VisualVM)
JProbe (not Free but very powerful. It is my favorite Java profiler, but it is expensive)
If the application has been developped with JMX control points, you can plug a JMX viewer to get informations...
If you want to stress the application to trigger the problem (if you want to verify whether it is a charge problem), you can use stress tools like JMeter
Sounds like the garbage collection cannot keep up and starts "halt-the-world" collecting for some reason.
Attach with jvisualvm in the JDK when starting and have a look at the collected data when the performance drops.
The problem you'r describing is quite typical but general as well. Causes can range from memory leaks, resource contention etcetera to bad GC policies and heap/PermGen-space allocation. To point out exact problems with your application, you need to profile it (I am aware of tools like Yourkit and JProfiler). If you profile your application wisely, only some application cycles would reveal the problems otherwise profiling isn't very easy itself.
In a similar situation, I have coded a simple profiling code myself. Basically I used a ThreadLocal that has a "StopWatch" (based on a LinkedHashMap) in it, and I then insert code like this into various points of the application: watch.time("OperationX");
then after the thread finishes a task, I'd call watch.logTime(); and the class would write a log that looks like this: [DEBUG] StopWatch time:Stuff=0, AnotherEvent=102, OperationX=150
After this I wrote a simple parser that generates CSV out from this log (per code path). The best thing you can do is to create a histogram (can be easily done using excel). Averages, medium and even mode can fool you.. I highly recommend to create a histogram.
Together with this histogram, you can create line graphs using average/medium/mode (which ever represents data best, you can determine this from the histogram).
This way, you can be 100% sure exactly what operation is taking time. If you can't determine the culprit, binary search is your friend (fine grain the events).
Might sound really primitive, but works. Also, if you make a library out of it, you can use it in any project. It's also cool because you can easily turn it on in production as well..
Aside from the GC that others have mentioned, Try taking thread dumps every 5-10 seconds for about 30 seconds during your slow down. There could be a case where DB calls, Web Service, or some other dependency becomes slow. If you take a look at the tread dumps you will be able to see threads which don't appear to move, and you could narrow your culprit that way.
From the GC stand point, do you monitor your CPU usage during these times? If the GC is running frequently you will see a jump in your overall CPU usage.
If only this was a Solaris box, prstat would be your friend.
For acute issues like this a quick jstack <pid> should quickly point out the problem area. Probably no need to get all fancy on it.
If I had to guess, I'd say Hotspot jumped in and tightly optimised some badly written code. Netbeans grinds to a halt where it uses a WeakHashMap with newly created objects to cache file data. When optimised, the entries can be removed from the map straight after being added. Obviously, if the cache is being relied upon, much file activity follows. You probably wont see the drive light up, because it'll all be cached by the OS.
I assume the latest update version of java would provide better performance.
I am looking for a way to implement isolation of software components from endless loops or memory leaks. Android isolates each app in it's own process, Google Chrome isolates each tab in it's own process.
My primary drawback is that java takes so long to start and also I would like to reduce memory consumption.
Is there any alternate build or more controlled startup that will accomplish this?
If quick startup is your goal, Java on a PC may not be your best bet. It's going to take a few seconds because that's how long it takes to load the VM from disk.
If you want your app to start more quickly it's easy to get a splash screen up, just create a module that only loads your splash screen, waits for it to fully display then uses reflection to link to your "Real" main module.
(Use reflection because otherwise it will pull in your entire program through references before it starts the main one--at least that's how it used to work).
If you're talking about run-time performance, you won't get quicker by changing languages, Java's about as fast as you can get. You MIGHT be able to get a boost by converting to C/C++ and rewriting it to suit those platforms (Less OO, stack allocations instead of heap, etc), but otherwise none of the other languages in general usage are close to Java in speed.
If you really need the quick startup, depending on what you are doing there may be some tricks. I've seen projects that try to keep a Java VM running in your toolbar and allow you to make requests (tell it to start an app). This was faster but made additional requirements of the user (Loading this additional tool)
Another possibility--if you are constantly starting up/shutting down small tasks and that's the reason the startup bothers you then you can definitely speed it up by keeping it running invisibly. Just have your Java app open a socket and listen for commands then create a little .EXE or shell script that can start your program if it's not running or send commands to that socket if it is. This would completely eliminate startups after the first run.
In general, Java has a much longer startup time than other languages. If you are sticking with Java on a desktop app, a lot of stuff like startup time is determined by the JRE installed on the client's computer, which you can't control.
As to "endless memory leaks"... Java doesn't leak memory. If your program does, fix it.
This is a second answer because it's completely different and my other got too long :)
Try compiling it--I think GCC can compile it. This could almost completely eliminate your startup. I believe Jikes used to be a windows java compiler by IBM, but I don't know if it's still maintained.
Note that compiled code will probably run slower than JVM code for long-running apps.