Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Something that really irritates me while developing apps is eclipse suddenly throwing random errors. Below are some of them.
Not being able to recognize Java libraries
Error: Import error, Can not resolve java.**
Solution: Clean projects and then restart eclipse.
Not being able to recognize reference libraries
Error: Multiple related errors
This is probably something that bothers the me most. I have libraries such as Google play services, appcom v7 etc that I use together in various apps. These libraries are locally saved on my computer.
Solution:
Small fix: clean projects -> restart eclipse
Major fix: To fix the resolution error I have to copy the library rename it and then make that new library a reference/support to my project again.
R can't be resolved
Solution: Multiple solutions
Way 1: Sudden package name not being recognized. Go to manifest file and fix name.
Way 2: Check if R file even exists. Clean projects.
Way 3: Check if you are missing any necessary imports. Might cause the R file to be hidden. Might be related to the previous problems.
What I want:
Is there any way to practically solve these errors once and for all. My computer is really crappy and takes forever to restart eclipse and launch the emulator.
This is a really embarrassing event especially if I want to present to my colleges or something and I have to wait for eclipse to restart.
Thanks in advance.
Many of your problems could actually be caused by your "crappy computer":
If your Eclipse doesn't have a large enough heap, then it is likely to be sluggish (in general).
If Eclipse runs with a heap that is too small, you could get OOMEs that Eclipse is unable to recover from properly. (They will probably show up in the eclipse log file.) This kind of thing could manifest as "random errors" due to breakage to Eclipse data structures.
So, a couple of practical things you could do are to give Eclipse a larger heap, and to get a machine with more RAM.
Increasing the max it can use to about 2GB. This helps initially but slows down as well.
This is a sign that you actually need more physical RAM and / or a 64-bit operating system.
You are probably putting your machine into a state where the memory page "working set" of the stuff you are running is larger than the physical memory available to hold it. The virtual memory system tries to address this by "paging"; i.e. swapping virtual pages between disc and RAM. But the result is that your machine becomes increasingly sluggish.
I would try to uninstall (Be sure to keep all of your project files backed up). Then download the latest version and install. Also, if you have a slower machine Eclipse is very intensive and it might just not run well on your computer all together.
You just update ADT to respective SDK version. Or check both version are compatible. I was faced same problem last week.
Related
When eclipse is building my workspace, it gets held up at about 19% when updating an unmodified(after project creation) javafx application. Why is it doing this and how can I fix it?
OS: Windows 8
Eclipse Version: 4.7
JDK 9.0.1
JRE 1.8
I've seen other threads about similar problems and they have suggested allocating more memory, I've allocated 1gb to each and nothing has changed, I can't allocate more without instability.
--EDIT--
On deleting the javafx application from my disk it no longer crashes
It turns out the answer to my question was as simple as it could get, all I had to do was delete the application and then recreate it, I think when I first tried to generate it I interrupted it by ending the task because I thought it had crashed.
This is bit late reply, but for people who still looking for this answer might help. I had similar problems and spent few hours. This is what worked for me and I no longer have any problem.
Delete your workspace and create a new one if its first time you started eclipse or create a new workspace and import all your projects there.
Also assigning more ram to eclipse also further increases speed. which can be found here:
How can you speed up Eclipse?
An application I have uses Java Agents with need large jar Libraries (the biggest one is PDFBox - all in all 11MB). They were running for 3 years without any issue with the jars in jvm/lib/ext.
During an upgrade to Domino 9.0.1FP6 the administrator forgot to reinstall the jars in jvm/lib/ext - with obvious repercussions. (Such an annoyance that IBM just completely replaces the whole jvm sometimes without being gentle to the jars)
Upon request, I changed the code by including the jars directly into the Java Agents. Things worked well for 2-3 days, and now we're getting OutOfMemory errors.
As far as I understand it, the jars get loaded onto the Java Heap when the agents get started, but the garbage collection is working slower than the continuous loading of the jars into the heap. I couldn't find any precise documentation by IBM on this matter.
We've increased JavaMaxHeapSize in the notes.ini of the servers but that didn't bring the expected results.
I'm dismissing the possibility that I have forgotten a recycle() in my code because it run beforehand with no memory leaks for three years.
I have thought of the possibility of running a separate Agent that checks total memory usage and then runs Sytem.gc() but I'm not convinced since I have no guarantee that the garbage collector will actually fire.
Apart from the obvious move of putting back the jars in jvm/lib/ext, is there an alternative that I haven't considered?
And is there anywhere some sort of documentation about how these classes get loaded into the Heap, and whether there's a possibility that the jars erroneously are not recognized as being garbage-collectible?
It's a memory leak bug - see http://www-01.ibm.com/support/docview.wss?uid=swg1LO49880 for details.
You need to go back to placing the jar files in jvm/lib/ext.
I'm using Guidewire development Studio (IntelliJ-based IDE), and it is very slow when handling big text files (~ 1500 lines and above). I tried with an out-of-the-box community IntelliJ as well, but meet the same problem.
When I open those files, it takes 1 second to type a character, even when I see clearly that the used memory is still plenty (1441 MB/ 3959 MB). Also, it quickly suck up all the memory if I open multiple files (I allocate 4GB just for IntelliJ). Intellisense and other automatical stuff is painfully slow as well.
I love IntelliJ, but working in those condition is just so hard. Is there any way to work around this problem? I have thought of some alternatives, like:
Edit big files on another editor (eg: Notepad++), then reload it on IntelliJ
Open another small file, copy your bit of code there, edit it, then copy it back. It would help because intellisense and code highlight is maintained, however it is troublesome
I did turn off all unnecessary plugins, only leaving those necessary, but nothing improved much.
I am also wondering if I can "embed" some of outside editor in IntelliJ? Like Notepad++, Notepad2 for example? I did my homework and google around but find no plugins/ configuration that allow to do that.
Is there anyone who's experienced can give me some advices how to work with big files in IntelliJ (without going mad)?
UPDATE: Through my research I learn that IntelliJ can break for very large files (like 20mb) or so on. But my file isn't that big. It just have about 100KB - 1MB, but it's very long text.
UPDATE 2 After trying increase the heap memory as Sajidkhan advise (I changed both idea64.vmoptions and idea.vmoptions), I realize that somehow IntelliJ doesn't take the change. The memory heap is stuck at maximum 3GB.
On another note, the slow performance can be perceived when the system uses only around ~1GB of heap memory, so I think the problem doesn't relate to memory issue.
After a while working around the bush, I find a workaround, kind of.
When I check other answers from similar questions, I found that they begin get troubles when the file size is at least several MBs. It doesn't make sense, since I got the trouble when the files are only several KBs. After more careful checking, I found that the Gosu plugin is the culprit: after I mark my Gosu file as "text only", the speed becomes normal.
So I guess the problem has something to do with code highlighter & syntax reminder. For now, the best way I work-around this is:
Right-click the file and mark it as plain text.
Close the file and open it again, then edit.
Note: Since it applies for all the file type in Guidewire development suit, you may want to mark permanently some long files as plain text, especially the *.properties (aka, i18n/international files). The benefit of code auto-completer just doesn't worth the trouble.
Can you try editing idea64.vmoptions in the bin folder. You could set the max heap and max PermGen to be a higher value
Don't forget to restart!
Tested on different PCs. Even on fast processors the editor is painfully slow when working with large files (2000+ lines of code).
Eclipse, Netbeans are absolutely OK. Tuning .vmoptions will not help.
This bug is still not fixed: https://intellij-support.jetbrains.com/hc/en-us/community/posts/206999515-PhpStorm-extremely-slow-on-large-source-files
UPDATE. Try 32 bit version with default settings. Usually 32bit idea works faster and eats less memory.
I'm making an application in Java using Eclipse Indigo. When I run it using Eclipse the Task Manager shows javaw.exe is using 50mb of memory. When I export the application as a runnable .jar and execute the .jar the Task Manager shows javaw.exe is using 500mb.
Why is this? How could I fix this?
Edit: I'm using a Windows 7 64 bit, and my system says I have Java 1.7 installed. Apparently the memory problem is caused by a while loop. I'll study what's inside the while loop causing the problem.
Edit: Problem found. At one point in the while loop new BufferedImage instances where created, instead of replacing the same BufferedImage.
Without any additional details about your code, I would suggest using a profiler to analyze the problem. I know YourKit and the one that is available for NetBeans are very good.
Once you run you app from the profiler, you should initially look at the objects and listeners created by your application's packages. If the issue is not there, you can expand your search to other packages until you identify things that are growing out-of-control, and then look at the code that handles those entities.
When you run certain parts of the code multiple times and still see memory utilization after that code stopped running, then you might have a leak and may consider nulling or emptying variables/listeners on exit.
It should be a good starting point, but please report your results back, so we know how it goes. By the way, what operating system are you using and what version of java?
--Luiz
You need to profile your code to get the exact answer, but from my experience when I see similar things I often equate it to garbage collecting. For example, I ran the same job and gave one job 10 gigs and the other 2 gigs..Both ran and completed but the 10gigs one used more memory(and finished faster) while the second(2gig) one, I believe, garbage collected so it still completed but took a bit more time with less memory. I'm a bit new to java so I maybe assuming the garbage collecting but I have seen what you are talking about.
You need to profile your code(check out jconsole, which is included with java, or visualVM)..
That sounds most peculiar.
I can think of two possible explanations:
You looked at the wrong javaw.exe instance. Perhaps you looked at the instance that is running Eclipse ... which is likely to be that big, or bigger.
You have (somehow) managed to configure Java to run with a large heap by default. On Linux you could do this with a wrapper script, a shell function or a shell alias. You can do at least the first of those on Windows.
I don't think it is the JAR file itself. AFAIK, you can't set JVM parameters in a JAR file. It is possible that you've somehow included a different version of something in the JAR file, but that's a bit of a stretch ...
If none of these ideas help, try profiling.
Problem found. At one point in the while loop new BufferedImage instances where created, instead of replacing the same BufferedImage.
Ah yes. BufferedImage uses large amounts of out-of-heap memory and that needs to be managed carefully.
But this doesn't explain why your application used more memory when run from the JAR than when launched from Eclipse ... unless you were telling the application to do different things.
I have a problem with Eclipse for some time. When I move to Windows 7 x64 on my notebook, Eclipse starts getting "Freeze", for example, when using Content Assist (Code Helper), or using any other option in Eclipse. I am using quite bunch of plugins, so, I tried to delete them all, and check clean IDE. But this didn't help. I downloaded fresh Eclipse Helios for Windows x64, didn't help. I even formated the disk, reinstall Windows, install only JDK and Eclipse but it always occur. What can I do ?
Edit:
Memory: I did not change memory, and IDE freeze, change memory to 512,1024,2048 MB, keeps freezing. (via vm parameters).
Anti-Virus: I am using ESET Smart Security, but with our without it, Eclipse keeps freezing.
After much frustration, I disabled AVG and it worked fine.
Several leads.
Check whether this freeze the freeze is linked to a huge consumption of CPU or disk usage. Unlikely.
If not then this is probably a network issue. Then disable the firewall for a while and try again. Eclipse now reports your plugin usage at the beginning of a session and it might be busy looking for a connection.
Close all editors from previous session. In the past, eclipse tried to access xml DTD with from the network instead of the local catalog and that would fail if you were offline of course.
Finally, let me tell you that if this is for running eclipse you've selected the worst OS. OSX and Linux are much better options. I used to do so as well. But for the last two years, I've run Windows only inside VirtualBox when I couldn't avoid it (TOAD, Macromedia Fireworks) and I wished I had migrated before.
The crucial point is how much memory you have for Eclipse and if you have any anti-virus software installed that needs to preparse all the class files Eclipse wants to look in.
Does it settle after some usage?