Comparing a java-app with a c-app in kilobytes - java

I have made two rather small applications in java- and c-language. The application is a financial calculator (Black & Scholes) where one can calculate the call- and put-price given the parameters assetprice, strikeprice, volatility, time and interest rate.
I somehow expected the apps to be approximately the same size in kilobytes - but I was surprized when I discovered that the c-file is much heavier:
c-program (exefile): 450 KB
java-program (jar): 11.7 KB
That is - when the c-program is almost o.5 MB the jar-file is as little as 11.7 KB.
How could this be explained?
(the firt image shows the c-app and the last is the java app)

Your trying to compare apples and oranges. Java java is an interpreted language compiling bytecode and using libraries available within the jvm. c compiles into machine code and includes static libraries but not dynamic shared libraries acailable by the os.
So, this depends on many factors and the specific project.
C apps also depend on compiler used and settings. Eg gcc will affect size differently than ms visual studio and a debug mode compile will be way larger than a non debug. Also optimizations and obfuscators play a role.

Without knowing all the details I would argue that the c-program is self-contained, i.e. you can run it on virtually any Windows machine.
To run Java applications, you need the JVM installed, which brings a lot of functionality (like... the String type, or Math libraries), so these do not need to be included in your deployed jar file.

The c file has incompiled a lot of runtime and other overhead, while java is almost interpreted. See, c is standalone and java needs the rontime

Related

Isabelle: performance issues with version Isabelle2013-2

I have performance issues in Isabelle (i.e., the resent version Isabelle2013-2).
I use Isabelle/JEdit, based on the new interface.
So before, the situation was I had some trouble with the performance. But now it is worse, as I sometimes have to wait up to 10 seconds sometimes to enter the right. The performance issues get worse over time, to the point were I have to restart Isabelle after an hour or so.
My suspicion is that I can configure Isabelle better or apply some tricks that improve the performance.
Hardware:
recent CPU, it's an intel i7 quadcore (mobile labtop chip), 16GB ram, fast SSD harddisk.
Software:
64bit arch linux (kernel 3.12.5-1-ARCH)
no 32bit compatibility libraries
my java version is:
java version "1.7.0_45"
OpenJDK Runtime Environment (IcedTea 2.4.3) (ArchLinux build 7.u45_2.4.3-1-x86_64)
My theory file has the size 125KB, the whole theory I am working is in one file, but at the moment I would really want to have just one file.
Symptoms:
Isabelle displays only about 900mb in the lower right corner of UI. I have 16GB RAM, should I configure java to use more RAM? Sometimes a singe process consumes 600% of the CPU, i.e., 6 cores that the linux kernel sees.
Tricks I use:
One trick is that I insert *) at a line below the code I am working on. This leads to a syntax error and the below code is not checked. The second trick is that I went to the timing panel, and all proofs that took longer than 0.2 seconds I commented out and replaced with sorry.
The resent two Isabelle versions are really great improvements!
Any suggestions or tricks to how I can improve the performance of Isabelle?
A few general hints on performance tuning:
One needs to distinguish Isabelle/ML (i.e. the underlying Poly/ML runtime) versus Isabelle/Scala (i.e. the underlying JVM).
Isabelle/ML: Intel CPUs like i7 have hyperthreading, which virtually doubles the number of cores. On smaller mobile machines it is usually better to restrict the nominal number of cores to half of that. See the "threads" option in Isabelle/jEdit / Plugin Options / Isabelle / General. When running on batteries you might even go further below.
Isabelle/ML: Using x86 (32bit) Poly/ML generally improves performance. This is only relevant to Linux, because that platform usually lacks x86 libraries that other platforms provide routinely. There is rarely any benefit to fall back on bulky x86_64. Poly/ML 5.5.x is very good at working in the constant space of 32bit mode.
Isabelle/Scala: JVM performance can be improved by using native x86_64 (which is the default) and providing generous stack and heap parameters.
The main Isabelle application bundle bootstraps the JVM with some options that are hard-wired in a certain place, which can be edited nonetheless:
Linux: Isabelle2013-2/Isabelle2013-2.run
Windows: Isabelle2013-2/Isabelle2013-2.ini
Mac OS X: Isabelle2013-2.app/Contents/Info.plist
For example, the maximum heap size can be changed from -Xmx1024m to -Xmx4096m.
The isabelle jedit command-line tool is configured via the Isabelle settings environment. See also $ISABELLE_HOME/src/Tools/etc/settings for some examples of JEDIT_JAVA_OPTIONS, which can be copied to $ISABELLE_HOME_USER/etc/settings and adapted accordingly. It is also possible to monitor JVM performance via jconsole to get an idea if that is actually a source of problems.
Isabelle/Scala: Isabelle bundles a certain JVM, which is assumed here by default. This variable elimination of Java versions is important to regain some sanity --- otherwise you never know what you get. Are you sure that your OpenJDK is actually used here? It is unlikely, unless you have edited some Isabelle settings.
Further sources of performance problems on Linux is graphics. Java/AWT is known to be much slower on X11 than on Windows and Mac OS X. Using the quasi-native GTK look-and-feel on Linux degrades graphics performance even further.

Appengine uploads are limited to 10000 files

Attempts to deploy my application to appengine have failed because of the hard limit on uploads i.e. 10,000.
My application is using external libraries and constants in 2 other languages. Please refer to the following snapshot:
GWT.Async blocks have been placed in necessary positions in the project.
Following compile time options are used:
-localWorkers 3 -XfragmentCount 10
But the problem is when I upload the project to appengine I get the following exception:
**
java.io.IOException: Applications are limited to 10000 files, you have
34731
**
I am aware that I can cut down on the file count by reducing the cross browser compatibility or by reducing locales. But that won't be a practical approach while deploying
So please suggest me some alternatives.
Another thing I wish to mention is the project extensively uses VerticalPanel/HorizontalPanel/FlexTable/DialogBox in most of its screens. I am not sure if this has something to do with this problem.
I'm afraid this will happen to me also, I had that problem at the middle of the project, so I limited browsers to chrome and ff. But when I'll have to really deploy, this could be an issue.
An application is limited to 10,000 uploaded files per version. Each file is limited to a maximum size of 32 megabytes. Additionally, if the total size of all files for all versions exceeds the initial free 1 gigabyte, then there will be a $0.13 per GB per month charge.
https://developers.google.com/appengine/docs/quotas#Deployments
The solution could be to deploy each language as an application, if your data is not related together between languages
Sounds like you might also be deploying all of your gwt classes along with your application.
When I was a heavy appengine user, I was sure to jars all my uploaded classes (and not include any non-shared gwt code). You might want to $ find . -n "*.class" | wc -l to count how many classes you are sending.
Jarring up your classes beforehand will make 15000 class files = 1 jar file.
It just sucks to make huge jars since you'll need to redeploy the whole jar on every change. Better to have lots of small jars. ;)
What I did is to put all the GWT generated files into a ZIP and serve them with a servlet.
To optimize things a bit I put every file in memcache after dezipping.

High Java memory usage even for small programs

I have a couple of simple applications written in java, one of them written to act as a widget. What surprised me how much RAM even small applications use.
I wrote the following to see if it is a bug in my programs, or a general Java issue:
public class ram {
public static void main(String[] args){
while(true)System.out.print("Hello World");//while loop to give me time to check RAM usage
}
}
Then compiled and ran it with java ram and it gave me the following RAM usage:
The process java (with pid 4489) is using approximately 43.3 MB of memory.
34460 KB [heap]
7088 KB /usr/lib/jvm/java-7-openjdk/jre/lib/amd64/server/libjvm.so
1712 KB /usr/lib/jvm/java-7-openjdk/jre/lib/rt.jar
136 KB [stack:4495]
120 KB /usr/lib/jvm/java-7-openjdk/jre/lib/amd64/libjava.so
Isn't this too high? Especially a heap of 34MB. My system is ArchLinux x86_64 and openjdk-7.
Is there any way to minimise the amount of RAM used by the JVM?
Edit: I tried using the -Xmx flag and this is what I got (1281k was the smallest it would let me start with):
java -Xmx1281k ram
The process java (with pid 4987) is using approximately 27.6 MB of memory.
18388 KB [heap]
For comparison, Python2 uses 4.4MB, Mono uses 4.3MB.
I see similar questions to this asked frequently and I have yet to see a satisfactory answer.
The answers I see most frequently are angry "why do you care?" answers. Some people put a lot of thought and effort into making the question and whoever asks it seem stupid.
Some people will answer this question with a long diatribe about the different kinds of memory Java allocates, why it does so, and what command line parameters you should look into. (rarely does anyone get very specific)
The fact is that for small utilities Java is a memory hog on most operating systems. The smaller the utility, the more obvious it really is. Java developers have long been conditioned to deal with or ignore this fact.
The reasons for the seemingly abnormal memory usage are plentiful. Each thread gets a certain amount of memory for it's stack. There are several threads that will get started regardless of how simple a program is for things like garbage cleanup, RMI, etc. On Windows/64-bit that's 1MB per thread. A bunch of classes get loaded by default and all of your classes. I'm sure a lot of other things are happening behind the scenes.
Java has made tradeoff choices that other languages haven't. The load time is slower than most other languages. The initial memory requirement is higher. Strings as they are most commonly used eat up a lot more memory than most people realize. There are countless others. The benefit for many situations really does pay off. Something like Hello World shows off the cost of those choices more than anything else. Benefits like easy multi-threading and near-native performance really don't do you any good with Hello World.
Unfortunately, there's not a lot that you can do to really minimize the memory used by a simple application. There are the aforementioned command line switches and trial and error could shrink your reported usage down to the ~10-15mb level I'm sure, but those choices and time you spend figuring them out aren't going to be applicable to a wide variety of future applications. You'll have to go through the same process again for the next application. Throw in a GUI interface and some logging and another common library or two and your base number will shoot up to 60mb pretty darn quick.
You aren't doing anything wrong. It's just the way things work in Java. You'll get used to it. You'll choose another language on occasion because of it, and that's OK too.
There are several reasons for this:
The java runtime is a rather complex program in itself. It needs to take the byte code of your Java program (the output from javac), translate it into machine code for the system it is running on, there's an optimizer for that, there are interfaces for debugging, etc.
Although your program is rather small, Java will still load many classes from its standard library. You can observe this by starting it using 'java -verbose:class ram'
Java allocates a great bunch of memory for your program in advance - it cannot know how much memory it will actually need. This is, among others, governed by the -Xmx option. There are several types of such memory. To find out more about them, you can use the JConsole tool, which is included in the JDK's bin folder. You can read more about it at java.sun.com. A summary of the memory areas used by Java is also given at this stackoverflow question

Why are some java libraries compiled without debugging information

I've noticed recently that there's a few java libs (the JDK, joda time, iText) that compile without some/all of the debugging information. Either the local variable info is missing, or the both the local variable info and line numbers are missing.
Is there any reason for this? I realise it makes compiled code larger but I don't believe that's a particular large consideration. Or is it just building with the default compile options?
Thanks.
The default compile options don't include debugging information, you must specifically tell the compiler to include it. There are several reasons why most people omit it:
Some libraries are used in embedded systems (like mobile phones). Until recently, every bit counted. Today, most mobiles come with more memory than all computers in 1985 had together ;)
When compiled with debugging active, the code runs 5% slower. Not much but again, in some cases every cycle counts.
Today's Senior Developers were born in a time when 64KB of RAM was enormous. Yesterday, I added another 2TB drive to my server in the cellar. That's 7 orders of magnitude in 25 years. Humans need more time to adjust.
[EDIT] As John pointed out, Java bytecode isn't optimized (much) anymore today. So the output of the class files will be the same for both cases (only the class file with debug information will be bigger). The code is optimized in the JIT at runtime which allows the runtime to optimize the code for the CPU, memory (amount and layout), etc.
The mentioned 5% penalty is when you run the code and add the command line options to allow a remote debugger to attach to the process. If you don't enable remote debugging, there is no penalty (except for class loading but that happens only once).
Probably size of installation. Debug information adds overhead to the jar-files which Sun probably didn't like.
I had to investigate a Java Web Start issue recently - no debug information available - so adding full tracing to the java console and downloading the source code helped some, but the code is rather tangled so I'd just like a debug build.
The JDK should be compiled with FULL debug information everywhere!

What is a good IDE for Java programming on a low end laptop? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
i have to work away from my desktop computer from time to time (for instance on trips). It is a low end laptop. I can use Eclipse but it is awfully slow.
Is there a better choice? If possible not something like vi oder emacs.
Laptop:
512 MB DDR RAM
Intel Pentium M 760 2.0 GHz
Windows XP SP3
There is no possibility to add more RAM
How low end is it? I used to use IntelliJ Idea and loved, it also ran faster than eclipse for me. DrJava is also very small and light weight. But personally I prefer vim + javac the best. :)
Netbeans is a little less sluggish than Eclipse, but it's a huge memory hog.
Emacs is always a fine choice too.
I actually don't consider that a "low end" machine.
I've used Eclipse and NetBeans on a P3 1.2 Ghz 512M RAM laptop, and they both run. They are a bit sluggish, but usable. Between the 2 I'd say NetBeans was less sluggish, probably because there aren't as many UI elements and frames all over.
My primary home laptop is a Toshiba 512M Pentium M 2 Ghz, and Eclipse runs fine on it (So does Visual Studio 2008).
It seems with these big IDEs, RAM > CPU at influencing speed.
Edit: it may be worth noting that my P3 1.2 Ghz laptop is running Ubuntu and my Pentium M 2 Ghz is running Win XP.
Eclipse is noticably faster in Linux. I once tested large project build times in:
WinXP running Eclipse
-vs-
WinXP Running VMWare Workstation Running Ubunty Running Eclipse
Suprisingly, Ubuntu in VMWare was consistently much faster, about 30-sec faster over what was a 7-minute build process on Windows.
You Could try JEdit, while it is not a true IDE, it does support a ton of Java centric functions like source formatting, syntax highlighting, and a java debugger, and bunch of other functions all of which can be added/subtracted via a plugin system. I've used it in the past when I wanted something with more power than notepad, but less bulky than Eclipse.
It's all open source and free, and portable to most systems since it is written in Java.
A nice lightweight editor is Notepad++. Based on a powerful editing component Scintilla, Notepad++ is written in C++ and uses pure Win32 API and STL which ensures a higher execution speed and smaller program size. By optimizing as many routines as possible without losing user friendlyness, Notepad++ is trying to reduce the world carbon dioxide emissions. When using less CPU power, the PC can throttle down and reduce power consumption, resulting in a greener environment.
I guess it is the JCreator Pro.. The free version, JCreator lite is OK but have limited capabilities.
You might have a look at BlueJ
The older versions of IntelliJ IDEA like 3,4,5 can run easily on that memory - provided you don't have a huge project, and are willing to miss out on some features in the new versions.
I haven't tried it yet, but recently stumbled upon JCoder, which is a Java IDE written in C++. Minimum memory requirement stated is 512MB.
Also, you could consider running an older version of Eclipse, and/or trying to tune Eclipse to run better on your hardware. A Google search for "Eclipse performance tuning" is turning up a bunch of pages with suggestions that may be applicable.
Text editor plus the Java console are your best tools if you are on a low end computer and you don't need debugging and such.
It really depends on your project more than the actual piece of hardware, so you need to think about it with pros and cons.
Good luck.
I was always partial to JCreator back in the day.
you can use netbeans with only the modules your using (same thing with eclipse) or geany (using linux?) not a ide but a really nice text editor with ide functionalities
other option is using netbeans/eclipse older versions that are way more efficient
Get more memory if you can.
SciTE, JUnit, Ant and jvisualvm used to run fine on my notebook, which had 768M, or the 2GB/1GHz netbook I now use. On the rare occasions you must use a debugger, then there's always jdb. The problems I've had with IDEs on notebooks are more to do with screen estate rather than performance. OTOH I gave up on Netbeans as its text editor was too slow on a 'standard built business desktop' machine last time I was contracting.
gvim + ctags + ant
You will run out of memory if using almost ANY modern AppServer anyway.
I hope you're not.
I have been using E Text Editor, a port of TextMate and am loving it. Comes with built in syntax highlighting, snippets, can download TextMate bundles and fully customizable/extendible in ruby.
I occasionally use TextPad for simple Java programs. It's very lightweight, free (well, nagware, but inexpensive to buy) and has a simple to use compile and run option. Also syntax highlighting, though I've never used it.
The important question is what features you think should be in a good IDE.
Code completion? JavaDoc in mouse overs? "Go to definition"? Built-in debugger? Syntax highlighting? Incremental compilation?
A good place to start, would be to get the code to build with ant as it allows you to move this out of the IDE where it hopefully needs less space to run.
I believe the requirements of older versions of JBuilder were quite low. You might want to buy a used one for this purpose.
Is there a chance of upgrading the laptop's memory? CPU doesn't matter much, but IDEs are nearly always huge memory hogs (even EMACS was considered that in its time).
I'd say that you can run eclipse quite well in 1GB (maybe even 512MB) using windows XP, if you don't do huge projects and don't run any other massive apps at the same time.
As long as I already have a project set up, I use vim/gvim for most maintenance development or fooling around.
First of all, memory is the problem.
Linux performs fairly well with low memory, but pc isn't great and mac is abysmal! (if you have 512M and less than 4gb hard disk free, it will barely work at all! This is because the mac allocates it's swap from "Free space" on your hard drive)
Macs are easy to upgrade though. I got 4gb for my laptop at fry's for less than $100, and the slots are inside the battery compartment. After the upgrade, my bottom-of-the-line mac has never once given me a single time to be concerned about its' performance.
PCs are more difficult than the mac, but vary based on model.
Okay, so let's say you don't want to upgrade.
The most important thing to do then is to be sure you have a local copy of the Javadocs. You'll miss them VERY QUICKLY if you don't have eclipse/netbeans.
After that, who cares what editor you use. Personally I'd use the built-in editor because I'm not actually that impressed with coloring and auto-formatting.
If you need context coloring, I guess vim would be the most light-weight editor with a Java mode (at least I believe it has one). JEdit is fairly light-weight, and so is emacs and I know they both have java modes.
For builds just use ant or maybe maven, building in the IDE is nice but overrated.
The biggest thing, as I said, is always have the javadocs on a browser bookmark.

Categories