This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Why isn't more Java software compiled natively?
I know that Java is byte code compiled, but when using the JIT, it will compile the 'hotspots' to native code. Why is there not an option to compile a program to native code?
There is an option to compile Java source code files in binary code it is called GCJ and come from Free Software Foundation.
It is not officially supported by Sun/Oracle but i've used the compiler and it do a great job ;)
Java, as a language, can be implemented, like any language, in many ways. This include full interpretation, bytecode compilation, and native compilation. See Java language specification
The VM specification defines how bytecode should be loaded and executed. It defines the compiled class format, and the execution semantics, e.g. threading issue, and what is called the "memory model". See Java VM specification
While orignally meant to go together, the language and the VM are distinct specifications. You can compile another language to Java bytecode and run it on top of the JVM. And you can implement the Java language in different way, notably without a VM, as long as you follow the expected semantics.
Of course, both are still related, and certain aspects of Java might be hard to support without bytecode interpretation at all, e.g. custom ClassLoader that return bytecode (see ClassLoader.defineClass). I guess the bytecode would need to be JIT'ed immediatly into native code, or maybe are not supported at all.
Which platform native code should it compile to?
Windows, Mac, Linux?
What if the developer works on a different platform than the application is going to run on?
What if the application platform changes, either in the server room or on the desktop?
I don't see the benefit, the JVM's nowadays seem to be to be fast enough for very general purpose needs.
There are several products out there to compile java programs to native code, however they are imperfect, and not at all like the JIT compiler. Some differences include:
Write Once Run Everywhere - it will only work on the target you compile it for.
Dynamic code - you cannot load jars or other Java code at runtime, which is often a feature of application servers, GUI builders and the like.
Runtime profiling - a lot of JIT compiler action involves understanding what the code is doing at runtime, not what it could potentially do under a static analysis, meaning that JIT can outperform a natively compiled application in the right circumstances.
Cannot support all Java features. Things like reflection aren't going to be very meaningful in a compiled program.
Large footprint - when it is compiled to native code, all of the libraries the JVM gives you have to be bundled into the package, causing a very large footprint. It is a tricky problem to figure out what can be left out.
So it is possible, for a certain subset of applications, to compile to native code, but as VMs have gotten faster and faster, and issue #5 above has not really been improved (although project Jigsaw should help with that), it is not a very compelling option for real world applications.
Because it is enough to have byte-code compiled.
If you would compile your own code - you had also compile all libraries.
And it is real problem from two point of view:
1. licensing - most of the code wouldn't be changed
2. you had 'recompile' megatons of code :-)
This was a decision made by Sun to not allow this because they wanted to position Java as being inherently multi-platform. As such, they wanted to ensure that any Java application compiled would run on any platform with a JVM. This prevents there from being Java binaries available on-line which don't run on certain hardware or operating systems.
Related
I had this question in software course:
True/False: The Java interpreter converts files from a byte-code format to executable files.
I think the statement is false. In class, they said the interpreter "executes" the byte-code files, on the system using the JVM (I didn't listen too much but I think I got it fairly correctly), but as I understood, it doesn't actually convert it to executable files (which presumably are .exe files), just runs it on the system directly.
"True/False: The Java interpreter converts files from a byte-code format to executable files".
The answer is false1.
The Java interpreter is one of the two components of the JVM that is responsible for executing Java code. It does it by "emulating" the execution of the Java Virtual Machine instructions (bytecodes); i.e. by pretending to be a "real" instance of the virtual machine.
The other JVM component that is involved is the Just In Time (JIT) compiler. This identifies Java methods that have been interpreted for a significant amount of time, and does an on-the-fly compilation to native code. This native code is then executed instead of interpreting the bytecodes.
But the JIT compiler does not write the compiled native code to the file system. Instead it writes it directly into a memory segment ready to be executed.
Java's interpret / JIT compile is more complicated, but it has a couple of advantages:
It means that it is not necessary to compile bytecodes to native code before the application can be run, which removes a significant impediment to portability.
It allows the JVM to gather runtime statistics on how the application is functioning, which can give hints as to the best way to optimize the native code. The result is faster execution for long-running applications.
The downside is that JIT compilation is one of the factors that tends to make Java applications slow to start (compared with C / C++ for example).
1 - ... for mainstream Java (tm) compilers. Android isn't Java (tm)2. Note that the first version of Java was interpreter only. I have also seen Java (not tm) implementations where the native code compilers were either ahead-of-time or eager ... or a combination of both.
2 - You are only permitted by Oracle to describe your "java-like" implementation as Java(tm) if it passes the Java compliance tests. Android wouldn't.
The Java compiler converts the source code to bytecode. This bytecode is then interpreted (or just-in-time-compiled and then executed) by the JVM. This bytecode is a kind of intermediate language that has not platform dependence. The virtual machine then is the layer that provides system specific functionality.
It is also possible to compile Java code to native code, a project aiming this is for example the GCJ.
To answer your question: no, a normal Java compiler does not emit an executable binary, but a set of classes that can be executed using a JVM. You can read more about this on Wikipedia.
False for regular JVMs. No executable files are created. The conversion from bytecode to native code for that platform takes place on the fly during execution. If the program is stopped, the compiled code is gone (was in memory only).
The new Android JVM ART does compile the bytecode into executables before to have better startup and runtime behavior. So ART creates files.
ART straddles an interesting mid-ground between compiled and interpreted code, called ahead-of-time (AOT) compilation. Currently with Android apps, they are interpreted at runtime (using the JIT), every time you open them up. This is slow. (iOS apps, by comparison, are compiled native code, which is much faster.) With ART enabled, each Android app is compiled to native code when you install it. Then, when it’s time to run the app, it performs with all the alacrity of a native app. http://www.extremetech.com/computing/170677-android-art-google-finally-moves-to-replace-dalvik-to-boost-performance-and-battery-life
The answer is false
reason:
JIT-just in time compiler and java interpreter does a same thing in different way but as per performance JIT wins. The main task is to convert the given bytecode into machine dependent Assembly language as of abstract information.Assembly level language is a low level language which understood by machine's assembler and after that assembler converts it to 01010111.....
I'm just starting to learn Java (as the second language from Python), but I can't understand the very first point of it. From my understanding, it says:
"Making each kind of compiler (e.g. of C/C++) to each kind of CPU is too much of a hassle. Java, on the other hand, works universally once JVM is installed, because its intermediate code is interpreted by JVM, rather than making a specific native code."
...but don't you need to implement each kind of JVM to each kind of CPU? Is this really an advantage of Java over C/C++?
I think there's a duplicate about this in SO or elsewhere on the internet, but sorry, I couldn't think up any good search word.
each kind of JVM to each kind of CPU is implemented by folks from oracle and other JVM vendors: in case of C/C++ you have to compile YOUR application code for each CPU/OS
Yes, someone has to make a JVM to be able to run on different platforms, but that someone isn't you.
If you go to the download link for Java https://java.com/en/download/manual.jsp you can see there are various JVM builds for Windows, Mac, Linux and Solaris etc
As a programmer, you just have to write your own code and compile it into .class files. Then it's someone else's problem to provide a JVM to run those class files on a specific machine.
Yes someone need to implement each kind of JVM to each kind of CPU? And if this JVM is going to be used, it will also contain a jit compiler, so not much benefit compared to just writing a compiler.
But it might be talking about using the jvm as a target for your own compiled language.
Imagine this: You want to make your own language. Lets call it MyLanguage. Normally you would have to write a compiler, for each cpu, and lots of support code for each operation system you want to support.
But if you just write one compiler which compiles MyLanguage to java bytecode, then the user can run the java byte code on a JVM.
Your language can then be used on any processor/operation system currently supported by a JVM. And you only had to write one compiler.
This is for example what the developers of Scala did.
I know java generates bytecode but the JVM needs to interpret it everytime during runtime.
Does a compiler exist that generates machine independent code, lets say for C.
Then at a target machine this is permanently converted to its local machine code once rather than converting for each run?
Does this solve why many developers develop for windows but no linux?
Not really, but some stuff comes close.
C is regarded as low level as possible while being portable by some. (This, of course, excludes all APIs). The GHC Haskell compiler uses internally a very c-like language in that regard c--, that might be very close to the machine in depended code you are looking for.
Most modern compilers do have such intermediate Code, for example LLVM. There is even a assembler like (so even more low leven than C) for that. But note that LLVM intermediate code is not portable, as for example the pointer size has to be known at compile time. (all the sizeofs in C will fixed at this time)
But there is a IMO more simple solution: Compile the code for any platform, and if you are on a different platform you a dynamic recompiler like QEMU. That still does negatively impact performance.
It's certainly possible, and interpreters exist for C and C++. However, projects using these languages will often use platform-specific code (like the Windows APIs) which stops them from being portable. Interpreted languages generally supply platform-independent core libraries.
Modern compilers – like Clang, LLVM and GCC – all compile your source code to an intermediate language. This means that the same code-level optimizations can be applied to any language that the compiler can convert, and it also enables tools like Emscripten which can effectively compile C to JavaScript! I believe it was used for the recent JavaScript Unreal Engine demo.
A Java example: Android 4.4 introduced a new experimental runtime virtual machine, ART (Android Runtime).
ART straddles an interesting mid-ground between compiled and interpreted code, called ahead-of-time (AOT) compilation. Currently with Android apps, they are interpreted at runtime (using the JIT), every time you open them up. This is slow. (iOS apps, by comparison, are compiled native code, which is much faster.) With ART enabled, each Android app is compiled to native code when you install it. Then, when it’s time to run the app, it performs with all the alacrity of a native app.
Source
On the EN Wiki I read that both C# and Java are interpreted languages, however at least for C# I think it is not true.
Many interpreted languages are first compiled to some form of virtual
machine code, which is then either interpreted or compiled at runtime
to native code.
From my understanding, it is compiled into CIL and when run, using JIT its compiled to target platform. I have also read that JIT is an interpreter, is that really so?
Or are they called interpreted as they are using intermediate code? I do not understand it.
Thanks
JIT is a form of compilation to native (machine) code. Typically (but not as a necessity), implementations of either the CLI and JVM are compiled in two steps:
the language compiler compiles code to something intermediate (IL/bytecode)
the JIT compiles that to native/machine code at runtime
However, interpreters for both do exist. Micro Framework operates as an IL interpreter, for example. Equally, tools like (looking .NET here) NGEN and "AOT" (mono) allow compilation to native/machine code at the start.
They are considered JIT languages which is different from interpreting. JIT simply compiles to native code when needed during execution. The common strategy is to compile into an intermediate representation (bytecode) beforehand which makes the JIT faster.
However, there is nothing that prevents them from being interpreted, or even statically compiled. Languages are simply languages - how they are executed is irrelevant from a language perspective.
On the EN Wiki I read that both C# and Java are interpreted languages
Can you pls provide the link?
May be the interpreted word means different here. It perhaps means that these languages are first interpreted to convert source code into platform-independent code.(VM Specific)
are they called interpreted as they are using intermediate code
I too think so.
I have also read that JIT is an interpreter
JIT is a compiler. See this
Is something "interpreter" or not depends on context of discussion.
From purely abstract view interpreter can be defined as any intermediate program present in runtime which dynamically translates program code written in one language to a target code of hardware/software of other language. Think about runing java bytecode on x86 hardware, or running Python on CLR VM what exactly IronPython is. In this view every virtual machine is an interpreter of some kind. As it is program present in runtime it clearly differs from static compilers or hardware implemented VM-s.
Now there are many different ways to achieve this functionality where accent is on "dynamically" and "present in runtime".
In discussions where implementation of VM matters, people make clear distinction between "classical" interpreter and JIT-ed one. Classical interpreter is something which for every instruction of hosted program emits routine of target code. This design is simple to build, but hard to optimize. JIT-ed design reads bunch of instruction of original code, and then translates all those instructions to a one native compiled routine. So it "interprets" faster. It is like micro static compiler within VM. There are many different ways to accomplish behavior labeled as JIT, and then there are other approaches like tracing compilers.
Modern VM's like CLR, HotSpot and J9 JVM's are even more complex than to be tagged with simple labels as JIT or Interpreter. They can be at a same time static compilers (AOT execution), classical interpreters and JIT-ed VMs.
For example CLR can compile code Ahead-Of-Time (static compiler), and store native code as bunch of more or less excutable files on disk to be used for faster future startups of hosted program. I believe "ngen" is AOT process used in windows for this functionality. If AOT is not used CLR behaves as JIT VM.
J9 and HotSpot are able to switch in runtime between purely interpreted execution or JIT-ed on depending of code analysis and current load. So it's is quite gray area. J9 even has AOT functionality similar to CLR.
Some other VMs like Maxine JVM or PyPy are socalled "metacircular" VM. This means they are (mostly) implemented in a same language they host (Maxine is JVM written in Java). In order to provided good code they usually have some JIT like behavior implemented in host language which is than bootstrapped and optimized by a very low, close to machine, interpreter.
So actual definition of interpreter varies on context of discussion. When labels like JIT are used then there is clear accent of discussion to an implementation details of VM being discussed.
I have been hearing a lot lately regarding Scala, Clojure, etc which is supposed to run on JVM.
Does this means that those languages are implementing the Java API underneath?
What does it mean for a language to run under JVM?
Thanks.
It means that these languages can be compiled into Java bytecode, which the JVM executes.
It means that the language compiles down to JVM byte code at some point. The language doesn't need to implement the Java API; the Java API is already there (more or less all the time).
It just means if you have a JVM you should be able to run the language without another VM (although you'll need whatever class files the language compiler and libraries need, obviously).
There is a Virtual machine that java runs one (JVM ),which abstracts away more machine level worries. These languages just use it as an intermediate language oppose to writing architecture specific instructions.
Usually, it just means that you have to install JRE to make sure they can execute.
And usually they don't require JDK, which is used to compile .java code into .class byte file. Instead, they provide their own compiler which runs on the JRE you have installed.
So in summary, you just need a runtime support Java (some specific version).
if you need an in depth information: normabmcclelland#linuxmirroreast.com