I just wanted to have a discussion and get some clarification about a problem I am trying to solve. As of now I have followed this documentation in order to update java on my Cloudera Quickstart VM from Java SE 1.7 to Java SE 1.8. My goal from upgrading to this version is to implement spark 2 on my machine.
After I followed the directions I checked what java version is in the system by typing $ java -version in the terminal. The output though was still the default 1.7 version and not the 1.8 version that the Cloudera manger points to. Why is it that this is happening? Will this discrepancy effect performance at all?
Edit: Change phrasing
Related
I am looking at someone else's code and the pom.xml says java version 1.8.
I need to know the version of the JDK. I know they can be the same, but some sites say they are different?
Sorry for the newb question and thanks
I tried googling this and couldn't find an answer
EDIT: To be clear I know java 1.8 == Java 8. I am asking about the JDK.
for example, if an exploit only works on JDK 9+ and my pom says java version 1.8, am I safe? My research says not necessarily but I'm not sure so asking here.
The version of the JDK installed on your computer may or may not be Java 1.8 (aka, Java 8). It means that the code wants to use Java 8 standards and produce output files that will work with Java 8. But you could easily use Java 17 to do this. Newer versions of the JDK can produce files that can be used on older versions of the JDK.
To find out what your JDK is, on a command line, run java -version. For me, I get:
openjdk version "17.0.2" 2022-01-18
OpenJDK Runtime Environment (build 17.0.2+8-Ubuntu-120.04)
OpenJDK 64-Bit Server VM (build 17.0.2+8-Ubuntu-120.04, mixed mode, sharing)
In your pom.xml you may have some lines that look like:
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
This tells Maven (and by extension the compiler) that your source code is targeted at Java 11 and the output from the compiler should be code that works with Java 11 and higher.
Yes, if pom.xml has properties -> java.version as 1.8 or plugin -> source as 1.8 means that the project will be using JDK8 compiler and language version is Java 8 for the same.
I'm running s simple pyspark python script in Pycharm, whithin an anaconda env with Python 3.7 (pyspark version 2.4.4) and I got error :
pyspark.sql.utils.IllegalArgumentException: 'Unsupported class file major version 55'.
I've followed the potential solutions I found on stackoverflow but non of them works, I followed this one:
https://stackoverflow.com/a/54704928/12375559: I've added java1.8.0_221 in my system env variables:
but when I type java -version in Pycharm terminal it's still using java11.0.6:
>java -version
openjdk version "11.0.6" 2020-01-14
OpenJDK Runtime Environment (build 11.0.6+8-b765.1)
OpenJDK 64-Bit Server VM (build 11.0.6+8-b765.1, mixed mode)
Then I found this:
Pyspark throws IllegalArgumentException: 'Unsupported class file major version 55' when trying to use udf switch boot jdk of Pycharm (2xshift -> jdk -> select jdk 1.8)
So I navigated to Pycharm Switch IDE Boot JDK:
I can see the second option is 1.8 but without anaconda plugin, so I'm not sure what to do now because I don't want to mess up with my settings, might someone be able to help me please? Many thanks!
The 55 error is caused by an older Java version trying to execute Java version 11 code (in a .jar or .class file). So it seems like an older Java version (for example Java 8 (a JVM 8)) is used, but it encounters a part compiled in Java 11 (see 55 is Java 11).
Since Java 8 is less well supported, you could try to use Java 11 or newer (see Java version history). You could try to use PySpark 3.0.1 with Python 3.8 and Java 11. That way, you have recent parts that should be able to work together.
These links might also be helpful:
Specify Java version in a Conda environment
Spark Error - Unsupported class file major version
When I run flutter create x or flutter doctor I get the following message
"Unable to find any JVMs matching version 1.8.
I also tried following another thread on StackOverFlow for this issue but it did not help because it did not specify what to do if you already have java installed. At the moment when I run java-version I get the following message:
java version 13.0.1 2019-10-15
Java(TM) SE Runtime Environment (build 13.0.1+9)
Java HotSpot(TM) 64-Bit Server VM (build 13.0.1+9, mixed mode, sharing)
Thank you in advance for the help!
EDIT: I found a solution!
echo export "JAVA_HOME=\$(/usr/libexec/java_home -v 1.7)" >> ~/.bash_profile
All I did was replace 1.7 with a JDK I already had installed and now it works. The thread can be found below:
How to set JAVA_HOME environment variable on Mac OS X 10.9?
You've installed java 13 while your application is looking for java 8. So install java 8 and set it your default jvm/jre (set your JAVA_HOME).
Hope it helps !
Just a guess, as I do not use Flutter…
Java changed its version numbering scheme
Java changed the way it reports its own version number. In earlier versions, the number was always 1.x.y where x is what we colloquially considered to be the version. Eventually Sun/Oracle decided to make that common usage official. So now later versions such as the one you are using dropped the 1.. Rather than 1.13.y Java now identifies itself as 13.y.
Update Flutter
This change in version numbering scheme can confuse old software that expected the version number to always report 1.x.y. Such software needs to be updated to understand the new number scheme.
I suggest you update your Flutter library to a more recent updated version, if one exists.
Downgrade Java
Most Java 8 apps should run without a problem on Java 13 if it weren't for this tiny version number interpretation issue. So you should not have to downgrade from Java 13 to Java 8 to run your app. But you might need to downgrade to resolve this issue if Flutter was never updated properly.
If you do need to downgrade, here is a flowchart I made to help locate a vendor of a Java implementation. This chart is aimed at Java 11, but most of these venders provide Java 8 implementations as well.
I am about to install Apache Spark 2.1.0 on Ubuntu 16.04 LTS. My goal is a standalone cluster, using Hadoop, with Scala and Python (2.7 is active)
Whilst downloading I get the choice: Prebuilt for Hadoop 2.7 and later (File is spark-2.1.0-bin-hadoop2.7.tgz)
Does this package actually include HADOOP 2.7 or does it need to be installed separately (first I assume)?
I have Java JRE 8 installed (Needed for other tasks). As the JDK 8 also seems to be a pre requisite as well, I also did a ' sudo apt install default-jdk', which indeed shows as installed:
default-jdk/xenial,now 2:1.8-56ubuntu2 amd64 [installed]
Checking java -version however doesn't show the JDK:
java version "1.8.0_121"
Java(TM) SE Runtime Environment (build 1.8.0_121-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.121-b13, mixed mode)
Is this sufficient for the installation? Why doesn't it also show the JDK?
I want to use Scala 2.12.1. Does this version work well with the Spark2.1/Hadoop 2.7 combination or is another version more suitable?
Is the Scala SBT package also needed?
Been going back and forth trying to get everything working, but am stuck at this point.
Hope somebody can shed some light :)
You need to install hadoop-2.7 more to whatever you are installing.
Java version is fine.
The mentioned configuration should work with scala 2.12.1.
We are testing neo4j 2.2.2 with java 8u45 but we are seeing an error when we start neo4j.
ERROR! Neo4j cannot be started using java version 1.8.0_45.
* Please use Oracle(R) Java(TM) 7 to run Neo4j Server. Download "Java Platform (JDK) 7" from:
http://www.oracle.com/technetwork/java/javase/downloads/index.html
* Please see http://docs.neo4j.org/ for Neo4j Server installation instructions.
Still the database starts, so the question is: Is this Error message a bug or we should revert the java 8 version to 7.
Error is trigger on /bin/utils script on function checkjvmcompatibility()
Thank you !!!
It clearly says that you should use Java 7. This is a system requirement, enforced by the library.
But the release notes on the http://docs.neo4j.org/ say:
Neo4j 2.2.2 is a maintenance release, with critical improvements. Notably, this release: Provides full support for Oracle and OpenJDK Java 8.
So either there's a mistake in the documentation, or (more likely) they forgot to update the compatibility check utility. Send them a message on this problem.
http://neo4j.com/contact-us/
According with the neo4j team: it is a bug that will be fixed in an upcoming maintenance release (thank you Viksit Puri).
So release notes are fine and java 8 works well with neo4j version 2.2.2 of Neo4j