JMETER non gui: Not able to generate folder using jmeter command line - java

I am unable to generate a folder/HTML report of jmeter in the command line.
I have previously upgraded to the latest java and somehow it did not work.
I have downloaded jdk8 but encountered this message below:
jmeter: line 128: [: : integer expression expected jmeter: line 199:
/Library/Internet Plug-Ins/JavaAppletPlugin.plugin/Contents/Home -v
1.8.331.09/bin/java: No such file or directory

You're using wrong Java, you need to have JDK (or at least JRE) and you seem to be using Java browser plugin
Follow the instructions from Installation of the JDK on macOS article to get the required version of Java (not earlier than JDK 8) and make sure that it's in your PATH before the one which is provided by the browser plugin.
Also you can consider using Homebrew for installing JMeter.
More information: Get Started With JMeter: Installation & Tests

Related

Error when initializing SparkContext in jupyterlab

Hi I'm trying to learn how to use pyspark but when I run this first line :
import pyspark
sc = pyspark.SparkContext('local[*]')
I get this error :
Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module #0x724b93a8) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module #0x724b93a8
I can't seem to find what's causing it :/
Spark runs on Java 8/11, Scala 2.12, Python 3.6+ and R 3.5+. Python 3.6 support is deprecated as of Spark 3.2.0.
Java 8 prior to version 8u201 support is deprecated as of Spark 3.2.0
For the Scala API, Spark 3.2.0 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x).
For Python 3.9, Arrow optimization and pandas UDFs might not work due to the supported Python versions in Apache Arrow. Please refer to the latest Python Compatibility page.
For Java 11, -Dio.netty.tryReflectionSetAccessible=true is required additionally for Apache Arrow library. This prevents java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available when Apache Arrow uses Netty internally.
What worked for me:
brew install openjdk#8
sudo ln -sfn /usr/local/opt/openjdk#8/libexec/openjdk.jdk /Library/Java/JavaVirtualMachines/openjdk-8.jdk
If you need to have openjdk#8 first in your PATH, run:
echo 'export PATH="/usr/local/opt/openjdk#8/bin:$PATH"' >> ~/.zshrc
source ~/.zshrc
Spark runs on Java 8/11 Java SE 8 Archive Downloads (JDK 8u202 and earlier)
I successfully installed Python 3.9.7 version of Anaconda distribution.
I’ve provided a spark installation link How to Install and Run PySpark in Jupyter Notebook on Windows
I’ve provided a spark installation video link youtube Video how to Run PySpark in Jupyter Notebook on Windows
This works me
Source: Eden Canlilar
How to Install and Run PySpark in Jupyter Notebook on Windows
When I write PySpark code, I use Jupyter notebook to test my code before submitting a job on the cluster. In this post, I will show you how to install and run PySpark locally in Jupyter Notebook on Windows. I’ve tested this guide on a dozen Windows 7 and 10 PCs in different languages.
A. Items needed
Spark distribution from spark.apache.orgspark.apache.org
Python and Jupyter Notebook. You can get both by installing the Python 3.x version of [Anaconda distribution.]
winutils.exe — a Hadoop binary for Windows — from Steve Loughran’s GitHub repo. Go to the corresponding Hadoop version in the Spark distribution and find winutils.exe under /bin. For example,
https://github.com/steveloughran/winutils/blob/master/hadoop-2.7.1/bin/winutils.exe
The findspark Python module, which can be installed by running python -m pip install findspark either in Windows command prompt or Git bash if Python is installed in item 2. You can find command prompt by searching cmd in the search box.
If you don’t have Java or your Java version is 7.x or less, download and install Java from Oracle. I recommend getting the latest JDK (current version 9.0.1).
If you don’t know how to unpack a .tgz file on Windows, you can download and install 7-zip on Windows to unpack the .tgz file from Spark distribution in item 1 by right-clicking on the file icon and select 7-zip > Extract Here.
B. Installing PySpark
After getting all the items in section A, let’s set up PySpark.
Unpack the .tgz file. For example, I unpacked with 7zip from step A6 and put mine under D:\spark\spark-2.2.1-bin-hadoop2.7
Move the winutils.exe downloaded from step A3 to the \bin folder of Spark distribution. For example, D:\spark\spark-2.2.1-bin-hadoop2.7\bin\winutils.exe
Add environment variables: the environment variables let Windows find where the files are when we start the PySpark kernel. You can find the environment variable settings by putting “environ…” in the search box.
The variables to add are, in my example,
Name
Value
SPARK_HOME
D:\spark\spark-2.2.1-bin-hadoop2.7
HADOOP_HOME
D:\spark\spark-2.2.1-bin-hadoop2.7
PYSPARK_DRIVER_PYTHON
jupyter
PYSPARK_DRIVER_PYTHON_OPTS
notebook
In the same environment variable settings window, look for the Path or PATH variable, click edit and add D:\spark\spark-2.2.1-bin-hadoop2.7\bin to it. In Windows 7 you need to separate the values in Path with a semicolon ; between the values.
(Optional, if see Java related error in step C) Find the installed Java JDK folder from step A5, for example, D:\Program Files\Java\jdk1.8.0_121, and add the following environment variable
Name
Value
JAVA_HOME
D:\Progra~1\Java\jdk1.8.0_121
If JDK is installed under \Program Files (x86), then replace the Progra~1 part by Progra~2 instead. In my experience, this error only occurs in Windows 7, and I think it’s because Spark couldn’t parse the space in the folder name.
Edit (1/23/19): You might also find Gerard’s comment helpful: How to Install and Run PySpark in Jupyter Notebook on Windows
C. Running PySpark in Jupyter Notebook
To run Jupyter notebook, open Windows command prompt or Git Bash and run jupyter notebook. If you use Anaconda Navigator to open Jupyter Notebook instead, you might see a Java gateway process exited before sending the driver its port number error from PySpark in step C. Fall back to Windows cmd if it happens.
Once inside Jupyter notebook, open a Python 3 notebook
In the notebook, run the following code
import findspark
findspark.init()
import pyspark # only run after findspark.init()
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
df = spark.sql('''select 'spark' as hello ''')
df.show()
When you press run, it might trigger a Windows firewall pop-up. I pressed cancel on the pop-up as blocking the connection doesn’t affect PySpark.
If you see the following output, then you have
installed PySpark on your Windows system!

How to change the used java version of SoapUI?

I have multiple Java versions installed. I switch versions when needed.
If I use Java 8 as default, SoapUI will use it. A Popup with with this error will show then :
java.lang.UnsupportedClassVersionError: com/eviware/soapui/SoapUI has been compiled by a more recent version of the Java Runtime (class file version 53.0), this version of the Java Runtime only recognizes class file versions up to 52.0
Since the 5.6.0, SoapUI need a Java version ⩾ 9. So I want to specify the java version used by SoapUI when my default version is java 8.
Notes :
By default I mean, the version printed when : java -version is executed on the Terminal.
In the file SoapUI-5.6.0/bin/SoapUI-5.6.0 :
Uncomment the var INSTALL4J_JAVA_HOME_OVERRIDE at the beginning of the file
Assign the java home (JRE OU JDK) to INSTALL4J_JAVA_HOME_OVERRIDE
Example :
INSTALL4J_JAVA_HOME_OVERRIDE=/usr/lib/jvm/jre-11/
I leave here my solution since i lost 4 hours just to understand that if you have java 8 and you use Ubuntu, you can save the world but SoapUI 5.6.0 won't work.
So the solution is to download an older version, like 5.4.0
Don't bother looking for one if you're an Ubuntu lady/guy, just use this command:
$ wget https://s3.amazonaws.com/downloads.eviware/soapuios/5.4.0/SoapUI-5.4.0-linux-bin.tar.gz
Then unzip and untar the downloaded archive:
$ gunzip SoapUI-5.4.0-linux-bin.tar.gz
$ tar xvf SoapUI-5.4.0-linux-bin.tar
Finally start SoapUI by entering in the folder where it is installed and run:
/bin/soapui.sh
in case it helps anyone - I have java-11 alongside java-8 (with latter being the default) and SoapUI-5.6.0.
Solution: I've set the INSTALL4J_JAVA_HOME_OVERRIDE variable in SoapUI-5.6.0 to:
INSTALL4J_JAVA_HOME_OVERRIDE="/usr/lib/jvm/java-11-openjdk-amd64/"
Just use nano or gedit, etc. to modify (don't forget to use elevated permissions, if neccessary).
Best regards

Failed to start Jenkins on macOs - Java 10

I'm trying to start Jenkins using:
java -jar jenkins.war
I got this error:
java.lang.UnsupportedClassVersionError: 54.0
at Main.main(Main.java:128)
This problem comes after an update of my development environment, I'd switched to :
Java 10.0.1+10
Jenkins 2.107.2
MacOS 10.13.4
Based on the error message that you are getting:
The JAR / WAR file being loaded was compiled for Java 10 (and later) because the message says that the classfile version is 54.
The JRE that is actually being used is Java 9 or earlier. If you were using Java 10, it would be happy with version 54.
In other words, despite upgrading your Java to Java 10, you must be using an older version to start Jenkins.
Check the launch script for Jenkins and make sure that it uses the correct JRE installation.
If you are launching Jenkins using java -jar jenkins.war, check what java -version tells you ... at the same command prompt.
An easier approach could be to download Jenkins WAR for 2.127 (weekly release) or above. Then one can run the war with the following command:
${JAVA10_HOME}/bin/java --add-modules java.xml.bind -jar jenkins.war \
--enable-future-java --httpPort=8080 --prefix=/jenkins
Though note that there are few known issues registered on their tracker :
Pipeline crashes immediately on Java 10 and 11 (JENKINS-46602)
There are many warnings about Illegal reflective access during execution (JENKINS-40689)
Configuration-as-Code plugin fails to export configurations on Java 10 (JENKINS-51991)
Here are the individual tracker for Java 10 compatiblity and one for Java 11.
Source - Jenkins with Java10-11

Elasticsearch installation

As per the install wiki [https://www.elastic.co/guide/en/elasticsearch/reference/current/zip-targz.html#install-targz] I have the elasticsearch-5.4.3 installed.
./bin/elasticsearch gave the following error :
Elasticsearch requires at least Java 8 but your Java version
from /usr/bin/java does not meet this requirement
So I have downloaded the latest version of java as well in the same directory as that of elasticsearch.
But running the following command still errors out
./jre1.8.0_151/bin/java ./elasticsearch-5.4.3/bin/elasticsearch
with error message :
Error: Could not find or load main class ..elasticsearch-5.4.3.bin.elasticsearch
What could be the remedy for this ?
first install your java (version 8+) and set $JAVA_HOME variable or download java. use update-alternative to set default java.
update-alternatives: warning: /etc/alternatives/java is dangling
then, install elasticsearch from rpm file and config /etc/elasticsearch/elasticsearch.yml.
https://www.elastic.co/guide/en/elasticsearch/reference/current/rpm.html
now, you can start elasticseach service by executing :
service elasticsearch start
Possibility 1: don't start with java, run elasticsearch directly
step 1: set the $JAVA_HOME to your latest 1.8 java installation
step 2: run the command ./bin/elasticsearch
Possibility 2: permission issue
step 1: set SELinux to permissive mode
step 2: run the installation command from the same user from which you have
downloaded and extracted the elasticsearch packages.

Error during Maven execution

I'm trying to run the maven and returns me the error:
/opt/apache-maven-3.3.3/bin/mvn: 227: exec: /opt/jdk1.7.0_79/bin/java: not found
What can it be?
run the command echo $JAVA_HOME and see where the java home is set and verify if the path is correct. Then check if $JAVA_HOME\bin\ folder is there and check for java command in it.
I just was experiencing this same problem, and about 4 hours later I figured it out. I had installed the wrong version of the java JDK. I originally installed the Linux x-86 version, but my computer settings required the x-64 version. Make sure you have the version of the Java JDK that is compatible with your system. Here's the link to those JDK downloads: http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html

Categories