running hive 0.12 with error of slf4j - java

Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-0.12.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12- 1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive>

you need to delete these jar files binding between Hadoop and Hive
rm lib/hive-jdbc-2.0.0-standalone.jar
rm lib/log4j-slf4j-impl-2.4.1.jar

you have to delete /usr/local/hive/lib/slf4j-log4j12-1.6.1.jar because hive will automatically use slf4j-log4j jar file present in hadoop.
you can also refer here https://issues.apache.org/jira/browse/HIVE-6162

Of the 2 SLF4J bindings being listed in the warning you'll need to exclude one of them from the classpath.
Even though this is a warning SLF4J will pick one logging framework/implementation and bind with it - binding is determined by the JVM and is mostly considered a random function.

You are getting such warning message because of conflicts sl4j.jar which is being used from HIVE and HADOOP path.
In order to get rid of this thing just delete hive-jdbc-1.1.0-standalone.jar from /usr/local/hive/lib.
Then you should good to go ... :)

To resolve this add the following line of script to /usr/iop/4.1.0.0/hive/bin/hive.distro file on all hive nodes,
CLASSPATH=`echo $CLASSPATH| sed 's/\/usr\/local\/hadoop\/lib\/slf4j\-log4j12\-1\.7\.10\.jar//g'`
The script should be inserted after the lines
if $cygwin; then
CLASSPATH=`cygpath -p -w "$CLASSPATH"`
CLASSPATH=${CLASSPATH};${AUX_CLASSPATH}
else
CLASSPATH=${CLASSPATH}:${AUX_CLASSPATH}
fi
The warnings will no longer appear.
http://www-01.ibm.com/support/docview.wss?uid=swg21971864

Related

How to make IntelliJ IDEA load org.slf4j.impl.StaticLoggerBinder in order to run Kafka?

I want to run Kafka from IDEA and I am getting the following error:
> Task :core:Kafka.main()
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
SLF4J: Failed to load class "org.slf4j.impl.StaticMDCBinder".
SLF4J: Defaulting to no-operation MDCAdapter implementation.
SLF4J: See http://www.slf4j.org/codes.html#no_static_mdc_binder for further details.
> Task :core:Kafka.main() FAILED
Execution failed for task ':core:Kafka.main()'.
> Process 'command '/Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home/bin/java'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
I am able to run Kafka from the terminal.
I run Zookeeper from terminal and then Run Kafka from IDEA.
Steps that I follow to run Kafka
As usual, I run the command ./gradlew jar to build Kafka from the source, using the terminal.
I open the project in IDEA using idea . from the root directory of the cloned repository.
I opened the file: core/src/main/scala/kafka/Kafka.scala
I then navigate to main() function and click on the green triangle.
This helps me get the Run configurations and it fails. I then add the config/server.properties in the Program Arguments to make the Run config look like this:
Upon running with the above configurations, I get the aforementioned error.
I searched a bit and found that the same issue was resolved by adding dependencies, like mentioned here and here but I could not understand how I can add dependency as I do not use Maven and I cannot find the file pom.xml file as described here.
Update 1
I tried to add the exact dependency as stated from the terminal to the Run configurations, I am unsure whether that is added or not because the result is still the same:
This is what I find when I run Kafka from the terminal:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/core/build/dependant-libs-2.13.5/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/tools/build/dependant-libs-2.13.5/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/connect/api/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/connect/transforms/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/connect/runtime/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/connect/file/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/connect/mirror/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/connect/mirror-client/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/connect/json/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/aviralsrivastava/dev/kafka/connect/basic-auth-extension/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[2021-03-23 10:54:59,524] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
From the second last line, I deduced the dependency name and added it as shown in the last image.
Generally speaking, the classpath needs configured correctly, to isolate a single Slf4j implementation (the one in the core module, specifically), but messing with the build scripts should be avoided to fix your runtime dependencies within a module
To fix your logger, you need to pass in -Dlog4j.configuration=config/log4j.properties as a VM option (I think you'll have to toggle the drop-down where it says -cp kafka.core.main to get that input to show)
If you want to emulate actual runtime behavior of the server, and attach a debugger, setup your breakpoints, open a terminal (assuming you are using Zookeeper, and it is already running somewhere else, otherwise you need a new terminal for it)
export KAFKA_DEBUG=y
export DEBUG_SUSPEND_FLAG=y
bin/kafka-server-start config/server.properties
Then add a run configuration for a remote application and attach it to port 5005
Once it attaches, your breakpoint should take focus, and you can step through the code
I was also facing the same issue while trying to set up the Apache Kafka source code in Intellij IDEA and here's what I have tried to solve the SL4J issue.
Set the JVM arguments as highlighted in the picture below:
Also, in the build.gradle file, add the following dependency under section project(':core') after the line implementation libs.commonsCli
implementation libs.slf4jlog4j
Tested on Java 17, Apache Kafka 3.1.0 and Intellij IDEA 2021.3.1

How to find specific org/slf4j/Logger jar file out of multiple bindings from the apache zip?

I am using Apache PDFBox & POIFSFilesystem which extract text from the pdfs and export into excel but my application is throwing below error ever after adding the apache-log4j-2.8.2-bin jar files.I tried to add org/slf4j/Logger jar files but its still throwing error. may be i could not find exact files..! pls suggest.
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at org.apache.logging.slf4j.SLF4JLoggerContext.getLogger(SLF4JLoggerContext.java:39)
at org.apache.logging.log4j.jcl.LogAdapter.newLogger(LogAdapter.java:34)
at org.apache.logging.log4j.jcl.LogAdapter.newLogger(LogAdapter.java:30)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:52)
at org.apache.logging.log4j.jcl.LogFactoryImpl.getInstance(LogFactoryImpl.java:40)
at org.apache.logging.log4j.jcl.LogFactoryImpl.getInstance(LogFactoryImpl.java:55)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:655)
at org.apache.pdfbox.pdmodel.PDDocument.<clinit>(PDDocument.java:80)
After adding SLF4J-1.7.25. I got below error. its more than 9k lines.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/D:/apache-log4j-2.8.2-bin/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/slf4j-1.7.25/slf4j-android-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/slf4j-1.7.25/slf4j-jcl-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/slf4j-1.7.25/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/slf4j-1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/slf4j-1.7.25/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/slf4j-1.7.25/slf4j-simple-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.StackOverflowError
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:122)
at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:46)
at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
java.lang.NoClassDefFoundError
generally means that the jar file is not provided correctly. Download the slf4j jar file from slf4j.org/download and try again.
Also, it is always better to use something like Maven to resolve all your dependencies so that these problems can be avoided.

Hive Command is showing error SLF4J [duplicate]

Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-0.12.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12- 1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive>
you need to delete these jar files binding between Hadoop and Hive
rm lib/hive-jdbc-2.0.0-standalone.jar
rm lib/log4j-slf4j-impl-2.4.1.jar
you have to delete /usr/local/hive/lib/slf4j-log4j12-1.6.1.jar because hive will automatically use slf4j-log4j jar file present in hadoop.
you can also refer here https://issues.apache.org/jira/browse/HIVE-6162
Of the 2 SLF4J bindings being listed in the warning you'll need to exclude one of them from the classpath.
Even though this is a warning SLF4J will pick one logging framework/implementation and bind with it - binding is determined by the JVM and is mostly considered a random function.
You are getting such warning message because of conflicts sl4j.jar which is being used from HIVE and HADOOP path.
In order to get rid of this thing just delete hive-jdbc-1.1.0-standalone.jar from /usr/local/hive/lib.
Then you should good to go ... :)
To resolve this add the following line of script to /usr/iop/4.1.0.0/hive/bin/hive.distro file on all hive nodes,
CLASSPATH=`echo $CLASSPATH| sed 's/\/usr\/local\/hadoop\/lib\/slf4j\-log4j12\-1\.7\.10\.jar//g'`
The script should be inserted after the lines
if $cygwin; then
CLASSPATH=`cygpath -p -w "$CLASSPATH"`
CLASSPATH=${CLASSPATH};${AUX_CLASSPATH}
else
CLASSPATH=${CLASSPATH}:${AUX_CLASSPATH}
fi
The warnings will no longer appear.
http://www-01.ibm.com/support/docview.wss?uid=swg21971864

Error Livy Spark Server hue 3.9

I installed hue 3.9 on a cluster of 5 hosts with HDP 2.3. My Ambari version is 2.1.2.
The problem is that the Hue initial setup screen displays:
Spark The app will not work without a running Livy Spark Server
Several problems appeared earlier but I got to solve them.
Following this thread Error in running livy spark server in hue and this web http://gethue.com/new-notebook-application-for-spark, I tried several things, but when I start livy-spark with root user I get the following error:
[root#m1 bin]# /usr/local/hue/build/env/bin/hue livy_server
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hue/apps/spark/java-lib/livy-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Failed to run spark-submit executable: java.io.IOException: Unable to determing spark-submit version [1]:
If I execute spark-submit to /usr/local/hue/build/env/bin/, it seems that there is no fault, it show me options of the command. And the spark-submit --version command displays spark's version correctly (1.4.1).
Someone could help me?
Thank you and regards
Have you tried updating your PATH to point to /usr/local/hue/build/env/bin/? Such as
# export PATH="$PATH:/usr/local/hue/build/env/bin/"
# /usr/local/hue/build/env/bin/hue livy_server

Unable to run Mahout 20newsgroups example under Cygwin

I was able to verify that the input directory (under /tmp) exists with the newsgroup data. Not sure why I am getting a file not found exception.
$ sh classify-20newsgroups.sh
Please select a number to choose the corresponding task to run
1. naivebayes
2. sgd
3. clean -- cleans up the work area in /tmp/mahout-work-rsrinivasan
Enter your choice : 1
ok. You chose 1 and we'll use naivebayes
creating work directory at /tmp/mahout-work-rsrinivasan
Preparing Training Data
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
no HADOOP_HOME set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/cygwin/usr/local/mahout/examples/target/mahout-examples-0.6-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/cygwin/usr/local/mahout/examples/target/dependency/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/cygwin/usr/local/mahout/examples/target/dependency/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
12/05/14 09:13:44 WARN driver.MahoutDriver: No org.apache.mahout.classifier.bayes.PrepareTwentyNewsgroups.props found on classpath, will use command-line arguments only
Exception in thread "main" java.io.FileNotFoundException: Can't find input directory \tmp\mahout-work-rsrinivasan\20news-bydate\20news-bydate-train
at org.apache.mahout.classifier.bayes.PrepareTwentyNewsgroups.main(PrepareTwentyNewsgroups.java:92)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188)
You probably have to edit that script before it works on Windows. I imagine the paths are wrong for Cygwin/Windows.
it is probably best to run the example under a unix environment. When I was trying the oscon2011 reuters example I ran into similar issues; although I was using the git bash console for doing the work. It seems that the classification and clustering examples hdfs local to run properly.
I managed to get a virtualbox up and running using vagrant and the process was relatively straightforward. Yes it does add to the learning cycle but after some initial investment I was able to complete the reuters example in a couple of hours.
thanks
anand

Categories