Google Maps Services Java Client Freezing WebApp - java

I would like to get the latitude and longitude of a location name entered by a customer from the Google Places API. The excecution freezes at the second line.
try {
GeoApiContext context = new GeoApiContext.Builder().apiKey("MY_API_KEY").build();
PlaceDetails placeDetails = PlacesApi.placeDetails(context, "Nairobi").await();
double lat = placeDetails.geometry.location.lat;
double lng = placeDetails.geometry.location.lng;
} catch (ApiException | InterruptedException | IOException apiException) {
apiException.printStackTrace(System.out);
}
I get the following stack trace but I can't quite pinout the root cause as the example I am running is the simplest in the tests provided.
Libraries included in my application are:
google-maps-services-0.2.1.jar
gson-2.8.1.jar
okhttp-3.8.1.jar
okio-1.13.0.jar
slf4j-api-1.7.25.jar
This is also a similar issue on github
This is the stack trace I gate and
Info: apartments was successfully deployed in 2,737 milliseconds.
Severe: SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
Severe: SLF4J: Defaulting to no-operation (NOP) logger implementation
Severe: SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Severe: Exception in thread "RateLimitExecutorDelayThread"
Severe: java.lang.NoSuchMethodError: com.google.common.util.concurrent.RateLimiter.acquire()D
at com.google.maps.internal.RateLimitExecutorService.run(RateLimitExecutorService.java:75)
at java.lang.Thread.run(Thread.java:745)

Change Google API Version to 0.2.0 as below in pom.xml
<dependency>
<groupId>com.google.maps</groupId>
<artifactId>google-maps-services</artifactId>
<version>0.2.0</version>
''
This will fix your problem

Related

arquillian test does not start

The arquillian test fails to start with the following message:
"Error: The LogManager accessed before the java.util.logging.manager
system property was set to org.jboss.logmanager.LogManager. Results
may be unexpected SLF4J: Failed to load class
org.slf4j.impl.StaticLoggerBinder SLF4J: Defaulting to no-operation
(NOP) logger implementation SLF4J: See
http://www.slf4j.org/codes.html#StaticLoggerBinder for further
details"
Unfortunately I don't understand the messages. I've researched but haven't found a solution

Hbase error while executing ./bin/start-hbase.sh (windows)

I installed Hadoop with this tutorial https://www.youtube.com/watch?v=g7Qpnmi0Q-s and it´s working. I installed it in C:/hadoop.
I installed it only because I read that hadoop is a prerequisite for executing (no single mode) and the error messages are regarding some hadoop configurations. But it didn´t help.
I tried to install Hbase with this tutorial https://ics.upjs.sk/~novotnyr/blog/334/setting-up-hbase-on-windows. But I´m getting this error while executing ./bin/start-hbase.sh
Output in cygwin terminal:
$ ./bin/start-hbase.sh
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further detail
s.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further detail
s.
: Name or service not knownstname laptop-l6543teb
running master, logging to /cygdrive/c/java/hbase-2.2.4-bin/hbase-2.2.4//logs/hbase-maiwa-master-LAPTOP-L6543TEB.out
: running regionserver, logging to /cygdrive/c/java/hbase-2.2.4-bin/hbase-2.2.4//logs/hbase-maiwa-regionserver-LAPTOP-L6543TEB.out
hbase-site-xml
<configuration>
<property>
<name>hbase.rootdir</name>
<value>file:///C:/cygwin/root/tmp/hbase/data</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>C:\Java\hbase-2.2.4-bin\hbase-2.2.4\logs</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>false</value>
</property>
<property>
<name>hbase.rootdir</name>
<value>hdfs://localhost:9000/hbase</value>
</property>
</configuration>
Environment variables:
Path variables:
The error output produced by start-hbase.sh has three different errors.
1. Issue with HADOOP_HOME variable
WARNING: DEFAULT_LIBEXEC_DIR ignored. It has been replaced by HADOOP_DEFAULT_LIBEXEC_DIR. WARNING: log4j.properties is not found. HADOOP_CONF_DIR may be incomplete.
ERROR: Invalid HADOOP_COMMON_HOME
Update the Environment variables with HADOOP_HOME pointing to the Hadoop installation folder (not the bin folder within the installation folder).
As per your setting,
HADOOP_HOME=C:\hadoop\
Additionally, set the location of the configuration files
HADOOP_CONF_DIR=C:\hadoop\etc\hadoop\
2. Issue with interpreting Linux style path or Invalid path
cygpath: can't convert empty path
In hbase-env.sh (under C:\Java\hbase-2.2.4-bin\hbase-2.2.4\conf\), update the values for HBASE_HOME and HBASE_CLASSPATH
As per your installation,
export HBASE_HOME=/cygdrive/c/Java/hbase-2.2.4-bin/hbase-2.2.4/
export HBASE_CLASSPATH=/cygdrive/c/Java/hbase-2.2.4-bin/hbase-2.2.4/lib/
And in your environment variables, make sure HBASE_HOME is configured similar to HADOOP_HOME.
3. Unable to resolve hostname
: Name or service not knownstname laptop-l6543teb
Update your hosts file with correct IP - Hostname mapping.

Azure Java SDK: How to disable logging to console?

I'm developing an application using Azure's Java SDK and Maven. This application sends data to an IoT Hub and some other functionalities that are not important for the scope of the question.
I implemented my own logging inside the application by using log4j2 and I'm fine with that since I can modify and change it however I want.
The problem arose when I checked this warning that was coming up in my application's console output:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Thanks to this SO question I was able to do the correct move and add the dependency inside my pom.xml file like so:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.project.myProject</groupId>
<artifactId>myProject</artifactId>
<packaging>jar</packaging>
<version>1.0.0</version>
...
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-jdk14</artifactId>
<version>1.7.25</version>
</dependency>
...
After this addition the Azure's SDK started printing to console a lot of information that I don't really want to see. This should be the class that originates the logging.
Following, some output that gets written to console by itself.
...
Jun 07, 2018 8:09:18 PM com.microsoft.azure.sdk.iot.device.CustomLogger LogInfo
INFO: IotHubConnectionString object is created successfully for iotHub.azure-devices.net, method name is <init>
Jun 07, 2018 8:09:19 PM com.microsoft.azure.sdk.iot.device.CustomLogger LogInfo
INFO: DeviceClientConfig object is created successfully with IotHubName=iotHub.azure-devices.net, deviceID=device01 , method name is <init>
Jun 07, 2018 8:09:20 PM com.microsoft.azure.sdk.iot.device.CustomLogger LogInfo
INFO: DeviceIO object is created successfully, method name is <init>
Jun 07, 2018 8:09:20 PM com.microsoft.azure.sdk.iot.device.CustomLogger LogInfo
INFO: Setting SASTokenExpiryTime as 2400 seconds, method name is setOption_SetSASTokenExpiryTime
...
I've already tried to disable the Logger but with no success (followed this SO question).
I would like to know if someone has ever had this problem and if so how can I disable the logging features or else suppress the warning?
Thanks a lot in advance!
There is a blog How to Configure SLF4J with Different Logger Implementations which you can refer to to configure your slf4j-jdk14 logger implementation, as below.
Using slf4j with JDK logger
The JDK actually comes with a logger package, and you can replace pom.xml with this logger implementation.
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-jdk14</artifactId>
<version>1.7.5</version>
</dependency>
Now the configuration for JDK logging is a bit difficult to work with. Not only need a config file, such assrc/main/resources/logging.properties, but you would also need to add a System properties -Djava.util.logging.config.file=logging.properties in order to have it pick it up. Here is an example to get you started:
level=INFO
handlers=java.util.logging.ConsoleHandler
java.util.logging.ConsoleHandler.level=FINEST
deng.level=FINEST
There are two ways to avoid outputing these INFO logs to console.
Upgrade the logging level from FINEST or INFO to WARNING to SEVERE, you can refer to the Oracle Javadoc for class Level, as below, then to not output low level logs.
The levels in descending order are:
SEVERE (highest value)
WARNING
INFO
CONFIG
FINE
FINER
FINEST (lowest value)
To change the handler value in logging.properties. Except ConsoleHandler, there are four other handlers which you can use, as below, please see java.utils.logging package summary.
ConsoleHandler: This Handler publishes log records to System.err.
FileHandler: Simple file logging Handler.
MemoryHandler: Handler that buffers requests in a circular buffer in memory.
SocketHandler: Simple network logging Handler.
StreamHandler: Stream based logging Handler.
For example, to output logs to a file
handlers=java.util.logging.FileHandler
java.util.logging.FileHandler.level=INFO
java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter
java.util.logging.FileHandler.limit=1024000
java.util.logging.FileHandler.count=10
java.util.logging.FileHandler.pattern=logs/mylog.log
java.util.logging.FileHandler.append=true

com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed

Im facing the below issue when trying to connect to cassandra cluster and display table contents:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /175.14.3.164:9042 (com.datastax.driver.core.TransportException: [/172.16.3.163:9042] Cannot connect))
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:223)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:78)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1272)
at com.datastax.driver.core.Cluster.init(Cluster.java:158)
at com.datastax.driver.core.Cluster.connect(Cluster.java:248)
at com.datastax.driver.core.Cluster.connect(Cluster.java:281)
at com.X.Y.App.main(App.java:27)
In my cassandra.yaml file
native_transport_port: 9042
My listen_address : 172.14.3.164
seeds: 172.14.3.164
rpc_address: 172.14.3.164
And code :
cluster=Cluster.builder().addContactPoint("172.14.3.164").build();
I had seen other links related to it but followed them but still could n't fix it. Kindly help

Hadoop CDH4 Error:SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder"

I am using ubuntu 14.04. CDH4.7
I am installing as per the procedure given in the link below
http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/latest/CDH4-Quick-Start/cdh4qs_topic_3_2.html
The problem is I am not able to start the data node . I am getting the error as
naveensrikanthd#ubuntu:/$ for x in `cd /etc/init.d ; ls hadoop-hdfs-*` ; do sudo service $x start ; done
[sudo] password for naveensrikanthd:
* Starting Hadoop datanode:
starting datanode, logging to /var/log/hadoop-hdfs/hadoop-hdfs-datanode-ubuntu.out
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
* Starting Hadoop namenode:
namenode running as process 15437. Stop it first.
* Starting Hadoop secondarynamenode:
secondarynamenode running as process 3061. Stop it first.
naveensrikanthd#ubuntu:/$ jps
7467 RunJar
8048 RunJar
18363 Jps
No Hadoop process is running and this three statements given above[slf4J] are shuffling between namenode,datanode:
Below given is the log file for the path:
/var/log/hadoop-hdfs/hadoop-hdfs-datanode-ubuntu.out
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
ulimit -a for user hdfs
What should I do to rid of this error anyone please help in crossing this error
The output shows that in fact the namenodes are already running. You should double-check where you think they are supposed to run and what your config says, because it's saying you already succeeded.
The error from log4j has nothing to do with Hadoop functionality.

Categories