hibernate not creating table but no error messages - java

I am doing a spring-boot project and trying to create a table with hibernate, I get no errors when I run the app and the server starts normally, but the table does not get created.
StatusUpdate.java
package model;
import java.util.Date;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.Table;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;
import javax.persistence.PrePersist;
#Entity
#Table(name="status_update")
public class StatusUpdate {
#Id
#Column(name="id")
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
#Column(name="text")
private String text;
#Column(name="added")
#Temporal(TemporalType.TIMESTAMP)
private Date added;
#PrePersist
protected void onCreate() {
if (added == null) {
added = new Date();
}
}
public StatusUpdate(String text) {
this.text = text;
}
public StatusUpdate(String text, Date added) {
this.text = text;
this.added = added;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getText() {
return text;
}
public void setText(String text) {
this.text = text;
}
public Date getAdded() {
return added;
}
public void setAdded(Date added) {
this.added = added;
}
#Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + ((added == null) ? 0 : added.hashCode());
result = prime * result + ((id == null) ? 0 : id.hashCode());
result = prime * result + ((text == null) ? 0 : text.hashCode());
return result;
}
#Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
StatusUpdate other = (StatusUpdate) obj;
if (added == null) {
if (other.added != null)
return false;
} else if (!added.equals(other.added))
return false;
if (id == null) {
if (other.id != null)
return false;
} else if (!id.equals(other.id))
return false;
if (text == null) {
if (other.text != null)
return false;
} else if (!text.equals(other.text))
return false;
return true;
}
}
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.4.0.RELEASE</version>
</parent>
<properties>
<java.version>1.8</java.version>
<tiles.version>3.0.7</tiles.version>
</properties>
<groupId>com.voja</groupId>
<artifactId>spring-boot-tutorial</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
</dependency>
<dependency>
<groupId>org.apache.tiles</groupId>
<artifactId>tiles-core</artifactId>
<version>${tiles.version}</version>
</dependency>
<dependency>
<groupId>org.apache.tiles</groupId>
<artifactId>tiles-jsp</artifactId>
<version>${tiles.version}</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.38</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
<version>1.3.5.RELEASE</version>
</dependency>
</dependencies>
<packaging>war</packaging>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<executable>true</executable>
</configuration>
</plugin>
</plugins>
</build>
</project>
application.properties
debug=true
spring.datasource.url=jdbc:mysql://localhost:3306/springboottutorial
spring.datasource.username=springboot
spring.datasource.password=hello
spring.datasource.driverClassName=com.mysql.jdbc.Driver
spring.jpa.hibernate.dialect=org.hibernate.dialect.MySQLInnoDBDialect
spring.jpa.generate-ddl=true
spring.jpa.show-sql=true
logging.level.org.hibernate.SQL=DEBUG
I also get a yellow line under dialect in this line spring.jpa.hibernate.dialect=org.hibernate.dialect.MySQLInnoDBDialectwhich says `spring.jpa.hibernate.dialect' is an unknown property. Did you mean 'spring.jpa.hibernate.ddl-auto' in case that might be a problem.

Sometimes the problem occurs in the property name you choose. Like text, value etc. which are reserved keyword for database. Hence you can see a log of creating table but table will not be created. To resolve this just change you property name in you bean or provide explicit column name.
Example: property or column name "value" is not supported in mysql. Hence my tables were not created. I have to rename it to 'valueString'.
Hope this helps.

Found the problem by reading another post, actually there was a problem with packages and classes within them, they couldn't find one another and so the table wasn't created.
I made a new project and put all classes inside the same package and it worked, so I will fix my existing project based on that.

It's not working for you, because you are not using it, I mean, JPA creates the database when needs to use it. You can try to write a test that uses it, or add a rest repository to try it, just add this to your application.properties;
spring.jpa.hibernate.ddl-auto=create
and then create this interface:
#RepositoryRestResource
public interface IStatusRepository extends CrudRepository<StatusUpdate, Long> {
}
You will need also this dependency;
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-rest</artifactId>
</dependency>
It's more easy just creating a test, but I guess you are trying to build a rest web service, so try this to see it works =)

For your application.properties file follow below link for this
it is like :
spring.jpa.generate-ddl=true
and then add
spring.jpa.hibernate.ddl-auto=create-drop
http://docs.spring.io/spring-boot/docs/1.2.3.RELEASE/reference/htmlsingle/#common-application-properties

If you are using jpa and hibernate then add the following property in your application.properties file
spring.jpa.hibernate.ddl-auto=update

Try adding this to your application.properties :
spring.jpa.hibernate.ddl-auto=create

Related

java.lang.NoClassDefFoundError: org/apache/commons/codec/binary/Base64: from a library jar imported with Maven

I've made a simple library jar that consists of this single class:
import org.apache.commons.codec.binary.Base64;
public class NiceEncoder {
public static String encode(String first, String second) {
String encoded = new String(Base64.encodeBase64((first + ":" + second).getBytes()));
return encoded;
}
}
It has this pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>base64tester</groupId>
<artifactId>base64tester</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<properties>
<maven.compiler.target>11</maven.compiler.target>
<maven.compiler.source>11</maven.compiler.source>
</properties>
<dependencies>
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.6</version>
</dependency>
</dependencies>
</project>
Then I package it into a jar and install it in local maven repository:
mvn install:install-file -Dfile=/home/base64tester-1.0-SNAPSHOT.jar -DgroupId=base64tester -DartifactId=base64tester -Dversion=1.0-SNAPSHOT -Dpackaging=jar
Then I import it in another project like this:
<dependencies>
<dependency>
<groupId>base64tester</groupId>
<artifactId>base64tester</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
</dependencies>
and try running this code:
public class TestNiceEncoder {
public static void main(String[] args) {
String first = "some word";
String second = "other word";
System.out.println(NiceEncoder.encode(first, second));
}
}
I get the java.lang.NoClassDefFoundError: org/apache/commons/codec/binary/Base64 error.
Why? How do I fix it other than importing the commons-codec inside the project with the TestNiceEncoder class? How do I make it so everyone who has the base64tester.jar can run it with no errors or additional actions?
EDIT: It's the same thing with other dependencies in base64tester project. Got it to use JSoup library and then if the method using it gets called from outside project, there's exception:
In project B:
public class TestNiceEncoder {
public static void main(String[] args) {
String first = "some word";
String second = "other word";
// System.out.println(NiceEncoder.encode(first, second));
System.out.println(NiceEncoder.getRedditTitle()); //jsoup error here
}
}
In project A (base64tester):
public class NiceEncoder {
public static String encode(String first, String second) {
String encoded = new String(Base64.encodeBase64((first + ":" + second).getBytes()));
return encoded;
}
public static String getRedditTitle() {
try {
Document doc = Jsoup.connect("http://reddit.com/").get();
String title = doc.title();
System.out.println("title: " + title);
return title;
}
catch (Exception e) {
System.out.println("Couldn't open URL");
return "";
}
}
}
<dependencies>
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.6</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jsoup/jsoup -->
<dependency>
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.13.1</version>
</dependency>
</dependencies>
error is this one:
Exception in thread "main" java.lang.NoClassDefFoundError: org/jsoup/Jsoup
This step
Then I package it into a jar and install it in local maven repository
require you to specify pom.xml file, otherwise one is generated, but without dependency info
Something like (of course you need to adjust path to pom.xml)
mvn install:install-file -Dfile=/home/base64tester-1.0-SNAPSHOT.jar -DpomFile=/home/pom.xml
read more in official docs.
Alternatively you can declare explicit dependency on commons-codec in your downstream project (where you're consuming your jar)
Side note, JDK8 already includes support for base64 codec, see here - so maybe you can use it directly without any extra jars on classpath
The dependencies package of Base64Tester are not included in jar file. You should use mvn instal from Base64Test project

JUnit Null Pointer Exception When Declaring in setUpClass or setUp

Currently I am setting up basic JUnit tests in the following two manners which both give me null pointer exceptions when running the tests at the line where I call c.add():
public class CalculatorTest {
Calculator c;
#BeforeEach
public void setUp() {
c = new Calculator();
}
#Test
public void testAdd() {
System.out.println("add");
int a = 0;
int b = 0;
int expResult = 0;
int result = c.add(a, b);
assertEquals(expResult, result);
}
....
The second way:
public class CalculatorTest {
static Calculator c;
public CalculatorTest() {
}
#BeforeAll
public static void setUpClass() {
c = new Calculator();
}
#Test
public void testAdd() {
System.out.println("add");
int a = 0;
int b = 0;
int expResult = 0;
int result = c.add(a, b);
assertEquals(expResult, result);
}
...
This link: JUNIT Null Pointer Exception
suggests to me I'm doing this right but I'm still getting the null pointer exception.
My pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany</groupId>
<artifactId>mavenproject1</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>5.6.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-params</artifactId>
<version>5.6.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>5.6.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.platform</groupId>
<artifactId>junit-platform-engine</artifactId>
<version>1.6.0</version>
<scope>test</scope>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>14</maven.compiler.source>
<maven.compiler.target>14</maven.compiler.target>
</properties>
</project>
Calculator is Class Under Test, therefore, you don't need to do
#BeforeEach
public void setUp() {
c = new Calculator();
}
You can declare c like this:
Calculator c = new Calculator();
and add #Test method as usual.

The problem of connecting Apache Flink to elasticsearch

I used a piece of code inside the Flink site to connect Apache Flink to Elastic Search. I want to run this piece of code from NetBeans software through maven project.
public class FlinkElasticCon {
public static void main(String[] args) throws Exception {
final int port;
try {
final ParameterTool params = ParameterTool.fromArgs(args);
port = params.getInt("port");
} catch (Exception e) {
System.err.println("No port specified. Please run 'SocketWindowWordCount --port <port>'");
return;
}
// get the execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// get input data by connecting to the socket
DataStream<String> text = env.socketTextStream("localhost", port, "\n");
// parse the data, group it, window it, and aggregate the counts
DataStream<WordWithCount> windowCounts = text
.flatMap((String value, Collector<WordWithCount> out) -> {
for (String word : value.split("\\s")) {
out.collect(new WordWithCount(word, 1L));
}
})
.keyBy("word")
.timeWindow(Time.seconds(5))
.reduce(new ReduceFunction<WordWithCount>() {
#Override
public WordWithCount reduce(WordWithCount a, WordWithCount b) {
return new WordWithCount(a.word, a.count + b.count);
}
});
// print the results with a single thread, rather than in parallel
//windowCounts.print().setParallelism(1);
env.execute("Socket Window WordCount");
List<HttpHost> httpHosts = new ArrayList<>();
httpHosts.add(new HttpHost("127.0.0.1", 9200, "http"));
httpHosts.add(new HttpHost("10.2.3.1", 9200, "http"));
ElasticsearchSink.Builder<String> esSinkBuilder = new ElasticsearchSink.Builder<>(
httpHosts,
new ElasticsearchSinkFunction<String>() {
public IndexRequest createIndexRequest(String element) {
Map<String, String> json = new HashMap<>();
json.put("data", element);
return Requests
.indexRequest()
.index("my-index")
.type("my-type")
.source(json);
}
#Override
public void process(String element, RuntimeContext ctx, RequestIndexer indexer) {
indexer.add(createIndexRequest(element));
}
}
);
windowCounts.addSink((SinkFunction<WordWithCount>) esSinkBuilder);
}
public static class WordWithCount {
public String word;
public long count;
public WordWithCount() {}
public WordWithCount(String word, long count) {
this.word = word;
this.count = count;
}
#Override
public String toString() {
return word + " : " + count;
}
}
}
When adding Dependency, it does not identify the elasticsearchsink class. Given that I added different Dependency to it, but the problem is still not resolved. When importing :
import org.apache.flink.streaming.connectors.elasticsearch6.ElasticsearchSink
The red line is created as unknown in the code.
my pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-
instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-
4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.apache.flink</groupId>
<artifactId>mavenproject1</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-elasticsearch6_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>1.8.1</version>
<scope>provided</scope>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-elasticsearch-base_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>6.0.0-alpha1</version>
<!--<version>6.0.0-alpha1</version>-->
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.8.1</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-core</artifactId>
<version>0.8.1</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
apache flink version : 1.8.1
elasticsearch version : 7.4.2
netbeans version : 8.2
java version : 8
please help me.
Flink Elasticsearch Connector 7
Please find a working and detailed answer which I have provided here.

Groovy spock test case failing which includes embeded Hazelcast in my test case

I am new to Groovy spock test framework, trying to write my 1st test using Groovy spock. I have few questions around my below test which is keep failing, not getting what to change code to make it working???
What I am trying is?
One thing I am not understanding is when should I use mock, when
should I use spy
Under then section I am trying to call classUnderTest.loadFromFile() in order to test it.
Always I see in CommonDataCache class, instance variable hazelcastCache is always showing null while I debug the test, but at the same time, I see cache is getting non-null Object. Due to this I am always getting null pointer exception as shown in below error logs.
Please anybody suggest why am I missing to make it working???
Error log:
Members [1] {
Member [127.0.0.1]:5001 - a43cbef2-ddf7-431b-9300-2b53c3ea9294 this
}
Dec 13, 2017 3:53:53 PM com.hazelcast.core.LifecycleService
INFO: [127.0.0.1]:5001 [dev] [3.7.3] [127.0.0.1]:5001 is STARTED
Dec 13, 2017 3:53:54 PM com.hazelcast.internal.partition.impl.PartitionStateManager
INFO: [127.0.0.1]:5001 [dev] [3.7.3] Initializing cluster partition table arrangement...
Condition not satisfied:
commonDataCache.getFromCache("28ef4a8f-bfbc-4ad5-bc8a-88fd96ad82a8") != null
| | |
| null false
com.realdoc.symphony.common.CommonDataCache#144ab54
at com.realdoc.symphony.common.store.MemoryStoreManagerTest.populate hazlecast cache from symphony dat file(MemoryStoreManagerTest.groovy:65)
Dec 13, 2017 3:53:54 PM com.hazelcast.instance.Node
INFO: [127.0.0.1]:5001 [dev] [3.7.3] Running shutdown hook... Current state: ACTIVE
This is my Groovy test class:
import org.springframework.core.io.ClassPathResource
import spock.lang.Specification
import spock.lang.Subject
import static io.dropwizard.testing.FixtureHelpers.fixture
class MemoryStoreManagerTest extends Specification {
/**
* Mock the any config DTOs that carries static configuration data
**/
def dw = Mock(SymphonyConfig)
def cacheConfig = Mock(CacheConfig)
/**
* This has to be spied because there is actual call happening in the target method which converts
* json string format to MemoryStoreFileData DTO object
*/
def jsonUtils = Spy(JsonUtils)
def hazelcastInstance = TestHazelcastInstanceFactory.newInstance().newHazelcastInstance()
/**
* This class is under test
**/
#Subject
def commonDataCache = new CommonDataCache(hazelcastInstance: hazelcastInstance,hazelcastCache: hazelcastInstance.getMap("default"), config: dw)
/**
* This class is under test
**/
#Subject
def classUnderTest = new MemoryStoreManager(dw:dw, jsonUtils: jsonUtils, commonDataCache: commonDataCache)
/**
* Test whether populating symphony.dat file into hazelcast cache is working
*/
def "populate hazlecast cache from symphony dat file"() {
setup:
def datFile = fixture("symphony.dat")
def resource = new ClassPathResource("symphony.dat")
def file = resource.getFile()
when:
cacheConfig.getStoreLocation() >> ""
cacheConfig.getStoreFileName() >> "symphony.dat"
dw.getUseHazelcastCache() >> true
dw.getCacheConfig() >> cacheConfig
cacheConfig.getFile() >> file
commonDataCache.postConstruct()
then:
classUnderTest.loadFromFile()
expect:
commonDataCache.getFromCache("28ef4a8f-bfbc-4ad5-bc8a-88fd96ad82a8") != null
}
}
This is my target class on which I am trying to test loadFromFile() method
#Component
public class MemoryStoreManager {
private static final Logger LOG = LoggerFactory.getLogger(MemoryStoreManager.class);
#Autowired
SymphonyConfig dw;
#Autowired
JsonUtils jsonUtils;
#Autowired
CommonDataCache commonDataCache;
private final Properties properties = new Properties();
#PostConstruct
public void loadFromFile() {
File file = dw.getCacheConfig().getFile();
LOG.info("Loading Data from file-{}", file.getAbsolutePath());
FileInputStream inStream = null;
try {
if (!file.exists()) {
Files.createFile(file.toPath());
}
inStream = new FileInputStream(file);
properties.load(inStream);
String property = properties.getProperty("data");
MemoryStoreFileData fileData;
if (StringUtils.isNotEmpty(property)) {
fileData = jsonUtils.jsonToObject(property, MemoryStoreFileData.class);
} else {
fileData = new MemoryStoreFileData(Collections.emptyMap(), Collections.emptyMap());
}
Long lastUpdatedTimeInFile = fileData.getLastUpdatedTime();
LOG.info("Last updated time in File-{}", lastUpdatedTimeInFile);
Long lastUpdatedTimeInCache = (Long) commonDataCache.getFromCache("lastUpdatedTime");
LOG.info("Last updated time in Cache-{}", lastUpdatedTimeInCache);
Map<String, DocData> loadedMap = fileData.getDocDataMap();
if (MapUtils.isEmpty(loadedMap)) {
loadedMap = new HashMap<>();
}
Map<String, ProcessStatusDto> processStatusMap = fileData.getProcessStatusMap();
if (MapUtils.isEmpty(processStatusMap)) {
processStatusMap = new HashMap<>();
}
if (lastUpdatedTimeInFile != null && (lastUpdatedTimeInCache == null || lastUpdatedTimeInCache < lastUpdatedTimeInFile)) {
LOG.info("Overwriting data from File");
commonDataCache.addAllToCache(loadedMap, processStatusMap);
} else {
String requestId;
DocData fileDocData;
DocData cacheDocData;
Map<String, String> filePageStatusMap;
Map<String, String> cachePageStatusMap;
String pageId;
String fileStatus;
String cacheStatus;
for (Entry<String, DocData> entry : loadedMap.entrySet()) {
requestId = entry.getKey();
fileDocData = entry.getValue();
cacheDocData = (DocData) commonDataCache.getFromCache(requestId);
filePageStatusMap = fileDocData.getPageStatusMap();
cachePageStatusMap = cacheDocData.getPageStatusMap();
for (Entry<String, String> pageStatus : filePageStatusMap.entrySet()) {
pageId = pageStatus.getKey();
fileStatus = pageStatus.getValue();
cacheStatus = cachePageStatusMap.get(pageId);
if (StringUtils.equals("IN_PROCESS", cacheStatus) && !StringUtils.equals("IN_PROCESS", fileStatus)) {
cachePageStatusMap.put(pageId, fileStatus);
LOG.info("PageId: {} status: {} updated", pageId, fileStatus);
}
}
commonDataCache.addToCache(requestId, cacheDocData);
}
}
} catch (Exception e) {
LOG.error("ErrorCode-{}, Component-{}, Message-{}. Error Loading cache data from file-{}. Exiting system", "OR-51010", "ORCHESTRATION", "Symphony cache loading exception", file.getAbsoluteFile(), e);
System.exit(0);
}
}
}
This is my cache utility class where store and retrieve methods are defines.
#Component
public class CommonDataCache {
private static final Logger LOG = LoggerFactory.getLogger(CommonDataCache.class);
#Autowired
HazelcastInstance hazelcastInstance;
#Autowired
SymphonyConfig config;
public static String LAST_UPDATED_TIME = "lastUpdatedTime";
private IMap<String, Object> hazelcastCache = null;
private boolean useHazelcast = false;
private final Map<String, Object> cache = new ConcurrentHashMap<>();
#PostConstruct
public void postConstruct() {
hazelcastCache = hazelcastInstance.getMap("default");
// Enable only if logging level is DEBUG
if (LOG.isDebugEnabled()) {
hazelcastCache.addEntryListener(new HazelcastMapListener(), true);
}
useHazelcast = config.getUseHazelcastCache();
}
public Map<String, Object> getAllDataFromCache() {
return hazelcastCache;
}
public void addToCache(String key, Object value) {
if (useHazelcast) {
hazelcastCache.put(key, value);
hazelcastCache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
} else {
cache.put(key, value);
cache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
}
}
public Object getAndRemoveFromCache(String key) {
if (useHazelcast) {
return hazelcastCache.remove(key);
} else {
return cache.remove(key);
}
}
public Object getFromCache(String key) {
if (useHazelcast) {
return hazelcastCache.get(key);
} else {
return cache.get(key);
}
}
/**
*
* #param cacheDataMap
*/
public void addAllToCache(Map<String, DocData> cacheDataMap, Map<String, ProcessStatusDto> processStatusMap) {
hazelcastCache.putAll(cacheDataMap);
hazelcastCache.putAll(processStatusMap);
hazelcastCache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
}
public void lockKey(String key) {
if (useHazelcast) {
hazelcastCache.lock(key);
}
}
public void unlockKey(String key) {
if (useHazelcast) {
hazelcastCache.unlock(key);
}
}
public Map<String, Object> getByKeyContains(String keyString) {
Map<String, Object> values;
if (useHazelcast) {
Set<String> foundKeys = hazelcastCache.keySet(entry -> ((String)entry.getKey()).contains(keyString));
values = hazelcastCache.getAll(foundKeys);
} else {
values = Maps.filterEntries(cache, entry -> entry.getKey().contains(keyString));
}
return values;
}
}
Here is maven dependencies for groovy tests.
<!-- Dependencies for GROOVY TEST -->
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>${hazelcast.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>${hazelcast.version}</version>
<classifier>tests</classifier>
<scope>test</scope>
</dependency>
<!-- GROOVY TEST FRAMEWORK -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-core</artifactId>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-assets</artifactId>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-testing</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>realdoc</groupId>
<artifactId>dropwizard-spring</artifactId>
<version>${realdoc.dropwizard-spring.version}</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<!-- any version of Groovy \>= 1.5.0 should work here -->
<version>${groovy-all.version}</version>
<!--<scope>test</scope>-->
</dependency>
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-core</artifactId>
<version>${spock.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-spring</artifactId>
<version>${spock.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib-nodep</artifactId>
<version>${cglib-nodep.version}</version>
<scope>test</scope>
</dependency>
<!-- GROOVY TEST FRAMEWORK -->
<build>
<testSourceDirectory>src/test/groovy</testSourceDirectory>
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
</testResource>
</testResources>
...
...
</build>

NoNodeAvailableException elasticsearch in one IntelliJ project but not in other with same programmatic and elasticsearch.yml configuration and code

I know question related to NoNodeAvailbleException has already been asked multiple times on this platform previously. I had followed through all the solutions been provided by experts previously; yet none work with me. The problem with the Exception I get is unique.
I am facing NoNodeAvailableException in my Java Client. The usual solution is to ensure consistency in confiurations been set through programmatically and the one present in elasticsearch.yml. I had already ensured this.
Just for being on the safe side, I had created another IntelliJ project with same Maven dependencies with same elasticsearch client code and same configurations, but I didn't get NoNodeAvailableException in it.
I could work with this new Intellij Project, by shipping code from my previous IntelliJ project but I will lose all the commits I had previously have and other services written in my old project.
Old client code:
public class ElasticSearchTest {
private static TransportClient client;
public static void main(String[] args) {
try {
Settings elasticsearchSettings = Settings.builder().put("cluster.name", "elasticsearch_cse").put("node.name","elasticsearch_cse_data").build();
client = new PreBuiltTransportClient(elasticsearchSettings)
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300))
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300));
String sampleWebsite = "www.samplewebsite.com";
List<PageModel> pages = new ArrayList<PageModel>();
pages.add(new PageModel("www.samplewebsite.com/page1", "darkknight wins a lot", "First page"));
pages.add(new PageModel("www.samplewebsite.com/page1/page2", "superman survives", "Second page"));
pages.add(new PageModel("www.samplewebsite.com/page3", "wonderwoman releases", "third page"));
DomainModel domainModel = new DomainModel(sampleWebsite, pages);
insertIndicesFromDomain(domainModel);
boolean isResultFound = false;
List<PageModel> pageModels = getSearchResults(domainModel, "superman");
for(PageModel pageModel: pageModels){
if(pageModel.getPageContent().equals("superman survives")){
isResultFound = true;
System.out.println(pageModel);
}
}
// closeService();
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public static void insertIndicesFromDomain(DomainModel domainModel){
int i = 1;
for(PageModel pageModel: domainModel.getPages()){
client.prepareIndex("websiteindex", domainModel.getDomainName(), i + "").setSource(pageModel.getPageModelInMapFormat()).execute().actionGet();
i += 1;
System.out.println(i);
}
}
public static List<PageModel> getSearchResults(DomainModel domainModel, String searchKeyWords){
SearchResponse searchResponse = client.prepareSearch("websiteindex").setTypes(domainModel.getDomainName()).setSearchType(SearchType.QUERY_AND_FETCH)
.setQuery(QueryBuilders.termQuery("content", searchKeyWords)).execute().actionGet();
SearchHit[] results = searchResponse.getHits().getHits();
List<PageModel> pageModels = new ArrayList<PageModel>();
for(SearchHit searchHit: results){
pageModels.add(new PageModel(searchHit.getSource()));
}
return pageModels;
}
public static void getIndices(){
ImmutableOpenMap<String, IndexMetaData> indicies = client.admin().cluster().prepareState().get().getState().getMetaData().getIndices();
for(ObjectObjectCursor<String, IndexMetaData> entry: indicies){
System.out.printf(entry.key + " " + entry.value);
}
}
public static void closeService(){
client.close();
}
}
StackTrace of old client code:
Exception in thread "main" NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{H90yHKj8R_aa0MeTciqjvA}{localhost}{127.0.0.1:9300}]]
at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:347)
at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:245)
at org.elasticsearch.client.transport.TransportProxyClient.execute(TransportProxyClient.java:59)
at org.elasticsearch.client.transport.TransportClient.doExecute(TransportClient.java:363)
at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:408)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:80)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:54)
at net.media.sitesearch.models.ElasticSearchTest.insertIndicesFromDomain(ElasticSearchTest.java:66)
at net.media.sitesearch.models.ElasticSearchTest.main(ElasticSearchTest.java:42)
New client Code:
public class Main {
private static TransportClient client;
public static void main(String[] args) {
try {
Settings elasticsearchSettings = Settings.builder().put("cluster.name", "elasticsearch_cse").put("node.name","elasticsearch_cse_data").build();
client = new PreBuiltTransportClient(elasticsearchSettings)
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300))
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300));
String sampleWebsite = "www.samplewebsite.com";
List<PageModel> pages = new ArrayList<PageModel>();
pages.add(new PageModel("www.samplewebsite.com/page1", "darkknight wins a lot", "First page"));
pages.add(new PageModel("www.samplewebsite.com/page1/page2", "superman survives", "Second page"));
pages.add(new PageModel("www.samplewebsite.com/page3", "wonderwoman releases", "third page"));
DomainModel domainModel = new DomainModel(sampleWebsite, pages);
insertIndicesFromDomain(domainModel);
boolean isResultFound = false;
List<PageModel> pageModels = getSearchResults(domainModel, "superman");
for(PageModel pageModel: pageModels){
if(pageModel.getPageContent().equals("superman survives")){
isResultFound = true;
System.out.println(pageModel);
}
}
// closeService();
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public static void insertIndicesFromDomain(DomainModel domainModel){
int i = 1;
for(PageModel pageModel: domainModel.getPages()){
client.prepareIndex("websiteindex", domainModel.getDomainName(), i + "").setSource(pageModel.getPageModelInMapFormat()).execute().actionGet();
i += 1;
System.out.println(i);
}
}
public static List<PageModel> getSearchResults(DomainModel domainModel, String searchKeyWords){
SearchResponse searchResponse = client.prepareSearch("websiteindex").setTypes(domainModel.getDomainName()).setSearchType(SearchType.QUERY_AND_FETCH)
.setQuery(QueryBuilders.termQuery("content", searchKeyWords)).execute().actionGet();
SearchHit[] results = searchResponse.getHits().getHits();
List<PageModel> pageModels = new ArrayList<PageModel>();
for(SearchHit searchHit: results){
pageModels.add(new PageModel(searchHit.getSource()));
}
return pageModels;
}
public static void getIndices(){
ImmutableOpenMap<String, IndexMetaData> indicies = client.admin().cluster().prepareState().get().getState().getMetaData().getIndices();
for(ObjectObjectCursor<String, IndexMetaData> entry: indicies){
System.out.printf(entry.key + " " + entry.value);
}
}
public static void closeService(){
client.close();
}
}
Output of new client code:
2
3
4
PageModel{pageUrl='www.samplewebsite.com/page1/page2', pageContent='superman survives', pageTitle='Second page', map={title=Second page, url=www.samplewebsite.com/page1/page2, content=superman survives}}
The only difference between my old code and new code, is that old code is under Git working tree and the new one is not.
New edit:
I found another important difference between my two projects. In my new Intellij Project, I haven't shipped all the other services from old one and just kept necessary ElasticSearch Client code. Hence, the new Intellij Project did not contain all the dependencies of the old project, but only one dependency of the elasticsearch.
In my new client code which does not contain all the maven dependencies except for elastic search, I don't get NoNodeAvailableException. Could the dependencies between elastic search and other maven apis be an indirect reason for that exception?
New pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sampleelasticsearch</groupId>
<artifactId>SampleES</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>5.5.1</version>
</dependency>
</dependencies>
</project>
Old pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>net.media.siteresearch</groupId>
<artifactId>SiteSearch</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
<dependency>
<groupId>net.media.crawler</groupId>
<artifactId>MnetCrawler</artifactId>
<version>0.995</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>5.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.24</version>
</dependency>
<dependency>
<!-- jsoup HTML parser library # https://jsoup.org/ -->
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.10.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.8.9</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.5.3</version>
</dependency>
</dependencies>
</project>
If authors here want to check logs from elastic search server or any other, then please ask (I have not added them because the question details were already filled with a lot of other relevant snippets from my perspective. Also, the question details would get cluttered).

Categories