openCMIS Local binding - JcrServiceFactory with jackRabbit implementation - java

Hey, there is something wrong with the third alternative, because the loop in JcrServiceFactory is searching for properties starting with jcr.* (others are not passed along), but right after in RepositoryFactoryImpl (Jackrabbit impl) it is searched for "org.apache.jackrabbit.repository.home" in the collection of properties that was passed along... that doesn't make sense. even if org.apache.jackrabbit.repository.home is there, it doesn't start with PREFIX_JCR_CONFIG so it is not put into jcrConfig collection that goes to RepositoryFactoryImpl.getRepository()
It would make sense if Map<String, String> map = null; because there is if (parameters == null) condition in RepositoryFactoryImpl, but this does not
It happens in the init method
JcrServiceFactory.java
private TypeManager typeManager;
private Map<String, String> jcrConfig;
private String mountPath;
private JcrRepository jcrRepository;
#Override
public void init(Map<String, String> parameters) {
typeManager = new TypeManager();
readConfiguration(parameters);
jcrRepository = new JcrRepository(acquireJcrRepository(jcrConfig), mountPath, typeManager);
}
Caused by: org.apache.chemistry.opencmis.commons.exceptions.CmisConnectionException: No JCR repository factory for configured parameters
at org.apache.chemistry.opencmis.jcr.JcrServiceFactory.acquireJcrRepository(JcrServiceFactory.java:95)
at org.apache.chemistry.opencmis.jcr.JcrServiceFactory.init(JcrServiceFactory.java:61)
at org.apache.chemistry.opencmis.client.bindings.spi.local.CmisLocalSpi.getSpiInstance(CmisLocalSpi.java:94)
... 34 more
private void readConfiguration(Map<String, String> parameters) {
Map<String, String> map = new HashMap<String, String>();
List<String> keys = new ArrayList<String>(parameters.keySet());
Collections.sort(keys);
/* the loop is searching for properties starting with jcr.* */
for (String key : keys) {
if (key.startsWith(PREFIX_JCR_CONFIG)) {
String jcrKey = key.substring(PREFIX_JCR_CONFIG.length());
String jcrValue = replaceSystemProperties(parameters.get(key));
map.put(jcrKey, jcrValue);
}
else if (MOUNT_PATH_CONFIG.equals(key)) {
mountPath = parameters.get(key);
log.debug("Configuration: " + MOUNT_PATH_CONFIG + '=' + mountPath);
}
else {
log.warn("Configuration: unrecognized key: " + key);
}
}
jcrConfig = Collections.unmodifiableMap(map);
log.debug("Configuration: jcr=" + jcrConfig);
}
But here the parameter Map is empty {} and it returns null; because it is searching for RepositoryFactoryImpl.REPOSITORY_HOME, which is org.apache.jackrabbit.repository.home
RepositoryFactoryImpl.java
/* parameters = jcrConfig */
public Repository getRepository(Map parameters) throws RepositoryException {
if (parameters == null) {
return getRepository(null, Collections.emptyMap());
} else if (parameters.containsKey(REPOSITORY_HOME)) {
String home = parameters.get(REPOSITORY_HOME).toString();
return getRepository(home, parameters);
} else if (parameters.containsKey(JcrUtils.REPOSITORY_URI)) {
Object parameter = parameters.get(JcrUtils.REPOSITORY_URI);
try {
URI uri = new URI(parameter.toString().trim());
String scheme = uri.getScheme();
if (("file".equalsIgnoreCase(scheme)
|| "jcr-jackrabbit".equalsIgnoreCase(scheme))
&& uri.getAuthority() == null) {
File file = new File(uri.getPath());
if (file.isFile()) {
return null; // Not a (possibly missing) directory
} else {
return getRepository(file.getPath(), parameters);
}
} else {
return null; // not a file: or jcr-jackrabbit: URI
}
} catch (URISyntaxException e) {
return null; // not a valid URI
}
} else {
return null; // unknown or insufficient parameters
}
}
<dependencies>
<dependency>
<groupId>javax.jcr</groupId>
<artifactId>jcr</artifactId>
<version>2.0</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-core</artifactId>
<version>2.2.4</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-api</artifactId>
<version>2.2.4</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.5.11</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>5.14</version>
<type>jar</type>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.chemistry.opencmis</groupId>
<artifactId>chemistry-opencmis-server-jcr</artifactId>
<version>0.3.0-incubating-SNAPSHOT</version>
<classifier>classes</classifier>
</dependency>
<dependency>
<groupId>org.apache.chemistry.opencmis</groupId>
<artifactId>chemistry-opencmis-client-bindings</artifactId>
<version>0.3.0-incubating-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.chemistry.opencmis</groupId>
<artifactId>chemistry-opencmis-client-api</artifactId>
<version>0.3.0-incubating-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.chemistry.opencmis</groupId>
<artifactId>chemistry-opencmis-client-impl</artifactId>
<version>0.3.0-incubating-SNAPSHOT</version>
</dependency>

The answer is right in the loop I complained about :-)
String jcrKey = key.substring(PREFIX_JCR_CONFIG.length());
It's a substring, so it cuts jcr. of and the rest goes on...
parameters.put("jcr.org.apache.jackrabbit.repository.home", repositoryHome);
It's a tricky and one kinda needs to figure out all this from debugging.

you need to configure your 'repository.properties' at 'WEB-INF/classes' with below entry.
jcr.org.apache.jackrabbit.repository.home={user.home}\jcr-repository (your repository location).
Cheers.

Related

The problem of connecting Apache Flink to elasticsearch

I used a piece of code inside the Flink site to connect Apache Flink to Elastic Search. I want to run this piece of code from NetBeans software through maven project.
public class FlinkElasticCon {
public static void main(String[] args) throws Exception {
final int port;
try {
final ParameterTool params = ParameterTool.fromArgs(args);
port = params.getInt("port");
} catch (Exception e) {
System.err.println("No port specified. Please run 'SocketWindowWordCount --port <port>'");
return;
}
// get the execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// get input data by connecting to the socket
DataStream<String> text = env.socketTextStream("localhost", port, "\n");
// parse the data, group it, window it, and aggregate the counts
DataStream<WordWithCount> windowCounts = text
.flatMap((String value, Collector<WordWithCount> out) -> {
for (String word : value.split("\\s")) {
out.collect(new WordWithCount(word, 1L));
}
})
.keyBy("word")
.timeWindow(Time.seconds(5))
.reduce(new ReduceFunction<WordWithCount>() {
#Override
public WordWithCount reduce(WordWithCount a, WordWithCount b) {
return new WordWithCount(a.word, a.count + b.count);
}
});
// print the results with a single thread, rather than in parallel
//windowCounts.print().setParallelism(1);
env.execute("Socket Window WordCount");
List<HttpHost> httpHosts = new ArrayList<>();
httpHosts.add(new HttpHost("127.0.0.1", 9200, "http"));
httpHosts.add(new HttpHost("10.2.3.1", 9200, "http"));
ElasticsearchSink.Builder<String> esSinkBuilder = new ElasticsearchSink.Builder<>(
httpHosts,
new ElasticsearchSinkFunction<String>() {
public IndexRequest createIndexRequest(String element) {
Map<String, String> json = new HashMap<>();
json.put("data", element);
return Requests
.indexRequest()
.index("my-index")
.type("my-type")
.source(json);
}
#Override
public void process(String element, RuntimeContext ctx, RequestIndexer indexer) {
indexer.add(createIndexRequest(element));
}
}
);
windowCounts.addSink((SinkFunction<WordWithCount>) esSinkBuilder);
}
public static class WordWithCount {
public String word;
public long count;
public WordWithCount() {}
public WordWithCount(String word, long count) {
this.word = word;
this.count = count;
}
#Override
public String toString() {
return word + " : " + count;
}
}
}
When adding Dependency, it does not identify the elasticsearchsink class. Given that I added different Dependency to it, but the problem is still not resolved. When importing :
import org.apache.flink.streaming.connectors.elasticsearch6.ElasticsearchSink
The red line is created as unknown in the code.
my pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-
instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-
4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.apache.flink</groupId>
<artifactId>mavenproject1</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-elasticsearch6_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>1.8.1</version>
<scope>provided</scope>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-elasticsearch-base_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>6.0.0-alpha1</version>
<!--<version>6.0.0-alpha1</version>-->
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.8.1</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-core</artifactId>
<version>0.8.1</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
apache flink version : 1.8.1
elasticsearch version : 7.4.2
netbeans version : 8.2
java version : 8
please help me.
Flink Elasticsearch Connector 7
Please find a working and detailed answer which I have provided here.

AWS SQS and SES dependency libraries effect each other

I'm trying to use imports from SES and SQS at the same time but the combination causes an error to be thrown with the .withBody method. I'm guessing it's to do with the dependencies but they are at the latest version.
Error:(116,54) java:incompatible types:com.amazonaws.services.simpleemail.model.Body cannot be converted to java.lang.String
import com.amazonaws.services.sqs.AmazonSQS;
import com.amazonaws.services.sqs.AmazonSQSClientBuilder;
import com.amazonaws.services.sqs.model.Message;
import com.amazonaws.services.sqs.model.ReceiveMessageRequest;
import com.amazonaws.services.simpleemail.AmazonSimpleEmailService;
import com.amazonaws.services.simpleemail.AmazonSimpleEmailServiceClientBuilder;
import com.amazonaws.services.simpleemail.model.*;
public void email(S3Event event, Person person, Boolean error) {
ObjectMapper mapper = new ObjectMapper();
String emailText = null;
if (error) {
emailText = "Error! No image in file!";
} else {
try {
emailText = mapper.writeValueAsString(person);
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
String key = event.getRecords().get(0).getS3().getObject().getKey();
AmazonSimpleEmailService client =
AmazonSimpleEmailServiceClientBuilder.standard().withRegion(Regions.EU_WEST_1).build();
Body body = new Body().withText(new Content().withData(emailText));
SendEmailRequest request = new SendEmailRequest().withDestination(
new Destination().withToAddresses(person.getEmail()))
.withMessage(new Message()
.withBody(new Body().withHtml(new
Content().withCharset("UTF8").withData(emailText)))
.withSubject(new Content()
.withCharset("UTF-8").withData("Message from passport service.")))
.withSource(person.getEmail());
client.sendEmail(request);
}
public void getBaseCodeFromSQS() {
AmazonSQS sqs = AmazonSQSClientBuilder.defaultClient();
try {
ReceiveMessageRequest receiveMessageRequest = new ReceiveMessageRequest("https://sqs.eu-
west-1.amazonaws.com/416031944655/TISFEXP-PSS-2-QUEUE");
List<Message> messages = sqs.receiveMessage(receiveMessageRequest).getMessages();
for (Message message : messages) {
LOGGER.info("MessageId: " + message.getMessageId());
LOGGER.info("ReceiptHandle: " + message.getReceiptHandle());
LOGGER.info("MD5OfBody: " + message.getMD5OfBody());
LOGGER.info("Body: " + message.getBody());
for (final Map.Entry<String, String> entry : message.getAttributes().entrySet())
{
LOGGER.info("Attribute - Name: " + entry.getKey());
LOGGER.info("Attribute - Value: " + entry.getValue());
}
}
} catch (Exception e) {
LOGGER.error(e);
}
}
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-sqs</artifactId>
<version>1.11.634</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>amazon-sqs-java-messaging-lib</artifactId>
<version>1.0.8</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-ses</artifactId>
<version>1.11.634</version>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bom</artifactId>
<version>1.11.634</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
There is a Message class defined in both SES and SQS packages. You are using the Message class defined in the SQS package. You should use the one defined in SES package instead.
https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/sqs/model/Message.html
https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/simpleemail/model/Message.html
SendEmailRequest request = new SendEmailRequest().withDestination(
new Destination().withToAddresses(person.getEmail()))
.withMessage(new com.amazonaws.services.simpleemail.model.Message()
.withBody(new Body().withHtml(new
Content().withCharset("UTF8").withData(emailText)))
.withSubject(new Content()
.withCharset("UTF-8").withData("Message from passport service.")))
.withSource(person.getEmail());

Unable to consume kafka messages using apache storm

I have developed an application to consume the kafka message using apache storm, when i run topology using in LocalCluster in eclipse it works fine and messages consumes properly, but when i run this using storm command (bin\storm jar ..\kafka-storm-0.0.1-SNAPSHOT.jar com.kafka_storm.util.Topology storm-kafka-topology), the topology started but unable to consume any message, it there something wrong i am doing, or guide me what things i can do to find the problem
Topology Code
public class Topology {
public Properties configs;
public BoltBuilder boltBuilder;
public SpoutBuilder spoutBuilder;
public Topology(String configFile) throws Exception {
configs = new Properties();
InputStream is = null;
try {
is = this.getClass().getResourceAsStream("/application.properties");
configs.load(is);
//configs.load(Topology.class.getResourceAsStream("/application.properties"));
boltBuilder = new BoltBuilder(configs);
spoutBuilder = new SpoutBuilder(configs);
} catch (Exception ex) {
ex.printStackTrace();
System.exit(0);
}
}
private void submitTopology() throws Exception {
System.out.println("Entered in submitTopology");
TopologyBuilder builder = new TopologyBuilder();
KafkaSpout<?, ?> kafkaSpout = spoutBuilder.buildKafkaSpout();
SinkTypeBolt sinkTypeBolt = boltBuilder.buildSinkTypeBolt();
MongoDBBolt mongoBolt = boltBuilder.buildMongoDBBolt();
//set the kafkaSpout to topology
//parallelism-hint for kafkaSpout - defines number of executors/threads to be spawn per container
int kafkaSpoutCount = Integer.parseInt(configs.getProperty(Keys.KAFKA_SPOUT_COUNT));
builder.setSpout(configs.getProperty(Keys.KAFKA_SPOUT_ID), kafkaSpout, kafkaSpoutCount);
//set the sinktype bolt
int sinkBoltCount = Integer.parseInt(configs.getProperty(Keys.SINK_BOLT_COUNT));
builder.setBolt(configs.getProperty(Keys.SINK_TYPE_BOLT_ID),sinkTypeBolt,sinkBoltCount).shuffleGrouping(configs.getProperty(Keys.KAFKA_SPOUT_ID));
//set the mongodb bolt
int mongoBoltCount = Integer.parseInt(configs.getProperty(Keys.MONGO_BOLT_COUNT));
builder.setBolt(configs.getProperty(Keys.MONGO_BOLT_ID),mongoBolt,mongoBoltCount).shuffleGrouping(configs.getProperty(Keys.SINK_TYPE_BOLT_ID),Keys.MONGODB_STREAM);
String topologyName = configs.getProperty(Keys.TOPOLOGY_NAME);
Config conf = new Config();
//Defines how many worker processes have to be created for the topology in the cluster.
conf.setNumWorkers(1);
System.out.println("Submitting Topology");
//StormSubmitter.submitTopology(topologyName, conf, builder.createTopology());
System.out.println("Topology submitted");
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(topologyName, conf, builder.createTopology());
}
public static void main(String[] args) throws Exception {
String configFile;
if (args.length == 0) {
System.out.println("Missing input : config file location, using default");
configFile = "application.properties";
} else{
configFile = args[0];
}
Topology ingestionTopology = new Topology(configFile);
ingestionTopology.submitTopology();
}
}
Spout Code
public class SpoutBuilder {
public Properties configs = null;
public SpoutBuilder(Properties configs) {
this.configs = configs;
}
public KafkaSpout<?, ?> buildKafkaSpout() {
String servers = configs.getProperty(Keys.KAFKA_BROKER);
String topic = configs.getProperty(Keys.KAFKA_TOPIC);
String group = configs.getProperty(Keys.KAFKA_CONSUMERGROUP);
return new KafkaSpout<>(getKafkaSpoutConfig(servers,topic,group));
}
protected KafkaSpoutConfig<String, String> getKafkaSpoutConfig(String bootstrapServers, String topic, String group) {
return KafkaSpoutConfig.builder(bootstrapServers, new String[]{topic})
.setProp(ConsumerConfig.GROUP_ID_CONFIG, group)
.setRetry(getRetryService())
.setOffsetCommitPeriodMs(10_000)
.setFirstPollOffsetStrategy(FirstPollOffsetStrategy.UNCOMMITTED_LATEST)
.setMaxUncommittedOffsets(250)
.setProcessingGuarantee(ProcessingGuarantee.AT_LEAST_ONCE)
.setTupleTrackingEnforced(true)
.setEmitNullTuples(false)
.setRecordTranslator(new DefaultRecordTranslator<String, String>())
.build();
}
protected KafkaSpoutRetryService getRetryService() {
return new KafkaSpoutRetryExponentialBackoff(TimeInterval.microSeconds(500),
TimeInterval.milliSeconds(2), Integer.MAX_VALUE, TimeInterval.seconds(10));
}
}
Bolt Builder
public class BoltBuilder {
public Properties configs = null;
public BoltBuilder(Properties configs) {
this.configs = configs;
}
public SinkTypeBolt buildSinkTypeBolt() {
return new SinkTypeBolt();
}
public MongoDBBolt buildMongoDBBolt() {
String host = configs.getProperty(Keys.MONGO_HOST);
int port = Integer.parseInt(configs.getProperty(Keys.MONGO_PORT));
String db = configs.getProperty(Keys.MONGO_DATABASE);
String collection = configs.getProperty(Keys.MONGO_COLLECTION);
return new MongoDBBolt(host, port, db, collection);
}
}
SinkTypeBolt Code
public class SinkTypeBolt extends BaseRichBolt {
private static final long serialVersionUID = 1L;
private OutputCollector collector;
public void execute(Tuple tuple) {
String value = tuple.getString(4);
System.out.println("Received in SinkType bolt : "+value);
if (value != null && !value.isEmpty()){
collector.emit(Keys.MONGODB_STREAM,new Values(value));
System.out.println("Emitted : "+value);
}
collector.ack(tuple);
}
public void prepare(Map conf, TopologyContext context, OutputCollector collector) {
this.collector = collector;
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declareStream(Keys.MONGODB_STREAM, new Fields("content"));
}
}
MongoDB Bolt
public class MongoDBBolt extends BaseRichBolt {
private static final long serialVersionUID = 1L;
private OutputCollector collector;
private MongoDatabase mongoDB;
private MongoClient mongoClient;
private String collection;
public String host;
public int port ;
public String db;
protected MongoDBBolt(String host, int port, String db,String collection) {
this.host = host;
this.port = port;
this.db = db;
this.collection = collection;
}
public void prepare(Map stormConf, TopologyContext context, OutputCollector collector) {
this.collector = collector;
this.mongoClient = new MongoClient(host,port);
this.mongoDB = mongoClient.getDatabase(db);
}
public void execute(Tuple input) {
Document mongoDoc = getMongoDocForInput(input);
try{
mongoDB.getCollection(collection).insertOne(mongoDoc);
collector.ack(input);
}catch(Exception e) {
e.printStackTrace();
collector.fail(input);
}
}
#Override
public void cleanup() {
this.mongoClient.close();
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
// TODO Auto-generated method stub
}
public Document getMongoDocForInput(Tuple input) {
Document doc = new Document();
String content = (String) input.getValueByField("content");
String[] parts = content.trim().split(" ");
System.out.println("Received in MongoDB bolt "+content);
try {
for(String part : parts) {
String[] subParts = part.split(":");
String fieldName = subParts[0];
String value = subParts[1];
doc.append(fieldName, value);
}
} catch(Exception e) {
}
return doc;
}
}
pom.xml code
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>1.2.2</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>1.1.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-kafka-client</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.0.4</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.4</version>
<configuration>
<createDependencyReducedPom>true</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.kafka_storm.util.Topology</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.4</version>
</plugin>
</plugins>
<resources>
<resource>
<directory>src/main/java</directory>
<includes>
<include> **/*.properties</include>
</includes>
</resource>
</resources>
</build>
Storm UI
Just to be sure, you are remembering to use the StormSubmitter line in Topology, rather than the LocalCluster when you submit the topology with storm jar, right?
Also please check that you've started all the right daemons, i.e. storm nimbus, storm supervisor should be running as a minimum (plus your Zookeeper install)
The next places to look would be in your log files. In the Storm directory, you'll have a logs directory. Look in the logs/worker-artifacts/<your-topology-id>/<your-worker-port>/worker.log files. Those will hopefully get you on the right track to figuring out what's going on. I'd open Storm UI, find your spout and look up which worker ports it's running on, so you can look in the right log files.

Groovy spock test case failing which includes embeded Hazelcast in my test case

I am new to Groovy spock test framework, trying to write my 1st test using Groovy spock. I have few questions around my below test which is keep failing, not getting what to change code to make it working???
What I am trying is?
One thing I am not understanding is when should I use mock, when
should I use spy
Under then section I am trying to call classUnderTest.loadFromFile() in order to test it.
Always I see in CommonDataCache class, instance variable hazelcastCache is always showing null while I debug the test, but at the same time, I see cache is getting non-null Object. Due to this I am always getting null pointer exception as shown in below error logs.
Please anybody suggest why am I missing to make it working???
Error log:
Members [1] {
Member [127.0.0.1]:5001 - a43cbef2-ddf7-431b-9300-2b53c3ea9294 this
}
Dec 13, 2017 3:53:53 PM com.hazelcast.core.LifecycleService
INFO: [127.0.0.1]:5001 [dev] [3.7.3] [127.0.0.1]:5001 is STARTED
Dec 13, 2017 3:53:54 PM com.hazelcast.internal.partition.impl.PartitionStateManager
INFO: [127.0.0.1]:5001 [dev] [3.7.3] Initializing cluster partition table arrangement...
Condition not satisfied:
commonDataCache.getFromCache("28ef4a8f-bfbc-4ad5-bc8a-88fd96ad82a8") != null
| | |
| null false
com.realdoc.symphony.common.CommonDataCache#144ab54
at com.realdoc.symphony.common.store.MemoryStoreManagerTest.populate hazlecast cache from symphony dat file(MemoryStoreManagerTest.groovy:65)
Dec 13, 2017 3:53:54 PM com.hazelcast.instance.Node
INFO: [127.0.0.1]:5001 [dev] [3.7.3] Running shutdown hook... Current state: ACTIVE
This is my Groovy test class:
import org.springframework.core.io.ClassPathResource
import spock.lang.Specification
import spock.lang.Subject
import static io.dropwizard.testing.FixtureHelpers.fixture
class MemoryStoreManagerTest extends Specification {
/**
* Mock the any config DTOs that carries static configuration data
**/
def dw = Mock(SymphonyConfig)
def cacheConfig = Mock(CacheConfig)
/**
* This has to be spied because there is actual call happening in the target method which converts
* json string format to MemoryStoreFileData DTO object
*/
def jsonUtils = Spy(JsonUtils)
def hazelcastInstance = TestHazelcastInstanceFactory.newInstance().newHazelcastInstance()
/**
* This class is under test
**/
#Subject
def commonDataCache = new CommonDataCache(hazelcastInstance: hazelcastInstance,hazelcastCache: hazelcastInstance.getMap("default"), config: dw)
/**
* This class is under test
**/
#Subject
def classUnderTest = new MemoryStoreManager(dw:dw, jsonUtils: jsonUtils, commonDataCache: commonDataCache)
/**
* Test whether populating symphony.dat file into hazelcast cache is working
*/
def "populate hazlecast cache from symphony dat file"() {
setup:
def datFile = fixture("symphony.dat")
def resource = new ClassPathResource("symphony.dat")
def file = resource.getFile()
when:
cacheConfig.getStoreLocation() >> ""
cacheConfig.getStoreFileName() >> "symphony.dat"
dw.getUseHazelcastCache() >> true
dw.getCacheConfig() >> cacheConfig
cacheConfig.getFile() >> file
commonDataCache.postConstruct()
then:
classUnderTest.loadFromFile()
expect:
commonDataCache.getFromCache("28ef4a8f-bfbc-4ad5-bc8a-88fd96ad82a8") != null
}
}
This is my target class on which I am trying to test loadFromFile() method
#Component
public class MemoryStoreManager {
private static final Logger LOG = LoggerFactory.getLogger(MemoryStoreManager.class);
#Autowired
SymphonyConfig dw;
#Autowired
JsonUtils jsonUtils;
#Autowired
CommonDataCache commonDataCache;
private final Properties properties = new Properties();
#PostConstruct
public void loadFromFile() {
File file = dw.getCacheConfig().getFile();
LOG.info("Loading Data from file-{}", file.getAbsolutePath());
FileInputStream inStream = null;
try {
if (!file.exists()) {
Files.createFile(file.toPath());
}
inStream = new FileInputStream(file);
properties.load(inStream);
String property = properties.getProperty("data");
MemoryStoreFileData fileData;
if (StringUtils.isNotEmpty(property)) {
fileData = jsonUtils.jsonToObject(property, MemoryStoreFileData.class);
} else {
fileData = new MemoryStoreFileData(Collections.emptyMap(), Collections.emptyMap());
}
Long lastUpdatedTimeInFile = fileData.getLastUpdatedTime();
LOG.info("Last updated time in File-{}", lastUpdatedTimeInFile);
Long lastUpdatedTimeInCache = (Long) commonDataCache.getFromCache("lastUpdatedTime");
LOG.info("Last updated time in Cache-{}", lastUpdatedTimeInCache);
Map<String, DocData> loadedMap = fileData.getDocDataMap();
if (MapUtils.isEmpty(loadedMap)) {
loadedMap = new HashMap<>();
}
Map<String, ProcessStatusDto> processStatusMap = fileData.getProcessStatusMap();
if (MapUtils.isEmpty(processStatusMap)) {
processStatusMap = new HashMap<>();
}
if (lastUpdatedTimeInFile != null && (lastUpdatedTimeInCache == null || lastUpdatedTimeInCache < lastUpdatedTimeInFile)) {
LOG.info("Overwriting data from File");
commonDataCache.addAllToCache(loadedMap, processStatusMap);
} else {
String requestId;
DocData fileDocData;
DocData cacheDocData;
Map<String, String> filePageStatusMap;
Map<String, String> cachePageStatusMap;
String pageId;
String fileStatus;
String cacheStatus;
for (Entry<String, DocData> entry : loadedMap.entrySet()) {
requestId = entry.getKey();
fileDocData = entry.getValue();
cacheDocData = (DocData) commonDataCache.getFromCache(requestId);
filePageStatusMap = fileDocData.getPageStatusMap();
cachePageStatusMap = cacheDocData.getPageStatusMap();
for (Entry<String, String> pageStatus : filePageStatusMap.entrySet()) {
pageId = pageStatus.getKey();
fileStatus = pageStatus.getValue();
cacheStatus = cachePageStatusMap.get(pageId);
if (StringUtils.equals("IN_PROCESS", cacheStatus) && !StringUtils.equals("IN_PROCESS", fileStatus)) {
cachePageStatusMap.put(pageId, fileStatus);
LOG.info("PageId: {} status: {} updated", pageId, fileStatus);
}
}
commonDataCache.addToCache(requestId, cacheDocData);
}
}
} catch (Exception e) {
LOG.error("ErrorCode-{}, Component-{}, Message-{}. Error Loading cache data from file-{}. Exiting system", "OR-51010", "ORCHESTRATION", "Symphony cache loading exception", file.getAbsoluteFile(), e);
System.exit(0);
}
}
}
This is my cache utility class where store and retrieve methods are defines.
#Component
public class CommonDataCache {
private static final Logger LOG = LoggerFactory.getLogger(CommonDataCache.class);
#Autowired
HazelcastInstance hazelcastInstance;
#Autowired
SymphonyConfig config;
public static String LAST_UPDATED_TIME = "lastUpdatedTime";
private IMap<String, Object> hazelcastCache = null;
private boolean useHazelcast = false;
private final Map<String, Object> cache = new ConcurrentHashMap<>();
#PostConstruct
public void postConstruct() {
hazelcastCache = hazelcastInstance.getMap("default");
// Enable only if logging level is DEBUG
if (LOG.isDebugEnabled()) {
hazelcastCache.addEntryListener(new HazelcastMapListener(), true);
}
useHazelcast = config.getUseHazelcastCache();
}
public Map<String, Object> getAllDataFromCache() {
return hazelcastCache;
}
public void addToCache(String key, Object value) {
if (useHazelcast) {
hazelcastCache.put(key, value);
hazelcastCache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
} else {
cache.put(key, value);
cache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
}
}
public Object getAndRemoveFromCache(String key) {
if (useHazelcast) {
return hazelcastCache.remove(key);
} else {
return cache.remove(key);
}
}
public Object getFromCache(String key) {
if (useHazelcast) {
return hazelcastCache.get(key);
} else {
return cache.get(key);
}
}
/**
*
* #param cacheDataMap
*/
public void addAllToCache(Map<String, DocData> cacheDataMap, Map<String, ProcessStatusDto> processStatusMap) {
hazelcastCache.putAll(cacheDataMap);
hazelcastCache.putAll(processStatusMap);
hazelcastCache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
}
public void lockKey(String key) {
if (useHazelcast) {
hazelcastCache.lock(key);
}
}
public void unlockKey(String key) {
if (useHazelcast) {
hazelcastCache.unlock(key);
}
}
public Map<String, Object> getByKeyContains(String keyString) {
Map<String, Object> values;
if (useHazelcast) {
Set<String> foundKeys = hazelcastCache.keySet(entry -> ((String)entry.getKey()).contains(keyString));
values = hazelcastCache.getAll(foundKeys);
} else {
values = Maps.filterEntries(cache, entry -> entry.getKey().contains(keyString));
}
return values;
}
}
Here is maven dependencies for groovy tests.
<!-- Dependencies for GROOVY TEST -->
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>${hazelcast.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>${hazelcast.version}</version>
<classifier>tests</classifier>
<scope>test</scope>
</dependency>
<!-- GROOVY TEST FRAMEWORK -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-core</artifactId>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-assets</artifactId>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-testing</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>realdoc</groupId>
<artifactId>dropwizard-spring</artifactId>
<version>${realdoc.dropwizard-spring.version}</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<!-- any version of Groovy \>= 1.5.0 should work here -->
<version>${groovy-all.version}</version>
<!--<scope>test</scope>-->
</dependency>
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-core</artifactId>
<version>${spock.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-spring</artifactId>
<version>${spock.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib-nodep</artifactId>
<version>${cglib-nodep.version}</version>
<scope>test</scope>
</dependency>
<!-- GROOVY TEST FRAMEWORK -->
<build>
<testSourceDirectory>src/test/groovy</testSourceDirectory>
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
</testResource>
</testResources>
...
...
</build>

hibernate not creating table but no error messages

I am doing a spring-boot project and trying to create a table with hibernate, I get no errors when I run the app and the server starts normally, but the table does not get created.
StatusUpdate.java
package model;
import java.util.Date;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.Table;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;
import javax.persistence.PrePersist;
#Entity
#Table(name="status_update")
public class StatusUpdate {
#Id
#Column(name="id")
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
#Column(name="text")
private String text;
#Column(name="added")
#Temporal(TemporalType.TIMESTAMP)
private Date added;
#PrePersist
protected void onCreate() {
if (added == null) {
added = new Date();
}
}
public StatusUpdate(String text) {
this.text = text;
}
public StatusUpdate(String text, Date added) {
this.text = text;
this.added = added;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getText() {
return text;
}
public void setText(String text) {
this.text = text;
}
public Date getAdded() {
return added;
}
public void setAdded(Date added) {
this.added = added;
}
#Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + ((added == null) ? 0 : added.hashCode());
result = prime * result + ((id == null) ? 0 : id.hashCode());
result = prime * result + ((text == null) ? 0 : text.hashCode());
return result;
}
#Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
StatusUpdate other = (StatusUpdate) obj;
if (added == null) {
if (other.added != null)
return false;
} else if (!added.equals(other.added))
return false;
if (id == null) {
if (other.id != null)
return false;
} else if (!id.equals(other.id))
return false;
if (text == null) {
if (other.text != null)
return false;
} else if (!text.equals(other.text))
return false;
return true;
}
}
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.4.0.RELEASE</version>
</parent>
<properties>
<java.version>1.8</java.version>
<tiles.version>3.0.7</tiles.version>
</properties>
<groupId>com.voja</groupId>
<artifactId>spring-boot-tutorial</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
</dependency>
<dependency>
<groupId>org.apache.tiles</groupId>
<artifactId>tiles-core</artifactId>
<version>${tiles.version}</version>
</dependency>
<dependency>
<groupId>org.apache.tiles</groupId>
<artifactId>tiles-jsp</artifactId>
<version>${tiles.version}</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.38</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
<version>1.3.5.RELEASE</version>
</dependency>
</dependencies>
<packaging>war</packaging>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<executable>true</executable>
</configuration>
</plugin>
</plugins>
</build>
</project>
application.properties
debug=true
spring.datasource.url=jdbc:mysql://localhost:3306/springboottutorial
spring.datasource.username=springboot
spring.datasource.password=hello
spring.datasource.driverClassName=com.mysql.jdbc.Driver
spring.jpa.hibernate.dialect=org.hibernate.dialect.MySQLInnoDBDialect
spring.jpa.generate-ddl=true
spring.jpa.show-sql=true
logging.level.org.hibernate.SQL=DEBUG
I also get a yellow line under dialect in this line spring.jpa.hibernate.dialect=org.hibernate.dialect.MySQLInnoDBDialectwhich says `spring.jpa.hibernate.dialect' is an unknown property. Did you mean 'spring.jpa.hibernate.ddl-auto' in case that might be a problem.
Sometimes the problem occurs in the property name you choose. Like text, value etc. which are reserved keyword for database. Hence you can see a log of creating table but table will not be created. To resolve this just change you property name in you bean or provide explicit column name.
Example: property or column name "value" is not supported in mysql. Hence my tables were not created. I have to rename it to 'valueString'.
Hope this helps.
Found the problem by reading another post, actually there was a problem with packages and classes within them, they couldn't find one another and so the table wasn't created.
I made a new project and put all classes inside the same package and it worked, so I will fix my existing project based on that.
It's not working for you, because you are not using it, I mean, JPA creates the database when needs to use it. You can try to write a test that uses it, or add a rest repository to try it, just add this to your application.properties;
spring.jpa.hibernate.ddl-auto=create
and then create this interface:
#RepositoryRestResource
public interface IStatusRepository extends CrudRepository<StatusUpdate, Long> {
}
You will need also this dependency;
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-rest</artifactId>
</dependency>
It's more easy just creating a test, but I guess you are trying to build a rest web service, so try this to see it works =)
For your application.properties file follow below link for this
it is like :
spring.jpa.generate-ddl=true
and then add
spring.jpa.hibernate.ddl-auto=create-drop
http://docs.spring.io/spring-boot/docs/1.2.3.RELEASE/reference/htmlsingle/#common-application-properties
If you are using jpa and hibernate then add the following property in your application.properties file
spring.jpa.hibernate.ddl-auto=update
Try adding this to your application.properties :
spring.jpa.hibernate.ddl-auto=create

Categories