AWS SQS and SES dependency libraries effect each other - java

I'm trying to use imports from SES and SQS at the same time but the combination causes an error to be thrown with the .withBody method. I'm guessing it's to do with the dependencies but they are at the latest version.
Error:(116,54) java:incompatible types:com.amazonaws.services.simpleemail.model.Body cannot be converted to java.lang.String
import com.amazonaws.services.sqs.AmazonSQS;
import com.amazonaws.services.sqs.AmazonSQSClientBuilder;
import com.amazonaws.services.sqs.model.Message;
import com.amazonaws.services.sqs.model.ReceiveMessageRequest;
import com.amazonaws.services.simpleemail.AmazonSimpleEmailService;
import com.amazonaws.services.simpleemail.AmazonSimpleEmailServiceClientBuilder;
import com.amazonaws.services.simpleemail.model.*;
public void email(S3Event event, Person person, Boolean error) {
ObjectMapper mapper = new ObjectMapper();
String emailText = null;
if (error) {
emailText = "Error! No image in file!";
} else {
try {
emailText = mapper.writeValueAsString(person);
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
String key = event.getRecords().get(0).getS3().getObject().getKey();
AmazonSimpleEmailService client =
AmazonSimpleEmailServiceClientBuilder.standard().withRegion(Regions.EU_WEST_1).build();
Body body = new Body().withText(new Content().withData(emailText));
SendEmailRequest request = new SendEmailRequest().withDestination(
new Destination().withToAddresses(person.getEmail()))
.withMessage(new Message()
.withBody(new Body().withHtml(new
Content().withCharset("UTF8").withData(emailText)))
.withSubject(new Content()
.withCharset("UTF-8").withData("Message from passport service.")))
.withSource(person.getEmail());
client.sendEmail(request);
}
public void getBaseCodeFromSQS() {
AmazonSQS sqs = AmazonSQSClientBuilder.defaultClient();
try {
ReceiveMessageRequest receiveMessageRequest = new ReceiveMessageRequest("https://sqs.eu-
west-1.amazonaws.com/416031944655/TISFEXP-PSS-2-QUEUE");
List<Message> messages = sqs.receiveMessage(receiveMessageRequest).getMessages();
for (Message message : messages) {
LOGGER.info("MessageId: " + message.getMessageId());
LOGGER.info("ReceiptHandle: " + message.getReceiptHandle());
LOGGER.info("MD5OfBody: " + message.getMD5OfBody());
LOGGER.info("Body: " + message.getBody());
for (final Map.Entry<String, String> entry : message.getAttributes().entrySet())
{
LOGGER.info("Attribute - Name: " + entry.getKey());
LOGGER.info("Attribute - Value: " + entry.getValue());
}
}
} catch (Exception e) {
LOGGER.error(e);
}
}
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-sqs</artifactId>
<version>1.11.634</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>amazon-sqs-java-messaging-lib</artifactId>
<version>1.0.8</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-ses</artifactId>
<version>1.11.634</version>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bom</artifactId>
<version>1.11.634</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>

There is a Message class defined in both SES and SQS packages. You are using the Message class defined in the SQS package. You should use the one defined in SES package instead.
https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/sqs/model/Message.html
https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/simpleemail/model/Message.html
SendEmailRequest request = new SendEmailRequest().withDestination(
new Destination().withToAddresses(person.getEmail()))
.withMessage(new com.amazonaws.services.simpleemail.model.Message()
.withBody(new Body().withHtml(new
Content().withCharset("UTF8").withData(emailText)))
.withSubject(new Content()
.withCharset("UTF-8").withData("Message from passport service.")))
.withSource(person.getEmail());

Related

Creating an Amazon S3 bucket Using the AWS SDK for Java : Exception in thread "main" java.lang.NoClassDefFoundError

Getting below error while trying to create an s3 bucket of aws using java API :
Error : Exception in thread "main" java.lang.NoClassDefFoundError: software/amazon/awssdk/protocols/query/internal/unmarshall/AwsXmlErrorUnmarshaller at software.amazon.awssdk.protocols.xml.internal.unmarshall.AwsXmlErrorTransformer.(AwsXmlErrorTransformer.java:40) at software.amazon.awssdk.protocols.xml.internal.unmarshall.AwsXmlErrorTransformer.(AwsXmlErrorTransformer.java:34) at software.amazon.awssdk.protocols.xml.internal.unmarshall.AwsXmlErrorTransformer$Builder.build(AwsXmlErrorTransformer.java:113) at software.amazon.awssdk.protocols.xml.AwsXmlProtocolFactory.createErrorTransformer(AwsXmlProtocolFactory.java:135) at software.amazon.awssdk.protocols.xml.AwsS3ProtocolFactory.createErrorCouldBeInBodyResponseHandler(AwsS3ProtocolFactory.java:80) at software.amazon.awssdk.services.s3.DefaultS3Client.createBucket(DefaultS3Client.java:1144) at com.act.niti.main(niti.java:33) Caused by: java.lang.ClassNotFoundException: software.amazon.awssdk.protocols.query.internal.unmarshall.AwsXmlErrorUnmarshaller at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ..
Code :
try {
Region region = Region.US_EAST_2;
S3Client s3 = S3Client.builder()
.region(region)
.build();
S3Waiter s3Waiter = s3.waiter();
CreateBucketRequest bucketRequest = CreateBucketRequest.builder()
.bucket("abc")
.build();
s3.createBucket(bucketRequest); //creating s3 bucket
System.out.println("bucket........abc");
HeadBucketRequest bucketRequestWait = HeadBucketRequest.builder()
.bucket("abc")
.build();
// Wait until the bucket is created and print out the response
WaiterResponse<HeadBucketResponse> waiterResponse =
s3Waiter.waitUntilBucketExists(bucketRequestWait);
waiterResponse.matched().response().ifPresent(System.out::println);
System.out.println("abc" +" is ready");
} catch (S3Exception e) {
System.err.println(e.awsErrorDetails().errorMessage());
System.exit(1);
}
Note : Using java 8
POM Xml :
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.570</version>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3</artifactId>
<version>2.17.269</version>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>ec2</artifactId>
<version>2.5.10</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.9.10</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-gamelift -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-gamelift</artifactId>
<version>1.11.647</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>5.8.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>5.8.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.platform</groupId>
<artifactId>junit-platform-commons</artifactId>
<version>1.8.2</version>
</dependency>
<dependency>
<groupId>org.junit.platform</groupId>
<artifactId>junit-platform-launcher</artifactId>
<version>1.8.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3-transfer-manager</artifactId>
<version>2.17.103-PREVIEW</version>
</dependency>
<dependencies>
Looks like you are mixing up V1 and V2 in your POM file. You have
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.570</version>
</dependency>
There is no need for this Dep when using AWS SDK for V2. S3Client is a V2 Service Client. In fact -- your errors are most likely related to mxing up SDK versions in your POM file.
The POM file that you should use can be found in the AWS Github repo here:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/main/javav2/example_code/s3
If you are not familiar with AWS SDK for Java v2, refer to the DEV Guide:
https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/home.html
The V2 Java code that works when you use the proper POM file is here:
package com.example.s3;
// snippet-start:[s3.java2.create_bucket_waiters.import]
import software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider;
import software.amazon.awssdk.core.waiters.WaiterResponse;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.s3.S3Client;
import software.amazon.awssdk.services.s3.model.CreateBucketRequest;
import software.amazon.awssdk.services.s3.model.HeadBucketRequest;
import software.amazon.awssdk.services.s3.model.HeadBucketResponse;
import software.amazon.awssdk.services.s3.model.S3Exception;
import software.amazon.awssdk.services.s3.waiters.S3Waiter;
import java.net.URISyntaxException;
// snippet-end:[s3.java2.create_bucket_waiters.import]
/**
* Before running this Java V2 code example, set up your development environment, including your credentials.
*
* For more information, see the following documentation topic:
*
* https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html
*/
public class CreateBucket {
public static void main(String[] args) throws URISyntaxException {
final String usage = "\n" +
"Usage:\n" +
" <bucketName> \n\n" +
"Where:\n" +
" bucketName - The name of the bucket to create. The bucket name must be unique, or an error occurs.\n\n" ;
if (args.length != 1) {
System.out.println(usage);
System.exit(1);
}
String bucketName = args[0];
System.out.format("Creating a bucket named %s\n", bucketName);
ProfileCredentialsProvider credentialsProvider = ProfileCredentialsProvider.create();
Region region = Region.US_EAST_1;
S3Client s3 = S3Client.builder()
.region(region)
.credentialsProvider(credentialsProvider)
.build();
createBucket (s3, bucketName);
s3.close();
}
// snippet-start:[s3.java2.create_bucket_waiters.main]
public static void createBucket( S3Client s3Client, String bucketName) {
try {
S3Waiter s3Waiter = s3Client.waiter();
CreateBucketRequest bucketRequest = CreateBucketRequest.builder()
.bucket(bucketName)
.build();
s3Client.createBucket(bucketRequest);
HeadBucketRequest bucketRequestWait = HeadBucketRequest.builder()
.bucket(bucketName)
.build();
// Wait until the bucket is created and print out the response.
WaiterResponse<HeadBucketResponse> waiterResponse = s3Waiter.waitUntilBucketExists(bucketRequestWait);
waiterResponse.matched().response().ifPresent(System.out::println);
System.out.println(bucketName +" is ready");
} catch (S3Exception e) {
System.err.println(e.awsErrorDetails().errorMessage());
System.exit(1);
}
}
// snippet-end:[s3.java2.create_bucket_waiters.main]
}

Requesting token from azure with claims, the claims are not set

I have a spring boot application and I am trying to request token from azure using the following code:
public String getTokenFromAzure() {
String token = null;
ConfidentialClientApplication application = getApplication();
final String claims = JsonSerializer.convertToJson(new Employee("public"));
final com.microsoft.aad.msal4j.ClaimsRequest claims1 = CustomClaimRequest.formatAsClaimsRequest(claims);
ClaimsRequest claims2 = new ClaimsRequest();
claims2.requestClaimInIdToken(claims, null);
MyClaims claims3 = new MyClaims();
claims3.requestClaimInAccessToken(claims,new RequestedClaimAdditionalInfo(true,"value", Arrays.asList("employeeid","dummy")));
if (application == null) {
log.error("application is not instantiated");
} else {
ClientCredentialParameters parameters = ClientCredentialParameters.builder(Collections.singleton(clientId + "/.default")).claims(claims3).build();
IAuthenticationResult auth = application.acquireToken(parameters).join();
if (auth == null) {
log.info("auth still == null");
} else {
log.info("idToken: " + auth.idToken());
log.info("accessToken: " + auth.accessToken());
token = isEmpty(auth.idToken()) ? auth.accessToken() : auth.idToken();
}
}
return token;
}
private ConfidentialClientApplication getApplication() {
if (application == null) {
try {
application = ConfidentialClientApplication.builder(clientId, ClientCredentialFactory.createFromSecret(clientSecret)).authority("https://login.microsoftonline.com/" + tenantId + "/").build();
} catch (MalformedURLException e) {
log.error("unable to instantiate application for tenant " + tenantId + " with client " + clientId + " with configuration", e);
}
}
return application;
}
static class MyClaims extends ClaimsRequest {
#Override
protected void requestClaimInAccessToken(String claim, RequestedClaimAdditionalInfo requestedClaimAdditionalInfo) {
super.requestClaimInAccessToken(claim, requestedClaimAdditionalInfo);
}
}
I have tried with claims1, claims2 and with claims3. I am getting a functional access token but the claims are not set.
These are the dependencies that I am using:
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>applicationinsights-spring-boot-starter</artifactId>
<version>2.6.1</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>applicationinsights-logging-logback</artifactId>
<version>2.6.1</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-active-directory-spring-boot-starter</artifactId>
<version>2.3.2</version>
</dependency>
<dependency>
<groupId>com.microsoft.graph</groupId>
<artifactId>microsoft-graph</artifactId>
<version>2.5.0</version>
</dependency>
<dependency>
<groupId>com.microsoft.graph</groupId>
<artifactId>microsoft-graph-auth</artifactId>
<version>0.2.0</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage</artifactId>
<version>4.2.0</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.3.5</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/com.microsoft.azure/msal4j -->
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>msal4j</artifactId>
<version>1.11.0</version>
</dependency>
Does anyone knows the correct way to add the claims into the jwt token?
You can add custom key-value pairs to the JWT's body as custom claims. It may be a user's department at work, a user's role or privilege, or whatever else you need to add to JWT. For instance, I am including two custom claims for the user's role and department at work in the code sample below.
String token = Jwts.builder()
.setSubject(subject)
.setExpiration(expDate)
.claim("Role", "Admin")
.claim("Department", "Product development")
.signWith(SignatureAlgorithm.HS512, secret )
.compact();
In the above code example, Role and Department are two custom claims that I have added. You can expand JWT's body of claims as necessary. Just keep in mind not to include sensitive data such as a user password or token secret. You may examine and decode JWT assertions.
Use the following bit of code to read the custom Claims from the JWT token's body:
Claims claims = Jwts.parser()
.setSigningKey(tokenSecret)
.parseClaimsJws(jwt).getBody();
// Reading Reserved Claims
System.out.println("Subject: " + claims.getSubject());
System.out.println("Expiration: " + claims.getExpiration());
// Reading Custom Claims
System.out.println("Role: " + claims.get("Role"));
System.out.println("Department: " + claims.get("Department"));
Remember that JWT is a Base64 encoded string and can be easily decoded. Therefore, you should not put into Claims any user details that are sensitive. Even though the information in Claims cannot be altered, this information can be viewed by the Base64-decoding JWT token.

#StreamListener not receiving message from kafka topic

I am able to send and receive the message using code:
#EnableBinding(Processor.class)
public class KafkaStreamsConfiguration {
#StreamListener(Processor.INPUT)
#SendTo(Processor.OUTPUT)
public String processMessage(String message) {
System.out.println("message = " + message);
return message.replaceAll("my", "your");
}
}
#RunWith(SpringRunner.class)
#SpringBootTest
#DirtiesContext
public class StreamApplicationIT {
private static String topicToPublish = "eventUpdateFromEventModel";
#BeforeClass
public static void setup() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
}
#Autowired
private KafkaMessageSender<String> kafkaMessageSenderToTestErrors;
#Autowired
private KafkaMessageSender<EventNotificationDto> kafkaMessageSender;
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true, topicToPublish);
#Autowired
private Processor pipe;
#Autowired
private MessageCollector messageCollector;
#Rule
public OutputCapture outputCapture = new OutputCapture();
#Test
public void working() {
pipe.input()
.send(MessageBuilder.withPayload("This is my message")
.build());
Object payload = messageCollector.forChannel(pipe.output())
.poll()
.getPayload();
assertEquals("This is your message", payload.toString());
}
#Test
public void non_working() {
kafkaMessageSenderToTestErrors.send(topicToPublish, "This was my message");
assertTrue(isMessageReceived("This was your message", 50));
}
private boolean isMessageReceived(final String msg, final int maxAttempt) {
return IntStream.rangeClosed(0, maxAttempt)
.peek(a -> {
try {
TimeUnit.MILLISECONDS.sleep(100);
} catch (InterruptedException e) {
fail();
}
}).anyMatch(i -> outputCapture.toString().contains(msg));
}
}
#Service
#Slf4j
public class KafkaMessageSender<T> {
private final KafkaTemplate<String, byte[]> kafkaTemplate;
private final ObjectWriter objectWriter;
public KafkaMessageSender(KafkaTemplate<String, byte[]> kafkaTemplate, ObjectMapper objectMapper) {
this.kafkaTemplate = kafkaTemplate;
this.objectWriter = objectMapper.writer();
}
public void send(String topicName, T payload) {
try {
kafkaTemplate.send(topicName, objectWriter.writeValueAsString(payload).getBytes());
} catch (JsonProcessingException e) {
log.info("error converting object into byte array {}", payload.toString().substring(0, 50));
}
log.info("sent payload to topic='{}'", topicName);
}
}
But when I send the message using kafkaTemplate to any topic, StreamListener doesn't receive the message.
spring.cloud.stream.bindings.input.group=test
spring.cloud.stream.bindings.input.destination=eventUpdateFromEventModel
my pom.xml:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-stream-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-test-support</artifactId>
<scope>test</scope>
</dependency>
<!-- Spring boot version -->
<spring.boot.version>1.5.7.RELEASE</spring.boot.version>
<spring-cloud.version>Edgware.SR3</spring-cloud.version>
<dependencyManagement>
<dependencies>
<dependency>
<!-- Import dependency management from Spring Boot -->
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>${spring.boot.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
working
Object payload = messageCollector.forChannel(pipe.output())
.poll()
.getPayload();
...
not working
KafkaTemplate
This is because you are using the TestBinder in your test, not the real Kafka broker and kafka binder.
The message collector is simply fetching it from the channel. If you want to test with a real Kafka broker, see the test-embedded-kafka sample app.
EDIT
I just tested the Ditmars (boot 1.5.x) version of the sample and it works fine...

Groovy spock test case failing which includes embeded Hazelcast in my test case

I am new to Groovy spock test framework, trying to write my 1st test using Groovy spock. I have few questions around my below test which is keep failing, not getting what to change code to make it working???
What I am trying is?
One thing I am not understanding is when should I use mock, when
should I use spy
Under then section I am trying to call classUnderTest.loadFromFile() in order to test it.
Always I see in CommonDataCache class, instance variable hazelcastCache is always showing null while I debug the test, but at the same time, I see cache is getting non-null Object. Due to this I am always getting null pointer exception as shown in below error logs.
Please anybody suggest why am I missing to make it working???
Error log:
Members [1] {
Member [127.0.0.1]:5001 - a43cbef2-ddf7-431b-9300-2b53c3ea9294 this
}
Dec 13, 2017 3:53:53 PM com.hazelcast.core.LifecycleService
INFO: [127.0.0.1]:5001 [dev] [3.7.3] [127.0.0.1]:5001 is STARTED
Dec 13, 2017 3:53:54 PM com.hazelcast.internal.partition.impl.PartitionStateManager
INFO: [127.0.0.1]:5001 [dev] [3.7.3] Initializing cluster partition table arrangement...
Condition not satisfied:
commonDataCache.getFromCache("28ef4a8f-bfbc-4ad5-bc8a-88fd96ad82a8") != null
| | |
| null false
com.realdoc.symphony.common.CommonDataCache#144ab54
at com.realdoc.symphony.common.store.MemoryStoreManagerTest.populate hazlecast cache from symphony dat file(MemoryStoreManagerTest.groovy:65)
Dec 13, 2017 3:53:54 PM com.hazelcast.instance.Node
INFO: [127.0.0.1]:5001 [dev] [3.7.3] Running shutdown hook... Current state: ACTIVE
This is my Groovy test class:
import org.springframework.core.io.ClassPathResource
import spock.lang.Specification
import spock.lang.Subject
import static io.dropwizard.testing.FixtureHelpers.fixture
class MemoryStoreManagerTest extends Specification {
/**
* Mock the any config DTOs that carries static configuration data
**/
def dw = Mock(SymphonyConfig)
def cacheConfig = Mock(CacheConfig)
/**
* This has to be spied because there is actual call happening in the target method which converts
* json string format to MemoryStoreFileData DTO object
*/
def jsonUtils = Spy(JsonUtils)
def hazelcastInstance = TestHazelcastInstanceFactory.newInstance().newHazelcastInstance()
/**
* This class is under test
**/
#Subject
def commonDataCache = new CommonDataCache(hazelcastInstance: hazelcastInstance,hazelcastCache: hazelcastInstance.getMap("default"), config: dw)
/**
* This class is under test
**/
#Subject
def classUnderTest = new MemoryStoreManager(dw:dw, jsonUtils: jsonUtils, commonDataCache: commonDataCache)
/**
* Test whether populating symphony.dat file into hazelcast cache is working
*/
def "populate hazlecast cache from symphony dat file"() {
setup:
def datFile = fixture("symphony.dat")
def resource = new ClassPathResource("symphony.dat")
def file = resource.getFile()
when:
cacheConfig.getStoreLocation() >> ""
cacheConfig.getStoreFileName() >> "symphony.dat"
dw.getUseHazelcastCache() >> true
dw.getCacheConfig() >> cacheConfig
cacheConfig.getFile() >> file
commonDataCache.postConstruct()
then:
classUnderTest.loadFromFile()
expect:
commonDataCache.getFromCache("28ef4a8f-bfbc-4ad5-bc8a-88fd96ad82a8") != null
}
}
This is my target class on which I am trying to test loadFromFile() method
#Component
public class MemoryStoreManager {
private static final Logger LOG = LoggerFactory.getLogger(MemoryStoreManager.class);
#Autowired
SymphonyConfig dw;
#Autowired
JsonUtils jsonUtils;
#Autowired
CommonDataCache commonDataCache;
private final Properties properties = new Properties();
#PostConstruct
public void loadFromFile() {
File file = dw.getCacheConfig().getFile();
LOG.info("Loading Data from file-{}", file.getAbsolutePath());
FileInputStream inStream = null;
try {
if (!file.exists()) {
Files.createFile(file.toPath());
}
inStream = new FileInputStream(file);
properties.load(inStream);
String property = properties.getProperty("data");
MemoryStoreFileData fileData;
if (StringUtils.isNotEmpty(property)) {
fileData = jsonUtils.jsonToObject(property, MemoryStoreFileData.class);
} else {
fileData = new MemoryStoreFileData(Collections.emptyMap(), Collections.emptyMap());
}
Long lastUpdatedTimeInFile = fileData.getLastUpdatedTime();
LOG.info("Last updated time in File-{}", lastUpdatedTimeInFile);
Long lastUpdatedTimeInCache = (Long) commonDataCache.getFromCache("lastUpdatedTime");
LOG.info("Last updated time in Cache-{}", lastUpdatedTimeInCache);
Map<String, DocData> loadedMap = fileData.getDocDataMap();
if (MapUtils.isEmpty(loadedMap)) {
loadedMap = new HashMap<>();
}
Map<String, ProcessStatusDto> processStatusMap = fileData.getProcessStatusMap();
if (MapUtils.isEmpty(processStatusMap)) {
processStatusMap = new HashMap<>();
}
if (lastUpdatedTimeInFile != null && (lastUpdatedTimeInCache == null || lastUpdatedTimeInCache < lastUpdatedTimeInFile)) {
LOG.info("Overwriting data from File");
commonDataCache.addAllToCache(loadedMap, processStatusMap);
} else {
String requestId;
DocData fileDocData;
DocData cacheDocData;
Map<String, String> filePageStatusMap;
Map<String, String> cachePageStatusMap;
String pageId;
String fileStatus;
String cacheStatus;
for (Entry<String, DocData> entry : loadedMap.entrySet()) {
requestId = entry.getKey();
fileDocData = entry.getValue();
cacheDocData = (DocData) commonDataCache.getFromCache(requestId);
filePageStatusMap = fileDocData.getPageStatusMap();
cachePageStatusMap = cacheDocData.getPageStatusMap();
for (Entry<String, String> pageStatus : filePageStatusMap.entrySet()) {
pageId = pageStatus.getKey();
fileStatus = pageStatus.getValue();
cacheStatus = cachePageStatusMap.get(pageId);
if (StringUtils.equals("IN_PROCESS", cacheStatus) && !StringUtils.equals("IN_PROCESS", fileStatus)) {
cachePageStatusMap.put(pageId, fileStatus);
LOG.info("PageId: {} status: {} updated", pageId, fileStatus);
}
}
commonDataCache.addToCache(requestId, cacheDocData);
}
}
} catch (Exception e) {
LOG.error("ErrorCode-{}, Component-{}, Message-{}. Error Loading cache data from file-{}. Exiting system", "OR-51010", "ORCHESTRATION", "Symphony cache loading exception", file.getAbsoluteFile(), e);
System.exit(0);
}
}
}
This is my cache utility class where store and retrieve methods are defines.
#Component
public class CommonDataCache {
private static final Logger LOG = LoggerFactory.getLogger(CommonDataCache.class);
#Autowired
HazelcastInstance hazelcastInstance;
#Autowired
SymphonyConfig config;
public static String LAST_UPDATED_TIME = "lastUpdatedTime";
private IMap<String, Object> hazelcastCache = null;
private boolean useHazelcast = false;
private final Map<String, Object> cache = new ConcurrentHashMap<>();
#PostConstruct
public void postConstruct() {
hazelcastCache = hazelcastInstance.getMap("default");
// Enable only if logging level is DEBUG
if (LOG.isDebugEnabled()) {
hazelcastCache.addEntryListener(new HazelcastMapListener(), true);
}
useHazelcast = config.getUseHazelcastCache();
}
public Map<String, Object> getAllDataFromCache() {
return hazelcastCache;
}
public void addToCache(String key, Object value) {
if (useHazelcast) {
hazelcastCache.put(key, value);
hazelcastCache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
} else {
cache.put(key, value);
cache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
}
}
public Object getAndRemoveFromCache(String key) {
if (useHazelcast) {
return hazelcastCache.remove(key);
} else {
return cache.remove(key);
}
}
public Object getFromCache(String key) {
if (useHazelcast) {
return hazelcastCache.get(key);
} else {
return cache.get(key);
}
}
/**
*
* #param cacheDataMap
*/
public void addAllToCache(Map<String, DocData> cacheDataMap, Map<String, ProcessStatusDto> processStatusMap) {
hazelcastCache.putAll(cacheDataMap);
hazelcastCache.putAll(processStatusMap);
hazelcastCache.put(LAST_UPDATED_TIME, System.currentTimeMillis());
}
public void lockKey(String key) {
if (useHazelcast) {
hazelcastCache.lock(key);
}
}
public void unlockKey(String key) {
if (useHazelcast) {
hazelcastCache.unlock(key);
}
}
public Map<String, Object> getByKeyContains(String keyString) {
Map<String, Object> values;
if (useHazelcast) {
Set<String> foundKeys = hazelcastCache.keySet(entry -> ((String)entry.getKey()).contains(keyString));
values = hazelcastCache.getAll(foundKeys);
} else {
values = Maps.filterEntries(cache, entry -> entry.getKey().contains(keyString));
}
return values;
}
}
Here is maven dependencies for groovy tests.
<!-- Dependencies for GROOVY TEST -->
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>${hazelcast.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>${hazelcast.version}</version>
<classifier>tests</classifier>
<scope>test</scope>
</dependency>
<!-- GROOVY TEST FRAMEWORK -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-core</artifactId>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-assets</artifactId>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-testing</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>realdoc</groupId>
<artifactId>dropwizard-spring</artifactId>
<version>${realdoc.dropwizard-spring.version}</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<!-- any version of Groovy \>= 1.5.0 should work here -->
<version>${groovy-all.version}</version>
<!--<scope>test</scope>-->
</dependency>
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-core</artifactId>
<version>${spock.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-spring</artifactId>
<version>${spock.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib-nodep</artifactId>
<version>${cglib-nodep.version}</version>
<scope>test</scope>
</dependency>
<!-- GROOVY TEST FRAMEWORK -->
<build>
<testSourceDirectory>src/test/groovy</testSourceDirectory>
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
</testResource>
</testResources>
...
...
</build>

openCMIS Local binding - JcrServiceFactory with jackRabbit implementation

Hey, there is something wrong with the third alternative, because the loop in JcrServiceFactory is searching for properties starting with jcr.* (others are not passed along), but right after in RepositoryFactoryImpl (Jackrabbit impl) it is searched for "org.apache.jackrabbit.repository.home" in the collection of properties that was passed along... that doesn't make sense. even if org.apache.jackrabbit.repository.home is there, it doesn't start with PREFIX_JCR_CONFIG so it is not put into jcrConfig collection that goes to RepositoryFactoryImpl.getRepository()
It would make sense if Map<String, String> map = null; because there is if (parameters == null) condition in RepositoryFactoryImpl, but this does not
It happens in the init method
JcrServiceFactory.java
private TypeManager typeManager;
private Map<String, String> jcrConfig;
private String mountPath;
private JcrRepository jcrRepository;
#Override
public void init(Map<String, String> parameters) {
typeManager = new TypeManager();
readConfiguration(parameters);
jcrRepository = new JcrRepository(acquireJcrRepository(jcrConfig), mountPath, typeManager);
}
Caused by: org.apache.chemistry.opencmis.commons.exceptions.CmisConnectionException: No JCR repository factory for configured parameters
at org.apache.chemistry.opencmis.jcr.JcrServiceFactory.acquireJcrRepository(JcrServiceFactory.java:95)
at org.apache.chemistry.opencmis.jcr.JcrServiceFactory.init(JcrServiceFactory.java:61)
at org.apache.chemistry.opencmis.client.bindings.spi.local.CmisLocalSpi.getSpiInstance(CmisLocalSpi.java:94)
... 34 more
private void readConfiguration(Map<String, String> parameters) {
Map<String, String> map = new HashMap<String, String>();
List<String> keys = new ArrayList<String>(parameters.keySet());
Collections.sort(keys);
/* the loop is searching for properties starting with jcr.* */
for (String key : keys) {
if (key.startsWith(PREFIX_JCR_CONFIG)) {
String jcrKey = key.substring(PREFIX_JCR_CONFIG.length());
String jcrValue = replaceSystemProperties(parameters.get(key));
map.put(jcrKey, jcrValue);
}
else if (MOUNT_PATH_CONFIG.equals(key)) {
mountPath = parameters.get(key);
log.debug("Configuration: " + MOUNT_PATH_CONFIG + '=' + mountPath);
}
else {
log.warn("Configuration: unrecognized key: " + key);
}
}
jcrConfig = Collections.unmodifiableMap(map);
log.debug("Configuration: jcr=" + jcrConfig);
}
But here the parameter Map is empty {} and it returns null; because it is searching for RepositoryFactoryImpl.REPOSITORY_HOME, which is org.apache.jackrabbit.repository.home
RepositoryFactoryImpl.java
/* parameters = jcrConfig */
public Repository getRepository(Map parameters) throws RepositoryException {
if (parameters == null) {
return getRepository(null, Collections.emptyMap());
} else if (parameters.containsKey(REPOSITORY_HOME)) {
String home = parameters.get(REPOSITORY_HOME).toString();
return getRepository(home, parameters);
} else if (parameters.containsKey(JcrUtils.REPOSITORY_URI)) {
Object parameter = parameters.get(JcrUtils.REPOSITORY_URI);
try {
URI uri = new URI(parameter.toString().trim());
String scheme = uri.getScheme();
if (("file".equalsIgnoreCase(scheme)
|| "jcr-jackrabbit".equalsIgnoreCase(scheme))
&& uri.getAuthority() == null) {
File file = new File(uri.getPath());
if (file.isFile()) {
return null; // Not a (possibly missing) directory
} else {
return getRepository(file.getPath(), parameters);
}
} else {
return null; // not a file: or jcr-jackrabbit: URI
}
} catch (URISyntaxException e) {
return null; // not a valid URI
}
} else {
return null; // unknown or insufficient parameters
}
}
<dependencies>
<dependency>
<groupId>javax.jcr</groupId>
<artifactId>jcr</artifactId>
<version>2.0</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-core</artifactId>
<version>2.2.4</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-api</artifactId>
<version>2.2.4</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.5.11</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>5.14</version>
<type>jar</type>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.chemistry.opencmis</groupId>
<artifactId>chemistry-opencmis-server-jcr</artifactId>
<version>0.3.0-incubating-SNAPSHOT</version>
<classifier>classes</classifier>
</dependency>
<dependency>
<groupId>org.apache.chemistry.opencmis</groupId>
<artifactId>chemistry-opencmis-client-bindings</artifactId>
<version>0.3.0-incubating-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.chemistry.opencmis</groupId>
<artifactId>chemistry-opencmis-client-api</artifactId>
<version>0.3.0-incubating-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.chemistry.opencmis</groupId>
<artifactId>chemistry-opencmis-client-impl</artifactId>
<version>0.3.0-incubating-SNAPSHOT</version>
</dependency>
The answer is right in the loop I complained about :-)
String jcrKey = key.substring(PREFIX_JCR_CONFIG.length());
It's a substring, so it cuts jcr. of and the rest goes on...
parameters.put("jcr.org.apache.jackrabbit.repository.home", repositoryHome);
It's a tricky and one kinda needs to figure out all this from debugging.
you need to configure your 'repository.properties' at 'WEB-INF/classes' with below entry.
jcr.org.apache.jackrabbit.repository.home={user.home}\jcr-repository (your repository location).
Cheers.

Categories