ehcache Map<String,Entry> not work springboot - java

i tried to cache Map<String,Entry> , but every time i found getEntries() hit database without caching ,
also i serialize Entry object , please yours support
#Cachable("stocks")
public Map<String,Entry> getEntries(){
//getting entry from database then convert to map
return map;
}

This works for me
#Service
public class OrderService {
public static int counter = 0;
#Cacheable("stocks")
public Map<String, Entry> getEntries() {
counter++;
final Map<String, Entry> map = new HashMap<>();
map.put("key", new Entry(123l, "interesting entry"));
return map;
}
}
Here's a test to prove the counter is not called.
#Test
public void entry() throws Exception {
OrderService.counter = 0;
orderService.getEntries();
assertEquals(1, OrderService.counter);
orderService.getEntries();
assertEquals(1, OrderService.counter);
}
I've added it all to my github example

Related

Apache Flink Process xml and write them to database

i have the following use case.
Xml files are written to a kafka topic which i want to consume and process via flink.
The xml attributes have to be renamed to match the database table columns. These renames have to be flexible and maintainable from outside the flink job.
At the end the attributes have to be written to the database.
Each xml document repesent a database record.
As a second step all some attributes of all xml documents from the last x minutes have to be aggregated.
As i know so far flink is capable of all the mentioned steps but i am lacking of an idea how to implement it corretly.
Currently i have implemented the kafka source, retrieve the xml document and parse it via custom MapFunction. There i create a POJO and store each attribute name and value in a HashMap.
public class Data{
private Map<String,String> attributes = HashMap<>();
}
HashMap containing:
Key: path.to.attribute.one Value: Value of attribute one
Now i would like to use the Broadcasting State to change the original attribute names to the database column names.
At this stage i stuck as i have my POJO data with the attributes inside the HashMap but i don't know how to connect it with the mapping via Broadcasting.
Another way would be to flatMap the xml document attributes in single records. This leaves me with two problems:
How to assure that attributes from one document don't get mixed with them from another document within the stream
How to merge all the attributes of one document back to insert them as one record into the database
For the second stage i am aware of the Window function even if i don't have understood it in every detail but i guess it would fit my requirement. The question on this stage would be if i can use more than one sink in one job while one would be a stream of the raw data and one of the aggregated.
Can someone help with a hint?
Cheers
UPDATE
Here is what i got so far - i simplified the code the XmlData POJO is representing my parsed xml document.
public class StreamingJob {
static Logger LOG = LoggerFactory.getLogger(StreamingJob.class);
public static void main(String[] args) throws Exception {
// set up the streaming execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
XmlData xmlData1 = new XmlData();
xmlData1.addAttribute("path.to.attribute.eventName","Start");
xmlData1.addAttribute("second.path.to.attribute.eventTimestamp","2020-11-18T18:00:00.000");
xmlData1.addAttribute("third.path.to.attribute.eventSource","Source1");
xmlData1.addAttribute("path.to.attribute.additionalAttribute","Lorem");
XmlData xmlData2 = new XmlData();
xmlData2.addAttribute("path.to.attribute.eventName","Start");
xmlData2.addAttribute("second.path.to.attribute.eventTimestamp","2020-11-18T18:00:01.000");
xmlData2.addAttribute("third.path.to.attribute.eventSource","Source2");
xmlData2.addAttribute("path.to.attribute.additionalAttribute","First");
XmlData xmlData3 = new XmlData();
xmlData3.addAttribute("path.to.attribute.eventName","Start");
xmlData3.addAttribute("second.path.to.attribute.eventTimestamp","2020-11-18T18:00:01.000");
xmlData3.addAttribute("third.path.to.attribute.eventSource","Source1");
xmlData3.addAttribute("path.to.attribute.additionalAttribute","Day");
Mapping mapping1 = new Mapping();
mapping1.addMapping("path.to.attribute.eventName","EVENT_NAME");
mapping1.addMapping("second.path.to.attribute.eventTimestamp","EVENT_TIMESTAMP");
DataStream<Mapping> mappingDataStream = env.fromElements(mapping1);
MapStateDescriptor<String, Mapping> mappingStateDescriptor = new MapStateDescriptor<String, Mapping>(
"MappingBroadcastState",
BasicTypeInfo.STRING_TYPE_INFO,
TypeInformation.of(new TypeHint<Mapping>() {}));
BroadcastStream<Mapping> mappingBroadcastStream = mappingDataStream.broadcast(mappingStateDescriptor);
DataStream<XmlData> dataDataStream = env.fromElements(xmlData1, xmlData2, xmlData3);
//Convert the xml with all attributes to a stream of attribute names and values
DataStream<Tuple2<String, String>> recordDataStream = dataDataStream
.flatMap(new CustomFlatMapFunction());
//Map the attributes with the mapping information
DataStream<Tuple2<String,String>> outputDataStream = recordDataStream
.connect(mappingBroadcastStream)
.process();
env.execute("Process xml data and write it to database");
}
static class XmlData{
private Map<String,String> attributes = new HashMap<>();
public XmlData(){
}
public String toString(){
return this.attributes.toString();
}
public Map<String,String> getColumns(){
return this.attributes;
}
public void addAttribute(String key, String value){
this.attributes.put(key,value);
}
public String getAttributeValue(String attributeName){
return attributes.get(attributeName);
}
}
static class Mapping{
//First string is the attribute path and name
//Second string is the database column name
Map<String,String> mappingTuple = new HashMap<>();
public Mapping(){}
public void addMapping(String attributeNameWithPath, String databaseColumnName){
this.mappingTuple.put(attributeNameWithPath,databaseColumnName);
}
public Map<String, String> getMappingTuple() {
return mappingTuple;
}
public void setMappingTuple(Map<String, String> mappingTuple) {
this.mappingTuple = mappingTuple;
}
}
static class CustomFlatMapFunction implements FlatMapFunction<XmlData, Tuple2<String,String>> {
#Override
public void flatMap(XmlData xmlData, Collector<Tuple2< String,String>> collector) throws Exception {
for(Map.Entry<String,String> entrySet : xmlData.getColumns().entrySet()){
collector.collect(new Tuple2<>(entrySet.getKey(), entrySet.getValue()));
}
}
}
static class CustomBroadcastingFunction extends BroadcastProcessFunction {
#Override
public void processElement(Object o, ReadOnlyContext readOnlyContext, Collector collector) throws Exception {
}
#Override
public void processBroadcastElement(Object o, Context context, Collector collector) throws Exception {
}
}
}
Here's some example code of how to do this using a BroadcastStream. There's a subtle issue where the attribute remapping data might show up after one of the records. Normally you'd use a timer with state to hold onto any records that are missing remapping data, but in your case it's unclear whether a missing remapping is a "need to wait longer" or "no mapping exists". In any case, this should get you started...
private static MapStateDescriptor<String, String> REMAPPING_STATE = new MapStateDescriptor<>("remappings", String.class, String.class);
#Test
public void testUnkeyedStreamWithBroadcastStream() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment(2);
List<Tuple2<String, String>> attributeRemapping = new ArrayList<>();
attributeRemapping.add(new Tuple2<>("one", "1"));
attributeRemapping.add(new Tuple2<>("two", "2"));
attributeRemapping.add(new Tuple2<>("three", "3"));
attributeRemapping.add(new Tuple2<>("four", "4"));
attributeRemapping.add(new Tuple2<>("five", "5"));
attributeRemapping.add(new Tuple2<>("six", "6"));
BroadcastStream<Tuple2<String, String>> attributes = env.fromCollection(attributeRemapping)
.broadcast(REMAPPING_STATE);
List<Map<String, Integer>> xmlData = new ArrayList<>();
xmlData.add(makePOJO("one", 10));
xmlData.add(makePOJO("two", 20));
xmlData.add(makePOJO("three", 30));
xmlData.add(makePOJO("four", 40));
xmlData.add(makePOJO("five", 50));
DataStream<Map<String, Integer>> records = env.fromCollection(xmlData);
records.connect(attributes)
.process(new MyRemappingFunction())
.print();
env.execute();
}
private Map<String, Integer> makePOJO(String key, int value) {
Map<String, Integer> result = new HashMap<>();
result.put(key, value);
return result;
}
#SuppressWarnings("serial")
private static class MyRemappingFunction extends BroadcastProcessFunction<Map<String, Integer>, Tuple2<String, String>, Map<String, Integer>> {
#Override
public void processBroadcastElement(Tuple2<String, String> in, Context ctx, Collector<Map<String, Integer>> out) throws Exception {
ctx.getBroadcastState(REMAPPING_STATE).put(in.f0, in.f1);
}
#Override
public void processElement(Map<String, Integer> in, ReadOnlyContext ctx, Collector<Map<String, Integer>> out) throws Exception {
final ReadOnlyBroadcastState<String, String> state = ctx.getBroadcastState(REMAPPING_STATE);
Map<String, Integer> result = new HashMap<>();
for (String key : in.keySet()) {
if (state.contains(key)) {
result.put(state.get(key), in.get(key));
} else {
result.put(key, in.get(key));
}
}
out.collect(result);
}
}

Unit Testing a simple in-memory cache

I have a simple in-memory cache that looks like (oversimplified) below:
class Cache {
private Map<String, Integer> m = new HashMap<>();
public String get(String key) {
if(!m.containsKey(key))
return null;
return m.get(key);
}
public void set(String key, String value) {
m.put(key, value);
}
public Set<String> getKeys() {
return m.keySet();
}
}
Now I want to write Unit Tests for this class. I was thinking I can do something along the lines of:
[TestMethod]
public void Get_Success() {
Cache c = new Cache();
c.set("Apple", 1);
c.set("Grape", 2);
c.set("Banana", 3);
Assert.Equals(1, c.get("Apple"));
Assert.Equals(2, c.get("Grape"));
Assert.Equals(3, c.get("Banana"));
}
But the problem is, it tests both Set and Get methods. So if the test breaks, I will not know whether the bug is in Set or Get.
What is the recommended way to test the Get and Set methods individually?
One way I thought was, to modify the Cache class to accept a Map<String, Integer> instance like:
class Cache {
private Map<String, Integer> m = new HashMap<>();
public Cache(Map<String, Integer> m) {
this.m = m;
}
...
...
}
Then from the test, I can create an instance of Map and pass it to Cache constructor. Then all the Set operations will happen in the map instance that I passed and I can validate whether Set worked as expected.
But I am not very sure whether it's the right approach.
Similarly, I want the ability to test Get as well without depending on Set.

DynamoDBMapper unconvert attribute error

I am getting json from dynamoDb that looks like this -
{
"id": "1234",
"payment": {
"payment_id": "2345",
"user_defined": {
"some_id": "3456"
}
}
}
My aim is to get the user_defined field in a Java HashMap<String, Object> as user_defined field can contain any user defined fields, which would be unknown until the data arrives. Everything works fine except my DynamoDBMapper cannot convert the user_defined field to a Java HashMap. It is throwing this error -
Exception occured Response[payment]; could not unconvert attribute
This is how the classes looks like -
#DynamoDBTable(tableName = "PaymentDetails")
public class Response {
private String id;
public Response() {
}
private Payment payment = new Payment();
#DynamoDBHashKey(attributeName="id")
public String getId() { return id; }
public void setId(String id) { this.id = id; }
public Payment getPayment() {
return payment;
}
public void setPayment(Payment payment) {
this.payment = payment;
}
}
The payment field mapper -
#DynamoDBDocument
public class Payment {
private String payment_id:
private HashMap<String, Object> user_defined;
public Payment() {}
public getPayment_id() {
return payment_id;
}
public setPayment_id(String payment_id) {
this.payment_id = payment_id;
}
#DynamoDBTypeConverted(converter = HashMapMarshaller.class)
public HashMap<String, Object> getUser_defined() {
return user_defined;
}
public void setUser_defined(HashMap<String, Object> user_defined) {
this.user_defined = user_defined;
}
}
The HashMapMarshaller(Just to check if Hashmap marshaller wasn't working with gson, I just defined a Hashmap, put in a value and return it, but seems to still not working) -
public class HashMapMarshaller implements DynamoDBTypeConverter<String, HashMap<String, Object>> {
#Override
public String convert(HashMap<String, Object> hashMap) {
return new Gson().toJson(hashMap);
}
#Override
public HashMap<String, Object> unconvert(String jsonString) {
System.out.println("jsonString received for unconverting is " + jsonString);
System.out.println("Unconverting attribute");
HashMap<String, Object> hashMap = new HashMap<>();
hashMap.put("key", "value");
return hashMap;
//return new Gson().fromJson(jsonString, new TypeToken<HashMap<String, Object>>(){}.getType());
}
}
Marshaller approach is till now not working for me. It is also not printing any of the printlns I've put in there. I've also tried using #DynamoDBTyped(DynamoDBMapperFieldModel.DynamoDBAttributeType.M) and using Map instead of HashMap above my user_defined getter to no avail.
I want to find out how to convert the user_defined field to Java HashMap or Map. Any help is appreciated. Thank you!
Make Map<String, Object> to Map<String, String>. It should work without any custom converters. Otherwise be specific about Map's value type. For example, Map<String, SimplePojo> should work. Don't forget to annotate SimplePojo class with #DynamoDBDocument.
With Object as a type of Map's value, DynamoDB will not able to decide which object it has to create while reading entry from DynamoDB. It should know about specific type like String, Integer, SimplePojo etc.

sorting a List<Map<String, String>>

I have a list of a map of strings:
List<Map<String, String>> list = new ArrayList<Map<String, String>>();
This gets populated with the following:
Map<String, String> action1 = new LinkedHashMap<>();
map.put("name", "CreateFirstName");
map.put("nextAction", "CreateLastName");
Map<String, String> action2 = new LinkedHashMap<>();
map.put("name", "CreateAddress");
map.put("nextAction", "CreateEmail");
Map<String, String> action3 = new LinkedHashMap<>();
map.put("name", "CreateLastName");
map.put("nextAction", "CreateAddress");
Map<String, String> action4 = new LinkedHashMap<>();
map.put("name", "CreateEmail");
list.add(action1);
list.add(action2);
list.add(action3);
list.add(action4);
action4 doesn't have a nextAction because it is the last action, but might be easier to just give it a nextAction that is a placeholder for no next action?
Question: How can I sort my list, so that the actions are in order?
ie: the nextAction of an action, is the same as the name of the next action in the list.
Although this seems to be a case of the XY-Problem, and this list of maps is certainly not a "nicely designed data model", and there is likely a representation that is "better" in many ways (although nobody can give recommendations about what the "best" model could be, as long as the overall goal is not known), this is the task that you have at hand, and here is how it could be solved:
First of all, you have to determine the first element of the sorted list. This is exactly the map that has a "name" entry that does not appear as the "nextAction" entry of any other map.
After you have this first map, you can add it to the (sorted) list. Then, determining the next element boils down to finding the map whose "name" is the same as the "nextAction" of the previous map. To quickly find these successors, you can build a map that maps each "name" entry to the map itself.
Here is a basic implementation of this sorting approach:
import java.util.ArrayList;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
public class SortListWithMaps
{
public static void main(String[] args)
{
List<Map<String, String>> list = new ArrayList<Map<String, String>>();
Map<String, String> action1 = new LinkedHashMap<>();
action1.put("name", "CreateFirstName");
action1.put("nextAction", "CreateLastName");
Map<String, String> action2 = new LinkedHashMap<>();
action2.put("name", "CreateAddress");
action2.put("nextAction", "CreateEmail");
Map<String, String> action3 = new LinkedHashMap<>();
action3.put("name", "CreateLastName");
action3.put("nextAction", "CreateAddress");
Map<String, String> action4 = new LinkedHashMap<>();
action4.put("name", "CreateEmail");
list.add(action1);
list.add(action2);
list.add(action3);
list.add(action4);
// Make it a bit more interesting...
Collections.shuffle(list);
System.out.println("Before sorting");
for (Map<String, String> map : list)
{
System.out.println(map);
}
List<Map<String, String>> sortedList = sort(list);
System.out.println("After sorting");
for (Map<String, String> map : sortedList)
{
System.out.println(map);
}
}
private static List<Map<String, String>> sort(
List<Map<String, String>> list)
{
// Compute a map from "name" to the actual map
Map<String, Map<String, String>> nameToMap =
new LinkedHashMap<String, Map<String,String>>();
for (Map<String, String> map : list)
{
String name = map.get("name");
nameToMap.put(name, map);
}
// Determine the first element for the sorted list. For that,
// create the set of all names, and remove all of them that
// appear as the "nextAction" of another entry
Set<String> names =
new LinkedHashSet<String>(nameToMap.keySet());
for (Map<String, String> map : list)
{
String nextAction = map.get("nextAction");
names.remove(nextAction);
}
if (names.size() != 1)
{
System.out.println("Multiple possible first elements: " + names);
return null;
}
// Insert the elements, in sorted order, into the result list
List<Map<String, String>> result =
new ArrayList<Map<String, String>>();
String currentName = names.iterator().next();
while (currentName != null)
{
Map<String, String> element = nameToMap.get(currentName);
result.add(element);
currentName = element.get("nextAction");
}
return result;
}
}
Instead of using a Map to store the properties of an action (the name and the nextAction), create your own type that's composed of those properties:
class Action {
private String name;
//nextAction
public void perform() {
//do current action
//use nextAction to perform the next action
}
}
The nextAction can now be a reference to the next action:
abstract class Action implements Action {
private String name;
private Action nextAction;
public Action(String name) {
this.name = name;
}
public final void perform() {
perform(name);
nextAction.perform();
}
protected abstract void perform(String name);
}
You can now create your actions by subtyping the Action class:
class CreateFirstName extends Action {
public CreateFirstName(Action nextAction) {
super("CreateFirstName", nextAction);
}
protected final void perform(String name) {
System.out.println("Performing " + name);
}
}
And chain them together:
Action action = new CreateFirstName(new CreateLastName(new CreateEmail(...)));
The nested expressions can get pretty messy, but we'll get to that later. There's a bigger problem here.
action4 doesn't have a nextAction because it is the last action, but might be easier to just give it a nextAction that is a placeholder for no next action
The same problem applies to the code above.
Right now, every action must have a next action, due to the constructor Action(String, Action). We could take the easy route and pass in a placeholder for no next action (null being the easiest route):
class End extends Action {
public End() {
super("", null);
}
}
And do a null check:
//class Action
public void perform() {
perform(name);
if(nextAction != null) {
nextAction.perform(); //performs next action
}
}
But this would be a code smell. You can stop reading here and use the simple fix, or continue below for the more involved (and educational) route.
There's a good chance that when you do use null, you're falling victim to a code smell. Although it doesn't apply to all cases (due to Java's poor null safety), you should try to avoid null if possible. Instead, rethink your design as in this example. If all else fails, use Optional.
The last action is not the same as the other actions. It can still perform like the other, but it has different property requirements.
This means they could both share the same behavior abstraction, but must differ when it comes to defining properties:
interface Action {
void perform();
}
abstract class ContinuousAction implements Action {
private String name;
private Action nextAction;
public ContinuousAction(String name) {
this.name = name;
}
public final void perform() {
perform(name);
nextAction.perform();
}
protected abstract void perform(String name);
}
abstract class PlainAction implements Action {
private String name;
public PlainAction(String name) {
this.name = name;
}
public final void perform() {
perform(name);
}
protected abstract void perform(String name);
}
The last action would extend PlainAction, while the others would extend ContinuousAction.
Lastly, to prevent long chains:
new First(new Second(new Third(new Fourth(new Fifth(new Sixth(new Seventh(new Eighth(new Ninth(new Tenth())))))))))
You could specify the next action within each concrete action:
class CreateFirstName extends ContinuousAction {
public CreateFirstName() {
super("CreateFirstName", new CreateLastName());
}
//...
}
class CreateLastName extends ContinuousAction {
public CreateLastName() {
super("CreateLastName", new CreateEmail());
}
//...
}
class CreateEmail extends PlainAction {
public CreateEmail() {
super("CreateEmail");
}
//...
}
The ContinuousAction and PlainAction can be abstracted further. They are both named actions (they have names), and that property affects their contract in the samw way (passing it to the template method process(String)):
abstract class NamedAction implements Action {
private String name;
public NamedAction(String name) {
this.name = name;
}
public final void perform() {
perform(name);
}
protected abstract void perform(String name);
}
//class ContinuousAction extends NamedAction
//class PlainAction extends NamedAction

CacheBuilder.newBuilder().build is ambiguous

I'm trying to use a snippet of code for a Stash plugin, but the compiler keeps giving me an error that I can't seem to solve. It's using com.google.common.cache.Cache (Guava)
static final RepositorySettings DEFAULT_SETTINGS = new RepositorySettings(0);
private final PluginSettings pluginSettings;
private final Cache<Integer, RepositorySettings> cache = CacheBuilder.newBuilder().build(
new CacheLoader<Integer, RepositorySettings>()
{
#Override
public RepositorySettings load(#Nonnull Integer repositoryId)
{
#SuppressWarnings("unchecked")
Map<String, String> data = (Map) pluginSettings.get(repositoryId.toString());
return data == null ? DEFAULT_SETTINGS : deserialize(data);
}
});
The .build is giving me the following error
The method build(CacheLoader<? super Integer,RepositorySettings>) is ambiguous for the type CacheBuilder<Object,Object>
Cache has a build() method that takes no parameters, LoadingCache on the other hand has a build() method that takes CacheLoader as a parameter.
private final LoadingCache<Integer, RepositorySettings> cache = CacheBuilder.newBuilder().build(
new CacheLoader<Integer, RepositorySettings>() {
#Override
public RepositorySettings load(#Nonnull Integer repositoryId) {
#SuppressWarnings("unchecked")
Map<String, String> data = (Map) pluginSettings.get(repositoryId.toString());
return data == null ? DEFAULT_SETTINGS : deserialize(data);
}
});
This should work.
As reference:
http://docs.guava-libraries.googlecode.com/git/javadoc/com/google/common/cache/CacheBuilder.html

Categories