Best design pattern for a scenario - java

We have a class called Variable which represents a singlevalue or compound value. For example, it can hold an integer,or a boolean,or a String etc... (single valued) or some compound value which can be list of Strings, integers or other Variables.
We serialize these objects and in the stream all these values are represented as strings. Whenever we serialize or deserialize there is a type conversion happening.
There are also some optional features or ways you can fill values in these variables. For example you can define a Variable to be populated from a webpage - For a given Variable we query a cache to understand if it should be populated from a webpage. Whenever someone does getValue() on the Variable we populate the value.
We also want to track changes of some variables. For example, I can choose to record or do some action whenever the value of a variable is read or changed.
As you can see that this is a hierarchical structure because variable can contain other variables. We wanted to find the best way to solve this.
Currently we have only one class called Variable which has so many if/else conditions and the code is very complex.
For example, getValue() code does the following:
if(query the cache to see if it needs population from webpage)
do something
else(---)
do something
else(if read should be recorded-find from cache)
do something etc...
Is there any pattern to design my classes in such a way that all my population from webpage logic can go in to one class, tracking logic in some other class, type conversion logic in some other class etc... to make it more readable.

Chain of Responsibility Each chained element in the Composite gets to do it's bit, but you have to spend some time configuring the runtime structure just so.
Possibly just a Composite or Observer for the getValue() scenario (but sounds more like Composite to me).
EDIT:
One could argue that the implementation below is in fact a case of "Chain of Responsibility", as a composite variable will delegate the responsibility of setting values to its children.
END EDIT
Here's a simple example using Observer and Composite. NOT TESTED just to give you the general feel for the solution...
I have not implemented stuff like serializing/deserializing.
In this solution you have compound values and atomic values, and you can add some observer to be executed before value is set.
package dk.asj.variables;
public abstract class VariableBase {
public interface Observer {
void onSet(final Value val, final VariableBase var);
}
private Observer obs = null;
public void setObserver(final Observer obs) {
this.obs = obs;
}
public void setValue(final Value val) {
if (obs != null) {
obs.onSet(val, this);
}
internalSetValue(val);
}
protected abstract void internalSetValue(final Value val);
public abstract Value getValue();
}
package dk.asj.variables;
import java.util.List;
public interface Value {
int getIntValue();
String getStringValue();
List<Value> getCompositeValue();
}
package dk.asj.variables;
public class SimpleVariable extends VariableBase {
private Value val = null;
#Override
protected void internalSetValue(final Value val) {
this.val = val;
}
#Override
public Value getValue() {
return val;
}
}
package dk.asj.variables;
import java.util.ArrayList;
import java.util.LinkedList;
import java.util.List;
public class CompoundVariable extends VariableBase {
final List<VariableBase> children = new LinkedList<VariableBase>();
public void addChild(final VariableBase c) {
children.add(c);
}
#Override
protected void internalSetValue(final Value val) {
for (int i = 0; i < val.getCompositeValue().size(); ++i) {
children.get(i).setValue(val.getCompositeValue().get(i));
}
}
#Override
public Value getValue() {
final List<Value> res = new ArrayList<Value>(children.size());
for (final VariableBase var : children) {
res.add(var.getValue());
}
return new Value() {
#Override
public int getIntValue() {
throw new RuntimeException("This is a composite value");
}
#Override
public String getStringValue() {
throw new RuntimeException("This is a composite value");
}
#Override
public List<Value> getCompositeValue() {
return res;
}
};
}
}

I'm not sure if this answers your question, however, this could lead to some new ideas, here is what I came up with in a similar situation:
I named these DynamicVariables. A dynamic variable may have a default value or be evaluated by a lamda (Java 8)/anonymous inner class (pre-Java 8).
Each variable has an evaluation context and can be evaluated only in a context - i.e. Session context or a Global context. Contexts fallback to each other and create an hierarchy, i.e. Session context falls back to a Global context. So the default variable constant or lambda value can be shadowed by a lambda or a constant defined in a context. In instance, session-scoped variables shadow out global vars when are accessed inside a session.
And this appeared to be quite a flexible approach - I even implemented a trivial dependency injection by introducing InjectionContext which is a thread-safe context holding an object being wired.
You might want to have a look at an example of how this is used in a deployment tool I'm currently developing. Configuration management and shared application logic there is built upon these variables. Code is under bear.context package, but it's rather raw at the moment.

Related

Java casting an object passed to method to its original type

I have a list called itemsData of object of class EtcStruct, but the class can differ depending on the file i want to use (the class is full of variables setters and getters):
ObservableList<EtcStruct> itemsData = FXCollections.observableArrayList();
Im passing it to the method thats supposed to work for any object type i choose and run invoked method from the file.
public static void parseToFile(ObservableList itemsData){
EtcStruct itemObject = (EtcStruct) itemsData.get(0);
System.out.print((int)reflectedmethod.invoke(itemObject);
}
Code above works , but what i want to achieve is make the method work without editing it's object type to make it more flexible for whatever structclass i plan to use.
I tried something with passing Struct Class name and .getClass() it returns the original type but i dont know what to do with it to make the new object of itemsData original type and cast the itemsData object.
public static void parseToFile(ObservableList itemsData,Class c){
Object itemObject = c.newInstance();
Object newobject = curClass.newInstance();
newobject = c.cast(itemsList.get(0));
}
Above seemed dumb to me and obviously didnt work.
After reading your comment I understand better why one would use reflection in your case. A GUI builder/editor is an example where reflection is used to provide an interface to set/get the values of components. Still, IMHO, reflection isn't a tool you would design for when you own the classes and are the primary designer. If possible you should strive for something more like this:
interface Parsable {
default int parse() {
System.out.println("Here I do something basic");
return 0;
}
}
class BasicStruct implements Parsable { }
class EtcStruct implements Parsable {
#Override
public int parse() {
System.out.println("Here I do something specific to an EtcStruct");
return 1;
}
}
// If some structs have a parent-child relationship
// you can alternatively `extend EtcStruct` for example.
class OtherStruct extends EtcStruct {
#Override
public int parse() {
super.parse();
System.out.println("Here I do something specific to an OtherStruct");
return 2;
}
}
void parseToFile(Parsable parsable) {
System.out.println(parsable.parse());
}
// If you use a generic with a specific class you don't
// have to guess or care which kind it is!
void parseToFile(ObservableList<Parsable> parsables) {
for (Parsable p : parsables) {
parseToFile(p);
}
}
public static void main(String[] args) {
ObservableList<Parsable> parsables = FXCollections.observableArrayList();
parsables.add(new BasicStruct());
parsables.add(new EtcStruct());
parsables.add(new OtherStruct());
parseToFile(parsables);
}
Output:
Here I do something basic
0
Here I do something specific to an EtcStruct
1
Here I do something specific to an EtcStruct
Here I do something specific to an OtherStruct
2
Of course, this is just an example that needs to be altered to meet your needs.
But what I still don't get is if you're able to parse from a file why you can't parse to one. Nonetheless, I slapped some code together to show you how I might parse an object to a file, manually, when dealing with Objects only.
The idea is to satisfy a bean-like contract. That is, each structure should provide a parameter-less constructor, all fields you want managed by reflection will follow Java naming convention and will have both a public setter and getter.
Don't get caught up in the file writing; that will be determined by your needs. Just notice that by following this convention I can treat any Object as a parsable structure. A less refined version here for reference:
public void parseToFile(Object object) throws IOException, InvocationTargetException, IllegalAccessException {
fos = new FileOutputStream("example" + object.getClass().getSimpleName());
List<Method> getters = Arrays.stream(object.getClass().getMethods())
.filter(method -> method.getName().startsWith("get") && !method.getName().endsWith("Class"))
.collect(Collectors.toList());
for (Method getter : getters) {
String methodName = getter.getName();
String key = String.valueOf(Character.toLowerCase(methodName.charAt(3))) +
methodName.substring(4, methodName.length());
fos.write((key + " : " + String.valueOf(getter.invoke(object)) + "\n").getBytes());
}
fos.close();
}
I think that you can just still use Generics to keep static objects typing. Try to parametrize your function parseToFile. Here is an example:
public static void parseToFile(ObservableList<EtcStruct> itemsData){
EtcStruct itemObject = itemsData.get(0);
System.out.print((int)reflectedmethod.invoke(itemObject);
}

How to add null values to ConcurrentHashMap

I have a ConcurrentHashMap which is called from different threads to put values in it. I have to insert null values, but ConcurrentHashMap doesn't allow null values. Is there a way to do that or an alternate option to do this in Java?
Rather than using null, which has no semantic meaning and is generally considered an antipattern, represent this "absent" notion as a concrete type that reflects your intent and forces callers to account for it properly.
A common solution is to use Optional (for pre-Java 8, use Guava's Optional) to represent the absence of a value.
So your map would have a type ConcurrentHashMap<Key, Optional<Value>>.
Another option is to represent whatever you intend to mean by null more directly in the type you're storing in the map, e.g. if null is supposed to mean "this used to exist, but no longer" you might create a class structure like so:
public abstract class Resource {
public abstract void doSomething();
public abstract ClosedResource close();
}
public class ActiveResource extends Resource {
public void doSomething() { ... }
public ClosedResource close() { ... }
}
public class ClosedResource extends Resource {
public void doSomething() { /* nothing to do */ }
public ClosedResource close() { return this; }
}
And then simply have a ConcurrentHashMap<Key, Resource>. There are pros and cons to both approaches depending on your exact needs, but both are objectively better than putting null values in your map.
You might also simply be able to avoid adding nulls at all - if you can't create a clear semantic meaning for null that's different from simply being absent (as suggested above), just use absence in the map to convey what you care about, rather than distinguishing between the absent and present-but-null cases.
One simple answer is "If the value is null, don't try to add it".
Another answer is to take a page from .NET and introduce a Nullable. Your values would be an instance of Nullable.
public class Nullable<T> {
private final T obj;
public Nullable(T t) {
obj = t;
}
public T get() {
return obj
}
}
You can successfully make instances of Nullable with a null value. You can get the value out with get().
--- This is the old version of the answer ---
First, why would you need to put a null key? That would seem like an issue in your design.
That being said, you could use a stand-in object for null. Say your key is KeyLikeThing. Normally a KeyLikeThing has attributes which affect its hash and equals properties. You can have an instance of KeyLikeThing is null in your world. When ever your client wants to put a null, it uses the instance of KeyLikeThing.
public class KeyLikeThing {
public static final KeyLikeThing NULL = new KeyLikeThing();
// Other KeyLikeThing related stuff.
}
public class Storage<T> {
private ConcurrentHashMap m = new ConcurrentHashMap();
public T put(KeyLikeThing k, T val) {
KeyLikeThing kl = k == null ? KeyLikeThing.NULL : k;
return m.put(kl, val);
}
}

Why java.util.Optional is not Serializable, how to serialize the object with such fields

The Enum class is Serializable so there is no problem to serialize object with enums. The other case is where class has fields of java.util.Optional class. In this case the following exception is thrown: java.io.NotSerializableException: java.util.Optional
How to deal with such classes, how to serialize them? Is it possible to send such objects to Remote EJB or through RMI?
This is the example:
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.io.Serializable;
import java.util.Optional;
import org.junit.Test;
public class SerializationTest {
static class My implements Serializable {
private static final long serialVersionUID = 1L;
Optional<Integer> value = Optional.empty();
public void setValue(Integer i) {
this.i = Optional.of(i);
}
public Optional<Integer> getValue() {
return value;
}
}
//java.io.NotSerializableException is thrown
#Test
public void serialize() {
My my = new My();
byte[] bytes = toBytes(my);
}
public static <T extends Serializable> byte[] toBytes(T reportInfo) {
try (ByteArrayOutputStream bstream = new ByteArrayOutputStream()) {
try (ObjectOutputStream ostream = new ObjectOutputStream(bstream)) {
ostream.writeObject(reportInfo);
}
return bstream.toByteArray();
} catch (IOException e) {
throw new RuntimeException(e);
}
}
}
This answer is in response to the question in the title, "Shouldn't Optional be Serializable?" The short answer is that the Java Lambda (JSR-335) expert group considered and rejected it. That note, and this one and this one indicate that the primary design goal for Optional is to be used as the return value of functions when a return value might be absent. The intent is that the caller immediately check the Optional and extract the actual value if it's present. If the value is absent, the caller can substitute a default value, throw an exception, or apply some other policy. This is typically done by chaining fluent method calls off the end of a stream pipeline (or other methods) that return Optional values.
It was never intended for Optional to be used other ways, such as for optional method arguments or to be stored as a field in an object. And by extension, making Optional serializable would enable it to be stored persistently or transmitted across a network, both of which encourage uses far beyond its original design goal.
Usually there are better ways to organize the data than to store an Optional in a field. If a getter (such as the getValue method in the question) returns the actual Optional from the field, it forces every caller to implement some policy for dealing with an empty value. This will likely lead to inconsisent behavior across callers. It's often better to have whatever code sets that field apply some policy at the time it's set.
Sometimes people want to put Optional into collections, like List<Optional<X>> or Map<Key,Optional<Value>>. This too is usually a bad idea. It's often better to replace these usages of Optional with Null-Object values (not actual null references), or simply to omit these entries from the collection entirely.
A lot of Serialization related problems can be solved by decoupling the persistent serialized form from the actual runtime implementation you operate on.
/** The class you work with in your runtime */
public class My implements Serializable {
private static final long serialVersionUID = 1L;
Optional<Integer> value = Optional.empty();
public void setValue(Integer i) {
this.value = Optional.ofNullable(i);
}
public Optional<Integer> getValue() {
return value;
}
private Object writeReplace() throws ObjectStreamException
{
return new MySerialized(this);
}
}
/** The persistent representation which exists in bytestreams only */
final class MySerialized implements Serializable {
private final Integer value;
MySerialized(My my) {
value=my.getValue().orElse(null);
}
private Object readResolve() throws ObjectStreamException {
My my=new My();
my.setValue(value);
return my;
}
}
The class Optional implements behavior which allows to write good code when dealing with possibly absent values (compared to the use of null). But it does not add any benefit to a persistent representation of your data. It would just make your serialized data bigger…
The sketch above might look complicated but that’s because it demonstrates the pattern with one property only. The more properties your class has the more its simplicity should be revealed.
And not to forget, the possibility to change the implementation of My completely without any need to adapt the persistent form…
If you would like a serializable optional, consider instead using guava's optional which is serializable.
The Vavr.io library (former Javaslang) also have the Option class which is serializable:
public interface Option<T> extends Value<T>, Serializable { ... }
It's a curious omission.
You would have to mark the field as transient and provide your own custom writeObject() method that wrote the get() result itself, and a readObject() method that restored the Optional by reading that result from the stream. Not forgetting to call defaultWriteObject() and defaultReadObject() respectively.
If you want to maintain a more consistent type list and avoid using null there's one kooky alternative.
You can store the value using an intersection of types. Coupled with a lambda, this allows something like:
private final Supplier<Optional<Integer>> suppValue;
....
List<Integer> temp = value
.map(v -> v.map(Arrays::asList).orElseGet(ArrayList::new))
.orElse(null);
this.suppValue = (Supplier<Optional<Integer>> & Serializable)() -> temp==null ? Optional.empty() : temp.stream().findFirst();
Having the temp variable separate avoids closing over the owner of the value member and thus serialising too much.
Just copy Optional class to your project and create your own custom Optional that implements Serializable. I am doing it because I just realized this sh*t too late.
the problem is you have used variables with optional. the basic solution to avoid this, provide the variable without optional and get them as optional when you call the getter like below. Optional<Integer> value = Optional.empty(); to Integer value = null;
public class My implements Serializable {
private static final long serialVersionUID = 1L;
//Optional<Integer> value = Optional.empty(); //old code
Integer value = null; //solution code without optional.
public void setValue(Integer value ) {
//this.value = Optional.of(value); //old code with Optional
this.value = value ; //solution code without optional.
}
public Optional<Integer> getValue() {
//solution code - return the value by using Optional.
return Optional.ofNullable(value);
}
}

Associating a generic type with Enum in Java

I am creating a store for user preferences, and there are a fixed number of preferences that users can set values for. The names of the preferences (settings) are stored as an Enum:
public enum UserSettingName {
FOO,
BAR,
ETC
}
What I would like to be able to do is store a value type with the name so that the service will store the user's value with the correct Java type. For example, FOO might be a Long, and BAR might be a String. Up until now, we were storing all values as Strings, and then manually casting the values into the appropriate Java type. This has lead to try/catch blocks everywhere, when it makes more sense to have only one try/catch in the service. I understand that Enums cannot have generic types, so I have been playing around with:
public enum UserSettingName {
FOO(Long.class),
BAR(String.class),
ETC(Baz.class)
private Class type;
private UserSettingName(Class type) {
this.type = type;
}
public Class getType() {
return this.type;
}
}
I have a generic UserSetting object that has public T getSettingValue() and public void setSettingValue(T value) methods that should return and set the value with the correct type. My problem comes from trying to specify that generic type T when I create or retrieve a setting because I can't do something like:
new UserSetting<UserSettingName.FOO.getType()>(UserSettingName.FOO, 123L)
Sorry if this isn't exactly clear, I can try to clarify if it's not understood.
Thanks!
UPDATE
Both the setting name and value are coming in from a Spring MVC REST call:
public ResponseEntity<String> save(#PathVariable Long userId, #PathVariable UserSettingName settingName, #RequestBody String settingValue)
So I used the Enum because Spring casts the incoming data automatically.
Firstly you have to step back and think about what you're trying to achieve, and use a standard pattern or language construct to achieve it.
It's not entirely clear what you're going after here but from your approach it almost certainly looks like you're reinventing something which could be done in a much more straightforward manner in Java. For example, if you really need to know and work with the runtime classes of objects, consider using the reflection API.
On a more practical level - what you're trying to do here isn't possible with generics. Generics are a compile-time language feature - they are useful for avoiding casting everything explicitly from Object and give you type-checking at compilation time. You simply cannot use generics in this way, i.e. setting T as some value UserSettingName.Foo.getType() which is only known at runtime.
Look how it done by netty:
http://netty.io/wiki/new-and-noteworthy.html#type-safe-channeloption
They done it by using typed constants:
http://grepcode.com/file/repo1.maven.org/maven2/io.netty/netty-all/4.0.0.Beta1/io/netty/channel/ChannelOption.java#ChannelOption
EDIT:
public interface ChannelConfig {
...
<T> boolean setOption(ChannelOption<T> option, T value);
...
}
public class ChannelOption<T> ...
public static final ChannelOption<Integer> SO_TIMEOUT =
new ChannelOption<Integer>("SO_TIMEOUT");
...
}
EDIT2: you can transform it like:
class Baz {}
class UserSettingName<T> {
public static final UserSettingName<Baz> ETC = new UserSettingName<Baz>();
}
class UserSetting {
public <T> UserSetting(UserSettingName<T> name, T param) {
}
}
public class Test {
public static void main(String[] args) {
new UserSetting(UserSettingName.ETC, new Baz());
}
}
Enums are not the answer here. If you find yourself repeating code everywhere you could just create a utility class and encapsulate all the try/catch logic there. That would cut down on your code redundancy without majorly impacting your current code.
public class Util
{
public static MyObject getObjectFromString(String s)
{
try
{
return (MyObject)s;
}
catch(Exception e)
{
return null;
}
}
}
Then use as follows:
MyObject myObj = Util.getObjectFromString(string);

Different ways to implement the Memento Pattern in Java

I am doing some research into the Memento Pattern and it seems that most of the examples I have come across seem to be relatively similar (Saving a String into an array and restoring it when needed) now correct me if I am wrong but I believe the method that i just described is "Object Cloning" but what are the other ways of implementing the Memento Pattern?
From what I have also picked up on Serialization can be used but there seems to be a grey area with people saying that it violates the encapsulation of the object and isn't a way to implement to Memento Pattern due to this.
So will anybody be able to shed some light on the ways to implement the pattern? My research has came up with a sort of mixture of all different things and has just made everything confusing.
Thanks
The Java Collections framework defines Queue, which can help.
Candidate code:
public final class Memento<T>
{
// List of saved values
private final Queue<T> queue = new ArrayDeque<T>();
// Last entered value, whether it has been saved or not
private T currentValue;
// No initial state, ie currentValue will be null on construction, hence
// no constructor
// Set a value, don't save it
public void set(final T value)
{
currentValue = value;
}
// Persist the currently saved value
public void persist()
{
queue.add(currentValue);
}
// Return the last saved value
public T lastSaved()
{
return queue.element();
}
// Return the last entered value
public T lastEntered()
{
return currentValue;
}
}
Notably missing from this code are many things, but are easily implementable:
revert to the last saved value;
no check for nulls;
T does not implement Serializable;
convenience method (like, add a value and make it the last saved state);
code is not thread safe!
Etc.
Sample code:
public static void main(final String... args)
{
final Memento<String> memento = new Memento<String>();
memento.set("state1");
System.out.println(memento.lastEntered()); // "state1"
memento.persist();
memento.set("state2");
System.out.println(memento.lastEntered()); // "state2"
System.out.println(memento.lastSaved()); // "state1"
}
In effect: this is a braindead implementation which can be improved, but which can be used as a basis -- extending it depends on your needs ;)
A usual problem that may come with memento implementations is that often there is a need for a lot of classes that represent the internal state of different kind of objects. Or the memento implementation must serialise object state to some other form (e.g. serialised java objects).
Here is a sketch for a memento implementation that doesn't rely on a specific memento class per class, whose state is to be captured for undo/redo support.
There's a basic concept to be introduced first:
public interface Reference<T> {
T get();
void set(T value);
}
This is an abstraction of java.lang.ref.Reference, because that class is for garbage collection purposes. But we need to use it for business logic. Basically a reference encapsulates a field. So they are intended to be used like that:
public class Person {
private final Reference<String> lastName;
private final Reference<Date> dateOfBirth;
// constructor ...
public String getLastName() {
return lastName.get();
}
public void setLastName(String lastName) {
this.lastName.set(lastName);
}
public Date getDateOfBirt() {
return dateOfBirth.get();
}
public void setDateOfBirth(Date dateOfBirth) {
this.dateOfBirth.set(dateOfBirth);
}
}
Note that object instantiation with those references might not be that trivial, but we leave that out here.
Now here are the details for the memento implementation:
public interface Caretaker {
void addChange(Change change);
void undo();
void redo();
void checkpoint();
}
public interface Change {
Change createReversal();
void revert();
}
Basically a Change represents a single identifiable change to the state of an identifiable object. A Change is revertable by invoking the revert method and the reversal of that change can itself be reverted by reverting the Change created by the createReversal method. The Caretaker accumlates changes to object states via the addChange method. By invoking the undoand redo methods the the Caretaker reverts or redoes (i.e. reverting the reversal of changes) all changes until the next checkpoint is reached. A checkpoint represents a point at which all observed changes will accumulate to a change that transforms all states of all changed objects from one valid to another valid configuration. Checkpoints are usually created past or before actions. Those are created via the checkpoint method.
And now here is how to make use of the Caretaker with Reference:
public class ReferenceChange<T> implements Change {
private final Reference<T> reference;
private final T oldValue;
private final T currentReferenceValue;
public ReferenceChange(Reference<T> reference, T oldValue,
T currentReferenceValue) {
super();
this.reference = reference;
this.oldValue = oldValue;
this.currentReferenceValue = currentReferenceValue;
}
#Override
public void revert() {
reference.set(oldValue);
}
#Override
public Change createReversal() {
return new ReferenceChange<T>(reference, currentReferenceValue,
oldValue);
}
}
public class CaretakingReference<T> implements Reference<T> {
private final Reference<T> delegate;
private final Caretaker caretaker;
public CaretakingReference(Reference<T> delegate, Caretaker caretaker) {
super();
this.delegate = delegate;
this.caretaker = caretaker;
}
#Override
public T get() {
return delegate.get();
}
#Override
public void set(T value) {
T oldValue = delegate.get();
delegate.set(value);
caretaker.addChange(new ReferenceChange<T>(delegate, oldValue, value));
}
}
There exists a Change that represents how the value of a Reference has changed. This Change is created when the CaretakingReference is set. In this implementation there is a need for a nested Reference within the CaretakingReference implementation, because a revert of the ReferenceChange shouldn't trigger a new addChange via the CaretakingReference.
Collection properties needn't use the Reference. A custom implementation triggering the caretaking should be used in that case. Primitives can be used with autoboxing.
This implementation infers an additional runtime and memory cost by always using the reference instead of fields directly.

Categories