I've been at this since yesterday looking for a way to do this. What I have are hundreds of POJOs from a third party and need to apply properties to these based on business rules. I'm avoiding the altering of the POJOs because the third party could potentially recreate them thus creating a nightmare for managing files down the road.
What I'm attempting to do is to dynamically have a class extend another class.
For example.
POJO: Foo.java
package abc.service;
public class Foo {
private String greeting = "";
public Foo(){
gretting = "Good morning";
}
public String getGreeting(){
return greeting;
}
}
// end file
Mine: Bar.java
package abc.service;
public class Bar {
private String claim = "";
public Bar(){
claim = "You're correct";
}
public String getClaim(){
return claim;
}
}
// end file
Mine: TestMe.java
Trying here in a class separate from the POJOs to have a POJO extend another of my classes.
Is this beyond the abilities of JAVA?
package abc;
public class TestMe {
Foo f = new Foo();
Class c1 = f.getClass();
Bar b = new Bar();
Class c2 = b.getClass();
Class merged = c2.asSubclass(c1);
// Trying to call the Foo method
System.out.println(merged.getGreeting());
// Trying to call the Bar method
System.out.println(merged.getClaim());
}
Additionally what is going on is that JSON schemas are being created from the POJOs that are provided. But the POJOs are only based on an UPDATE record scenario. I'm looking for the best way to have the POJOs extend another class for CREATE record scenarios which is why I'm looking to dynamically have their POJOs extend my code when required.
Need to generate json schema for the POJOs for UPDATE
Need to verifying their json meets the POJOs requirements for UPDATE
Need to convert their json to the POJOs for UPDATE
Also,
Need to generate json schema for the POJOs for CREATE
Need to verifying their json meets the POJOs requirements for CREATE
Need to convert their json to the POJOs for CREATE
Using Jackson Mixin and the ObjectMapper I'm able to dynamically apply my code to the classes when creating the schemas but the issue I'm having is when trying to have the POJOs extend the class where Mixin is not going to solve the issue.
With plain Java: no, it can't be done.
You can change byte code, either in the build process, or at runtime. But it's hard, and there's not a lot of documentation.
AspectJ's declare parents expression is probably the easiest way to do it at build time.
If you want to do it at runtime, look at frameworks like asm, CGLib or ByteBuddy. But you will have to run the code from inside a custom ClassLoader or agent.
You can use composition instead of inheritance.
public class Foo {
private String greeting = "";
public Foo(){
gretting = "Good morning";
}
public String getGreeting(){
return greeting;
}
}
Your class
public class Bar {
private String claim = "";
private Foo foo;
public Bar(){
claim = "You're correct";
foo = new Foo();
}
public String getClaim(){
return claim;
}
public Foo getFoo(){
return foo;
}
}
And the test
public class TestMe {
// Trying to call the Foo method
System.out.println(bar.getFoo().getGreeting());
// Trying to call the Bar method
System.out.println(bar.getClaim());
}
Or you can do you class a little bit different.
public class Bar {
private String claim = "";
private Foo foo;
public Bar(){
claim = "You're correct";
foo = new Foo();
}
public String getClaim(){
return claim;
}
public String getGreeting(){
return foo.getGreeting();
}
}
And the test
public class TestMe {
// Trying to call the Foo method
System.out.println(bar.getGreeting());
// Trying to call the Bar method
System.out.println(bar.getClaim());
}
It is Not Possible.
Simply to put, JAVA at present(till latest version) does not have a provision to dynamically extend the class at runtime and load to JVM.
Instead of extending, you should use a design pattern. For example the Stategy Pattern. This allows you to change your strategy dynamicaly.
Related
I am trying to implement a simple java event-handler lambda for AWS. It receives sqs events and should make appropriate updates to the dynamoDB table.
One of the attributes in this table is a status field that has 4 defined states; therefore I wanted to use an enum class in java and map it to this attribute.
Under AWS SDK v1 I could use the #DynamoDBTypeConvertedEnum annotation. But it does not exist anymore in v2. Instead, there is the #DynamoDbConvertedBy() which receives a converter class reference. There is also an EnumAttributeConverter class which should work nicely with it.
But for some reason, it does not work. The following is a snip from my current code:
#Data
#DynamoDbBean
#NoArgsConstructor
public class Task{
#Getter(onMethod_ = {#DynamoDbPartitionKey})
String id;
...
#Getter(onMethod_ = {#DynamoDbConvertedBy(EnumAttributeConverter.class)})
ExportTaskStatus status;
}
The enum looks as follows:
#RequiredArgsConstructor
public enum TaskStatus {
#JsonProperty("running") PROCESSING(1),
#JsonProperty("succeeded") COMPLETED(2),
#JsonProperty("cancelled") CANCELED(3),
#JsonProperty("failed") FAILED(4);
private final int order;
}
With this, I get the following exception when launching the application:
Class 'class software.amazon.awssdk.enhanced.dynamodb.internal.converter.attribute.EnumAttributeConverter' appears to have no default constructor thus cannot be used with the BeanTableSchema
For anyone else coming here, it looks do me like just dropping the annotation from the enum altogether works just fine, i.e. the SDK applies the provided attribute converters implicitly. This is also mentioned in this Github issue. My own class looks like this (Brand is an enum here), and the enum is converted without any issues when fetching items.
#Value
#Builder(toBuilder = true)
#DynamoDbImmutable(builder = User.UserBuilder.class)
public class User {
#Getter(onMethod = #__({#DynamoDbPartitionKey}))
String id;
Brand brand;
...
}
How can I use Java Enums with Amazon DynamoDB and AWS SDK v2?
Although the documentation doesn't state it, the DynamoDbConvertedBy annotation requires any AttriuteConverter you supply to contain a parameterles default constructor
Unfortunately for you and me, whoever wrote many of the built-in AttributeConverter classes decided to use static create() methods to instantiate them instead of a constructor (maybe they're singletons under the covers? I don't know). This means anyone who wants to use these helpful constructor-less classes like InstantAsStringAttributeConverter and EnumAttributeConverter needs to wrap them in custom wrapper classes that simple parrot the converters we instantiated using create. For a non-generic typed class like InstantAsStringAttributeConverter, this is easy. Just create an wrapper class that parrots the instance you new up with create() and refer to that instead:
public class InstantAsStringAttributeConverterWithConstructor implements AttributeConverter<Instant> {
private final static InstantAsStringAttributeConverter CONVERTER = InstantAsStringAttributeConverter.create();
#Override
public AttributeValue transformFrom(Instant instant) {
return CONVERTER.transformFrom(instant);
}
#Override
public Instant transformTo(AttributeValue attributeValue) {
return CONVERTER.transformTo(attributeValue);
}
#Override
public EnhancedType<Instant> type() {
return CONVERTER.type();
}
#Override
public AttributeValueType attributeValueType() {
return CONVERTER.attributeValueType();
}
}
Then you update your annotation to point to that class intead of the actual underlying library class.
But wait, EnumAttributeConverter is a generic typed class, which means you need to go one step further. First, you need to create a version of the converter that wraps the official version but relies on a constructor taking in the type instead of static instantiation:
import software.amazon.awssdk.enhanced.dynamodb.AttributeConverter;
import software.amazon.awssdk.enhanced.dynamodb.AttributeValueType;
import software.amazon.awssdk.enhanced.dynamodb.EnhancedType;
import software.amazon.awssdk.enhanced.dynamodb.internal.converter.attribute.EnumAttributeConverter;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
public class EnumAttributeConverterWithConstructor<T extends Enum<T>> implements AttributeConverter<T> {
private final EnumAttributeConverter<T> converter;
public CustomEnumAttributeConverter(final Class<T> enumClass) {
this.converter = EnumAttributeConverter.create(enumClass);
}
#Override
public AttributeValue transformFrom(T t) {
return this.converter.transformFrom(t);
}
#Override
public T transformTo(AttributeValue attributeValue) {
return this.converter.transformTo(attributeValue);
}
#Override
public EnhancedType<T> type() {
return this.converter.type();
}
#Override
public AttributeValueType attributeValueType() {
return this.converter.attributeValueType();
}
}
But that only gets us half-way there-- now we need to generate a version for each enum type we want to convert that subclasses our custom class:
public class ExportTaskStatusAttributeConverter extends EnumAttributeConverterWithConstructor<ExportTaskStatus> {
public ExportTaskStatusAttributeConverter() {
super(ExportTaskStatus.class);
}
}
#DynamoDbConvertedBy(ExportTaskStatusAttributeConverter.class)
public ExportTaskStatus getStatus() { return this.status; }
Or the Lombok-y way:
#Getter(onMethod_ = {#DynamoDbConvertedBy(ExportTaskStatusAttributeConverter.class)})
ExportTaskStatus status;
It's a pain. It's a pain that could be solved with a little bit of tweaking and a tiny bit of reflection in the AWS SDK, but it's where we're at right now.
I am thinking that your annotations might actually be the problem here. I would remove all annotations that mention a constructor, and instead, write out your own constructor(s). For both Task and TaskStatus.
The dynamodb-enhanced SDK does this out of the box.
When you declare a #DynamoDbBean the DefaultAttributeConverterProvider provides a long list of possible ways to convert attributes between java types, including an EnumAttributeConverter which is used if type.rawClass().isEnum() is true. So you don't need to worry about it.
If you ever wanted to extend the number of converters, you would need to add the converterProviders annotation parameter, and declare the default one (or omit it), as well as any other providers you want.
Example:
#DynamoDbBean(converterProviders = { DefaultAttributeConverterProvider.class, MyCustomAttributeConverterProvider.class });
Solution based on watkinsmatthewp Answer:
public class TaskStatusConverter implements AttributeConverter<TaskStatus> {
#Delegate
private final EnumAttributeConverter<TaskStatus> converter;
public TaskStatusConverter() {
converter = EnumAttributeConverter.create(TaskStatus.class);
}
}
Task status attribute looks like this:
#Getter(onMethod_ = {#DynamoDbConvertedBy(TaskStatusConverter.class)})
TaskStatus status;
I have the following POJO using Immutables+Jackson under the hood:
#JsonInclude(JsonInclude.Include.NON_NULL)
abstract class AbstractQueryRequest {
#JsonProperty("reqid")
public abstract String reqid();
#JsonProperty("rawquery")
public abstract String rawquery();
}
At some point I need to build another object based on the fields of the POJO, something along this line:
final HttpUrl.Builder urlBuilder = HttpUrl.parse(cfg.baseUrl()).newBuilder();
urlBuilder.addQueryParameter("reqid", request.reqid())
.addQueryParameter("rawquery", request.rawquery());
It's quite annoying to keep the POJO and this call aligned upon changes, I was wondering if it was possible to access programmatically each JsonProperty instead of typing the string manually.
Note that it is fine to write the getters by hand as I can easily refactor and I have the compiler double checking, but for strings I am worried for people down the line and I would like to "read" them from the POJO class somehow.
You can do it via reflection. You need to take method annotation values which annotated with JsonProperty. But I recommend you to use JsonProperty on fields, not methods.
Here is an example for your current requirement :
public class Main {
public static void main(String[] args) {
AbstractQueryRequest someType = new SomeType();
for(Method method : x.getClass().getSuperclass().getDeclaredMethods()) {
if (method.isAnnotationPresent(JsonProperty.class)) {
JsonProperty annotation = method.getAnnotation(JsonProperty.class);
System.out.println(annotation.value());
}
}
}
}
class SomeType extends AbstractQueryRequest {
#Override
public String reqid() {
return null;
}
#Override
public String rawquery() {
return null;
}
}
Output is :
rawquery
reqid
I have the following test where I need to verify that all getters of the Person class are being called. So far I have used mockito's verify() to make sure that each getter is called. Is there a way to do that by reflection? It can be the case that a new getter is added to the Person class but the test will miss that.
public class GetterTest {
class Person{
private String firstname;
private String lastname;
public String getFirstname() {
return firstname;
}
public String getLastname() {
return lastname;
}
}
#Test
public void testAllGettersCalled() throws IntrospectionException{
Person personMock = mock(Person.class);
personMock.getFirstname();
personMock.getLastname();
for(PropertyDescriptor property : Introspector.getBeanInfo(Person.class).getPropertyDescriptors()) {
verify(personMock, atLeast(1)).getFirstname();
//**How to verify against any getter method and not just getFirstName()???**
}
}
}
Generally, don't mock the class under test. If your test is for a Person, you shouldn't ever see Mockito.mock(Person.class) in it, as that's a pretty clear sign that you're testing the mocking framework instead of the system-under-test.
Instead, you may want to create a spy(new Person()), which will create a real Person implementation using a real constructor and then copy its data to a Mockito-generated proxy. You can use MockingDetails.getInvocations() to reflectively check that every getter was called.
// This code is untested, but should get the point across. Edits welcome.
// 2016-01-20: Integrated feedback from Georgios Stathis. Thanks Georgios!
#Test
public void callAllGetters() throws Exception {
Person personSpy = spy(new Person());
personSpy.getFirstname();
personSpy.getLastname();
assertAllGettersCalled(personSpy, Person.class);
}
private static void assertAllGettersCalled(Object spy, Class<?> clazz) {
BeanInfo beanInfo = Introspector.getBeanInfo(clazz);
Set<Method> setOfDescriptors = beanInfo.getPropertyDescriptors()
.stream()
.map(PropertyDescriptor::getReadMethod)
.filter(p -> !p.getName().contains("getClass"))
.collect(Collectors.toSet());
MockingDetails details = Mockito.mockingDetails(spy);
Set<Method> setOfTestedMethods = details.getInvocations()
.stream()
.map(InvocationOnMock::getMethod)
.collect(Collectors.toSet());
setOfDescriptors.removeAll(setOfTestedMethods);
// The only remaining descriptors are untested.
assertThat(setOfDescriptors).isEmpty();
}
There might be a way to call verify and invoke on the Mockito-generated spy, but that seems very fragile, and very dependent on Mockito internals.
As an aside, testing bean-style getters seems like an odd use of time/effort. In general focus on testing implementations that are likely to change or break.
I can think of two solutions for your problem:
Generate the Builder code programmatically, so you don't need to run tests. Java code is generated by a program and never edited by a user. Test the generator instead. Use a text template and build definitions from a serialized domain model or directly from Java compiled classes (you'll need a separate module dependent on the bean's one)
Write your tests against a proxy library. The problem is that regular proxies can only implement interfaces, not regular classes, and it's very cumbersome to have interfaces for Javabeans. If you choose this route, I'd go with Javassist. I coded a runnable sample and put it on GitHub. The test cases use a proxy factory to instantiate beans (instead of using new)
public class CountingCallsProxyFactory {
public <T> T proxy(Class<T> classToProxy) {
ProxyFactory factory = new ProxyFactory();
factory.setSuperclass(classToProxy);
Class clazz = factory.createClass();
T instance = (T) clazz.newInstance();
ProxyObject proxy = (ProxyObject) instance;
MethodCallCounter handler = new MethodCallCounter();
proxy.setHandler(handler);
return instance;
}
public void verifyAllGettersCalled(Object bean) {
// Query the counter against the properties in the bean
}
}
The counter is kept inside the class MethodCallCounter
I m beginner in java, so need your professional suggestions.
Short description:
I have some classes A1, A2, A3 .. B1 ..
I need one magic generic class or method that creates CompositeA CompositeB ...
These composite classes, have defined generalized behavior, what to do with what kind of methods
The kind of methods, could be set by annotations
The methods itself are different for A, B ..
I want to avoid writing CompositeX class every time I have new class X.
The question: Is it possible? If yes, how to?
Long description:
Our problem now:
There are 2+ data sources, like DB, files, libs.
From this data sources we get similar information. For example IDs.
The information is in objects (like hibernate entity), that extends one interface/abstract class.
For example if the content we need is in Foo class, so we have FooDB, FooFile, FooLib. All implementing/extending Foo.
And we create some CombinedFoo, to work with combined data of all this sources.
to update CombinedFoo data source (insert or change some information), only one source will be really used,
for retreiving the data from CombinedFoo all the sources will be used
(see CompositeFoo constructor later)
For example I have a Foo object, that has data (in this case IDs) from DB, another Foo from files and another Foo from some lib.
All this Foo's, implements same Foo interface.
Now I need all the sources in one combined object (that implements Foo interface too).
By initialization of combined foo I do
CompositeFoo compFoo = new CompositeFoo(fooDB, fooFILE, fooLIB)
compFoo.addId("value") calls only fooDB.addId("value").
compFoo.getIds() gets all the ids from all the data sources
(fooDB.getIds() + fooFILE.getIds() + fooLIB.getIds()).
If I want not only Foo, but for example Bar, I need to implement again new CompositeBar, with its own methods (this I want to avoid). But in the end all I need is: to make sure what methods are only for retrieving (get_methods) and what for changing information (set_methods).
now the question:
Is it possible to create such magic generic class, that is written once, and can be applied to every object class, I want to combine?
Here some example for more understanding:
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
//interface used for composite class and data classes
public abstract class Foo {
abstract Set<String> getIds();
abstract void addId(String id);
}
//data source from file
class FooFromFile extends Foo {
#Override
Set<String> getIds() {
//loads data from file
}
#Override
void addId(String id) {
//changes file, add there one id
}
}
//data source from DB
class FooFromDB extends Foo {
#Override
Set<String> getIds() {
//loads ids from DataBase
}
#Override
void addId(String id) {
//adds id to database
}
}
//class i use now, combines all Foo's
//gets info from all, updates only main data source
//this one i want to make generic
class CompositeFoo extends Foo {
private Foo mainFoo;
private Set<Foo> others;
public CompositeFoo(Foo mainSource, Foo ...otherSources ) {
mainFoo = mainSource;
others = new HashSet<>();
Collections.addAll(others, otherSources);
}
#Override
Set<String> getIds() {
HashSet<String> result = new HashSet<>();
//load from main source
result.addAll(mainFoo.getIds());
//and from other sources
for (Foo otherFoo : others ) {
result.addAll(otherFoo.getIds());
}
//ids from all the sources
return result;
}
#Override
void addId(String id) {
//uses only main source, we don't want to add ID to all the sources
mainFoo.addId(id);
}
}
Now it works so:
public static void main(String[] args) {
FooFromDB dbFooMain = new FooFromDB();
FooFromFile fileFoo = new FooFromFile();
CompositeFoo compFoo= new CompositeFoo(dbFooMain, fileFoo);
compFoo.add("new id"); //adds to dbFooMain only
compFoo.getIds(); //gets from all sources (dbFooMain + fileFoo)
}
This magic I want to realize but don't know how:
It should work somehow in this example
public static void main(String[] args) {
FooFromDB dbFooMain = new FooFromDB();
FooFromFile fileFoo = new FooFromFile();
//something like this:
MagicComposite<Foo> compFoo = new MagicComposite<>(dbFooMain, fileFoo);
//or something like this:
Foo compFoo = AnyCompositeFactory.createComposite(Foo.class, dbFoo, fileFoo);
//than this should work
//adds to dbFooMain only. dbFoo.add("new id") called
compFoo.add("new id");
//gets ids from all sources so dbFoo.getIds() + fileFoo.getIds() called and combined
compFoo.getIds();
//-----------------------------------------
//for Bar the same magic class should do
MagicComposite<Bar> compBar = new MagicComposite<>(dbBarMain, fileBar);
//or
Bar compBar = AnyCompositeFactory.createComposite(Bar.class, dbBar, fileBar);
}
So I want to avoid the creating of Composite%ClassName% for every %ClassName% data class.
I need some dynamic, generic CompositeMagic<Class> or whatever.
What I imagine..
If I have Bar class, than I somehow annotate it, what methods are to be combined to get all information, and what methods are just for changing information.
abstract class Bar{
#Magic(Type = retrieveInformationOnly)
Set<String> getSomething();
#Magic(Type = addInformation)
void addSomething(String id);
}
In this case the if I initialize the CombinedBar with some other Bars, and call addSomething() of this composite object, than only main (first arg in constructor) data source should add information, if I call getSomething(), all the initialized classed are called, and combined result is returned. (see CompositeFoo implementation)
The implementation could be something else. Code examples here are just to clarify what I need.
I can't find a simple way to add a custom field during serialization in Gson and I was hoping someone else may be able to help.
Here is a sample class to show my issue:
public class A {
String id;
String name;
...
}
When I serialize class A I would like to return something like:
{ "id":"123", "name":"John Doe", "url_to_user":"http://www.example.com/123" }
where url_to_user is not stored in my instance of class A, but can be generated with data in the instance of class A.
Is there a simple way of doing this? I would prefer to avoid writing an entire serializer just to add one field.
Use Gson.toJsonTree to get a JsonElement, with which you can interact dynamically.
A a = getYourAInstanceHere();
Gson gson = new Gson();
JsonElement jsonElement = gson.toJsonTree(a);
jsonElement.getAsJsonObject().addProperty("url_to_user", url);
return gson.toJson(jsonElement);
Well, the top rated answer is quite a quick one and not essentially bad when you are lacking much time but here is the problem: There is no proper separation of concern
You are modifying the serialized JSON at the same place where you are writing your business logic. You should be doing all the serialization inside of a TypeAdapter or a JsonSerializer.
How can we maintain a proper separation of concern?
The answer wraps around a bit of additional complexity but the architecture demands it. Here we go(taken from my other answer):
First, we would be using a custom serializer for the type. Second, we would have to create a copy constructor inside the base class and a wrapper subclass as follows:
Note: The custom serializer might seem like an overkill but trust me, it pays off in long run for maintainability.
.
// Lets say the base class is named Cat
public class Cat {
public String name;
public Cat(String name) {
super();
this.name = name;
}
// COPY CONSTRUCTOR
public Cat(Cat cat) {
this.name = cat.name;
}
#Override
public String sound() {
return name + " : \"meaow\"";
};
}
// The wrapper subclass for serialization
public class CatWrapper extends Cat{
public CatWrapper(String name) {
super(name);
}
public CatWrapper(Cat cat) {
super(cat);
}
}
And the serializer for the type Cat:
public class CatSerializer implements JsonSerializer<Cat> {
#Override
public JsonElement serialize(Cat src, Type typeOfSrc, JsonSerializationContext context) {
// Essentially the same as the type Cat
JsonElement catWrapped = context.serialize(new CatWrapper(src));
// Here, we can customize the generated JSON from the wrapper as we want.
// We can add a field, remove a field, etc.
// The main logic from the top rated answer now here instead of *spilling* around(Kindly ignore the cat having a url for the sake of example)
return catWrapped.getAsJsonObject().addProperty("url_to_user", url);
}
}
So, why a copy constructor?
Well, once you define the copy constructor, no matter how much the base class changes, your wrapper will continue with the same role. Secondly, if we don't define a copy constructor and simply subclass the base class then we would have to "talk" in terms of the extended class, i.e, CatWrapper. It is quite possible that your components talk in terms of the base class and not the wrapper type.