Easily create instance of Java DTO object from Scala code - java

I am converting the server side of my GWT project use Scala instead of Java. I have a number of RPC servlets that do DB lookups then map results to ArrayList where a class like SomeDTO might be
override def listTrips(): util.ArrayList[TripRoleDTO] = {
val trd = new TripRoleDTO
trd.setRoleType(RoleType.TripAdmin)
trd.setTripName(sessionDataProvider.get().getSessionUser.getEmail)
val res: util.ArrayList[TripRoleDTO] = new util.ArrayList[TripRoleDTO]()
res.add(trd)
res
}
instead of
#Override
public ArrayList<TripRoleDTO> listTrips() {
final SessionData sessionData = sessionDataProvider.get();
final List<TripRole> tripsForUser = tripAdminProvider.get().listTripRolesForUser(sessionData.getSessionUser().getId());
return newArrayList(transform(tripsForUser, DTOConverter.convertTripRole));
}
Note that the Java implementation actually makes the DB call (something I'm figuring out in Scala still) but it does its DTO transformation via Google Guava's Iterables.transform method.
Since the DTO objects need to be .java files that the client side of GWT can use what is an elegant way to transform my Scala domain objects to DTOS?

Use the GWT RequestFactory for automating the creation of DTOs. The DTO can be defined simply with an interface and a #ProxyFor annotation, see an example in the link provided.
If using RequestFactory by some reason is not an alternative, then consider using Dozer to map domain objects to DTOs, this is frequently used with GWT.

Related

Syntax for msearch in new Elastic Java Api

I am migrating my existing code from the old Elasticsearch Java API to the Java API (so, replacing RestHighLevelClient with ElasticsearchClient).
The new API (I'm writing in Scala) basically works like:
val resp = client.msearch(searchRequest, Product.class)
val products: List[Product] = resp.hits().hits()
But the whole point of msearch is to submit many queries to the ES server in a single HTTP request, right? And those queries don't necessarily have the same return schema.
What is the correct way to write this so that you have some base class Product, and then some subclass Product1 that is the return type of query #1, and some other subclass Product2 that is the return type of query #2? Does the new API not support this? The docs do not give clear guidance, and neither do the javadoc.
You can use Object.class while fetching data from different indexes as below.
val resp = client.msearch(searchRequest, Object.class)

How to consume a Spring HAL/HATEOAS API in Java using purely Jackson, not Spring

We are trying to create a Java client for an API created with Spring Data.
Some endpoints return hal+json responses containing _embedded and _links attributes.
Our main problem at the moment is trying to wrap our heads around the following structure:
{
"_embedded": {
"plans": [
{
...
}
]
},
...
}
When you hit the plans endpoint you get a paginated response the content of which is within the _embedded object. So the logic is that you call plans and you get back a response containing an _embedded object that contains a plans attribute that holds an array of plan objects.
The content of the _embedded object can vary as well, and trying a solution using generics, like the example following, ended up returning us a List of LinkedHashMap Objects instead of the expected type.
class PaginatedResponse<T> {
#JsonProperty("_embedded")
Embedded<T> embedded;
....
}
class Embedded<T> {
#JsonAlias({"plans", "projects"})
List<T> content; // This instead of type T ends up deserialising as a List of LinkedHashMap objects
....
}
I am not sure if the above issue is relevant to this Jackson bug report dating from 2015.
The only solution we have so far is to either create a paginated response for each type of content, with explicitly defined types, or to include a List<type_here> for each type of object we expect to receive and make sure that we only read from the populated list and not the null ones.
So our main question to this quite spread out issue is, how one is supposed to navigate such an API without the use of Spring?
We do not consider using Spring in any form as an acceptable solution. At the same time, and I may be quite wrong here, but it looks like in the java world Spring is the only framework actively supporting/promoting HAL/HATEOAS?
I'm sorry if there are wrongly expressed concepts, assumptions and terminology in this question but we are trying to wrap our heads around the philosophy of such an implementation and how to deal with it from a Java point of view.
You can try consuming HATEOS API using super type tokens. A kind of generic way to handle all kind of hateos response.
For example
Below generic class to handle response
public class Resource<T> {
protected Resource() {
this.content = null;
}
public Resource(T content, Link... links) {
this(content, Arrays.asList(links));
}
}
Below code to read the response for various objects
ObjectMapper objectMapper = new ObjectMapper();
Resource<ObjectA> objectA = objectMapper.readValue(response, new TypeReference<Resource<ObjectA>>() {});
Resource<ObjectB> objectB = objectMapper.readValue(response, new TypeReference<Resource<ObjectB>>() {});
You can refer below
http://www.java-allandsundry.com/2012/12/json-deserialization-with-jackson-and.html
http://www.java-allandsundry.com/2014/01/consuming-spring-hateoas-rest-service.html

Preserve Generics when generating JSON schema

I'm using jackson-module-jsonSchema and jsonschema2pojo API.
Brief explanation: I'm trying to json-schemify my server's Spring controller contract objects (objects that the controllers return and objects that they accept as parameters) and package them up to use with a packaged retrofit client in order to break the binary dependency between the client and server. The overall solution uses an annotation processor to read the Spring annotations on the controller and generate a retrofit client.
I've got it mostly working, but realized today I've got a problem where generic objects are part of the contract, e.g.
public class SomeContractObject<T> {
...
}
Of course, when I generate the schema for said object, the generic types aren't directly supported. So when I send it through the jsonschema2pojo api I end up with a class like so:
public class SomeContractObject {
}
So my question is simple but may have a non-trivial answer: Is there any way to pass that information through via the json schema to jsonschema2pojo?

Google App Engine class on client side

I am developing an Android app using GAE on Eclipse.
On one of the EndPoint classes I have a method which returns a "Bla"-type object:
public Bla foo()
{
return new Bla();
}
This "Bla" object holds a "Bla2"-type object:
public class Bla {
private Bla2 bla = new Bla2();
public Bla2 getBla() {
return bla;
}
public void setBla(Bla2 bla) {
this.bla = bla;
}
}
Now, my problem is I cant access the "Bla2" class from the client side. (Even the method "getBla()" doesn't exist)
I managed to trick it by creating a second method on the EndPoint class which return a "Bla2" object:
public Bla2 foo2()
{
return new Bla2();
}
Now I can use the "Bla2" class on the client side, but the "Bla.getBla()" method still doesn't exit. Is there a right way to do it?
This isn't the 'right' way, but keep in mind that just because you are using endpoints, you don't have to stick to the endpoints way of doing things for all of your entities.
Like you, I'm using GAE/J and cloud endpoints and have an ANdroid client. It's great running Java on both the client and the server because I can share code between all my projects.
Some of my entities are communicated and shared the normal 'endpoints way', as you are doing. But for other entities I still use JSON, but just stick them in a string, send them through a generic endpoint, and deserialize them on the other side, which is easy because the entity class is in the shared code.
This allows me to send 50 different entity types through a single endpoint, and it makes it easy for me to customize the JSON serializing/deserializing for those entities.
Of course, this solution gets you in trouble if decide to add an iOS or Web (unless you use GWT) client, but maybe that isn't important to you.
(edit - added some impl. detail)
Serializing your java objects (or entities) to/from JSON is very easy, but the details depend on the JSON library you use. Endpoints can use either Jackson or GSON on the client. But for my own JSON'ing I used json.org which is built-into Android and was easy to download and add to my GAE project.
Here's a tutorial that someone just published:
http://www.survivingwithandroid.com/2013/10/android-json-tutorial-create-and-parse.html
Then I added an endpoint like this:
#ApiMethod(name = "sendData")
public void sendData( #Named("clientId") String clientId, String jsonObject )
(or something with a class that includes a List of String's so you can send multiple entities in one request.)
And put an element into your JSON which tells the server which entity the JSON should be de serialized into.
Try using #ApiResourceProperty on the field.

GWT manually serialize domain object on server

The first thing my GWT app does when it loads is request the current logged in user from the server via RequestFactory. This blocks because I need properties of the User to know how to proceed. This only takes < 500ms, but it really annoys me that the app is blocked during this time. I already have the User on the server when the jsp is generated, so why not just add the serialized User to the jsp and eliminate this request altogether?
I have two problems keeping me from doing this:
I need to transform User to UserProxy
I need to serialize UserProxy in a way that is easy for GWT to deserialize.
I have not figured out a good way to do #1. This logic appears to be buried in ServiceLayerDecorator without an easy way to isolate? I may be wrong here.
The second one seems easier via ProxySerializer But how do I get my hands on the requestfactory when I am on the server? You cannot call GWT.create on the server.
I have been looking into AutoBeans but this does not handle #1 above. My UserProxy has references to collections of other EntityProxy's that I would like to maintain.
It is possible using AutoBeans if you create an AutoBeanFactory for your proxies:
To transform User to UserProxy:
Create a server side RequestFactory and invoke the same normal request. Response will contain UserProxy (but on the server).
To serialize UserProxy:
AutoBean<UserProxy> bean = AutoBeanUtils.getAutoBean(receivedUserProxy);
String json = AutoBeanCodex.encode(bean).getPayload();
To deserialize UserProxy on client:
AutoBean<UserProxy> bean = AutoBeanCodex.decode(userAutoBeanFactory, UserProxy.class, json);
Creating an in-process RequestFactory on the server (tutorial):
public static <T extends RequestFactory> T create( Class<T> requestFactoryClass ) {
ServiceLayer serviceLayer = ServiceLayer.create();
SimpleRequestProcessor processor = new SimpleRequestProcessor( serviceLayer );
T factory = RequestFactorySource.create( requestFactoryClass );
factory.initialize( new SimpleEventBus(), new InProcessRequestTransport(processor) );
return factory;
}
You could use AutoBeans for this as well if you are able to make User implements UserProxy. It works since Proxies are interfaces with getters/setters:
interface UserFactory implements AutoBeanFactory
{
AutoBean<UserProxy> user(UserProxy toWrap); // wrap existing instance in an AutoBean
}
Then on server you can create the autobean and serialize to json:
UserFactory factory = AutoBeanFactorySource.create(UserFactory.class)
AutoBean<UserProxy> userProxyBean = factory.user( existingUserPojo );
// to convert AutoBean to JSON
String json = AutoBeanCodex.encode(userProxyBean).getPayload();
On the client you can just use AutoBeanCodex.decode to deserialize JSON back to a bean
You cannot call GWT.create on the server (or from any real JVM), but in many cases you can call a JVM-compatible method designed for server use instead. In this case, take a look at RequestFactorySource.create.
It can be a little messy to get the server to read from itself and print out data using RequestFactory - here is a demo example of how this can work (using gwt 2.4, the main branch has the same thing for 2.3 or so) https://github.com/niloc132/tvguide-sample-parent/blob/gwt-2.4.0/tvguide-client/src/main/java/com/acme/gwt/server/TvViewerJsonBootstrap.java - not quite the same thing that you are after, but it may be possible to use this same idea to populate a string in a proxy store that can be read in the client (seen here https://github.com/niloc132/tvguide-sample-parent/blob/gwt-2.4.0/tvguide-client/src/main/java/com/acme/gwt/client/TvGuide.java).
The basic idea is to create a request (including ids, invocations, and with() arguments so the proxy builder makes all the right pieces in a consistent way), and pass it into a SimpleRequestProcessor instance, which will then run it through the server pieces it normally would. (Any entity management system probably should still have the entities cached to avoid an additional lookup, otherwise you need to model some of the work SRP doesn internally.) The ProxySerializer, which wraps a ProxyStore, expects to have full RF messages as sent from the server, so a fair bit of message bookkeeping needs to be done correctly.
I found the answer on the GWT Google Group. All credits go to Nisha Sowdri NM.
Server side encoding:
DefaultProxyStore store = new DefaultProxyStore();
ProxySerializer ser = requests.getSerializer(store);
final String key = ser.serialize(userProxy);
String message = key + ":" + store.encode();
Client side decoding:
String[] parts = message.split(":", 2);
ProxyStore store = new DefaultProxyStore(parts[1]);
ProxySerializer ser = requests.getSerializer(store);
UserProxy user = ser.deserialize(UserProxy.class, parts[0]);

Categories