Replace #QueryResult while switching from SDN+OGM to SDN/RX - java

Using up to Spring Boot 2.3.4, I've been using the #QueryResult annotation to map some custom Cypher queries responses to POJOs. I'm now testing the Spring Boot 2.4 first RC and trying to follow instructions on how to drop OGM since the support has been removed. I successfully replaced other annotations by the ones provided here:
https://neo4j.github.io/sdn-rx/current/#migrating
but I'm now left with my #QueryResult annotations for which nothing is specified. When I delete them I get Mapping errors:
org.springframework.data.mapping.MappingException: Could not find mappable nodes or relationships inside Record
I've looked up some of the Mapping explanations but here's the thing: my custom POJOs don't represent any entity from the database, neither do they represent part(s) of an entity. They're rather relevant bits from differents Nodes.
Let me examplify:
I want to get all b nodes that are targets of the MY_REL relationship from a:
(a:Node {label:"my label"})-[:MY_REL]->(b:Node)
For my purposes, I don't need to get the nodes in the response, so my POJO only has 2 attributes:
a "source" String which is the beginning node's label
a "targets" Set of String which is the list of end nodes' labels
and I return this:
RETURN a.label AS source, COLLECT(b.label) AS targets
My POJO was simply annotated with #QueryResult in order to get the mapping done.
Does anyone know how to reproduce this behaviour with SB 2.4 release candidate? As I said, removing the now faulty annotation prompts me with a Mapping error but I don't know what I should do to replace it.

Spring Data Neo4j 6 now supports projections (formerly known as #QueryResult) in line with the other Spring Data modules.
Having said this, the simplest thing you would have to do, assuming that this #Query is written in a Neo4jRepository<Node,...>, would be to return also the a.
I know that this sounds ridiculous first but with choosing the repository abstraction, you say that everything that should get processed during the mapping phase is a Node and you want to project its properties (or a subset) into the POJO (DTO projection). SDN cannot ensure that you are really working with the right type when it starts the mapping, so it throws the exception you are facing. Neo4j-OGM was more relaxed behind the scenes for mapping the #QueryResults but unfortunately also wrong with this direction.
If your use-case is simple as you have described it, I would strongly suggest to use the Neo4jClient(docs) that gives you direct access to the mapping.
It has a fluent API for querying and manual mapping, and it participates in the ongoing Spring transactions your repositories are running within.
There is a lot in there when it comes to projections, so I would suggest to also read the section in the documentation.

Related

#Field annotations attributes not working for dynamic mapping feature

We are running spring-data 4.1.0, Spring 5.2.10
This may sound weird but here is the scenario (state of es No index/mapping):
Fire up the container and the first thing spring-data-es does is create the index with all the mapping. yea! but if that process fails for some reason the mapping does not get created. ok understandable.
After that(mapping failed) you save an entity.. it appears spring/es will dynamically start generating the mapping for that entity as it is getting saved. Cool.. yea! but... some of the #Field attributes are not getting into the mapping. IE copy_to attribute.
I don't know how all the dynamic mapping works, if it is on the java side or the es side. I guess if the dynamic mapping is happening on the es side then this behavior makes sense. But i think i noticed other #Field attributes making its way into the mapping like the field type and data conversion stuff.
Is this the expected behavior? I guess i am thinking that #Field annotation attributes should make its way into the mapping regardless as to how the mapping gets created.
Your assumptions are correct.
The #Field annotations only are considered when the mapping is written with Spring Data Elasticsearch - on repository initialization or when one of the corresponding methods from the IndexOperations interface is called.
When the mapping on index creation fails it is not automatically done afterwards. And it will not be done on the next application start as the index then already exists.
When an entity is stored in an index that does not have a mapping defined, Elasticsearch automatically creates the mapping, and as Elasticsearch does not know anything about Spring Data Elasticsearch annotations.
Did you get an error in the application when the mapping could not be stored?

Don't run your own ORM Implementation, but

the Topic already says one of the key roles regarding ORM
Don't run your own ORM Implementation
But, I have a situation here where I'm not sure how to get our Requirements implemented properly.
To give you a bit of background, currently we are using Spring Data JPA with Hibernate as JPA Implementation and all is fine so far.
But we have separate fields which we want to "manage" automatically, a bit similar to Auditing Annotations from Hibernate (#CreatedBy, #ModifiedBy, ...).
In our case this is e.g. a specific "instance" the entity belongs to.
Our Application is rather a Framework than an App, so other Developers frequently add Entities and we want to keep it simple and intuitive.
But, we do not only want to set it automatically on storage but also add it as condition for most "simple and frequent" queries (see my related question here Inject further query conditions / inject automatic entity field values when using Spring Data JPA Repositories).
Thus, I thought about building a simple Layer on top of the EntityManager and its Criteria API to support at least simple Queries like
findById(xx)
findByStringAttribute(String attribute, String value)
findByIntegerAttribute(int attribute, String value)
...
I'm not sure if this is too broad of a question but, what are your thoughts on that? Is this a reasonable idea or should I skip that idea?

SDN4 : How to add multiple custom labels to a NodeEntity

Prior to using SDN 4, I used custom REST client code to implement my own DAO layer between client & Neo4j db. I was able to add a number of labels to nodes I created. This also appears to have been possible using SDN 3 from what I can deduce from docs & other questions using the #Labels annotation.
However, #Labels does not appear to be supported in SDN 4 and the SDN 4 documentation implies that only the class name of an entity class (and any super classes) will be added to a node entity on creation.
Is there a way to add additional labels to a node? I need the values of such labels to be supplied by the user, not hard coded in an annotation.
An unmanaged extension would be the way to do it currently. When a node is created, the extension can assign the additional labels as required. You could either write an unmanaged extension directly or build a GraphAware Runtime Module

What is a good strategy for converting jpa entities into restful resources

Restful resources do not always have a one-to-one mapping with your jpa entities. As I see it there are a few problems that I am trying to figure out how to handle:
When a resource has information that is populated and saved by more than one entity.
When an entity has more information in it that you want to send down as a resource. I could just use Jackson's #JsonIgnore but I would still have issue 1, 3 and 4.
When an entity (like an aggregate root) has nested entities and you want to include part of its nested entities but only to a certain level of nesting as your resource.
When you want to exclude once piece of an entity when its part of one parent entity but exclude a separate piece when its part of a different parent entity.
Blasted circular references (I got this mostly working with JSOG using Jackson's #JsonIdentityInfo)
Possible solutions:
The only way I could think of that would handle all of these issues would be to create a whole bunch of "resource" classes that would have constructors that took the needed entities to construct the resource and put necessary getters and setters for that resource on it. Is that overkill?
To solve 2, 3, 4 , and 5 I could just do some pre and post processing on the actual entity before sending it to Jackson to serialize or deserialize my pojo into JSON, but that doesn't address issue 1.
These are all problems I would think others would have come across and I am curious what solutions other people of come up with. (I am currently using JPA 2, Spring MVC, Jackson, and Spring-Data but open to other technologies)
With a combination of JAX_RS 1.1 and Jackson/GSON you can expose JPA entities directly as REST resources, but you will run into a myriad of problems.
DTOs i.e. projections onto the JPA entities are the way to go. It would allow you to separate the resource representation concerns of REST from the transactional concerns of JPA. You get to explicitly define the nature of the representations. You can control the amount of data that appears in the representation, including the depth of the object graph to be traversed, if you design your DTOs/projections carefully. You may need to create multiple DTOs/projections for the same JPA entity for the different resources in which the entity may need to be represented differently.
Besides, in my experience using annotations like #JsonIgnore and #JsonIdentityInfo on JPA entities doesnt exactly lend to more usable resource representations. You may eventually run into trouble when merging the objects back into the persistence context (because of ignored properties), or your clients may be unable to consume the resource representations, since object references as a scheme may not be understood. Most JavaScript clients will usually have trouble consuming object references produced by the #JsonidentityInfo annotation, due to the lack of standardization here.
There are other additional aspects that would be possible through DTOs/projections. JPA #EmbeddedIds do not fit naturally into REST resource representations. Some advocate using the JAX-RS #MatrixParam annotation to identify the resource uniquely in the resource URIs, but this does not work out of the box for most clients. Matrix parameters are after all only a design note, and not a standard (yet). With a DTO/projection, you can serve out the resource representation against a computed Id (could be a combination of the constituent keys).
Note: I currently work on the JBoss Forge plugin for REST where some or all of these issues exist and would be fixed in some future release via the generation of DTOs.
I agree with the other answers that DTOs are the way to go. They solve many problems:
Separation of layers and clean code. One day you may need to expose the data model using a different format (eg. XML) or interface (eg. non web-service based). Keeping all configuration (such as #JsonIgnore, #JsonidentityInfo) for each interface/format in domain model would make is really messy. DTOs separate the concerns. They can contain all the configuration required by your external interface (web-service) without involving changes in domain model, which can stay web-service and format agnostic.
Security - you easily control what is exposed to the client and what the client is allowed to modify.
Performance - you easily control what is sent to the client.
Issues such as (circular) entity references, lazily-loaded collections are also resolved explicitly and knowingly by you on converting to DTO.
Given your constraints, there looks to be no other solution than Data Transfer Objects - yes, it's occurring frequently enough that people named this pattern...
If you application is completely CRUDish then the way to go is definitely Spring Data REST in which you absolutely do not need DTOs. If it's more complicated than that you will be safer with DTOs securing the application layer. But do not attempt to encapsulate DTOs inside the controller layer. They belong to a service layer cause the mapping is also the part of logic (what you let in the application and what you let out of it). This way the application layer stays hermetic. Of course in most cases it can be the mix of those two.

Best workaround for Spring MVC Json parsing limitations

I have a project which uses Spring, Hibernate, and has controllers which return JSONs. Naturally, my models contain lists and such to use JPA annotations to define the hibernate relationships, so, for example, I have Users, which contain a set of Challenges they own, and likewise Challenge contains a User who owns it.
Unfortunately I seem to be having a lot of issues with collections embedded in my JSONs.
For example, with that set up (a User owns challenges and a challenge has a owner) I can return a Challenge just fine. I can return a User just fine. But when I try to return a list of challenges, everything blows up! I receive the following error from my Jmeter test:
Error 500 Server Error
I believe this means that the Jackson json parser had an issue setting the json. I believe this, because if I use #JsonIgnoreProperties({"challengesOwned"}) then I can return the list of challenges just fine, since each individual challenge object no longer has a list embedded inside it.
This seems very strange to me. Can Jackson really not map simple embedded lists within JSONs? I've also got a huge problem because I have a Map which uses a User as its key ... and it seems it's not even possible to define a JSON map's key as an embedded object at all!
Does anyone have a suggestion for my issue? Do I have to manually define some Json mappings? Is there a simple solution I just don't know about?
EDIT:
While what j0ntech says does seem to have been true, it turns out that was not the whole story. It seems that when Spring used Jackson to serialize one of my hibernate entities into it's JSON version, hibernate was trying to lazy load one of that entity's properties, but since the entity was outside of its transaction at that point (being "in" the controller), it caused an exception, which just got swallowed up.
So there were actually TWO issues. I figured this out by trying to manually use Jackson to serialize the object I was returning before actually returning it. That way I actually got the stack trace for the other issue.
You probably have a recursive loop (as per DwB's comment): User contains a list of Challenges, which each contain a User, which contains a list of Challenges and so on and so forth. The parser (or your server at large) doesn't like that. You should use the annotations JsonManagedReference and JsonBackReference.
You can read about how to use these annotations here and here. I've used them in some of my own projects and they work very well if correctly implemented.
you could try flexjson (used by Spring Roo) or gson (developed by google)
Parsing GSON with Lists seems to be pretty straight forward
http://rishabhsays.wordpress.com/2011/02/24/parsing-list-of-json-objects-with-gson/

Categories