Objectify + JSP: displaying 1:N relationships - java

My bean looks like that:
#Entity
public class Fattura {
#Id
Long id;
#NotEmpty
String numero;
#Min(value=0)
Double importo;
Key<User> utente;
// gets & sets....
}
The "utente" property is the key of another bean I created: a "Fattura" can have only one "User", one "User" can have many "Fattura"s
My Spring MVC controller will manage a request for a list of Fattura and display them in a simple jsp:
#RequestMapping( value = "/fatture" , method = RequestMethod.GET )
public ModelAndView leFatture() {
ModelAndView mav = new ModelAndView("fatture");
mav.addObject("fatture",fatturaService.listFatture());
return mav;
}
the code of the jsp is really simple: only a foreach cycle in a table
My question is:
how can I display the "utente"?
The only thing I have is its key, but I'd like to do something like ${fattura.utente.firstName} in my JSP, how can I do it?

Unfortunately you would have to manually fetch "utente" in your DAO class. There is no automatic fetching in Objectify like in Twig. In my POJOs I have following fields
#Transient private Organization sender; // Pickup location (for client RPC)
transient private Key<Organization> senderKey; // Pickup location (for Datastore)
I load entity from Datastore and then load manually Organization using senderKey.
In new Objectify4 you'll be able to do what you want like this:
class Beastie {
#Parent
#Load
ParentThing parent;
#Id Long id;
#Load({"bigGroup", "smallGroup"})
SomeThing some;
#Load("bigGroup")
List<OtherThing> others;
#Load
Ref<OtherThing> refToOtherThing;
Ref<OtherThing> anotherRef; // this one is never fetched automatically
}
Here is evolving design document of new version.
Update at Nov 17, 2011: This is big news. Twig author, John Patterson, joined Objectify project today.

I know it sounds annoying that you have to manually fetch the two objects, but it's actually very useful to know that you're doubling your work and time to do this - each "get" call take a while and the second won't start until the first is complete. It a typical NoSQL environment, you shouldn't often need to have two separate entities - is there a reason that you do?
There are only two reasons I can easily think of:
The class references another object of the same type - this is the example in the Objectify documentation, where a person has a reference to their spouse, who is also a person.
The class that you're embedding the other into ("Fattura" in your case) has masses of data in it that you don't want fetched at the same time as you want to fetch the "User" - and you need the user on it's own more often than you need the "Fattura" and the "User". It would need to be quite a lot of data to be worth the extra datastore call when you DO want the "Fattura".

You don't necessarily have to use temporary field for just getting a object.
This works:
public User getUtente() {
Objectify ofy = ObjectifyService.begin();
return ofy.get(utenteKey);
}
This will of course do a datastore get() each time the getter is called. You can improve this by using #Cached on your User entity, so they turn into memcache calls after the first call. Memcache is good, but we can do a little better using the session cache:
public User getUtente() {
Objectify ofy = myOfyProvider.get();
return ofy.get(utenteKey);
}
The key thing here is that you need to provide (through myOfyProvider) an instance of Objectify that is bound to the current request/thread, and that has the session cache enabled. (ie, for any given request, myOfyProvider.get() should return the same instance of Objectify)
In this setup, the exact same instance of User will be returned from the session cache each time the getter is called, and no requests to the datastore/memcache will be made after from the initial load of this Entity.

Related

Why is an entity being automatically saved without calling persist when using foreign generation strategy in bidirectional one-to-one mapping?

I have been practicing one-to-one mapping in hibernate and don't understand this particular case. I have to say, the program is working fine and as I intended, but apparently I can omit a perist() call and it still works smoothly. The fact that it's working is good, but I want to know exactly why the call is optional. Let me write some details:
This is the user class, which is supposed to be the owning side of the mapping:
#Data
#Entity
public class User {
#Id
#GeneratedValue(strategy=GenerationType.SEQUENCE)
private Long id;
private String name;
#OneToOne
private Ticket ticket;
public User() {}
public User(String name) {
this.name=name;
}
}
And this is the ticket class that's supposed to be the dependent one:
#Data
#Entity
public class Ticket {
#Id
#GeneratedValue(generator="foreignGenerator")
#GenericGenerator(name="foreignGenerator", strategy="foreign",
parameters=#org.hibernate.annotations.Parameter(name="property", value="user"))
private Long id;
#OneToOne(optional = false, mappedBy="ticket")
#PrimaryKeyJoinColumn
private User user;
public Ticket() {
}
public Ticket(User user) {
this.user=user;
}
}
I am trying to test the "shared primary key" strategy in one-to-one mapping. As you can see I have set up the generator with foreign strategy, which is supposed to make Ticket's id the same as it's corresponding User's id.
#Bean
CommandLineRunner loadData() {
return args->{
EntityManager em=emf.createEntityManager();
em.getTransaction().begin();
User user=new User("Test User");
Ticket ticket=new Ticket(user);
//em.persist(user);
user.setTicket(ticket);
em.persist(ticket);
em.getTransaction().commit();
em.close();
//We don't have to call persist on user
};
}
}
This program runs perfectly. Uncommenting the line which calls persist on user makes no difference. I am assuming that persisting ticket, which has it's user property set, automatically saves the user as well. Therefore, the reason it makes no difference is that no matter if user is getting saved or not, it will get persisted when we call ticket.
I want to know if my assumption is correct and any additional links to articles/documentation would be greatly appreaciated. Especially I am wondering about this part that I said above-"I am assuming that persisting ticket, which has it's user property set, automatically saves the user as well." I couldn't find anything that would confirm or deny this. I know that the "shared primary key" approach in one-to-one mapping is the only use case of "foreign" generation strategy, so there are not a lot of posts about it, and whatever posts are there are getting overshadowed by "foreign key" during the search.
Any help regarding this or any other issue that might be wrong with the code provided above would be appreciated. Thanks for taking your time to read this
The JPA specification states this behavior is wrong:
Looking at the 3.0 release:
section "3.2.2. Persisting an Entity Instance" implies user is unmanaged after your persist (you can check with the em.contains method).
Section "3.2.4. Synchronization to the Database" covers the flush/commit which states:
• If X is a managed entity, it is synchronized to the database.
..
◦ For any entity Y referenced by a relationship from X, where the relationship to Y has not been annotated with the cascade element value cascade=PERSIST or cascade=ALL:
▪ If Y is new or removed, an IllegalStateException will be thrown by the flush operation (and the transaction marked for rollback) or the transaction commit will fail.
User is new, so this should be resulting in an exception. That it works might be a glitch in how Hibernate is handling the #PrimaryKeyJoinColumn annotation (speculation on my part) and custom "foreignGenerator".
This is not a pattern I'd suggest you rely on, and should instead call persist to avoid inconsistencies with the behavior on other mapping setups.

DDD implementation with Spring Data and JPA + Hibernate problem with identities

So I'm trying for the first time in a not so complex project to implement Domain Driven Design by separating all my code into application, domain, infrastructure and interfaces packages.
I also went with the whole separation of the JPA Entities to Domain models that will hold my business logic as rich models and used the Builder pattern to instantiate. This approach created me a headache and can't figure out if Im doing it all wrong when using JPA + ORM and Spring Data with DDD.
Process explanation
The application is a Rest API consumer (without any user interaction) that process daily through Scheduler tasks a fairly big amount of data resources and stores or updates into MySQL. Im using RestTemplate to fetch and convert the JSON responses into Domain objects and from there Im applying any business logic within the Domain itself e.g. validation, events, etc
From what I have read the aggregate root object should have an identity in their whole lifecycle and should be unique. I have used the id of the rest API object because is already something that I use to identify and track in my business domain. I have also created a property for the Technical id so when I convert Entities to Domain objects it can hold a reference for the update process.
When I need to persist the Domain to the data source (MySQL) for the first time Im converting them into Entity objects and I persist them using the save() method. So far so good.
Now when I need to update those records in the data source I first fetch them as a List of Employees from data source, convert Entity objects to Domain objects and then I fetch the list of Employees from the rest API as Domain models. Up until now I have two lists of the same Domain object types as List<Employee>. I'm iterating them using Streams and checking if an objects are not equal() between them if yes a collection of List items is created as a third list with Employee objects that need to be updated. Here I've already passed the technical Id to the domain objects in the third list of Employees so Hibernate can identify and use to update the records that are already exists.
Up to here are all fairly simple stuff until I use the saveAll() method to update the records.
Questions
I alway see Hibernate using INSERT instead of updating the list of
records. So If Im correct Hibernate session is not recognising the
objects that Im throwing into it because I have detached them when I
used the convert to domain object?
Does anyone have a better idea how can I implement this differently or fix
this problem?
Or should I stop using this approach as two different objects and continue use
them as rich Entity models?
Simple classes to explain it with code
EmployeeDO.java
#Entity
#Table(name = "employees")
public class EmployeeDO implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
private String name;
public EmployeeDO() {}
...omitted getter/setters
}
Employee.java
public class Employee {
private Long persistId;
private Long employeeId;
private String name;
private Employee() {}
...omitted getters and Builder
}
EmployeeConverter.java
public class EmployeeConverter {
public static EmployeeDO serialize(Employee employee) {
EmployeeDO target = new EmployeeDO();
if (employee.getPersistId() != null) {
target.setId(employee.getPersistId());
}
target.setName(employee.getName());
return target;
}
public static Employee deserialize(EmployeeDO employee) {
return new Country.Builder(employee.getEmployeeId)
.withPersistId(employee.getId()) //<-- Technical ID setter
.withName(employee.getName())
.build();
}
}
EmployeeRepository.java
#Component
public class EmployeeReporistoryImpl implements EmployeeRepository {
#Autowired
EmployeeJpaRepository db;
#Override
public List<Employee> findAll() {
return db.findAll().stream()
.map(employee -> EmployeeConverter.deserialize(employee))
.collect(Collectors.toList());
}
#Override
public void saveAll(List<Employee> employees) {
db.saveAll(employees.stream()
.map(employee -> EmployeeConverter.serialize(employee))
.collect(Collectors.toList()));
}
}
EmployeeJpaRepository.java
#Repository
public interface EmployeeJpaRepository extends JpaRepository<EmployeeDO, Long> {
}
I use the same approach on my project: two different models for the domain and the persistence.
First, I would suggest you to don't use the converter approach but use the Memento pattern. Your domain entity exports a memento object and it could be restored from the same object. Yes, the domain has 2 functions that aren't related to the domain (they exist just to supply a non-functional requirement), but, on the other side, you avoid to expose functions, getters and constructors that the domain business logic never use.
For the part about the persistence, I don't use JPA exactly for this reason: you have to write a lot of code to reload, update and persist the entities correctly. I write directly SQL code: I can write and test it fast, and once it works I'm sure that it does what I want. With the Memento object I can have directly what I will use in the insert/update query, and I avoid myself a lot of headaches about the JPA of handling complex tables structures.
Anyway, if you want to use JPA, the only solution is to:
load the persistence entities and transform them into domain entities
update the domain entities according to the changes that you have to do in your domain
save the domain entities, that means:
reload the persistence entities
change, or create if there're new ones, them with the changes that you get from the updated domain entities
save the persistence entities
I've tried a mixed solution, where the domain entities are extended by the persistence ones (a bit complex to do). A lot of care should be took to avoid that domain model should adapts to the restrictions of JPA that come from the persistence model.
Here there's an interesting reading about the splitting of the two models.
Finally, my suggestion is to think how complex the domain is and use the simplest solution for the problem:
is it big and with a lot of complex behaviours? Is expected that it will grow up in a big one? Use two models, domain and persistence, and manage the persistence directly with SQL It avoids a lot of caos in the read/update/save phase.
is it simple? Then, first, should I use the DDD approach? If really yes, I would let the JPA annotations to split inside the domain. Yes, it's not pure DDD, but we live in the real world and the time to do something simple in the pure way should not be some orders of magnitude bigger that the the time I need to to it with some compromises. And, on the other side, I can write all this stuff in an XML in the infrastructure layer, avoiding to clutter the domain with it. As it's done in the spring DDD sample here.
When you want to update an existing object, you first have to load it through entityManager.find() and apply the changes on that object or use entityManager.merge since you are working with detached entities.
Anyway, modelling rich domain models based on JPA is the perfect use case for Blaze-Persistence Entity Views.
Blaze-Persistence is a query builder on top of JPA which supports many of the advanced DBMS features on top of the JPA model. I created Entity Views on top of it to allow easy mapping between JPA models and custom interface defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure the way you like and map attributes(getters) via JPQL expressions to the entity model. Since the attribute name is used as default mapping, you mostly don't need explicit mappings as 80% of the use cases is to have DTOs that are a subset of the entity model.
The interesting point here is that entity views can also be updatable and support automatic translation back to the entity/DB model.
A mapping for your model could look as simple as the following
#EntityView(EmployeeDO.class)
#UpdatableEntityView
interface Employee {
#IdMapping("persistId")
Long getId();
Long getEmployeeId();
String getName();
void setName(String name);
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
Employee dto = entityViewManager.find(entityManager, Employee.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features and it can also be saved back. Here a sample repository
#Repository
interface EmployeeRepository {
Employee findOne(Long id);
void save(Employee e);
}
It will only fetch the mappings that you tell it to fetch and also only update the state that you make updatable through setters.
With the Jackson integration you can deserialize your payload onto a loaded entity view or you can avoid loading alltogether and use the Spring MVC integration to capture just the state that was transferred and flush that. This could look like the following:
#RequestMapping(path = "/employee/{id}", method = RequestMethod.PUT, consumes = MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<String> updateEmp(#EntityViewId("id") #RequestBody Employee emp) {
employeeRepository.save(emp);
return ResponseEntity.ok(emp.getId().toString());
}
Here you can see an example project: https://github.com/Blazebit/blaze-persistence/tree/master/examples/spring-data-webmvc

Spring And Hibernate - generic entity updates

I have a very simple task,
I have a "User" Entity.
This user has tons of fields, for example :
firstName
age
country
.....
My goal is to expose a simple controller for update:
#RequestMapping(value = "/mywebapp/updateUser")
public void updateUser(data)
I would like clients to call my controller with updates that might include one or more fields to be updated.
What are the best practices to implement such method?
One naive solution will be to send from the client the whole entity, and in the server just override all fields, but that seems very inefficient.
another naive and bad solution might be the following:
#Transactional
#RequestMapping(value = "/mywebapp/updateUser")
public void updateUser(int userId, String[] fieldNames, String[] values) {
User user = this.userDao.findById(userId);
for (int i=0 ; i < fieldsNames.length ; i++) {
String fieldName = fieldsName[i];
switch(fieldName) {
case fieldName.equals("age") {
user.setAge(values[i]);
}
case fieldName.equals("firstName") {
user.setFirstName(values[i]);
}
....
}
}
}
Obviously these solutions aren't serious, there must be a more robust\generic way of doing that (reflection maybe).
Any ideas?
I once did this genetically using Jackson. It has a very convenient ObjectMapper.readerForUpdating(Object) method that can read values from a JsonNode/Tree onto an existing object.
The controller/service
#PATCH
#Transactional
public DomainObject partialUpdate (Long id, JsonNode data) {
DomainObject o = repository.get(id);
return objectMapper.readerForUpdating(o).readValue(data);
}
That was it. We used Jersey to expose the services as REST Web services, hence the #PATCH annotation.
As to whether this is a controller or a service: it handles raw transfer data (the JsonNode), but to work efficiently it needs to be transactional (Changes made by the reader are flushed to the database when the transaction commits. Reading the object in the same transaction allows hibernate to dynamically update only the changed fields).
If your User entity doesn't contains any security fields like login or password, you can simply use it as model attribute. In this case all fields will be updated automatically from the form inputs, those fields that are not supose to be updated, like id should be hidden fields on the form.
If you don't want to expose all your entity propeties to the presentation layer you can use pojo aka command to mapp all needed fields from user entity
BWT It is really bad practice to make your controller methods transactional. You should separate your application layers. You need to have service. This is the layer where #Transactional annotation belongs to. You do all the logic there before crud operations.

Objectify loads object behind Ref<?> even when #Load is not specified

I have an account object which references a user object.
#Cache
#Entity
public final class Account {
#Id Long id;
#Index private Ref<User> user;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public User getUser() {
return user.get();
}
public void setUser(User user) {
this.user = Ref.create(user);
}
}
I have hidden the Ref as recommended here: http://code.google.com/p/objectify-appengine/wiki/Entities - please note the Ref does not have the #Load annotation.
When I call my Google Cloud Endpoint from an Android client, it looks like Objectify delivers the account object with the embedded user, even though #Load is not specified.
#ApiMethod(name = "account.get")
public Account getAccount(
#Named("id") final Long id
) {
return ofy().load().type(Account.class).id(id).now();
}
When I query the account directly using Apis Explorer, I also get both, account with the user embedded:
200 OK
{
"id": "5079604133888000",
"user": { "id": "5723348596162560",
"version": "1402003195251",
"firstName": "Karl" },
"kind": "api#accountItem",
"etag": "\"30khohwUBSGWr00rYOZuF9f4BTE/Q31EvnQCQ6E9c5YXKEZHNsD_mlQ\""}
This raises three questions:
Does Appengine always return embedded Refs natively and does Objectify always pass on objects which it already knows?
What exactly is #Load for and is there a way to control this behavior? Load Groups?
Have I missed something? Why isn't #Load obeyed?
In your example code, you are not specifying #Load which means that loading the account will not fetch the User. However, your #ApiMethod is serializing the account back to the client, so the user property is been accessed, thus a separate fetch is issued to load the user object. That's why you are getting the information of the user when calling the method.
Not specifying #Load doesn't mean that you won't get a User back. It means that you are not going to retrieve a User unless you specifically ask for it later.
Ref works like this:
I'm a reference, so by default I won't fetch the data.
If you ask for me, then I will first load the data, then answer you.
Oh, if you tell me to #Load myself, then I will fetch the data initially and have it ready for you.
So this is working fine in your code... but then your #ApiMethod is serializing your Account object back to the client. The serialization process is going through every property in your Account object, including the user property. At this point, the Ref<User> is being accessed, so the data will get fetched from the Datastore and then returned to the client.
This is making your code very inefficient, since the Account objects are loaded without the User information, but then you always access the User info later (during serialization), issuing a separate fetch. Batching gets from the Datastore is way more efficient than issuing separate gets.
In your case, you can do either of two things:
Add #Load to the user property, so the Account object is fetched efficiently.
Make your #ApiMethod return a different Account object without the user property (thus avoiding fetching the user if you don't need it).
Option 2 above is quite useful since you can abstract your internal Datastore structure from what the client sees. You'll find yourself using this patter quite often.

Passing complex JPA Entities to the controller with POJO

My team is coding an application that involves editing wikipedia-like pages.
It is similar to the problem we have with registration:
A straightforward implementation gives something like
public static void doRegistration(User user) {
//...
}
The user parameter is a JPA Entity. The User model looks something like this:
#Entity
public class User extends Model {
//some other irrelevant fields
#OneToMany(cascade = CascadeType.ALL)
public Collection<Query> queries;
#OneToMany(cascade = CascadeType.ALL)
public Collection<Activity> activities;
//...
I have read here and there that this fails. Now, in Play!, what is the best course of action we can take? There must be some way to put all that data that has to go to the server in one object, that easily can be saved into the database.
EDIT: The reason this fails is because of the validation fail. It somehow says "incorrect value" when validating collection objects. I was wondering if this can be avoided.
SOLUTION: Changing the Collection to List has resolved the issue. This is a bug that will be fixed in play 1.2 :)
Thanks beforehand
this works. To be more clear, you can define a controller method like the one you wrote:
public static void doRegistration(User user) {
//...
}
That's completely fine. Just map it to a POST route and use a #{form} to submit the object, like:
#{form id:'myId', action:#Application.doRegistration()}
#{field user.id}
[...]
#{/form}
This works. You may have problems if you don't add all the field of the entity in the form (if some fields are not editable, either use hidden inputs or a NoBinding annotation as described here).
EDIT: on the OneToMany subject, the relation will be managed by the "Many" side. That side has to keep the id of the related entity as a hidden field (with a value of object.user.id). This will solve all related issues.
It doesn't fail. If you have a running transaction, the whole thing will be persisted. Just note that transactions are usually running within services, not controllers, so you should pass it from the controller to the service.

Categories