I'd like to store 2 new entities in a batch put. However, one entity is the parent of the other. If I have a a field in the child object that looks like:
#Parent
private Key parent
How do I fill in a value for this field in the child object if the parent hasn't been stored yet (and thus has no key yet).
Allocate the id of the parent in advance. Then you can save the parent and the child (with a parent key reference) in a single batch put.
Yo can not do it that way (as one batch).
If your question is more concerned with data integrity, then you may use transactions.
Example:
from google.appengine.ext import db
from app.models import ParentModel, ChildModel
class ARequestHandler(BaseHandler):
def get(self):
def create_parent_and_child():
parent_entity = ParentModel(...)
parent_entity.put()
child_entity = ChildModel(Parent=parent_entity, ...)
child_entity.put()
db.run_in_transaction(create_parent_and_child)
Related
Spring JPA Question:
We load quite a bit of data form a pretty slow service. Now each dto we load from the service has a foreign key to a table containing data. We don't need the data right away, just the key and we don't want to find the item containing the key each time, because we would need to put a repository into our mapper, which would make everything even slower. Is it possible to tell Hibernate just to reference the table without needing a object/#ManyToOne? Then we could just save the entity from our mapper right away without needing a custom mapping for the key.
So maybe something like:
#ForeignKey(tableName="itemTemperature", referencedColumnName="id")
var itemTemperatureId: Int = 0
Or does this work too:
var item = Item().apply {
itemTemperature = ItemTemperature().apply { id = keyFormService }
}
itemRepository.save(item)
I have the following hierarchy in my classes
#Entity
#Table(name = "Parent")
#Inheritance(strategy = InheritanceType.SINGLE_TABLE)
public class Parent {
}
#Entity
#DiscriminatorVlaue("FirstChild")
public class FirstChild extends Parent {
}
#Entity
#DiscriminatorVlaue("SecondChild")
public class SecondChild extends Parent {
}
This creates a table Parent as expected.
Some business logic in my app:
As the request is accepted , it is persisted as "Parent" type and creates the following entry in the table
Dtype id value
Parent 1 "some value"
As the request is processed it could either be of type FirstChild or of type SecondChild so in my code somewhere I have
if (some condition is met) {
// change the Parent to its firstChild type
}
else {
// chnage it to "SecondChild" type
}
Is my understanding and usage of inheritance correct? I am essentially downcasting objects in the if loop which throw runtime exceptions, but does changing the type of an object also change the DTYPE value in the database? I am essentially using inheritance to keep things organized as their types. Is there some magic I can do in the constructors to achieve this? But the real question is is the Dtype modifiable on an update?
To work with this further,
I created contsructors in the child classes as
public FirstChild(Parent parent) {
id = parent.id
}
but this creates a new row for the sub types ,
Dtype id value
Parent 1 "some value"
FirstChild 2 "some value"
I am making sure that the "firstChild" record is created with the parent ID (1) but I am not sure why a second row is created
There is not a real inheritance in relational databases. Hibernate is just simulating it. Inheritance is a programming concept to use other classes like an owner. But you can't use other table as owning table.. In an child table if you have an element than you have a parent but where is the parent ? Parent is an element too.. ( a row of data ). So you simulate it in databases.
Hibernate has some inheritance strategies.'SINGLE_TABLE' is one of them and collects parent table and child tables in 1 table. To differantiate which type is which class uses 'DiscriminatorColumn' on database. This column cant be changed it is static and declared with your java class. You can use 'DiscriminatorValue' to select a value to identify a java class at 'DiscriminatorColumn'.
So your inheritance understanding is differs on relational databases. You must declare every child row data on parent table rows...
But the real question is is the Dtype modifiable on an update?
No, you cant: source.
From your example your pure java code would look like this:
Parent parent = new Parent(); //object received from request, stored in database as it is
if (some conditions is met) {
// FirstChild firstChild = (Parent) parent; //this would throw ClassCastException
FirstChild firstChild = new FirstChild(parent); //creating new object, gc will collect parent
}
So hibernate act same way as java: you cannot update Parent, you have to create new record for firstChild and delete parent.
Wouldn't be helpfull for you use #DiscriminatorFormula instead of checking some condition is met when processing the request?
#DiscriminatorFormula("case when value = 'some value' then 'FirstChild' else 'SecondChild' end")
Then you wouldnt have to save Parent and then process its records.
These are my findings:
There is no objective reason why a Child should extend Parent. In terms of modeling I would have instead one abstract class lets call it FamilyMember and then Parent, FirstBorn and SecondBorn would extend from the family member.
The downcasting problem would be solved by (1)
Hibernate does not support update of discriminator value. This mean that if a child comes to age and is no longer a child , but a parent. The child record needs to be deleted and a new record corresponding to a Parent be created. This is how it works. A child is a child until it is no longer a child and then the child record gets deleted and a new one not child gets created.
I build a tree structure which is stored in the database. The relation is build on the id, parent_id columns in the database table. I'm using spring data and hibernate.
For accessing the tree structure I build a entity class "Node" and a "NodeRepository". The entity class has an attribute "children" which has a #OneToMany relation to itself.
Fetching the nodes is no problem. But fetching the children is kind of a problem, because of lazy fetching outside of a transactional environment ("failed to lazily initialize a collection of role").
I changed the Fetchmode to eager. But that is also a problem because of the relation to itself, it ended up with the whole tree structure to be fetched.
So, what is the best practice in this case to keep the retrieval of the children easy on one side and don't fetching the whole structure on the other side?
I think i got a solution which fits my needs.
First i defined a custom interface which extends my NodeRepository interface, give it an additional method "findOneWithChildrenInit" and implemented it.
public interface NodeRepositoryCustom {
public Node findOneWithChildrenInit(Long id);
}
public class NodeRepositoryImpl implements NodeRepositoryCustom {
#Autowired
NodeRepository repo;
#Override
#Transactional
public Node findOneWithChildrenInit(Long id) {
Node node = repo.findOne(id);
node.getChildren().size();
return node;
}
}
So I can decide. When I don't need the children, I can simply call findOne(). Then I need them, I call findOneWithChildrenInit().
It probably depends on the provider implementation. But usually, the children collection is initialized when you use the collection, in any way. So, a common way is to call children.size() before you get out of the transaction, and throw away the result.
You can still use lazy fetch type and make sure that session has not been closed by the time all children were fetched. You can simple do:
Session session = sessionFactory.openSession();
// fetch all required Nodes & all their children by using the same session object
session.close();
When you have one more relationship eager in your entity you need include the annotation #Fetch(FetchMode.SELECT) in the others eagers.
Check in official documentation each of typer avaiables in the fetchmode.
I have a parent object with a version locking policy defined as follows:
VersionLockingPolicy lockingPolicy = new VersionLockingPolicy();
lockingPolicy.setIsCascaded(true);
lockingPolicy.setWriteLockFieldName("CacheId");
descriptor.setOptimisticLockingPolicy(lockingPolicy);
and with a child mapped as follows:
OneToManyMapping childMapping = new OneToManyMapping();
childMapping.setAttributeName("children");
childMapping.setReferenceClass(Child.class);
childMapping.dontUseIndirection();
childMapping.privateOwnedRelationship();
childMapping.useBatchReading();
childMapping.useCollectionClass(ArrayList.class);
childMapping.addTargetForeignKeyFieldName("Child.ParentId", "Parent.Id");
descriptor.addMapping(childMapping);
When I change a field on the child and update the child cacheId directly on the database, eclipselink queries do not pick up the change. When I then update the cacheId of the parent object, eclipselink queries do return the change to the child field.
I thought the cascaded version locking policy was supposed to cause the parent to update when any of its private owned child objects were updated (as defined by their version fields). Was I wrong about that, or is there likely something wrong somewhere else in my code?
Just use the following on the parent entity class:
#OptimisticLocking(cascade = true)
and mark #OneToMany with #PrivateOwned
This works only if you use version column. Please check:
http://wiki.eclipse.org/Using_EclipseLink_JPA_Extensions_(ELUG)#Using_EclipseLink_JPA_Extensions_for_Optimistic_Locking
I was wrong. There is nothing in the eclipselink code that will do what I wanted.
I think I will simply add a trigger to the child objects to update the parent cacheId.
consider this scenario:
I have loaded a Parent entity through hibernate
Parent contains a collection of Children which is large and lazy loaded
The hibernate session is closed after this initial load while the user views the Parent data
The user may choose to view the contents of the lazy Children collection
I now wish to load that collection
What are the ways / best way of loading this collection?
Assume session-in-view is not an option as the fetching of the Children collection would only happen after the user has viewed the Parent and decided to view the Children.
This is a service which will be accessed remotely by web and desktop based client.
Thanks.
The lazy collection can be loaded by using Hibernate.initialize(parent.getCollection()) except that the parent object needs to be attached to an active session.
This solution takes the parent Entity and the name of the lazy-loaded field and returns the Entity with the collection fully loaded.
Unfortunately, as the parent needs to be reattached to the newly opened session, I can't use a reference to the lazy collection as this would reference the detached version of the Entity; hence the fieldName and the reflection. For the same reason, this has to return the attached parent Entity.
So in the OP scenario, this call can be made when the user chooses to view the lazy collection:
Parent parentWithChildren = dao.initialize(parent,"lazyCollectionName");
The Method:
public Entity initialize(Entity detachedParent,String fieldName) {
// ...open a hibernate session...
// reattaches parent to session
Entity reattachedParent = (Entity) session.merge(detachedParent);
// get the field from the entity and initialize it
Field fieldToInitialize = detachedParent.getClass().getDeclaredField(fieldName);
fieldToInitialize.setAccessible(true);
Object objectToInitialize = fieldToInitialize.get(reattachedParent);
Hibernate.initialize(objectToInitialize);
return reattachedParent;
}
I'm making some assumptions about what the user is looking at, but it seems like you only want to retrieve the children if the user has already viewed the parent and really wants to see the children.
Why not try opening a new session and fetching the children by their parent? Something along the lines of ...
criteria = session.createCriteria(Child.class);
criteria.add(Restrictions.eq("parent", parent));
List<Child> children = criteria.list();
Hibernate handles collections in a different way that normal fields.
At my work we get around this by just initialising the fields in the initial load that we need on a case by case basis. For example, in a facade load method that is surrounded by a transaction you might have:
public Parent loadParentWithIntent1(Long parentId)
{
Parent parent = loadParentFromDAO();
for (Child c : parent.getChildren())
{
c.getField1();
}
}
and we have a different facade call for each intent. This essentially achieves what you need because you'd be loading these specific fields when you need them any way and this just puts them in the session at load time.