Im using ORMLite in my Android app. I need to persist this class, which has a HashMap. What is a good way of persisting it? Its my first time trying to persist a HashMap, also first time with ORMLite so any advice would be greatly appreciated!
*Edit*
If that makes any difference, the Exercise class is simply a String (that also works as id in the database), and the Set class has an int id (which is also id in database), int weight and int reps.
#DatabaseTable
public class Workout {
#DatabaseField(generatedId = true)
int id;
#DatabaseField(canBeNull = false)
Date created;
/*
* The hashmap needs to be persisted somehow
*/
HashMap<Exercise, ArrayList<Set>> workoutMap;
public Workout() {
}
public Workout(HashMap<Exercise, ArrayList<Set>> workoutMap, Date created){
this.workoutMap = workoutMap;
this.created = created;
}
public void addExercise(Exercise e, ArrayList<Set> setList) {
workoutMap.put(e, setList);
}
...
}
Wow. Persisting a HashMap whose value is a List of Sets. Impressive.
So in ORMLite you can persist any Serializable field. Here's the documentation about the type and how you have to configure it:
http://ormlite.com/docs/serializable
So your field would look something like:
#DatabaseField(dataType = DataType.SERIALIZABLE)
Map<Exercise, List<Set>> workoutMap;
Please note that if the map is at all large then this will most likely not be very performant. Also, your Exercise class (and the List and Set classes) need to implement Serializable.
If you need to search this map, you might consider storing the values in the Set in another table in which case you might want to take a look at how ORMLite persists "foreign objects".
Related
Here are the relevant pieces of the code I inherited. The object "process" is the old process that is passed to the method. The object "newProcess" is what I am replacing it with, using different fields of the user's choosing.
try
{
final EntityManager em = getEntityManager();
em.getTransaction().begin();
JpaProcessDAO pDao = new JpaProcessDAO(em);
Process newProcess = pDao.findById(processId);
newProcess.setName(process.getName());
newProcess.setDataBaseVersion(process.getDataBaseVersion());
newProcess.setNotes(process.getNotes());
newProcess.setReadyForUse(process.getReadyForUse();
newProcess.setSteps(process.getSteps());
em.merge(newProcess); <---- WHERE PROBLEM OCCURS
em.persist(newProcess);
em.getTrasaction().commit();
}
RESULT: Every field that I change is changed in newProcess EXCEPT "Steps". During the merge step in the code, that list goes back to whatever the steps were in the original object "process".
Now this could be because that "Step" is an object itself, not a primitive like all of the other fields I set in "newProcess":
Mapping in Process.java
#OneToMany(mappedBy="process")
private List<Step>
// getter, setter
In Step.java there is a collection of objects, some of which are lists of nonprimitive objects themselves.
Step.java
public class Step implements Serializable {
#Id
#Column(name = "step_id")
#GeneratedValue(strategy=GenerationType.IDENTITY)
private int stepId;
private String duration;
private String name;
private String notes;
private Integer sort;
#OneToMany(mappedBy="step", cascade=CascadeType.REMOVE)
private List<Constituent> constituents;
#OneToMany(mappedBy="step")
private List<Reference> references;
#ManyToOne
#JoinColumn(name ="process_id")
private Process process;
#OneToMany(mappedBy="step",cascade=CascadeType.REMOVE)
private List<StepEquipment> stepEquipments;
public Step() {
}
// getters/setters
}
Does anybody know what this inherited code I have could possibly do wrong?
ADDITIONS TO CODE ON 11/29:
public T findById(final Integer id) throws CPDPersistenceExceptin {
return findByPrimaryKey(id,templateClass);
}
public T findBYPrimaryKey(Object key, Class<T> clazz) {
T t = getEntityManager().find(clazz,key);
getEntityManager.merge(t);
getEntityManager.refresh(t);
return t; <-------------- newProcess is returned by this statement.
}
newProcess does not have the steps that were in the original process,nor does it have the ProcessCategories that were in process. The Hibernate logs say
that select is going on for process_id, database_version, process_name, process_notes, and process_ready_to_use only in the merge and refresh statements.
You need to synchronize both sides of the association. In your code you're only setting newProcess.setSteps(...), but each Step doesn't set a Process. From here:
However, we still need to have both sides in sync as otherwise, we break the Domain Model relationship consistency, and the entity state transitions are not guaranteed to work unless both sides are properly synchronized.
So in other words, you would need to do something along the lines of:
newProcess.setSteps(process.getSteps());
process.getSteps().forEach(s -> s.setProcess(newProcess));
As in answer from dyslexit told you need to set the Process to each Step.
But in addition you need to have the new Steps persisted and old ones removed. You can do this manually per Step but easier way would be to alter your code a bit.
Mofify the mapping annotation in step like:
#OneToMany(mappedBy = "process", cascade=CascadeType.PERSIST, orphanRemoval=true)
private List<Step> steps;
so tell persist to cascade to Steps also and to remove all Steps that are detached from Process.
Modify the update logic:
// newProcess.setSteps(process.getSteps());
// em.merge(newProcess); <---- WHERE PROBLEM OCCURS
// em.persist(newProcess);
newProcess.getSteps().clear(); // remove old steps
newProcess.getSteps().addAll(process.getSteps()); // add new steps
// You need to set the other side of association also as below
newProcess.getSteps().forEach(s -> s.setProcess(newProcess));
// em.persist(newProcess); // not sure if needed
SO: do not REPLACE the list but instead MODIFY the original list.
ALSO: there might not be a need for any merge/persist operation (and certainly doing both in series is not something that should ever be done). But because you use mystical JpaProcessDAO I can not be sure so check that.
And also see for what those are really used, great explanation here.
I am guessing that entity manager might handle everything just fine - without persist/merge stuff -because I think you already got managed entity when called pDao.findById(processId);, that is why I have commented it out.
Another story is then the mappings you have in your Step class. Those might also need changes to persistence & cascade setting.
As a side note: have also a look at this question how you might have update done easier with ModelMapper.
I'm developing an app with backend and I decided to try using Google App Engine for my backend. Since I'm really new on Google App Engine, I'm little bit confused with the logic.
Basically, I have a couple of model classes to represent my object types. Lets say one of them is User and another is Item. Users have items and an item can belong more than one user. So User X can have 25 items including Item A, and User Y can have totally different 20 items and also the Item A.
Right now my User class looks like this:
#Entity
public class User {
#Id private Long id;
private String name;
private String emailAddress;
private String photoURL;
//All getters and setters...
}
And my Item class is approximately same. One of my questions is, where should I add some kind of list, like a list of Items into User. And which annotation should I use? What will that annotation provide me as a result (a reference, an id or a complete object)?
Another question related to this is, in my endpoint class, how can I get a list of Items that a specific User has (or list of Users that owns a specific Item)?
One last totally unrelated question, should I do anything to make id auto increment or will it be automatic if I won't provide any id while inserting an item?
You can search in the datastore for 2 things: keys and indexed properties.
class Thing {
#Id Long id;
#Index String property;
}
At some point you save some entities
Thing thing1 = new Thing();
thing1.property = "yes";
Thing thing2 = new Thing();
thing2.property = "no";
ofy().save().entities(thing1, thing2).now();
Now you can search for all entities based on their indexed properties. E.g. for all things with property == "yes".
List<Thing> things = ofy().load().type(Thing.class).filter("property", "yes").list();
Would return exactly thing1.
The same works with Lists of properties. And it works with lists of references/keys to other properties.
class User {
#Id Long id;
#Index List<Key<Item>> items;
}
class Item {
#Id
Long id;
}
List<User> searchUsersWithItem(long itemId) {
Key<Item> itemKey = Key.create(Item.class, itemId);
return ofy().load().type(User.class).filter("items", itemKey).list();
}
List<User> searchUsersWithItem(Item item) {
return ofy().load().type(User.class).filter("items", item).list();
}
// just loads all the referenced items in the owner
List<Item> searchItemsWithOwner(User owner) {
return new ArrayList<Item>(ofy().load().<Item>values(owner.items).values());
}
filter works with refs, keys and entitiy instances.
To be found things must be indexed https://cloud.google.com/datastore/docs/concepts/indexes / https://github.com/objectify/objectify/wiki/Queries
What's left for you to decide is how you model your relation. There are multiple ways. A user that owns a set of items which can be owned by set of users is actually a many-to-many relation. You could represent it like
class User { List<Key<Item>> items; }
class Item { }
or
class User { }
class Item { List<Key<User>> owners; }
or
class User { List<Key<Item>> items; }
class Item { List<Key<User>> owners; }
or even
class User { }
class Item { }
class Ownership { Key<Item> item; Key<User> user; }
Each approach has it's ups and downs with respect to data consistency and searchability / performance. In the initial example it's trivial to search for all items of a user since all you have to to is to load that one user and you have the list of items. The other direction requires the query approach.
So with respect to search performance you benefit from having the list of owners in the items as well as the list of items in the user because that way you don't need queries at all. The big downside becomes data consistency. If you fail to update both user and item at the same time you can have items that believe to be owned by a user where the user thinks different.
The last approach, using an explicit "Ownership" entity is essentially the traditional pivot / junction table https://en.wikipedia.org/wiki/Many-to-many_%28data_model%29 that is the result of transforming a many-many relation into 2 one-many relations. Using that would result in easy consistency, but the worst query performance.
Parent relations can sometimes be useful but only if there is an actual 1 to many relation where the parent needs to exist.
Also note how keys are not foreign keys like in traditional SQL databases as they can exist without an entity. So you'll have to take care of consistency regardless of what you do.
Assume I have a model like following
class Chest {
public Id id;
public List<Drawer> drawers;
public Price price;
}
class Drawer {
public Id id;
public Price price;
}
And a JOOQ query to fetch a Chest object with its Drawers:
dsl.selectFrom(CHEST.join(DRAWERS).onKey()).where(CHEST.ID.eq(1)).fetch()
What is the best way to construct the Chest object from the result of the query above?
Thanks.
In general, using JOIN to materialise object graphs won't really work well, as you're denormalising your database entities into a table (with duplicates) before you try to normalise the data again in a mapping algorithm. JPA hides these things from you by offering an alternative query language that doesn't expose so many SQL features.
In your particular case, however, you can get this to run via the jOOQ API by using the Result.intoGroups() methods. Thus:
Map<Record, Result<Record>> result =
dsl.selectFrom(...).fetch().intoGroups(CHEST.fields());
List<Chest> list = new ArrayList<>();
for (Entry<Record, Result<Record>> entry : result.entrySet()) {
Record chest = entry.getKey();
Result<Record> drawers = entry.getValue();
list.add(new Chest(
chest.into(Id.class), // These into(Class<?>) methods assume that you
drawers.into(Drawer.class) // want to use jOOQ's DefaultRecordMapper
));
}
The above algorithm is probably incomplete, or not exactly what you need. But it'll give you a general idea of what's possible out-of-the-box via jOOQ API.
I am working on an android app that loads in a list of students to display in a list based activity. There are two components to the app. There is a server which responds via xml with the list of current active students and a database on the app end which stores theses students with some details (name,age etc). I would like a way to sync these two data sources. When the app starts, I would like to check against the xml to see if students on the server were added/deleted and update the db accordingly.
I would be parsing the xml list into a student object at login. Is there any way to store/retrieve an entire object into an android supported db so I can do a direct comparison to see what to update/delete? It would end up being something like
if (serverStudent[0].name == dbStudent[0].name)
//overwrite dbStudent object with serverStudent fields
What is the most efficient/lightweight way to achieve object persistance and then comparison in Android?
Here's a method I have used in the past:
Anytime an object in the database is changed, use a timestamp column to store that time. When the app connects on startup, simply check each timestamp in the app db against the timestamp in the server db for each object. If the timestamps match, do nothing. If the timestamps don't match, retrieve the updated record from the server. Make sure you're using a detail enough timestamp (usually down to milli- or micro- seconds).
The nice thing about timestamps is that if you don't want the server data to override the app data, you could look at which is newer and keep that object if they've both been edited. Just adding some additional thoughts!
You can do something like this -
public class StudentRecord {
Vector<StudentData> studentDatas;
public StudentRecord()
{
studentDatas = new Vector<StudentData>();
}
public Vector<StudentData> getRecords() {
return studentDatas;
}
public void setRecords(Vector<StudentData> records) {
this.studentDatas = records;
}
public class StudentData
{
String name,Rollno;
public String getRollno() {
return Rollno;
}
public void setRollno(String rollno) {
Rollno = rollno;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
}
When you get the vector object studentDatas you can do something like this -
for(Object object : record.getRecords())
{
data = (StudentData)object;
data.getRollno();
data.getName();
}
Check out these libraries:
http://www.datadroidlib.com/
https://github.com/octo-online/robospice
I believe both offer solutions for your situation.
Or you can roll your own solution... Basically you will want to create a service or asynctask to do the syncing, in your student object you can create a constructor that you can pass an id to and have it pull the appropriate record from your local db then make a comparison method that will update if newer information is available.
I'm not sure i understood your question correctly.But as far as i understand i would do something like this.
In server side send send Json array which holds json student objects.
In android side create similer Student class and override equals
method as you want.
Then for each student check with equals method whether they are
equals or not and take action accordingly.
If you want to make faster search in students object array then apply
hash map instead of arrays.
I'm trying to merge these three objects into a single complex object:
public class Person {
private String name;
private List<Event> events;
// getters and setters
}
public class Event {
private String name;
private List<Gift> gifts;
// getters and setters
}
public class Gift {
private String name;
private String recipient;// the name of the person
private String eventName;
// getters and setters
}
My goal is to save the Person object in MongoDB using Morphia and this how I want my document laid out. I've created a document builder, of sorts, that combines lists of each object. Each Person gets a list of all Events, but can only receive specific Gifts. While my document builder does create a document that Morphia can persist, only the Gifts of that last recipient (sort order) are inserted into the Events for all Persons. Though for the correct Events.
public void merge() {
for (Person person : listOfPersons) {
for (Event event : listOfEvents) {
// somePersonsGifts: a sublist of gifts based on Event and Person.
List<Gift> somePersonsGifts = new ArrayList<Gift>();
for (Gift gift : listOfGifts) {
if (person.getName().equals(gift.getRecipient()) && gift.getEventName().equals(event.getName())) {
somePersonsGifts.add(gift);
}
}
event.setGifts(somePersonsGifts);
}
person.setEvents(listOfEvents)
}
}
If I modify the code slightly to process one person at a time by removing the outer loop and having the method take an argument for specific index of the Persons list:
public void merge(int p) {
Person person = listOfPersons.get(p);
//...and so on
I get one complete Person object with the correct gifts. If try to feed the this modified version into a loop, the problem comes back. I've tried using regular for-loops and synchronized collections. I've tried using Google Guava's ImmutableArrayList and still no luck. I know the problem is that I'm changing the lists while accessing them but I can't find anyway around it. I wrote a DAO that uses the MongoDB driver directly and it works properly, but it's a lot more code and quite ugly. I really want this approach to work, the answer is in front of me but I just can't see it. Any help would be greatly appreciated.
Here is your problem:
List<Gift> somePersonsGifts = new ArrayList<Gift>();
....
event.setGifts(somePersonsGifts);
You add the gifts only for one person; if you want to aggregate all the gifts into the event, re-use the existing list.
I don't know anything about MongoDB or Morphia but I suspect the problem is your use of the setters event.setGifts(somePersonsGifts) and person.setEvents(events). Your code does not seem to merge the existing gift and event lists with the ones you are calculating further in the loop, which is how you would want it to behave (if I understand the question correctly).
You should retrieve the allready existing gift list (and event list too) instead of overwriting them with empty new ones.
I don't know if the method merge() is inside the list but I assume that since you are using the list events here
person.setEvents(events);
Maybe you meant
person.setEvents(listOfEvents)
Notice that you are adding all the events to each person. If all the persons went to all the events, it is unnecessary to have the events inside the person.