I am using Google App Engine datastore to store 4 String values. The String vlaues are added to the datastore in a servlet:
DatastoreService datastore = DatastoreServiceFactory.getDatastoreService();
Entity balances;
Key primaryKey;
String table = "MainTable";
String name = "Values";
primaryKey = KeyFactory.createKey(table, name);
Transaction t = datastore.beginTransaction();
// If the 'table' exists - delete it
datastore.delete(primaryKey);
// Really make sure it's deleted/
t.commit();
t = datastore.beginTransaction();
balances = new Entity("Balances", primaryKey);
updateBalances(balances);
datastore.put(balances);
// Save the new data
t.commit();
resp.sendRedirect("/balance.jsp");
I want to be able to update the four String values each time the servlet is run - which is why I look for the key first and delete it. I even use a separate transaction to ensure this really happens.
The key is found and is deleted and then the values are added. But when I load a .jsp file which retrieves the values the number of 'records' in the Entity grows by 1 each time. I do not understand why the record is not being deleted.
Here is the .jsp code:
<%
DatastoreService datastore = DatastoreServiceFactory.getDatastoreService();
Key guestbookKey = KeyFactory.createKey("MainTable", "Values");
Query query = new Query("Balances", guestbookKey);
List<Entity> greetings = datastore.prepare(query).asList(FetchOptions.Builder.withLimit(5));
%>
<!-- This should always be 1, but it gorws each time the servlet is hit.-->
<%= greetings.size() %>
SOLUTION
I don't know what the problem was with the code in the original question. However I achieved my objective of persisting String values across sessions on a Google App Engine (GAE) by using a library called Objectify (http://code.google.com/p/objectify-appengine/) - which aims to simplify the use of the DataStore on GAE.
The library itself is just a .jar file and can be added to a Java project in Eclipse easily. I did not find using the library that easy to use...the main problem is registering the class which models the data you wish to save. Registration can only be done once!
To register the class only once I added a listener to my web app which registered the class with the Objectify framework and also created 4 random numbers and saved them:
public class MyListener implements ServletContextListener {
public void contextInitialized(ServletContextEvent event) {
// Register the Account class, only once!
ObjectifyService.register(Account.class);
Objectify ofy = ObjectifyService.begin();
Account balances = null;
// Create the values we wish to persist.
balances = new Account(randomNum(), randomNum(), randomNum(),
randomNum());
// Actually save the values.
ofy.put(balances);
assert balances.id != null; // id was autogenerated
}
public void contextDestroyed(ServletContextEvent event) {
// App Engine does not currently invoke this method.
}
private String randomNum() {
// Returns random number as a String
}
}
.. this code is run only once when the server starts - for this to happen I also needed to modify web.xml to add:
<listener>
<listener-class>.MyListener</listener-class>
</listener>
Then I just had a .jsp page which read the saved values:
<%
Objectify ofy = ObjectifyService.begin();
boolean data = false;
// The value "mykey" was hard coded into my Account class enter code here
// since I only wanted access to the same data every time.
Account a = ofy.get(Account.class, "mykey");
data = (null!=a);
%>
Here is my Account class:
import javax.persistence.*;
public class Account
{
#Id String id = "mykey";
public String balance1, balance2, balance3, balance4;
private Account() {}
public Account(String balance1, String balance2, String balance3, String balance4)
{
this.balance1 = balance1;
this.balance2 = balance2;
this.balance3 = balance3;
this.balance4 = balance4;
}
}
One last thing...I found the OBjectify documentation very helpful in understanding GAE Datastore irrespective of the Objectify framework
For future reference, I think your original example failed because of this line:
balances = new Entity("Balances", primaryKey);
This doesn't actually create an entity with primaryKey, but it creates an entity with primaryKey as the ancestor key. It will get an automatically generated id every time you store it.
Related
I have encountered a curious bug or feature while writing code. Here's the situation:
We are using a PostgreSQL database, EclipseLink in a JavaEE project.
What I am doing in this part of the project is fetching an entity from the database i.e.:
User user = userController.get(userId);
Which then goes to our controller and fetches the user via a TypedQuery:
#Stateless
#LocalBean
public class UserController {
private EntityManager em;
public User get(Integer userId){
User retval = null;
TypedQuery<User> = em.createNamedQuery("User.findByUserId", User.class);
q.setParameter("userId", userId);
retval = q.getSingleResult();
}
public User update(final User modified){...}
}
And in my User class I have:
#NamedQuery(name = "User.findByUserId", query = "SELECT u FROM User u WHERE u.id = :userId"),
So the call goes, I get my user object with its respective data from the database.
In the class where I called the userController.get method I continue to modify the data on this object, and call our controller again to update this data on the database
user.setServiceId(1); //any id (integer) pointing to an existing service, this is a ManyToOne relationship
userController.update(user);
And here is where it gets funny. In our update method inside the controller class I have my modified User object and using this object I get the primary key userId and fetch the data again from the database to get the original:
#Stateless
#LocalBean
public class userController {
private EntityManager em;
public User get(Integer userId){...}
public User update(final User modified){
User retval = null;
if(modified != null){
try {
User original = get(modified.getId()); //Here I fetch the current state of the DB
if(original != null){
Set<Modifications> modifications = apply(original, modified); //Method to apply modifications
retval = em.merge(original); //Merge changes into database
em.flush(); //Force data to be persisted
catch(Exception e){
}
return retval;
}
}
However, the fields in the original object do not reflect the state of the database but instead contains the same data as the modified object. In this case, the serviceId on the database is null, and in the modified I set it to an ID. The original has its serviceId set to the same value as the modified object even though it should contain the fetched data from the database, in this case null
My current solution is to construct a new User object, after fetching the user from the database, and modify the data on that new object:
User user = userController.get(userId);
User newUser = new User(user);
newUser.setService(service);
userController.update(newUser);
Now when I do the update method, the original reflects the state of the database.
Or maybe it reflects the state of the user object that already exists in the persistence context?
But why does this happen? Since I do make a new get call with a SELECT statement to the database in my update method.
You are using the same EntityManager for everything, both the read and the 'merge', which in this case is then a no-op. Everything read in through an EM is managed, so that if you read it back again, you get the same instance back. As long as the User isn't being serialized, it is 'managed' by the EntityManager it was read from, and so that same instance, and its changes, are visible on any get calls on that ID.
You didn't show how you are getting EntityManagers, but I would guess is isn't container managed, as they would inject a new one for these calls, and then close them for you when done. You haven't shown any transaction logic on how the update and the em context it is using are hooked up, but I would suggest you create a new EntityManager for these calls. Flush also seems unnecessary, as if update is wrapped in a transaction, should handle flushing the update statement to the database without this extra call.
If user.setServiceId(1); is called when the "user" entity is managed, the call is going to update the database row.
you can check the manage entity lifecycle
You need to refresh the data after saving it to the database and to get the latest state of the object, as em.refresh(retval)
You can find the code added below.
#Stateless
#LocalBean
public class userController {
private EntityManager em;
public User get(Integer userId){...}
public User update(final User modified){
User retval = null;
if(modified != null){
try {
User original = get(modified.getId()); //Here I fetch the current state of the DB
if(original != null){
Set<Modifications> modifications = apply(original, modified); //Method to apply modifications
retval = em.merge(original); //Merge changes into database
em.flush(); //Force data to be persisted
em.refresh(retval); // This will fetch the updated data from database
catch(Exception e){
}
return retval;
}
}
I'm using Spring Data MongoDB and Spring Data Rest to create a REST API which allows GET, POST, PUT and DELETE operations on my MongoDB database and it's all working fine except for the update operations (PUT). It only works if I send the full object in the request body.
For example I have the following entity:
#Document
public class User {
#Id
private String id;
private String email;
private String lastName;
private String firstName;
private String password;
...
}
To update the lastName field, I have to send all of the user object, including the password ! which is obviously very wrong.
If I only send the field to update, all the others are set to null in my database. I even tried to add a #NotNull constraints on those fields and now the update won't even happens unless I send all of the user object's fields.
I tried searching for a solution here but I only found the following post but with no solution: How to update particular field in mongo db by using MongoRepository Interface?
Is there a way to implement this ?
Spring Data Rest uses Spring Data repositories to automatically retrieve and manipulate persistent data using Rest calls (check out https://docs.spring.io/spring-data/rest/docs/current/reference/html/#reference).
When using Spring Data MongoDB, you have the MongoOperations interface which is used as a repository for your Rest endpoints.
However MongoOperations currently does not supports specific fields updates !
PS: It will be awesome if they add this feature like #DynamicUpdate in Spring Data JPA
But this doesn't mean it can be done, here's the workaround I did when I had this issue.
Firstly let me explain what we're going to do:
We will create a controller which will override all the PUT operations so that we can implement our own update method.
Inside that update method, we will use MongoTemplate which do have the ability to update specific fields.
N.B. We don't want to re-do these steps for each model in our application, so we will retrieve which model to update dynamically. In order to do that we will create a utility class. [This is optional]
Let's start by adding the org.reflections api to our project dependency which allows us to get all the classes which have a specific annotation (#Document in our case):
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.12</version>
</dependency>
Then create a new class, called UpdateUtility and add the following methods and also replace the MODEL_PACKAGE attribute with your own package containing your entities:
public class UpdateUtility {
private static final String MODEL_PACKAGE = "com.mycompany.myproject.models";
private static boolean initialized = false;
private static HashMap<String, Class> classContext = new HashMap<>();
private static void init() {
if(!initialized) {
Reflections reflections = new Reflections(MODEL_PACKAGE);
Set<Class<?>> classes = reflections.getTypesAnnotatedWith(Document.class); // Get all the classes annotated with #Document in the specified package
for(Class<?> model : classes) {
classContext.put(model.getSimpleName().toLowerCase(), model);
}
initialized = true;
}
}
public static Class getClassFromType(String type) throws Exception{
init();
if(classContext.containsKey(type)) {
return classContext.get(type);
}
else {
throw new Exception("Type " + type + " does not exists !");
}
}
}
Using this utility class we can retreive the model class to update from it's type.
E.g: UpdateUtility.getClassFromType() will returns User.class
Now let's create our controller:
public class UpdateController {
#Autowired
private MongoTemplate mongoTemplate;
#PutMapping("/{type}/{id}")
public Object update(#RequestBody HashMap<String, Object> fields,
#PathVariable(name = "type") String type,
#PathVariable(name = "id") String id) {
try {
Class classType = UpdatorUtility.getClassFromType(type); // Get the domain class from the type in the request
Query query = new Query(Criteria.where("id").is(id)); // Update the document with the given ID
Update update = new Update();
// Iterate over the send fields and add them to the update object
Iterator iterator = fields.entrySet().iterator();
while(iterator.hasNext()) {
HashMap.Entry entry = (HashMap.Entry) iterator.next();
String key = (String) entry.getKey();
Object value = entry.getValue();
update.set(key, value);
}
mongoTemplate.updateFirst(query, update, classType); // Do the update
return mongoTemplate.findById(id, classType); // Return the updated document
} catch (Exception e) {
// Handle your exception
}
}
}
Now we're able to update the specified fields without changing the calls.
So in your case, the call would be:
PUT http://MY-DOMAIN/user/MY-USER-ID { lastName: "My new last name" }
PS: You can improve it by adding the possibility to update specific field in a nested objects...
I have implemented by project using Spring-Data-Rest. I am trying to do an update on an existing record in a table. But when I try to send only a few fields instead of all the fields(present in Entity class) through my request, Spring-Data-Rest thinking I am sending null/empty values. Finally when I go and see the database the fields which I am not sending through my request are overridden with null/empty values. So my understanding is that even though I am not sending these values, spring data rest sees them in the Entity class and sending these values as null/empty. My question here is, is there a way to disable the fields when doing UPDATE that I am not sending through the request. Appreciate you are any help.
Update: I was using PUT method. After reading the comments, I changed it to PATCH and its working perfectly now. Appreciate all the help
Before update, load object from database, using jpa method findById return object call target.
Then copy all fields that not null/empty from object-want-to-update to target, finally save the target object.
This is code example:
public void update(Object objectWantToUpdate) {
Object target = repository.findById(objectWantToUpdate.getId());
copyNonNullProperties(objectWantToUpdate, target);
repository.save(target);
}
public void copyNonNullProperties(Object source, Object target) {
BeanUtils.copyProperties(source, target, getNullPropertyNames(source));
}
public String[] getNullPropertyNames (Object source) {
final BeanWrapper src = new BeanWrapperImpl(source);
PropertyDescriptor[] propDesList = src.getPropertyDescriptors();
Set<String> emptyNames = new HashSet<String>();
for(PropertyDescriptor propDesc : propDesList) {
Object srcValue = src.getPropertyValue(propDesc.getName());
if (srcValue == null) {
emptyNames.add(propDesc.getName());
}
}
String[] result = new String[emptyNames.size()];
return emptyNames.toArray(result);
}
You can write custom update query which updates only particular fields:
#Override
public void saveManager(Manager manager) {
Query query = sessionFactory.getCurrentSession().createQuery("update Manager set username = :username, password = :password where id = :id");
query.setParameter("username", manager.getUsername());
query.setParameter("password", manager.getPassword());
query.setParameter("id", manager.getId());
query.executeUpdate();
}
As some of the comments pointed out using PATCH instead of PUT resolved the issue. Appreciate all the inputs. The following is from Spring Data Rest Documentation:
"The PUT method replaces the state of the target resource with the supplied request body.
The PATCH method is similar to the PUT method but partially updates the resources state."
https://docs.spring.io/spring-data/rest/docs/current/reference/html/#customizing-sdr.hiding-repository-crud-methods
Also, I like #Tran Quoc Vu answer but not implementing it for now since I dont have to use custom controller. If there is some logic(ex: validation) involved when updating the entity, I am in favor of using the custom controller.
OK, so I have an interesting problem. I am using java/maven/spring-boot/cassandra... and I am trying to create a dynamic instantiation of the Mapper setup they use.
I.E.
//Users.java
import com.datastax.driver.mapping.annotations.Table;
#Table(keyspace="mykeyspace", name="users")
public class Users {
#PartitionKey
public UUID id;
//...
}
Now, in order to use this I would have to explicitly say ...
Users user = (DB).mapper(Users.class);
obviously replacing (DB) with my db class.
Which is a great model, but I am running into the problem of code repetition. My Cassandra database has 2 keyspaces, both keyspaces have the exact same tables with the exact same columns in the tables, (this is not my choice, this is an absolute must have according to my company). So when I need to access one or the other based on a form submission it becomes a mess of duplicated code, example:
//myWebController.java
import ...;
#RestController
public class MyRestController {
#RequestMapping(value="/orders", method=RequestMethod.POST)
public string getOrders(...) {
if(Objects.equals(client, "first_client_name") {
//do all the things to get first keyspace objects like....
FirstClientUsers users = (db).Mapper(FirstClientUsers.class);
//...
} else if(Objects.equals(client, "second_client_name") {
SecondClientUsers users = (db).Mapper(SecondClientUsers.class);
//....
}
return "";
}
I have been trying to use methods like...
Class cls = Class.forName(STRING_INPUT_VARIABLE_HERE);
and that works ok for base classes but when trying to use the Accessor stuff it no longer works because Accessors have to be interfaces, so when you do Class cls, it is no longer an interface.
I am trying to find any other solution on how to dynamically have this work and not have to have duplicate code for every possible client. Each client will have it's own namespace in Cassandra, with the exact same tables as all other ones.
I cannot change the database model, this is a must according to the company.
With PHP this is extremely simple since it doesn't care about typecasting as much, I can easily do...
function getData($name) {
$className = $name . 'Accessor';
$class = new $className();
}
and poof I have a dynamic class, but the problem I am running into is the Type specification where I have to explicitly say...
FirstClientUsers users = new FirstClientUsers();
//or even
FirstClientUsers users = Class.forName("FirstClientUsers");
I hope this is making sense, I can't imagine that I am the first person to have this problem, but I can't find any solutions online. So I am really hoping that someone knows how I can get this accomplished without duplicating the exact same logic for every single keyspace we have. It makes the code not maintainable and unnecessarily long.
Thank you in advance for any help you can offer.
Do not specify the keyspace in your model classes, and instead, use the so-called "session per keyspace" pattern.
Your model class would look like this (note that the keyspace is left undefined):
#Table(name = "users")
public class Users {
#PartitionKey
public UUID id;
//...
}
Your initialization code would have something like this:
Map<String, Mapper<Users>> mappers = new ConcurrentHashMap<String, Mapper<Users>>();
Cluster cluster = ...;
Session firstClientSession = cluster.connect("keyspace_first_client");
Session secondClientSession = cluster.connect("keyspace_second_client");
MappingManager firstClientManager = new MappingManager(firstClientSession);
MappingManager secondClientManager = new MappingManager(secondClientSession);
mappers.put("first_client", firstClientManager.mapper(Users.class));
mappers.put("second_client", secondClientManager.mapper(Users.class));
// etc. for all clients
You would then store the mappers object and make it available through dependency injection to other components in your application.
Finally, your REST service would look like this:
import ...
#RestController
public class MyRestController {
#javax.inject.Inject
private Map<String, Mapper<Users>> mappers;
#RequestMapping(value = "/orders", method = RequestMethod.POST)
public string getOrders(...) {
Mapper<Users> usersMapper = getUsersMapperForClient(client);
// process the request with the right client's mapper
}
private Mapper<Users> getUsersMapperForClient(String client) {
if (mappers.containsKey(client))
return mappers.get(client);
throw new RuntimeException("Unknown client: " + client);
}
}
Note how the mappers object is injected.
Small nit: I would name your class User in the singular instead of Users (in the plural).
I have an issue i'm struggling with for some time now. Im trying to implement a news feed feature in my app using GAE cloud endpoints and java. The common concept is of followers and followees, where an action of a followee can be seen by his followers. A new follower should also see his followees past actions, not only from the time he started following.
I made a few tries with the following components. Each try worked great but was lacking something:
On each user action i added a 'log' entity into the datastore with the user id included. When a user was displaying his news feed i just queried for all those entities by their user ids according to the user's followees list. Everything was fine until i realized that a 'IN' query cannot be cursored. So this option was gone.
On this try, which is also the current state of the application, im using the Search API. Upon every user action im not storing a 'log' entity into the datastore anymore but a document into a search index. Complex queries can be cursored here and the world is smiling again. But... im not too sure that, billing wise, this is a smart descision. It seems that the costs of searching/adding/deleting documents along side the documented daily limitations is making the whole thing a bit too sketchy.
The next try should be Prospective Search API. From what i'm reading in the documents it seems the right component to pick for that purpose. Unfortunately, the documentation is really poor and give very little examples. Also the billing information is unclear.
So im asking for the advice of the stackoverflow community. Can you please advise me about this matter ? and if Prospective Search is the right option to choose, can you please provide some clear sample java code that uses cloud endpoints?
EDIT : Just to emphasize the main design requirement here - The news feed feature need to have the ability to fetch sorted followees actions using a cursor (in order avoid querying the whole batch).
Use a pull-aggregate-per-follower model: periodically (or on demand) query all followees actions once and then cache them inside a dedicated per-follower entity. Remember the time of last query, so next time you just query from that point on (assuming actions can not be added/changed to the past times).
This will give you the following features (and limitations):
If query is on-demand, than you will not need to query for users that are inactive.
Since the query is "new-only" (looks for new actions only), it would cost you nothing if it returned zero results.
You will only query each followee actions per follower once. After that all recent actions would be cached inside one entity and loaded into memory with one get. This should be a substantial cost and time saving.
You could sort/filter actions in memory any way you wish.
Limitations:
Entities have a 1MB limit, so there is a max no of actions that you can cache in one entity. So you will either need to limit caching of recent actions per user or spread out action caching over multiple entities.
You will need to use IN query over followees (max 30) and also use parallel threads to achieve decent performance. This could easily hit 3-5 seconds when querying over 1000-2000 followees. Also, you could easily hit RPC limit (aka max concurrent API calls) per instance when serving multiple users at the same time.
I hope I understand the question correctly - you want to implement a news feed into your application and allow users to follow each other. The new followers need to be able to see the users actions. I am sure there are multiple other ways of solving this problem, but I will attempt to help you out by providing a solution that makes use of JAVA JDO to access the datastore.
I would first design the entity relationships in JDO as follows:
1 User to many actions.
1 User to many followers (User).
1 User to many following (User).
Here are simple JDO classes:
User Class:
#PersistenceCapable(identityType=IdentityType.APPLICATION)
public class User {
#PrimaryKey
#Persistent(valueStrategy=IdGeneratorStrategy.IDENTITY)
private Key key;
#Persistent
private String userId; // Google unique user ID, could also store user email.
#Persistent
private Set<Key> actions;
#Persistent
private Set<Key> followers;
#Persistent
private List<Key> following;
public User(Key key, String userId) {
this.key = key;
this.userId = userId;
this.actions = new HashSet<Key>();
this.followers = new HashSet<Key>();
this.following = new HashSet<Key>();
}
public Key getKey() {
return this.key;
}
public void addAction(Key actionKey) {
this.actions.add(actionKey);
}
public void addActions(Set<Key> actionKeys) {
this.actions.addAll(actionKeys);
}
public Set<Key> getActions() {
return this.actions;
}
public void addFollower(Key followerKey) {
this.followers.add(followerKey);
}
public void addFollowers(Set<Key> followerKeys) {
this.followers.addAll(followerKeys);
}
public Set<Key> getFollowers() {
return this.followers;
}
public void addFollowing(Key followingKey) {
this.following.add(followingKey);
}
public void addAllFollowing(Set<Key> followingKeys) {
this.following.addAll(followingKeys);
}
public Set<Key> getFollowing() {
return this.following;
}
}
Action Class:
#PersistenceCapable(identityType=IdentityType.APPLICATION)
public class Action {
#PrimaryKey
#Persistent(valueStrategy=IdGeneratorStrategy.IDENTITY)
private Key key;
#Persistent
Date date;
#Persistent
private String title;
public Action(Key key, String title) {
this.key = key;
this.title = title;
this.date = new Date(); // date of creation (now).
}
public Key getKey() {
return this.key;
}
public void setTitle(String title) {
this.title = title;
}
public String getTitle() {
return this.title;
}
}
The Action class makes use of a Date property, you can refer to the documentation for applicable data types in the datastore. When an action is created, a Date object is allocated and initialized so that it represents the time at which it was allocated, measured to the nearest millisecond.
In my example above I linked the entities by their Keys, you could instead link them by their classes as follows:
List<Action> actions;
The relationship in my example is one of an unowned one-to-many relationship, perhaps it should be owned one-to-many. More information here for your to take a look and perhaps decide which would be best for your solution.
Once the relationships have been defined, you can create your endpoint classes around the JDO model classes. This will create basic api methods. You might want to change the endpoint class methods to suit your needs, for example change the way an action is created. A basic example would be to create the key from the actions title as follows (ActionEnpoint.java):
...
#ApiMethod(name = "insertAction")
public Action insertAction( #Named("title") String title ) {
PersistenceManager pm = getPersistenceManager();
Key key = KeyFactory.createKey(Action.class.getSimpleName(), title);
Action action = null;
try {
action = new Action(key, title);
pm.makePersistent(action);
} finally {
pm.close();
}
return action;
}
...
If you want to, you can add a method to your UserEndpoint class to query the datastore and return all actions belonging to that user and per date using the datastore query objects.
You need to add a method to your UserEndpoint class that allows you to add an action to that user, here is a simple example:
...
#ApiMethod(name = "addActionToUser")
public Achiever addActionToUser(
#Named("userId") String userId,
#Named("actionTitle") String actionTitle) {
PersistenceManager pm = getPersistenceManager();
Key userKey = KeyFactory.createKey(User.class.getSimpleName(), userId);
Key actionKey = KeyFactory.createKey(Action.class.getSimpleName(), actionTitle);
User user = null;
try {
user = (User) pm.getObjectById(User.class, userKey);
user.addAction(actionKey);
pm.makePersistent(user);
} catch (Exception e) {
}
return user;
}
...
Once all of the above is complete you can easily get the list of actions per user by calling the getUser method in your UserEndpoint class, which returns a User object. You can then call [ReturnedUserObject].getActions(). A new follower can now view all of the "followees" actions by just calling the api method to get that "followees" object and get his/her actions. You can then just sort the actions by date or however you envision it.
I hope I understood your question correctly, I was unsure about the first component you mentioned, but it seemed as though you got your relationships mixed up. I hope this solution points you in the right direction at least :).
If you need any additional help or clarification, or my answer was completely off point to what you were looking for then please let me know.
Kind regards,
Miki