Can you please help me to solve this problem. Here I'm storing some data to Datastore with JDO Interface using AJAX call. I'm storing the data to datastore and Retrieving it immediately. While Retrieving some times it returns NULL as a response(Its not always returning NULL. Only some times it returning NULL). Can you please help me to fix this. The below given code is used to store and retrieve the data
This code for Storing the data,
public void saveSchedule(String listName, String email, String date, String time, String details, String name)
{
Date hiredate = new Date();
String gmtdate = hiredate.toGMTString();
Schedule schedule = new Schedule();
schedule.setName(name);
schedule.setListName(listName);
schedule.setEmail(email);
schedule.setDate(date);
schedule.setDateGMT(gmtdate);
schedule.setDetails(details);
schedule.setTime(time);
p = PMF.get().getPersistenceManager();
try
{
p.makePersistent(schedule);
}
catch(Exception e)
{
System.out.println(e);
}
finally
{
p.close();
}
}
This code for Retrieving the data,
public String savedDataRetrive(String details, String email) {
p = PMF.get().getPersistenceManager();
Query q = p.newQuery(Schedule.class);
q.setFilter("details == '"+details+"' && email == '"+email+"'");
List<Schedule> sch = (List<Schedule>) q.execute();
String data = null;
ObjectMapper n=new ObjectMapper();
try {
data = n.writeValueAsString(sch);
} catch (JsonGenerationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (JsonMappingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}finally{
p.close();
}
return data;
}
The Datastore replicates data across multiple datacenters. This provides a high level of availability for reads and writes, however, most queries are eventually consistent.
Eventual consistency is a consistency model used in distributed
computing to achieve high availability that informally guarantees
that, if no new updates are made to a given data item, eventually all
accesses to that item will return the last updated value.
This is most likely the reason why your query sometimes returns nothing.
I would recommend you to go through Structuring Data for Strong Consistency article.
Here is a useful example:
https://github.com/mattburns/OddPrints/blob/master/op-gae/src/com/oddprints/servlets/Edit.java#L89
#GET
#Path("/basic/sample")
#Produces(MediaType.TEXT_HTML)
public Viewable loadBasicSample(#Context HttpServletRequest req)
throws FileUploadException, IOException, URISyntaxException {
return viewSampleImage(req, Settings.SAMPLE_PHOTO_BLOB_KEY,
Settings.SAMPLE_PHOTO_BLOB_SIZE, new URL(
"http://www.oddprints.com/images/sample.jpg"));
}
Viewable viewSampleImage(HttpServletRequest req, Settings blobKeySetting,
Settings blobSizeSetting, URL image) throws MalformedURLException,
IOException {
String blobKeyString = ApplicationSetting.getSetting(blobKeySetting);
if (blobKeyString == null) {
InputStream imgStream = image.openStream();
byte[] bytes = IOUtils.toByteArray(imgStream);
BlobKey blobKey = ImageBlobStore.INSTANCE.writeImageData(bytes);
blobKeyString = blobKey.getKeyString();
ApplicationSetting.putSetting(blobKeySetting, blobKeyString);
ApplicationSetting.putSetting(blobSizeSetting, "" + bytes.length);
}
String blobSize = ApplicationSetting.getSetting(blobSizeSetting);
req.getSession().setAttribute("blobKeyString", blobKeyString);
req.getSession().setAttribute("blobSize", blobSize);
req.getSession().setAttribute("basicMode", Boolean.TRUE);
return viewBasic(req);
}
I would recommend to use memcache, that way the fetch will be faster, and you will have less null objects in return IMO.
Related
I have a problem in a class I wrote. The purpose of the class is to add/remove/update for the applicationResource.properties files which the <spring:message code="key" /> uses to provide bilingual support to the website. Manually interacting with the properties files works fine, but I had a greater need and so I built in a way to allow changes to be done from the database. This has given me a very dynamic and flexible system that I can work from.
However, there is a problem. At some point after even a single change using this, the French characters end up getting changed. Such as Déconnexion becoming Déconnexion. When looked at in notepad++ its first Déconnexion and then corrupted to D\u00C3\u00A9connexion. This example was part of the original properties file.
The original (not temp) properties files have the text file encoding set to other: UTF-8. The Project properties text file encoding is set to inherited from container (Cp1252). I tried changing to Other: UTF-8 with no change.
So my question(s) is, what is causing the corruption to my French characters and how can I fix it? I have provided the complete class below.
Update: After the assistance from StephaneM in her answer I was able to track down exactly what is causing the corruption, but have not fixed it yet. The loadProperties() function in the AR Class. As soon as the temp AP files are loaded the French characters are corrupted. This makes me suspect the original process which creates the temp AP files is using a different encoding. So I will have to track it down.
package pojo;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.Objects;
import java.util.Properties;
import java.util.Set;
import org.springframework.beans.factory.annotation.Autowired;
/*
* Purpose of this class is to handle all the ApplicationResource(_fr).properties interactions
* so that there is one unified location handling this, instead of code duplication.
*/
public class AR{
public final String en_path = "/ApplicationResources.properties";
public final String fr_path = "/ApplicationResources_fr.properties";
private Properties en_prop = null;
private Properties fr_prop = null;
public AR()
{
loadProperties();
}
private void loadProperties()
{
InputStream en_is = null;
InputStream fr_is = null;
try {
this.en_prop = new Properties();
this.fr_prop = new Properties();
en_is = this.getClass().getResourceAsStream(en_path);
fr_is = this.getClass().getResourceAsStream(fr_path);
en_prop.load(en_is);
fr_prop.load(fr_is);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
private boolean keyExist(String mykey, String mypath) //deprecated due to better code/method
{
Properties test_prop = null;
InputStream is = null;
try {
test_prop = new Properties();
is = this.getClass().getResourceAsStream(mypath);
test_prop.load(is);
Set<Object> keys = test_prop.keySet();
for(Object k:keys) {
String key = (String)k;
//System.out.print(key + " ");
if(key.equals(mykey))
{
return true;
}
}
//System.out.println(" ");
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (NullPointerException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return false;
}
public boolean en_keyExist(String mykey)
{
//searches english file
loadProperties();
return en_prop.containsKey(mykey);
//return keyExist(mykey, en_path); //original method
}
public boolean fr_keyExist(String mykey)
{
//searches french file
loadProperties();
return fr_prop.containsKey(mykey);
//return keyExist(mykey, fr_path); //original method
}
public boolean en_fr_keyExist(String mykey)
{
//searches both english and french files
loadProperties();
return (en_prop.containsKey(mykey) && fr_prop.containsKey(mykey));
//return (keyExist(mykey, en_path) && keyExist(mykey, fr_path)); //original method
}
public String en_returnProperty(String mykey)
{
//returns null if key does not exist
loadProperties();
return this.en_prop.getProperty(mykey);
}
public String fr_returnProperty(String mykey)
{
//returns null if key does not exist
loadProperties();
return this.fr_prop.getProperty(mykey);
}
public void appendProperty(Properties new_en_prop,Properties new_fr_prop)
{
//note: during a test, setProperty (used in populating the properties) does not allow duplicates, it overwrites.
//So, load the existing properties, and for each new property add it
loadProperties();
for(Object key : new_en_prop.keySet())
{
en_prop.setProperty((String)key, new_en_prop.getProperty((String)key));
}
try (OutputStream en_os = new FileOutputStream(getClass().getResource(en_path).getFile(),false);)
{
en_prop.store(en_os, null);
} catch (IOException e) {
e.printStackTrace();
}
for(Object key : new_fr_prop.keySet())
{
fr_prop.setProperty((String)key, new_fr_prop.getProperty((String)key));
}
try (OutputStream fr_os = new FileOutputStream(getClass().getResource(fr_path).getFile(),false);)
{
fr_prop.store(fr_os, null);
} catch (IOException e) {
e.printStackTrace();
}
}
public boolean appendProperty(String mykey, String en_val, String fr_val) //appears to have timing error due to only saving last value
//due to timing error this function is only suitable for single additions
//due to the timing error, tried returning boolean to have it finished but was not successful
//setting the class variables to static did not solve the timing issue
{
loadProperties();
en_prop.setProperty(mykey, en_val);
try (OutputStream en_os = new FileOutputStream(getClass().getResource(en_path).getFile(),false);)
{
en_prop.store(en_os, null);
} catch (IOException e) {
e.printStackTrace();
}
fr_prop.setProperty(mykey, fr_val);
try (OutputStream fr_os = new FileOutputStream(getClass().getResource(fr_path).getFile(),false);)
{
fr_prop.store(fr_os, null);
} catch (IOException e) {
e.printStackTrace();
}
return true;
}
public void en_setProperty(String mykey, String en_val)
//suspected timing issue, use only for singular changes
{
loadProperties();
en_prop.setProperty(mykey, en_val);
try (OutputStream en_os = new FileOutputStream(getClass().getResource(en_path).getFile(),false);)
{
en_prop.store(en_os, null);
} catch (IOException e) {
e.printStackTrace();
}
}
public void fr_setProperty(String mykey, String fr_val)
//suspected timing issue, use only for singular changes
{
loadProperties();
fr_prop.setProperty(mykey, fr_val);
try (OutputStream fr_os = new FileOutputStream(getClass().getResource(fr_path).getFile(),false);)
{
fr_prop.store(fr_os, null);
} catch (IOException e) {
e.printStackTrace();
}
}
public void compareResources()
{
Properties new_en = new Properties();
Properties new_fr = new Properties();
for(Object key : en_prop.keySet())
{
new_en.setProperty((String)key, en_prop.getProperty((String)key));
}
for(Object key : fr_prop.keySet())
{
new_fr.setProperty((String)key, fr_prop.getProperty((String)key));
}
Properties temp = (Properties) new_en.clone();
for(Object key : temp.keySet())
{
if(new_fr.containsKey((String) key))
{
new_fr.remove(key);
new_en.remove(key);
}
}
for(Object key : new_en.keySet())
{
System.out.println("English only key: " + ((String)key));
}
for(Object key : new_fr.keySet())
{
System.out.println("French only key: " + ((String)key));
}
}
}
Sample use case for the class, taken directly from application, but with some editing so only the relevant parts are here
AR testing = new AR();
Properties en_prop = new Properties();
Properties fr_prop = new Properties();
final String test_prod_cur = "{call BILINGUAL_VALUES(?)}";
ResultSet rs = null;
try (
Connection connection = jdbcTemplate.getDataSource().getConnection();
CallableStatement callableStatement = connection.prepareCall(test_prod_cur);
)
{
callableStatement.registerOutParameter(1, OracleTypes.CURSOR);
callableStatement.executeUpdate();
rs = (ResultSet) callableStatement.getObject(1);
while (rs.next())
{
String thead = rs.getString(1);
en_prop.setProperty(keyheader+thead, rs.getString(2));
fr_prop.setProperty(keyheader+thead, rs.getString(3));
//testing.appendProperty(keyheader+thead, rs.getString(2), rs.getString(3)); //has a timing issue, ends up only appending final value
}
}
catch (SQLException e)
{
System.out.println("SQLException - bilingual values");
System.out.println(e.getMessage());
}
testing.appendProperty(en_prop, fr_prop);
Regarding this question: "what is causing the corruption to my French characters and how can I fix it?", the answer is in the documentation ( Properties.store() ):
public void store(OutputStream out,
String comments)
throws IOException
Writes this property list (key and element pairs) in this Properties
table to the output stream in a format suitable for loading into a
Properties table using the load(InputStream) method.
Properties from the defaults table of this Properties table (if any)
are not written out by this method.
This method outputs the comments, properties keys and values in the
same format as specified in store(Writer), with the following
differences:
The stream is written using the ISO 8859-1 character encoding.
Characters not in Latin-1 in the comments are written as \uxxxx for their appropriate unicode hexadecimal value xxxx.
Characters less than \u0020 and characters greater than \u007E in property keys or values are written as \uxxxx for the appropriate hexadecimal value xxxx.
I am not the only person who has faced this issue, I managed to find another question and it was one of the answers that lead me to my solution. I have to thank another site for letting me know what to include.
There are only four added or changed lines, I will list them, and then give a complete function.
import java.io.Reader;
Reader reader = new InputStreamReader(fr_is, "UTF-8");
fr_prop.load(reader); //instead of fr_prop.load(fr_is);
reader.close();
The complete function
import java.io.Reader;
private void loadProperties()
{
InputStream en_is = null;
InputStream fr_is = null;
try {
this.en_prop = new Properties();
this.fr_prop = new Properties();
en_is = this.getClass().getResourceAsStream(en_path);
fr_is = this.getClass().getResourceAsStream(fr_path);
Reader reader = new InputStreamReader(fr_is, "UTF-8");
en_prop.load(en_is);
fr_prop.load(reader);
reader.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
By introducing the reader and using that, it has cleared up French character corruption.
I should mention that I changed every file property I could find over to UTF-8 before I did the above changes and got it working. This site gives you the changes I made. This was a nice page that helped me confirm the encodings.
I have been trying to implement a DAO method for delete operation for Azure Storage entities. Delete using TableOperation was ok.
TableOperation deleteEntity = TableOperation.delete(entity);
But when I tried it using Batch Operation, It was not supported.
Any suggestions to overcome this issue is highly appreciated.
But when I tried it using Batch Operation, It was not supported.
I assumed that you could group your items for deleting by partition key, then execute the TableBatchOperation.
Here I wrote a helper class via C# language for achieving this purpose, you could refer to it:
public class TableBatchHelper<T> where T : ITableEntity
{
const int batchMaxSize = 100;
public static IEnumerable<TableBatchOperation> GetBatchesForDelete(IEnumerable<T> items)
{
var list = new List<TableBatchOperation>();
var partitionGroups = items.GroupBy(arg => arg.PartitionKey).ToArray();
foreach (var group in partitionGroups)
{
T[] groupList = group.ToArray();
int offSet = batchMaxSize;
T[] entities = groupList.Take(offSet).ToArray();
while (entities.Any())
{
var tableBatchOperation = new TableBatchOperation();
foreach (var entity in entities)
{
tableBatchOperation.Add(TableOperation.Delete(entity));
}
list.Add(tableBatchOperation);
entities = groupList.Skip(offSet).Take(batchMaxSize).ToArray();
offSet += batchMaxSize;
}
}
return list;
}
public static async Task BatchDeleteAsync(CloudTable table, IEnumerable<T> items)
{
var batches = GetBatchesForDelete(items);
await Task.WhenAll(batches.Select(table.ExecuteBatchAsync));
}
}
Then, you could you execute the batch deleting as follows:
await TableBatchHelper<ClassName>.BatchDeleteAsync(cloudTable,items);
Or
var batches = TableBatchHelper<ClassName>.GetBatchesForDelete(entities);
Parallel.ForEach(batches, new ParallelOptions()
{
MaxDegreeOfParallelism = 5
}, (batchOperation) =>
{
try
{
table.ExecuteBatch(batchOperation);
Console.WriteLine("Writing {0} records", batchOperation.Count);
}
catch (Exception ex)
{
Console.WriteLine("ExecuteBatch throw a exception:" + ex.Message);
}
});
No, That was the code without using block operation. Following is the code that includes block operation. Sorry for not mentioning that
TableBatchOperation batchOperation = new TableBatchOperation();
List<TableBatchOperation> list = new ArrayList<>();
if (partitionQuery != null) {
for (AzureLocationData entity : cloudTable.execute(partitionQuery)) {
batchOperation.add(TableOperation.delete(entity));
list.add(batchOperation); //exception thrown line
}
try {
cloudTable.execute((TableOperation) batchOperation);
} catch (StorageException e) {
e.printStackTrace();
}
}
public void deleteLocationsForDevice(String id) {
logger.info("Going to delete location data for Device [{}]", id);
// Create a filter condition where the partition key is deviceId.
String partitionFilter = TableQuery.generateFilterCondition(
PARTITION_KEY,
TableQuery.QueryComparisons.EQUAL,
id);
// Specify a partition query, using partition key filter.
TableQuery<AzureLocationData> partitionQuery =
TableQuery.from(AzureLocationData.class)
.where(partitionFilter);
if (partitionQuery != null) {
for (AzureLocationData entity : cloudTable.execute(partitionQuery)) {
TableOperation deleteEntity = TableOperation.delete(entity);
try {
cloudTable.execute(deleteEntity);
logger.info("Successfully deleted location records with : " + entity.getPartitionKey());
} catch (StorageException e) {
e.printStackTrace();
}
}
} else {
logger.debug("No records to delete!");
}
// throw new UnsupportedOperationException("AzureIotLocationDataDao Delete Operation not supported");
}
I wrote a very simple Java web application ,just included some basic function like register , sign in , changing the password and some others.
I don't use database. I just create a file in the app to record the users' information and do the database stuff.
I used JMeter to stressing the web application, especially the register interface.
The JMeter shows that the result of the 1000 thread is right
but when I look into the information.txt , which stores the users' information, it's wrong because it stores 700+ record :
but it should include 1000 record, it must be somewhere wrong
I use the singleton class to do the write/read stuff, and i add a synchronized word to the class, the insert() function which is used by register to record the register information is shown as below: (a part of it)
public class Database {
private static Database database = null;
private static File file = null;
public synchronized static Database getInstance() {
if (database == null) {
database = new Database();
}
return database;
}
private Database() {
String path = this.getClass().getClassLoader().getResource("/")
.getPath() + "information.txt";
file = new File(path);
if (!file.exists()) {
try {
file.createNewFile();
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
public void insert(String account, String password, String username) {
RandomAccessFile infoFile = null;
try {
infoFile = new RandomAccessFile(file, "rw");
String record;
long offset = 0;
while ((record = infoFile.readLine()) != null ) {
offset += record.getBytes().length+2;
}
infoFile.seek(offset);
record = account+"|"+password+"|"+username+"\r\n";
infoFile.write(record.getBytes());
infoFile.close();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (infoFile != null) {
try {
infoFile.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
}
}
the question is why would this happened , the synchronized is thread safe, why i lost so many data and some blank line was inserted into it, what could I do the correct it !
You are synchronizing the getInstance() method, but not the insert() method. This makes the retrieval of the instance of Database thread-safe, but not the write operation.
I'm trying to create save states for my game, not so much for where your game was left but something simple like score boards. The format would be something like this:
Wins: 5
Losses: 10
GamesPlayed: 15
I need to be able to access the file, and depending on whether the player won/lost it will append +1 to the value in the file.
What would be the best way to go about this? I've heard of a bunch of different ways to save data, for example XML, but aren't those overkill for the size of my data?
Also, I do want to keep this file safe from the users being able to go into the files and change the data. Would I have to do some sort of encryption? And, if the user removes the file and replaces it with an empty one can't they technically reset their values?
You can use plain serialization/deserialization for this. In order to serialize/deserialize a class, it must implement the Serializable interface. Here's a example to start with:
public class Score implements Serializable {
private int wins;
private int loses;
private int gamesPlayed;
//constructor, getter and setters...
}
public class ScoreDataHandler {
private static final String fileName = "score.dat";
public void saveScore(Score score) {
ObjectOutputStreamout = null;
try {
out = new ObjectOutputStream(new FileOutputStream(fileName));
out.writeObject(score);
} catch (Exception e) {
//handle your exceptions...
} finally {
if (out != null) {
try {
out.close();
} catch (IOException ioe) {
}
}
}
}
public Score loadScore() {
ObjectInputStreamin = null;
Score score = null;
try {
in = new ObjectInputStream(new FileInputStream(fileName));
score = (Score)in.readObject();
} catch (Exception e) {
//handle your exceptions...
} finally {
if (in != null) {
try {
in.close();
} catch (IOException ioe) {
}
}
}
return score;
}
}
I'm working on a Android application using the Google App-Engine to store and manage data for the application. Unfortunately I've run into a problem which I don't seem able to solve.
When a user creates a new account a new "Project" is created for them. This project contains tasks, and these tasks are stored in an ArrayList in the Project class. So, in the constructor of the project class everything is instantiated and and the Tasks are created from a text-file with data in Json-format using Gson2.2.2.
All of this works fine, and if I look in the datastore viewer in the appengine admin console everything looks good. Immediately after creating the account the user is logged on, and when the user logs on the Tasks needs to be sent to the Android-client. This is where it gets weird. When serializing the Tasks back to Json format, they seem to be uninitialized. All the string fields are empty, and the integers are all set to 0, but the correct number of Tasks are being serialized so the list is populated. This problem persists, until I manually shut down the Instance in GAE. When it is restarted with a new request the data is then serialized correctly to Json format and everything works fine. Obviously this is not good, I can't have the server shutdown it's instance every time a new user creates an account, just to be able to serve them the correct data. So, please help me solve this. I've been struggling with it quite a while now. Below is code that accurately reproduces the problem.
public class CreateData extends HttpServlet{
public void doGet(HttpServletRequest req, HttpServletResponse resp){
if(req.getParameter("name").length() > 1){
PersistenceManager pm = PMF.get().getPersistenceManager();
User u = new User(req.getParameter("name"));
pm.makePersistent(u);
pm.close();
try {
resp.getWriter().print("User created with name "+req.getParameter("name"));
} catch (IOException e) {
e.printStackTrace();
}
}else if(req.getParameter("name").length() <= 1){
try {
resp.getWriter().print("Please supply a name with at least 2 characters");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}else{
try {
resp.sendError(400);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
The User class
public class User {
public #interface Skip {
// Field tag only annotation
}
#PrimaryKey
#Persistent
private String name;
#Persistent
private ArrayList<DataType> data;
public User(String name) {
this.name = name;
data = new ArrayList<DataType>();
createDataFromJSON();
}
public String getDatasAsJSON(){
Gson gson = new GsonBuilder().setExclusionStrategies(new MyExclusionStrategy(Key.class)).create();
Type taskType = new TypeToken<List<DataType>>(){}.getType();
String json = gson.toJson(this.data, taskType);
return json;
}
public void createDataFromJSON() {
FileReader fr = null;
try {
fr = new FileReader(new File("WEB-INF/defaults.json"));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
if (fr != null) {
Type taskType = new TypeToken<List<DataType>>(){}.getType();
data = new Gson().fromJson(fr, taskType);
}
}
public class MyExclusionStrategy implements ExclusionStrategy {
private final Class<?> typeToSkip;
private MyExclusionStrategy(Class<?> typeToSkip) {
this.typeToSkip = typeToSkip;
}
public boolean shouldSkipClass(Class<?> clazz) {
return (clazz == typeToSkip);
}
public boolean shouldSkipField(FieldAttributes f) {
return f.getAnnotation(Skip.class) != null;
}
}
}
The DataType class
public class DataType {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
private Key key;
#Persistent
private String name;
#Persistent
private int points;
#Persistent
private int unique;
public DataType() {
}
public DataType(String name, int points, int unique){
this.name = name;
this.points = points;
this.unique = unique;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getPoints() {
return points;
}
public void setPoints(int points) {
this.points = points;
}
public int getUnique() {
return unique;
}
public void setUnique(int unique) {
this.unique = unique;
}
}
Servlet for getting the data
public class GetData extends HttpServlet{
public void doGet(HttpServletRequest req, HttpServletResponse resp){
String name = req.getParameter("name");
PersistenceManager pm = PMF.get().getPersistenceManager();
User u = null;
try{
u = pm.getObjectById(User.class, name);
}catch(JDOObjectNotFoundException e){
try {
resp.sendError(404);
} catch (IOException e1) {
e1.printStackTrace();
}
}
if(u != null){
String response = u.getDatasAsJSON();
try {
resp.getWriter().print(response);
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
And the JSON data
[
{
"name": "Hug",
"unique": 1,
"points": 20
},
{
"name": "Tug",
"unique": 2,
"points": 40
},
{
"name": "Rug",
"unique": 3,
"points": 50
},
{
"name": "Jug",
"unique": 4,
"points": 100
},
{
"name": "Smug",
"unique": 5,
"points": 20
}
]
So, creating a new User with name "Arne" works fine and the objects are created in the HDR. Asking for the objects back from the datastore as Json yields this response
[{"points":0,"unique":0},{"points":0,"unique":0},{"points":0,"unique":0},{"points":0,"unique":0},{"points":0,"unique":0}]
upon restarting the server instance the same request gives this response
[{"name":"Hug","points":20,"unique":1},{"name":"Tug","points":40,"unique":2},{"name":"Rug","points":50,"unique":3},{"name":"Jug","points":100,"unique":4},{"name":"Smug","points":20,"unique":5}]
Sorry for the long post, but hopefully somebody is able to point out to me what I'm doing wrong. Many thanks in advance!
Best regards,
Ivar
To bad nobody seem to be able to answer this, though I've found a work around. I still think this needs a proper solution, but at least I've got it working now by looping through all the objects and assigning a temporary variable with the value from a field in the object. This seems to force them to initialize, and the returned JSON is actually populated with the correct fields.
public class GetData extends HttpServlet{
public void doGet(HttpServletRequest req, HttpServletResponse resp){
String name = req.getParameter("name");
PersistenceManager pm = PMF.get().getPersistenceManager();
User u = null;
try{
u = pm.getObjectById(User.class, name);
}catch(JDOObjectNotFoundException e){
try {
resp.sendError(404);
} catch (IOException e1) {
e1.printStackTrace();
}
}
if(u != null){
//By adding this seemingly pointless loop
//the objects will actually return populated fields
//when converted back to JSON
for(DataType dt : u.getData()){
String temp = dt.getName();
}
String response = u.getDatasAsJSON();
try {
resp.getWriter().print(response);
} catch (IOException e) {
e.printStackTrace();
}
}
}
}