I am making this J2ME application but I am having some problem when I am trying to save I thinks that it save properly but I am not sure....but when I retrieve it gives null
This is how I am storing them
PAR par = new PAR(oldMonPay, newMonPay, oldInterest);
par.setOldMPay(oldMonPay);
par.setNewMPay(newMonPay);
par.setOldInt(oldInterest);
And this is how I saving and retrieving
public static byte[] parseObjPAR(PAR p) {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
DataOutputStream out;
try {
out = new DataOutputStream(baos);
out.writeUTF(p.getNewMPay());
out.writeUTF(p.getOldInt());
out.writeUTF(p.getOldMPay());
} catch (IOException e) {
}
return baos.toByteArray();
}
public static PAR parseByteArrPAR(byte[] b) {
PAR p = null;
ByteArrayInputStream bais;
DataInputStream in;
if (b != null) {
try {
bais = new ByteArrayInputStream(b);
in = new DataInputStream(bais);
p = new PAR(
in.readUTF(),
in.readUTF(),
in.readUTF());
} catch (IOException e) {
}
}
return p;
}
This is how I displaying the retrieved information, there is another problem this thing is not showing all the data but is only showing the 3 records. I think the first 3.
public void populatePAResult(PAR[] p) {
try {
for (int i = 0; i < p.length; i++) {
String oldMP = p[i].getOldMPay();
String newMP = p[i].getNewMPay();
String oldI = p[i].getOldInt();
result1.append("Day : " + oldMP, null);
result1.append("Time : " + oldI, null);
result1.append("Technology : " + newMP, null);
}
} catch (Exception e) {
}
}
In the parseObjPAR method that writes the data the order is:
out.writeUTF(p.getNewMPay());
out.writeUTF(p.getOldInt());
out.writeUTF(p.getOldMPay());
whereas when you read it back in and pass the order the constructor is expecting is different:
PAR par = new PAR(oldMonPay, newMonPay, oldInterest);
so even if it wasn't null the loaded data would be invalid.
Related
little bit of a pickle here. I am reading a JSON From a Zip file and I want to fill a table in Vaadin with the contents of the JSON.
Here's my Function to read the stuff and fill the table, this is Java.
private void getJsonContent() {
try {
FileInputStream fis = new FileInputStream(backupFile);
ZipInputStream zin = new ZipInputStream(new BufferedInputStream(fis));
ZipEntry entry;
byte[] buffer = new byte[1024];
while((entry = zin.getNextEntry()) != null) {
if(entry.getName().equalsIgnoreCase("content.json")) {
int n;
while((n = zin.read(buffer, 0, 1024)) > -1){
String JSON = new String(buffer, Charset.forName("UTF-8"));
JSONObject obj = new JSONObject(JSON);
logger.info(JSON);
// Assign "global" Values to Variables
this.createdAt = obj.getString("created_at");
this.version = obj.getString("version");
// Fill table if applicable
for(int i = 0; i < obj.getJSONArray("content").length(); i++) {
JSONObject sub = obj.getJSONArray("content").getJSONObject(i);
logger.info(sub);
infoTable.addItem(new Object[] {
sub.get("imported_identities").toString(),
sub.get("project_versions").toString(),
sub.get("last_import").toString(),
sub.get("client").toString(),
sub.get("project").toString()
}, i +1);
}
}
}
}
zin.close();
fis.close();
} catch (FileNotFoundException e) {
// Can't happen here
} catch (IOException e) {
logger.info("Can't read File.");
} catch (JSONException jse) {
logger.info("JSON Content could not be read: " + jse.getMessage());
}
}
You will notice I have a function call logger.info(sub) - to make sure what I get is another valid JSON Object (the file I am reading contains nested things)
Output:
{"imported_identities":0,"project_versions":0,"last_import":null,"client":"Client1","project":"Project2"}
{"imported_identities":0,"project_versions":0,"last_import":null,"client":"Client2","project":"Project1"}
{"imported_identities":0,"project_versions":1,"last_import":"2016-09-14T09:28:24.520Z","client":"Client1","project":"Project1"}
I made sure all the values were correct (and the table is built with null as default) - here is the table properties:
infoTable.addContainerProperty(impIds, String.class, null);
infoTable.addContainerProperty(projVe, String.class, null);
infoTable.addContainerProperty(lstImp, String.class, null);
infoTable.addContainerProperty(client, String.class, null);
infoTable.addContainerProperty(projct, String.class, null);
infoTable.setColumnCollapsingAllowed(true);
infoTable.setColumnCollapsed(impIds, true);
infoTable.setColumnCollapsed(projVe, true);
infoTable.setColumnCollapsed(lstImp, true);
Finally, the table has "refreshRowCache" called on it. Anyone see the problem? There are no errors, no nothing, the table just doesn't add the item (the size of infoTable.getItemIds().size() is 0 right after the call.
EDIT:
I tried the following to verify.
infoTable.addItem(i + 1);
infoTable.getItem(i + 1).getItemProperty(impIds).setValue(sub.get("imported_identities").toString());
infoTable.getItem(i + 1).getItemProperty(projVe).setValue(sub.get("project_versions").toString());
This went and caused a NullPointerException, the stack trace however does not contain any of my classes as far as I can see.
The following is wrong:
The String constructor needs the read size (n).
while ((n = zin.read(buffer, 0, 1024)) > -1) {
String JSON = new String(buffer, 0, n, StandardCharsets.UTF_8);
Then you do JSONs of at most 1024 in the loop, instead on one JSON of it all
The bytes of a UTF-8 cannot be split at some point say at position 1024 and expect to have a valid complete multi-byte sequence at end and following block's begin.
Also there is readFully and closeEntry was missing.
In short:
private void getJsonContent() {
try (ZipInputStream zin = new ZipInputStream(new BufferedInputStream(
new FileInputStream(backupFile)))) {
ZipEntry entry;
while ((entry = zin.getNextEntry()) != null) {
if (entry.getName().equalsIgnoreCase("content.json")) {
long size = entry.getSize();
if (size > 100_000) {
throw new IllegalArgumentException("Data too large");
}
// We could use an InputStreamReader and read text piecewise.
// However JSON parsing also is easiest on an entire text.
byte[] buffer = new byte[(int)size];
int n = zin.readFully(buffer, 0, buffer.length);
String json = new String(buffer, StandardCharsets.UTF_8);
JSONObject obj = new JSONObject(json);
logger.info(json);
// Assign "global" Values to Variables
this.createdAt = obj.getString("created_at");
this.version = obj.getString("version");
// Fill table if applicable
for (int i = 0; i < obj.getJSONArray("content").length(); i++) {
JSONObject sub = obj.getJSONArray("content").getJSONObject(i);
logger.info(sub);
infoTable.addItem(new Object[] {
sub.get("imported_identities").toString(),
sub.get("project_versions").toString(),
sub.get("last_import").toString(),
sub.get("client").toString(),
sub.get("project").toString()
}, i + 1);
}
} // if
zin.closeEntry(); // Do not forget preparing for the next entry
}
} catch (IOException e) {
logger.info("Can't read File.");
} catch (JSONException jse) {
logger.info("JSON Content could not be read: " + jse.getMessage());
}
}
The try-with-resources closes automatically even on exception or return.
Below is my code. I am converting images into bytearray values.
Here finalPathNames.size() == 4
So i want to save the byteArray values eachtime like byteArray1,byteArray2,byteArray3,byteArray4 which is inside for loop
Set<String> finalPathNames = sharedpre.getStringSet("prePathNames", null);
InputStream is = null;
for (String temp : finalPathNames) {
try {
is = new FileInputStream(temp);
try {
byteArray = streamToBytes(is);
} finally {
is.close();
}
} catch (Exception e) {
}
}
is there any optimized way to find result values
Send the bytes to the server, when you retrieve them or keep them in a list (in case you need them more than 1 time)
// as mentioned in the comments, user wants specifically 4 arrays
byte[][] byteArrays = byte[4][]; //
Set<String> finalPathNames = sharedpre.getStringSet("prePathNames", null);
InputStream is = null;
int index = 0;
for (String temp : finalPathNames) {
byteArrays[index] = new byte[0]; // in case of exception clear array. possibly set to null
try {
is = new FileInputStream(temp);
try {
byte[] byteArray = streamToBytes(is);
byteArrays[index] = byteArray;
} finally {
is.close();
}
} catch (Exception e) {
}
finally {
index++;
}
}
Then the resulting streams are available as:
byteArrays[0], byteArrays[1], byteArrays[2], byteArrays[3],
I want to save to a file in android , Some of my arrayList that will be deleted after that.I already have two methods to write/read from android file here but the problem is I want the two methods do that:
the first method must save the element of arraylist then if I call it again it will not write the new element in the same line but write it in another line
The second must read a line (for example I give to the method which line and it returns what the lines contains)
The file looks like that :
firstelem
secondelem
thridelem
anotherelem
another ..
is this possible to do in android java?
PS: I don't need database.
Update
This is My methods :
private void writeToFile(String data) {
try {
OutputStreamWriter outputStreamWriter = new OutputStreamWriter(openFileOutput("config.txt", Context.MODE_PRIVATE));
outputStreamWriter.write(data);
outputStreamWriter.close();
}
catch (IOException e) {
Log.e("Exception", "File write failed: " + e.toString());
}
}
private String readFromFile() {
String ret = "";
try {
InputStream inputStream = openFileInput("config.txt");
if ( inputStream != null ) {
InputStreamReader inputStreamReader = new InputStreamReader(inputStream);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader);
String receiveString = "";
StringBuilder stringBuilder = new StringBuilder();
while ( (receiveString = bufferedReader.readLine()) != null ) {
stringBuilder.append(receiveString);
// stringBuilder.append("\\n");
}
inputStream.close();
ret = stringBuilder.toString();
}
}
catch (FileNotFoundException e) {
Log.e("login activity", "File not found: " + e.toString());
} catch (IOException e) {
Log.e("login activity", "Can not read file: " + e.toString());
}
return ret;
}
Using the save method you linked to you can create the text to save with a StringBuilder:
public String makeArrayListFlatfileString(List<List<String>> listOfLists)
{
StringBuilder sb = new StringBuilder();
if (!listOfLists.isEmpty()) {
// this assumes all lists are the same length
int listLengths = listOfLists.get(0).size();
for (int i=0; i<listLengths; i++)
{
for (List<String> list : listOfLists)
{
sb.append(list.get(i)).append("\n");
}
sb.append("\n"); // blank line after column grouping
}
}
return sb.toString();
}
To parse the contents from that same file (again assuming equal length lists and a String input):
public List<List<String>> getListOfListsFromFlatfile(String data)
{
// split into lines
String[] lines = data.split("\\n");
// first find out how many Lists we'll need
int numberOfLists = 0;
for (String line : lines){
if (line.trim().equals(""))
{
// blank line means new column grouping so stop counting
break;
}
else
{
numberOfLists++;
}
}
// make enough empty lists to hold the info:
List<List<String>> listOfLists = new ArrayList<List<String>>();
for (int i=0; i<numberOfLists; i++)
{
listOfLists.add(new ArrayList<String>());
}
// keep track of which list we should be adding to, and populate the lists
int listTracker = 0;
for (String line : lines)
{
if (line.trim().equals(""))
{
// new block so add next item to the first list again
listTracker = 0;
continue;
}
else
{
listOfLists.get(listTracker).add(line);
listTracker++;
}
}
return listOfLists;
}
For writing, just as Illegal Argument states - append '\n':
void writeToFileWithNewLine(String data) {
try {
OutputStreamWriter outputStreamWriter = new OutputStreamWriter(openFileOutput("config.txt", Context.MODE_PRIVATE));
outputStreamWriter.write(data + "\n");
outputStreamWriter.close();
}
catch (IOException e) { /* handle exception */ }
}
For reading (just the idea, in practice you should read the file only once):
String readLine(final int lineNo) {
InputStream in = new FileInputStream("file.txt");
ArrayList<String> lines = new ArrayList<String>();
try {
InputStreamReader inReader = new InputStreamReader(in);
BufferedReader reader = new BufferedReader(inReader);
String line;
do {
line = reader.readLine();
lines.add(line);
} while(line != null);
} catch (Exception e) { /* handle exceptions */ }
finally {
in.close();
}
if(lineNo < lines.size() && lineNo >= 0) {
return lines.get(lineNo);
} else {
throw new IndexOutOfBoundsException();
}
}
I have some problem with Java IO, this code below is not working, the variable count return -1 directly.
public void putFile(String name, InputStream is) {
try {
OutputStream output = new FileOutputStream("D:\\TEMP\\" + name);
byte[] buf = new byte[1024];
int count = is.read(buf);
while( count >0) {
output.write(buf, 0, count);
count = is.read(buf);
}
} catch (FileNotFoundException e) {
} catch (IOException e) {
}
}
But if I commented the OutputStream such as
public void putFile(String name, InputStream is) {
try {
//OutputStream output = new FileOutputStream("D:\\TEMP\\" + name);
byte[] buf = new byte[1024];
int count = is.read(buf);
while( count >0) {
//output.write(buf, 0, count);
count = is.read(buf);
}
} catch (FileNotFoundException e) {
} catch (IOException e) {
}
}
The count will return the right value (>-1).
How is this possible ? Is it a bug ?
I'm using Jetty in Eclipse with Google plugins and Java 6.21 in Windows 7.
PS :I change the original code, but it doesn't affect the question
public class GenericWorldLoader implements WorldLoader {
#Override
public LoginResult checkLogin(PlayerDetails pd) {
Player player = null;
int code = 2;
File f = new File("data/savedGames/" + NameUtils.formatNameForProtocol(pd.getName()) + ".dat.gz");
if(f.exists()) {
try {
InputStream is = new GZIPInputStream(new FileInputStream(f));
String name = Streams.readRS2String(is);
String pass = Streams.readRS2String(is);
if(!name.equals(NameUtils.formatName(pd.getName()))) {
code = 3;
}
if(!pass.equals(pd.getPassword())) {
code = 3;
}
} catch(IOException ex) {
code = 11;
}
}
if(code == 2) {
player = new Player(pd);
}
return new LoginResult(code, player);
}
#Override
public boolean savePlayer(Player player) {
try {
OutputStream os = new GZIPOutputStream(new FileOutputStream("data/savedGames/" + NameUtils.formatNameForProtocol(player.getName()) + ".dat.gz"));
IoBuffer buf = IoBuffer.allocate(1024);
buf.setAutoExpand(true);
player.serialize(buf);
buf.flip();
byte[] data = new byte[buf.limit()];
buf.get(data);
os.write(data);
os.flush();
os.close();
return true;
} catch(IOException ex) {
return false;
}
}
#Override
public boolean loadPlayer(Player player) {
try {
File f = new File("data/savedGames/" + NameUtils.formatNameForProtocol(player.getName()) + ".dat.gz");
InputStream is = new GZIPInputStream(new FileInputStream(f));
IoBuffer buf = IoBuffer.allocate(1024);
buf.setAutoExpand(true);
while(true) {
byte[] temp = new byte[1024];
int read = is.read(temp, 0, temp.length);
if(read == -1) {
break;
} else {
buf.put(temp, 0, read);
}
}
buf.flip();
player.deserialize(buf);
return true;
} catch(IOException ex) {
return false;
}
}
}
Yeah so... My problem is that this seems to save 'something' in really complex and hard to read way(binary) and I'd rather have it as an .txt, in easily readable format. how to convert?
EDIT: I'm not using Apache Mina, so what should I replace
IoBuffer buf = IoBuffer.allocate(1024);
buf.setAutoExpand(true);"
with?
checkLogin() obviously checks whether the specified login has matching data present and whether the password is correct.
savePlayer() method saves the player.
loadPlayer() loads it again.
The data format used is gzip (wiki) and it is written as a stream of serialized data. If you want to make it more readable, you might want to overload (or just use it, if it is good) toString() method of Player class and to write player.toString() into a new text file using e.g. BufferedWriter wrapped around a File Writer:
String playerName = NameUtils.formatNameForProtocol(player.getName());
BufferedWriter writer = new BufferedWriter(new FileWriter(playerName + ".txt"));
writer.write(player.toString());
writer.close();