File changes permission after writing block of byte - java

I am getting issue when trying to read and write to the same file using RandomAcessFile.
I am reading block of 16 bytes from a file and write them in the same file on given position (eg. 256-th).
The problem is on ra.write(b) line. When the following line is execute i got a message on the text editor Kate (I am using Linux Manjaro) saying:
The file /home/mite/IdeaProjects/IspitJuni2015/dat.txt was opened with UTF-8 encoding but contained invalid characters.
It is set to read-only mode, as saving might destroy its content.
Either reopen the file with the correct encoding chosen or enable the read-write mode again in the tools menu to be able to edit it.
and it turns on read-only mode.
Also I tried manually to uncheck the read-only permission in Kate but it's not working either. What seems to be the problem?
public static byte[] read(long i) throws IOException{
File in = new File("./dat.txt");
RandomAccessFile ra = new RandomAccessFile(in,"rw");
byte[] readObj= new byte[16];
if (i>in.length()/16)
{
return null;
}
ra.seek(i*16);
ra.read(readObj);
ra.close();
return readObj;
}
public static void write(long i, byte[] obj) throws IOException{
File out=new File("./dat.txt");
RandomAccessFile ra=new RandomAccessFile(out,"rw");
if (!out.exists())
{
out.createNewFile();
}
long size=out.length();
if (i*16>size)
{
ra.seek(out.length());
for (long j=size;j<i*16;j+=16)
{
byte[] b=new byte[16];
ra.write(b);
}
}
ra.seek((i)*16);
System.out.println(new String(obj));
ra.write(obj);
ra.close();
}
public static void main(String[] args) throws IOException{
write(35,read(4));
}

I think you misunderstand what your editor tells you.
First of all, not every possible sequence of bytes is a valid UTF-8 string, see for example "UTF-8 decoder capability and stress test". So when you copy 16 bytes from one place of UTF-8 file to another you might get a file which no longer contains a valid UTF-8 text.
I suspect that you have the same file opened in Kate to see results of your editing. What the editors says to you is that it noticed that the file you opened is not a valid UTF-8 file and thus it doesn't know how to handle it correctly and thus to prevent accidental damage to your potentially precious data which now looks as binary (not text) to the editor, the editor refuses to save anything from UI back to that file. This doesn't change any permission on file-system level and probably other (dumber) editors will not warn you about such possible corruptions.

Thank you for your replies. I figured out the problem.
Sometimes text editors are adding one extra byte at the end of the file which is not supported as byte in Java. Usually this is EOF byte and is treated as UTF-8 which Java only accepts writing/reading ASCI bytes, except manipulating through writeUTF() method.
Also this byte is invisible in text editors and that was the reason why I write this post.
It took me two days to find out what is the issue, but if someone gets stuck here keep in mind the EOF byte.

Related

can not save utf8 file in windows server with java

I have a simple java application that saves some String in utf-8 encode.
But when I open that file with notepad and save as,it shows it's encode ANSI.Now I don't know where is the problem?
My code that save the file is
File fileDir = new File("c:\\Sample.txt");
Writer out = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream(fileDir), "UTF8"));
out.append("kodehelp UTF-8").append("\r\n");
out.append("??? UTF-8").append("\r\n");
out.append("???? UTF-8").append("\r\n");
out.flush();
out.close();
The characters you are writing to the file, as they appear in the code snippet, are in the basic ASCII subset of UFT-8. Notepad is likely auto-detecting the format, and seeing nothing outside the ASCII range, decides the file is ANSI.
If you want to force a different decision, place characters such as 字 or õ which are well out of the ASCII range.
It is possible that the ??? strings in your example were intended to be UTF-8. If so. make sure your IDE and/or build tool recognizes the files as UTF-8, and the files are indeed UTF-8 encoded. If you provide more information about your build system, then we can help further.

How do I create an external .exe file using Java

I am storing all of the bytes of an external .exe file, and then re-writing them to another .exe file that I am currently creating with FileOutputStream/BufferedOutputStream.
The bytes are written fine, and the second program is created in the location of my choice, but when I come to run the file, it says it's not a valid .exe file or not a valid 32/64bit application.
I'm guessing because it's not packed and generated properly.
How would I make it so it's an executable file and works the same as the first one?
p.s I can't use any copying of the file, because eventually I'm going to be encrypting the bytes and writing them to the file, but I still want it to be usable.
If all the bytes are identical it will run. If the original file runs and the copy doesn't then some of the bytes have to be different. It has nothing to do with packing.
Maybe you are using a byte variable to store the read data. Don't do that. Just use int. If you don't use byte variables correctly you can run into problems due to automatic sign extension.
This works fine for me
import java.io.*;
public class Test {
public static void main(String[] args) throws Exception {
BufferedInputStream in = new BufferedInputStream(new FileInputStream("a.exe"));
BufferedOutputStream out = new BufferedOutputStream(new FileOutputStream("b.exe"));
int data = in.read();
while(data >= 0) {
out.write(data);
data = in.read();
}
in.close();
out.close();
}
}

error in sending image from android to java app serially -javax.imageio.IIOException: Bogus Huffman table definition

I need to send image from android app to java app. Basically, I need a byte array from the image to send to rf module which transmits.Another rf module receives and sends the byte array to java app which must make the image .
Android code:
FileInputStream fis = new FileInputStream(myFile);
byte[] b=new byte[(int)myFile.length()];
fis.read(b);server.send(b);
Java code:
FileOutputStream fwrite = new FileOutputStream(new File("my_xml"),true);
fwrite.write(bb);//bb is a byte from rf using input stream as soon as a byte comes it is read to file. This is necessary for some other reasons
fwrite.flush();
fwrite.close();
After getting full file:
FileInputStream fir=new FileInputStream("my_xml");
final BufferedImage bufferedImage = ImageIO.read(fir);
ImageIO.write(bufferedImage, "bmp", new File("image.bmp"));
fir.close();
I am getting error javax.imageio.IIOException: Bogus Huffman table definition
The rf is working fine because text file is being sent perfectly.Please help.Even without ImageIo code is not giving image even after changing extension to jpeg
The error means that the image file cant be read because the format is wrong.That is some bytes are missing or wrong or out of proper position and therefore file cant be decoded. My rf transfer does not have protocols like tcp/ip therefore some bytes are lost due to error in communication channel and hence the error.
You don't need to use ImageIO just to copy a file. Just read and write the bytes.
Your code has other problems:
You are assuming that read(byte[]) fills the buffer. It doesn't. Check the Javadoc.
You are also assuming that the file length fits into an int. If it does, fine. If it doesn't, you are hosed.
You appear to be opening and closing the FileOutputStream on every byte received. This could not be more inefficient. Open it once, write everything, close it.
flush() before close() is redundant.
You are storing the image in a file called 'my_xml'. This is only going to cause confusion, if it hasn't already.
You don't even need the file. Just load the image directly from the input stream.

How can i read a Russian file in Java?

I tried adding UTF-8 for this but it didn't work out. What should i do for reading a Russian file in Java?
FileInputStream fstream1 = new FileInputStream("russian.txt");
DataInputStream in = new DataInputStream(fstream1);
BufferedReader br = new BufferedReader(new InputStreamReader(in,"UTF-8"));
If the file is from Windows PC, try either "windows-1251" or "Cp1251" for the charset name.
If the file is somehow in the MS-DOS encoding, try using "Cp866".
Both of these are single-byte encodings and changing the file type to UTF-8 (which is multibyte) does nothing.
If all else fails, use the hex editor and dump a few hex lines of these file to you question. Then we'll detect the encoding.
As others mentioned you need to know how the file is encoded. A simple check is to (ab)use Firefox as an encoding detector: answer to similar question
If this is a display problem, it depends what you mean by "reads": in the console, in some window? See also How can I make a String with cyrillic characters display correctly?

Carriage Return\Line feed in Java

I have created a text file in Unix environment using Java code.
For writing the text file I am using java.io.FileWriter and BufferedWriter. And for newline after each row I am using bw.newLine() method (where bw is object of BufferedWriter).
And I'm sending that text file by attaching in mail from Unix environment itself (automated that using Unix commands).
My issue is, after I download the text file from mail in a Windows system, if I
opened that text file the data is not properly aligned. newline() character is
not working, I think so.
I want same text file alignment as it is in Unix environment, if I opened the
text file in Windows environment also.
How do I resolve the problem?
Java code below for your reference (running in Unix environment):
File f = new File(strFileGenLoc);
BufferedWriter bw = new BufferedWriter(new FileWriter(f, false));
rs = stmt.executeQuery("select * from jpdata");
while ( rs.next() ) {
bw.write(rs.getString(1)==null? "":rs.getString(1));
bw.newLine();
}
Java only knows about the platform it is currently running on, so it can only give you a platform-dependent output on that platform (using bw.newLine()) . The fact that you open it on a windows system means that you either have to convert the file before using it (using something you have written, or using a program like unix2dos), or you have to output the file with windows format carriage returns in it originally in your Java program. So if you know the file will always be opened on a windows machine, you will have to output
bw.write(rs.getString(1)==null? "":rs.getString(1));
bw.write("\r\n");
It's worth noting that you aren't going to be able to output a file that will look correct on both platforms if it is just plain text you are using, you may want to consider using html if it is an email, or xml if it is data. Alternatively, you may need some kind of client that reads the data and then formats it for the platform that the viewer is using.
The method newLine() ensures a platform-compatible new line is added (0Dh 0Ah for DOS, 0Dh for older Macs, 0Ah for Unix/Linux). Java has no way of knowing on which platform you are going to send the text. This conversion should be taken care of by the mail sending entities.
Don't know who looks at your file, but if you open it in wordpad instead of notepad, the linebreaks will show correct. In case you're using a special file extension, associate it with wordpad and you're done with it. Or use any other more advanced text editor.
bw.newLine(); cannot ensure compatibility with all systems.
If you are sure it is going to be opened in windows, you can format it to windows newline.
If you are already using native unix commands, try unix2dos and convert teh already generated file to windows format and then send the mail.
If you are not using unix commands and prefer to do it in java, use ``bw.write("\r\n")` and if it does not complicate your program, have a method that finds out the operating system and writes the appropriate newline.
If I understand you right, we talk about a text file attachment.
Thats unfortunate because if it was the email's message body, you could always use "\r\n", referring to http://www.faqs.org/rfcs/rfc822.html
But as it's an attachment, you must live with system differences. If I were in your shoes, I would choose one of those options:
a) only support windows clients by using "\r\n" as line end.
b) provide two attachment files, one with linux format and one with windows format.
c) I don't know if the attachment is to be read by people or machines, but if it is people I would consider attaching an HTML file instead of plain text. more portable and much prettier, too :)
Encapsulate your writer to provide char replacement, like this:
public class WindowsFileWriter extends Writer {
private Writer writer;
public WindowsFileWriter(File file) throws IOException {
try {
writer = new OutputStreamWriter(new FileOutputStream(file), "ISO-8859-15");
} catch (UnsupportedEncodingException e) {
writer = new FileWriter(logfile);
}
}
#Override
public void write(char[] cbuf, int off, int len) throws IOException {
writer.write(new String(cbuf, off, len).replace("\n", "\r\n"));
}
#Override
public void flush() throws IOException {
writer.flush();
}
#Override
public void close() throws IOException {
writer.close();
}
}

Categories