Writing to a file with utf-8 path in java - java

I'm trying to Crypt some files in my program and it works fine but when I try to do it with a file whose path contains utf-8 letters it doesn't work and writes "FileNotFound Exception" then i tried to use java.nio maybe it could solve my issue but java.nio writes the same error :(
code: fil is the file i want to crypt, i want my buffer size be 50 mbytes (52428800 bytes) , x is the times that the buffer must be filled completely and y is the size that must be filled at the last time it fills buffer. example for a 101 mega bytes file y is 2 and x is 1 megabytes. the program read data from a channel to a buffer and then crypts it with a function that i didn't write here (thats not important my issue is the file path) and then puts it into a temp array and again writes it back to the file. when the path is ascii it works but when not it doesn't.------------------------------------if verifying the code is hard for you forget it. just tell me how to crypt a file with utf-8 name like فایل.docx .
RandomAccessFile fil = new RandomAccessFile(sfil, "rw");
FileChannel inChannel = fil.getChannel();
ByteBuffer buf;
int x = (int) (fil.length() % (52428800));
int y = (int) (fil.length() / (52428800));
// -----------------------------------------
for (int q = 0; q <= y; q++) {
if (q == y)
buf = ByteBuffer.allocate(x);
else
buf = ByteBuffer.allocate(52428800);
inChannel.read(buf);
buf.flip();
byte[] temp = new byte[buf.limit()];
for (int i = 0; i < buf.capacity(); i++) {
temp[i] = cr(buf.get(), key);
}
buf.flip();
buf.put(temp);
while(buf.hasRemaining()) {
inChannel.write(buf);
}
buf.clear();
}
inChannel.close();
fil.close();

Related

Retrieving data from byte Array

I'm trying to implement the tail program and want to print the last n bytes of a file. I've used a RandomAccessFile variable to store the data from the text file. When I try to retrieve the data and print it to the console, I'm getting something like this:
-n1
65109710979710979710810979710810510510979710910659711010510979711410011897114107109797114100119111108102106597114111110
How does on properly retrieve the data from the byte array?
This is my code:
RandomAccessFile raf = new RandomAccessFile(file, "r");
byte[] b = new byte[n];
raf.readFully(b, 0, n);
for (int i = 0; i < n; i++) {
System.out.print(b[i]);
}
You are printing the byte value. To convert e.g. an array of bytes to a String that you can print on System.out.println try the following:
System.out.println(new String(b));
If you wish to convert each byte (as in your loop) to a printable charyou can do the following conversion:
for (int i = 0; i < 10; i++) {
char c = (char) (b[i] & 0xFF);
System.out.print(c);
}
A byte is simply one byte in size (8 bits) whereas a char in Java i 16 bits (2 bytes). Therefore the printed bytes does not make sense, it is not an entire character.
See this link for more info.

When using RandomAccessFile, does the file pointer update on it's own after a read?

I am trying to read and parse a few pieces of information from a file that will be stored in sequential order. ,a char[] of size 8 ,an int ,an int[] of size 8 and finally ,an int.
So, I am reading 56 bytes of information. I am using RandomAccessFile and was wondering if I needed to seek() after preforming each operation of readChar() and readInt() or if I can just call these methods one after the other. I guess this is more of a question about whether the file pointer will reset after each operation completes or if it's safe so assume that the fp will follows it's last location until the file is closed.
Here is what I've written:
int currentOffset = 128;
for(int i = 0; i<16; i++){
//Initialize nodes.
readDisk.seek(currentOffset);
char[] name = new char[8];
for(int j = 0; j<8; j++){
name[i] = readDisk.readChar();
}
int size = readDisk.readInt();
int[] blockPointers = new int[8];
for(int j = 0; j<8; j++){
blockPointers[i] = readDisk.readInt();
}
int used = readDisk.readInt();
Will the fp be at 156 after these operations? Thank you! Sorry if this is a silly question.
Yes, read moves file pointer by number of bytes read, try this
RandomAccessFile f = new RandomAccessFile("1.txt", "r");
f.readChar();
System.out.println(f.getFilePointer());
output
2

Read 32 bit binary numbers in java

I am trying to read five 32-bit binary numbers and print them as int. Here is my code:
FileInputStream fin = new FileInputStream(file);
int count = 5;
for (int i = 0; i < count; i++) {
byte[] input = file.getBytes();
String bin=Integer.toBinaryString(0xFF & input[i] | 0x100).substring(1);
System.out.println(bin);
I am getting this:
01010011
01101110
00110011
01011111
01010010
What am I doing wrong? thanks
You're not actually reading from the file, but printing the binary representation of the first five characters of the name of the file. Use fin.read() to read bytes from the file.
You can also use DataInputStream to read 32 bit big endian integers directly, instead of reading them as 4 individual bytes.
If you need to read five big-endian 32-bit integers, then I suggest that you use DataInputStream, e.g.
final int count = 5;
try (DataInputStream dis = new DataInputStream(new FileInputStream(file))) {
for (int i = 0; i < count; i++) {
int value = dis.readInt();
System.out.println(Integer.toBinaryString(value));
}
}

How to convert image to Boolean Array in Java (Android)?

I'm trying to store a 32 x 32 Boolean array in a 32 x 32 black and white image (either bitmap or PNG), to then be mapped to a Boolean[32][32] array with black pixels being true and white being false.
This is to store frames of animation to display on a virtual 32 x 32 display. Here's what I have so far below.
Bitmap bmp = BitmapFactory.decodeResource(context.getResources(), R.raw.f1);
bmp.compress(Bitmap.CompressFormat.PNG, 100, o_stream);
byte[] byteArray = o_stream.toByteArray();
What do I do with byteArray to make it a Boolean[32][32] array or am I going about this all wrong in the first place?
While I never did anything with images (so I don't know if this is anything close to what one should do to get the black-and-white of a pic), I suppose you need a rule to decide whether a pixel is closer to black or closer to white. But I'm curious, how can a byte represent a color? Even if it's RGB, you need at least three bytes, don't you?
if(src!= null){
ByteArrayOutputStream os=new ByteArrayOutputStream();
src.compress(android.graphics.Bitmap.CompressFormat.PNG, 100,(OutputStream) os);
byte[] byteArray = os.toByteArray();
//Log.d("byte=",""+byteArray.length);//returns length.
//str = Base64.encodeToString(byteArray,Base64.DEFAULT);//returns string
}
where src is the bitmap....
If you just want to encode an array of Booleans into a bitmap to save storage space, why use an image? That's a lot of extra overhead. Why not just create a bitmap yourself, like this:
Boolean[][] booleanArray = ... // this is your Boolean array
int[] bits = new int[32]; // One int holds 32 bits
for (int i = 0; i < 32; i++) {
for (int j = 0; j < 32; j++) {
if (booleanArray[i][j]) {
// Set bit at the corresponding position in the bits array
bits[i] |= 1 << j;
}
}
}
// Now you have the data in a int array which you can write to a file
// using DataOutputStream. The file would contain 128 bytes.
// To recreate the Boolean[32][32] from the int array, do this:
Boolean[][] booleanArray = new Boolean[32][32];
int[] bits = ... // This is the data you read from the file using DataInputStream
for (int i = 0; i < 32; i++) {
for (int j = 0; j < 32; j++) {
if ((bits[i] & (1 << j)) != 0) {
// Set corresponding Boolean
booleanArray[i][j] = true;
}
}
}

Fast reading of little endian integers from file

I need to read a binary file consisting of 4 byte integers (little endian) into a 2D array for my Android application. My current solution is the following:
DataInputStream inp = null;
try {
inp = new DataInputStream(new BufferedInputStream(new FileInputStream(procData), 32768));
}
catch (FileNotFoundException e) {
Log.e(TAG, "File not found");
}
int[][] test_data = new int[SIZE_X][SIZE_Y];
byte[] buffer = new byte[4];
ByteBuffer byteBuffer = ByteBuffer.allocate(4);
for (int i=0; i < SIZE_Y; i++) {
for (int j=0; j < SIZE_X; j++) {
inp.read(buffer);
byteBuffer = ByteBuffer.wrap(buffer);
test_data[j][SIZE_Y - i - 1] = byteBuffer.order(ByteOrder.LITTLE_ENDIAN).getInt();
}
}
This is pretty slow for a 2k*2k array, it takes about 25 seconds. I can see in the DDMS that the garbage collector is working overtime, so that is probably one reason for the slowness.
There has to be a more efficient way of using the ByteBuffer to read that file into the array, but I'm not seeing it at the moment. Any idea on how to speed this up?
Why not read into a 4-byte buffer and then rearrange the bytes manually? It will look like this:
for (int i=0; i < SIZE_Y; i++) {
for (int j=0; j < SIZE_X; j++) {
inp.read(buffer);
int nextInt = (buffer[0] & 0xFF) | (buffer[1] & 0xFF) << 8 | (buffer[2] & 0xFF) << 16 | (buffer[3] & 0xFF) << 24;
test_data[j][SIZE_Y - i - 1] = nextInt;
}
}
Of course, it is assumed that read reads all four bytes, but you should check for the situation when it's not. This way you won't create any objects during reading (so no strain on the garbage collector), you don't call anything, you just use bitwise operations.
If you are on a platform that supports memory-mapped files, consider the MappedByteBuffer and friends from java.nio
FileChannel channel = new RandomAccessFile(procData, "r").getChannel();
MappedByteBuffer map = channel.map(FileChannel.MapMode.READ_ONLY, 0, 4 * SIZE_X * SIZE_Y);
map.order(ByteOrder.LITTLE_ENDIAN);
IntBuffer buffer = map.asIntBuffer();
int[][] test_data = new int[SIZE_X][SIZE_Y];
for (int i=0; i < SIZE_Y; i++) {
for (int j=0; j < SIZE_X; j++) {
test_data[j][SIZE_Y - i - 1] = buffer.get();
}
}
If you need cross-platform support or your platform lacks memory-mapped buffers, you may still want to avoid performing the conversions yourself using an IntBuffer. Consider dropping the BufferedInputStream, allocating a larger ByteBuffer yourself and obtaining a little-endian IntBuffer view on the data. Then in a loop reset the buffer positions to 0, use DataInputStream.readFully to read the large regions at once into the ByteBuffer, and pull int values out of the IntBuffer.
First of all, your 'inp.read(buffer)' is unsafe, as read contract does not guarantee that it will read all 4 bytes.
That aside, for quick transformation use the algorithm from DataInputStream.readInt
I've adapted for you case of byte array of 4 bytes:
int little2big(byte[ ] b) {
return (b[3]&0xff)<<24)+((b[2]&0xff)<<16)+((b[1]&0xff)<<8)+(b[0]&0xff);
}
I don't think it is necessary to reinvent the wheel and perform the byte reordering for endianness again. This is error prone and there is a reason a class like ByteBuffer exists.
Your code can be optimized in the sense that it wastes objects. When a byte[] is wrapped by a ByteBuffer the buffer adds a view, but the original array remains the same. It does not matter wheather the original array is modified/read from directly or the ByteBuffer instance is used.
Therefore, you only need to initialize one instance of ByteBuffer and also have to set the ByteOrder once.
To start again, just use rewind() to set the counter again to the beginning of the buffer.
I have taken your code and modified it as desribed. Be aware that it does not check for errors if there are not enough bytes in the input left. I would suggest to use inp.readFully, as this will throw EOFException if not enough bytes to fill the buffer are found.
int[][] test_data = new int[SIZE_X][SIZE_Y];
ByteBuffer byteBuffer = ByteBuffer.wrap(new byte[4]).order(ByteOrder.LITTLE_ENDIAN);
for (int i=0; i < SIZE_Y; i++) {
for (int j=0; j < SIZE_X; j++) {
inp.read(byteBuffer.array());
byteBuffer.rewind();
test_data[j][SIZE_Y - i - 1] = byteBuffer.getInt();
}
}

Categories