There's a method that encodes a file
public static void cipher(File input, int[] key, File output) throws IOException {
ByteBuffer buffer = ByteBuffer.allocate(10);
FileInputStream fin = new FileInputStream(input);
FileChannel fcin = fin.getChannel();
FileOutputStream fout = new FileOutputStream( output );
FileChannel fcout = fout.getChannel();
ByteBuffer temp;
while (true){
buffer.clear();
int r = fcin.read(buffer);
if(r == -1){
break;
}
//cb = StandardCharsets.UTF_8.decode(cipherBuffer(buffer,key));
//System.out.println(cb.toString());
temp = cipherBuffer(buffer,key);
for (int i = 0; i < 10; i++){
System.out.print((char)temp.get(i));
}
temp.flip();
fcout.write(temp);
}
}
The cipherBuffer() method changes the initial Buffer according to a given key.
Tried to use commented code, but it didn't help. For now it even doesn't write to the output
public static ByteBuffer cipherBuffer(ByteBuffer initialBuffer, int[] key){
ByteBuffer result = ByteBuffer.allocate(10);
for (int i = 0; i < 10; i++){
result.put(i, initialBuffer.get(key[i]));
//System.out.println(result.get(i));
}
return result;
}
That's how the output looks like at this stage
ᄚᄎᄋᄏᄚᄇタ ᄒᄈムᄒᄡ ᄉテᄒᄏ
Related
This is the function causing the problem: the program won't be suspended in any breakpoint, therefore, I am not clear of the detail in this function.
public static short[] readHgtFile2short(File file) throws IOException {
BufferedInputStream fis = new BufferedInputStream(new FileInputStream(file));
byte[] bytes = new byte[1024];
byte[] bytes_all = new byte[SAMPLES*SAMPLES*2];
int length = 0;
int i = 0;
while ((length = fis.read(bytes)) != -1) {
for(int j = 0; j< length;j++)
bytes_all[i+j] = bytes[j];
i+=length;
}
fis.close();
short[] st = new short[SAMPLES*SAMPLES];
for(int k = 0;k<SAMPLES*SAMPLES;k++){
byte[] t = new byte[2];
t[0] = bytes_all[2*k];
t[0] = bytes_all[2*k+1];
st[k] = bytes2Short(t);
}
return st;
}
I tried to resolve this problem by delete some codes. I find that if the first line of codes "BufferedInputStream fis = new BufferedInputStream(new FileInputStream(file));" is reserved, the problem was still stuck, unless I just reserve the last two lines.
public static short[] readHgtFile2short(File file) throws IOException {
Short[] st = new Short[SAMPLES*SAMPLES];
return st;
}
I think the problem is caused by BufferedInputStream, but I cannot find similiar questions in stackoverflow and other websites.
I want to use Deflater and Inflater (NOT DeflaterOutputStream and InflaterInputStream) to compress files. The problem is that deflater stops working after mentioned buffer size in this case is 1024. I am using the following code:
public class CompressionUtils {
static String deflateInput = "pic.jpg";
static String deflateOutput = "picDeflate.raw";
static String inflateOutput = "picInflate.jpg";
public static void compress() throws IOException {
Deflater deflater = new Deflater();
byte[] data = new byte[1024];
FileInputStream in = new FileInputStream(new File(deflateInput));
FileOutputStream out = new FileOutputStream(new File(deflateOutput));
long readBytes = 0;
while ((readBytes = in.read(data, 0, 1024)) != -1) {
deflater.setInput(data);
deflater.finish();
byte[] buffer = new byte[1024];
while (!deflater.finished()) {
int count = deflater.deflate(buffer); // returns the generated code... index
out.write(buffer, 0, count);
}
}
}
public static void decompress() throws IOException, DataFormatException {
Inflater inflater = new Inflater();
byte[] data = new byte[1024];
FileInputStream in = new FileInputStream(new File(deflateOutput));
FileOutputStream out = new FileOutputStream(new File(inflateOutput));
long readBytesCount = 0;
long readCompressedBytesCount = 0;
long readBytes = 0;
while ((readBytes = in.read(data, 0, 1024)) != -1) {
readBytesCount = readBytesCount + readBytes;
inflater.setInput(data);
byte[] buffer = new byte[1024];
while (!inflater.finished()) {
int count = inflater.inflate(buffer);
System.out.println("Remaining: " + inflater.getRemaining());
out.write(buffer, 0, count);
}
}
System.out.println("readBytesCount: " + readBytesCount);
}
public static void main(String[] args) {
System.out.println("Operation started");
try {
compress();
} catch (Exception e) {
e.printStackTrace();
}
System.out.println("Operation ended");
}
}
And this is output (in windows) of dir:
01-04-2018 16:52 220,173 pic.jpg
28-04-2018 20:50 943 picDeflate.raw
28-04-2018 20:28 1,024 picInflate.jpg
Why does the compress code stops after reading 1024 bytes?
finish() is only for when you're finished. It is the last thing called after you have provided all of the input data to the object.
I have saved a binary data in FileOutputStream but when I check the length of the data before and after I found that it changes from 72 to 106.
This is my method:
inputStream = new FileInputStream(certificate_file);
/*Certificate file is a Path of a binary file */
pubkey = readFromStream(inputStream, 0, 71);
System.out.println("length of pubkey: "+pubkey.length());
/* This return : length of pubkey: 72 */
writeToStream(path + "pubkey.bin", pubkey);
inputStream = new FileInputStream(path + "pubkey.bin");
pubkey = readFromStream(inputStream);
System.out.println("length of pubkey: "+pubkey.length());
/* This return : length of pubkey: 106 */
writeToStream method to write data into outputstream:
public void writeToStream(String path, String data)
throws FileNotFoundException {
OutputStream os = new FileOutputStream(path);
PrintStream printStream = new PrintStream(os);
printStream.print(data);
}
readFromStream method to read data from stream:
public static String readFromStream(InputStream inputStream, int begin, int end) throws Exception {
int i = 0;
int data = inputStream.read();
String out = "";
while (data != -1) {
if (i >= begin && i <= end) {
out += (char) data;
}
data = inputStream.read();
i++;
}
return out;
}
public static String readFromStream(InputStream inputStream) throws Exception {
int i = 0;
int data = inputStream.read();
String out = "";
while (data != -1) {
out += (char) data;
data = inputStream.read();
i++;
}
return out;
}
Why I have this problem?
I have solved the problem, I transformed the data from String to bytes[] and I changed the read in readFromStream to readAllBytes.
I'm writing a Vigenere cipher. It works with normal char, but when I use extra char on Mac keyboard (using option and c, for example) it breaks. Is it because it's outside of char range?
Output using read byte individually
hello testing long output!##)!(!*!(#()asdfasdfljkasdfjË©âå¬ÃËââÃ¥ËÃâçËâËøÅËèâÏåøÃ
Output using read(byte[])
hello testing long output!##)!(!*!(#()asdfasdfljkasdfjᅨルᅡᄅ¬ネニᅢᆬᅡᆲᅢ゚ᅨレ¬ネツ¬ネニᅢᆬᅨルᅢ゚¬ネニᅢ뎨ニ¬ネムᅨニᅢ쟤モᅨニᅢ゚ᅡᄄ¬ネツᅬタᅢᆬᅢ재゚
Code:
import java.io.*;
class VigenereFilterInputStream extends FilterInputStream {
private final byte[] key;
private int index = 0;
VigenereFilterInputStream(InputStream in, byte[] k) {
super(in);
key = k.clone();
}
public int read() throws IOException {
int c = super.read();
if (c == -1)
return -1;
int out = c ^ key[index];
index ++;
index %= key.length;
return out;
}
public int read(byte[] b) throws IOException {
int result = in.read(b);
for(int i = 0; i < b.length; i++) {
b[i] = (byte) (b[i] ^ key[i % key.length]);
}
return result;
}
}
class VigenereFilterOutputStream extends FilterOutputStream {
private final byte[] key;
VigenereFilterOutputStream(OutputStream out, byte[] k) {
super(out);
key = k.clone();
}
public void write(byte[] b) throws IOException {
byte[] out = new byte[b.length];
for(int i = 0; i < b.length; i++) {
out[i] = (byte) (b[i] ^ key[i % key.length]);
}
super.write(out);
}
}
class Vigenere {
public static void main(String[] args) throws Exception {
if (args.length != 1) {
throw new Exception("Missing filename");
}
File f = new File(args[0]);
byte[] text = "hello testing long output!##)!(!*!(#()asdfasdfljkasdfj˙©∆å¬ß˚∂∆å˙ß∆çˆ∑ˆøœˆß¨∂πåøß".getBytes();
byte[] key = "hello".getBytes();
FileOutputStream os = new FileOutputStream(f);
VigenereFilterOutputStream encrypt = new VigenereFilterOutputStream(os, key);
encrypt.write(text);
FileInputStream is = new FileInputStream(f);
BufferedInputStream bis = new BufferedInputStream(is);
VigenereFilterInputStream decrypt = new VigenereFilterInputStream(bis, key);
bis.mark(text.length);
int c;
while((c = decrypt.read()) != -1) {
System.out.print((char) c);
}
System.out.println();
bis.reset();
byte[] b = new byte[text.length];
decrypt.read(b);
for(byte d: b) {
System.out.print((char) d);
}
System.out.println();
}
}
Here, I am reading the 18 MB file and store it in a two dimensional array. But this program takes almost 15 minutes to run. Is there anyway to optimize the running time of the program. The file contains only binary values. Thanks in advance…
public class test
{
public static void main(String[] args) throws FileNotFoundException, IOException
{
BufferedReader br;
FileReader fr=null;
int m = 2160;
int n = 4320;
int[][] lof = new int[n][m];
String filename = "D:/New Folder/ETOPOCHAR";
try {
Scanner input = new Scanner(new File("D:/New Folder/ETOPOCHAR"));
double range_km=1.0;
double alonn=-57.07; //180 to 180
double alat=38.53;
while (input.hasNextLine()) {
for (int i = 0; i < m; i++) {
for (int j = 0; j < n; j++) {
try
{
lof[j][i] = input.nextInt();
System.out.println("value[" + j + "][" + i + "] = "+ lof[j][i]);
}
catch (java.util.NoSuchElementException e) {
// e.printStackTrace();
}
}
} //print the input matrix
}
I have also tried with byte array but i can not save it in twoD array...
public class FileToArrayOfBytes
{
public static void main( String[] args )
{
FileInputStream fileInputStream=null;
File file = new File("name of file");
byte[] bFile = new byte[(int) file.length()];
try {
//convert file into array of bytes
fileInputStream = new FileInputStream(file);
fileInputStream.read(bFile);
fileInputStream.close();
for (int i = 0; i < bFile.length; i++) {
System.out.print((char)bFile[i]);
}
System.out.println("Done");
}catch(Exception e){
e.printStackTrace();
}
}
}
You can read the file into a byte array first, then deserialize these bytes. Start with 2048 bytes buffer (as input buffer), then experiment by increasing/decreasing its size, but the experimental buffer size values should be a power of two (512, 1024, 2048, etc).
As far as I rememenber, there are good chances that the best performance can be achived with a buffer of size 2048 bytes, but it is OS dependent and should be verified.
Code sample (here you can try different values of BUFFER_SIZE variable, in my case I've read a test file of size 7.5M in less then one second):
public static void main(String... args) throws IOException {
File f = new File(args[0]);
byte[] buffer = new byte[BUFFER_SIZE];
ByteBuffer result = ByteBuffer.allocateDirect((int) f.length());
try (FileInputStream fos = new FileInputStream(f)) {
int bytesRead;
int totalBytesRead = 0;
while ((bytesRead = fos.read(buffer, 0, BUFFER_SIZE)) != -1) {
result.put(buffer, 0, bytesRead);
totalBytesRead += bytesRead;
}
// debug info
System.out.printf("Read %d bytes\n", totalBytesRead);
// Here you can do whatever you want with the result, including creation of a 2D array...
int pos = result.position();
result.rewind();
for (int i = 0; i < pos / 4; i++) {
System.out.println(result.getInt());
}
}
}
Take your time and read docs for java.io, java.nio packages as well as Scanner class, just to improve understanding.