I'm writing an application that signs and envelopes data using BouncyCastle.
I need to sign large files so instead of using the CMSSignedDataGenerator (which works just fine for small files) I chose to use CMSSignedDataStreamGenerator. The signed files are being generated but the SHA1 hash does not match with the original file. Could you help me?
Here`s the code:
try {
int buff = 16384;
byte[] buffer = new byte[buff];
int unitsize = 0;
long read = 0;
long offset = file.length();
FileInputStream is = new FileInputStream(file);
FileOutputStream bOut = new FileOutputStream("teste.p7s");
Certificate cert = keyStore.getCertificate(alias);
PrivateKey key = (PrivateKey) keyStore.getKey(alias, null);
Certificate[] chain = keyStore.getCertificateChain(alias);
CertStore certStore = CertStore.getInstance("Collection",new CollectionCertStoreParameters(Arrays.asList(chain)));
CMSSignedDataStreamGenerator gen = new CMSSignedDataStreamGenerator();
gen.addSigner(key, (X509Certificate) cert, CMSSignedDataGenerator.DIGEST_SHA1, "SunPKCS11-iKey2032");
gen.addCertificatesAndCRLs(certStore);
OutputStream sigOut = gen.open(bOut,true);
while (read < offset) {
unitsize = (int) (((offset - read) >= buff) ? buff : (offset - read));
is.read(buffer, 0, unitsize);
sigOut.write(buffer);
read += unitsize;
}
sigOut.close();
bOut.close();
is.close();
I don't know what I'm doing wrong.
I agree with Rasmus Faber, the read/write loop is dodgy.
Replace this:
while (read < offset) {
unitsize = (int) (((offset - read) >= buff) ? buff : (offset - read));
is.read(buffer, 0, unitsize);
sigOut.write(buffer);
read += unitsize;
}
with:
org.bouncycastle.util.io.Streams.pipeAll(is, sigOut);
One possible problem is the line
is.read(buffer, 0, unitsize);
FileInputStream.read is only guaranteed to read between 1 and unitsize bytes.
Try writing
int actuallyRead = is.read(buffer, 0, unitsize);
sigOut.write(buffer, 0, actuallyRead);
read += actuallyRead;
Related
My application requires the users to upload digitally signed pdf and then encrypt the file. This encrypted file is then uploaded on server, where it is decrypted. The decrypted file is then verified by matching the hash of file and digital signature. Now this file is encrypted with using AES algorithm. Once encryption is completed the file is then stored on file server. The size of file could go upto 80mb.
The challenge I am facing now is that when the encrypted file is stored on local drive of machine the files get saved instantly but when the file server is on another machine it takes upto 30 min to save a single file. I am not able to figure out the reason for it.
Following is the code which I am using. I have deployed and tried in tomcat 6 and IBM WAS. The file transfer takes the same time when transferring to file server. The file server is connected to Application server via SAN network.
Following is my encryption code
strSymAlg = rb.getString("SYM_KEY_ALG"); //SYM_KEY_ALG=AES
cipher = Cipher.getInstance(strSymAlg);
SecKey = new SecretKeySpec(hex2Byte(sSymmetricKey), strSymAlg);
cipher.init(Cipher.DECRYPT_MODE, SecKey);
baos = recoverFile(new FileInputStream(fileEnv), cipher);
if (baos != null && isRecoveredFileValid((InputStream) new ByteArrayInputStream(baos.toByteArray()))) {
fileRecovered = (InputStream) new ByteArrayInputStream(baos.toByteArray());
}
}
private ByteArrayOutputStream recoverFile(FileInputStream in, Cipher cipher) {
int blockSize = cipher.getBlockSize();
int outputSize = cipher.getOutputSize(blockSize);
byte[] inBytes = new byte[blockSize];
byte[] outBytes = new byte[outputSize];
int inLength = 0;
int outLength = 0;
boolean more = true;
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try {
while (more) {
inLength = in.read(inBytes);
if (inLength == blockSize) {
outLength = cipher.update(inBytes, 0, blockSize, outBytes);
baos.write(outBytes, 0, outLength);
} else {
more = false;
}
}
if (inLength > 0) {
outBytes = cipher.doFinal(inBytes, 0, inLength);
} else {
outBytes = cipher.doFinal();
}
baos.write(outBytes);
} catch (Exception e) {
System.out.println("recoverFile1: " + e.getMessage());
// e.printStackTrace();
baos = null;
}
return baos;
}
my encryption code is
String strSymKey = "";
File fileToCreate = null;
KeyGenerator keygen = KeyGenerator.getInstance(strSymAlg);
random = new SecureRandom();
keygen.init(random);
SecretKey secKey = keygen.generateKey();
Key publicKey = getPublicKeyFromString(sPubKey.trim());
//encrypt Symmetric key with public key
Cipher cipher = Cipher.getInstance(ALGORITHM);
cipher.init(Cipher.WRAP_MODE, publicKey);
byte[] wrappedKey = cipher.wrap(secKey);
strSymKey = byte2hex(wrappedKey);
fileToCreate = new File(strFile);
if (fileToCreate.exists()) {
fileToCreate.delete();
}
//Encrypt Bidder file with symmetric key
DataOutputStream out = new DataOutputStream(new FileOutputStream(strFile));
cipher = Cipher.getInstance(strSymAlg);
cipher.init(Cipher.ENCRYPT_MODE, secKey);
crypt(fis, out, cipher);
fis.close();
out.close();
//blnDone=true;
// System.out.println("STRING SYMMETRIC KEY:"+ strSymKey);
return strSymKey;
public String byte2hex(byte[] b) {
// String Buffer can be used instead
String hs = "";
String stmp = "";
for (int n = 0; n < b.length; n++) {
stmp = (java.lang.Integer.toHexString(b[n] & 0XFF));
if (stmp.length() == 1) {
hs = hs + "0" + stmp;
} else {
hs = hs + stmp;
}
if (n < b.length - 1) {
hs = hs + "";
}
}
return hs;
}
New Function added
public void crypt(InputStream in, OutputStream out, Cipher cipher) throws IOException, GeneralSecurityException {
System.out.println("crypt start time :"+ new Date());
int blockSize = cipher.getBlockSize();
int outputSize = cipher.getOutputSize(blockSize);
byte[] inBytes = new byte[blockSize];
byte[] outBytes = new byte[outputSize];
int inLength = 0;
boolean more = true;
while (more) {
inLength = in.read(inBytes);
if (inLength == blockSize) {
int outLength = cipher.update(inBytes, 0, blockSize, outBytes);
out.write(outBytes, 0, outLength);
} else {
more = false;
}
}
if (inLength > 0) {
outBytes = cipher.doFinal(inBytes, 0, inLength);
} else {
outBytes = cipher.doFinal();
}
System.out.println("crypt end time :"+ new Date());
out.write(outBytes);
}
Thanks in advance
You are making the classic mistake of assuming that every read fills the buffer, and another: that it won't do so at end of stream. Neither is correct.
while ((count = in.read(buffer)) >= 0)
{
out.write(cipher.update(buffer, 0, count));
}
out.write(cipher.doFinal());
You don't need a DataOutputStream for this.
I am using the following code to decrypt files encrypted from android device.
private void mDecrypt_File(FileInputStream fin, String outFile) throws Exception {
FileOutputStream fout = new FileOutputStream(outFile);
byte[] iv = new byte[16];
byte[] salt = new byte[16];
byte[] len = new byte[8];
byte[] FC_TAGBuffer = new byte[8];
Cipher cipher = Cipher.getInstance(CIPHER_INSTANCE);
DataInputStream dis = new DataInputStream(fin);
dis.read(iv, 0, 16);
dis.read(salt, 0, 16);
Rfc2898DeriveBytes rfc = new Rfc2898DeriveBytes(DEFAULT_PASSWORD, salt, F_ITERATIONS);
SecretKey key = new SecretKeySpec(rfc.getBytes(32), "AES");
//decryption code
cipher.init(Cipher.DECRYPT_MODE, key, new IvParameterSpec(iv));
CipherInputStream cIn = new CipherInputStream(dis, cipher);
cIn.read(len, 0, 8);
long lSize = getLong(len, 0);
cIn.read(FC_TAGBuffer, 0, 8);
byte[] tempFC_TAGBuffer = changeByteArray(FC_TAGBuffer, 0);//new byte[8];
BigInteger ulong = new BigInteger(1, tempFC_TAGBuffer);
if (!ulong.equals(FC_TAG)) {
Exception ex = new Exception("Tags are not equal");
throw ex;
}
byte[] bytes = new byte[BUFFER_SIZE];
//determine number of reads to process on the file
long numReads = lSize / BUFFER_SIZE;
// determine what is left of the file, after numReads
long slack = (long) lSize % BUFFER_SIZE;
int read = -1;
int value = 0;
int outValue = 0;
MessageDigest md = MessageDigest.getInstance("SHA-256");
md.reset();
// read the buffer_sized chunks
for (int i = 0; i < numReads; ++i) {
read = cIn.read(bytes, 0, bytes.length);
fout.write(bytes, 0, read);
md.update(bytes, 0, read);
value += read;
outValue += read;
}
// now read the slack
if (slack > 0) {
read = cIn.read(bytes, 0, (int) slack);
fout.write(bytes, 0, read);
md.update(bytes, 0, read);
value += read;
outValue += read;
}
fout.flush();
fout.close();
byte[] curHash = md.digest();
byte[] oldHash = new byte[md.getDigestLength()];
read = cIn.read(oldHash, 0, oldHash.length);
if (oldHash.length != read || (!CheckByteArrays(oldHash, curHash))) {
Exception ex = new Exception("File Corrupted!");
throw ex;
}
if (outValue != lSize) {
Exception ex = new Exception("File Sizes don't match!");
throw ex;
}
}
This code is working fine on android but behaving strange on Java desktop application.
What I have observed is, on reading old hash from CipherInputStream cIn returns correct hash value only if the size of data to be decrypted is multiples of 32. For example, if I encrypt a text file which have a text of length 32 chars(or 64/128/...), then the following code
byte[] oldHash = new byte[md.getDigestLength()];
read = cIn.read(oldHash, 0, oldHash.length);
if (oldHash.length != read || (!CheckByteArrays(oldHash, curHash))) {
Exception ex = new Exception("File Corrupted!");
throw ex;
}
calculates oldHash correctly, but if I change the text of any other length(not multiple of 32) then the oldHash's last few values becomes zeros.
My Observations :
Text Size 6 char - Trailing zeros in oldHash - 6
Text Size 13 char - Trailing zeros in oldHash - 13
Text Size 20 char - Trailing zeros in oldHash - 4
Text Size 32 char - Trailing zeros in oldHash - 0 // Correct Result
Text Size 31 char - Trailing zeros in oldHash - 1
Text Size 64 char - Trailing zeros in oldHash - 0 // Correct Result
Please help me understanding this behavior.
Agree with DuncanJones, your loop is a mess. Although you properly check the return value of the read() method your loop iterations assume that every read() will return BUFFER_SIZE bytes or 'slack' bytes for the last read.
You code would be hugely better if you make proper use of DataInputStream. For example, you wrap the FileInputStream fin in a DataInputStream but then use the wrong methods in these two lines:
dis.read(iv, 0, 16);
dis.read(salt, 0, 16);
instead, you should use the readFully method, as in:
dis.readFully(iv);
dis.readFully(salt);
Similarly, you would benefit from wrapping your CipherInputStream cIn with another DataInputStream, something like:
CipherInputStream cIn = new CipherInputStream(dis, cipher);
DataInputStream dcIn = new DataInputStream(cIn);
DataInputStream already has a getLong method, so you could just replace these lines:
cIn.read(len, 0, 8);
long lSize = getLong(len, 0);
cIn.read(FC_TAGBuffer, 0, 8);
with
long lSize = dcIn.getLong()
dcIn.readFully(FC_TAGBuffer);
and you get to throw out your homegrown getLong method. Now you can go on and read the next lSize bytes in exactly BUFFER_SIZE chunks using dcIn.readFully(bytes) and make your code, cleaner, shorter, easier to read, and correct.
I have to encrypt some file (jpg) using vigenere cipher. I wrote some code, but after encryption and decryption my file is corrupted. The first 1/4 of image displays okay, but the rest of it is corrupted. Here is my code:
#Override
public byte[] encryptFile(byte[] file, String key) {
char[] keyChars = key.toCharArray();
byte[] bytes = file;
for (int i = 0; i < file.length; i++) {
int keyNR = keyChars[i % keyChars.length] - 32;
int c = bytes[i] & 255;
if ((c >= 32) && (c <= 127)) {
int x = c - 32;
x = (x + keyNR) % 96;
bytes[i] = (byte) (x + 32);
}
}
return bytes;
}
#Override
public byte[] decryptFile(byte[] file, String key) {
char[] keyChars = key.toCharArray();
byte[] bytes = file;
for (int i = 0; i < file.length; i++) {
int keyNR = keyChars[i % keyChars.length] - 32;
int c = bytes[i] & 255;
if ((c >= 32) && (c <= 127)) {
int x = c - 32;
x = (x - keyNR + 96) % 96;
bytes[i] = (byte) (x + 32);
}
}
return bytes;
}
What did I do wrong?
EDIT:
reading and writing to file:
public void sendFile(String selectedFile, ICipher cipher, String key) {
try {
DataOutputStream outStream = new DataOutputStream(client
.getOutputStream());
outStream.flush();
File file = new File(selectedFile);
FileInputStream fileStream = new FileInputStream(file);
long fileSize = file.length();
long completed = 0;
long bytesLeft = fileSize - completed;
String msg = "SENDING_FILE:" + file.getName() + ":" + fileSize;
outStream.writeUTF(cipher.encryptMsg(msg, key));
while (completed < fileSize) {
int step = (int) (bytesLeft > 150000 ? 150000 : bytesLeft);
byte[] buffer = new byte[step];
fileStream.read(buffer);
buffer = cipher.encryptFile(buffer, key);
outStream.write(buffer);
completed += step;
bytesLeft = fileSize - completed;
}
outStream.writeUTF(cipher.encryptMsg("SEND_COMPLETE", key));
fileStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
private void downloadFile(String fileName, int fileSize,DataInputStream input,ICipher cipher, String key) {
try {
FileOutputStream outStream = new FileOutputStream("C:\\" + fileName);
int bytesRead = 0, counter = 0;
while (counter < fileSize) {
int step = (int) (fileSize > 150000 ? 150000 : fileSize);
byte[] buffer = new byte[step];
bytesRead = input.read(buffer);
if (bytesRead >= 0) {
buffer = cipher.decryptFile(buffer, key);
outStream.write(buffer, 0, bytesRead);
counter += bytesRead;
}
if (bytesRead < 1024) {
outStream.flush();
break;
}
}
Display.getDefault().syncExec(new Runnable() {
#Override
public void run() {
window.handleMessage("Download sucessfully");
}
});
outStream.close();
} catch (Exception e) {
Display.getDefault().syncExec(new Runnable() {
#Override
public void run() {
window.handleMessage("Error on downloading file!");
}
});
}
}
You encode the file in whatever chunks come from the disk I/O:
int step = (int) (bytesLeft > 150000 ? 150000 : bytesLeft);
byte[] buffer = new byte[step];
fileStream.read(buffer);
buffer = cipher.encryptFile(buffer, key);
But you decode the file in whatever chunks come from the network I/O:
bytesRead = input.read(buffer);
if (bytesRead >= 0) {
buffer = cipher.decryptFile(buffer, key);
outStream.write(buffer, 0, bytesRead);
counter += bytesRead;
}
These chunks are likely to disagree. The disk I/O may always give you full chunks (lucky for you), but the network I/O will likely give you packet-sized chunks (1500 bytes minus header).
The cipher should get an offset into the already encoded/decoded data (or encode/decode everything at once), and use that to shift the key appropriately, or this may happen:
original: ...LOREM IPSUM...
key : ...abCde abCde...
encoded : ...MQUIR JRVYR...
key : ...abCde Cdeab... <<note the key got shifted
decoded : ...LOREM GNQXP... <<output wrong after the first chunk.
Since the packet data size is (for Ethernet-sized TCP/IP packets) aligned at four bytes, a key of length four is likely to be always aligned.
another issue is that you are ignoring the number of bytes read from disk when uploading the file. While disk I/O is likely to always give you full-sized chunks (the file's likely to be memory-mapped or the underlying native API does provide this guarantee), nothing's taken for granted. Always use the amount of bytes actually read: bytesRead = fileStream.read(buffer);
I am trying to download/resume file. Resume seems to work, but whole download brings the problem. After executing this code testfile is 5242845. But it should be 5242880! I opened this two files in the hex editor and figured out that testfile missing some bytes at the end (begining is okay). This is the code:
String url = "http://download.thinkbroadband.com/5MB.zip";
String DESTINATION_PATH = "/sdcard/testfile";
URLConnection connection;
connection = (HttpURLConnection) url.openConnection();
File file = new File(DESTINATION_PATH);
if (file.exists()) {
downloaded = (int) file.length();
connection.setRequestProperty("Range", "bytes=" + (file.length()) + "-");
}
connection.setDoInput(true);
connection.setDoOutput(true);
BufferedInputStream in = new BufferedInputStream(connection.getInputStream());
FileOutputStream fos = (downloaded == 0) ? new FileOutputStream(DESTINATION_PATH) : new FileOutputStream(DESTINATION_PATH, true);
BufferedOutputStream bout = new BufferedOutputStream(fos, 1024);
byte[] data = new byte[1024];
int x = 0;
int i = 0;
int lenghtOfFile = connection.getContentLength();
while ((x = in.read(data, 0, 1024)) != -1) {
i++;
bout.write(data, 0, x);
downloaded += x;
}
I think that the problem is here while ((x = in.read(data, 0, 1024)) != -1) {.
For example we have file 1030 bytes long. First write is good, bout.write(data,0,1024); but next time while ((x = in.read(data, 0, 1024)) != -1) { gets -1, because 1030-1024=6 bytes left. And we are trying to write 1024 bytes! I know it should not be so, but it seems that it is how I said. How can I figure this? Thanks.
bout.flush();
and/or
bout.close();
You need to close your BufferedOutputStream to ensure that all that is buffered is sent to the buffered OutputStream.
google told me, there is a "available" method of bufferedinputstream, so you can write like
(I´m not an java guru)
while (in.available() > 0)
{
x = in.read(data, 0, 1024);
bout.write(data, 0, x);
}
I am decrypting an XML file from the file system using Bouncy Castle. I output the decrypted text and get a fatal error SAXParseException on the very last byte of data. Below is my decryption method and the setup of the cipher object.
I was initially using cipher streams, and everything worked perfect (commented out code was my stream). Due to policy files and end users not having the 256 bit unlimited versions I need to use bouncy castle.
Any ideas why the final byte is not coming through?
From Constructor:
keyParam = new KeyParameter(key);
engine = new AESEngine();
paddedBufferedBlockCipher =
new PaddedBufferedBlockCipher(new CBCBlockCipher(engine));
Decrypt Method:
public void decrypt(InputStream in, OutputStream out) {
try
{
paddedBufferedBlockCipher.init(false,
new ParametersWithIV(keyParam, _defaultIv));
// cipher.init(Cipher.DECRYPT_MODE, secretKey, ivs);
// CipherInputStream cipherInputStream
// = new CipherInputStream(in, cipher);
byte[] buffer = new byte[4096];
byte[] outBuffer = new byte[4096];
for (int count = 0; (count = in.read(buffer)) != -1;) {
paddedBufferedBlockCipher.processBytes(buffer, 0,
count, outBuffer, 0);
out.write(outBuffer, 0, count);
}
}
catch(Exception e) {
e.printStackTrace();
}
}
[Fatal Error] :40:23: Element type "Publi" must be followed by either attribute specifications, ">" or "/>".
org.xml.sax.SAXParseException: Element type "Publi" must be followed by either attribute specifications, ">" or "/>".
at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:264)
at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:292)
Do you call doFinal() with the final chunk of data?
public void decrypt(InputStream in, OutputStream out) {
try
{
paddedBufferedBlockCipher.init(false,
new ParametersWithIV(keyParam, _defaultIv));
byte[] buffer = new byte[4096];
byte[] outBuffer = new byte[4096];
for (int count = 0; (count = in.read(buffer)) != -1;) {
int c2 = paddedBufferedBlockCipher.processBytes(buffer, 0,
count, outBuffer, 0);
out.write(outBuffer, 0, c2);
}
count = paddedBufferedBlockCipher.doFinal(outBuffer, 0);
out.write(outBuffer, 0, count);
}
catch(Exception e) {
e.printStackTrace();
}
}