Encryption - Wrong data when decrypting - java

I'm working on communicating with a server and I've reached the final stage, where after negotiating keys and setting the session key, I send an encrypted message and the server answers back.
Currently, we're working with AES-256 CBC, with a random IV that gets sent to the server and I locally store. The problem that I'm currently facing is when I decrypt the data I got from the server:
decryptCipher.init(Cipher.DECRYPT_MODE, key, new IvParameterSpec(cipher.getIV(), 0, 16));
//Get the array after the 7 bytes from the header
byte[] encrypted = Arrays.copyOfRange(sbResponse.toString().getBytes(), 7, sbResponse.toString().length());
When I try to decrypt that parsed array, any of the following happen, however, the response from the server does not vary in length or content at all:
I can't decrypt it, because of the following error:
javax.crypto.BadPaddingException: Given final block not properly padded
at com.sun.crypto.provider.SunJCE_f.b(DashoA13*..)
at com.sun.crypto.provider.SunJCE_f.b(DashoA13*..)
at com.sun.crypto.provider.AESCipher.engineDoFinal(DashoA13*..)
at javax.crypto.Cipher.doFinal(DashoA13*..)
I can't decrypt it, this error comes up:
javax.crypto.IllegalBlockSizeException: Input length must be multiple of 16 when decrypting with padded cipher
at com.sun.crypto.provider.SunJCE_f.b(DashoA13*..)
at com.sun.crypto.provider.SunJCE_f.b(DashoA13*..)
at com.sun.crypto.provider.AESCipher.engineDoFinal(DashoA13*..)
at javax.crypto.Cipher.doFinal(DashoA13*..)
I can decrypt it, certain blocks come up fine, but some of them, have weird characters among clear text:
k¤kC­²O©æÖ—Õw½QøX»ÌEXøÀWHÌËùtiÓaÚo at?the application
Everything comes up just fine.
So, I have to make a bunch of calls until I get a clean response from the server.
What I've noticed is that the server does change the IV on its end, however, on my end, the IV always remains the same when I ask the Cipher for it, so I really don't know where else to look.
Here's an excerpt of the code that gets the response from the server:
StringBuilder sb = new StringBuilder();
while (ConnectionStatus.LISTENING.equals(status)) {
if (in.ready()) {
sb.append((char) in.read());
} else {
if (sb.length() > 0) {
status = ConnectionStatus.OPEN;
}
}
}
if (ConnectionStatus.TIMEOUT.equals(status)) {
status = ConnectionStatus.OPEN;
throw new TimeoutException();
}
Does anyone have any idea on what might be happening?
Let me know if you need further details, code or anything.

The problem is with storing binary data into a String.
If the InputStreamReader expects UTF-8, it most likely encounters invalid data since most binary streams are not valid UTF-8. Data is lost when the reader encounters a sequence of bytes that is not a valid character.
There are at least two or three solutions:
Switch to the underlying InputStream for binary data. Since an InputStreamReader may perform buffering, this is problematic - even if this might happen with some charsets only (To enable the efficient conversion of bytes to characters, more bytes may be read ahead from the underlying stream than are necessary to satisfy the current read operation.)
Always treat data as binary, and only if you expect textual data, convert the data to String.
Encode the encrypted message to text before transmission, and decode it after receiving. There are several standard encoding schemes or you may roll your own. Here are some:
Hexadecimal - not exactly efficient (4 bits per character) but easier to implement manually.
Base64 - the de-facto standard in binary data encoding (6 bits per character). While not a part of the JFC (yet), there's at least one library for that.
Ascii85 - the top notch in encoding density to printable text (6.4 bits per character), if you can find a library for that. It's not widely used.

Related

Converting byte array with ASCII encoding to String produces weird result

I'm making a socket application in Java that receives some HTML data from the server in ASCII and then parse the data accordingly.
byte[] receivedContent = new byte[12500];
receivedSize = inputStream.read(receivedContent);
receivedContent = Arrays.copyOf(receivedContent, receivedSize+1);
if (receivedSize == -1) {
System.out.println("ERROR! NO DATA RECEIVED");
System.exit(-1);
}
lastReceived = new String(receivedContent, StandardCharsets.US_ASCII);
This should really be quite straight forward but it's not. I printed out some debug messages and found that despite receiving some bytes of data, (for exmaple priting receivedSize tells me its received 784 bytes), the resulting string from those bytes is only a few chars long, like this:
Ard</a></li><li><a
I'm expecting a full HTML document, and so this is clearly wrong. There's also no obvious pattern as to when might this happen. It seems totally random. Since I'm allocating new memory for the buffer there really shouldn't be any old data in it that messes with the new data from the socket. Can someone shed some light on this strange behavior? Also this seems to happen less frequently on my Windows machine running OracleJDK rather than my remote Ubunut machine that runs OpenJDK, could that be the reason and how would I fix that?
UPDATE:
at the end I manually inspected the byte array's ASCII encoding against a ASCII table and found that the server is intentionally sending garbled data. Mystery solved.
Instead of using:
inputStream.read(receivedContent);
You need to read all data from the stream. Using something like (from apache commons io):
IOUtils.readFully(inputStream, receivedContent)

Java is giving me a different buffer of the same packet from the one in C

Hello StackOverflow community,
I has always been around this community for a while but I never had such a problem without any other solution online, or at least, I couldn't find it.
I'm using Java to make a client, as soon as the client connects to the server, it receives a packet containing sensible and essential informations and of course they are encrypted; I successfully reverse engineered the cryptography behind the whole process a long time ago and implemented it in C++ without any problem and fully tested with positive results.
Now I'm trying to rewrite the client in Java for science and better coding speed, but the only problem is that the packet is different from what it should look like.
For example, by sniffing the packet with a C native application I get a buffer, but the same packet in my Java client results different.
What do I mean? I mean there are several 0xFD/BF bytes around which are not valid, resulting in a corrupted buffer and then a decryption failure.
These are the screenshots to let you understand better
Original CORRECT packet
Corrent packet
This is the packet dumped by Java, which is CORRUPTED
Incorrect packet
I'm using Read as reading object class for the socket's outStream.
Do you have any idea about the cause of the problem?
private Reader _br = new InputStreamReader(socket.getInputStream());
char[] _data = new char[92];
this._br.read(_data);
_dump(toBytes(_data));
I just put the code related to the issue.
You are decoding random bytes (probably your ciphertext) to characters, using the platform default encoding, which appears to be UTF-8.
This generally doesn't work, of course, so the replacement character, "�" or U+FFFD, is substituted in the character stream wherever invalid byte sequences are encountered.
Then you print the characters, encoding the (now-corrupted) text into UTF-8. The UTF-8 encoding of U+FFFD is 0xEF 0xBF 0xBD.
The cause of the problem is that you are treating non-textual data as text.
Update:
The problem is that you are creating an InputStreamReader. Don't do that. That would only be useful if the input stream contains encoded text. Read your input into a byte array instead:
InputStream is = socket.getInputStream();
byte[] data = new byte[92];
for (int pos = 0; pos < data.length; ) {
int n = is.read(data, pos, data.length - pos);
if (n < 0)
throw new EOFException();
pos += n;
}
/* Print what you read for debugging... */
for (byte b : data)
System.out.printf("%02X", b & 0xFF);
System.out.println();
Now data contains your packet. You can parse it and decrypt the ciphertext. Perhaps the resulting plain text is actually text, and at that point, you can decode it into characters.

How to ensure the encrypted length are same as the given string

Hi is it possible to encrypt the string with the certain length that i want? for example: i want to encrypt this string BI000001 to something like hex value A3D5F2ARD3(random) fixed it at 10 length. Therefore when user enter this value A3D5F2ARD3, system will based on this value and decrypt it to get back the value BI000001 .
is it possible to do this in java?
I tried a lot of method but all encrypted length are way too long.
I am not aware of any JDK built-in Java encryption method which provides this feature. Then again, I am not an encryption expert, but I guess such a custom feature won't be built in the JDK.
Maybe this discussion is also useful: https://crypto.stackexchange.com/questions/6098/is-there-a-length-preserving-encryption-scheme
Why do you want to preserve size of the string? maybe there is another solution for your problem.
Typically you would use a block cipher such as AES to encrypt data. Block ciphers (as their name suggest) work in blocks of data of a fixed size, for example AES works in blocks of 128 bits. If a block cipher encounters input smaller than the block size it pads it, which is likely why you are seeing the ciphertext larger than the plaintext.
If you want to preserve the length then consider Format Preserving Encryption as mentioned in this question.

CRC check redundant when using encryption?

I am using AES for encryption and CRC to check data integrity and I have the impression that the CRC check is redundant in my case. I am doing the following:
Encryption:
Take the payload data and calculate CRC from it
Encrypt payload data plus CRC
Decryption:
Decrypt data
Calculate new CRC of payload data and compare it with the old CRC
I wanted to provoke a CRC check failure in my unit test but when I manipulate the payload data the decryption always throws a BadPaddingException.
My question is: If the decryption always throws this exception when data is corrupted or manipulated (will it?) isn´t the CRC check redundant the way I am using it?
Assuming the incorrectly decrypted data is uniformly distributed, it will appear to be correctly PKCS5/PKCS7 padded about 1 time for every 255 incorrect passwords. This means that there is still a 1/255 chance that a correct change will occur and the item will decrypt into garbage. Therefore your check is not a waste.
If you actually want the behavior you expected, you can use "AES/CTR/NoPadding", which will not require an exact block size and will always return a decrypted byte[], whether or not the keys match.
However if an attacker can repeatedly alter the ciphertext and get you to decrypt it, (an example might be encrypted data stored in a cookie) and if they can distinguish between your behavior when the decrypted data throws an exception for bad padding and when it is simply garbage, then they can determine the plaintext via a "padding oracle attack".
You may also want to consider if a more robust fingerprint than CRC may be appropriate like SHA-256 for ensuring message integrity.
A lot of this is repeated from: AES BadPaddingException

Java String to Byte Conversion [duplicate]

This question already has an answer here:
Closed 11 years ago.
Possible Duplicate:
Java AES Encrypt Entire String
Im having problems with the conversions back and forth between strings and byte arrays.
Basically I've made a small program to encrypt and decrypt messages using AES.
After encrypting the message this happens:
byte[] result = cipher.doFinal(message.getBytes());
String stringResult = new String(result);
Which converts the encrypted message to a string.
Now my decryptor changes the string back to a byte using:
byte[] result = stringResult.getBytes();
but when it decrypts the message (depending on the message) it may not be able to. There appears to be a padding problem and the error that I get is:
Exception in thread "main" javax.crypto.BadPaddingException: Given final block not properly padded
Any ideas why this occurs?
One example when this occurs for sure is when the encryption key is "1231231231231231" and the message encrypted is "read".
You're using the platform default encoding - once to convert message to bytes, and once to then convert the arbitrary binary output of encryption back into a string. Both steps are problematic.
Firstly, it's best to use a fixed encoding which is known to cover the whole of Unicode when converting the string to bytes. UTF-8 is usually a good bet.
Then there's the matter of representing arbitrary binary data as text. This isn't text data represented in an encoding - it's binary data. Interpreting it as if it were text data will almost certainly lose information. You need something more robust, capable of round-tripping arbitrary binary data. Base64 is usually the way to go.
There's a public domain base64 encoder which works pretty well as far as I know. So your encryption code becomes:
byte[] result = cipher.doFinal(message.getBytes("UTF-8"));
String stringResult = Base64.encodeBytes(result);
The decrypting code would then be:
byte[] encrypted = Base64.decode(encryptedBase64);
byte[] decrypted = /* do the decryption here */
String message = new String(decrypted, "UTF-8");
Your encrypted bytes are binary data which are unlikely to survive conversion to a string and back. If it needs to be stored in a string then Base64 encode it.

Categories