padding issue while decryption using AES256 with ECB mode in java [duplicate] - java

I am encrypting a string in objective-c and also encrypting the same string in Java using AES and am seeing some strange issues. The first part of the result matches up to a certain point but then it is different, hence when i go to decode the result from Java onto the iPhone it cant decrypt it.
I am using a source string of "Now then and what is this nonsense all about. Do you know?"
Using a key of "1234567890123456"
The objective-c code to encrypt is the following: NOTE: it is a NSData category so assume that the method is called on an NSData object so 'self' contains the byte data to encrypt.
- (NSData *)AESEncryptWithKey:(NSString *)key {
char keyPtr[kCCKeySizeAES128+1]; // room for terminator (unused)
bzero(keyPtr, sizeof(keyPtr)); // fill with zeroes (for padding)
// fetch key data
[key getCString:keyPtr maxLength:sizeof(keyPtr) encoding:NSUTF8StringEncoding];
NSUInteger dataLength = [self length];
//See the doc: For block ciphers, the output size will always be less than or
//equal to the input size plus the size of one block.
//That's why we need to add the size of one block here
size_t bufferSize = dataLength + kCCBlockSizeAES128;
void *buffer = malloc(bufferSize);
size_t numBytesEncrypted = 0;
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionPKCS7Padding,
keyPtr, kCCKeySizeAES128,
NULL /* initialization vector (optional) */,
[self bytes], dataLength, /* input */
buffer, bufferSize, /* output */
&numBytesEncrypted);
if (cryptStatus == kCCSuccess) {
//the returned NSData takes ownership of the buffer and will free it on deallocation
return [NSData dataWithBytesNoCopy:buffer length:numBytesEncrypted];
}
free(buffer); //free the buffer;
return nil;
}
And the java encryption code is...
public byte[] encryptData(byte[] data, String key) {
byte[] encrypted = null;
Security.addProvider(new org.bouncycastle.jce.provider.BouncyCastleProvider());
byte[] keyBytes = key.getBytes();
SecretKeySpec keySpec = new SecretKeySpec(keyBytes, "AES");
try {
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS7Padding", "BC");
cipher.init(Cipher.ENCRYPT_MODE, keySpec);
encrypted = new byte[cipher.getOutputSize(data.length)];
int ctLength = cipher.update(data, 0, data.length, encrypted, 0);
ctLength += cipher.doFinal(encrypted, ctLength);
} catch (Exception e) {
logger.log(Level.SEVERE, e.getMessage());
} finally {
return encrypted;
}
}
The hex output of the objective-c code is -
7a68ea36 8288c73d f7c45d8d 22432577 9693920a 4fae38b2 2e4bdcef 9aeb8afe 69394f3e 1eb62fa7 74da2b5c 8d7b3c89 a295d306 f1f90349 6899ac34 63a6efa0
and the java output is -
7a68ea36 8288c73d f7c45d8d 22432577 e66b32f9 772b6679 d7c0cb69 037b8740 883f8211 748229f4 723984beb 50b5aea1 f17594c9 fad2d05e e0926805 572156d
As you can see everything is fine up to -
7a68ea36 8288c73d f7c45d8d 22432577
I am guessing I have some of the settings different but can't work out what, I tried changing between ECB and CBC on the java side and it had no effect.
Can anyone help!? please....

Since the CCCrypt takes an IV, does it not use a chaining block cipher method (such as CBC)? This would be consistant with what you see: the first block is identical, but in the second block the Java version applies the original key to encrypt, but the OSX version seems to use something else.
EDIT:
From here I saw an example. Seems like you need to pass the kCCOptionECBMode to CCCrypt:
ccStatus = CCCrypt(encryptOrDecrypt,
kCCAlgorithm3DES,
kCCOptionECBMode, <-- this could help
vkey, //"123456789012345678901234", //key
kCCKeySize3DES,
nil, //"init Vec", //iv,
vplainText, //"Your Name", //plainText,
plainTextBufferSize,
(void *)bufferPtr,
bufferPtrSize,
&movedBytes);
EDIT 2:
I played around with some command line to see which one was right. I thought I could contribute it:
$ echo "Now then and what is this nonsense all about. Do you know?" | openssl enc -aes-128-ecb -K $(echo 1234567890123456 | xxd -p) -iv 0 | xxd
0000000: 7a68 ea36 8288 c73d f7c4 5d8d 2243 2577 zh.6...=..]."C%w
0000010: e66b 32f9 772b 6679 d7c0 cb69 037b 8740 .k2.w+fy...i.{.#
0000020: 883f 8211 7482 29f4 7239 84be b50b 5aea .?..t.).r9....Z.
0000030: eaa7 519b 65e8 fa26 a1bb de52 083b 478f ..Q.e..&...R.;G.

I spent a few weeks decrypting a base64 encoded, AES256 encrypted string. Encryption was done by CCCrypt (Objective-C)on an iPad. The decryption was to be done in Java (using Bouncy Castle).
I finally succeeded and learnt quite a lot in the process. The encryption code was exactly the same as above (I guess it's taken from the Objective-C sample in the iPhone developer documentation).
What CCCrypt() documentation does NOT mention is that it uses CBC mode by default (if you don't specify an option like kCCOptionECBMode). It does mention that the IV, if not specified, defaults to all zeros (so IV will be a byte array of 0x00, 16 members in length).
Using these two pieces of information, you can create a functionally identical encryption module using CBC (and avoid using ECB which is less secure) on both Java and OSx/iphone/ipad(CCCrypt).
The Cipher init function will take the IV byte array as a third argument:
cipher.init(Cipher.ENCRYPT_MODE, keySpec, IV).

For anyone else who needs this, disown was absolutely spot on... the revised call to create the crypt in objective-c is as follows (note you need the ECB mode AND the padding)...
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionECBMode + kCCOptionPKCS7Padding,
keyPtr, kCCKeySizeAES128,
NULL /* initialization vector (optional) */,
[self bytes], dataLength, /* input */
buffer, bufferSize, /* output */
&numBytesEncrypted);

Just to add to the first post: in your objective C/cocoa code you used CBC mode and in your java code you used EBC and an IV initialization vector wasn't used in either.
EBC cipher is block by block and CBC chains upon the preceding block, so if your text is smaller than 1 block (=16 bytes in your example), the cipher text produced by both are decryptable by the other (the same).
If you are looking for a way to standardize your use of the ciphers, NIST Special Publication 800-38A, 2001 Edition has test vectors. I can post code for the AES CBC and EBC vectors if it's helpful to anyone.

Related

Data signed on iOS can't be verified in Java

I have some data that I'm signing on iOS with SecKeyRawSign using Elliptic Curve private key. However, verifying that data in Java using Signature.verify() returns false
The data is a random 64 bit integer, split into bytes like so
uint64_t nonce = (some 64 bit integer)
NSData *nonceData = [NSData dataWithBytes: &nonce length: sizeof(nonce)];
From that data I'm creating a SHA256 digest
int digestLength = CC_SHA256_DIGEST_LENGTH;
uint8_t *digest = malloc(digestLength);
CC_SHA256(nonceData.bytes, (CC_LONG)nonceData.length, digest);
NSData *digestData = [NSData dataWithBytes:digest length:digestLength];
and then signing it with private key
size_t signedBufferSize = kMaxCipherBufferSize;
uint8_t *signedBuffer = malloc(kMaxCipherBufferSize);
OSStatus status = SecKeyRawSign(privateKeyRef,
kSecPaddingPKCS1SHA256,
(const uint8_t *)digestData.bytes,
digestData.length,
&signedBuffer[0],
&signedBufferSize);
NSData *signedData = nil;
if (status == errSecSuccess) {
signedData = [NSData dataWithBytes:signedBuffer length:signedBufferSize];
}
Everything appears to work fine.
Then, in Java server, I'm trying to verify that signed data
PublicKey publicKey = (a public key sent from iOS, X509 encoded)
Long nonce = (64 bit integer sent from iOS)
String signedNonce = (base64 encoded signed data)
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.putLong(nonce);
byte[] nonceBytes = buffer.array();
byte[] signedNonceBytes = Base64.getDecoder().decode(signedNonce.getBytes());
Signature signer = Signature.getInstance( "SHA256withECDSA" );
signer.initVerify( publicKey );
signer.update( nonceBytes );
Boolean isVerified = signer.verify( signedNonceBytes );
At this point, signer.verify() returns false
I also tried to sign plain data, instead of SHA256 digest, but that doesn't work either.
What am I missing? Am I signing the data correctly? Am I using correct padding? Is there something else to be done with data to be able to verify it with SHA256withECDSA algorithm?
The byte ordering does not match:
iOS is little endian. The way you create nonceData, this order is retained.
On the Java side, ByteBuffer defaults to big endian, independent of the underlying operating system / hardware.
So you need to change the byte order:
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.order(ByteOrder.LITTLE_ENDIAN);
buffer.putLong(nonce);
I'm a java guy, so I can't say anything about the iOS side, but a quick check of the java side can be done using the commented assumptions:
// Generate a new random EC keypair for testing
KeyPair keys = KeyPairGenerator.getInstance("EC").generateKeyPair();
PrivateKey privateKey = keys.getPrivate();
PublicKey publicKey = keys.getPublic();
// Generate a Random nonce to test with
byte[] nonceBytes = new byte[8]; // (some 64 bit integer)
new Random(System.nanoTime()).nextBytes(nonceBytes);
// sign
Signature sign = Signature.getInstance("SHA256withECDSA");
sign.initSign(privateKey);
sign.update(nonceBytes);
byte[] signature = sign.sign();
//verify
Signature verify = Signature.getInstance("SHA256withECDSA");
verify.initVerify(publicKey);
verify.update(nonceBytes);
Boolean isVerified = verify.verify(signature);
// print results
System.out.println("nonce used ::" + Base64.getEncoder().encodeToString(nonceBytes));
System.out.println("Signed nonce ::" + Base64.getEncoder().encodeToString(signature));
System.out.println("nonce used ::" + isVerified);
As you'd expect returns, the code above will always return that the signature is verified. Check your assumptions are accurate and validate the keys being used are correct on both sides.
I can advice you to use some Crypto Library which is available for both iOS and JAVA sides (f.e.:https://github.com/VirgilSecurity/virgil-crypto). This will ensure that the algorithm and block types (etc.) are the same in both cases and you won't need to worry about it anymore. I believe you will finds many crypto libraries on a GitHub.
In getBytes() you could specify the encoding technique using java.nio.charset.StandardCharsets. and do the same with the decoder.
https://docs.oracle.com/javase/7/docs/api/java/nio/charset/StandardCharsets.html

'javax.crypto.BadPaddingException' while using cipherInputStream

I'm writing a program to encrypt and decrypt data.
for encrypting,
I created a symmetric key using keyGenerator.
I transferred the key to the cipher, and created a string version of the key:
String keyString = Base64.getEncoder().encodeToString(symmetricKey.getEncoded());
in order to store it in a configuration file (so I can retrieve the key in the decrypt function).
Now, in the decrypt function I need to get that string back to key format, so I can send it as a parameter to the cipher in dercypt mode.
I convert it back to key this way:
byte[] keyBytes = key.getBytes(Charset.forName("UTF-8"));
Key newkey = new SecretKeySpec(keyBytes,0,keyBytes.length, "AES");
And I transffer it to the cipher and write the output (the decrypted data) using CipherInputStream:
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
cipher.init(Cipher.DECRYPT_MODE, newkey, newiv, SecureRandom.getInstance("SHA1PRNG"));
CipherInputStream cipherInputStream = new CipherInputStream(
new ByteArrayInputStream(encryptedBytes), cipher);
ArrayList<Byte> decryptedVal = new ArrayList<>();
int nextByte;
while ((nextByte = cipherInputStream.read()) != -1) {
decryptedVal.add((byte) nextByte);
}
byte[] bytes = new byte[decryptedVal.size()];
for (int i = 0; i < bytes.length; i++) {
bytes[i] = decryptedVal.get(i);
}
String decryptedData = new String(bytes);
cipherInputStream.close();
System.out.println("decryptedData: " + decryptedData);
I get this error:
Exception in thread "main" java.io.IOException: javax.crypto.BadPaddingException: Given final block not properly padded. Such issues can arise if a bad key is used during decryption.
So I suspect that there might be a problem with the way I treat the key.
Any suggestions? help would be appreciated!
I think you have not sent IV to decryption function. For decryption in CBC mode, you must provide an IV which is used in encryption process.
Update:
IV will affect only first block in CBC decryption mode. So my answer may affect the unpadding if your data is less than 1 block. It will just change the decrypted plaintext of the first block otherwise.
Of course you get this error: first you apply base 64 encoding:
String keyString = Base64.getEncoder().encodeToString(symmetricKey.getEncoded());
and then you use character-encoding to turn it back into bytes:
byte[] keyBytes = key.getBytes(Charset.forName("UTF-8"));
which just keeps be base64 encoding, probably expanding the key size from 16 bytes to 24 bytes which corresponds with a 192 bit key instead of a 128 bit key. Or 24 bytes key to a 32 bytes key of course - both seem to work.
To solve this you need to use Base64.getDecoder() and decode the key.
Currently you get a key with a different size and value. That means that each block of plaintext, including the last one containing the padding, will decrypt to random plaintext. As random plaintext is unlikely to contain valid padding, you will be greeted with a BadPaddingException.
Reminder:
encoding, e.g. base 64 or hex: encoding bytes to a text string
character-encoding, e.g. UTF-8 or ASCII: encoding a text string into bytes
They are not opposites, that would be decoding and character-decoding respectively.
Remarks:
yes, listen to Ashfin; you need to use a random IV during encryption and then use it during decryption, for instance by prefixing it to the ciphertext (unencrypted);
don't use ArrayList<Byte>; that stores a reference to each separate byte (!) - use ByteArrayOutputStream or any other OutputStream instead;
you can better use a byte buffer and use that to read / write to the streams (note that the read function may not fill the buffer, even if at the start or in the middle of the stream) - reading a single byte at the time is not performant;
lookup try-with-resources for Java;
using a KeyStore may be better than storing in a config file;
GCM mode (AES/GCM/NoPadding) also authenticates data and should be preferred over CBC mode.

RSA transfer iOS to Java

I want to encrypt a String in Objective-C (iOS) and decrypt it after HTTP Transfer in Java.
The Public Key was successfully transmitted from Java to iOS (at least it is recognied in the Keychain)
In ObjC I try to send the text "Hallo" using this:
NSData* myData = [#"Hallo" dataUsingEncoding:NSUTF8StringEncoding];
OSStatus status = noErr;
size_t cipherBufferSize;
uint8_t *cipherBuffer;
cipherBufferSize = SecKeyGetBlockSize(publicKey);
cipherBuffer = malloc(cipherBufferSize);
size_t dataLength = [self length];
uint8_t* intDataToEncrypt = (uint8_t*)[self bytes];
if (cipherBufferSize < dataLength) {
printf("Could not encrypt. Packet too large.\n");
return NULL;
}
status = SecKeyEncrypt( publicKey,
kSecPaddingNone,
intDataToEncrypt,
dataLength,
cipherBuffer,
&cipherBufferSize
);
NSData *encryptedData = [NSData dataWithBytes:cipherBuffer length:cipherBufferSize];
"encryptedData" is then send base64 encoded via HTTP Post to a Java Servlet which uses this (BouncyCastle is used and the data converted back from Base64 first, the key is read from a file):
PKCS8EncodedKeySpec keySpec = new PKCS8EncodedKeySpec(b64.decodeBuffer(key));
PrivateKey privateKey = kf.generatePrivate(keySpec);
Cipher cipher = Cipher.getInstance("RSA/NONE/NoPadding");
cipher.init(Cipher.DECRYPT_MODE, privateKey);
byte[] myData = cipher.doFinal(byteData, 0, 256);
The output is completely messed up when trying to use as new String(myData, "UTF-8");
I tried sending only the "Hallo" without encryption this works, so it seems base64 is ok.
When I encrypt the same phrase in Java directly the output differs in 2 Bytes from the ObjC encryption.
In the End I want to use OAEP Padding instead of none...
I hope someone can help with this, the iOS Security Framework seems really to be very bad
The Problem was not in the RSA part but the Base64 encoding and transfer. All "+" in the String were received as " " due to HTTP(S) transfer (of course, I should have thought of that).
Replacing them with "-" as proposed by the unofficial Base64URL version solves the problem. (also all "/" with "_")
This also explain why only 2 bytes were different - those were the "+" symbol bytes.

iOS symmetric key encryption / decryption equivalent to Java

I am trying to encrypt/decrypt data ios to java & java to ios but I data encrypted in java is not properly decrypted in ios & data encrypted in ios is not properly decrypted in java
- (NSData *) encrypt:(NSData *) dataToEncrypt symmetricKey:(NSData *)symmetricKey context:(CCOperation)encryptOrDecrypt{
NSUInteger data_length= [dataToEncrypt length];
uint8_t input_raw_data[data_length];
//The [dataToEncrypt length] gives the number of chars present in the string.So say there are 10 chars.
//Now,the getBytes needs to get the raw bytes from this i.e. binary NSData.But suppose the encoding was
//full 16 bit encoding then the number of bytes needed wd have been double- 20.But as we are using the
//NSUTF8StringEncoding,the number of byes needed is 1 per char as the chars(even if originally unicode are
//compressed into an 8 bit UTF8 encoding.)
[dataToEncrypt getBytes:&input_raw_data length:data_length];
// [dataToEncrypt getBytes:&input_raw_data maxLength:data_length usedLength:NULL encoding:NSUTF8StringEncoding options:0 range:NSMakeRange(0,data_length) remainingRange:NULL];
//According to the doc: For block ciphers, the output size will always be less than or
//equal to the input size plus the size of one block.
//That's why we need to add the size of one block here
size_t buffer_size = data_length + kCCBlockSizeAES128;
void* buffer = malloc(buffer_size);
size_t num_bytes_encrypted = 0;
CCCryptorStatus crypt_status = CCCrypt(encryptOrDecrypt, kCCAlgorithmAES128, 0x0000,
[symmetricKey bytes], kCCKeySizeAES128,
NULL,
input_raw_data, data_length,
buffer, buffer_size,
&num_bytes_encrypted);
// NSLog(#"~~num bytes encrypted: %d",num_bytes_encrypted);
if (crypt_status == kCCSuccess){
NSLog(#"~~Data encoded successfully...");
return [NSData dataWithBytesNoCopy:buffer length:num_bytes_encrypted];
}
free(buffer); //free the buffer;
return nil;
}
I have used this
Java Code -
Cipher cipher = Cipher.getInstance("AES/CBC/NoPadding");
String keyString = "keykeykeykeykeykeykeykey";
byte[] keyBytes = keyString.getBytes("UTF-8");
cipher.init(Cipher.ENCRYPT_MODE, new SecretKeySpec(keyBytes, "AES"), new IvParameterSpec(new byte[16]));
byte[] resultBytes = cipher.doFinal("Hallo Welt!".getBytes("UTF8"));
FileOutputStream out = new FileOutputStream(new File("encryptedFileJava"));
out.write(resultBytes); out.close();
and this is encrypted text - “Se áJbë|8”R ,
key - BW3dKDf2bkDC4Bq9xTdr1g==
Please help me or suggest me any solution.
Thank you.
You have at least two problems:
The Objective C code is using ECB mode, while the Java code is using CBC mode. Use a bytearray of zeroes instead of NULL in the CCrypt-invocation to use CBC mode with a zero IV like the Java code.
Since keyBytes is 24 bytes long, Java will use AES-192. CCrypt will just ignore the extra bytes. Either specify AES-192 to CCrypt or use a 128 bit key ("keykeykeykeykeyk" should work).
For secured communication between IOS and Java devices, symmetric key encryption can be used.
In such cases where the platforms are different it is advisable that the key generated is a plain text key and not a API generated key.
AES 128 bit encryption can be used in such cases. IOS devices are capable of generating a symmetric key and encrypting the text using AES encryption.
Below link provides the java code to encrypt and decrypt using plain text symmetric key
http://www.java-redefined.com/2015/06/symmetric-key-encryption-ios-java.html

Unable to exchange data encrypted with AES-256 between Java and PHP

My problem is: what I encrypt in Java I can decrypt perfectly in Java, but PHP mcrypt can't decrypt. What I encrypt with mcrypt I can decrypt with mcrypt, but can't in Java.
I want to send and receive encrypted data from a Java application to a PHP page, so I need it to be compatible.
Here's what I have...
JAVA...
public static String crypt(String input, String key){
byte[] crypted = null;
try{
SecretKeySpec skey = new SecretKeySpec(Base64.decodeBase64(key), "AES");
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
cipher.init(Cipher.ENCRYPT_MODE, skey);
crypted = cipher.doFinal(input.getBytes());
}catch(Exception e){
}
return Base64.encodeBase64String(crypted);
}
public static String decrypt(String input, String key){
byte[] output = null;
try{
SecretKeySpec skey = new SecretKeySpec(Base64.decodeBase64(key), "AES");
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
cipher.init(Cipher.DECRYPT_MODE, skey);
output = cipher.doFinal(Base64.decodeBase64(input));
}catch(Exception e){
}
return new String(output);
}
Running:
public static void main(String[] args) {
String key = "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=";
String data = "example";
System.out.println(Cpt.decrypt(Cpt.crypt(data, key), key));
}
Output:
example
PHP...
function getEncrypt($sStr, $sKey) {
return base64_encode(
mcrypt_encrypt(
MCRYPT_RIJNDAEL_256,
$sKey,
$sStr,
MCRYPT_MODE_ECB
)
);
}
function getDecrypt($sStr, $sKey) {
return mcrypt_decrypt(
MCRYPT_RIJNDAEL_256,
$sKey,
base64_decode($sStr),
MCRYPT_MODE_ECB
);
}
Running:
$crypt = getDecrypt(getEncrypt($str, $key), $key);
echo "<p>Crypt: $crypt</p>";
Output:
Crypt: example�������������������������
Using PHP to crypt "example" with key "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=" I get "YTYhgp4zC+w5IsViTR5PUkHMX4i7JzvA6NJT1FqhoGY=".
Using Java to crypt the same thing with the same key I get "+tdAZqTE7WAVPXhB3Tp5+g==".
I'm encoding and decoding to base64 in the right order and I tested base64 encode and decode compatibility between Java and PHP and it's working.
BUG#1
MCRYPT_RIJNDAEL_256 is not AES. The 256 in that constant refers to the blocksize, not the keysize. Use MCRYPT_RIJNDAEL_128 to get the same algorithm as AES. The keysize is set just by the number of bytes in the key argument you supply. So supply 32 bytes and you get AES with a 256-bit key.
BUG#2
These two lines are never correct in Java and indicate a fundamental misunderstanding of the nature of the arbitrary binary data produced by cryptographic transforms:
output = cipher.doFinal(Base64.decodeBase64(input));
return new String(output);
There is nothing wrong with transmitting and storing byte[] directly, but if you must use only printable strings then you should base64 encode/decode to do so. As you are already using base64 extensively that would seem like the way to go. I would guess that the correct two lines would be:
output = cipher.doFinal(Base64.decodeBase64(input));
return new String(Base64.encodeBase64(output), "UTF-8");
EDIT:
Just kidding about bug #2. Really, I was wrong, I didn't notice it was the decrypt direction. Of course, if you know the decrypted byte[] is a valid string then it is perfectly correct to do what your code does.
I know this is an old topic, but I will add my working solution.
You have to rewrite PHP side of the script:
function getEncrypt($sStr, $sKey) {
return base64_encode(
mcrypt_encrypt(
MCRYPT_RIJNDAEL_128,
base64_decode($sKey),
$sStr,
MCRYPT_MODE_ECB
)
);
}
function getDecrypt($sStr, $sKey) {
return mcrypt_decrypt(
MCRYPT_RIJNDAEL_128,
base64_decode($sKey),
base64_decode($sStr),
MCRYPT_MODE_ECB
);
}
You should base64_decode($sKey) because your key is base64 encoded.
$key = "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=";
Then, you need to create this function (credit goes to beltrachi from http://www.php.net/manual/en/function.mcrypt-decrypt.php):
function pkcs5_pad ($text, $blocksize) {
$pad = $blocksize - (strlen($text) % $blocksize);
return $text . str_repeat(chr($pad), $pad);
}
Use this code do encode/decode:
$decrypt = getDecrypt("6XremNEs1jv/Nnf/fRlQob6oG1jkge+5Ut3PL489oIo=", $key);
echo $decrypt;
echo "\n\n";
echo getEncrypt(pkcs5_pad("My very secret text:)", 16), $key);
I hope this will be useful for someone! :)
Please see here:
Difference in PHP encryption from iOS and .NET
AES Encrypt in C#, decrypt in PHP
DES Encryption in PHP and C#
The problem you're encountering is a padding-issue. I don't know Java, but AES/ECB/PKCS5Padding looks like you're using a PKCS#5 (that's essentially the same as PKCS#7) padding while PHP natively only support NULL-padding. That's what PKCS#5/7 does:
Pad the input with a padding string of
between 1 and 8 bytes to make the
total length an exact multiple of 8
bytes. The value of each byte of the
padding string is set to the number of
bytes added - i.e. 8 bytes of value
0x08, 7 bytes of value 0x07, ..., 2
bytes of 0x02, or one byte of value
0x01.
So the PHP code to do the padding right is trivial:
$blockSize = mcrypt_get_block_size(MCRYPT_RIJNDAEL_256, MCRYPT_MODE_ECB);
$padding = $blockSize - (strlen($data) % $blockSize);
$data .= str_repeat(chr($padding), $padding);
Keep in mind to have the same encoding for the strings. Try to convert the strings in both languages to UTF-8, e.g., and than convert to binary data that is encoded:
PHP (s. utf8_encode() function):
$strAndBlob = utf8_encode("My string");
Java:
String str = "My string";
byte[] blob = str.getBytes("utf-8");
PHP, e.g., must not use UTF-8 by default.

Categories