I want to encrypt a String in Objective-C (iOS) and decrypt it after HTTP Transfer in Java.
The Public Key was successfully transmitted from Java to iOS (at least it is recognied in the Keychain)
In ObjC I try to send the text "Hallo" using this:
NSData* myData = [#"Hallo" dataUsingEncoding:NSUTF8StringEncoding];
OSStatus status = noErr;
size_t cipherBufferSize;
uint8_t *cipherBuffer;
cipherBufferSize = SecKeyGetBlockSize(publicKey);
cipherBuffer = malloc(cipherBufferSize);
size_t dataLength = [self length];
uint8_t* intDataToEncrypt = (uint8_t*)[self bytes];
if (cipherBufferSize < dataLength) {
printf("Could not encrypt. Packet too large.\n");
return NULL;
}
status = SecKeyEncrypt( publicKey,
kSecPaddingNone,
intDataToEncrypt,
dataLength,
cipherBuffer,
&cipherBufferSize
);
NSData *encryptedData = [NSData dataWithBytes:cipherBuffer length:cipherBufferSize];
"encryptedData" is then send base64 encoded via HTTP Post to a Java Servlet which uses this (BouncyCastle is used and the data converted back from Base64 first, the key is read from a file):
PKCS8EncodedKeySpec keySpec = new PKCS8EncodedKeySpec(b64.decodeBuffer(key));
PrivateKey privateKey = kf.generatePrivate(keySpec);
Cipher cipher = Cipher.getInstance("RSA/NONE/NoPadding");
cipher.init(Cipher.DECRYPT_MODE, privateKey);
byte[] myData = cipher.doFinal(byteData, 0, 256);
The output is completely messed up when trying to use as new String(myData, "UTF-8");
I tried sending only the "Hallo" without encryption this works, so it seems base64 is ok.
When I encrypt the same phrase in Java directly the output differs in 2 Bytes from the ObjC encryption.
In the End I want to use OAEP Padding instead of none...
I hope someone can help with this, the iOS Security Framework seems really to be very bad
The Problem was not in the RSA part but the Base64 encoding and transfer. All "+" in the String were received as " " due to HTTP(S) transfer (of course, I should have thought of that).
Replacing them with "-" as proposed by the unofficial Base64URL version solves the problem. (also all "/" with "_")
This also explain why only 2 bytes were different - those were the "+" symbol bytes.
Related
I have some data that I'm signing on iOS with SecKeyRawSign using Elliptic Curve private key. However, verifying that data in Java using Signature.verify() returns false
The data is a random 64 bit integer, split into bytes like so
uint64_t nonce = (some 64 bit integer)
NSData *nonceData = [NSData dataWithBytes: &nonce length: sizeof(nonce)];
From that data I'm creating a SHA256 digest
int digestLength = CC_SHA256_DIGEST_LENGTH;
uint8_t *digest = malloc(digestLength);
CC_SHA256(nonceData.bytes, (CC_LONG)nonceData.length, digest);
NSData *digestData = [NSData dataWithBytes:digest length:digestLength];
and then signing it with private key
size_t signedBufferSize = kMaxCipherBufferSize;
uint8_t *signedBuffer = malloc(kMaxCipherBufferSize);
OSStatus status = SecKeyRawSign(privateKeyRef,
kSecPaddingPKCS1SHA256,
(const uint8_t *)digestData.bytes,
digestData.length,
&signedBuffer[0],
&signedBufferSize);
NSData *signedData = nil;
if (status == errSecSuccess) {
signedData = [NSData dataWithBytes:signedBuffer length:signedBufferSize];
}
Everything appears to work fine.
Then, in Java server, I'm trying to verify that signed data
PublicKey publicKey = (a public key sent from iOS, X509 encoded)
Long nonce = (64 bit integer sent from iOS)
String signedNonce = (base64 encoded signed data)
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.putLong(nonce);
byte[] nonceBytes = buffer.array();
byte[] signedNonceBytes = Base64.getDecoder().decode(signedNonce.getBytes());
Signature signer = Signature.getInstance( "SHA256withECDSA" );
signer.initVerify( publicKey );
signer.update( nonceBytes );
Boolean isVerified = signer.verify( signedNonceBytes );
At this point, signer.verify() returns false
I also tried to sign plain data, instead of SHA256 digest, but that doesn't work either.
What am I missing? Am I signing the data correctly? Am I using correct padding? Is there something else to be done with data to be able to verify it with SHA256withECDSA algorithm?
The byte ordering does not match:
iOS is little endian. The way you create nonceData, this order is retained.
On the Java side, ByteBuffer defaults to big endian, independent of the underlying operating system / hardware.
So you need to change the byte order:
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.order(ByteOrder.LITTLE_ENDIAN);
buffer.putLong(nonce);
I'm a java guy, so I can't say anything about the iOS side, but a quick check of the java side can be done using the commented assumptions:
// Generate a new random EC keypair for testing
KeyPair keys = KeyPairGenerator.getInstance("EC").generateKeyPair();
PrivateKey privateKey = keys.getPrivate();
PublicKey publicKey = keys.getPublic();
// Generate a Random nonce to test with
byte[] nonceBytes = new byte[8]; // (some 64 bit integer)
new Random(System.nanoTime()).nextBytes(nonceBytes);
// sign
Signature sign = Signature.getInstance("SHA256withECDSA");
sign.initSign(privateKey);
sign.update(nonceBytes);
byte[] signature = sign.sign();
//verify
Signature verify = Signature.getInstance("SHA256withECDSA");
verify.initVerify(publicKey);
verify.update(nonceBytes);
Boolean isVerified = verify.verify(signature);
// print results
System.out.println("nonce used ::" + Base64.getEncoder().encodeToString(nonceBytes));
System.out.println("Signed nonce ::" + Base64.getEncoder().encodeToString(signature));
System.out.println("nonce used ::" + isVerified);
As you'd expect returns, the code above will always return that the signature is verified. Check your assumptions are accurate and validate the keys being used are correct on both sides.
I can advice you to use some Crypto Library which is available for both iOS and JAVA sides (f.e.:https://github.com/VirgilSecurity/virgil-crypto). This will ensure that the algorithm and block types (etc.) are the same in both cases and you won't need to worry about it anymore. I believe you will finds many crypto libraries on a GitHub.
In getBytes() you could specify the encoding technique using java.nio.charset.StandardCharsets. and do the same with the decoder.
https://docs.oracle.com/javase/7/docs/api/java/nio/charset/StandardCharsets.html
Some background of what I'm trying to accomplish.
Part 1.
PHP server communicates with a Java-based device. PHP uses OpenSSL to generate a public/private keypair, then sends the public key to the device which in turn gives back an encrypted macKey (generated using the public key), encoded in base64. PHP now needs to base64-decode and decrypt the macKey using the private key.
What is the equivalent of the below Java code snippet in PHP?
String base64EncodedMacKey = "LkvTT9LFj5lcxRRB8KrwwN906fSIDDcJvQK3E7a5PbR+Ox9WnslOs32jSCC9FkE8ouvr2MfWwtppuZmoPjaxwg3yAQI4UN3T1loISuF2VwKWfJ45fywbK9bNnD5Cw7336mjoGctv77Tg3JXPrsRwgMGIlBsNwdt1B0wgT4MMMAjl32TnBI3iwQ94VTMHffrK+QToddTahRHHoVsr3FVrETdiqKXdkiX1jES53im5lrXYIsY89UFkGzPo+3u4ijKIQWSLvYnA5wXI128gFHKxKYS82MbJDUn9i1RVFsGaP6T3nQRSX5SZNpSe5yGFWwMgYOx0KXMgET82FeaL2hfWuw==";
byte[] base64DecodedMacKey = DatatypeConverter.parseBase64Binary(base64EncodedMacKey);
Cipher cipher = Cipher.getInstance("RSA/ECB/PKCS1Padding");
cipher.init(Cipher.DECRYPT_MODE, keypair.getPrivate());
byte[] macKey = cipher.doFinal(base64DecodedMacKey);
Here's what I attempted in PHP, however I'm confused about using byte array versus string when decrypting the macKey
$macKey = 'LkvTT9LFj5lcxRRB8KrwwN906fSIDDcJvQK3E7a5PbR+Ox9WnslOs32jSCC9FkE8ouvr2MfWwtppuZmoPjaxwg3yAQI4UN3T1loISuF2VwKWfJ45fywbK9bNnD5Cw7336mjoGctv77Tg3JXPrsRwgMGIlBsNwdt1B0wgT4MMMAjl32TnBI3iwQ94VTMHffrK+QToddTahRHHoVsr3FVrETdiqKXdkiX1jES53im5lrXYIsY89UFkGzPo+3u4ijKIQWSLvYnA5wXI128gFHKxKYS82MbJDUn9i1RVFsGaP6T3nQRSX5SZNpSe5yGFWwMgYOx0KXMgET82FeaL2hfWuw==';
$base64DecodedMacKey = base64_decode($macKey);
openssl_private_decrypt($base64DecodedMacKey, $decrypted, $privateKey);
The $decrypted above holds some binary data as it appears, so I'm unsure whether I need to convert it into a byte array or treat it as a string...
Part 2.
Each request has a counter. The macKey in Java code above is used to create a MAC value out of the counter.
What is the equivalent of the below Java code snippet in PHP?
int counter = 0;
String nextCounter = String.valueOf(++counter);
SecretKeySpec signingKey = new SecretKeySpec(macKey, "AES");
Mac mac = Mac.getInstance("HmacSHA256");
mac.init(signingKey);
byte[] counterMac = mac.doFinal(nextCounter.getBytes("UTF-8"));
String base64EncodedMac = DatatypeConverter.printBase64Binary(counterMac);
The base64EncodedMac above is finally sent to the device to validate communication.
I've tried googling different solutions, however I've not been successful in generating a valid base64EncodedMac string in PHP for the device to approve it.
Found the solution myself. For Part 1, I chose to use phpseclib to generate the public/private keys and to specify the encryption algorithm. Decrypting macKey:
$rsa = new Crypt_RSA();
$keys = $rsa->createKey(2048);
// [...]
$macKey = base64_decode($base64EncodedMacKey);
$rsa->setEncryptionMode(CRYPT_RSA_ENCRYPTION_PKCS1);
$rsa->loadKey($keys['privatekey']);
$decryptedMac = $rsa->decrypt($macKey);
Followed by Part 2:
$counter = 0;
$hmac = hash_hmac('sha256', ++$counter, $decryptedMac, true);
$counterMac = base64_encode($hmac);
The main confusing part was that in Java, HMAC was done out of byte array, while in PHP the hash_hmac function expects a String as its 2nd parameter, so using unpack() was not sufficient. However, it seems to have worked with passing the $counter directly. It was also important to use the 4th parameter as TRUE to return raw data.
I implemented data encryption/decryption with RSA. It works if I just encrypt/decrypt locally, however if I send my encrypted data I get BadPaddingException: Data must start with zero.
In order to send my data over network I need to change it from byte array to String (I'm sending it in a header) on the client side and then retrieve it and change it back to byte array on the server side.
Here's my code for local encryption/decryption (I'm using private key to encrypt and public key do decrypt):
// Encryption:
String message = "HELLO";
Cipher rsa = Cipher.getInstance("RSA");
rsa.init(Cipher.ENCRYPT_MODE, privateKey); // privateKey has type java.security.PrivateKey
byte [] encryptedBytes = rsa.doFinal(message.getBytes());
// Decryption:
rsa.init(Cipher.DECRYPT_MODE, publicKey); // type of publicKey: java.security.PublicKey
byte [] ciphertext = rsa.doFinal(encryptedBytes);
String decryptedString = new String(ciphertext, "UTF-8");
DecryptedString and message are the same and everything works fine.
Then I use the same code on the client side just for encryption plus I change ciphertext to a String using:
String encryptedString = new String(ciphertext, "UTF-8");
And on the server side I do:
String message = request.getHeader("Message");
byte [] msgBytes = message.getBytes("UTF-8");
Cipher rsa = Cipher.getInstance("RSA");
rsa.init(Cipher.DECRYPT_MODE, publicKey);
byte [] decryptedMsg = rsa.doFinal(msgBytes);
String decryptedString = new String(decryptedMsg, "UTF-8");
This doesn't work and I get BadPaddingException.
I have tried using different instance of cipher, e.g. "RSA/ECB/PKCS1Padding" or "RSA/ECB/NoPadding" but this didn't help. I also tried converting Strings using BASE64 but then I get a different exception: IllegalBlockSizeException.
I know I probably do sth wrong with converting Strings into byte arrays and vice versa, but I just can't figure out the correct way of doing that. Please help!
You can't just convert arbitrary binary data (the encrypted text) into a String. If you want to send the data as text, you need to use some sort of binary -> text encoding like Base64.
I am trying to encrypt/decrypt data ios to java & java to ios but I data encrypted in java is not properly decrypted in ios & data encrypted in ios is not properly decrypted in java
- (NSData *) encrypt:(NSData *) dataToEncrypt symmetricKey:(NSData *)symmetricKey context:(CCOperation)encryptOrDecrypt{
NSUInteger data_length= [dataToEncrypt length];
uint8_t input_raw_data[data_length];
//The [dataToEncrypt length] gives the number of chars present in the string.So say there are 10 chars.
//Now,the getBytes needs to get the raw bytes from this i.e. binary NSData.But suppose the encoding was
//full 16 bit encoding then the number of bytes needed wd have been double- 20.But as we are using the
//NSUTF8StringEncoding,the number of byes needed is 1 per char as the chars(even if originally unicode are
//compressed into an 8 bit UTF8 encoding.)
[dataToEncrypt getBytes:&input_raw_data length:data_length];
// [dataToEncrypt getBytes:&input_raw_data maxLength:data_length usedLength:NULL encoding:NSUTF8StringEncoding options:0 range:NSMakeRange(0,data_length) remainingRange:NULL];
//According to the doc: For block ciphers, the output size will always be less than or
//equal to the input size plus the size of one block.
//That's why we need to add the size of one block here
size_t buffer_size = data_length + kCCBlockSizeAES128;
void* buffer = malloc(buffer_size);
size_t num_bytes_encrypted = 0;
CCCryptorStatus crypt_status = CCCrypt(encryptOrDecrypt, kCCAlgorithmAES128, 0x0000,
[symmetricKey bytes], kCCKeySizeAES128,
NULL,
input_raw_data, data_length,
buffer, buffer_size,
&num_bytes_encrypted);
// NSLog(#"~~num bytes encrypted: %d",num_bytes_encrypted);
if (crypt_status == kCCSuccess){
NSLog(#"~~Data encoded successfully...");
return [NSData dataWithBytesNoCopy:buffer length:num_bytes_encrypted];
}
free(buffer); //free the buffer;
return nil;
}
I have used this
Java Code -
Cipher cipher = Cipher.getInstance("AES/CBC/NoPadding");
String keyString = "keykeykeykeykeykeykeykey";
byte[] keyBytes = keyString.getBytes("UTF-8");
cipher.init(Cipher.ENCRYPT_MODE, new SecretKeySpec(keyBytes, "AES"), new IvParameterSpec(new byte[16]));
byte[] resultBytes = cipher.doFinal("Hallo Welt!".getBytes("UTF8"));
FileOutputStream out = new FileOutputStream(new File("encryptedFileJava"));
out.write(resultBytes); out.close();
and this is encrypted text - “Se áJbë|8”R ,
key - BW3dKDf2bkDC4Bq9xTdr1g==
Please help me or suggest me any solution.
Thank you.
You have at least two problems:
The Objective C code is using ECB mode, while the Java code is using CBC mode. Use a bytearray of zeroes instead of NULL in the CCrypt-invocation to use CBC mode with a zero IV like the Java code.
Since keyBytes is 24 bytes long, Java will use AES-192. CCrypt will just ignore the extra bytes. Either specify AES-192 to CCrypt or use a 128 bit key ("keykeykeykeykeyk" should work).
For secured communication between IOS and Java devices, symmetric key encryption can be used.
In such cases where the platforms are different it is advisable that the key generated is a plain text key and not a API generated key.
AES 128 bit encryption can be used in such cases. IOS devices are capable of generating a symmetric key and encrypting the text using AES encryption.
Below link provides the java code to encrypt and decrypt using plain text symmetric key
http://www.java-redefined.com/2015/06/symmetric-key-encryption-ios-java.html
My problem is: what I encrypt in Java I can decrypt perfectly in Java, but PHP mcrypt can't decrypt. What I encrypt with mcrypt I can decrypt with mcrypt, but can't in Java.
I want to send and receive encrypted data from a Java application to a PHP page, so I need it to be compatible.
Here's what I have...
JAVA...
public static String crypt(String input, String key){
byte[] crypted = null;
try{
SecretKeySpec skey = new SecretKeySpec(Base64.decodeBase64(key), "AES");
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
cipher.init(Cipher.ENCRYPT_MODE, skey);
crypted = cipher.doFinal(input.getBytes());
}catch(Exception e){
}
return Base64.encodeBase64String(crypted);
}
public static String decrypt(String input, String key){
byte[] output = null;
try{
SecretKeySpec skey = new SecretKeySpec(Base64.decodeBase64(key), "AES");
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
cipher.init(Cipher.DECRYPT_MODE, skey);
output = cipher.doFinal(Base64.decodeBase64(input));
}catch(Exception e){
}
return new String(output);
}
Running:
public static void main(String[] args) {
String key = "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=";
String data = "example";
System.out.println(Cpt.decrypt(Cpt.crypt(data, key), key));
}
Output:
example
PHP...
function getEncrypt($sStr, $sKey) {
return base64_encode(
mcrypt_encrypt(
MCRYPT_RIJNDAEL_256,
$sKey,
$sStr,
MCRYPT_MODE_ECB
)
);
}
function getDecrypt($sStr, $sKey) {
return mcrypt_decrypt(
MCRYPT_RIJNDAEL_256,
$sKey,
base64_decode($sStr),
MCRYPT_MODE_ECB
);
}
Running:
$crypt = getDecrypt(getEncrypt($str, $key), $key);
echo "<p>Crypt: $crypt</p>";
Output:
Crypt: example�������������������������
Using PHP to crypt "example" with key "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=" I get "YTYhgp4zC+w5IsViTR5PUkHMX4i7JzvA6NJT1FqhoGY=".
Using Java to crypt the same thing with the same key I get "+tdAZqTE7WAVPXhB3Tp5+g==".
I'm encoding and decoding to base64 in the right order and I tested base64 encode and decode compatibility between Java and PHP and it's working.
BUG#1
MCRYPT_RIJNDAEL_256 is not AES. The 256 in that constant refers to the blocksize, not the keysize. Use MCRYPT_RIJNDAEL_128 to get the same algorithm as AES. The keysize is set just by the number of bytes in the key argument you supply. So supply 32 bytes and you get AES with a 256-bit key.
BUG#2
These two lines are never correct in Java and indicate a fundamental misunderstanding of the nature of the arbitrary binary data produced by cryptographic transforms:
output = cipher.doFinal(Base64.decodeBase64(input));
return new String(output);
There is nothing wrong with transmitting and storing byte[] directly, but if you must use only printable strings then you should base64 encode/decode to do so. As you are already using base64 extensively that would seem like the way to go. I would guess that the correct two lines would be:
output = cipher.doFinal(Base64.decodeBase64(input));
return new String(Base64.encodeBase64(output), "UTF-8");
EDIT:
Just kidding about bug #2. Really, I was wrong, I didn't notice it was the decrypt direction. Of course, if you know the decrypted byte[] is a valid string then it is perfectly correct to do what your code does.
I know this is an old topic, but I will add my working solution.
You have to rewrite PHP side of the script:
function getEncrypt($sStr, $sKey) {
return base64_encode(
mcrypt_encrypt(
MCRYPT_RIJNDAEL_128,
base64_decode($sKey),
$sStr,
MCRYPT_MODE_ECB
)
);
}
function getDecrypt($sStr, $sKey) {
return mcrypt_decrypt(
MCRYPT_RIJNDAEL_128,
base64_decode($sKey),
base64_decode($sStr),
MCRYPT_MODE_ECB
);
}
You should base64_decode($sKey) because your key is base64 encoded.
$key = "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=";
Then, you need to create this function (credit goes to beltrachi from http://www.php.net/manual/en/function.mcrypt-decrypt.php):
function pkcs5_pad ($text, $blocksize) {
$pad = $blocksize - (strlen($text) % $blocksize);
return $text . str_repeat(chr($pad), $pad);
}
Use this code do encode/decode:
$decrypt = getDecrypt("6XremNEs1jv/Nnf/fRlQob6oG1jkge+5Ut3PL489oIo=", $key);
echo $decrypt;
echo "\n\n";
echo getEncrypt(pkcs5_pad("My very secret text:)", 16), $key);
I hope this will be useful for someone! :)
Please see here:
Difference in PHP encryption from iOS and .NET
AES Encrypt in C#, decrypt in PHP
DES Encryption in PHP and C#
The problem you're encountering is a padding-issue. I don't know Java, but AES/ECB/PKCS5Padding looks like you're using a PKCS#5 (that's essentially the same as PKCS#7) padding while PHP natively only support NULL-padding. That's what PKCS#5/7 does:
Pad the input with a padding string of
between 1 and 8 bytes to make the
total length an exact multiple of 8
bytes. The value of each byte of the
padding string is set to the number of
bytes added - i.e. 8 bytes of value
0x08, 7 bytes of value 0x07, ..., 2
bytes of 0x02, or one byte of value
0x01.
So the PHP code to do the padding right is trivial:
$blockSize = mcrypt_get_block_size(MCRYPT_RIJNDAEL_256, MCRYPT_MODE_ECB);
$padding = $blockSize - (strlen($data) % $blockSize);
$data .= str_repeat(chr($padding), $padding);
Keep in mind to have the same encoding for the strings. Try to convert the strings in both languages to UTF-8, e.g., and than convert to binary data that is encoded:
PHP (s. utf8_encode() function):
$strAndBlob = utf8_encode("My string");
Java:
String str = "My string";
byte[] blob = str.getBytes("utf-8");
PHP, e.g., must not use UTF-8 by default.