We are trying to duplicate the sha1 crypto encoding done in our java 1.6 server with the iOS/iPhone CommonCrypto libraries.
A basic question I have is why does Java have a fix output of 40 bytes while iOS has a fix output of 20 bytes from the SHA1 algorithms
I have found this link which shows how to generate the encoding in both environments but the output would be of different lengths, correct?
How to SHA1 hash a string in Android?
The SHA1 algorithm always return 160 bits (or 20 bytes).
I suspect your Java code is turning the byte array into a hexadecimal string, i.e. where each byte would show as two characters.
To compare this with CommonCrypto you can either:
convert the Java output to a byte array; or
convert the CommonCrypto byte array to an hexadecimal string (this is what the link in your question is doing)
before comparing the values.
Yesterday i was facing exactly this problem, the sha1 algorithm implementation i was using, was not compatible with the android one, after more or less one hour searching on android and ios implementations, i realize it was only a problem of String Formating. (change X to x).
Im sharing the snippet of what we are using for sha1 algorithm implementation ios/android compatible ... Hope this helps as a version of #poupou theory answer :-).
public static String sha1(String s) {
MessageDigest digest = null;
try {
digest = MessageDigest.getInstance("SHA-1");
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
digest.reset();
byte[] data = digest.digest(s.getBytes());
return String.format("%0" + (data.length * 2) + "X", new BigInteger(1, data));
}
-(NSString*) sha1:(NSString*)input
{
const char *cstr = [input cStringUsingEncoding:NSUTF8StringEncoding];
NSData *data = [NSData dataWithBytes:cstr length:input.length];
uint8_t digest[CC_SHA1_DIGEST_LENGTH];
CC_SHA1(data.bytes, data.length, digest);
NSMutableString* output = [NSMutableString stringWithCapacity:CC_SHA1_DIGEST_LENGTH * 2];
for(int i = 0; i < CC_SHA1_DIGEST_LENGTH; i++)
[output appendFormat:#"%02X", digest[i]];
return output;
}
Android or iOS, the SHA-1 has an expected length of 20 bytes.
But there is a difference in the sha-1 algorithm return.
iOS simply does not terminate the result with a null character.
So I guess the point is to not use the sha output length to generate the output data, but the CC_SHA1_DIGEST_LENGTH constant - which is 20.
uint8_t digest[CC_SHA1_DIGEST_LENGTH];
NSData* data = [stringToHash dataUsingEncoding:NSUTF8StringEncoding];
char* sha = CC_SHA1(data.bytes, data.length, digest);
NSData *hashedData = [NSData dataWithBytes:sha length:CC_SHA1_DIGEST_LENGTH];
If you terminate the digest yourself, then the sha output is correct:
uint8_t digest[CC_SHA1_DIGEST_LENGTH+1];
memset(digest,0,CC_SHA1_DIGEST_LENGTH+1);
NSData* data = [stringToHash dataUsingEncoding:NSUTF8StringEncoding];
char* sha = CC_SHA1(data.bytes, data.length, digest);
NSData *hashedData = [NSData dataWithBytes:sha length:strlen(sha)];
Hope it helps, cheers :)
Related
I have some data that I'm signing on iOS with SecKeyRawSign using Elliptic Curve private key. However, verifying that data in Java using Signature.verify() returns false
The data is a random 64 bit integer, split into bytes like so
uint64_t nonce = (some 64 bit integer)
NSData *nonceData = [NSData dataWithBytes: &nonce length: sizeof(nonce)];
From that data I'm creating a SHA256 digest
int digestLength = CC_SHA256_DIGEST_LENGTH;
uint8_t *digest = malloc(digestLength);
CC_SHA256(nonceData.bytes, (CC_LONG)nonceData.length, digest);
NSData *digestData = [NSData dataWithBytes:digest length:digestLength];
and then signing it with private key
size_t signedBufferSize = kMaxCipherBufferSize;
uint8_t *signedBuffer = malloc(kMaxCipherBufferSize);
OSStatus status = SecKeyRawSign(privateKeyRef,
kSecPaddingPKCS1SHA256,
(const uint8_t *)digestData.bytes,
digestData.length,
&signedBuffer[0],
&signedBufferSize);
NSData *signedData = nil;
if (status == errSecSuccess) {
signedData = [NSData dataWithBytes:signedBuffer length:signedBufferSize];
}
Everything appears to work fine.
Then, in Java server, I'm trying to verify that signed data
PublicKey publicKey = (a public key sent from iOS, X509 encoded)
Long nonce = (64 bit integer sent from iOS)
String signedNonce = (base64 encoded signed data)
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.putLong(nonce);
byte[] nonceBytes = buffer.array();
byte[] signedNonceBytes = Base64.getDecoder().decode(signedNonce.getBytes());
Signature signer = Signature.getInstance( "SHA256withECDSA" );
signer.initVerify( publicKey );
signer.update( nonceBytes );
Boolean isVerified = signer.verify( signedNonceBytes );
At this point, signer.verify() returns false
I also tried to sign plain data, instead of SHA256 digest, but that doesn't work either.
What am I missing? Am I signing the data correctly? Am I using correct padding? Is there something else to be done with data to be able to verify it with SHA256withECDSA algorithm?
The byte ordering does not match:
iOS is little endian. The way you create nonceData, this order is retained.
On the Java side, ByteBuffer defaults to big endian, independent of the underlying operating system / hardware.
So you need to change the byte order:
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.order(ByteOrder.LITTLE_ENDIAN);
buffer.putLong(nonce);
I'm a java guy, so I can't say anything about the iOS side, but a quick check of the java side can be done using the commented assumptions:
// Generate a new random EC keypair for testing
KeyPair keys = KeyPairGenerator.getInstance("EC").generateKeyPair();
PrivateKey privateKey = keys.getPrivate();
PublicKey publicKey = keys.getPublic();
// Generate a Random nonce to test with
byte[] nonceBytes = new byte[8]; // (some 64 bit integer)
new Random(System.nanoTime()).nextBytes(nonceBytes);
// sign
Signature sign = Signature.getInstance("SHA256withECDSA");
sign.initSign(privateKey);
sign.update(nonceBytes);
byte[] signature = sign.sign();
//verify
Signature verify = Signature.getInstance("SHA256withECDSA");
verify.initVerify(publicKey);
verify.update(nonceBytes);
Boolean isVerified = verify.verify(signature);
// print results
System.out.println("nonce used ::" + Base64.getEncoder().encodeToString(nonceBytes));
System.out.println("Signed nonce ::" + Base64.getEncoder().encodeToString(signature));
System.out.println("nonce used ::" + isVerified);
As you'd expect returns, the code above will always return that the signature is verified. Check your assumptions are accurate and validate the keys being used are correct on both sides.
I can advice you to use some Crypto Library which is available for both iOS and JAVA sides (f.e.:https://github.com/VirgilSecurity/virgil-crypto). This will ensure that the algorithm and block types (etc.) are the same in both cases and you won't need to worry about it anymore. I believe you will finds many crypto libraries on a GitHub.
In getBytes() you could specify the encoding technique using java.nio.charset.StandardCharsets. and do the same with the decoder.
https://docs.oracle.com/javase/7/docs/api/java/nio/charset/StandardCharsets.html
I am trying to encrypt/decrypt data ios to java & java to ios but I data encrypted in java is not properly decrypted in ios & data encrypted in ios is not properly decrypted in java
- (NSData *) encrypt:(NSData *) dataToEncrypt symmetricKey:(NSData *)symmetricKey context:(CCOperation)encryptOrDecrypt{
NSUInteger data_length= [dataToEncrypt length];
uint8_t input_raw_data[data_length];
//The [dataToEncrypt length] gives the number of chars present in the string.So say there are 10 chars.
//Now,the getBytes needs to get the raw bytes from this i.e. binary NSData.But suppose the encoding was
//full 16 bit encoding then the number of bytes needed wd have been double- 20.But as we are using the
//NSUTF8StringEncoding,the number of byes needed is 1 per char as the chars(even if originally unicode are
//compressed into an 8 bit UTF8 encoding.)
[dataToEncrypt getBytes:&input_raw_data length:data_length];
// [dataToEncrypt getBytes:&input_raw_data maxLength:data_length usedLength:NULL encoding:NSUTF8StringEncoding options:0 range:NSMakeRange(0,data_length) remainingRange:NULL];
//According to the doc: For block ciphers, the output size will always be less than or
//equal to the input size plus the size of one block.
//That's why we need to add the size of one block here
size_t buffer_size = data_length + kCCBlockSizeAES128;
void* buffer = malloc(buffer_size);
size_t num_bytes_encrypted = 0;
CCCryptorStatus crypt_status = CCCrypt(encryptOrDecrypt, kCCAlgorithmAES128, 0x0000,
[symmetricKey bytes], kCCKeySizeAES128,
NULL,
input_raw_data, data_length,
buffer, buffer_size,
&num_bytes_encrypted);
// NSLog(#"~~num bytes encrypted: %d",num_bytes_encrypted);
if (crypt_status == kCCSuccess){
NSLog(#"~~Data encoded successfully...");
return [NSData dataWithBytesNoCopy:buffer length:num_bytes_encrypted];
}
free(buffer); //free the buffer;
return nil;
}
I have used this
Java Code -
Cipher cipher = Cipher.getInstance("AES/CBC/NoPadding");
String keyString = "keykeykeykeykeykeykeykey";
byte[] keyBytes = keyString.getBytes("UTF-8");
cipher.init(Cipher.ENCRYPT_MODE, new SecretKeySpec(keyBytes, "AES"), new IvParameterSpec(new byte[16]));
byte[] resultBytes = cipher.doFinal("Hallo Welt!".getBytes("UTF8"));
FileOutputStream out = new FileOutputStream(new File("encryptedFileJava"));
out.write(resultBytes); out.close();
and this is encrypted text - “Se áJbë|8”R ,
key - BW3dKDf2bkDC4Bq9xTdr1g==
Please help me or suggest me any solution.
Thank you.
You have at least two problems:
The Objective C code is using ECB mode, while the Java code is using CBC mode. Use a bytearray of zeroes instead of NULL in the CCrypt-invocation to use CBC mode with a zero IV like the Java code.
Since keyBytes is 24 bytes long, Java will use AES-192. CCrypt will just ignore the extra bytes. Either specify AES-192 to CCrypt or use a 128 bit key ("keykeykeykeykeyk" should work).
For secured communication between IOS and Java devices, symmetric key encryption can be used.
In such cases where the platforms are different it is advisable that the key generated is a plain text key and not a API generated key.
AES 128 bit encryption can be used in such cases. IOS devices are capable of generating a symmetric key and encrypting the text using AES encryption.
Below link provides the java code to encrypt and decrypt using plain text symmetric key
http://www.java-redefined.com/2015/06/symmetric-key-encryption-ios-java.html
In my Android app I have a SHA256 hash which I must further hash with the RIPEMD160 message digest algorithm.
I can output the correct sha256 and ripemd160 hash of any string, but when I try to hash the sha256 hash with ripemd160 I get a hash which is incorrect.
According to online hash calculators, the SHA256 value of the string 'test'(all lowercase) is:
9f86d081884c7d659a2feaa0c55ad015a3bf4f1b2b0b822cd15d6c15b0f00a08
And the RIPEMD160 value of the string 'test' is:
5e52fee47e6b070565f74372468cdc699de89107
The value from hashing the resulting sha256 hash with ripemd160 according to online calcs is:
4efc1c36d3349189fb3486d2914f56e05d3e66f8
And the one my app gives me is:
cebaa98c19807134434d107b0d3e5692a516ea66
which is obviously wrong.
Here is my code:
public static String toRIPEMD160(String in)
{
byte[] addr = in.getBytes();
byte[] out = new byte[20];
RIPEMD160Digest digest = new RIPEMD160Digest();
byte[] sha256 = sha256(addr);
digest.update(sha256,0,sha256.length);
digest.doFinal(out,0);
return getHexString(out);
}
public static byte[] sha256(byte[] data)
{
byte[] sha256 = new byte[32];
try
{
sha256 = MessageDigest.getInstance("SHA-256").digest(data);
}
catch(NoSuchAlgorithmException e)
{}
return sha256;
}
For the ripemd160 algorithm, you need bouncycastle and java.security.MessageDigest for sha256.
Your "online calculator" result is the result of hashing the bytes of the string "test" with SHA-256, converting the result of that hash to a hex string, then taking the bytes corresponding to the ASCII characters of that hex string and hashing those a second time. This is very different from your Java code, which passes the bytes that come out of the first hash directly to the second one, without printing them as hex and turning those characters back into bytes in between. The single byte with value 254 (decimal) becomes "fe" in hex, which becomes the two-byte sequence [0x66, 0x65] when converted back to bytes.
Your hash is working fine. The problem is that the online calculators that you're using are treating your input:
9f86d081884c7d659a2feaa0c55ad015a3bf4f1b2b0b822cd15d6c15b0f00a08
as a string instead of an array of bytes. In other words, it's treating each character as a byte instead of parsing character pairs as bytes in hexadecimal. If I give this as a string to online calculators, I indeed get exactly what you got:
4efc1c36d3349189fb3486d2914f56e05d3e66f8
However, you're treating the output as an array of bytes instead of a String and that's giving you different results. You should encode your raw SHA256 hash as a string, then pass the encoded string to the hash function. I see you have a getHexString method, so we'll just use that.
public static String toRIPEMD160(String in) {
try {
byte[] addr = in.getBytes();
byte[] out = new byte[20];
RIPEMD160Digest digest = new RIPEMD160Digest();
// These are the lines that changed
byte[] rawSha256 = sha256(addr);
String encodedSha256 = getHexString(rawSha256);
byte[] strBytes = encodedSha256.getBytes("UTF-8");
digest.update(strBytes, 0, strBytes.length);
digest.doFinal(out, 0);
return getHexString(out);
} catch (UnsupportedEncodingException ex) {
// Never happens, everything supports UTF-8
return null;
}
}
If you want to know it's working, take the value of encodedSha256 and put that into an online hash calculator. As long as the calculator uses UTF-8 encoding to turn the string into a byte array, it will match your output.
To get printable version of byte[] digest use this code:
StringBuffer hexString = new StringBuffer();
for (int i=0;i<out.length;i++) {
hexString.append( String.format("%02x", 0xFF & out[i]) );
}
and then call hexString.toString();
I'm trying to make a simple String to SHA1 converter in Java and this is what I've got...
public static String toSHA1(byte[] convertme) {
MessageDigest md = null;
try {
md = MessageDigest.getInstance("SHA-1");
}
catch(NoSuchAlgorithmException e) {
e.printStackTrace();
}
return new String(md.digest(convertme));
}
When I pass it toSHA1("password".getBytes()), I get [�a�ɹ??�%l�3~��. I know it's probably a simple encoding fix like UTF-8, but could someone tell me what I should do to get what I want which is 5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8? Or am I doing this completely wrong?
UPDATE
You can use Apache Commons Codec (version 1.7+) to do this job for you.
DigestUtils.sha1Hex(stringToConvertToSHexRepresentation)
Thanks to #Jon Onstott for this suggestion.
Old Answer
Convert your Byte Array to Hex String. Real's How To tells you how.
return byteArrayToHexString(md.digest(convertme))
and (copied from Real's How To)
public static String byteArrayToHexString(byte[] b) {
String result = "";
for (int i=0; i < b.length; i++) {
result +=
Integer.toString( ( b[i] & 0xff ) + 0x100, 16).substring( 1 );
}
return result;
}
BTW, you may get more compact representation using Base64. Apache Commons Codec API 1.4, has this nice utility to take away all the pain. refer here
This is my solution of converting string to sha1. It works well in my Android app:
private static String encryptPassword(String password)
{
String sha1 = "";
try
{
MessageDigest crypt = MessageDigest.getInstance("SHA-1");
crypt.reset();
crypt.update(password.getBytes("UTF-8"));
sha1 = byteToHex(crypt.digest());
}
catch(NoSuchAlgorithmException e)
{
e.printStackTrace();
}
catch(UnsupportedEncodingException e)
{
e.printStackTrace();
}
return sha1;
}
private static String byteToHex(final byte[] hash)
{
Formatter formatter = new Formatter();
for (byte b : hash)
{
formatter.format("%02x", b);
}
String result = formatter.toString();
formatter.close();
return result;
}
Using Guava Hashing class:
Hashing.sha1().hashString( "password", Charsets.UTF_8 ).toString()
SHA-1 (and all other hashing algorithms) return binary data. That means that (in Java) they produce a byte[]. That byte array does not represent any specific characters, which means you can't simply turn it into a String like you did.
If you need a String, then you have to format that byte[] in a way that can be represented as a String (otherwise, just keep the byte[] around).
Two common ways of representing arbitrary byte[] as printable characters are BASE64 or simple hex-Strings (i.e. representing each byte by two hexadecimal digits). It looks like you're trying to produce a hex-String.
There's also another pitfall: if you want to get the SHA-1 of a Java String, then you need to convert that String to a byte[] first (as the input of SHA-1 is a byte[] as well). If you simply use myString.getBytes() as you showed, then it will use the platform default encoding and as such will be dependent on the environment you run it in (for example it could return different data based on the language/locale setting of your OS).
A better solution is to specify the encoding to use for the String-to-byte[] conversion like this: myString.getBytes("UTF-8"). Choosing UTF-8 (or another encoding that can represent every Unicode character) is the safest choice here.
This is a simple solution that can be used when converting a string to a hex format:
private static String encryptPassword(String password) throws NoSuchAlgorithmException, UnsupportedEncodingException {
MessageDigest crypt = MessageDigest.getInstance("SHA-1");
crypt.reset();
crypt.update(password.getBytes("UTF-8"));
return new BigInteger(1, crypt.digest()).toString(16);
}
Just use the apache commons codec library. They have a utility class called DigestUtils
No need to get into details.
As mentioned before use apache commons codec. It's recommended by Spring guys as well (see DigestUtils in Spring doc). E.g.:
DigestUtils.sha1Hex(b);
Definitely wouldn't use the top rated answer here.
It is not printing correctly because you need to use Base64 encoding. With Java 8 you can encode using Base64 encoder class.
public static String toSHA1(byte[] convertme) throws NoSuchAlgorithmException {
MessageDigest md = MessageDigest.getInstance("SHA-1");
return Base64.getEncoder().encodeToString(md.digest(convertme));
}
Result
This will give you your expected output of 5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8
Message Digest (hash) is byte[] in byte[] out
A message digest is defined as a function that takes a raw byte array and returns a raw byte array (aka byte[]). For example SHA-1 (Secure Hash Algorithm 1) has a digest size of 160 bit or 20 byte. Raw byte arrays cannot usually be interpreted as character encodings like UTF-8, because not every byte in every order is an legal that encoding. So converting them to a String with:
new String(md.digest(subject), StandardCharsets.UTF_8)
might create some illegal sequences or has code-pointers to undefined Unicode mappings:
[�a�ɹ??�%l�3~��.
Binary-to-text Encoding
For that binary-to-text encoding is used. With hashes, the one that is used most is the HEX encoding or Base16. Basically a byte can have the value from 0 to 255 (or -128 to 127 signed) which is equivalent to the HEX representation of 0x00-0xFF. Therefore hex will double the required length of the output, that means a 20 byte output will create a 40 character long hex string, e.g.:
2fd4e1c67a2d28fced849ee1bb76e7391b93eb12
Note that it is not required to use hex encoding. You could also use something like base64. Hex is often preferred because it is easier readable by humans and has a defined output length without the need for padding.
You can convert a byte array to hex with JDK functionality alone:
new BigInteger(1, token).toString(16)
Note however that BigInteger will interpret given byte array as number and not as a byte string. That means leading zeros will not be outputted and the resulting string may be shorter than 40 chars.
Using Libraries to Encode to HEX
You could now copy and paste an untested byte-to-hex method from Stack Overflow or use massive dependencies like Guava.
To have a go-to solution for most byte related issues I implemented a utility to handle these cases: bytes-java (Github)
To convert your message digest byte array you could just do
String hex = Bytes.wrap(md.digest(subject)).encodeHex();
or you could just use the built-in hash feature
String hex = Bytes.from(subject).hashSha1().encodeHex();
Base 64 Representation of SHA1 Hash:
String hashedVal = Base64.getEncoder().encodeToString(DigestUtils.sha1(stringValue.getBytes(Charset.forName("UTF-8"))));
Convert byte array to hex string.
public static String toSHA1(byte[] convertme) {
final char[] HEX_CHARS = "0123456789ABCDEF".toCharArray();
MessageDigest md = null;
try {
md = MessageDigest.getInstance("SHA-1");
}
catch(NoSuchAlgorithmException e) {
e.printStackTrace();
}
byte[] buf = md.digest(convertme);
char[] chars = new char[2 * buf.length];
for (int i = 0; i < buf.length; ++i) {
chars[2 * i] = HEX_CHARS[(buf[i] & 0xF0) >>> 4];
chars[2 * i + 1] = HEX_CHARS[buf[i] & 0x0F];
}
return new String(chars);
}
The reason this doesn't work is that when you call String(md.digest(convertme)), you are telling Java to interpret a sequence of encrypted bytes as a String. What you want is to convert the bytes into hexadecimal characters.
Maybe this helps (works on java 17):
import org.apache.tomcat.util.codec.binary.Base64;
return new String(Base64.encodeBase64(md.digest(convertme)));
My problem is: what I encrypt in Java I can decrypt perfectly in Java, but PHP mcrypt can't decrypt. What I encrypt with mcrypt I can decrypt with mcrypt, but can't in Java.
I want to send and receive encrypted data from a Java application to a PHP page, so I need it to be compatible.
Here's what I have...
JAVA...
public static String crypt(String input, String key){
byte[] crypted = null;
try{
SecretKeySpec skey = new SecretKeySpec(Base64.decodeBase64(key), "AES");
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
cipher.init(Cipher.ENCRYPT_MODE, skey);
crypted = cipher.doFinal(input.getBytes());
}catch(Exception e){
}
return Base64.encodeBase64String(crypted);
}
public static String decrypt(String input, String key){
byte[] output = null;
try{
SecretKeySpec skey = new SecretKeySpec(Base64.decodeBase64(key), "AES");
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
cipher.init(Cipher.DECRYPT_MODE, skey);
output = cipher.doFinal(Base64.decodeBase64(input));
}catch(Exception e){
}
return new String(output);
}
Running:
public static void main(String[] args) {
String key = "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=";
String data = "example";
System.out.println(Cpt.decrypt(Cpt.crypt(data, key), key));
}
Output:
example
PHP...
function getEncrypt($sStr, $sKey) {
return base64_encode(
mcrypt_encrypt(
MCRYPT_RIJNDAEL_256,
$sKey,
$sStr,
MCRYPT_MODE_ECB
)
);
}
function getDecrypt($sStr, $sKey) {
return mcrypt_decrypt(
MCRYPT_RIJNDAEL_256,
$sKey,
base64_decode($sStr),
MCRYPT_MODE_ECB
);
}
Running:
$crypt = getDecrypt(getEncrypt($str, $key), $key);
echo "<p>Crypt: $crypt</p>";
Output:
Crypt: example�������������������������
Using PHP to crypt "example" with key "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=" I get "YTYhgp4zC+w5IsViTR5PUkHMX4i7JzvA6NJT1FqhoGY=".
Using Java to crypt the same thing with the same key I get "+tdAZqTE7WAVPXhB3Tp5+g==".
I'm encoding and decoding to base64 in the right order and I tested base64 encode and decode compatibility between Java and PHP and it's working.
BUG#1
MCRYPT_RIJNDAEL_256 is not AES. The 256 in that constant refers to the blocksize, not the keysize. Use MCRYPT_RIJNDAEL_128 to get the same algorithm as AES. The keysize is set just by the number of bytes in the key argument you supply. So supply 32 bytes and you get AES with a 256-bit key.
BUG#2
These two lines are never correct in Java and indicate a fundamental misunderstanding of the nature of the arbitrary binary data produced by cryptographic transforms:
output = cipher.doFinal(Base64.decodeBase64(input));
return new String(output);
There is nothing wrong with transmitting and storing byte[] directly, but if you must use only printable strings then you should base64 encode/decode to do so. As you are already using base64 extensively that would seem like the way to go. I would guess that the correct two lines would be:
output = cipher.doFinal(Base64.decodeBase64(input));
return new String(Base64.encodeBase64(output), "UTF-8");
EDIT:
Just kidding about bug #2. Really, I was wrong, I didn't notice it was the decrypt direction. Of course, if you know the decrypted byte[] is a valid string then it is perfectly correct to do what your code does.
I know this is an old topic, but I will add my working solution.
You have to rewrite PHP side of the script:
function getEncrypt($sStr, $sKey) {
return base64_encode(
mcrypt_encrypt(
MCRYPT_RIJNDAEL_128,
base64_decode($sKey),
$sStr,
MCRYPT_MODE_ECB
)
);
}
function getDecrypt($sStr, $sKey) {
return mcrypt_decrypt(
MCRYPT_RIJNDAEL_128,
base64_decode($sKey),
base64_decode($sStr),
MCRYPT_MODE_ECB
);
}
You should base64_decode($sKey) because your key is base64 encoded.
$key = "Zvzpv8/PXbezPCZpxzQKzL/FeoPw68jIb+NONX/LIi8=";
Then, you need to create this function (credit goes to beltrachi from http://www.php.net/manual/en/function.mcrypt-decrypt.php):
function pkcs5_pad ($text, $blocksize) {
$pad = $blocksize - (strlen($text) % $blocksize);
return $text . str_repeat(chr($pad), $pad);
}
Use this code do encode/decode:
$decrypt = getDecrypt("6XremNEs1jv/Nnf/fRlQob6oG1jkge+5Ut3PL489oIo=", $key);
echo $decrypt;
echo "\n\n";
echo getEncrypt(pkcs5_pad("My very secret text:)", 16), $key);
I hope this will be useful for someone! :)
Please see here:
Difference in PHP encryption from iOS and .NET
AES Encrypt in C#, decrypt in PHP
DES Encryption in PHP and C#
The problem you're encountering is a padding-issue. I don't know Java, but AES/ECB/PKCS5Padding looks like you're using a PKCS#5 (that's essentially the same as PKCS#7) padding while PHP natively only support NULL-padding. That's what PKCS#5/7 does:
Pad the input with a padding string of
between 1 and 8 bytes to make the
total length an exact multiple of 8
bytes. The value of each byte of the
padding string is set to the number of
bytes added - i.e. 8 bytes of value
0x08, 7 bytes of value 0x07, ..., 2
bytes of 0x02, or one byte of value
0x01.
So the PHP code to do the padding right is trivial:
$blockSize = mcrypt_get_block_size(MCRYPT_RIJNDAEL_256, MCRYPT_MODE_ECB);
$padding = $blockSize - (strlen($data) % $blockSize);
$data .= str_repeat(chr($padding), $padding);
Keep in mind to have the same encoding for the strings. Try to convert the strings in both languages to UTF-8, e.g., and than convert to binary data that is encoded:
PHP (s. utf8_encode() function):
$strAndBlob = utf8_encode("My string");
Java:
String str = "My string";
byte[] blob = str.getBytes("utf-8");
PHP, e.g., must not use UTF-8 by default.