Comparing SHA256 Output in Java and PHP - java

This question might probably be a duplicate. But so far I haven't seen any response that could solve mine issue.
I have this piece of Java code doing encryption with SHA-256:
public static String hashIt(String msg, String key) {
MessageDigest m = null;
String hashText = null;
byte[] actualKeyBytes = TripleDES.hexStringToBytes(key);
try {
m = MessageDigest.getInstance("SHA-256");
m.update(actualKeyBytes, 0, actualKeyBytes.length);
try {
m.update(msg.getBytes("UTF-8"), 0, msg.length());
} catch (UnsupportedEncodingException ex) {
}
hashText = TripleDES.bytesToHexString( m.digest() ); //new BigInteger(1, m.digest()).toString(16);
} catch (NoSuchAlgorithmException ex) {
}
return hashText;
}
Using d38a5cd5 as key with "ewo10kalavanda" as the string to hash.
Utils.hashIt("ewo10kalavanda", "d38a5cd5");
I have the following output:
fc87c73012e11de3a57faabe4d852ce89ec3337504531c16
Using the same SHA256 in PHP
hash_hmac('SHA256', "ewo10kalavanda", "d38a5cd5", $raw=false)
The output is 1839412f79b9e33c2f810650f79f23f46173792f885dd8d8c9633675e28e792f which does not match that of Java.
Is there anything done wrong here? Been on this for some hours now.

In your PHP code you used HMAC which is more than just hashing the string obtained by joining key and the message body. I found a diagram from Wikipedia which explains how HMAC-SHA1 works:
I did manage to get a working version in Java:
public static String hashIt(String msg, String key) {
try {
byte[] keyBytes = key.getBytes("UTF-8");
SecretKeySpec spec = new SecretKeySpec(keyBytes, HMAC_SHA256);
Mac mac = Mac.getInstance(HMAC_SHA256);
mac.init(spec);
return TripleDES.bytesToHexString(mac.doFinal(msg.getBytes("UTF-8")));
} catch (UnsupportedEncodingException | NoSuchAlgorithmException | InvalidKeyException e) {
throw new RuntimeException(e);
}
}
I still think there is something wrong with msg.length(). If you ever hash a two byte character it will be buggy. I tested it and it's different. For example try use your previous code to hash 111111錒(message) and 1111(key) and then use my previously suggested code to hash the same string. Your code output 81e385eb2bf89f7494a4b0927a4f5d4105450eb4a21152d53d52ddb9c08ed0e1 and my code output ef7f82833c865ef4d6089ba7dfbec8ad4f05b58e3fd77ca242c5fd7e7757d8b4.
That chinese character is intended. It shows how the OP's code fails with two byte characters. DO NOT REMOVE.

The other answer from glee8e should have got you a long way. But just to be sure, here is how to generate the output:
$k = hex2bin("d38a5cd5");
$m = "ewo10kalavanda";
$in = $k.$m;
$h = hash ("SHA256", $in);
print $h;
It would be a bit better to first encode to UTF-8, but I haven't got the right module installed:
$m = mb_convert_encoding("ewo10kalavanda", "UTF-8");
for the test sting this of course doesn't matter as long as the platform encoding is compatible with UTF-8 for the input characters.
That's however half of the answer though: there is a reason why HMAC was defined, and the major reason is that hash functions on their own are not that secure for keyed hash or Message Authentication Code (MAC). So the use of HMAC as in the PHP function should be preferred.

Related

C# DESede Symmetric ECB Encryption (I am so close - slightly differing characters)!

My output that I have to match is from Java DESede using a BouncyCastle Jar from 2005 ... I am very close...
Here is my output in Java (which is correct) followed by my output in C# ... if you view in an editor, you will see they ALMOST match, except in C# where it has a forward slash "/" in Java it has "%2F", "%2D" and at the end where C# has is an "=" , Java has "%3D". Any ideas? (I added spaces to show they match up - but you will only see them in an editor.)
F3e8sdZ%2F951IRiguIAVqfDLyWptqlbWik5tvFzItcxJCEmupzD9wXp%2BDzIbrf2J2dPpXyEXL2QU%3D (Java - Correct)
F3e8sdZ/ 951IRiguIAVqfDLyWptqlbWik5tvFzItcxJCEmupzD9wXp+ DzIbrf2J2dPpXyEXL2QU= (C# - Close?)
Here is my C# Code:
public static string DoubleTrippleDESede(string strToEncode, ref string symKey, ref ICryptoTransform cipher)
{
try
{
//byte[] input = Encoding.UTF8.GetBytes("DESede (3DES) Encryption in RAILO CFML");
byte[] input = Encoding.UTF8.GetBytes(strToEncode);
//byte[] key = Convert.FromBase64String("ru8femXhTm9jwdGdhb/4Sw==");
byte[] key = Convert.FromBase64String(symKey);
TripleDESCryptoServiceProvider algorithm = new TripleDESCryptoServiceProvider();
algorithm.Mode = CipherMode.ECB;
algorithm.BlockSize = 64;
algorithm.KeySize = 192; // 24 byte key
algorithm.Key = key; //Original
//algorithm.Key = key.CopyTo(algorithm.Key,)
cipher = algorithm.CreateEncryptor();
byte[] encrypted = cipher.TransformFinalBlock(input, 0, input.Length);
Debug.WriteLine("encrypted (.NET): {0}", Convert.ToBase64String(encrypted));
return Convert.ToBase64String(encrypted);
}
catch (Exception ex)
{
return ex.Message;
}
}
Any guidance would be greatly appreciated!!!! I've been at this for 2 weeks and finally can taste victory (I think!?)
Your Java output appears to have additionally been urlencoded. You should be able to call System.Uri.EscapeDataString() to match your present output.

Using PBKDF2 in Java and PHP

I'm having some difficulty producing the same encrypted password using the PBKDF2 algorithm on both Java and PHP.
I'm using the following Java implementation to generate the hash with a random byte array which is 16 bytes in size. I'm then storing the hash and salt separately in a MySQL database, however when I go to do the same operation in PHP using the salt retrieved from the database, I get almost the exact same encryption except the hash has a leading 0 and I cannot for the life of me figure out why.
Java:
public String hashPassword(String password, byte[] salt){
char[] passwordChars = password.toCharArray();
PBEKeySpec spec = new PBEKeySpec(
passwordChars,
salt,
ITERATIONS,
KEY_LENGTH
);
SecretKeyFactory key = null;
try {
key = SecretKeyFactory.getInstance("PBKDF2WithHmacSHA1");
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
byte[] hashedPassword = null;
try {
hashedPassword = key.generateSecret(spec).getEncoded();
} catch (InvalidKeySpecException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return String.format("%x", new BigInteger(hashedPassword));
}
I found the above code at https://adambard.com/blog/3-wrong-ways-to-store-a-password/
PHP:
$query = $database->query('SELECT * FROM USERS');
$password = 'hello';
$iterations = 1000;
foreach($query as $user){
$hash = hash_pbkdf2("sha1", $password, $user['salt'], $iterations, 40, false);
}
echo $hash;
Note: There is only one user stored in the database, I know the above code isn't great, I created it quickly for testing purposes.
For both implementations I'm using an iteration count of 1000, a key length of 160 in Java and a key length of 40 in PHP (to compensate for setting raw-output to false)
Java Output - 971f0dddc1bc2e899f2bca178f16ea79bfbbb13
PHP Output - 0971f0dddc1bc2e899f2bca178f16ea79bfbbb13
Any help is much appreciated, thank you.
It is the BigInteger that is killing the leading 0.
Hashes are not integers, they are an array of 8-bit bytes. Do not try to convert to a BigInteger.
Either use it as a byte[] or encode as a hexadecimal or Base64 string. To match PHP hexadecimal encode hashedPassword.
PHP is returning a hexadecimal string encoded hash because raw_output is set to FALSE.

port sha-1 hash from C# to Android/java

I need to port the following code from C#
private string hashcode(string code)
{
byte[] bytes = Encoding.Unicode.GetBytes(code);
byte[] inArray = HashAlgorithm.Create("SHA1").ComputeHash(bytes);
return Convert.ToBase64String(inArray);
}
to an Android App. I have this in Java:
private static String hashCode(String userCode) {
String hashSha1;
MessageDigest digest = null;
try {
digest = MessageDigest.getInstance("SHA-1");
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
digest.reset();
byte[] data = digest.digest(userCode.getBytes());
return Base64.encodeToString(data, Base64.DEFAULT);
}
Alas, this code does not produce the same results. Finding out why is a pretty wild goose chase.
What can I do in Android to get the same hashes?
This is the C# code you've got for converting the userCode string to bytes:
byte[] bytes = Encoding.Unicode.GetBytes(code);
And the Java code is just:
userCode.getBytes()
That means in Java you're using the platform-default encoding, is UTF-8 on Android. You should specify the equivalent encoding to Encoding.Unicode, which is StandardCharsets.UTF_16LE.
You should pretty much never call either String(byte[]) or String.getBytes() which both use the platform-default encoding. Always specify the encoding to use.
On a more general point, you said that "finding out why is a pretty wild goose chase" - but in situations like this, the solution is almost always to log in detail one step in the transformation at a time. You have three transformations here:
String to byte[] (encoding userCode as binary data)
MD5 hashing (byte[] to byte[])
Encoding the hash as base64
If you had taken each of those steps individually, and logged the results, you'd have spotted that the problem was in the first step.
Android has UTF-8 as default character set. Try String.getBytes(Charset charset) using StandardCharsets.UTF_16LE instead.

RSA result with negative symbol in JAVA

I am not an expert in cryptography and I am getting some interesting results when I use the encryption method below.
The server is .NET C# and the client runs JAVA. Basically, We encrypt credit card information and for the 12 credit cards I have, 11 works perfectly with the methods below.
However, one of the cards (real VISA credit CARD) the result returned by encrypt() and converted to hex has a negative symbol in the start of the string, like this:
-6d9830a52b2c3add7a78fd9897bca19d....., it fails when the server tries to decrypt it and I think it should be positive not negative based on this explanation RSA - Encryption with negative exponent
private static byte[] encrypt(String text, PublicKey pubRSA) throws Exception
{
Cipher cipher = Cipher.getInstance(RSA);
cipher.init(Cipher.ENCRYPT_MODE, pubRSA);
return cipher.doFinal(text.getBytes());
}
//Using this encryption method one card could not be decrypted by vPAY due to negative (exponential) symbol.
//It may have the same affect with other cards
public final static byte[] encrypt(String text)
{
try {
KeyFactory keyFactory = KeyFactory.getInstance("RSA");
X509EncodedKeySpec x509Spec = new X509EncodedKeySpec(Base64.decode(pkBase64));
PublicKey pk = keyFactory.generatePublic(x509Spec);
return encrypt(text, pk);
}
catch(Exception e)
{
e.printStackTrace();
}
return null;
}
Has anyone faced something like that and found a workaround?
I have tried three other algorithms with different KeySpec and the same publicKey (the source is a string in base64 format) but none of them could be decrypted by the server even with the cards the were working before...
UPDATE 1
This is how a convert the encrypted result in bytes to HEX:
public static String byteToHex(byte[] string)
{
try {
return String.format("%04x", new BigInteger(string));
} catch (Exception e) {
// TODO Auto-generated catch block
return null;
}
}
You should print out the hexadecimal string directly from byte[]. This can be done using the following code:
StringBuilder sb = new StringBuilder(data.length * 2);
for (int i = 0; i < data.length; i++) {
sb.append(String.format("%02X", data[i] & 0xFF));
}
return sb.toString();
There is no need to use BigInteger. In fact, it is dangerous to use BigInteger. One reason is the one you've already encountered: BigInteger conversion to/from byte[] is using signed big endian encoding by default. The other thing is that the output of the RSA signature (as integer) may be smaller than the modulus size in hexadecimals. This is why EJP's solution will fail now and then.
RSA output has been defined in bytes, as an unsigned big endian encoded in the same number of bits as the key size (using integer to octet string encoding in the standard documents).
public static String byteToHex(byte[] string)
A byte[] is not a string. It's a byte array. Don't confuse yourself with inappropriate variable names. String is not a container for binary data.
return String.format("%04x", new BigInteger(string));
Try return new BigInteger(1,string).toString(16), and have a look at the Javadoc to see why this works where new BigInteger(string) didn't.

AES-128 Encryption not working on Java < 1.7

I've been chipping away at a school assignment for 3 days, and finally finished it today, error-free and working fine! Except, I was testing it on Java 1.7, and the school servers (where the professor will compile it) run 1.6. So, I tested my code on 1.6, wanting to cover all my bases, and I get a BadPaddingException upon decryption.
[EDIT] Warning: this code does not follow common security practices and should not be used in production code.
Originally, I had this, which works fine on 1.7 (sorry, lots of code.. all relevant..):
public static String aes128(String key, String data, final int direction) {
SecureRandom rand = new SecureRandom(key.getBytes());
byte[] randBytes = new byte[16];
rand.nextBytes(randBytes);
SecretKey encKey = new SecretKeySpec(randBytes, "AES");
Cipher cipher = null;
try {
cipher = Cipher.getInstance("AES");
cipher.init((direction == ENCRYPT ? Cipher.ENCRYPT_MODE : Cipher.DECRYPT_MODE), encKey);
} catch (InvalidKeyException e) {
return null;
} catch (NoSuchPaddingException e) {
return null;
} catch (NoSuchAlgorithmException e) {
return null;
}
try {
if (direction == ENCRYPT) {
byte[] encVal = cipher.doFinal(data.getBytes());
String encryptedValue = Base64.encode(encVal);
return encryptedValue;
} else {
byte[] dataBytes = Base64.decode(data);
byte[] encVal = cipher.doFinal(dataBytes);
return new String(encVal);
}
} catch (NullPointerException e) {
return null;
} catch (BadPaddingException e) {
return null;
} catch (IllegalBlockSizeException e) {
return null;
}
}
However, my BadPaddingException catch block executes upon decryption:
javax.crypto.BadPaddingException: Given final block not properly padded
at com.sun.crypto.provider.SunJCE_f.b(DashoA13*..)
at com.sun.crypto.provider.SunJCE_f.b(DashoA13*..)
at com.sun.crypto.provider.AESCipher.engineDoFinal(DashoA13*..)
at javax.crypto.Cipher.doFinal(DashoA13*..)
at CipherUtils.aes128(CipherUtils.java:112)
at CipherUtils.decryptFile(CipherUtils.java:44)
at decryptFile.main(decryptFile.java:21)
This is what I tried to fix it (basically, I added all the padding/unpadding myself, and used NoPadding):
public static String aes128(String key, String data, final int direction) {
// PADCHAR = (char)0x10 as String
while (key.length() % 16 > 0)
key = key + PADCHAR; // Added this loop
SecureRandom rand = new SecureRandom(key.getBytes());
byte[] randBytes = new byte[16];
rand.nextBytes(randBytes);
SecretKey encKey = new SecretKeySpec(randBytes, "AES");
AlgorithmParameterSpec paramSpec = new IvParameterSpec(key.getBytes()); // Created this
Cipher cipher = null;
try {
cipher = Cipher.getInstance("AES/CBC/NoPadding"); // Added CBC/NoPadding
cipher.init((direction == ENCRYPT ? Cipher.ENCRYPT_MODE : Cipher.DECRYPT_MODE), encKey, paramSpec); // Added paramSpec
} catch (InvalidKeyException e) {
return null;
} catch (NoSuchPaddingException e) {
return null;
} catch (NoSuchAlgorithmException e) {
return null;
} catch (InvalidAlgorithmParameterException e) {
return null; // Added this catch{}
}
try {
if (direction == ENCRYPT) {
while (data.length() % 16 > 0)
data = data + PADCHAR; // Added this loop
byte[] encVal = cipher.doFinal(data.getBytes());
String encryptedValue = Base64.encode(encVal);
return encryptedValue;
} else {
byte[] dataBytes = Base64.decode(data);
byte[] encVal = cipher.doFinal(dataBytes);
return new String(encVal);
}
} catch (NullPointerException e) {
return null;
} catch (BadPaddingException e) {
return null;
} catch (IllegalBlockSizeException e) {
return null;
}
}
When using this, I just get gibberish in and out:
Out: u¢;èÉ÷JRLòB±J°N°[9cRÐ{ªv=]I¯¿©:
´RLA©êí;R([¶Ü9¸ßv&%®µ^#û|Bá (80)
Unpadded: u¢;èÉ÷JRLòB±J°N°[9cRÐ{ªv=]I¯¿©:
´RLA©êí;R([¶Ü9¸ßv&%®µ^#û|Bá (79)
It is also worth noting that 1.6 and 1.7 produce different encrypted strings.
For example, on 1.7, encrypting xy (including a SHA-1 hash) with key hi produces:
XLUVZBIJv1n/FV2MzaBK3FLPQRCQF2FY+ghyajdqCGsggAN4aac8bfwscrLaQT7BMHJgfnjJLn+/rwGv0UEW+dbRIMQkNAwkGeSjda3aEpk=
On 1.6, the same thing produces:
nqeahRnA0IuRn7HXUD1JnkhWB5uq/Ng+srUBYE3ycGHDC1QB6Xo7cPU6aEJxH7NKqe3kRN3rT/Ctl/OrhqVkyDDThbkY8LLP39ocC3oP/JE=
I didn't expect the assignment to take so long, so my time has run out and it does need to be done tonight. If there is no answer by then, however, I'll just leave a note to my teacher regarding this. It appears to be some issue that was fixed in 1.7... though hopefully can be remedied through the correct addition/fix in my code.
Thanks a ton for everyone's time!
First off:
For almost all systems, encrypting the same plaintext twice should always (i.e. with very very high probability) produce different ciphertext.
The traditional example is that it allows a CPA adversary to distinguish E("attack at dawn") from E("attack at dusk") with only two queries. (There are a handful of systems where you want deterministic encryption, but the right way to do this is "synthetic IV" or cipher modes like CMC and EME.)
Ultimately, the problem is that SecureRandom() is not intended for key derivation.
If the input "key" is a passphrase, you should be using something like PBKDF2 (or scrypt() or bcrypt()).
Additionally, you should be using an explicit charset, e.g. String.getBytes("UTF-8").
If the input "key" is a key, the most common string representation is a hexdump. Java doesn't include an unhexing function, but there are several here.
If the input is a "master key" and you want to derive a subkey, then you should be hashing it with other data. There's not much point if the subkey is always the same.
Additional nitpicks:
Your code is vulnerable to a padding oracle attack; you really should be verifying a MAC before doing anything with the data (or better, using an authenticated encryption mode).
In your second listing, you explicitly reuse the IV. Bad! Assuming CBC mode, the IV used should be unpredictable; SecureRandom is useful here.
I've been looking over and over and I have to agree with NullUserException. The problem is the use of SecureRandom. This means that you never really know what your key is and therefore it is not necessarily ever the same key.
encKey comes from SecureRandom, which is seeded by the key provided. Therefore, if the key is the same, the seed is the same, so the random should be the same...
...unless of course Oracle (or another provider) changes the implementation between versions.
Okay, adding more information that I researched. I think this answer was most helpful.
Get password and cleartext from the user, and convert them to byte arrays.
Generate a secure random salt.
Append the salt to the password and compute its cryptographic hash. Repeat this many times.
Encrypt the cleartext using the resulting hash as the initialization vector and/or secret key.
Save the salt and the resulting ciphertext.
To me, it sounds like SecureRandom is used once to generate a salt but then salt must be saved with the cypher text in order to undo the cyphering process. Additional security comes from repetition and variance of steps (obscurity).
Note: I couldn't find any consensus that these steps are best practices.

Categories