I am not an expert in cryptography and I am getting some interesting results when I use the encryption method below.
The server is .NET C# and the client runs JAVA. Basically, We encrypt credit card information and for the 12 credit cards I have, 11 works perfectly with the methods below.
However, one of the cards (real VISA credit CARD) the result returned by encrypt() and converted to hex has a negative symbol in the start of the string, like this:
-6d9830a52b2c3add7a78fd9897bca19d....., it fails when the server tries to decrypt it and I think it should be positive not negative based on this explanation RSA - Encryption with negative exponent
private static byte[] encrypt(String text, PublicKey pubRSA) throws Exception
{
Cipher cipher = Cipher.getInstance(RSA);
cipher.init(Cipher.ENCRYPT_MODE, pubRSA);
return cipher.doFinal(text.getBytes());
}
//Using this encryption method one card could not be decrypted by vPAY due to negative (exponential) symbol.
//It may have the same affect with other cards
public final static byte[] encrypt(String text)
{
try {
KeyFactory keyFactory = KeyFactory.getInstance("RSA");
X509EncodedKeySpec x509Spec = new X509EncodedKeySpec(Base64.decode(pkBase64));
PublicKey pk = keyFactory.generatePublic(x509Spec);
return encrypt(text, pk);
}
catch(Exception e)
{
e.printStackTrace();
}
return null;
}
Has anyone faced something like that and found a workaround?
I have tried three other algorithms with different KeySpec and the same publicKey (the source is a string in base64 format) but none of them could be decrypted by the server even with the cards the were working before...
UPDATE 1
This is how a convert the encrypted result in bytes to HEX:
public static String byteToHex(byte[] string)
{
try {
return String.format("%04x", new BigInteger(string));
} catch (Exception e) {
// TODO Auto-generated catch block
return null;
}
}
You should print out the hexadecimal string directly from byte[]. This can be done using the following code:
StringBuilder sb = new StringBuilder(data.length * 2);
for (int i = 0; i < data.length; i++) {
sb.append(String.format("%02X", data[i] & 0xFF));
}
return sb.toString();
There is no need to use BigInteger. In fact, it is dangerous to use BigInteger. One reason is the one you've already encountered: BigInteger conversion to/from byte[] is using signed big endian encoding by default. The other thing is that the output of the RSA signature (as integer) may be smaller than the modulus size in hexadecimals. This is why EJP's solution will fail now and then.
RSA output has been defined in bytes, as an unsigned big endian encoded in the same number of bits as the key size (using integer to octet string encoding in the standard documents).
public static String byteToHex(byte[] string)
A byte[] is not a string. It's a byte array. Don't confuse yourself with inappropriate variable names. String is not a container for binary data.
return String.format("%04x", new BigInteger(string));
Try return new BigInteger(1,string).toString(16), and have a look at the Javadoc to see why this works where new BigInteger(string) didn't.
Related
I am trying to create a program in java in which part of it uses AES encryption to encrypt data for my final project in a coding class. Here is the code that I am using for my encryption:
static String symmetric(String info, String key, String mode) {
try {
Cipher c = Cipher.getInstance("AES/ECB/PKCS5Padding");
byte [] bytes = Base64.getDecoder().decode(Crypto.sha256(key));
byte [] information = Base64.getDecoder().decode(info);
Key k = new SecretKeySpec(bytes, "AES");
if (mode.equals("ENCRYPT")) {
c.init(Cipher.ENCRYPT_MODE, k);
} else if (mode.equals("DECRYPT")) {
c.init(Cipher.DECRYPT_MODE, k);
}
return (Base64.getEncoder().encodeToString(c.doFinal(information)).trim());
} catch (Exception e) {
JOptionPane.showMessageDialog(null, e.getMessage());
}
return (null);
}
When I encrypt my data using String ciphterText = symmetric("message", "key", "ENCRYPT") and decrypt the ciphertext using symmetric(cipherText, "key", "DECRYPT"), the string it returns is "messagc=". I'm worried that the padding is weird but I don't know how to fix it.
FYI: Crypto.sha256(String input) is a method I created that returns the sha256 hash of info as a base 64 string. Here is the code for it if it helps:
public static String sha256(String input) {
try {
MessageDigest digest = MessageDigest.getInstance("SHA-256");
byte [] tempHash = digest.digest(input.getBytes(StandardCharsets.UTF_8));
return (Base64.getEncoder().encodeToString(tempHash));
} catch (NoSuchAlgorithmException e) {
JOptionPane.showMessageDialog(null, e.getMessage());
}
return (null);
}
Also I know ECB is not secure compared to other methods that use initialization vectors, but it is a small project and I don't have enough time to do that, which is the same reason why I'm not salting my hashes. Is there anything I can do to fix it?
This is a problem with the way you are using base-64 encoding.
When you encrypt, you are treating "message" as base-64 encoded bytes. The last block is "age". A strict decoder would reject that input, because it is missing padding, and has some extra bits that spill over into the third byte. But a permissive decoder ignores that, and decodes the array as { 0x99, 0xeb, 0x2c, 0x6a, 0x07 }
The correct base-64 encoding of { 0x99, 0xeb, 0x2c, 0x6a, 0x07 } is "messagc=".
To make this work correctly, every statement in your method should differ depending on the mode flag. It would be more clear and clean to separate encrypt and decrypt methods.
The padding problem
1) I'm encrypting a message in ruby using a public key with PKCS1_PADDING.
2) Then converting the output (which is ASCII-8BIT encoded) to hex and sending it to the android devie.
3) On android converting the hex to byte array & decrypting it using private key, I am getting a lot of additional chars. (ON android side its defaulted to RSA/NONE/PKCS1Padding).
Example:
Expected string: hello how are you doing ?
Actual string: V')f�rBA�;\�:�D��.a�~�A#�.P�(� �l��-�ך��\�0}�nj.F�#Ƨ�Wr[��k��Ez��o��偣�r�����K����1D�涮���U!�t�.UI?�gA��|X��o#v�K��Ə����'��n�F������
P܆�0��9m9*u�٘S�1�������<>�L�?��;3�_���~�-)�$�����Ũ
*"���%/Oѡ�k#��hello how are you doing?
JAVA CODE:
public String Decrypt (String result,String privKey) throws NoSuchAlgorithmException, NoSuchPaddingException, InvalidKeyException, IllegalBlockSizeException, BadPaddingException
{
PrivateKey privateKey = getPrivateKeyFromString(privKey);
Cipher cipher1 = Cipher.getInstance("RSA");
cipher1.init(Cipher.DECRYPT_MODE, privateKey);
String decrypted="";
try {
byte[] bytes = hexStringToByteArray(result);
byte[] decryptedBytes = cipher1.doFinal(bytes);
decrypted = new String(decryptedBytes);
}catch (Exception e)
{
e.printStackTrace();
}
return decrypted;
}
public static byte[] hexStringToByteArray(String s) {
int len = s.length();
byte[] data = new byte[len/2];
for(int i = 0; i < len; i+=2){
data[i/2] = (byte) ((Character.digit(s.charAt(i), 16) << 4) + Character.digit(s.charAt(i+1), 16));
}
return data;
}
RUBY CODE:
require 'openssl'
require 'base64'
public_key = "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAn6fT8ScFrW2FR5bxTeFzsD77nN1W+gL5XUB1yQVNL699y6WISopbQ6lls76XvKfyhJHn7ca8i5rDRXrNnaY1BVvX9n/jKWLw13AQcVG4SjMewMQbW1KXOWFe2cltGxB7dX+4xlnxRtXz26xtOpEoBdMN2LBB39WdMghaLIrzcNu9uj363KK8szs9x9rO9E5BNfaqePFwajJoOXjkc5PUwRHeW2DodQnKfxJhaBwotoBbD6zrx+XPqpEzXD7XLjq2i/MGEuw6XGLCGQ+/zaytiYCDe8gboQ5WkWQtfa0FALve9zguqjpoNouWaK4SBq1kyeFKsdsbmZLC8NdJlSruUQIDAQAB"
rsa_public_key = OpenSSL::PKey::RSA.new(Base64.decode64(public_key))
encrypted_string = rsa_public_key.public_encrypt('hello how are you doing ?', OpenSSL::PKey::RSA::PKCS1_PADDING)
encrypted_string.unpack("H*")
For maximum portability you should use "RSA/ECB/PKCS1Padding" as initialization string for the cipher.
This string has actually been defined in the Java Standard Algorithm Names as required for any Java implementation. Of course Android isn't officially Java yet, but you can be sure that Google will try and make sure that it will be Java as close as it gets. So this should be compatible with any Java(-ish) implementation.
What is not required by Sun is that the mode of operation ("ECB" in above string) and padding scheme ("PKCS1Padding") are the default for "RSA". That why you have to specify those explicitly. Never rely on provider-specific defaults - except when specifying the random number generator.
What you currently get is the "RSA/ECB/NoPadding" scheme which leaves all the padding intact. So when you look at the plaintext size in bytes it will be identical to the size in bytes of the modulus. And the contents will be the PKCS#1-padding, which is (mostly) randomized for each encryption. Random bytes cannot be converted to text easily, so what you get back mainly looks like garbage.
Notes:
"ECB" is a bit of a misnomer by Sun, it should have been "None" as only one block of plaintext can be encrypted (in general);
you should also make explicit the character set when converting bytes to string, even though Android has UTF-8 as default (Java on Windows uses Windows-1252 encoding!);
the best random number generation is generally pretty platform specific, so using a specific algorithm may actually lower the security of your implementation, doubly so for the ill-defined "SHA1PRNG".
This is the code used in c#
public static string Encode_SHA512(string input) {
try {
using (SHA512 sha = SHA512.Create()) {
byte[] hash = sha.ComputeHash(Encoding.Unicode.GetBytes(input));
return Convert.ToBase64String(hash);
}
} catch (Exception ex) {
throw new Exception("Error al generar hash SHA512", ex);
}
}
And this is the code used in java (One of many attempts)
public static String Encode_SHA512(String input) {
MessageDigest md = MessageDigest.getInstance("SHA-512");
byte[] digest = md.digest(input.getBytes("UTF-16LE"));
return String.format("%0128x", new BigInteger(1, digest));
}
But always the result is different. How can I use the same encryption of the C # code in java?
The problem is in your Java version you are not converting to a base64 encoded string correctly. If you update your Java code to the following then they both produce the same hash:
public static String Encode_SHA512(String input) throws NoSuchAlgorithmException, UnsupportedEncodingException {
MessageDigest md = MessageDigest.getInstance("SHA-512");
byte [] inputBytes = input.getBytes("UTF-16LE");
byte[] digest = md.digest(inputBytes);
return Base64.getEncoder().encodeToString(digest);
}
The line return String.format("%0128x", new BigInteger(1, digest)); is replaced with return Base64.getEncoder().encodeToString(digest); which currently base 64 encodes the resultant hash. Perhaps why the code was confusing is that the byte[] produced by the digest method looked different as in Java the byte type is twos complement (which is signed) so -128 to + 127 whereas in C# it is unsigned so 0 to 255 hence in the debugger any byte with starting with a 1 would look different as it would appear negative in Java but positive in C#.
In my Android app I have a SHA256 hash which I must further hash with the RIPEMD160 message digest algorithm.
I can output the correct sha256 and ripemd160 hash of any string, but when I try to hash the sha256 hash with ripemd160 I get a hash which is incorrect.
According to online hash calculators, the SHA256 value of the string 'test'(all lowercase) is:
9f86d081884c7d659a2feaa0c55ad015a3bf4f1b2b0b822cd15d6c15b0f00a08
And the RIPEMD160 value of the string 'test' is:
5e52fee47e6b070565f74372468cdc699de89107
The value from hashing the resulting sha256 hash with ripemd160 according to online calcs is:
4efc1c36d3349189fb3486d2914f56e05d3e66f8
And the one my app gives me is:
cebaa98c19807134434d107b0d3e5692a516ea66
which is obviously wrong.
Here is my code:
public static String toRIPEMD160(String in)
{
byte[] addr = in.getBytes();
byte[] out = new byte[20];
RIPEMD160Digest digest = new RIPEMD160Digest();
byte[] sha256 = sha256(addr);
digest.update(sha256,0,sha256.length);
digest.doFinal(out,0);
return getHexString(out);
}
public static byte[] sha256(byte[] data)
{
byte[] sha256 = new byte[32];
try
{
sha256 = MessageDigest.getInstance("SHA-256").digest(data);
}
catch(NoSuchAlgorithmException e)
{}
return sha256;
}
For the ripemd160 algorithm, you need bouncycastle and java.security.MessageDigest for sha256.
Your "online calculator" result is the result of hashing the bytes of the string "test" with SHA-256, converting the result of that hash to a hex string, then taking the bytes corresponding to the ASCII characters of that hex string and hashing those a second time. This is very different from your Java code, which passes the bytes that come out of the first hash directly to the second one, without printing them as hex and turning those characters back into bytes in between. The single byte with value 254 (decimal) becomes "fe" in hex, which becomes the two-byte sequence [0x66, 0x65] when converted back to bytes.
Your hash is working fine. The problem is that the online calculators that you're using are treating your input:
9f86d081884c7d659a2feaa0c55ad015a3bf4f1b2b0b822cd15d6c15b0f00a08
as a string instead of an array of bytes. In other words, it's treating each character as a byte instead of parsing character pairs as bytes in hexadecimal. If I give this as a string to online calculators, I indeed get exactly what you got:
4efc1c36d3349189fb3486d2914f56e05d3e66f8
However, you're treating the output as an array of bytes instead of a String and that's giving you different results. You should encode your raw SHA256 hash as a string, then pass the encoded string to the hash function. I see you have a getHexString method, so we'll just use that.
public static String toRIPEMD160(String in) {
try {
byte[] addr = in.getBytes();
byte[] out = new byte[20];
RIPEMD160Digest digest = new RIPEMD160Digest();
// These are the lines that changed
byte[] rawSha256 = sha256(addr);
String encodedSha256 = getHexString(rawSha256);
byte[] strBytes = encodedSha256.getBytes("UTF-8");
digest.update(strBytes, 0, strBytes.length);
digest.doFinal(out, 0);
return getHexString(out);
} catch (UnsupportedEncodingException ex) {
// Never happens, everything supports UTF-8
return null;
}
}
If you want to know it's working, take the value of encodedSha256 and put that into an online hash calculator. As long as the calculator uses UTF-8 encoding to turn the string into a byte array, it will match your output.
To get printable version of byte[] digest use this code:
StringBuffer hexString = new StringBuffer();
for (int i=0;i<out.length;i++) {
hexString.append( String.format("%02x", 0xFF & out[i]) );
}
and then call hexString.toString();
I am trying to encrypt and decrypt a message as mentioned in the below code. Basically I want to encrypt a message with a public key and convert that encrypted message from byte array to String. And decrypt this string into original text. Here are the both methods. Here encryption works fine but decryption fails (error is "Data must start with zero"). I think this is causing because I convert encrypted byte array into String.
How do I solve this? (I want to have encrypted byte array as string and use it for decryption) Is there any other approach (with public and private keys)
public static String getEncryptedMessage(String publicKeyFilePath,
String plainMessage) {
byte[] encryptedBytes;
try {
Cipher cipher = Cipher.getInstance("RSA");
byte[] publicKeyContentsAsByteArray = getBytesFromFile(publicKeyFilePath);
PublicKey publicKey = getPublicKey(publicKeyContentsAsByteArray);
cipher.init(Cipher.ENCRYPT_MODE, publicKey);
encryptedBytes = cipher.doFinal(plainMessage.getBytes());
return new String(encryptedBytes);
} catch (Throwable t) {
}
}
public static String getDecryptedMessage(
String privateKeyFilePath, String encryptedMessage)
{
byte[] decryptedMessage;
try {
Cipher cipher = Cipher.getInstance("RSA");
byte[] privateKeyContentsAsByteArray = getBytesFromFile(privateKeyFilePath);
PrivateKey privateKey = getPrivateKey(privateKeyContentsAsByteArray);
cipher.init(Cipher.DECRYPT_MODE, privateKey);
decryptedMessage = cipher.doFinal(encryptedMessage.getBytes());
return new String(decryptedMessage);
} catch (Throwable t) {
}
If you look at this page (http://www.wikijava.org/wiki/Secret_Key_Cryptography_Tutorial) you will need to do base-64 encoding to turn the bytes into a string, then to decrypt it you would just decode it then decrypt.
Base-64 encoding uses the first 7 bits of a byte, to make something that is printable or emailable, for example.
UPDATE:
I made a mistake, there are 64 characters that it would be encoded in, again, in order to make it easier to use as something printable.
Why don't you treat the message as byte array from encryption to decryption? Why changing it to String in the middle? (I know it seems like a question, but it's actually an answer...)
Using RSA directly on unformatted data may leave your application vulnerable to an adaptive chosen ciphertext attack. For details please see Chapter 8, pages 288-289, of the Handbook of Applied Cryptography, a freely-available book from CRC Press. (It's well worth buying the bound edition, if you're really interested in cryptography -- you'll be stunned at the quality for the price.)
Because of this attack, most protocols that integrate RSA use RSA for encrypting randomly-generated session keys or signing hash functions with outputs that ought to be indistinguishable from random, OR using very carefully formatted messages that will fail to be correctly interpreted. (See Note 8.63 in HAC for details.)