AES-256 decryption with part of a key - java

This is homework. Our professor encrypted a message with AES-256 in CBC mode (key is 256bits). Then he gave us the IV he used (randomized in my code), the encrypted message and second half of a key (keySuffix), key is in hexadecimal, IV too. We have to use brute-force and decrypt that message. The message is a few blocks of length 8 made of ones and zeroes with spaces in between (cryptogram). What I do is getting rid of the spaces, transforming cryptogram to byteArray (you can see in code how I do that). On the key I just do getBytes(), on IV I use DatatypeConverter.parseHexBinary() otherwise it's throwing an error about wrong IV length. When I get a message i return it as a String by using new String(myByteArray). Then I check if that message has all the normal characters (not some 'bushes').
Point is that I get no results. I have no idea what may be wrong here, my guess is that some conversions are done not the way they were supposed to be. I've been trying some things with unicodes etc., but brute-force takes so long time to complete and testing like this is troublesome. Can someone point my in the right direction, like how to make those conversions. Generating all possible Strings for the other side of the key works fine.
btw. PKCS5Padding is giving an error.
public class Main {
private String randomized = "21232d0960a7b522d3e25141e54ecee6";
private String keySuffix = "1dad418a";
private String cryptogram = "00110001 01111000 01111101 01111100 01100001 11011110 10010010 01011011";
private byte[] cryptogramBytes;
private String pattern = "[a-zA-Z1-9\\s]*";
private IvParameterSpec ivSpec = null;
private Cipher cipher = null;
public static void main(String... args){
char[] elements = { 'a', 'b', 'c', 'd', 'e', 'f', '1', '2', '3', '4', '5', '6', '7', '8', '9', '0' };
char[] buff = new char[8];
Main main = new Main ();
byte[] convertedRandomized = DatatypeConverter.parseHexBinary(main.randomized);
main.ivSpec = new IvParameterSpec(convertedRandomized);
main.cryptogram = main.cryptogram.replaceAll("\\s", "");
BigInteger bigint = new BigInteger(main.cryptogram, 2);
main.cryptogramBytes = bigint.toByteArray();
main.cipher = Cipher.getInstance("AES/CBC/NoPadding");
main.permGen(elements, 0, 8, buff);
}
public void permGen(char[] s, int i, int k, char[] buff){
if (i < k) {
for (int j = 0; j < s.length; j++) {
buff[i] = s[j];
permGen(s, i + 1, k, buff);
}
} else {
String result = decrypt(String.valueOf(buff) + keySuffix);
if (result.matches(pattern))
System.out.println("Key is: " + String.valueOf(buff) + keySuffix);
}
}
public String decrypt(String key){
SecretKeySpec skeySpec = new SecretKeySpec(key.getBytes(), "AES");
cipher.init(Cipher.DECRYPT_MODE, skeySpec, ivSpec);
return new String(cipher.doFinal(cryptogramBytes));
}
}

Related

Is there a way to cipher in java/kotlin and decipher in nodejs with AES/CBC?

I try to cipher a text in java and decipher it in nodejs (and vice versa)
I can cipher and decipher in the same language but I can't with them both ...
Here is my code in Kotlin :
#Throws(Exception::class)
fun encrypt(text: String, password: String?): String?
{
if (password == null)
return null
val hash = toHash(password).copyOf(16)
val keySpec = SecretKeySpec(hash, "AES")
val ivSpec = IvParameterSpec(hash)
val cipher = Cipher.getInstance("AES/CBC/PKCS5Padding")
cipher.init(Cipher.ENCRYPT_MODE, keySpec, ivSpec)
val results = cipher.doFinal(text.toByteArray())
return Base64.encodeToString(results, Base64.NO_WRAP or Base64.DEFAULT)
}
#Throws(Exception::class)
fun decrypt(text: String, password: String?): String?
{
if (password == null)
return null
val hash = toHash(password).copyOf(16)
val keySpec = SecretKeySpec(hash, "AES")
val ivSpec = IvParameterSpec(hash)
val cipher = Cipher.getInstance("AES/CBC/PKCS5Padding")
cipher.init(Cipher.DECRYPT_MODE, keySpec, ivSpec)
return String(cipher.doFinal(Base64.decode(text, Base64.DEFAULT)))
}
And here is my code in JS :
function decrypt(data, password)
{
var hash = sha256(password).substring(0, 16)
var decipher = crypto.createDecipheriv('aes-128-cbc', hash, hash);
var dec = decipher.update(data, 'hex', 'utf8');
dec += decipher.final('utf8');
return dec;
}
function encrypt(data, password)
{
var hash = sha256(password).substring(0, 16)
var cipher = crypto.createCipheriv('aes-128-cbc', hash, hash);
var crypted = cipher.update(data, 'utf8', 'hex');
crypted += cipher.final('hex');
return crypted;
}
I have tried to play with the differents block size in Java and nodeJS (192, 128 and 256) but it's not working.
I don't want to cipher in ECB, I want to achive this in CBC or CTR.
Does someone know how to do this please ? Thank you per advance !
I have tried using cbc,noPadding and applied same padding algorithm in both js and java and worked fine it generated same encrypted string in js and java as well please check the link:
JS link:
https://plnkr.co/edit/aihF54rkxxw3Jjcly9Uo?p=preview
Java code:
import java.security.Key;
import javax.crypto.Cipher;
import javax.crypto.spec.IvParameterSpec;
import javax.crypto.spec.SecretKeySpec;
import sun.misc.*;
public class CipherConversion {
private static final String algorithm = "AES/CBC/NoPadding";
private static final byte[] keyValue = new byte[] { '0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f' };
private static final byte[] ivValue = new byte[] { 'f', 'e', 'd', 'c', 'b', 'a', '9', '8', '7', '6', '5', '4', '3', '2', '1', '0' };
private static final IvParameterSpec ivspec = new IvParameterSpec(ivValue);
private static final SecretKeySpec keyspec = new SecretKeySpec(keyValue, "AES");
// final protected static char[] hexArray = "0123456789ABCDEF".toCharArray();
public static String encrypt(String Data) throws Exception {
Cipher c = Cipher.getInstance(algorithm);
c.init(Cipher.ENCRYPT_MODE, keyspec, ivspec);
byte[] encVal = c.doFinal(Data.getBytes());
String encryptedValue = new BASE64Encoder().encode(encVal);
return encryptedValue;
}
public static String decrypt(String encryptedData) throws Exception {
Cipher c = Cipher.getInstance(algorithm);
c.init(Cipher.DECRYPT_MODE, keyspec, ivspec);
byte[] decordedValue = new BASE64Decoder().decodeBuffer(encryptedData);
byte[] decValue = c.doFinal(decordedValue);
String decryptedValue = new String(decValue);
return decryptedValue;
}
private static String padString(String source) {
char paddingChar = ' ';
int size = 16;
int x = source.length() % size;
int padLength = size - x;
for (int i = 0; i < padLength; i++)
{
source += paddingChar;
}
return source;
}
public static void main(String[] args) throws Exception {
System.out.println("keyValue"+keyValue);
System.out.println("keyValue"+ivValue);
String password = "ChangeMe1";
String passwordEnc = CipherConversion.encrypt(padString(password));
String passwordDec = CipherConversion.decrypt(passwordEnc);
System.out.println("Plain Text : " + password);
System.out.println("Encrypted Text : " + passwordEnc);
System.out.println("Decrypted Text : " + passwordDec);
}
}
I have faced a similar situation before, where the AES encryption was not working for both application and server side. Finally, I could make it work for both Android and server side. I am providing the class that I used for AES encryption. However, I have a java implementation and I thought that would be a help.
import android.util.Base64;
import javax.crypto.Cipher;
import javax.crypto.spec.SecretKeySpec;
public class AESProvider {
private static final String ALGORITHM = "AES";
private static final String ENCRYPTION_KEY = "YourEncryptionKey";
public static String encrypt(String stringToEncrypt) {
try {
SecretKeySpec secretKey = new SecretKeySpec(ENCRYPTION_KEY, ALGORITHM);
Cipher cipher = Cipher.getInstance(ALGORITHM);
cipher.init(Cipher.ENCRYPT_MODE, secretKey);
byte[] data = cipher.doFinal(stringToEncrypt.getBytes("UTF-8"));
return Base64.encodeToString(data, Base64.DEFAULT);
} catch (Exception e) {
e.printStackTrace();
}
return "";
}
public static String decrypt(String stringToDecrypt) throws Exception {
SecretKeySpec secretKey = new SecretKeySpec(ENCRYPTION_KEY, ALGORITHM);
Cipher cipher = Cipher.getInstance(ALGORITHM);
cipher.init(Cipher.DECRYPT_MODE, secretKey);
return new String(cipher.doFinal(Base64.decode(stringToDecrypt, Base64.DEFAULT)));
}
}
I was missing the Base64 encode and decode while encoding and decoding the AES encryption in my case. Hope that helps!

Different output encryption both CryptoJS and Java Code

I need to encrypt certainly string from client-side (JavaScript) and decrypt from server-side (Java), so I found CryptoJS and I write the code with the same params/configuration of mi Java Code but the output is always different, do you have any idea or what happen?
I'm using CBC with NoPadding
CryptoJS
http://jsfiddle.net/Soldier/gCHAG/
<script src="http://crypto-js.googlecode.com/svn/tags/3.1.2/build/rollups/aes.js">
</script>
<script src="http://crypto-js.googlecode.com/svn/tags/3.1.2/build/components/pad-nopadding-min.js"></script>
<script>
function padString(source) {
var paddingChar = ' ';
var size = 16;
var x = source.length % size;
var padLength = size - x;
for (var i = 0; i < padLength; i++) source += paddingChar;
return source;
}
var key = CryptoJS.enc.Hex.parse('0123456789abcdef');
var iv = CryptoJS.enc.Hex.parse('fedcba9876543210');
var message = "soldier";
var padMsg = padString(message);
var encrypted = CryptoJS.AES.encrypt(padMsg, key, { iv: iv, padding: CryptoJS.pad.NoPadding, mode: CryptoJS.mode.CBC});
console.log("Encrypted: "+encrypted);
console.log("Encrypted text: "+encrypted.ciphertext);
</script>
Java Code
import java.security.Key;
import javax.crypto.Cipher;
import javax.crypto.spec.IvParameterSpec;
import javax.crypto.spec.SecretKeySpec;
import sun.misc.*;
public class AesCipher {
private static final String algorithm = "AES/CBC/NoPadding";
private static final byte[] keyValue = new byte[] { '0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f' };
private static final byte[] ivValue = new byte[] { 'f', 'e', 'd', 'c', 'b', 'a', '9', '8', '7', '6', '5', '4', '3', '2', '1', '0' };
private static final IvParameterSpec ivspec = new IvParameterSpec(ivValue);
private static final SecretKeySpec keyspec = new SecretKeySpec(keyValue, "AES");
final protected static char[] hexArray = "0123456789ABCDEF".toCharArray();
public static String encrypt(String Data) throws Exception {
Cipher c = Cipher.getInstance(algorithm);
c.init(Cipher.ENCRYPT_MODE, keyspec, ivspec);
byte[] encVal = c.doFinal(Data.getBytes());
String encryptedValue = new BASE64Encoder().encode(encVal);
return encryptedValue;
}
public static String decrypt(String encryptedData) throws Exception {
Cipher c = Cipher.getInstance(algorithm);
c.init(Cipher.DECRYPT_MODE, keyspec, ivspec);
byte[] decordedValue = new BASE64Decoder().decodeBuffer(encryptedData);
byte[] decValue = c.doFinal(decordedValue);
String decryptedValue = new String(decValue);
return decryptedValue;
}
public static String bytesToHex(byte[] bytes) {
char[] hexChars = new char[bytes.length * 2];
int v;
for ( int j = 0; j < bytes.length; j++ ) {
v = bytes[j] & 0xFF;
hexChars[j * 2] = hexArray[v >>> 4];
hexChars[j * 2 + 1] = hexArray[v & 0x0F];
}
return new String(hexChars);
}
private static String padString(String source) {
char paddingChar = ' ';
int size = 16;
int x = source.length() % size;
int padLength = size - x;
for (int i = 0; i < padLength; i++)
{
source += paddingChar;
}
return source;
}
public static void main(String[] args) throws Exception {
String password = "soldier";
String passwordEnc = AesCipher.encrypt(padString(password));
String passwordDec = AesCipher.decrypt(passwordEnc);
System.out.println("Plain Text : " + password);
System.out.println("Encrypted Text : " + passwordEnc);
System.out.println("Decrypted Text : " + passwordDec);
}
}
Original string:
soldier
Output from CryptoJS:
Encrypted: VNzZNKJTqfRbM7zO/M4cDQ==
Encrypted Hex: 54dcd934a253a9f45b33bccefcce1c0d
Output from Java Code:
Encrypted: j6dSmg2lfjY2RpN91GNgNw==
Encrypted Hex: 6a3664536d67326c666a593252704e3931474e674e773d3d
The base64 string encrypted has same length but not the hex.
If I put the output result of CryptoJS in Java Code, the decryption is incorrect.
Regards,
One problem here is that you're using 64 bit keys and iv's.
CryptoJS supports AES-128, AES-192, and AES-256, and AFAIK Java supports AES-128, so you should probably be specifying 128 bit keys and IVs. That might be the whole problem - I'm sure using the wrong key size is undefined behavior.
As for the difference in output lengths, 22 base64 characters is 132 bits of information, so it's a 128 bit answer (There isn't a unique 128 bit length, 21 characters would have been too few). CryptoJS is outputting 32 hex characters, which is 128 bits of information. This seems correct.
The Java code is outputting 48 hex characters which is 192 bits of information. So it's the java code that's wrong. I'm not sure why it's outputting more, though.

Java AES-128 encryption of 1 block (16 byte) returns 2 blocks(32 byte) as output

I'm using the following code for AES-128 encryption to encode a single block of 16 byte but the length of the encoded value gives 2 blocks of 32 byte. Am I missing something?
plainEnc = AES.encrypt("thisisapassword!");
import java.security.*;
import java.security.spec.InvalidKeySpecException;
import javax.crypto.*;
import sun.misc.*;
public class AES {
private static final String ALGO = "AES";
private static final byte[] keyValue =
new byte[] { 'T', 'h', 'e', 'B', 'e', 's', 't',
'S', 'e', 'c', 'r','e', 't', 'K', 'e', 'y' };
public static String encrypt(String Data) throws Exception {
System.out.println("string length: " + (Data.getBytes()).length); //length = 16
Key key = generateKey();
Cipher chiper = Cipher.getInstance(ALGO);
chiper.init(Cipher.ENCRYPT_MODE, key);
byte[] encVal = chiper.doFinal(Data.getBytes());
System.out.println("output length: " + encVal.length); //length = 32
String encryptedValue = new BASE64Encoder().encode(encVal);
return encryptedValue;
}
public static String decrypt(String encryptedData) throws Exception {
Key key = generateKey();
Cipher chiper = Cipher.getInstance(ALGO);
chiper.init(Cipher.DECRYPT_MODE, key);
byte[] decordedValue = new BASE64Decoder().decodeBuffer(encryptedData);
byte[] decValue = chiper.doFinal(decordedValue);
String decryptedValue = new String(decValue);
return decryptedValue;
}
private static Key generateKey() throws Exception {
Key key = new SecretKeySpec(keyValue, ALGO);
return key;
}
}
Cipher.getInstance("AES") returns a cipher that uses PKCS #5 padding. This padding is added in all cases – when the plaintext is already a multiple of the block size, a whole block of padding is added.
Specify your intentions explicitly in the Cipher.getInstance() call to avoid relying on defaults and potentially causing confusion:
Cipher.getInstance("AES/ECB/NoPadding");
You will also see that you are using ECB mode, which is a bad choice in almost any situation.

why am i seeing inconsistencies between two blowfish implementations?

I am encrypting "06.93308" using the key "rubicon" first with javax.crypto.spec.SecretKeySpec and another in C++ using openSSL. However both are giving me different encrypted values. The java version gives me hex A834BDD6C3478B8C whilst the OpenSSL gives me D06D7CB756744903 which is considerably different. The aim is to get the same result as the java equivalent. Any ideas on what i am doing wrong?
the java code is as follows:
char[] password = new char[] { 'r', 'u', 'b', 'i', 'c', 'o', 'n' };
byte[] raw = encrypt(password,"06.93308" );
private static byte[] encrypt(char[] password, String plaintext) throws Exception {
byte[] bytes = new byte[password.length];
for (int i = 0; i < password.length; ++i) {
bytes[i] = (byte) password[i];
}
SecretKeySpec skeySpec = new SecretKeySpec(bytes, "Blowfish");
Cipher cipher = Cipher.getInstance("Blowfish/ECB/NoPadding");
cipher.init(Cipher.ENCRYPT_MODE, skeySpec);
byte[] encrypted = cipher.doFinal(plaintext.getBytes());
return encrypted;
}
the c++ side is as follows:
CBlowFish oBlowFish((byte *)"rubicon", 8);
char encryptedPrice[17] = "\0\0\0\0\0\0\0\0";
char myBidPrice[] = "06.93308";
encrypt(myBidPrice,encryptedPrice);
void encrypt(char bidPrice[],char encryptedPrice[])
{
oBlowFish.Encrypt((unsigned char*)bidPrice, (unsigned char*)encryptedPrice,8 );
}
this is the openSSL code which gives me the same result as the above C++ code
#define SIZE 16
unsigned char *out = (unsigned char *)calloc(SIZE+1, sizeof(char));
BF_KEY *key = (BF_KEY *)calloc(1, sizeof(BF_KEY));
BF_set_key(key, SIZE, (const unsigned char*)"rubicon" );
BF_ecb_encrypt(in, out, key, BF_ENCRYPT);
printf("%s\n",out);
"rubicon" is not 16 bytes long. You'll have to adjust SIZE accordingly.

AES Encryption I am trying to encrypt using AES encryption

I just want to migrate the ruby code to Java
here is my ruby code
require 'openssl'
require 'base64'
key = '7c54367a45b37a192abc2cd7f45203042350406f8'
cipher = OpenSSL::Cipher::Cipher.new('aes-128-ecb')
cipher.encrypt()
cipher = OpenSSL::Cipher::Cipher.new('aes-256-ecb')
cipher.encrypt()
cipher.key = key
crypt = cipher.update('Rahul')
crypt << cipher.final()
puts (Base64.encode64(crypt))
Here is what i am trying in Java
String getDecodedString(String key,String encodedValue,SupportedEncryptionAlgorithm algoInfo)
{
Cipher cipher = getCipherInstancenew(algoInfo, key,Cipher.DECRYPT_MODE);
try
{
byte[] dec = new sun.misc.BASE64Decoder().decodeBuffer(encodedValue);
int ctLength = cipher.getOutputSize(dec.length);
byte[] plainText = new byte[cipher.getOutputSize(ctLength)];
int ptLength = cipher.update(dec, 0, ctLength, plainText, 0);
ptLength += cipher.doFinal(plainText, ptLength);
return null;
}
catch (IllegalBlockSizeException e)
{
LoggerFactory.getLogger(EncryptionHelper.class).error("Security Alert",e);
}
catch (BadPaddingException e)
{
LoggerFactory.getLogger(EncryptionHelper.class).error("Security Alert",e);
}
return null;
}
public static byte[] stringToBytes(String s) {
byte[] b2 = new BigInteger(s, 36).toByteArray();
return Arrays.copyOfRange(b2, 1, b2.length);
}
public static Cipher getCipherInstancenew(SupportedEncryptionAlgorithm algoInfo,String keyString,int mode) throws IOException
{
byte[] decodedBytes;
Cipher cipher=null;
try
{
decodedBytes = getBase64FromHEX(keyString).getBytes();
SecretKeySpec skeySpec = new SecretKeySpec(decodedBytes, "AES");
Security.addProvider(new BouncyCastleProvider());
cipher = Cipher.getInstance("AES/ECB/PKCS5Padding", "BC");
cipher.init(mode, skeySpec );
}
catch (java.security.GeneralSecurityException e)
{
/*Strictly no logging as it is security class
* There seems to be some issue with the Keys so alert it */
//LoggerFactory.getLogger(EncryptionHelper.class).error("Security Alert",e);
throw new IOException("GetCipherInstance does not exsists");
}
return cipher;
}
public static String getBase64FromHEX(String input) {
byte barr[] = new byte[16];
int bcnt = 0;
for (int i = 0; i < 32; i += 2) {
char c1 = input.charAt(i);
char c2 = input.charAt(i + 1);
int i1 = intFromChar(c1);
int i2 = intFromChar(c2);
barr[bcnt] = 0;
barr[bcnt] |= (byte) ((i1 & 0x0F) << 4);
barr[bcnt] |= (byte) (i2 & 0x0F);
bcnt++;
}
BASE64Encoder encoder = new BASE64Encoder();
return encoder.encode(barr);
}
private static int intFromChar(char c) {
char[] carr = { '0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f' };
char clower = Character.toLowerCase(c);
for (int i = 0; i < carr.length; i++) {
if (clower == carr[i]) {
return i;
}
}
return 0;
}
It works with 32 byte String but not with the 41 bytes in Java but in Ruby it works for any length greater than 32 byte Strange Please help
The below Java code outputs the exact same base 64 encoded result from encrypting as the Ruby code and successfully decrypts it:
final Cipher encryptCipher = Cipher.getInstance("AES/ECB/PKCS5Padding", "BC");
encryptCipher.init(Cipher.ENCRYPT_MODE, new SecretKeySpec("7c54367a45b37a192abc2cd7f4520304".getBytes(), "AES"));
final byte[] encrypt = encryptCipher.doFinal("This is my text".getBytes());
System.out.println(new String(Base64.encode(encrypt)));
final Cipher decryptCipher = Cipher.getInstance("AES/ECB/PKCS5Padding", "BC");
decryptCipher.init(Cipher.DECRYPT_MODE, new SecretKeySpec("7c54367a45b37a192abc2cd7f4520304".getBytes(), "AES"));
final byte[] decrypt = decryptCipher.doFinal(encrypt);
System.out.println(new String(decrypt));
The Ruby OpenSSL API is apparently only using the first 32 bytes of the key, since the following value for key returns the same value as the 41 byte version:
key = '7c54367a45b37a192abc2cd7f4520304'
Also, I'm not sure why cipher is initialized twice in the Ruby code as it isn't necessary as far as I can tell.
I doubt that your calls to String.getBytes() are doing what you need them to do.
The getBytes() method uses the platform's default character encoding to convert the characters of a String to a sequence of bytes. The default platform character encoding is something like UTF-8, US-ASCII, or ISO-8859-1. It's not base-64 or hexadecimal.
Most character encodings can't handle the random 8-bit values that are used in cryptographic operations. So, for example, you generally can't create a new String from the bytes that result from encryption. Many of the values will be replaced with � or ?, depending on your encoding. And, even if it happens to work on your machine, the machine on the next desk could be configured differently, and will fail when trying to decode that character string.
If you need to convert between binary data and text, use an encoding like Base-64.

Categories