I'm developing a HTTP API that requires encryption. I have tried to use AES to get compatibility between Java, PHP and Javascript, but so far I have managed to get Java<->PHP and then Java<->Javascript, but not both PHP and Javascript at the same time.
Has anyone had any experience with achieving interoperability between these languages and more?
Any advice would be much appreciated.
Thanks
To get AES to work across different systems, you have to make sure that everything is the same on all systems. That means not relying on system defaults for anything - defaults can differ between systems. You need to explicitly specify everything.
specify the mode; use CBC or CTR.
specify the IV. You can prepend it to the cyphertext.
specify the padding; for AES use PKCS7.
if your key is a text string then specify the character encoding used to convert it to bytes.
if your plaintext is a text string then specify the character encoding used to convert it to bytes.
AES is a standard (defined here). No matter which programming language you use, the result has to be the same.
Check some test vectors either from the official definition or - if you've already implemented a block mode of operation - from here.
If your implementation has different result, it might work, but it won't be AES...
Related
Thi is a fundamental questuion about how java works and so i dont have any code to support it.
I am new to java development and want to know how the different number systems, charecter sets like UTF 8 and unicode come together in Java.
Lets say a user creates a new string and int with the same value.
int i=100;
String S="100";
The hardware of a computer understands zeros and ones. so it has to be converted to binary?(correct me if im wrong). this conversion should be done by the JVM(correct me if im wrong)? and to represent charecters of different languages into charecters that can be typed into the keyboard (english) UTF-8 and such conversions are used(correction needed)?
now how does this whole flow fit into the bigger picture of running a java web application?
how does a string/int get converted to a binary for the machine's hardware to understand?
how does it get converted to UTF-8 for a browser to understand?
and what are the default number format and charecterset in java? if im reading contents of a file? will they be read into binary or utf-8?
All computers run in binary. The conversion is done by the JVM and the computer that you have. You shouldn't worry about converting the code into the coordinating 1's and 0's. The browser has its own conversion hard code to change the universal 1's and 0's(used by all programs and computer software) into however it decides to display the given information. All languages are just a translation guide for the user to "speak" with the computer. And vice versa. Hope this helps though I don't think I really answered anything.
How java represents any data type in memory is the choice of the actual JVM. In practice, the JVM will chose the format native to the processor (e.g. chose between little/big endian for int), simply because it offers the best performance on that platform.
Basically, the JLS makes certain guarantees (like that a byte has 8 bits and the values range from -128 to 127) - the VM just maps that to the platform as it deems suitable (the JLS was specified to match common computing technology closely, so there is usually no magic needed to guess how primitive types map to the platform).
You should never care how the VM represents data in memory, java does not offer any legal way to access the data in a manner where you would need to know (bypassing most of the VM's logic by using sun.misc.Unsafe is not considered legal).
If you care for educational purposes, learn what binary representations the underlying platform (e.g. x86) uses and take a look at the VM. It has little to do with java really, its all VM and platform specific.
For java.lang.String, its the implementation of the class that defines how the String is stored internally - it went through quite some changes over major java versions - but what that String exposes is quite narrowly defined (see JDK javadoc for String.length(), String.charAt()).
As for how user input is translated to java standard types, thats actually platform specific. The JVM selects the default encoding (e.g. String.toBytes() can return quite different results for the same string, depending on the platform - thats why its recommended to explictly specify the desired encoding). Same goes for many other things (time zone, number format etc.).
CharSets and Formats are building blocks the program wires up to translate data from the outside world (file, http or user input) into javas representation of data (or vice versa). For example, a Web application will use the encoding from a HTTP header to determine what CharSet to use when interpreting the contents (the HTTP headers encoding is defined to be US-ASCII by the spec).
So, for a specific project I need to be able to encrypt and decrypt Strings in the same way that are encrypted/decrypted in another PHP application. My application is a Grails app, so I will be writing code in Java. The way the Strings will be encrypted/decrypted on the PHP side is (example code, not necessarily functional):
<?
$input="textToBeEncrypted";
function encrypt($data, $key)
{
$cipher_alg = MCRYPT_DES;
$mode = MCRYPT_MODE_CBC;
return #mcrypt_encrypt($cipher_alg, $key, $data, $mode);
}
function decrypt($encrypted, $key)
{
$cipher_alg = MCRYPT_DES;
$mode = MCRYPT_MODE_CBC;
return #mcrypt_decrypt($cipher_alg, $key, $encrypted, $mode);
}
$key ="testKey";
$data=$input;
$result = decrypt($data, $key);
echo ">>" . $result . "<br>\n";
?>
So, I would like to be able to apply the same encryption/decryption in Java (or Groovy). I have found this example code, https://github.com/stevenholder/PHP-Java-AES-Encrypt/blob/master/security.java and I understand that if I manage to find the names of the algorithm and mode in Java, it should work. Unless if I am missing something... I navigated to the Java Standard Names page for encrypting algorithms, http://docs.oracle.com/javase/7/docs/technotes/guides/security/StandardNames.html, but I can't find the exact equivalent of what I have in the PHP code. Any ideas? Has any of you guys ever needed to do something similar?
Thanks,
Iraklis
The three main things to consider with encryption are the algorithm, the mode, and the padding. Those must be compatible between the encryption and decryption software for everything to work.
To start with, AES (Rijndael) is definitely recommended in favor of DES as the encryption algorithm. DES is not considered secure enough any longer. The Java code you posted a link to is using AES, so it definitely won't be compatible with the PHP code you are showing.
Plus, that Java code is using ECB for the mode which is also not recommended. ECB is easier since it doesn't require any handling of initialization vectors but that is also its downfall. The PHP code is using CBC which is recommended, although I don't see any explicit IV handling. Mcrypt will use an IV of all zeros in that instance, which isn't ideal at all.
Finally the Java code is using PKCS5 as the padding method, whereas the PHP code uses zero-padding. Those aren't compatible. The default provider that comes with Oracle's JDK doesn't support zero padding, however Bouncy Castle does (see section 5.0). That would require either using Bouncy Castle's API directly or using it as a JCE provider using one of the methods detailed at that link. The the string "DES/CBC/ZeroByte" should do the trick.
Of course, zero padding can be problematic if the data you are encrypting can end with a null byte.
I've answered a question similar to this question before here and provided code for encrypting in Java and decrypting in PHP in the answer. Hopefully that might help you.
I am using a 3rd party platform to create a landing page, it is a business requirement that I use this particular platform.
On their page I can encrypt data and send it to my server through a request parameter when calling a resource on my site. This is done through an AES Symmetric Encryption.
I need to specify a password, salt (which must be a hex value) and an initialization vector (but be 16 characters).
Their backend is a .NET platform. I know this because if I specify an IV longer than it expects the underlying exception is:
System.Security.Cryptography.CryptographicException: Specified initialization vector (IV) does not match the block size for this algorithm.
Source: mscorlib
So for example, on their end I specify:
EncryptSymmetric("Hello World","AES","P4ssw0rD","00010203040506070809", "000102030405060708090A0B0C0D0E0F")
Where the inputs are: plain text, algorithm, pass phrase, salt, and IV respectively.
I get the value: eg/t9NIMnxmh412jTGCCeQ==
If I try and decrypt this on my end using the JCE or the BouncyCastle provider I get (same algo,pass phrase, salt & IV, with 1000 iterations): 2rrRdHwpKGRenw8HKG1dsA== which is completely different.
I have looked at many different Java examples online on how to decrypt AES. One such demo is the following: http://blogs.msdn.com/b/dotnetinterop/archive/2005/01/24/java-and-net-aes-crypto-interop.aspx
How can I decrypt a AES Symmetric Encryption that uses a pass phrase, salt and IV, which was generated by the .NET framework on a Java platform?
I don't necessarily need to be able to decrypt the contents of the encryption string if I can generate the same signature on the java side and compare (if it turns out what is really being generated here is a hash).
I'm using JDK 1.5 in production so I need to use 1.5 to do this.
As a side note, a lot of the example in Java need to specify an repetition count on the java side, but not on the .NET side. Is there a standard number of iterations I need to specify on the java side which matches the default .NET output.
It all depends on how the different parts/arguments of the encryption are used.
AES is used to encrypt bytes. So you need to convert the string to a byte array. So you need to know the encoding used to convert the string. (UTF7, UTF8, ...).
The key in AES has some fixed sizes. So you need to know, how to come from a passphrase to an AES key with the correct bitsize.
Since you provide both salt and IV, I suppose the salt is not the IV. There is no standard way to handle the Salt in .Net. As far as I remember a salt is mainly used to protect against rainbow tables and hashes. The need of a Salt in AES is unknown to me.
Maybe the passphrase is hashed (you did not provide the method for that) with the salt to get an AES key.
The IV is no secret. The easiest method is to prepend the encrypted data with the IV. Seen the length of the encrypted data, this is not the case.
I don't think your unfamiliarity of .Net is the problem here. You need to know what decisions the implementer of the encryption made, to come from your parameters to the encrypted string.
As far as I can see, it is the iteration count which is causing the issue. With all things the same (salt,IV,iterations), the .Net implementation generates the same output as the Java implementation. I think you may need to ask the 3rd party what iterations they are using
I've tried but failed to encode a string in Javascript to decode on a java server. We'd like to use the bouncycastle algorithm PBEWITHSHA256AND256BITAES-CBC-BC to decode serverside.
I've tried using crypto.js to do the encoding using the following code:
var encrypted = Crypto.AES.encrypt("it was Professor Plum in the library with the candlestick",
key,
{ mode: new Crypto.mode.CBC });
var encryptedString = Crypto.util.bytesToHex(Crypto.charenc.Binary.stringToBytes(crypted));
However this doesn't decode correctly on the server, my guess is its something to do with the SHA256 but I can't work out what it would be digesting & can't find any documentation. Does anyone know how to perform the encryption in javascript?
You need to do everything the same at both ends. You need the same key. You need the same mode (CBC) you need the same padding (use PKCS7) and you need the same IV.
Check that the actual key you are using is the same at both ends by displaying the hex, after you have run the passphrase through SHA-256. Check the hex for the IVs as well. Don't use any defaults, but explicitly pick the mode and padding to use.
If you think that it is the PBE/SHA-256 that is going wrong then you might want to look at how your text passphrase is being turned into bytes. Again, check the hex at both sides before it is passed to SHA-256. Converting text to bytes is a common source of errors. You need to be very sure what stringToBytes() is doing and that whatever you are using on the Java side is doing exactly the same.
I have a jersey oauth provider that uses HmacSHA1 for signing/verifying requests. This works for my development & test platforms where client & server are both different physical systems. However, when I move to a production platform the HmacSHA1 algorithm (provider-side) returns a different value than the HmacSHA1 algorithm (client-side) using the same params & secret, and my oauth validation fails.
The JDK (1.6.x) is the same exact version on both the provider and client for all platforms.
When I switched my oauth provider & client to use the PLAINTEXT signature method (bad for security, I know), it works on all platforms.
When I dug into the jersey OAuthSignature.verify() method, it calls the signature method's (HmacSHA1 or PLAINTEXT) verify function, which simply signs the oauth elements with the secret and compares the value against the signature passed in.
For HmacSHA1, the method calls the Base64.encode() method to generate the signature, but for PLAINTEXT no encoding is done (as expected).
What could be causing the Base64.encode() method using an HmacSHA1 signature algorithm to have different results using the same params & secret on both systems?
Thanks in advance!
--TK
One educated guess: if platform encodings differ (quite common; some platforms use ISO-8859-1, others UTF-8, Windows maybe CP-1250 or whatever, AND OAuth library in question has newbie bugs where encoding is not specified when converting between byte[] and String, AND there are characters that encode differently on different encodings (usually anything but 7-bit ASCII range, characters 0 - 127), and you will end up with different signatures.
So -- you can see what the platform default encoding is; and force it to be same on both first. If this solves the issue, I would consider reporting this as a bug to OAuth lib (or framework that bundles it) author(s), or at least ask on mailing lists.
I have seen such bugs (String.getBytes("test")) VERY often -- it is one of most common Java anti-patterns in existence. Worst part is that it is bug that only causes issues under specific circumstances, so people are not bitten badly enough to fix these.
Another potential issue is with URL encoding -- handling of certain characters (space, %, +) can differ between implementations, due to subtle bugs in encoding/decoding. So you can see if content that you are passing has 'special' characters; try to see if eliminating them (for testing) makes difference, and zero in what triggers the difference.