Why does the following SecureRandom NativePRNG fail on windows? - java

import java.security.NoSuchAlgorithmException;
import java.security.SecureRandom;
public class Main {
public static void main(String[] args) throws NoSuchAlgorithmException {
SecureRandom srand = SecureRandom.getInstance("NativePRNG");
System.out.println(srand.nextInt());
}
}
How to run with NativePRNG on windows?
UPD:
By default, in Java 10+, \<JDK>\conf\security\java.security file,
crypto.policy=unlimited. sp nothing need to be changed for Java 10+ in java.security file.
And to use NativePRNG algorithm, we can change it to Windows-PRNG.
SecureRandom srand = SecureRandom.getInstance("Windows-PRNG");
System.out.println(srand.nextInt());

The reason the code fails on Windows is because the "NativePRNG" algorithm is not available on all platforms, including Windows. This is because "NativePRNG" relies on platform-specific sources of randomness, and the implementation may vary across different operating systems.
To run with "NativePRNG" on Windows, you can install the Java Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy Files, which includes a "NativePRNG" implementation for Windows. Once you have installed the JCE Unlimited Strength Policy Files, you can modify the code to explicitly specify the "NativePRNG" algorithm provider:
import java.security.NoSuchAlgorithmException;
import java.security.SecureRandom;
public class Main {
public static void main(String[] args) throws NoSuchAlgorithmException {
SecureRandom srand = SecureRandom.getInstance("NativePRNG", "SUN");
System.out.println(srand.nextInt());
}
}
Note that you should replace "SUN" with the name of the provider that supports the "NativePRNG" algorithm on your specific platform, as different providers may support different algorithms on different platforms.

Related

Using the JarSigner with RSASSA-PSS

Im trying to RSA-PSS-sign a JAR file with the JarSigner using PKCS#11. To specify the signature algorithm the JarSigner uses the sigalg flag. The JDK 14 Docs of the JarSigner does not specify which sigalgs are explicitly supported. I have tested that the JarSigner accepts "RSASSA-PSS" as a valid algorithm. The JarSigner does not accept "SHA256withRSASSA-PSS" or similar RSASSA-PSS variants that Java Crypto Service Providers, such as the SunPKCS11 Crypto Service Provider, often support.
When trying to sign with the sigalg "RSASSA-PSS" the JarSigner returns
jarsigner: unable to sign jar: java.security.SignatureException: Parameters required for RSASSA-PSS signature
This exception means that the PSS parameters are not set. I have traced the problem down to the JarSigner
not having any way to pass PSS parameters through the command line (see JDK 14 Docs of the JarSigner)
never setting PSS parameters - the JarSigner.java never directly (see line 831 to 843) or indirectly (see Signature.java and P11PSSSignature.java) calls setParameter->setEngineParameter->setSigParams, which is responsible for setting the required PSS params.
Am I missing something? If yes, how can I RSA-PSS-sign a JAR file? If no, is this a bug? After all, the JarSigner clearly accepts RSASSA-PSS as a valid sigalg. Or is this rather an incompatibility between the JarSigner and the SunPKCS11 implementation? After all, SunPKCS11 could just be using hardcoded PSS param values in such a case.
It looks like this is not supported yet. I can reproduce this behaviour both with the jarsigner command line tool and with Java code like this:
import jdk.security.jarsigner.JarSigner;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.security.*;
import java.security.cert.CertPath;
import java.security.cert.CertificateException;
import java.security.cert.CertificateFactory;
import java.util.Arrays;
import java.util.zip.ZipFile;
class JarSignerDemo {
public static void main(String[] args) throws IOException, CertificateException, NoSuchAlgorithmException, KeyStoreException, UnrecoverableKeyException {
char[] password = "changeit".toCharArray();
KeyStore keyStore = KeyStore.getInstance(new File("keystore.jks"), password);
PrivateKey privateKey = (PrivateKey) keyStore.getKey("mykey", password);
CertPath certPath = CertificateFactory.getInstance("X.509").generateCertPath(Arrays.asList(keyStore.getCertificateChain("mykey")));
JarSigner jarSigner = new JarSigner.Builder(privateKey, certPath)
.digestAlgorithm("SHA-256")
.signatureAlgorithm("RSASSA-PSS")
.build();
try (
ZipFile jarFile = new ZipFile("my.jar");
FileOutputStream signedJarFile = new FileOutputStream("my-signed.jar")
)
{
jarSigner.sign(jarFile, signedJarFile);
}
}
}
Exception in thread "main" jdk.security.jarsigner.JarSignerException: Error creating signature
at jdk.jartool/jdk.security.jarsigner.JarSigner.sign(JarSigner.java:573)
at JarSignerDemo.main(scratch_3.java:28)
Caused by: java.security.SignatureException: Parameters required for RSASSA-PSS signatures
at java.base/sun.security.rsa.RSAPSSSignature.ensureInit(RSAPSSSignature.java:295)
at java.base/sun.security.rsa.RSAPSSSignature.engineUpdate(RSAPSSSignature.java:346)
at java.base/java.security.Signature$Delegate.engineUpdate(Signature.java:1393)
at java.base/java.security.Signature.update(Signature.java:902)
at java.base/java.security.Signature.update(Signature.java:871)
at jdk.jartool/jdk.security.jarsigner.JarSigner.sign0(JarSigner.java:841)
at jdk.jartool/jdk.security.jarsigner.JarSigner.sign(JarSigner.java:562)
... 1 more
It looks like JDK-8245274 is meant to add this feature to Java 16. I am not 100% sure, but it looks like your issue. You may want to watch it.
Update: Slightly off-topic, but it looks like you can sign with RSASSA-PSS using BouncyCastle. I am not sure if that is an alternative for you, though. Maybe you just want to switch to another key type.

FIPS encryption policy API [duplicate]

How can I check, in Java code, if the current JVM have unlimited strength cryptography available?
In the same spirit as the answer of Dan Cruz, but with a single line of code and without going trough exceptions:
boolean limit = Cipher.getMaxAllowedKeyLength("RC5")<256;
So a complete program might be:
import javax.crypto.Cipher;
public class TestUCE {
public static void main(String args[]) throws Exception {
boolean unlimited =
Cipher.getMaxAllowedKeyLength("RC5") >= 256;
System.out.println("Unlimited cryptography enabled: " + unlimited);
}
}
If you are on Linux and you have installed the JDK (but Beanshell is not available), you can check with the runscript command provided with the JDK.
jrunscript -e 'exit (javax.crypto.Cipher.getMaxAllowedKeyLength("RC5") >= 256 ? 0 : 1);'; echo $?
This returns a 0 status code if the Unlimited Cryptography is available, or 1 if not available. Zero is the correct 'success' return value for shell functions, and non-zero indicates a failure.
I think you could probably use Cipher.getMaxAllowedKeyLength(), while also comparing the cypher you're using to known lists of "good", secure cyphers, such as AES.
Here's a reference article that lists maximum key size jurisdiction limitations that were current as of Java 1.4 (these likely haven't changed, unless the law has also changed - see below).
If you are operating in a nation that has cryptographic export/import restrictions, you'd have to consult the law in your nation, but it's probably safe to assume in these situations that you don't have unlimited strength cryptography available (by default) in your JVM. Putting it another way, assuming you're using the official JVM from Oracle, and you happen to live in a nation against which the U.S. has leveled export restrictions for cryptography (and since Oracle is a United States company, it would be subject to these restrictions), then you could also assume in this case that you don't have unlimited strength available.
Of course, that doesn't stop you from building your own, and thereby granting yourself unlimited strength, but depending on your local laws, that might be illegal.
This article outlines the restrictions on export to other nations, from the Unites States.
The way how to check if restrictions apply is documented in the method Cipher.getMaxAllowedKeyLength:
If JCE unlimited strength jurisdiction policy files are installed, Integer.MAX_VALUE will be returned.
This means that if any value other than (or indeed lower than) Integer.MAX_VALUE is returned that restrictions do apply.
Even more information is in the JavaDoc of the method below:
/**
* Determines if cryptography restrictions apply.
* Restrictions apply if the value of {#link Cipher#getMaxAllowedKeyLength(String)} returns a value smaller than {#link Integer#MAX_VALUE} if there are any restrictions according to the JavaDoc of the method.
* This method is used with the transform <code>"AES/CBC/PKCS5Padding"</code> as this is an often used algorithm that is an implementation requirement for Java SE.
*
* #return <code>true</code> if restrictions apply, <code>false</code> otherwise
*/
public static boolean restrictedCryptography() {
try {
return Cipher.getMaxAllowedKeyLength("AES/CBC/PKCS5Padding") < Integer.MAX_VALUE;
} catch (final NoSuchAlgorithmException e) {
throw new IllegalStateException("The transform \"AES/CBC/PKCS5Padding\" is not available (the availability of this algorithm is mandatory for Java SE implementations)", e);
}
}
Note that since Java 9 the unlimited crypto policies are installed by default (with those affected by import / export regulations having to install the limited crypto policies instead). So this code would mainly be required for backwards compatibility and/or other runtimes.
This is a complete copy paste version to allow for testing
import javax.crypto.Cipher;
import java.security.NoSuchAlgorithmException;
class Test {
public static void main(String[] args) {
int allowedKeyLength = 0;
try {
allowedKeyLength = Cipher.getMaxAllowedKeyLength("AES");
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
System.out.println("The allowed key length for AES is: " + allowedKeyLength);
}
}
To run
javac Test.java
java Test
If JCE is not working output: 128
JCE is working something like: 2147483647
If you are using Linux, you can check it easily with this command
java -version ; \
echo 'System.err.println(javax.crypto.Cipher.getInstance("AES/CBC/PKCS5Padding").getMaxAllowedKeyLength("AES"));' \
| java -cp /usr/share/java/bsh-*.jar bsh.Interpreter >/dev/null
If the output is something like that, unlimited strength cryptography is not available
java version "1.7.0_76"
Java(TM) SE Runtime Environment (build 1.7.0_76-b13)
Java HotSpot(TM) 64-Bit Server VM (build 24.76-b04, mixed mode)
128
You can check it in one step from the command line by using groovy :
groovysh -e 'javax.crypto.Cipher.getMaxAllowedKeyLength("AES")'
If the result is 2147483647, you have unlimited cryptography.
On older version of groovy, you have to remove the -e :
groovysh 'javax.crypto.Cipher.getMaxAllowedKeyLength("AES")'
NOTE: Please use jefflunt's answer or KonstantinSpirov's answer. This answer is not a valid answer since it will always return true. I am leaving this answer here only because it is referenced elsewhere in answers and comments and is useful as a reference only.
You could use the following to initialize a static final boolean somewhere that you can then use for testing unlimited crypto support (since AES 256-bit is only supported if the unrestricted policy is installed).
boolean isUnlimitedSupported = false;
try {
KeyGenerator kgen = KeyGenerator.getInstance("AES", "SunJCE");
kgen.init(256);
isUnlimitedSupported = true;
} catch (NoSuchAlgorithmException e) {
isUnlimitedSupported = false;
} catch (NoSuchProviderException e) {
isUnlimitedSupported = false;
}
System.out.println("isUnlimitedSupported=" + isUnlimitedSupported);
// set static final variable = isUnlimitedSupported;
I recently had to do add a JCE check and my solution evolved to the following snippet. This was a groovy script, but it should be easy to convert to standard java method with a try catch. This has been tested with Java 7 & Java 8.
import javax.crypto.Cipher;
import javax.crypto.spec.SecretKeySpec;
import javax.crypto.SecretKey;
// Make a blank 256 Bit AES Key
final SecretKey secretKey = new SecretKeySpec(new byte[32], "AES");
final Cipher encryptCipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
// This line will throw a invalid key length exception if you don't have
// JCE Unlimited strength installed
encryptCipher.init(Cipher.ENCRYPT_MODE, secretKey);
// If it makes it here, you have JCE installed

Java - getOpenFileDescriptorCount for Windows

How to get the number of open File descriptors under Windows?
On unix there is this:
UnixOperatingSystemMXBean.getOpenFileDescriptorCount()
But there doesn't seem to be an equivalent for windows?
This was going to be a comment but got a little long winded.
Conflicting answers as to why there may be a lack of equivalence here on ServerFault: Windows Server 2008 R2 max open files limit. TLDR: Windows is only limited by available hardware vs Windows is limited by 32 vs 64 bit implementation (MS Technet Blog Post - Pushing the Limits of Windows: Handles). Granted, this is old information.
But! if you note the JavaDocs for the com.sun.management package, you will of course note the conspicuous absence of a Windows version of the the UnixOperatingSystemMXBean that would extend OperatingSystemMXBean to provide the functionality. Even UnixOperatingSystemMXBean only exists to provide getMaxFileDescriptorCount() and getOpenFileDescriptorCount() so it seems unlikely that Windows has the same concept.
Edit:
I did find a nice little program that sort of shows this off, which I tweaked.
Descriptors.java
import java.lang.management.ManagementFactory;
import java.lang.management.OperatingSystemMXBean;
import java.lang.reflect.Method;
class Descriptors {
public static void main(String [ ] args) {
System.out.println(osMxBean.getClass().getName());
OperatingSystemMXBean osMxBean = ManagementFactory.getOperatingSystemMXBean();
try {
Method getMaxFileDescriptorCountField = osMxBean.getClass().getDeclaredMethod("getMaxFileDescriptorCount");
Method getOpenFileDescriptorCountField = osMxBean.getClass().getDeclaredMethod("getOpenFileDescriptorCount");
getMaxFileDescriptorCountField.setAccessible(true);
getOpenFileDescriptorCountField.setAccessible(true);
System.out.println(getOpenFileDescriptorCountField.invoke(osMxBean) + "/" + getMaxFileDescriptorCountField.invoke(osMxBean));
} catch (Exception e) {
e.printStackTrace();
}
}
}
On Linux:
com.sun.management.UnixOperatingSystem
11/2048
On Windows:
sun.management.OperatingSystemImpl
java.lang.NoSuchMethodException:
sun.management.OperatingSystemImpl.getMaxFileDescriptorCount()
at java.lang.Class.getDeclaredMethod(Unknown Source)
at Descriptors.main(Descriptors.java:10)

What prevents Java from verifying signed jars with multiple signature algorithms

Quick background: We release a webstart application, which includes our own application jars and numerous third-party jars. Webstart requires that all distributed jars referred to by the jnlp file be signed by a single certificate. We therefore sign all jars (our jars and the third-party jars) using a self-signed certificate. Some third-party jars are already signed by the party which produced them, but we just sign them again, and this works fine. Until now.
Problem: We recently moved from Java 6 to Java 7, and suddenly webstart is refusing to load some jars, complaining: "Invalid SHA1 signature file digest". This only happens for some jars and not others, and the common thread appears among those jars that fail appears to be having multiple signatures.
After searching around on S.O. and the internet, it appears that the default signature algorithm for Java's jarsigner has changed between Java 6 and Java 7, from SHA1 to SHA256, and various people are recommending using "jarsigner -digestalg SHA1" to work around verification issues. I tried that, and sure enough our multiply-signed jars now verify. So this appears to be a workaround for our issue.
From what I can gather, it appears that the third-party signature is a SHA1 signature, and we were signing with the default -- SHA256 -- resulting in a mixing of signatures. When I force SHA1 using the '-digestalg' switch, we have two signatures of the same type, and verification now works. So it seems the problem is caused by having multiple signatures with different algorithms? Or is there some other factor I'm missing.
Questions:
Why does it fail to verify with SHA1 + SHA256, but verifies with SHA1 + SHA1? Is there a technical reason? A security policy reason? Why can't it verify that both signatures are correct?
Is there any drawback to us using (continuing to use) SHA1 instead of the now-default SHA256?
Rather than re-signing the third party jars yourself, you can create a separate JNLP file for each third-party signer that refers to the relevant jar files, then have your main JNLP depend on these using the <extension> element. The restriction that all JAR files must be signed by the same signer only applies within one JNLP, each extension can have a different signer.
Failing that, you could strip out the third party signatures before adding your own (by repacking them without META-INF/*.{SF,DSA,RSA})
I know this is a bit late - but we are going thru this now. Our problem was the "MD2withRSA" signing issue. I resolved the problem in a couple steps:
1) Worked with Verisign to remove the 'old' algorithm from our certificate - so the MD2withRSA algorithm was no longer used to sign our jars.
2) We also have a pile of 3rd party jars and we just re-sign them with out our certificate. We encountered the 'not all jars signed with the same certificate' when both the SHA1 and SHA-256 algorithms were listed in the MANIFEST.MF. This was just a small subset of the jars - so for those, we removed the bottom half of the MANIFEST.MF file; that part with the Name: class and the algorithm spec. That data is re-generated in the last part of our process. We unzip, exclude the old signing info and re-jar. Last step is to re-sign the jars. We found that in some cases, if the old Name: entry with the SHA1 entry was in the MANIFEST.MF, that the signing did not replace it with the SHA-256 - so we manually handle those jars (for now). Working on updating our Ant tasks to handle this.
Sorry - can't speak to why web start doesn't handle/allow it - just figured out how to make it work!
Good Luck!
Seems like a bug in the JRE. Personally I'm assuming the old default signing algorithm (DSA with SHA1 digest) is less secure than the new one (RSA with SHA256 digest), so it's best not to use the "-digestalg SHA1" option.
I solved this problem by using a custom Ant task in my build script to 'unsign' my jars before signing them. That way there is only one signature for each jar.
Here's my Ant task:
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
import java.util.zip.ZipOutputStream;
import org.apache.tools.ant.BuildException;
import org.apache.tools.ant.Task;
import org.apache.tools.ant.types.FileSet;
import org.apache.tools.ant.types.Path;
import org.apache.tools.ant.types.Resource;
import org.apache.tools.ant.types.resources.FileProvider;
import org.apache.tools.ant.types.resources.FileResource;
import org.apache.tools.ant.util.FileUtils;
import org.apache.tools.ant.util.ResourceUtils;
public class UnsignJar extends Task {
protected List<FileSet> filesets = new ArrayList<FileSet>();
protected File todir;
public void addFileset(final FileSet set) {
filesets.add(set);
}
public void setTodir(File todir) {
this.todir = todir;
}
#Override
public void execute() throws BuildException {
if (todir == null) {
throw new BuildException("todir attribute not specified");
}
if (filesets.isEmpty()) {
throw new BuildException("no fileset specified");
}
Path path = new Path(getProject());
for (FileSet fset : filesets) {
path.addFileset(fset);
}
for (Resource r : path) {
FileResource from = ResourceUtils.asFileResource(r
.as(FileProvider.class));
File destFile = new File(todir, from.getName());
File fromFile = from.getFile();
if (!isUpToDate(destFile, fromFile)) {
unsign(destFile, fromFile);
}
}
}
private void unsign(File destFile, File fromFile) {
log("Unsigning " + fromFile);
try {
ZipInputStream zin = new ZipInputStream(
new FileInputStream(fromFile));
ZipOutputStream zout = new ZipOutputStream(
new FileOutputStream(destFile));
ZipEntry entry = zin.getNextEntry();
while (entry != null) {
if (!entry.getName().startsWith("META-INF")) {
copyEntry(zin, zout, entry);
}
zin.closeEntry();
entry = zin.getNextEntry();
}
zin.close();
zout.close();
} catch (IOException e) {
throw new BuildException(e);
}
}
private void copyEntry(ZipInputStream zin, ZipOutputStream zout,
ZipEntry entry) throws IOException {
zout.putNextEntry(entry);
byte[] buffer = new byte[1024 * 16];
int byteCount = zin.read(buffer);
while (byteCount != -1) {
zout.write(buffer, 0, byteCount);
byteCount = zin.read(buffer);
}
zout.closeEntry();
}
private boolean isUpToDate(File destFile, File fromFile) {
return FileUtils.getFileUtils().isUpToDate(fromFile, destFile);
}
}

Java Security NoSuchAlgorithmException in only One of Three Eclipse Instances

Recently had a problem with
java.security.NoSuchAlgorithmException: Algorithm HmacSHA1 not available
Tried to isolate the problem, using this simple code:
import java.security.NoSuchAlgorithmException;
import javax.crypto.Mac;
public class Main {
public static void main(String... args) {
final String HMAC_SHA1_ALGORITHM = "HmacSHA1";
Mac instance;
try {
instance = Mac.getInstance(HMAC_SHA1_ALGORITHM);
} catch (NoSuchAlgorithmException e) {
final String errmsg = "NoSuchAlgorithmException: "
+ HMAC_SHA1_ALGORITHM + " " + e;
// ...
}
}
}
The thing is that this works in one of my Eclipse instances, and not on the other. All tested using New Java Project, pasting the above, and running, so this should be due to some settings difference between the Eclipse instances.
I've tried to look through all the seemingly relevant settings (classpath, JRE, java compiler), but nothing that looks different or makes it work if changed. (If someone knows how to 'diff' the settings of two Eclipses, do tell!)
I'm resorting to simply using a third Eclipse (where it works (so far)), but it would still be interesting to learn what this potentially infuriating problem is really down to.
I had that same Exception using Eclipse and tomcat, what I did was :
Reload JDK over TOMCAT 8 configuration, and it started working

Categories