Can you use BigInteger.isProbablePrime() to generate cryptographically secure primes? What certainty is necessary for them to be "secure"?
I do not hold a degree in crypto, so take this with a grain of salt.
You have two major areas of concern here:
Your primes need to be unpredictably random. This means that you need to use a source such as SecureRandom to generate your primes. No matter how sure of your primality, if they are predictable, the entire cryptosystem fails to meet its goal. If you are using the BigInteger(int bitLength, int certainty, Random rnd) constructor, you can pass in your SecureRandom as it subclasses Random.
Your potential primes need to be reasonably certain of being primes (I'm assuming that you are using an algorithm that relies on the hardness of factoring). If you get a probable prime, but an attacker can, with a good probability, factor it within 5 minutes because it had a factor that never got noticed by the primality test you ran, you are somewhat out of luck with your algorithm. Rabin-Miller is generally used, and this answer states that a certainty of 15 is sufficient for 32-bit integers. A value up to 40 is recommended, and anything beyond that is meaningless.
this is the way i generated a secure BigInteger for my cryptographic application.
Here's my code:
BigInteger b = new BigInteger(25, new SecureRandom());
Since you also need it for a crypto application, in my opinion, getting a BigInteger is this way is right.
Note: Remember that SecureRandom objects are preformance-wise costly. So You should not initialize them many times.
After reading comments, it worked out further
Here's a way which assures you more certainity of getting a prime number.
BigInteger b =BigInteger.probablePrime(25, new SecureRandom(););
As #hexafraction says, you need to use SecureRandom() to generate a cryptographic quality random number. The Javadoc says that the generated prime is 2^-100 secure. If you want more security (say to 2^-128 for AES equivalent security) then run more iterations of the Miller-Rabin test on it. Each iteration gives you an extra 2^-2 security, so fourteen iterations would get you to 2^-128.
Related
What is the best and fastest algorithm to generate cryptographically secure PRNG? My requirements is as follows.
In each run, I need to generate few millions of numbers with length of 10.
It should not allow one to predict future iterations by observing some number of iterations.
Numbers should be unique. No duplicates allowed.
Speed is also important since it will generate millions of numbers in each iteration.
I checked Mersenne Twister algorithm but it seems it is not cryptographically secure. They said, it could be predicted by checking 624 iterations.
P.S. It would be better if there is a Java Implementation.
Your unique requirement means that you cannot use any sort of RNG (my earlier comment was wrong), since random numbers will include repeats. Instead, use a cipher and encrypt the numbers: 0, 1, 2, 3, ... which will give guaranteed unique results. Since ciphers are reversible, each cyphertext decrypts back to the original plaintext, so the cyphertexts are a permutation of the plaintexts.
You also want numbers of "length ten". I assume this means ten decimal digits, [1,000,000,000 .. 9,999,999,999]. That means you need a cipher which works in the range [0 .. 8,999,999,999], and just add 1e9 to the output.
That is more complex. Either use the Hasty Pudding cipher, which can be set for any range of numbers you want, or roll your own Feistel cipher with its block size set to the next higher power of 2. If a number is out of range then re-encrypt it until it is in range. The Feistel option will be faster, but less secure. You can make it more secure by increasing the number of rounds at the cost of making it slower.
Use Linear feedback shift register. This algorithm is fast and secure.
Build register based on primitive characteristic polynomial of degree n. This will give the sequence of 2^n - 1 unique numbers. After that the sequence will repeat. Do not seed with zero. Some well-known stream cipher algorithms are based on LFSR, so you can extract and investigate implementation. For example LILI-128 (but its reference implementation in C)
public static void main(String[] args)
{
Random ranGen = new SecureRandom();
ranGen.setSeed(0);
int randomNumber = ranGen.nextInt(2);
System.out.print(randomNumber);
}
Is the above code a good way to either produce a truly random and secure/unbiased 0 or 1 ?
No, it isn't (a good way to either produce a truly random and secure/unbiased 0 or 1).
new SecureRandom() is OK, but setting the seed directly after isn't. Oracle's implementation of "SHA1PRNG" - which is normally the default - will replace the initial seed by the one given if setSeed() is called before any entropy is retrieved from the random number generator.
If it is not seeded the SecureRandom implementation will be seeded using an entropy source of the operating system. This should be relatively safe. Often the operating system is the only part that has access to real hardware devices so it is better positioned to retrieve entropy than the JVM. You can call setSeed(), but you should only call it after a call to nextBytes(), requesting at least one byte. Afterwards you can add some seed to the pool.
Finally, calling nextInt(2) is OK, but not very efficient. It will request 32 bits, and then toss away the 31 bits it doesn't require. Requesting random bytes and extracting the bits from the array will be more efficient (if efficiency is required, don't over-optimize). Calling nextBoolean() will probably just toss away 7 bits, so it is at least 4 times more efficient and much more concise.
Let's assume I have a reliably truly random source of random numbers, but it is very slow. It only give me a few hundreds of numbers every couple of hours.
Since I need way more than that I was thinking to use those few precious TRN I can get as seeds for java.util.Random (or scala.util.Random). I also always will pick a new one to generate the next random number.
So I guess my questions are:
Can the numbers I generate from those Random instance in Java be considered truly random since the seed is truly random?
Is there still a condition that is not met for true randomness?
If I keep on adding levels at what point will randomness be lost?
Or (as I thought when I came up with it) is truly random as long as the stream of seeds is?
I am assuming that nobody has intercepted the stream of seeds, but I do not plan to use those numbers for security purposes.
For a pseudo random generator like java.util.Random, the next generated number in the sequence becomes predictable given only a few numbers from the sequence, so you will loose your "true randomness" very fast. Better use one of the generators provided by java.security.SecureRandom - these are all strong random generators with an VERY long sequence length, which should be pretty hard to be predicted.
Our java Random gives uniformly spread random numbers. That is not true randomness, which may yield five times the same number.
Furthermore for every specific seed the same sequence is generated (intentionally). With 2^64 seeds in general irrelevant. (Note hackers could store the first ten numbers of every sequence; thereby rapidly catching up.)
So if you at large intervals use a truely random number as seed, you will get a uniform distribution during that interval. In effect not very different from not using the true randomizers.
Now combining random sequences might reduce the randomness. Maybe translating the true random number to bytes, and xor-ing every new random number with another byte, might give a wilder variance.
Please do not take my word only - I cannot guarantee the mathematical correctness of the above. A math/algorithmic forum might give more info.
When you take out more bits, than you have put in they are for sure no longer truly random. The break point may even occur earlier if the random number generator is bad. This can be seen by considering the entropy of the sequences. The seed value determines the sequence completely, so there are at most as many sequences as seed values. If they are all distinct, the entropy is the same as that of the seeds (which is essentially the number of seed bits, assuming the seed is truly random).
However, if different seeds lead to the same pseudo random sequence the entropy of the sequences will be lower than that of the seeds. If we cut off the sequences after n bits, the entropy may be even lower.
But why care if you don't use it for security purposes? Are you sure the pseudo random numbers are not good enough for your application?
I need/want to get random (well, not entirely) numbers to use for password generation.
What I do: Currently I am generating them with SecureRandom.
I am obtaining the object with
SecureRandom sec = SecureRandom.getInstance("SHA1PRNG", "SUN");
and then seeding it like this
sec.setSeed(seed);
Target: A (preferably fast) way to create random numbers, which are cryptographically at least a safe as the SHA1PRNG SecureRandom implementation. These need to be the same on different versions of the JRE and Android.
EDIT: The seed is generated from user input.
Problem: With SecureRandom.getInstance("SHA1PRNG", "SUN"); it fails like this:
java.security.NoSuchProviderException: SUN. Omitting , "SUN" produces random numbers, but those are different than the default (JRE 7) numbers.
Question: How can I achieve my Target?
You don't want it to be predictable: I want, because I need the predictability so that the same preconditions result in the same output. If they are not the same, its impossible hard to do what the user expects from the application.
EDIT: By predictable I mean that, when knowing a single byte (or a hundred) you should not be able to predict the next, but when you know the seed, you should be able to predict the first (and all others). Maybe another word is reproducible.
If anyone knows of a more intuitive way, please tell me!
I ended up isolating the Sha1Prng from the sun sources which guarantees reproducibility on all versions of Java and android. I needed to drop some important methods to ensure compatibility with android, as android does not have access to nio classes...
I recommend using UUID.randomUUID(), then splitting it into longs using getLeastSignificantBits() and getMostSignificantBits()
If you want predictable, they aren't random. That breaks your "Target" requirement of being "safe" and devolves into a simple shared secret between two servers.
You can get something that looks sort of random but is predicatable by using the characteristics of prime integers where you build a set of integers by starting with I (some specific integer) and add the first prime number and then modulo by the 2nd prime number. (In truth the first and second numbers only have to be relatively prime--meaning they have no common prime factors--not counting 1, in case you call that a factor.
If you repeat the process of adding and doing the modulo, you will get a set of numbers that you can repeatably reproduce and they are ordered in the sense that taking any member of the set, adding the first prime and doing the modulo by the 2nd prime, you will always get the same result.
Finally, if the 1st prime number is large relative to the second one, the sequence is not easily predictable by humans and seems sort of random.
For example, 1st prime = 7, 2nd prime = 5 (Note that this shows how it works but is not useful in real life)
Start with 2. Add 7 to get 9. Modulo 5 to get 4.
4 plus 7 = 11. Modulo 5 = 1.
Sequence is 2, 4, 1, 3, 0 and then it repeats.
Now for real life generation of numbers that seem random. The relatively prime numbers are 91193 and 65536. (I chose the 2nd one because it is a power of 2 so all modulo-ed values can fit in 16 bits.)
int first = 91193;
int modByLogicalAnd = 0xFFFF;
int nonRandomNumber = 2345; // Use something else
for (int i = 0; i < 1000 ; ++i) {
nonRandomNumber += first;
nonRandomNumber &= modByLogicalAnd;
// print it here
}
Each iteration generates 2 bytes of sort of random numbers. You can pack several of them into a buffer if you need larger random "strings".
And they are repeatable. Your user can pick the starting point and you can use any prime you want (or, in fact, any number without 2 as a factor).
BTW - Using a power of 2 as the 2nd number makes it more predictable.
Ignoring RNGs that use some physical input (random clock bits, electrical noise, etc) all software RNGs are predicable, given the same starting conditions. They are, after all, (hopefully) deterministic computer programs.
There are some algorithms that intentionally include the physical input (by, eg, sampling the computer clock occasionally) in attempt to prevent predictability, but those are (to my knowledge) the exception.
So any "conventional" RNG, given the same seed and implemented to the same specification, should produce the same sequence of "random" numbers. (This is why a computer RNG is more properly called a "pseudo-random number generator".)
The fact that an RNG can be seeded with a previously-used seed and reproduce a "known" sequence of numbers does not make the RNG any less secure than one where your are somehow prevented from seeding it (though it may be less secure than the fancy algorithms that reseed themselves at intervals). And the ability to do this -- to reproduce the same sequence again and again is not only extraordinarily useful in testing, it has some "real life" applications in encryption and other security applications. (In fact, an encryption algorithm is, in essence, simply a reproducible random number generator.)
I'm aware of the function BigInteger.probablePrime(int bitLength, Random rnd) that outputs probably prime number of any bit length. I want a REAL prime number in Java. Is there any FOSS library to do so with acceptable performance? Thanks in advance!
EDIT:
I'm looking at 1024 & 2048 bit primes.
use probable prime to generate a candidate
use a fast deterministic test such as the AKS primality test to check whether the candidate is indeed prime.
edit: Or, if you don't trust the isProbablePrime to be large enough certainty, use the BigInteger constructor BigInteger(int bitLength, int certainty, Random rnd) that lets you tune your certainty threshold:
certainty - a measure of the uncertainty that the caller is willing to tolerate. The probability that the new BigInteger represents a prime number will exceed (1 - 1/2certainty). The execution time of this constructor is proportional to the value of this parameter.
Probabilistic tests used for cryptographic purposes are guaranteed to bound the probability of false positives -- it's not like there's some gotcha numbers that exist that will sneak through, it's just a matter of how low you want the probability to be. If you don't trust the Java BigInteger class to use these (it would be nice if they documented what test was used), use the Rabin-Miller test.
There are some methods to generate very large primes with acceptable performance, but not with sufficient density for most purposes other than getting into the Guiness Book of Records.
Look at it like this: the likelihood that a number returned by probablePrime() is not prime is lower than the likelihood of you and everyone you know getting hit by lighting. Twice. On a single day.
Just don't worry about it.
You could also use the constructor of BigInteger to generate a real prime:
BigInteger(int bitLength, int certainty, Random rnd)
The time to execute is proportional to the certainty, but on my Core i7 it isn't a problem.
Make a method and wrap it.
BigInteger definitePrime(int bits, Random rnd) {
BigInteger prime = new BigInteger("4");
while(!isPrime(prime)) prime = BigInteger.probablePrime(bits,rnd);
return prime;
}
Random rnd = new SecureRandom();
System.out.println(BigInteger.probablePrime(bitLength, rnd));
The probability that a BigInteger returned by method probablePrime() is composite does not exceed 2^-100.