Is there a SignatureOutputStream (or equivalent) in Java? - java

I note that there is a CipherOutputStream in Java, but apparently no SignatureOutputStream. Is this true? Where might I find a SignatureOutputStream?

Here is an example implementation, just for you. It comes with a test function (see below).
package de.fencing_game.paul.examples;
import java.io.*;
import java.security.*;
/**
* This class provides an outputstream which writes everything
* to a Signature as well as to an underlying stream.
*/
public class SignatureOutputStream
extends OutputStream
{
private OutputStream target;
private Signature sig;
/**
* creates a new SignatureOutputStream which writes to
* a target OutputStream and updates the Signature object.
*/
public SignatureOutputStream(OutputStream target, Signature sig) {
this.target = target;
this.sig = sig;
}
public void write(int b)
throws IOException
{
write(new byte[]{(byte)b});
}
public void write(byte[] b)
throws IOException
{
write(b, 0, b.length);
}
public void write(byte[] b, int offset, int len)
throws IOException
{
target.write(b, offset, len);
try {
sig.update(b, offset, len);
}
catch(SignatureException ex) {
throw new IOException(ex);
}
}
public void flush()
throws IOException
{
target.flush();
}
public void close()
throws IOException
{
target.close();
}
}
Here are some test methods to show how to use this:
private static byte[] signData(OutputStream target,
PrivateKey key, String[] data)
throws IOException, GeneralSecurityException
{
Signature sig = Signature.getInstance("SHA1withRSA");
sig.initSign(key);
DataOutputStream dOut =
new DataOutputStream(new SignatureOutputStream(target, sig));
for(String s : data) {
dOut.writeUTF(s);
}
byte[] signature = sig.sign();
return signature;
}
private static void verify(PublicKey key, byte[] signature,
byte[] data)
throws IOException, GeneralSecurityException
{
Signature sig = Signature.getInstance("SHA1withRSA");
sig.initVerify(key);
ByteArrayOutputStream collector =
new ByteArrayOutputStream(data.length);
OutputStream checker = new SignatureOutputStream(collector, sig);
checker.write(data);
if(sig.verify(signature)) {
System.err.println("Signature okay");
}
else {
System.err.println("Signature falsed!");
}
}
/**
* a test method.
*/
public static void main(String[] params)
throws IOException, GeneralSecurityException
{
if(params.length < 1) {
params = new String[] {"Hello", "World!"};
}
KeyPairGenerator gen = KeyPairGenerator.getInstance("RSA");
KeyPair pair = gen.generateKeyPair();
ByteArrayOutputStream arrayStream = new ByteArrayOutputStream();
byte[] signature = signData(arrayStream, pair.getPrivate(), params);
byte[] data = arrayStream.toByteArray();
verify(pair.getPublic(), signature, data);
// change one byte by one
data[3]++;
verify(pair.getPublic(), signature, data);
data = arrayStream.toByteArray();
verify(pair.getPublic(), signature, data);
// change signature
signature[4]++;
verify(pair.getPublic(), signature, data);
}
The main method signs its command line parameters with a new (random) private key, and then checks it with the corresponding public key. It again checks it with a bit changed data and signature.
Of course, for the checking of the signature a SignatureInputStream would be more useful - you can create it precisely the same way.

Related

How to get the intermediate compressed size using GZipOutputStream? [duplicate]

I have a BufferedWriter as shown below:
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(
new GZIPOutputStream( hdfs.create(filepath, true ))));
String line = "text";
writer.write(line);
I want to find out the bytes written to the file with out querying file like
hdfs = FileSystem.get( new URI( "hdfs://localhost:8020" ), configuration );
filepath = new Path("path");
hdfs.getFileStatus(filepath).getLen();
as it will add overhead and I don't want that.
Also I cant do this:
line.getBytes().length;
As it give size before compression.
You can use the CountingOutputStream from Apache commons IO library.
Place it between the GZIPOutputStream and the file Outputstream (hdfs.create(..)).
After writing the content to the file you can read the number of written bytes from the CountingOutputStream instance.
If this isn't too late and you are using 1.7+ and you don't wan't to pull in an entire library like Guava or Commons-IO, you can just extend the GZIPOutputStream and obtain the data from the associated Deflater like so:
public class MyGZIPOutputStream extends GZIPOutputStream {
public MyGZIPOutputStream(OutputStream out) throws IOException {
super(out);
}
public long getBytesRead() {
return def.getBytesRead();
}
public long getBytesWritten() {
return def.getBytesWritten();
}
public void setLevel(int level) {
def.setLevel(level);
}
}
You can make you own descendant of OutputStream and count how many time write method was invoked
This is similar to the response by Olaseni, but I moved the counting into the BufferedOutputStream rather than the GZIPOutputStream, and this is more robust, since def.getBytesRead() in Olaseni's answer is not available after the stream has been closed.
With the implementation below, you can supply your own AtomicLong to the constructor so that you can assign the CountingBufferedOutputStream in a try-with-resources block, but still retrieve the count after the block has exited (i.e. after the file is closed).
public static class CountingBufferedOutputStream extends BufferedOutputStream {
private final AtomicLong bytesWritten;
public CountingBufferedOutputStream(OutputStream out) throws IOException {
super(out);
this.bytesWritten = new AtomicLong();
}
public CountingBufferedOutputStream(OutputStream out, int bufSize) throws IOException {
super(out, bufSize);
this.bytesWritten = new AtomicLong();
}
public CountingBufferedOutputStream(OutputStream out, int bufSize, AtomicLong bytesWritten)
throws IOException {
super(out, bufSize);
this.bytesWritten = bytesWritten;
}
#Override
public void write(byte[] b) throws IOException {
super.write(b);
bytesWritten.addAndGet(b.length);
}
#Override
public void write(byte[] b, int off, int len) throws IOException {
super.write(b, off, len);
bytesWritten.addAndGet(len);
}
#Override
public synchronized void write(int b) throws IOException {
super.write(b);
bytesWritten.incrementAndGet();
}
public long getBytesWritten() {
return bytesWritten.get();
}
}

Jar differs but they should not

I have one method to create a jar.
public class Test {
public static void main(String[] args) throws Exception {
aha();
aha();
aha();
aha();
Thread.sleep(5000);
aha();
}
private static void aha() throws IOException, NoSuchAlgorithmException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
JarOutputStream jos = new JarOutputStream(baos);
jos.putNextEntry(new ZipEntry("sd"));
jos.write("sdf".getBytes());
jos.close();
MessageDigest md = MessageDigest.getInstance("sha1");
byte[] digest = md.digest(baos.toByteArray());
for (byte b : digest) {
System.out.print("," + b);
}
System.out.println();
}
}
The output is:
,-57,-44,59,113,-126,-15,71,62,-90,-120,27,36,-3,69,26,-55,63,107,-93,102
,-57,-44,59,113,-126,-15,71,62,-90,-120,27,36,-3,69,26,-55,63,107,-93,102
,-57,-44,59,113,-126,-15,71,62,-90,-120,27,36,-3,69,26,-55,63,107,-93,102
,-57,-44,59,113,-126,-15,71,62,-90,-120,27,36,-3,69,26,-55,63,107,-93,102
,-124,-26,-79,-28,-34,77,-72,83,92,53,30,-13,95,21,-92,55,70,24,-72,39
I need same digests but the last digest differs. How to become reproducable hashes?
Altough almost invisible, if you write a ZipEntry to a JarOutputStream, the underlying ZipOutputStream will initialize the last modification time for you.
if (e.xdostime == -1) {
// by default, do NOT use extended timestamps in extra
// data, for now.
e.setTime(System.currentTimeMillis());
}
You would have to manually initialize the time with setTime get a constant result.

Log response body in case of exception

I am using retrofit for http calls with gson as a converter.
In some cases I get exceptions thrown when gson tries to convert response to object and I would like to know what is the actual response in such case.
For example:
This is the exception message I get:
Expected a string but was BEGIN_OBJECT at line 1 column 26 path $[0].date
The code that execute the call is like this:
Gson gson = gsonBuilder.create();
Retrofit retrofit = (new retrofit2.Retrofit.Builder()).baseUrl(baseUrl).addConverterFactory(GsonConverterFactory.create(gson)).client(httpClient).build();
MyService service = retrofit.create(clazz);
...
Response<T> response = service.call().execute();
When this code throws exception I would like to log the raw response body somehow. How can I do that?
I don't think it can be accomplished easily. Retrofit does not seem to provide an easy way of tracking input streams (the most natural place I was thinking of was CallAdapter.Factory but it does not allow invalid response tracking).
Basically, illegal response conversion should be detected in a particular converter whose only responsibility is logging invalid payloads. Sounds pretty much like the Decorator design pattern. Since Java (unlike Kotlin?) does not support decorators as a first-class citizen, forwarding implementations can be implemented similarly to Google Guava Forwarding*** classes:
ForwardingInputStream.java
#SuppressWarnings("resource")
abstract class ForwardingInputStream
extends InputStream {
protected abstract InputStream inputStream();
// #formatter:off
#Override public int read() throws IOException { return inputStream().read(); }
// #formatter:on
// #formatter:off
#Override public int read(final byte[] b) throws IOException { return inputStream().read(b); }
#Override public int read(final byte[] b, final int off, final int len) throws IOException { return inputStream().read(b, off, len); }
#Override public long skip(final long n) throws IOException { return inputStream().skip(n); }
#Override public int available() throws IOException { return inputStream().available(); }
#Override public void close() throws IOException { inputStream().close(); }
#Override public void mark(final int readlimit) { inputStream().mark(readlimit); }
#Override public void reset() throws IOException { inputStream().reset(); }
#Override public boolean markSupported() { return inputStream().markSupported(); }
// #formatter:on
}
ForwardingResponseBody.java
#SuppressWarnings("resource")
abstract class ForwardingResponseBody
extends ResponseBody {
protected abstract ResponseBody responseBody();
// #formatter:off
#Override public MediaType contentType() { return responseBody().contentType(); }
#Override public long contentLength() { return responseBody().contentLength(); }
#Override public BufferedSource source() { return responseBody().source(); }
// #formatter:on
// #formatter:off
#Override public void close() { super.close(); }
// #formatter:on
}
ForwardingBufferedSource.java
abstract class ForwardingBufferedSource
implements BufferedSource {
protected abstract BufferedSource bufferedSource();
// #formatter:off
#Override public Buffer buffer() { return bufferedSource().buffer(); }
#Override public boolean exhausted() throws IOException { return bufferedSource().exhausted(); }
#Override public void require(final long byteCount) throws IOException { bufferedSource().require(byteCount); }
#Override public boolean request(final long byteCount) throws IOException { return bufferedSource().request(byteCount); }
#Override public byte readByte() throws IOException { return bufferedSource().readByte(); }
#Override public short readShort() throws IOException { return bufferedSource().readShort(); }
#Override public short readShortLe() throws IOException { return bufferedSource().readShortLe(); }
#Override public int readInt() throws IOException { return bufferedSource().readInt(); }
#Override public int readIntLe() throws IOException { return bufferedSource().readIntLe(); }
#Override public long readLong() throws IOException { return bufferedSource().readLong(); }
#Override public long readLongLe() throws IOException { return bufferedSource().readLongLe(); }
#Override public long readDecimalLong() throws IOException { return bufferedSource().readDecimalLong(); }
#Override public long readHexadecimalUnsignedLong() throws IOException { return bufferedSource().readHexadecimalUnsignedLong(); }
#Override public void skip(final long byteCount) throws IOException { bufferedSource().skip(byteCount); }
#Override public ByteString readByteString() throws IOException { return bufferedSource().readByteString(); }
#Override public ByteString readByteString(final long byteCount) throws IOException { return bufferedSource().readByteString(byteCount); }
#Override public int select(final Options options) throws IOException { return bufferedSource().select(options); }
#Override public byte[] readByteArray() throws IOException { return bufferedSource().readByteArray(); }
#Override public byte[] readByteArray(final long byteCount) throws IOException { return bufferedSource().readByteArray(byteCount); }
#Override public int read(final byte[] sink) throws IOException { return bufferedSource().read(sink); }
#Override public void readFully(final byte[] sink) throws IOException { bufferedSource().readFully(sink); }
#Override public int read(final byte[] sink, final int offset, final int byteCount) throws IOException { return bufferedSource().read(sink, offset, byteCount); }
#Override public void readFully(final Buffer sink, final long byteCount) throws IOException { bufferedSource().readFully(sink, byteCount); }
#Override public long readAll(final Sink sink) throws IOException { return bufferedSource().readAll(sink); }
#Override public String readUtf8() throws IOException { return bufferedSource().readUtf8(); }
#Override public String readUtf8(final long byteCount) throws IOException { return bufferedSource().readUtf8(byteCount); }
#Override public String readUtf8Line() throws IOException { return bufferedSource().readUtf8Line(); }
#Override public String readUtf8LineStrict() throws IOException { return bufferedSource().readUtf8LineStrict(); }
#Override public int readUtf8CodePoint() throws IOException { return bufferedSource().readUtf8CodePoint(); }
#Override public String readString(final Charset charset) throws IOException { return bufferedSource().readString(charset); }
#Override public String readString(final long byteCount, final Charset charset) throws IOException { return bufferedSource().readString(byteCount, charset); }
#Override public long indexOf(final byte b) throws IOException { return bufferedSource().indexOf(b); }
#Override public long indexOf(final byte b, final long fromIndex) throws IOException { return bufferedSource().indexOf(b, fromIndex); }
#Override public long indexOf(final ByteString bytes) throws IOException { return bufferedSource().indexOf(bytes); }
#Override public long indexOf(final ByteString bytes, final long fromIndex) throws IOException { return bufferedSource().indexOf(bytes, fromIndex); }
#Override public long indexOfElement(final ByteString targetBytes) throws IOException { return bufferedSource().indexOfElement(targetBytes); }
#Override public long indexOfElement(final ByteString targetBytes, final long fromIndex) throws IOException { return bufferedSource().indexOfElement(targetBytes, fromIndex); }
#Override public InputStream inputStream() { return bufferedSource().inputStream(); }
#Override public long read(final Buffer sink, final long byteCount) throws IOException { return bufferedSource().read(sink, byteCount); }
#Override public Timeout timeout() { return bufferedSource().timeout(); }
#Override public void close() throws IOException { bufferedSource().close(); }
// #formatter:on
}
Trivial forwarding implementations just override all methods of their parent classes and delegate the job to a delegated object. Once a forwarding class is extended, some of the parent methods can be overridden again.
IConversionThrowableConsumer.java
This is just a listener used below.
interface IConversionThrowableConsumer {
/**
* Instantiating {#link okhttp3.ResponseBody} can be not easy due to the way of how {#link okio.BufferedSource} is designed -- too heavy.
* Deconstructing its components to "atoms" with some lack of functionality may be acceptable.
* However, this consumer may need some improvements on demand.
*/
void accept(MediaType contentType, long contentLength, InputStream inputStream, Throwable ex)
throws IOException;
}
ErrorReportingConverterFactory.java
The next step is implementating the error-reporting converter factory that can be injected to Retrofit.Builder and listen to any errors occurring in downstream converters. Note how it works:
For every response converter an intermediate converter is injected. It allows to listen to any error in downstream converters.
Downstream converters obtain a non-closeable resources in order not to close underlaying I/O resources prematurely...
Downstream converters convert whilst the intermediate converter collects the real input stream content into a buffer in order to respond with an input stream that may cause GsonConverter fail. This should be considered a bottleneck here due to possibly large size of the grown buffer (however, it may be limited), its internal array is copied when requested from the converter and so on.
If IOException or RuntimeException occur, the intermediate converter concatenates the buffered input stream content and the real input stream in order to let a consumer to accept input streams from the very beginning.
The intermediate converter takes care of closing resources itself.
final class ErrorReportingConverterFactory
extends Factory {
private final IConversionThrowableConsumer consumer;
private ErrorReportingConverterFactory(final IConversionThrowableConsumer consumer) {
this.consumer = consumer;
}
static Factory getErrorReportingConverterFactory(final IConversionThrowableConsumer listener) {
return new ErrorReportingConverterFactory(listener);
}
#Override
public Converter<ResponseBody, ?> responseBodyConverter(final Type type, final Annotation[] annotations, final Retrofit retrofit) {
return (Converter<ResponseBody, Object>) responseBody -> {
final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
final InputStream realInputStream = responseBody.byteStream();
try {
final ForwardingResponseBody bufferingResponseBody = new BufferingNoCloseResponseBOdy(responseBody, byteArrayOutputStream);
final Converter<ResponseBody, Object> converter = retrofit.nextResponseBodyConverter(this, type, annotations);
return converter.convert(bufferingResponseBody);
} catch ( final RuntimeException | IOException ex ) {
final InputStream inputStream = concatInputStreams(new ByteArrayInputStream(byteArrayOutputStream.toByteArray()), realInputStream);
consumer.accept(responseBody.contentType(), responseBody.contentLength(), inputStream, ex);
throw ex;
} finally {
responseBody.close();
}
};
}
private static class BufferingInputStream
extends ForwardingInputStream {
private final InputStream inputStream;
private final ByteArrayOutputStream byteArrayOutputStream;
private BufferingInputStream(final InputStream inputStream, final ByteArrayOutputStream byteArrayOutputStream) {
this.inputStream = inputStream;
this.byteArrayOutputStream = byteArrayOutputStream;
}
#Override
protected InputStream inputStream() {
return inputStream;
}
#Override
public int read()
throws IOException {
final int read = super.read();
if ( read != -1 ) {
byteArrayOutputStream.write(read);
}
return read;
}
#Override
public int read(final byte[] b)
throws IOException {
final int read = super.read(b);
if ( read != -1 ) {
byteArrayOutputStream.write(b, 0, read);
}
return read;
}
#Override
public int read(final byte[] b, final int off, final int len)
throws IOException {
final int read = super.read(b, off, len);
if ( read != -1 ) {
byteArrayOutputStream.write(b, off, read);
}
return read;
}
}
private static class BufferingNoCloseResponseBOdy
extends ForwardingResponseBody {
private final ResponseBody responseBody;
private final ByteArrayOutputStream byteArrayOutputStream;
private BufferingNoCloseResponseBOdy(final ResponseBody responseBody, final ByteArrayOutputStream byteArrayOutputStream) {
this.responseBody = responseBody;
this.byteArrayOutputStream = byteArrayOutputStream;
}
#Override
protected ResponseBody responseBody() {
return responseBody;
}
#Override
#SuppressWarnings("resource")
public BufferedSource source() {
final BufferedSource source = super.source();
return new ForwardingBufferedSource() {
#Override
protected BufferedSource bufferedSource() {
return source;
}
#Override
public InputStream inputStream() {
return new BufferingInputStream(super.inputStream(), byteArrayOutputStream);
}
};
}
/**
* Suppressing close due to automatic close in {#link ErrorReportingConverterFactory#responseBodyConverter(Type, Annotation[], Retrofit)}
*/
#Override
public void close() {
// do nothing
}
}
}
Note that this implementation uses forwarding classes heavily and only overrides what's necessary.
Also there are some utilities like concatenating input streams and adapting iterators to enumerations.
IteratorEnumeration.java
final class IteratorEnumeration<T>
implements Enumeration<T> {
private final Iterator<? extends T> iterator;
private IteratorEnumeration(final Iterator<? extends T> iterator) {
this.iterator = iterator;
}
static <T> Enumeration<T> iteratorEnumeration(final Iterator<? extends T> iterator) {
return new IteratorEnumeration<>(iterator);
}
#Override
public boolean hasMoreElements() {
return iterator.hasNext();
}
#Override
public T nextElement() {
return iterator.next();
}
}
InputStreams.java
final class InputStreams {
private InputStreams() {
}
static InputStream concatInputStreams(final InputStream... inputStreams) {
return inputStreams.length == 2
? new SequenceInputStream(inputStreams[0], inputStreams[1])
: new SequenceInputStream(iteratorEnumeration((Iterator<? extends InputStream>) asList(inputStreams).iterator()));
}
}
OutputStreamConversionThrowableConsumer.java
Trivial logging implementation.
final class OutputStreamConversionThrowableConsumer
implements IConversionThrowableConsumer {
private static final int BUFFER_SIZE = 512;
private final PrintStream printStream;
private OutputStreamConversionThrowableConsumer(final PrintStream printStream) {
this.printStream = printStream;
}
static IConversionThrowableConsumer getOutputStreamConversionThrowableConsumer(final OutputStream outputStream) {
return new OutputStreamConversionThrowableConsumer(new PrintStream(outputStream));
}
static IConversionThrowableConsumer getSystemOutConversionThrowableConsumer() {
return getOutputStreamConversionThrowableConsumer(System.out);
}
static IConversionThrowableConsumer getSystemErrConversionThrowableConsumer() {
return getOutputStreamConversionThrowableConsumer(System.err);
}
#Override
public void accept(final MediaType contentType, final long contentLength, final InputStream inputStream, final Throwable ex)
throws IOException {
printStream.print("Content type: ");
printStream.println(contentType);
printStream.print("Content length: ");
printStream.println(contentLength);
printStream.print("Content: ");
final byte[] buffer = new byte[BUFFER_SIZE];
int read;
while ( (read = inputStream.read(buffer)) != -1 ) {
printStream.write(buffer, 0, read);
}
printStream.println();
}
}
Put all together
final Gson gson = new Gson();
final Retrofit retrofit = new Retrofit.Builder()
.baseUrl(...)
.addConverterFactory(getErrorReportingConverterFactory(getSystemOutConversionThrowableConsumer()))
.addConverterFactory(GsonConverterFactory.create(gson))
.build();
final IWhateverService service = retrofit.create(IWhateverService.class);
final Call<...> call = service.getWhatever("test.json");
call.enqueue(new Callback<...>() {
#Override
public void onResponse(final Call<...> call, final Response<...> response) {
System.out.println(response.body());
}
#Override
public void onFailure(final Call<...> call, final Throwable throwable) {
throwable.printStackTrace(System.err);
}
});
Note that ErrorReportingConverterFactory must registered before the GsonConverterFactory. Let's assume the service requests for a JSON that's eventually illegal:
{"foo":1,###"bar":2}
In such case, the error reporting converter will produce the following dump to stdout:
Content type: application/json
Content length: -1
Content: {"foo":1,###"bar":2}
I'm not an expert in Log4j, and could not find an efficient way to get the output streams to redirect the input stream to. Here is the closest thing what I've found:
final class Log4jConversionThrowableConsumer
implements IConversionThrowableConsumer {
private static final int BUFFER_SIZE = 512;
private final Logger logger;
private Log4jConversionThrowableConsumer(final Logger logger) {
this.logger = logger;
}
static IConversionThrowableConsumer getLog4jConversionThrowableConsumer(final Logger logger) {
return new Log4jConversionThrowableConsumer(logger);
}
#Override
public void accept(final MediaType contentType, final long contentLength, final InputStream inputStream, final Throwable ex) {
try {
final StringBuilder builder = new StringBuilder(BUFFER_SIZE)
.append("Content type=")
.append(contentType)
.append("; Content length=")
.append(contentLength)
.append("; Input stream content=");
readInputStreamFirstChunk(builder, inputStream);
logger.error(builder.toString(), ex);
} catch ( final IOException ioex ) {
throw new RuntimeException(ioex);
}
}
private static void readInputStreamFirstChunk(final StringBuilder builder, final InputStream inputStream)
throws IOException {
final Reader reader = new InputStreamReader(inputStream);
final char[] buffer = new char[512];
final int read = reader.read(buffer);
if ( read >= 0 ) {
builder.append(buffer, 0, read);
}
}
}
Unfortunately, collecting the whole string may be expensive, so it only takes the very first 512 bytes. This may require callibrating the joined streams in the intermediate converter in order to "shift" the content "to the left" a bit.

How to calculate message digests in custom output stream?

I would like to implement an OutputStream that can produce MessageDigests. Likewise, I already have an InputStream implementation of it here, which works fine and extends FilterInputStream.
The problem is this: if I'm extending FilterOutputStream, the checksums don't match. If I use FileOutputStream it works fine (although that is not the stream I'd like to be using, as I'd like it to be a bit more generic than that).
public class MultipleDigestOutputStream extends FilterOutputStream
{
public static final String[] DEFAULT_ALGORITHMS = { EncryptionConstants.ALGORITHM_MD5,
EncryptionConstants.ALGORITHM_SHA1 };
private Map<String, MessageDigest> digests = new LinkedHashMap<>();
private File file;
public MultipleDigestOutputStream(File file, OutputStream os)
throws NoSuchAlgorithmException, FileNotFoundException
{
this(file, os, DEFAULT_ALGORITHMS);
}
public MultipleDigestOutputStream(File file, OutputStream os, String[] algorithms)
throws NoSuchAlgorithmException, FileNotFoundException
{
// super(file); // If extending FileOutputStream
super(os);
this.file = file;
for (String algorithm : algorithms)
{
addAlgorithm(algorithm);
}
}
public void addAlgorithm(String algorithm)
throws NoSuchAlgorithmException
{
MessageDigest digest = MessageDigest.getInstance(algorithm);
digests.put(algorithm, digest);
}
public MessageDigest getMessageDigest(String algorithm)
{
return digests.get(algorithm);
}
public Map<String, MessageDigest> getDigests()
{
return digests;
}
public String getMessageDigestAsHexadecimalString(String algorithm)
{
return MessageDigestUtils.convertToHexadecimalString(getMessageDigest(algorithm));
}
public void setDigests(Map<String, MessageDigest> digests)
{
this.digests = digests;
}
#Override
public void write(int b)
throws IOException
{
super.write(b);
System.out.println("write(int b)");
for (Map.Entry entry : digests.entrySet())
{
int p = b & 0xFF;
byte b1 = (byte) p;
MessageDigest digest = (MessageDigest) entry.getValue();
digest.update(b1);
}
}
#Override
public void write(byte[] b)
throws IOException
{
super.write(b);
for (Map.Entry entry : digests.entrySet())
{
MessageDigest digest = (MessageDigest) entry.getValue();
digest.update(b);
}
}
#Override
public void write(byte[] b, int off, int len)
throws IOException
{
super.write(b, off, len);
for (Map.Entry entry : digests.entrySet())
{
MessageDigest digest = (MessageDigest) entry.getValue();
digest.update(b, off, len);
}
}
#Override
public void close()
throws IOException
{
super.close();
}
}
My test case (the asserted checksums have been checked with md5sum and sha1sum):
public class MultipleDigestOutputStreamTest
{
#Before
public void setUp()
throws Exception
{
File dir = new File("target/test-resources");
if (!dir.exists())
{
//noinspection ResultOfMethodCallIgnored
dir.mkdirs();
}
}
#Test
public void testWrite()
throws IOException,
NoSuchAlgorithmException
{
String s = "This is a test.";
File file = new File("target/test-resources/metadata.xml");
ByteArrayOutputStream baos = new ByteArrayOutputStream();
MultipleDigestOutputStream mdos = new MultipleDigestOutputStream(file, baos);
mdos.write(s.getBytes());
mdos.flush();
final String md5 = mdos.getMessageDigestAsHexadecimalString("MD5");
final String sha1 = mdos.getMessageDigestAsHexadecimalString("SHA-1");
Assert.assertEquals("Incorrect MD5 sum!", "120ea8a25e5d487bf68b5f7096440019", md5);
Assert.assertEquals("Incorrect SHA-1 sum!", "afa6c8b3a2fae95785dc7d9685a57835d703ac88", sha1);
System.out.println("MD5: " + md5);
System.out.println("SHA1: " + sha1);
}
}
Could you please advise as to what could be the problem and how to fix it? Many thanks in advance!
If you are using java 7 or above, you can just use DigestOutputstream.
UPDATE
You can inplement the abstract MessageDigest class to wrap multiple MessageDigest instances.
SOME CODE
import java.security.DigestException;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
public class DigestWrapper extends MessageDigest
{
private final MessageDigest md5;
private final MessageDigest sha1;
// some methods missing.
// I just implemeted them throwing a RuntimeException.
public DigestWrapper() throws NoSuchAlgorithmException
{
super(null);
sha1 = MessageDigest.getInstance("sha-1");
md5 = MessageDigest.getInstance("md5");
}
public byte[] getMD5Digest()
{
return md5.digest();
}
public byte[] getSHA1Digest()
{
return sha1.digest();
}
#Override
public int digest(byte[] buf, int offset, int len) throws DigestException
{
md5.digest(buf, offset, len);
sha1.digest(buf, offset, len);
return 0;
}
#Override
public byte[] digest(byte[] input)
{
md5.digest(input);
sha1.digest(input);
return input;
}
#Override
public void reset()
{
md5.reset();
sha1.reset();
}
#Override
public void update(byte input)
{
md5.update(input);
sha1.update(input);
}
#Override
public void update(byte[] input, int offset, int len)
{
md5.update(input, offset, len);
sha1.update(input, offset, len);
}
#Override
public void update(byte[] input)
{
md5.update(input);
sha1.update(input);
}
}
I have created a project on Github which contains my implementation of the MultipleDigestInputStream and MultipleDigestOutputStream here.
To check how the code can be used, have a look at the following tests:
MultipleDigestInputStreamTest
MultipleDigestOutputStreamTest
Let me know, if there is enough interest and I can release it and publish it to Maven Central.

Find file with certain extension and calculate its hash in Java

I want to calculate the MD5 hash of a file that ends with a certain extension in Java. I used two codes for this:
FileSearch.java
public class FileSearch
{
public static File findfile(File file) throws IOException
{
String drive = (new DetectDrive()).USBDetect();
Path start = FileSystems.getDefault().getPath(drive);
Files.walkFileTree(start, new SimpleFileVisitor<Path>() {
#Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs)
{
if (file.toString().endsWith(".raw"))
{
System.out.println(file);
}
return FileVisitResult.CONTINUE;
}
});
return file;
}
public static void main(String[] args) throws IOException
{
Hash hasher = new Hash();
try
{
if (file.toString().endsWith("raw"))
{
hasher.hash(file);
}
} catch (IOException e)
{
e.printStackTrace();
}
}
}
Hash.java
public class Hash
{
public void hash(File file) throws Exception
{
MessageDigest md = MessageDigest.getInstance("MD5");
FileInputStream fis = new FileInputStream(file);
byte[] dataBytes = new byte[1024];
int nread = 0;
while ((nread = fis.read(dataBytes)) != -1)
{
md.update(dataBytes, 0, nread);
};
byte[] mdbytes = md.digest();
StringBuffer sb = new StringBuffer();
for (int i = 0; i < mdbytes.length; i++)
{
sb.append(Integer.toString((mdbytes[i] & 0xff) + 0x100, 16).substring(1));
}
System.out.println("Digest(in hex format):: " + sb.toString());
}
}
The first code is used to search for the file that ends with .raw while the second code (not completed yet) is used to get the raw file and then calculate its hash. However, I do not know how to call the first code into the second code to get that raw file. I believe I have to put a string inside the new FileInputStream(...) but I need to call the raw file instead.
Is it possible to do so since both of them contain a main method? Or do I need to change the FileSearch.java without the main method and have a "public String search()" instead and then call it in the second code? I would appreciate if you could show me how to do it the right way.
So the logic consists in these steps:
for each file with the .raw extension
hash the file
You should thus have a method void hash(File file), and call it from your first class.
So, in Hash.java, rename your main method to
public void hash(File file)
And open the file using
FileInputStream fis = new FileInputStream(file);
Then call this hash() method from your first class:
public static void main(String[] args) throws IOException
Hash hasher = new Hash();
...
if (file.toString().endsWith(".raw")) {
hasher.hash(file);
}
...
}
You'll also have to make sure that every FileInputStream you create is properly closed, otherwise you'll quickly run out of file descriptors. The best way to do that is to use the try-with-resources construct: http://docs.oracle.com/javase/tutorial/essential/exceptions/tryResourceClose.html

Categories