Getting errors in java program to write to Kinesis Firehose stream - java

I'm trying to just write some data from an API (google stocks/finance API) to my AWS Firehose stream. I already downloaded and installed the AWS plugin on Eclipse, setup my Firehose stream on AWS, and everything seems to be setup correctly. Am encountering some problems, though. The following line seems to be deprecated...I tried different variations from Amazon's SDK, but I can't seem to get the correct code.
AmazonKinesisFirehoseClient firehoseClient = new
AmazonKinesisFirehoseClient(credentials);
Next, I'm getting some errors with the following. The specific error is, "The method setRecord(Record) is undefined for the type PutRecordRequest," even though I took it directly from Amazon's API reference.
request.setRecord(record);
firehoseClient.putRecord(request);
Also getting an error on the second line above: "The method putRecord(com.amazonaws.services.kinesisfirehose.model.PutRecordRequest) in the type AmazonKinesisFirehoseClient is not applicable for the arguments (com.amazonaws.services.kinesis.model.PutRecordRequest)"
package com.amazonaws.samples;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
import java.nio.ByteBuffer;
import org.apache.http.client.CredentialsProvider;
import com.amazonaws.*;
import com.amazonaws.AmazonClientException;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.client.builder.AwsClientBuilder;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient;
import com.amazonaws.services.kinesis.AmazonKinesis;
import com.amazonaws.services.kinesis.AmazonKinesisClient;
import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
import com.amazonaws.services.kinesis.clientlibrary.interfaces.IRecordProcessorFactory;
import com.amazonaws.services.kinesis.clientlibrary.lib.worker.InitialPositionInStream;
import com.amazonaws.services.kinesis.clientlibrary.lib.worker.KinesisClientLibConfiguration;
import com.amazonaws.services.kinesis.clientlibrary.lib.worker.Worker;
import com.amazonaws.services.kinesis.model.PutRecordRequest;
import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
import com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient;
import com.amazonaws.services.kinesisfirehose.model.PutRecordBatchRequest;
import com.amazonaws.services.kinesisfirehose.model.Record;
public class FirehoseExample {
public static void main(String[] args) {
AWSCredentials credentials = null;
try {
credentials = new ProfileCredentialsProvider().getCredentials();
}
catch (Exception e) {
throw new AmazonClientException("Cannot load the credentials from the credential profiles file. "
+ "Please make sure that your credentials file is at the correct "
+ "location (/Users/elybenari/.aws/credentials), and is in valid format.", e);
}
AmazonKinesisFirehoseClient firehoseClient = new AmazonKinesisFirehoseClient(credentials);
PutRecordRequest request = new PutRecordRequest();
request.setStreamName("project-stream");
Record record = new Record();
for (int i = 0; i < 20*60; i++){
try {
URL url = new URL("https://www.google.com/finance/info?q=NASDAQ:AMZN");
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
StringBuilder response = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
response.append(line);
}
reader.close();
System.out.println(response.toString().replace("\n", "").replaceAll(" ", ""));
System.out.println("****\n");
ByteBuffer buffer = ByteBuffer.wrap(response.toString().replace("\n", "").replaceAll(" ", "").getBytes());
record.setData(buff);
request.setRecord(record);
firehoseClient.putRecord(request);
Thread.sleep(2000);
}
catch(Exception e){
e.printStackTrace();
}
}
}
}

The problem is that you've included some classes from Kinesis, not Kinesis Firehose, Java package. For e.g., you've used:
import com.amazonaws.services.kinesis.model.PutRecordRequest;
Whereas, you should've used:
import com.amazonaws.services.kinesisfirehose.model.PutRecordRequest;
Kinesis, Kinesis Firehose and Kinesis Analytics are different services, even though they fall under one umbrella of streaming services on AWS. Consequently, they have different package namespaces in the Java SDK. If you start from the official documentation here, you'll reach the correct Java SDK reference here.
EDIT: To answer your other question: yes, the following is deprecated:
AmazonKinesisFirehoseClient firehoseClient = new AmazonKinesisFirehoseClient(credentials);
You should instead use the following:
AmazonKinesisFirehoseClient firehoseClient = AmazonKinesisFirehoseClientBuilder.standard().withCredentials(new AWSStaticCredentialsProvider(awsCredentials)).build();
Refer to the official documentation here on how to correctly initialize AmazonKinesisFirehoseClient.

Related

Java Google Api application Default credentials

How do I declare the application credentials? I have my .json file which is the key.
package shyam;
// Imports the Google Cloud client library
import com.google.cloud.vision.v1.AnnotateImageRequest;
import com.google.cloud.vision.v1.AnnotateImageResponse;
import com.google.cloud.vision.v1.BatchAnnotateImagesResponse;
import com.google.cloud.vision.v1.EntityAnnotation;
import com.google.cloud.vision.v1.Feature;
import com.google.cloud.vision.v1.Feature.Type;
import com.google.cloud.vision.v1.Image;
import com.google.cloud.vision.v1.ImageAnnotatorClient;
import com.google.protobuf.ByteString;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
public class App {
public static void main(String[] args) throws Exception {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests. After completing all of your requests, call
// the "close" method on the client to safely clean up any remaining background resources.
try (ImageAnnotatorClient vision = ImageAnnotatorClient.create()) {
// The path to the image file to annotate
String fileName = "./resources/wakeupcat.jpg";
// Reads the image file into memory
Path path = Paths.get(fileName);
byte[] data = Files.readAllBytes(path);
ByteString imgBytes = ByteString.copyFrom(data);
// Builds the image annotation request
List<AnnotateImageRequest> requests = new ArrayList<>();
Image img = Image.newBuilder().setContent(imgBytes).build();
Feature feat = Feature.newBuilder().setType(Type.LABEL_DETECTION).build();
AnnotateImageRequest request =
AnnotateImageRequest.newBuilder().addFeatures(feat).setImage(img).build();
requests.add(request);
// Performs label detection on the image file
BatchAnnotateImagesResponse response = vision.batchAnnotateImages(requests);
List<AnnotateImageResponse> responses = response.getResponsesList();
for (AnnotateImageResponse res : responses) {
if (res.hasError()) {
System.out.format("Error: %s%n", res.getError().getMessage());
return;
}
// for (EntityAnnotation annotation : res.getLabelAnnotationsList()) {
// annotation
// .getAllFields()
// .forEach((k, v) -> System.out.format("%s : %s%n", k, v.toString()));
// }
}
}
}
}
I'm getting the error
Application default credentials are not available
I have already set it in my cmd using set GOOGLE_APPLICATION_CREDENTIALS='key_path'. I have a lot initialized my Google Cloud Account in the cli. Hope someone can help me. Thank you.

Azure functions using Java - How to created #BlobTrigger

i need to created Azure function BlobTrigger using Java to monitor my storage container for new and updated blobs.
tried with below code
import java.util.*;
import com.microsoft.azure.serverless.functions.annotation.*;
import com.microsoft.azure.serverless.functions.*;
import java.nio.file.*;
import java.io.*;
import java.net.URL;
import java.net.URLConnection;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import com.microsoft.azure.storage.*;
import com.microsoft.azure.storage.blob.*;
#FunctionName("testblobtrigger")
public String testblobtrigger(#BlobTrigger(name = "test", path = "testcontainer/{name}") String content) {
try {
return String.format("Blob content : %s!", content);
} catch (Exception e) {
// Output the stack trace.
e.printStackTrace();
return "Access Error!";
}
}
when executed it is showing error
Storage binding (blob/queue/table) must have non-empty connection. Invalid storage binding found on method:
it is working when added connection string
public String kafkablobtrigger(#BlobTrigger(name = "test", path = "testjavablobstorage/{name}",connection=storageConnectionString) String content) {
why i need to add connection string when using blobtrigger?
in C# it is working without connection string:
public static void ProcessBlobContainer1([BlobTrigger("container1/{blobName}")] CloudBlockBlob blob, string blobName)
{
ProcessBlob("container1", blobName, blob);
}
i didn't see any Java sample for Azure functions for #BlobTrigger.
After all, connection is necessary for the trigger to identify where the container locates.
After test I find #Mikhail is right.
For C#, the default value(in local.settings.json or in application settings in portal) will be used if connection is ignored. But unfortunately there's no same settings for java.
You can add #StorageAccount("YourStorageConnection") below your #FuncionName as it's another valid way to choose. And value of YourStorageConnection in local.settings.json or in portal's application settings is up to you.
You can follow this tutorial, use mvn azure-functions:add to find four(Http/Blob/Queue/TimerTrigger) templates for java.

Webmaster Tools Api, get more then 1000 crawling errors

Im using the new webmaster tools api to get all my site's crawling errors (+ details). Unfort. it only gives me 1000 but i have like 10000. Is there a way to get all of them ?
This is the code i use:
package main;
import com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeFlow;
import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.googleapis.auth.oauth2.GoogleTokenResponse;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.webmasters.Webmasters;
import com.google.api.services.webmasters.Webmasters.Urlcrawlerrorssamples;
import com.google.api.services.webmasters.model.SitesListResponse;
import com.google.api.services.webmasters.model.UrlCrawlErrorsSample;
import com.google.api.services.webmasters.model.UrlCrawlErrorsSamplesListResponse;
import com.google.api.services.webmasters.model.WmxSite;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.Arrays;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
public class WebmastersCommandLine {
private static String CLIENT_ID = "...";
private static String CLIENT_SECRET = "...";
private static String REDIRECT_URI = "urn:ietf:wg:oauth:2.0:oob";
private static String OAUTH_SCOPE = "https://www.googleapis.com/auth/webmasters.readonly";
private static String PAGE_URL = "...";
public static void main(String[] args) throws IOException {
HttpTransport httpTransport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
GoogleAuthorizationCodeFlow flow = new GoogleAuthorizationCodeFlow.Builder(
httpTransport, jsonFactory, CLIENT_ID, CLIENT_SECRET, Arrays.asList(OAUTH_SCOPE))
.setAccessType("online")
.setApprovalPrompt("auto").build();
String url = flow.newAuthorizationUrl().setRedirectUri(REDIRECT_URI).build();
System.out.println("open URL:");
System.out.println(" " + url);
System.out.println("code:");
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
String code = br.readLine();
GoogleTokenResponse response = flow.newTokenRequest(code).setRedirectUri(REDIRECT_URI).execute();
GoogleCredential credential = new GoogleCredential().setFromTokenResponse(response);
// Create a new authorized API client
Webmasters service = new Webmasters.Builder(httpTransport, jsonFactory, credential)
.setApplicationName("WebmastersCommandLine")
.build();
Webmasters.Urlcrawlerrorssamples.List req2 = service.urlcrawlerrorssamples().list(PAGE_URL, "notFound", "web");
try
{
UrlCrawlErrorsSamplesListResponse urlList = req2.execute();
System.out.println("start");
for(UrlCrawlErrorsSample sample : urlList.getUrlCrawlErrorSample())
{
Webmasters.Urlcrawlerrorssamples.Get req3 = service.urlcrawlerrorssamples().get(PAGE_URL, sample.getPageUrl(), "notFound", "web");
UrlCrawlErrorsSample details = req3.execute();
System.out.println(sample.getPageUrl() + "," + details.getUrlDetails().getLinkedFromUrls());
}
}
catch(IOException e)
{
System.out.println("An error occurred: " + e);
}
System.out.println("done");
}
}
This however only gives me a list of 1000 errors, but i need all 10000 of them. Does anybody know a way to do that ?
The Webmaster Tools API URL Crawl Errors Sample method returns a sample of 1000 crawl errors. It's not meant to return a complete list (you could compile that from your server logs). If you want more samples through the API, one thing you can do is to mark these errors as fixed and check back in a day. It will then generate a set of samples from the remaining crawl errors.
The order of the samples is the same as in the UI, so the more important ones will be the first ones you see. This means that there are diminishing returns as you move on, with later crawl errors being either similar to the previous ones, or at least seen as being less critical. The original blog post has more on the prioritization:
We determine this based on a multitude of factors, including whether
or not you included the URL in a Sitemap, how many places it’s linked
from (and if any of those are also on your site), and whether the URL
has gotten any traffic recently from search.

Cannot access azure blobs through rest api

I was able to create a Container in Storage Account and upload a blob to it through the Client Side Code.
I was able to make the blob available for Public access as well , such that when I hit the following query from my browser, I am able to see the image which I uploaded.
https://MYACCOUNT.blob.core.windows.net/MYCONTAINER/MYBLOB
I now have a requirement to use the rest service to retrieve the contents of the blob. I wrote down the following java code.
package main;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.TimeZone;
public class GetBlob {
public static void main(String[] args) {
String url="https://MYACCOUNT.blob.core.windows.net/MYCONTAINER/MYBLOB";
try {
System.out.println("RUNNIGN");
HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();
connection.setRequestProperty("Authorization", createQuery());
connection.setRequestProperty("x-ms-version", "2009-09-19");
InputStream response = connection.getInputStream();
System.out.println("SUCCESSS");
String line;
BufferedReader reader = new BufferedReader(new InputStreamReader(response));
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
} catch (IOException e) {
e.printStackTrace();
}
}
public static String createQuery()
{
String dateFormat="EEE, dd MMM yyyy hh:mm:ss zzz";
SimpleDateFormat dateFormatGmt = new SimpleDateFormat(dateFormat);
dateFormatGmt.setTimeZone(TimeZone.getTimeZone("UTC"));
String date=dateFormatGmt.format(new Date());
String Signature="GET\n\n\n\n\n\n\n\n\n\n\n\n" +
"x-ms-date:" +date+
"\nx-ms-version:2009-09-19" ;
// I do not know CANOCALIZED RESOURCE
//WHAT ARE THEY??
// +"\n/myaccount/myaccount/mycontainer\ncomp:metadata\nrestype:container\ntimeout:20";
String SharedKey="SharedKey";
String AccountName="MYACCOUNT";
String encryptedSignature=(encrypt(Signature));
String auth=""+SharedKey+" "+AccountName+":"+encryptedSignature;
return auth;
}
public static String encrypt(String clearTextPassword) {
try {
MessageDigest md = MessageDigest.getInstance("SHA-256");
md.update(clearTextPassword.getBytes());
return new sun.misc.BASE64Encoder().encode(md.digest());
} catch (NoSuchAlgorithmException e) {
}
return "";
}
}
However , I get the following error when I run this main class...
RUNNIGN
java.io.IOException: Server returned HTTP response code: 403 for URL: https://klabs.blob.core.windows.net/delete/Blob_1
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(Unknown Source)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getInputStream(Unknown Source)
at main.MainClass.main(MainClass.java:61)
Question1: Why this error, did I miss any header/parameter?
Question2: Do I need to add headers in the first place, because I am able to hit the request from the browser without any issues.
Question3: Can it be an SSL issue? What is the concept of certificates, and how and where to add them? Do I really need them? Will I need them later, when I do bigger operations on my blob storage(I want to manage a thousand blobs)?
Will be thankful for any reference as well, within Azure and otherwise that could help me understand better.
:D
AFTER A FEW DAYS
Below is my new code for PutBlob I azure. I believe I have fully resolved all header and parameter issues and my request is perfect. However I am still getting the same 403. I do not know what the issue is. Azure is proving to be pretty difficult.
A thing to note is that the containers name is delete, and I want to create a blob inside it, say newBlob. I tried to initialize the urlPath in the code below with both "delete" and "delete/newBlob".
Does not work..
package main;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.UnsupportedEncodingException;
import java.net.HttpURLConnection;
import java.net.URISyntaxException;
import java.net.URL;
import java.security.InvalidKeyException;
import java.security.NoSuchAlgorithmException;
import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.TimeZone;
import javax.crypto.Mac;
import javax.crypto.spec.SecretKeySpec;
import com.sun.org.apache.xml.internal.security.exceptions.Base64DecodingException;
import com.sun.org.apache.xml.internal.security.utils.Base64;
public class Internet {
static String key="password";
static String account="klabs";
private static Base64 base64 ;
private static String createAuthorizationHeader(String canonicalizedString) throws InvalidKeyException, Base64DecodingException, NoSuchAlgorithmException, IllegalStateException, UnsupportedEncodingException {
Mac mac = Mac.getInstance("HmacSHA256");
mac.init(new SecretKeySpec(base64.decode(key), "HmacSHA256"));
String authKey = new String(base64.encode(mac.doFinal(canonicalizedString.getBytes("UTF-8"))));
String authStr = "SharedKey " + account + ":" + authKey;
return authStr;
}
public static void main(String[] args) {
System.out.println("INTERNET");
String key="password";
String account="klabs";
long blobLength="Dipanshu Verma wrote this".getBytes().length;
File f = new File("C:\\Users\\Dipanshu\\Desktop\\abc.txt");
String requestMethod = "PUT";
String urlPath = "delete";
String storageServiceVersion = "2009-09-19";
SimpleDateFormat fmt = new SimpleDateFormat("EEE, dd MMM yyyy HH:mm:sss");
fmt.setTimeZone(TimeZone.getTimeZone("UTC"));
String date = fmt.format(Calendar.getInstance().getTime()) + " UTC";
String blobType = "BlockBlob";
String canonicalizedHeaders = "x-ms-blob-type:"+blobType+"\nx-ms-date:"+date+"\nx-ms-version:"+storageServiceVersion;
String canonicalizedResource = "/"+account+"/"+urlPath;
String stringToSign = requestMethod+"\n\n\n"+blobLength+"\n\n\n\n\n\n\n\n\n"+canonicalizedHeaders+"\n"+canonicalizedResource;
try {
String authorizationHeader = createAuthorizationHeader(stringToSign);
URL myUrl = new URL("https://klabs.blob.core.windows.net/" + urlPath);
HttpURLConnection connection=(HttpURLConnection)myUrl.openConnection();
connection.setRequestProperty("x-ms-blob-type", blobType);
connection.setRequestProperty("Content-Length", String.valueOf(blobLength));
connection.setRequestProperty("x-ms-date", date);
connection.setRequestProperty("x-ms-version", storageServiceVersion);
connection.setRequestProperty("Authorization", authorizationHeader);
connection.setDoOutput(true);
connection.setRequestMethod("POST");
System.out.println(String.valueOf(blobLength));
System.out.println(date);
System.out.println(storageServiceVersion);
System.out.println(stringToSign);
System.out.println(authorizationHeader);
System.out.println(connection.getDoOutput());
DataOutputStream outStream = new DataOutputStream(connection.getOutputStream());
// Send request
outStream.writeBytes("Dipanshu Verma wrote this");
outStream.flush();
outStream.close();
DataInputStream inStream = new DataInputStream(connection.getInputStream());
System.out.println("BULLA");
String buffer;
while((buffer = inStream.readLine()) != null) {
System.out.println(buffer);
}
// Close I/O streams
inStream.close();
outStream.close();
} catch (InvalidKeyException | Base64DecodingException | NoSuchAlgorithmException | IllegalStateException | UnsupportedEncodingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
I know only a proper code reviewer might be able to help me, please do it if you can.
Thanks
Question1: Why this error, did I miss any header/parameter?
Most likely you're getting this error is because of incorrect signature. Please refer to MSDN documentation for creating correct signature: http://msdn.microsoft.com/en-us/library/azure/dd179428.aspx. Unless your signature is correct you'll not be able to perform operations using REST API.
Question2: Do I need to add headers in the first place, because I am
able to hit the request from the browser without any issues.
In your current scenario, because you can access the blob directly (which in turn means the container in which the blob exist has Public or Blob ACL) you don't really need to use REST API. You can simply make a HTTP request using Java and read the response stream which will have blob contents. You would need to go down this route if the container ACL is Private because in this case your requests need to be authenticated and the code above creates an authenticated request.
Question3: Can it be an SSL issue? What is the concept of
certificates, and how and where to add them? Do I really need them?
Will I need them later, when I do bigger operations on my blob
storage(I want to manage a thousand blobs)?
No, it is not an SSL issue. Its an issue with incorrect signature.
Finally found the mistake!!
In the code above , I was using a String "password" as key for my SHA2
base64.decode(key)
It should have been the key associated with my account with AZURE.
Silly One!! Took me 2 weeks to find.

How to setup zxing library on Windows 8 machine?

I have images of codes that I want to decode. How can I use zxing so that I specify the image location and get the decoded text back, and in case the decoding fails (it will for some images, that's the project), it gives me an error.
How can I setup zxing on my Windows machine? I downloaded the jar file, but I don't know where to start. I understand I'll have to create a code to read the image and supply it to the library reader method, but a guide how to do that would be very helpful.
I was able to do it. Downloaded the source and added the following code. Bit rustic, but gets the work done.
import com.google.zxing.NotFoundException;
import com.google.zxing.ChecksumException;
import com.google.zxing.FormatException;
import com.google.zxing.BarcodeFormat;
import com.google.zxing.DecodeHintType;
import com.google.zxing.Reader;
import com.google.zxing.BinaryBitmap;
import com.google.zxing.Result;
import com.google.zxing.LuminanceSource;
import com.google.zxing.client.j2se.BufferedImageLuminanceSource;
import com.google.zxing.common.HybridBinarizer;
import java.awt.image.BufferedImage;
import javax.imageio.ImageIO;
import java.io.File;
import java.io.IOException;
import java.util.*;
import com.google.zxing.qrcode.QRCodeReader;
class qr
{
public static void main(String args[])
{
Reader xReader = new QRCodeReader();
BufferedImage dest = null;
try
{
dest = ImageIO.read(new File(args[0]));
}
catch(IOException e)
{
System.out.println("Cannot load input image");
}
LuminanceSource source = new BufferedImageLuminanceSource(dest);
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
Vector<BarcodeFormat> barcodeFormats = new Vector<BarcodeFormat>();
barcodeFormats.add(BarcodeFormat.QR_CODE);
HashMap<DecodeHintType, Object> decodeHints = new HashMap<DecodeHintType, Object>(3);
decodeHints.put(DecodeHintType.POSSIBLE_FORMATS, barcodeFormats);
decodeHints.put(DecodeHintType.TRY_HARDER, Boolean.TRUE);
Result result = null;
try
{
result = xReader.decode(bitmap, decodeHints);
System.out.println("Code Decoded");
String text = result.getText();
System.out.println(text);
}
catch(NotFoundException e)
{
System.out.println("Decoding Failed");
}
catch(ChecksumException e)
{
System.out.println("Checksum error");
}
catch(FormatException e)
{
System.out.println("Wrong format");
}
}
}
The project includes a class called CommandLineRunner which you can simply call from the command line. You can also look at its source to see how it works and reuse it.
There is nothing to install or set up. It's a library. Typically you don't download the jar but declare it as a dependency in your Maven-based project.
If you just want to send an image to decode, use http://zxing.org/w/decode.jspx

Categories