Star micronics receipt printer image from database - java

I have a sm-t300i and im trying to figure out how to print a image from a database. I have the image data but not sure how to plug it in. I have successfuly added a image from the assets but not sure how to from raw image data. Code below is from assets. Also for some reason the image in the code below will not center is there something else i need to do to center the image. Thank you.
AssetManager assetManager = mContext.getAssets();
InputStream istr = null;
try {
istr = assetManager.open("www/img/logo.bmp");
} catch (IOException e) {
e.printStackTrace();
}
Bitmap bm = BitmapFactory.decodeStream(istr);
StarBitmap starbitmap = new StarBitmap(bm, false, 200);
commands.add(new byte[] { 0x1b, 0x61, 0x01 }); //align center
commands.add(starbitmap.getImageEscPosDataForPrinting(false,false));

Looks like you can just convert the base64 and make it a bitmap
String imagex = "iVBORw0KGgoAAAANS etc";
Bitmap bm = StringToBitMap(imagex);
StarBitmap starbitmap = new StarBitmap(bm, true, 600);
commands.add(starbitmap.getImageEscPosDataForPrinting(false,true));
public Bitmap StringToBitMap(String encodedString){
try {
byte [] encodeByte=Base64.decode(encodedString, Base64.DEFAULT);
Bitmap bitmap=BitmapFactory.decodeByteArray(encodeByte, 0, encodeByte.length);
return bitmap;
} catch(Exception e) {
e.getMessage();
return null;
}
}

Related

How to convert blob into .png and send it via firebase notification?

I have a problem when I try to send .png image via firebase service. Notification works fine and it comes to my phone, but, without image. Also, I've tried putting a direct link to the image from internet but when I tried to convert blob from database into .png and send it like that It doesnt send picture. I guess that I am not sending the image the right way?
Here is my controller code below:
Java
Company company = companyService.findCompanyByName(systemUserService.findByUsername(principal.getName()).getCompany().getName());
Notification notify = notificationService.findByName(name);
System.out.println("Title: " + notify.getName());
System.out.println("Message: " + notify.getText());
JSONObject body = new JSONObject();
body.put("to", "/topics/" + TOPIC);
body.put("priority", "high");
JSONObject notification = new JSONObject();
notification.put("title", notify.getName());
notification.put("body", notify.getText());
notification.put("sound", "default");
try {
byte[] aByteArray = company.getLogo();
int width = 1;
int height = 2;
DataBuffer buffer = new DataBufferByte(aByteArray, aByteArray.length);
WritableRaster raster = Raster.createInterleavedRaster(buffer, width, height, 3 * width, 3, new int[] {0, 1, 2}, (Point)null);
ColorModel cm = new ComponentColorModel(ColorModel.getRGBdefault().getColorSpace(), false, true, Transparency.OPAQUE, DataBuffer.TYPE_BYTE);
BufferedImage image = new BufferedImage(cm, raster, true, null);
ImageIO.write(image, "png", new File("image.png"));
notification.put("image", image);
} catch (IOException e) {
e.printStackTrace();
}
JSONObject data = new JSONObject();
data.put("Key-1", "JSA Data 1");
data.put("Key-2", "JSA Data 2");
body.put("notification", notification);
body.put("data", data);
HttpEntity<String> request = new HttpEntity<>(body.toString());
CompletableFuture<String> pushNotification = androidPushNotificationsService.send(request);
CompletableFuture.allOf(pushNotification).join();
try {
String firebaseResponse = pushNotification.get();
return new ResponseEntity<>(firebaseResponse, HttpStatus.OK);
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
return new ResponseEntity<>("Push Notification ERROR!", HttpStatus.BAD_REQUEST);
Try something like this :
Download image from URL and convert to Base64 encoded String that can be JSONified.
public String jsonifyImage(String imageUrl) {
try {
URL url = new URL(imageUrl);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setDoInput(true);
connection.connect();
Bitmap bMap = BitmapFactory.decodeStream(connection.getInputStream());
ByteArrayOutputStream oStream = new ByteArrayOutputStream();
bMap.compress(Bitmap.CompressFormat.PNG, 100, oStream);
byte[] byteArr = oStream.toByteArray();
return Base64.getEncoder().encodeToString(byteArr);
} catch (Exception e) {
//handle
return null;
}
}
The result should be image: "...base64 of the image file...".
byte[] aByteArray = company.getLogo();
String image = Base64.getEncoder().encodeToString(aByteArray);
notification.put("image", image);
For json of course no image needs to be loaded. The blob should contain the bytes of the image (file).

save zoom image level to file

Let's say that I want to load an shp file, do my stuff on it and save the map as an image.
In order to save an image I am using:
public void saveImage(final MapContent map, final String file, final int imageWidth) {
GTRenderer renderer = new StreamingRenderer();
renderer.setMapContent(map);
Rectangle imageBounds = null;
ReferencedEnvelope mapBounds = null;
try {
mapBounds = map.getMaxBounds();
double heightToWidth = mapBounds.getSpan(1) / mapBounds.getSpan(0);
imageBounds = new Rectangle(0, 0, imageWidth, (int) Math.round(imageWidth * heightToWidth));
} catch (Exception e) {
// Failed to access map layers
throw new RuntimeException(e);
}
BufferedImage image = new BufferedImage(imageBounds.width, imageBounds.height, BufferedImage.TYPE_INT_RGB);
Graphics2D gr = image.createGraphics();
gr.setPaint(Color.WHITE);
gr.fill(imageBounds);
try {
renderer.paint(gr, imageBounds, mapBounds);
File fileToSave = new File(file);
ImageIO.write(image, "png", fileToSave);
} catch (IOException e) {
throw new RuntimeException(e);
}
}
But, let's say I am doing something like this:
...
MapContent map = new MapContent();
map.setTitle("TEST");
map.addLayer(layer);
map.addLayer(shpLayer);
// zoom into the line
MapViewport viewport = new MapViewport(featureCollection.getBounds());
map.setViewport(viewport);
saveImage(map, "/tmp/img.png", 800);
1) The problem is that the zoom level isn't saved on the image file.Is there a way to save it?
2) When I am doing MapViewport(featureCollection.getBounds()); is there a way to extend a little bit the boundaries in order to have a better visual representation?
...
The reason that you aren't saving the map at the current zoom level is that in your saveImage method you have the line:
mapBounds = map.getMaxBounds();
which always uses the full extent of the map, you can change this to
mapBounds = map.getViewport().getBounds();
You can expand a bounding box by something like:
ReferencedEnvelope bounds = featureCollection.getBounds();
double delta = bounds.getWidth()/20.0; //5% on each side
bounds.expandBy(delta );
MapViewport viewport = new MapViewport(bounds);
map.setViewport(viewport );
A quicker (and easier) way to save a map from the GUI is to use a method like this which just saves exactly what is on the screen:
public void drawMapToImage(File outputFile, String outputType,
JMapPane mapPane) {
ImageOutputStream outputImageFile = null;
FileOutputStream fileOutputStream = null;
try {
fileOutputStream = new FileOutputStream(outputFile);
outputImageFile = ImageIO.createImageOutputStream(fileOutputStream);
RenderedImage bufferedImage = mapPane.getBaseImage();
ImageIO.write(bufferedImage, outputType, outputImageFile);
} catch (IOException ex) {
ex.printStackTrace();
} finally {
try {
if (outputImageFile != null) {
outputImageFile.flush();
outputImageFile.close();
fileOutputStream.flush();
fileOutputStream.close();
}
} catch (IOException e) {// don't care now
}
}
}

how to draw text on image view android

I have an app that calculate time of phone usage and you can share it on social networks, instead of normal gettext() I want to put the results on an image and save it to phone. How can I create an image that I can save to the phone that includes custom text?
What you want to do is to paint to a Canvas that's linked to a Bitmap, and save the bitmap. Here's a few bits of code that you should be able to string together to make it work. Note that you'll have to still add the paining to the canvas, in the getBitmap() function.
private Bitmap getBitmap() {
Bitmap bitmap = Bitmap.createBitmap(mPieToss.getWidth(),
mPieToss.getHeight(), Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
// Draw things to your canvas. They will be included as your BitMap, which is saved later on
return bitmap;
}
public void save() {
Bitmap bitmap = getBitmap();
File path = Environment
.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);
String timestamp = new SimpleDateFormat("yyyyMMdd_HHmmss")
.format(new Date());
String filename = "Imagen_" + timestamp + ".jpg";
File file = new File(path, filename);
FileOutputStream stream;
// This can fail if the external storage is mounted via USB
try {
stream = new FileOutputStream(file);
bitmap.compress(CompressFormat.PNG, 100, stream);
stream.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
mUri = Uri.fromFile(file);
bitmap.recycle();
}
First of all convert your layout to bitmap:
View contentLayout; // your layout with bavkground image and TextView
Bitmap bitmap = Bitmap.createBitmap(screenWidth, screenHeight, Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
content.draw(canvas);
Now you can save it to the file:
FileOutputStream fos = new FileOutputStream(fileName, false);
bitmap.compress(Bitmap.CompressFormat.PNG, 75, fos);
fos.flush();
fos.close();

Taking a photo upon QR code scan

I have an application that has zxing integrated. I've been looking at trying to store a photo when a QR code is scanned. Sean Owen recommended the following:
"The app is getting a continuous stream of frames from the camera to analyze. You can store off any of them by intercepting them in the preview callback."
As far as I am aware the only instances of preview callback is within the CameraManager.java activity (https://code.google.com/p/zxing/source/browse/trunk/android/src/com/google/zxing/client/android/camera/CameraManager.java).
In particular:
public synchronized void requestPreviewFrame(Handler handler, int message) {
Camera theCamera = camera;
if (theCamera != null && previewing) {
previewCallback.setHandler(handler, message);
theCamera.setOneShotPreviewCallback(previewCallback);
}}
Since this runs every frame I don't have a method of saving (preferably as byte date) any particular frame. I would have assumed there to be a point where something is passed back to the CaptureActivity.java class (Link given at bottom) however I haven't found anything myself.
Anyone who has used Zxing will know that after a scan a ghostly image is shown on screen of the scan data, if it is possible to hijack this part of the code and convert and/or save that data as byte code that may also be useful.
Any help, or other ideas would be very appreciated. Requests for any further information will be responded to quickly. Thank you.
Full code available within this folder: https://code.google.com/p/zxing/source/browse/trunk#trunk%2Fandroid%2Fsrc%2Fcom%2Fgoogle%2Fzxing%2Fclient%2Fandroid
Update:
So far the following sections of code appear to be possible places to save byte data, both are within the DecodeHandler.java class.
private void decode(byte[] data, int width, int height) {
long start = System.currentTimeMillis();
Result rawResult = null;
PlanarYUVLuminanceSource source = activity.getCameraManager().buildLuminanceSource(data, width, height);
if (source != null) {
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
//here?
try {
rawResult = multiFormatReader.decodeWithState(bitmap);
} catch (ReaderException re) {
// continue
} finally {
multiFormatReader.reset();
}
}
Handler handler = activity.getHandler();
if (rawResult != null) {
// Don't log the barcode contents for security.
long end = System.currentTimeMillis();
Log.d(TAG, "Found barcode in " + (end - start) + " ms");
if (handler != null) {
Message message = Message.obtain(handler, R.id.decode_succeeded, rawResult);
Bundle bundle = new Bundle();
Bitmap grayscaleBitmap = toBitmap(source, source.renderCroppedGreyscaleBitmap());
//I believe this bitmap is the one shown on screen after a scan has been performed
bundle.putParcelable(DecodeThread.BARCODE_BITMAP, grayscaleBitmap);
message.setData(bundle);
message.sendToTarget();
}
} else {
if (handler != null) {
Message message = Message.obtain(handler, R.id.decode_failed);
message.sendToTarget();
}
}}
private static Bitmap toBitmap(LuminanceSource source, int[] pixels) {
int width = source.getWidth();
int height = source.getHeight();
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
//saving the bitmnap at this point or slightly sooner, before grey scaling could work.
return bitmap;}
Update: Requested code found within PreviewCallback.java
public void onPreviewFrame(byte[] data, Camera camera) {
Point cameraResolution = configManager.getCameraResolution();
Handler thePreviewHandler = previewHandler;
if (cameraResolution != null && thePreviewHandler != null) {
Message message = thePreviewHandler.obtainMessage(previewMessage, cameraResolution.x,
cameraResolution.y, data);
message.sendToTarget();
previewHandler = null;
} else {
Log.d(TAG, "Got preview callback, but no handler or resolution available");
}
The data from preview callback is NV21 format. So if you wanna save it, you could use such code:
YuvImage im = new YuvImage(byteArray, ImageFormat.NV21, width,
height, null);
Rect r = new Rect(0, 0, width, height);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
im.compressToJpeg(r, 50, baos);
try {
FileOutputStream output = new FileOutputStream("/sdcard/test_jpg.jpg");
output.write(baos.toByteArray());
output.flush();
output.close();
} catch (FileNotFoundException e) {
} catch (IOException e) {
}
The save point is when the ZXing could decode the byte[] and return content String successfully.

Android taking a screenshot - partial black Google map

Sometimes it works, sometimes it doesn't. I am sharing this picture on Facebook, and sometimes it is ok:
And sometimes it is partial black (only the Google map!):
I have a separate thread doing this screenshot:
new Thread() {
#Override
public void run() {
// Get root view
View view = mapView.getRootView();
view.setDrawingCacheEnabled(true);
// Create the bitmap to use to draw the screenshot
final Bitmap bitmap = Bitmap.createBitmap(
getWindowManager().getDefaultDisplay().getWidth(), getWindowManager().getDefaultDisplay().getHeight(), Bitmap.Config.ARGB_4444);
final Canvas canvas = new Canvas(bitmap);
// Get current theme to know which background to use
final Theme theme = activity.getTheme();
final TypedArray ta = theme
.obtainStyledAttributes(new int[] { android.R.attr.windowBackground });
final int res = ta.getResourceId(0, 0);
final Drawable background = activity.getResources().getDrawable(res);
// Draw background
background.draw(canvas);
// Draw views
view.draw(canvas);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 90, baos); //bm is the bitmap object
byte[] b = baos.toByteArray();
Bundle params = new Bundle();
params.putByteArray("source", b);
params.putString("message", "genie in a bottle");
try {
//String resp =
facebook.request("me/photos", params, "POST");
} catch (FileNotFoundException e) {
throw new RuntimeException(e);
} catch (MalformedURLException e) {
throw new RuntimeException(e);
} catch (IOException e) {
throw new RuntimeException(e);
}
}
}.start();
You can create the Bitmap object directly from the screen using:
bitmap = Bitmap.createBitmap(view.getDrawingCache());
Regards.

Categories