Cant scan display image charset qr zxing android - java

Sorry for my english. I cant scan qr image, I have this "???????". I use zxing library. I try but not success. Bellow my code. String equals russian charset. I don't know why scan instance symbol to ??????.
final QRCodeWriter writer = new QRCodeWriter();
ImageView tnsd_iv_qr = (ImageView)findViewById(R.id.qrImage);
Charset charset = Charset.forName("UTF-8");
CharsetEncoder encoder = charset.newEncoder();
byte[] b = null;
try {
// Convert a string to UTF-8 bytes in a ByteBuffer
ByteBuffer bbuf = encoder.encode(CharBuffer.wrap(summ.getText().toString() + "/"
+ getNumber.substring(1) + "/"
+ getName + "/"
+ getIdDepartament + "/"
+ getIdUser + "/"
+ getSpinnerItem));
b = bbuf.array();
} catch (CharacterCodingException e) {
//
}
String data;
try {
data = new String(b, "UTF-8");
Hashtable<EncodeHintType, String> hints = new Hashtable<EncodeHintType, String>(2);
hints.put(EncodeHintType.CHARACTER_SET, "UTF-8");
ByteMatrix bitMatrix = writer.encode( data
,BarcodeFormat.QR_CODE, 512, 512, hints);
int width = 512;
int height = 512;
Bitmap bmp = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);
for (int x = 0; x < width; x++) {
for (int y = 0; y < height; y++) {
if (bitMatrix.get(x, y)==0)
bmp.setPixel(x, y, Color.BLACK);
else
bmp.setPixel(x, y, Color.WHITE);
}
}
tnsd_iv_qr.setImageBitmap(bmp);
} catch (WriterException | UnsupportedEncodingException e) {
e.printStackTrace();
}

I fix this line
data = new String(b, "ISO-8859-1");
Hashtable<EncodeHintType, String> hints = new Hashtable<EncodeHintType, String>(2);
hints.put(EncodeHintType.CHARACTER_SET, "ISO-8859-1");
and its work!

Related

save image file with bitmap factory

I'm encountering an issue with bitmap factory.
I've got a method to reduce and rotate an image to show a preview in an image view, but I would like to save this with the new size.
I'm just turning around with inputfilestream and outputfilestream but don't get to save it.
Is anybody know a clear method to put my bitmap in an outpufilestream?
Thanks a lot
here's my code
#Override
protected void onResume() {
super.onResume();
File[] fileArray;
final File root;
File chemin = Environment.getExternalStorageDirectory();
String filepath = chemin + "/SmartCollecte/PARC/OUT/" + fichano + "_" + conteneur_s+"_"+cpt+".jpg";
try {
decodeFile(filepath);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}}
public void decodeFile(String filePath) {
// Decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
BitmapFactory.decodeFile(filePath, o);
// The new size we want to scale to
final int REQUIRED_SIZE = 1024;
// Find the correct scale value. It should be the power of 2.
int width_tmp = o.outWidth, height_tmp = o.outHeight;
int scale = 1;
while (true) {
if (width_tmp < REQUIRED_SIZE && height_tmp < REQUIRED_SIZE)
break;
width_tmp /= 2;
height_tmp /= 2;
scale *= 2;
}
// Decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize = scale;
Bitmap b1 = BitmapFactory.decodeFile(filePath, o2);
Bitmap b = ExifUtils.rotateBitmap(filePath, b1);
FileOutputStream fos = new FileOutputStream(filePath);
b.compress(Bitmap.CompressFormat.PNG,100,fos);
fos.close();
showImg.setImageBitmap(b);
}
Have you tried doing it like this?
Assuming bitmap is bitmap you want to save.
Also, take a look at some existing system directories.
final FileOutputStream fos = new FileOutputStream(new File(filepath + "_scaled.jpg"));
try {
bitmap.compress(Bitmap.CompressFormat.JPEG, 90, fos);
} catch (IOException e) {
// handle exception
} finally {
fos.close
}
Source
Where first parameter of Bitmap.compress() is your desired output format (see CompressFormat) and the second parameter is compression quality.
ok I found out what was missing.
had to create a new byte array to convert my bitmap to file :
String filepathcomp = Environment.getExternalStorageDirectory()+"/SmartCollecte/PARC/OUT/"+ fichano + "_" + conteneur_s+"_"+cpt+".jpg";
File f = new File(filepathcomp);
Bitmap newbitmap = b;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
newbitmap.compress(Bitmap.CompressFormat.JPEG,80,bos);
byte[] bitmapdata = bos.toByteArray();
FileOutputStream fos = new FileOutputStream(f);
fos.write(bitmapdata);
fos.flush();
fos.close();

Android Camera2 API YUV_420_888 to JPEG

I'm getting preview frames using OnImageAvailableListener:
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
byte[] data = new byte[buffer.capacity()];
buffer.get(data);
//data.length=332803; width=3264; height=2448
Log.e(TAG, "data.length=" + data.length + "; width=" + image.getWidth() + "; height=" + image.getHeight());
//TODO data processing
} catch (Exception e) {
e.printStackTrace();
}
if (image != null) {
image.close();
}
}
Each time length of data is different but image width and height are the same.
Main problem: data.length is too small for such resolution as 3264x2448.
Size of data array should be 3264*2448=7,990,272, not 300,000 - 600,000.
What is wrong?
imageReader = ImageReader.newInstance(3264, 2448, ImageFormat.JPEG, 5);
I solved this problem by using YUV_420_888 image format and converting it to JPEG image format manually.
imageReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH, MAX_PREVIEW_HEIGHT,
ImageFormat.YUV_420_888, 5);
imageReader.setOnImageAvailableListener(this, null);
Surface imageSurface = imageReader.getSurface();
List<Surface> surfaceList = new ArrayList<>();
//...add other surfaces
previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(imageSurface);
surfaceList.add(imageSurface);
cameraDevice.createCaptureSession(surfaceList,
new CameraCaptureSession.StateCallback() {
//...implement onConfigured, onConfigureFailed for StateCallback
}, null);
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
if (image != null) {
//converting to JPEG
byte[] jpegData = ImageUtils.imageToByteArray(image);
//write to file (for example ..some_path/frame.jpg)
FileManager.writeFrame(FILE_NAME, jpegData);
image.close();
}
}
public final class ImageUtil {
public static byte[] imageToByteArray(Image image) {
byte[] data = null;
if (image.getFormat() == ImageFormat.JPEG) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
data = new byte[buffer.capacity()];
buffer.get(data);
return data;
} else if (image.getFormat() == ImageFormat.YUV_420_888) {
data = NV21toJPEG(
YUV_420_888toNV21(image),
image.getWidth(), image.getHeight());
}
return data;
}
private static byte[] YUV_420_888toNV21(Image image) {
byte[] nv21;
ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
ByteBuffer vuBuffer = image.getPlanes()[2].getBuffer();
int ySize = yBuffer.remaining();
int vuSize = vuBuffer.remaining();
nv21 = new byte[ySize + vuSize];
yBuffer.get(nv21, 0, ySize);
vuBuffer.get(nv21, ySize, vuSize);
return nv21;
}
private static byte[] NV21toJPEG(byte[] nv21, int width, int height) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
return out.toByteArray();
}
}
public final class FileManager {
public static void writeFrame(String fileName, byte[] data) {
try {
BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(fileName));
bos.write(data);
bos.flush();
bos.close();
// Log.e(TAG, "" + data.length + " bytes have been written to " + filesDir + fileName + ".jpg");
} catch (IOException e) {
e.printStackTrace();
}
}
}
I am not sure, but I think you are taking only one of the plane of the YUV_420_888 format (luminance part).
In my case, I usually transform my image to byte[] in this way.
Image m_img;
Log.v(LOG_TAG,"Format -> "+m_img.getFormat());
Image.Plane Y = m_img.getPlanes()[0];
Image.Plane U = m_img.getPlanes()[1];
Image.Plane V = m_img.getPlanes()[2];
int Yb = Y.getBuffer().remaining();
int Ub = U.getBuffer().remaining();
int Vb = V.getBuffer().remaining();
data = new byte[Yb + Ub + Vb];
//your data length should be this byte array length.
Y.getBuffer().get(data, 0, Yb);
U.getBuffer().get(data, Yb, Ub);
V.getBuffer().get(data, Yb+ Ub, Vb);
final int width = m_img.getWidth();
final int height = m_img.getHeight();
And I use this byte buffer to transform to rgb.
Hope this helps.
Cheers.
Unai.
Your code is requesting JPEG-format images, which are compressed. They'll change in size for every frame, and they'll be much smaller than the uncompressed image. If you want to do nothing besides save JPEG images, you can just save what you have in the byte[] data to disk and you're done.
If you want to actually do something with the JPEG, you can use BitmapFactory.decodeByteArray() to convert it to a Bitmap, for example, though that's pretty inefficient.
Or you can switch to YUV, which is more efficient, but you need to do more work to get a Bitmap out of it.

Why i cannot show my byte[] image after converting it from bitmap?

I'm trying to write a method that accepts an image(Bitmap) and returns a byte[] array. finally, I try to write this byte[] array to a folder so I can see the difference, but my byte[] arraycan not displayed, and in addition, it is not scaled down! This is my method:
private byte[] changeSize(Bitmap image) {
byte[] picture;
int width = image.getWidth();
int height = image.getHeight();
int newHeight = 0, newWidth = 0;
if (width > 250 || height > 250) {
if (width > height) { //landscape-mode
newHeight = 200;
newWidth = (newHeight * width) / height;
} else { //portrait-mode
newWidth = 200;
newHeight = (newWidth * height) / width;
}
} else {
Toast.makeText(this, "Something wrong!", Toast.LENGTH_LONG).show();
}
Bitmap sizeChanged = Bitmap.createScaledBitmap(image, newWidth, newHeight, true);
//Convert bitmap to a byte array
int bytes = sizeChanged.getByteCount();
ByteBuffer bb = ByteBuffer.allocate(bytes);
sizeChanged.copyPixelsFromBuffer(bb);
picture = bb.array();
//Write to a hd
picturePath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);
String fileName = edFile.getText().toString() + "_downscaled" + ".jpg";
File file = new File(picturePath, fileName);
FileOutputStream fos = null;
try {
fos = new FileOutputStream(file);
fos.write(picture);
fos.close();
} catch (Exception e) {
e.printStackTrace();
}
return image;
}
I tried several hours to get my byte[] array visible, but I could simply not do this. Any help or hints to show me where I derail is/are very appreciated.
This was working for me
public static Bitmap byteArraytoBitmap(byte[] bytes) {
return (BitmapFactory.decodeByteArray(bytes, 0, bytes.length));
}
public static byte[] bitmaptoByteArray(Bitmap bmp) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream); //PNG format is lossless and will ignore the quality setting!
byte[] byteArray = stream.toByteArray();
return byteArray;
}
public static Bitmap bitmapFromFile(File file) {
//returns null if could not decode
return BitmapFactory.decodeFile(file.getPath());
}
public static boolean saveImage(Bitmap image, String filePath) {
LogInfo(TAG, "Saving image to: " + filePath);
File file = new File(filePath);
File fileDirectory = new File(file.getParent());
LogInfo(TAG, fileDirectory.getPath());
if (!fileDirectory.exists()) {
if (!fileDirectory.mkdirs()) {
Log.e(TAG, "ERROR CREATING DIRECTORIES");
return false;
}
}
try {
file.createNewFile();
FileOutputStream fo = new FileOutputStream(file);
fo.write(bitmaptoByteArray(image));
fo.flush();
fo.close();
return true;
}
catch (Exception e) {
e.printStackTrace();
return false;
}
}

Using readObject to get byte[][] giving OptionalDataException

I have a slight problem reading some files I have made. I am making a game and have decided to make my own file type for the maps. I have made a special application to make these map files. Once I instantiate a map I can choose to call readFile(String path) to set the map as the one saved. I know that I have to read and write the stream in the same order and everything went well until I added the statements about reading and writing the byte[][]. I cannot figure out why I am getting this exception and how to still read a byte[][]. Here is my class.
public class Map implements Serializable{
String savePath;
int boxWidth;
int boxHeight;
int mapWidth;
int mapHeight;
BufferedImage map;
byte[][] encoded;
LinkedList<BufferedImage> tileSet = new LinkedList<BufferedImage>();
Map(int boxWidth, int boxHeight, int mapWidth, int mapHeight){
map = new BufferedImage(boxWidth * mapWidth, boxHeight * mapHeight, BufferedImage.TYPE_INT_RGB);
Graphics g = map.createGraphics();
g.setColor(Color.WHITE);
g.fillRect(0, 0, map.getWidth(), map.getHeight());
g.dispose();
this.boxHeight = boxHeight;
this.boxWidth = boxWidth;
this.mapHeight = mapHeight;
this.mapWidth = mapWidth;
initEncode();
}
Map(){
map = new BufferedImage(1, 1, BufferedImage.TYPE_INT_ARGB);
this.boxHeight = 0;
this.boxWidth = 0;
this.mapHeight = 0;
this.mapWidth = 0;
initEncode();
}
void initEncode(){
int width = 2 * mapWidth + 1;
int height = 2 * mapHeight + 1;
encoded = new byte[width][height];
for(int i = 0; i < width; i++){
for(int j = 0; j < height; j++){
encoded[i][j] = 0;
}
}
}
void setMapTile(int i, int j, byte index){
encoded[2 * i + 1][2 * j + 1] = index;
}
void setMapWall(int i, int j, byte index){
encoded[2 * i][2 * i] = index;
}
void addToTileset(Tile tile){
tileSet.add(tile.tile);
writeFile(savePath);
}
//writing to file with path - boolean is for whether it went successfully or not
boolean writeFile(String path){
savePath = path;
try{
OutputStream file = new FileOutputStream(path);
OutputStream buffer = new BufferedOutputStream(file);
ObjectOutputStream output = new ObjectOutputStream(buffer);
writeObject(output);
output.close();
buffer.close();
file.close();
}catch(IOException ex){
System.err.println("Could not Write to file: " + path + "\nError caused by: " + ex);
return false;
}
return true;
}
//reading from file with path - boolean is for whether it went successfully or not
boolean readFile(String path){
savePath = path;
try{
InputStream file = new FileInputStream(path);
InputStream buffer = new BufferedInputStream(file);
ObjectInputStream in = new ObjectInputStream(buffer);
readObject(in);
initEncode();
file.close();
buffer.close();
in.close();
}catch(IOException ex){
System.err.println("Could not read from file: " + path + "\nError caused by: " + ex + "\n");
ex.printStackTrace();
return false;
}catch(ClassNotFoundException e){
System.err.println("Could not read from file: " + path + "\nError caused by: " + e + "\n");
e.printStackTrace();
}
return true;
}
private void writeObject(ObjectOutputStream out) throws IOException {
out.writeInt(boxHeight);
out.writeInt(boxWidth);
out.writeInt(mapHeight);
out.writeInt(mapWidth);
ImageIO.write(map, "png", out);
out.writeObject(encoded);
out.writeInt(tileSet.size());
for(BufferedImage b: tileSet){
ImageIO.write(b, "png", out);
}
}
public void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException{
boxHeight = in.readInt();
boxWidth = in.readInt();
mapHeight = in.readInt();
mapWidth = in.readInt();
map = ImageIO.read(in);
encoded = (byte[][]) in.readObject();
int tileSetSize = in.readInt();
for(int i = 0; i < tileSetSize; i++){
tileSet.add(ImageIO.read(in));
}
}
}
Is there some reason that my (byte[][]) readObject() line is throwing OptionalDataException and how do i still read/write my byte[][].
EDIT: Thank you for your answer Abhinav Kumar. I overlooked that but when I fixed the code it still gave me the same error on the same line. (The class has been fixed now).
You have to read the InputStream in the same order and same format which you write in the stream otherwise you would get OptionalDataException
You have written the data in OutputStream in the order :-
ImageIO.write(map, "png", out);
out.writeInt(2 * mapWidth + 1);
out.writeObject(encoded);
And you are reading the stream in the order :-
map = ImageIO.read(in);
encoded = (byte[][]) in.readObject();
Just read the int after u read map.The correct code would be :-
public void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException{
boxHeight = in.readInt();
boxWidth = in.readInt();
mapHeight = in.readInt();
mapWidth = in.readInt();
map = ImageIO.read(in);
in.readInt();// you read this int and assign it to the object as you wish
encoded = (byte[][]) in.readObject();
int tileSetSize = in.readInt();
for(int i = 0; i < tileSetSize; i++){
tileSet.add(ImageIO.read(in));
}
}

Java Buffer Image form online Webcam

I have to write servlet which capture few images from the online Webcam. Every parameter like(URL, Interval, numer, and count) are send by POST method. In my servlet file I have something like that:
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
String url = request. getParameter("url").toString();
int interwal = Integer.parseInt(request.getParameter("interwal").toString());
int nrSt = Integer.parseInt(request.getParameter("nr").toString());
int il = Integer.parseInt(request.getParameter("il").toString());
PrintWriter out = response.getWriter();
BufferedImage img;
URL imgURL;
File imgFile;
for(int i=0; i<il; i++){
try{
imgURL = new URL(url);
img = ImageIO.read(imgURL);
imgFile = new File("E:\\image" + (nrSt+i) + ".jpg");
ImageIO.write(img, "png", imgFile);
out.print("Saved image" + (nrSt+i) + ".jpg<br>");
} catch(IOException e){
out.print("Error reading Image!");
e.printStackTrace();
}
try {
Thread.sleep(interwal*1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
What I must change to capture image form for examle this webcam: cam
Using a webcam page can be much more difficult. The one you chose as example uses a dynamic image URL which is calculated on the page using JavaScript - but it is simple enough to easily extract what you need. The next problem is that the image URL points to a service which answers with multi-part responses which ImageIO.read() does not seem to understand.
The following standalone code seems to successfully acquire the webcam image:
public static void main(String[] args) {
try {
// get webcam page
URL url = new URL("...your example URL.../webcam/campob.html");
InputStreamReader isr = new InputStreamReader(url.openStream(), "UTF-8");
Scanner scanner = new Scanner(isr);
final Pattern nullDelimiter = Pattern.compile("<<<>>>");
String html = scanner.useDelimiter(nullDelimiter).next();
scanner.close();
// extract image URL from HTML
final Pattern extractPattern = Pattern.compile(
"^var BaseURL = \"([^\"]+)\".*"
+ "^var ImageResolution = \"([^\"]+)\".*"
+ "^var File = \"([^\"]+)\"",
Pattern.MULTILINE|Pattern.DOTALL);
Matcher m = extractPattern.matcher(html);
if (!m.find()) throw new RuntimeException();
URL imgURL = new URL(m.group(1) + m.group(3) + m.group(2));
System.out.println("imgURL=" + imgURL);
// read headers into buffer
InputStream is = imgURL.openStream();
byte[] buffer = new byte[131072];
int bytes = 0;
Pattern headersPattern = Pattern.compile(
"^Content-Length:\\s*(\\d+)\\s*$.*?^$\\r?\\n",
Pattern.MULTILINE|Pattern.DOTALL);
while (bytes < buffer.length) {
int count = is.read(buffer, bytes, buffer.length - bytes);
if (count < 0) break;
bytes += count;
m = headersPattern.matcher(new String(buffer, "ASCII"));
if (m.find()) break;
}
// read rest of image bytes into buffer
int offset = m.end();
int contentLength = Integer.parseInt(m.group(1));
int limit = Math.min(buffer.length, offset + contentLength);
while (bytes < limit) {
int count = is.read(buffer, bytes, limit - bytes);
if (count < 0) break;
bytes += count;
}
is.close();
System.out.println("bytes=" + bytes + " offset=" + offset);
// read image from buffer (start after header)
is = new ByteArrayInputStream(buffer);
is.skip(offset);
Image img = ImageIO.read(is);
System.out.println(img);
} catch (Exception ex) {
ex.printStackTrace(System.err);
}
}
Note that the code is lacking real error handling, the buffer has fixed size etc.

Categories