Today I'm presented with a challange of writing an XML that will hold image files. I don't know how to go about this but I need to have an image on Java and that will use SimpleXML write method to send an image from one Socket to the Socket in Android.
Is this possible?
I've tried looking around but I seem to have trouble with that.
If this is possible, is it also possible to use other multimedia?
After extensive research I have solved this problem by creating a ByteArray in SimpleXML, putting the image into the byte array in Java and then recreating it as bitmap in android using bitmap factory. Hope this helps anyone in their future endeavors.
Related
I need to create an app (using Android Studio) that generates CNC code to operate a 3D printer. It takes a String as input.
I've found a couple libraries in Python and Javascript that does this, but as I don't have time to translate whole libraries to Java, can you recommend any libraries that does that for me? If there are no open-source options, can you recommend any guide to help me develop this conversor?
What we ended up doing:
App asks for a String as input;
String is converted to a bitmap and then saved as a .png;
.png is loaded and converted to a .svg file. We used this repo: https://github.com/jankovicsandras/imagetracerandroid
We developed a parser to convert a .svg to g-code.
It worked but it's not the best solution, we're looking to implement something that runs python in Android, as there are many pythons that do all the work already, but that's how we've done it and it's working by now.
In my project I need to save a JPG image to android gallery. Unfortunately, Unity tools do not provide such function in-the-box, so I must implement this function myself. As far as I know, I need to make a little plugin to do that and already figured how to make a simple plugin, but I have questions about my specific goal.
So, the main part is: I have texture in Unity, I encode it to JPG format, resulting in a byte array. Then I convert this byte array into a string to be able to pass this string via android bridge into method located in a java class. Everything that's going to happen further is a black box for me.
I know the basic theory: I need to convert the string into an image somehow, specify the folder for the pictures of my app, save the image into this folder, and somehow refresh device memory so the image is visible in the gallery. But the implementation is a big mystery for me. I know it can be done after a few days of digging, but I guess that this can be done in one-two simple methods.
Can anybody give me some hints? Are there any issues between conversion from C# code in Unity into Java code in, for example, Eclipse?
So I have a webpage. I want to catch a webcam data and processing the image I get from that. I thought I could get the image from webcam with WebRTC in javascript. The thing is that, the library I found most suitable with the image processing I need is in Java. It reads an image and do the processing.
How do I combine this? Get image from webcam using WebRTC in javascript and processing it using Java? Or is there any other way to do this?
If my question's wrong, please tell me why. If you think I ask this in a wrong site, please also tell me where I should ask this. Thank you.
Note
As I'm new to web programming, searching and searching, I found that the practice is to execute JSP in server. But what if I have to process the image continously? Won't it be so slow since I would have to send the data continously? Though, in this case, I just need to practice it locally, not in real server. But, is there any way to execute Java in browser?
You can capture individual images over a period of time utilizing a WebRTC recording Javascript library. You can then send the images or the blob down over a websocket to your Java websocket server and process the images.
Check this out, this should pretty much give you all the tools you need to do exactly what you want to do.
I found a problem when loading a bitmap file in my working company software. It crashes the software when I drop in the bitmap file. However, I am trying to create a Java app to read the bitmap file header and display the header information. So I know what might causing the problem.
Can anyone suggest the idea how can I grab the bitmap file header information or which class should I use to achieve this goal?
Yes it is possible, I wrote code to do something like this with jpeg headers some months ago.
Basically, you need to learn a bit about the bitmap file format.
Then you need to open the file (for reading bytes).
Finally, you read enough bytes to get to the right field in the header, and decompose that to the Java data type you want.
There may be a class that already does this, in which case I would suggest Google for finding it.
The apache sanselan project provides a BmpImageParser class for parsing BMP files. You can take a look at the source here.
I am creating a simple video editing application using Java, JNI, C, and FFmpeg. The UI is being designed in Java and I am talking with FFmpeg from a C file using JNI. Does anyone have any suggestions as to the best way I should go about saving part of a video file using FFmpeg? I am allowing the users to choose parts of the video to save and what I am thinking as of right now is to basically loop through all of the packets and decode each frame (if need to encode to a different format) then save the frame to a file. All the while seeking to different parts of the video based on the users start and stop sections of their crops. If this doesn't make sense I would be glad to clear it up. Any ideas are much appreciated as I am just looking to create the most efficient and correct way to go about doing this. Thanks!
Use xuggler? It will do all for you without you having to figure out the jni bindings.