Now iam using the '/dev/graphics/fb0' to get the frame buffer and iam saving it on sd card using 'cat /dev/graphics/fb0 > /sdcard/screen.raw'.
Now i want convert that raw data into a png image and display on to the screen.is it possible to convert as a png in android code by using (NDK/SDK).
Code to do this can be found in the Android source tree. The old screenshot tool reads from the framebuffer device and saves it to PNG.
Note this does not work on newer devices, which may use hardware composer overlays to avoid writing to a single framebuffer. On such devices you need to use the screencap tool instead.
Related
In JavaFx we can load images easily from an external server:
imageView.setImage(new Image("http://...File.png"));
But is there a way to load the thumbnail of videos?
Something like: imageView.setImage(new Image("http://...File.mp4"));
I'm developing something like a "gallery" and would like to load the thumbnails of videos coming straight from my server, is it possible?
An mp3 or mp4 file can have an embedded thumbnail image as part of its metadata. There are third party libraries for reading mp3 / mp4 metadata. For example:
How to retrieve thumbnail picture of an mp3 file using java
App Engine Java - Extract thumbnail from mp4 video
If the file doesn't have a thumbnail, then you could conceivably pick a frame of the video and use it as a thumbnail, but the chances of picking an appropriate frame (i.e. one that is indicative of the movie) without the assistance of a human being are not great. But here is an example:
Extract Thumbnail for specific second from MP4 file in Android
But how do you make a video that is not local without downloading it altogether?
Approach #1: pick a 3rd-party metadata extraction library that can operate in stream mode. The metadata should be at / near the start of the stream.
Approach #2: get the server to do the extraction, and present you with the thumbnail separately from the main video.
I want to add a png image while recording video using camera api, the image must acts as a watermark on the video while video is saved.
One approach which I can suggest is use ffmpeg for android (check this). You will need to compile the library yourself, if you want the latest version. Readily available ffmpeg for android is this.
After implementing the ffmpeg, refer this question for adding watermark.
Hope this helps.
I am currently using MediaMetadataRetriever.getFrameAtTime on Android to get thumbnails of videos and it is working great except for live streams (m3u8). Using the m3u8 directly doesn't seem to work and I have managed to get a bitmap back for a .ts segment file using that same method but it is always just a black bitmap. So how can I get a thumbnail of a live stream?
Thanks.
I work on a big project with codenameone(i can't attach my codes because it's really big). I get android app and it's works on android devices. But recently i get ios build for this project and it's not working on ios device(just showing a white page instead of map).
My project is a map-framework that render tiles and ... on graphics(i used graphics class for drawing, transforming, writing text and more).
I used input stream for working with file because of File not supported.
I need a solution to how debug and find my problem about ios build(why tiles doesn't showed).
In fact i don'n know anything about ios and objective-c.
Thanks in advance.
Most of the logging functionality that allows inspecting issues is for pro developers (you can try the trial) its discussed in this video (mostly focused on crashes): http://www.codenameone.com/how-do-i---use-crash-protection-get-device-logs.html
From your description I would guess you created a really large mutable image (larger than screen bounds) and are drawing onto that. This would be both slow on iOS (and on newer Android devices) and might actually produce that result if the image exceeds the maximum texture size of the device.
If that is not the case you would need to explain what you are doing more precisely.
I want to capture the frame currently in display on the screen of a windows pc regardless of what is being displayed as a bitmap or image file or similar.
Until now I used the Java robot API and some C++ api, however it can capture only the desktop and windows etc. but cannot capture anything that's being drawn using an overlay like videos or directX games.
Capturing while a video playing returns a image with video region black and when in game it returns image of desktop as game isn't running!
Is there any standard method for capturing everything on the screen (like in fraps etc.)?
or I need to make a custom driver for each graphics card (Impossible)
The following is the code for capturing screenshots from Java:
http://www.codinguide.com/2010/04/capture-screen-shot-from-java.html
Hope that helps.