HI guys I am experimenting with the use of JavaCV as I want to get to know how it works in order to include it's functionality in a project I have in mind. I have downloaded and set up OpenCV just like what the instructions said and I also have downloaded form bytedeco the JavaCV 1.0 jars that I need to include in my project.
I have started with an example program I found online that basically grabs and save images from a webcam. The code I have written is the following:
import org.bytedeco.javacpp.opencv_core.IplImage;
import org.bytedeco.javacv.CanvasFrame;
import org.bytedeco.javacv.Frame;
import org.bytedeco.javacv.FrameGrabber;
import org.bytedeco.javacv.VideoInputFrameGrabber;
import org.bytedeco.javacv.*;
import org.bytedeco.javacpp.*;
import static org.bytedeco.javacpp.opencv_core.*;
import static org.bytedeco.javacpp.opencv_imgproc.*;
import static org.bytedeco.javacpp.opencv_imgcodecs.*;
public class GrabberShow implements Runnable{
IplImage image;
CanvasFrame canvas = new CanvasFrame("Web Cam");
public GrabberShow(){
canvas.setDefaultCloseOperation(javax.swing.JFrame.EXIT_ON_CLOSE);
}
public void run(){
FrameGrabber grabber = new VideoInputFrameGrabber(0);
int i = 0;
try{
grabber.start();
IplImage img;
while(true){
img = grabber.grab();
if(img != null){
cvFlip(img, img, 1);
cvSaveImage((i++) + "-aa.img", img);
canvas.showImage(img);
}
}
}catch(Exception e){
e.printStackTrace();
}
}
public static void main(String[] args){
GrabberShow gs = new GrabberShow();
Thread th = new Thread(gs);
th.start();
}
}
This is a very straightforward and easy example. The problem I am experiencing can be found on the following lines:
img = grabber.grab();
and
canvas.showImage(img);
The problem I face is a Type Mismatch "Cannot convert from Frame to opencv_core.LplImage".
I have tried searching for this online but I was unable to locate a good answer about this. What I did found was the same example only. Does anyone have any ideas about this?
Need to stress out that this the first time I use openCV with Java. I have used it in the past to make and object tracking program but this was done with native openCV and using Python.
The code you have looks like it would probably work for javacv version 0.10. For 1.0 theFrameGrabber's returnFrame objects that then need to be converted to IplImage objects using a OpenCVFrameConverter.ToIplImage.
import org.bytedeco.javacpp.opencv_core.IplImage;
import static org.bytedeco.javacpp.opencv_core.cvFlip;
import static org.bytedeco.javacpp.opencv_imgcodecs.cvSaveImage;
import org.bytedeco.javacv.CanvasFrame;
import org.bytedeco.javacv.Frame;
import org.bytedeco.javacv.FrameGrabber;
import org.bytedeco.javacv.OpenCVFrameConverter;
import org.bytedeco.javacv.VideoInputFrameGrabber;
...
public class GrabberShow implements Runnable {
CanvasFrame canvas = new CanvasFrame("Web Cam");
public GrabberShow() {
canvas.setDefaultCloseOperation(javax.swing.JFrame.EXIT_ON_CLOSE);
}
public void run() {
FrameGrabber grabber = new VideoInputFrameGrabber(0);
OpenCVFrameConverter.ToIplImage converter = new OpenCVFrameConverter.ToIplImage();
int i = 0;
try {
grabber.start();
IplImage img;
while (true) {
Frame frame = grabber.grab();
img = converter.convert(frame);
if (img != null) {
cvFlip(img, img, 1);
cvSaveImage((i++) + "-aa.img", img);
canvas.showImage(frame);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
GrabberShow gs = new GrabberShow();
Thread th = new Thread(gs);
th.start();
}
}
I got my code to draw my image in an applet, but it is an animated gif and it is stopped on the first frame as if it were a single image.
It is supposed to be the spooky scary skeleton dancing, but he's just standing still.
Here is my code:
import java.util.*;
import java.awt.*;
import java.applet.*;
import java.net.*;
import java.awt.image.BufferedImage;
import javax.imageio.ImageIO;
import java.io.*;
public class Spooky extends Applet
{
Image scary, trumpet, walking;
MediaTracker mt;
AudioClip spoopy;
Graphics buffer;
Image offscreen;
Dimension dim;
public void init()
{
setLayout(null);
mt = new MediaTracker(this);
mt.addImage(scary,1);
mt.addImage(trumpet,1);
mt.addImage(walking,1);
spoopy = getAudioClip(getDocumentBase(),"spoopy.wav");
spoopy.loop();
}
public void paint(Graphics g)
{
try
{
URL url = this.getClass().getResource("spooky.gif");
BufferedImage img;
img = ImageIO.read(url);
mt.addImage(img,1);
g.drawImage(img,0,0,300,300,this);
}
catch(IOException e)
{
}
}
}
The problem is the ImageIO.read(url); method. Not sure how, but internally, it messes up the reading of the gif. Instead, construct an ImageIcon from the URL and use getImage() of the ImageIcon to get an Image
As an aside, don't load the image in the paint method. Load it in the init method.
public class Spooky extends Applet {
Image image;
public void init() {
URL url = this.getClass().getResource("spooky.gif");
image = new ImageIcon(url).getImage();
}
public void paint(Graphics g) {
super.paint(g);
g.drawImage(image, 0, 0, 300, 300, this);
}
}
I don't know if it helps but usually it is a problem of needing a separate Thread/Runnable for the animation and a separate for the rest of the code. At least that was a problem I had when I was making a small game. Try this one and let me know if it helps :)
Check this too: Displaying Gif animation in java
Update: An example I found (http://examples.oreilly.com/jswing2/code/) uses JApplet that supports gifs (Applet is the old one)
// AnimationApplet.java
// The classic animation applet rewritten to use an animated GIF.
//
import javax.swing.*;
public class AnimationApplet extends JApplet {
public void init() {
ImageIcon icon = new ImageIcon("images/rolling.gif"); // animated gif
getContentPane().add(new JLabel(icon));
}
}
Did you try not to put sound to see if animation works alone? It could be that sound needs a separate Runnable/Thread
I had a similar problem. Michail Michailidis's answer is one solution. However, if you are trying to load the GIF from an InputStream (which was the situation in my case) you will need a different solution. In this case, I would use the Toolkit class to load your image, more specifically, Toolkit#createImage(byte[]):
Image image;
try(InputStream stream = this.getClass().getResourceAsStream("/someImage.gif")) {
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[16384];
while ((nRead = stream.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
buffer.flush();
image = Toolkit.getDefaultToolkit().createImage(buffer.toByteArray());
} catch (IOException e) {
e.printStackTrace();
}
i would like to read a Mp4 file in java8-64bit frame by frame and write each frame as a jpg to my harddisk. my first attempt was to use JavaFX 2.2 media player to play the file
on a View component. i thought maybe there would be an option to register an observer to get an event each time a new frame was loaded and ready to be painted on the component surface but seems there is no such method.
it would be enough to grab just those frames/pixels that got painted on the component.
Can this be done by using the media player? the reason why i use the media player is bcs it was the simplest solution i got workin. i tryed vlcj, just 32bit, and gstreamer but without luck :(
what i got so far:
public class VideoGrabber extends extends JFrame {
// code for scene setup omitted
final MediaView view = createMediaView(...)
// some other stuff happens here
// now start the video
view.getMediaPlayer().seek(Duration.ZERO);
view.getMediaPlayer().play();
view.getMediaPlayer().setOnEndOfMedia(new Runnable()
{ // save image when done
BufferedImage img = new BufferedImage(getWidth(), getHeight(), BufferedImage.TYPE_INT_BGR
view.paint(img.getGraphics());
ImageIO.write(img, "JPEG", new File("pic-"+System.currentTimeMillis()+".jpg"));
});
// somewhere else to create
private MediaView createMediaView(String url)
{
final Media clip = new Media(url);
final MediaPlayer player = new MediaPlayer(clip);
final MediaView view = new MediaView(player);
view.setFitWidth(VID_WIDTH);
view.setFitHeight(VID_HEIGHT);
return view;
}
is there somehow a way to do the following:
player.setOnNextFrameReady(final Event evt) { writeImage(evt.getFrame()) };
Thanks!
You can use the snapshot() of MediaView.
First connect a mediaPlayer to a MediaView component, then use mediaPlayer.seek() to seek the video position. And then you can use the following code to extract the image frame:
int width = mediaPlayer.getMedia().getWidth();
int height = mediaPlayer.getMedia().getHeight();
WritableImage wim = new WritableImage(width, height);
MediaView mv = new MediaView();
mv.setFitWidth(width);
mv.setFitHeight(height);
mv.setMediaPlayer(mediaPlayer);
mv.snapshot(null, wim);
try {
ImageIO.write(SwingFXUtils.fromFXImage(wim, null), "png", new File("/test.png"));
} catch (Exception s) {
System.out.println(s);
}
Marvin Framework provides methods to process media files frame by frame. Below it is shown a simple video processing example that highlights the path of a snooker ball.
At the left the video is played frame by frame with an interval of 30ms between frames. At the right the video is played frame by frame, but keeping all positions of the white ball through a simple image processing approach.
The source code can be checked here and the video here.
Below the essential source code to request media file frames using Marvin:
public class MediaFileExample implements Runnable{
private MarvinVideoInterface videoAdapter;
private MarvinImage videoFrame;
public MediaFileExample(){
try{
// Create the VideoAdapter used to load the video file
videoAdapter = new MarvinJavaCVAdapter();
videoAdapter.loadResource("./res/snooker.wmv");
// Start the thread for requesting the video frames
new Thread(this).start();
}
catch(MarvinVideoInterfaceException e){e.printStackTrace();}
}
#Override
public void run() {
try{
while(true){
// Request a video frame
videoFrame = videoAdapter.getFrame();
}
}catch(MarvinVideoInterfaceException e){e.printStackTrace();}
}
public static void main(String[] args) {
MediaFileExample m = new MediaFileExample();
}
}
Frameworks like ffmpeg come with command line tools to split a video into individual frames (i.e. images in a folder).
See this question: ffmpeg split avi into frames with known frame rate
The answers contains links to Java frameworks which wrap ffmpeg.
Project: HERE link to the project
I am trying to use this open source code but I get the following error:
error: bad operand type for binary operator '!='
In this context:
if (img != null) {
cvFlip(img, img, 1);// l-r = 90_degrees_steps_anti_clockwise
cvSaveImage((i++) + "-capture.jpg", img);
// show image on window
canvas.showImage(img);
}
Here is the entire class:
package pdlwebcam;
import com.googlecode.javacv.CanvasFrame;
import com.googlecode.javacv.FrameGrabber;
import com.googlecode.javacv.VideoInputFrameGrabber;
import com.googlecode.javacv.cpp.opencv_core.IplImage;
public class PDLWebcam implements Runnable {
//final int INTERVAL=1000;///you may use interval
IplImage image;
CanvasFrame canvas = new CanvasFrame("Web Cam");
public PDLWebcam() {
canvas.setDefaultCloseOperation(javax.swing.JFrame.EXIT_ON_CLOSE);
}
#Override
public void run() {
FrameGrabber grabber = new VideoInputFrameGrabber(0);
int i = 0;
try {
grabber.start();
IplImage img;
while (true) {
img = grabber.grab();
if (img != null) {
cvFlip(img, img, 1);// l-r = 90_degrees_steps_anti_clockwise
cvSaveImage((i++) + "-capture.jpg", img);
// show image on window
canvas.showImage(img);
}
//Thread.sleep(INTERVAL);
}
} catch (Exception e) {
}
}
}
I've created stub implementations for cvFlip and cvSaveImage methods and project compiled without any errors. Netbeans displays Bad operand error message anyway, though. It looks like a bug in IDE itself.
Workaround: I've noticed that IplImage class derives from com.googlecode.javacpp.Pointer which is not visible to Netbeans. Adding javacpp to project libraries helped to remove the error message.
How can I continuously capture images from a webcam?
I want to experiment with object recognition (by maybe using java media framework).
I was thinking of creating two threads
one thread:
Node 1: capture live image
Node 2: save image as "1.jpg"
Node 3: wait 5 seconds
Node 4: repeat...
other thread:
Node 1: wait until image is captured
Node 2: using the "1.jpg" get colors
from every pixle
Node 3: save data in arrays
Node 4: repeat...
This JavaCV implementation works fine.
Code:
import org.bytedeco.javacv.*;
import org.bytedeco.opencv.opencv_core.IplImage;
import java.io.File;
import static org.bytedeco.opencv.global.opencv_core.cvFlip;
import static org.bytedeco.opencv.helper.opencv_imgcodecs.cvSaveImage;
public class Test implements Runnable {
final int INTERVAL = 100;///you may use interval
CanvasFrame canvas = new CanvasFrame("Web Cam");
public Test() {
canvas.setDefaultCloseOperation(javax.swing.JFrame.EXIT_ON_CLOSE);
}
public void run() {
new File("images").mkdir();
FrameGrabber grabber = new OpenCVFrameGrabber(0); // 1 for next camera
OpenCVFrameConverter.ToIplImage converter = new OpenCVFrameConverter.ToIplImage();
IplImage img;
int i = 0;
try {
grabber.start();
while (true) {
Frame frame = grabber.grab();
img = converter.convert(frame);
//the grabbed frame will be flipped, re-flip to make it right
cvFlip(img, img, 1);// l-r = 90_degrees_steps_anti_clockwise
//save
cvSaveImage("images" + File.separator + (i++) + "-aa.jpg", img);
canvas.showImage(converter.convert(img));
Thread.sleep(INTERVAL);
}
} catch (Exception e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
Test gs = new Test();
Thread th = new Thread(gs);
th.start();
}
}
There is also post on configuration for JavaCV
You can modify the code and be able to save the images in regular interval and do rest of the processing you want.
Some time ago I've created generic Java library which can be used to take pictures with a PC webcam. The API is very simple, not overfeatured, can work standalone, but also supports additional webcam drivers like OpenIMAJ, JMF, FMJ, LTI-CIVIL, etc, and some IP cameras.
Link to the project is https://github.com/sarxos/webcam-capture
Example code (take picture and save in test.jpg):
Webcam webcam = Webcam.getDefault();
webcam.open();
BufferedImage image = webcam.getImage();
ImageIO.write(image, "JPG", new File("test.jpg"));
It is also available in Maven Central Repository or as a separate ZIP which includes all required dependencies and 3rd party JARs.
JMyron is very simple for use.
http://webcamxtra.sourceforge.net/
myron = new JMyron();
myron.start(imgw, imgh);
myron.update();
int[] img = myron.image();
Here is a similar question with some - yet unaccepted - answers. One of them mentions FMJ as a java alternative to JMF.
This kind of goes off of gt_ebuddy's answer using JavaCV, but my video output is at a much higher quality then his answer. I've also added some other random improvements (such as closing down the program when ESC and CTRL+C are pressed, and making sure to close down the resources the program uses properly).
import java.awt.event.ActionEvent;
import java.awt.event.KeyEvent;
import java.awt.event.WindowAdapter;
import java.awt.event.WindowEvent;
import java.awt.image.BufferedImage;
import javax.swing.AbstractAction;
import javax.swing.ActionMap;
import javax.swing.InputMap;
import javax.swing.JComponent;
import javax.swing.JFrame;
import javax.swing.KeyStroke;
import com.googlecode.javacv.CanvasFrame;
import com.googlecode.javacv.OpenCVFrameGrabber;
import com.googlecode.javacv.cpp.opencv_core.IplImage;
public class HighRes extends JComponent implements Runnable {
private static final long serialVersionUID = 1L;
private static CanvasFrame frame = new CanvasFrame("Web Cam");
private static boolean running = false;
private static int frameWidth = 800;
private static int frameHeight = 600;
private static OpenCVFrameGrabber grabber = new OpenCVFrameGrabber(0);
private static BufferedImage bufImg;
public HighRes()
{
// setup key bindings
ActionMap actionMap = frame.getRootPane().getActionMap();
InputMap inputMap = frame.getRootPane().getInputMap(JComponent.WHEN_IN_FOCUSED_WINDOW);
for (Keys direction : Keys.values())
{
actionMap.put(direction.getText(), new KeyBinding(direction.getText()));
inputMap.put(direction.getKeyStroke(), direction.getText());
}
frame.getRootPane().setActionMap(actionMap);
frame.getRootPane().setInputMap(JComponent.WHEN_IN_FOCUSED_WINDOW, inputMap);
// setup window listener for close action
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.addWindowListener(new WindowAdapter()
{
public void windowClosing(WindowEvent e)
{
stop();
}
});
}
public static void main(String... args)
{
HighRes webcam = new HighRes();
webcam.start();
}
#Override
public void run()
{
try
{
grabber.setImageWidth(frameWidth);
grabber.setImageHeight(frameHeight);
grabber.start();
while (running)
{
final IplImage cvimg = grabber.grab();
if (cvimg != null)
{
// cvFlip(cvimg, cvimg, 1); // mirror
// show image on window
bufImg = cvimg.getBufferedImage();
frame.showImage(bufImg);
}
}
grabber.stop();
grabber.release();
frame.dispose();
}
catch (Exception e)
{
e.printStackTrace();
}
}
public void start()
{
new Thread(this).start();
running = true;
}
public void stop()
{
running = false;
}
private class KeyBinding extends AbstractAction {
private static final long serialVersionUID = 1L;
public KeyBinding(String text)
{
super(text);
putValue(ACTION_COMMAND_KEY, text);
}
#Override
public void actionPerformed(ActionEvent e)
{
String action = e.getActionCommand();
if (action.equals(Keys.ESCAPE.toString()) || action.equals(Keys.CTRLC.toString())) stop();
else System.out.println("Key Binding: " + action);
}
}
}
enum Keys
{
ESCAPE("Escape", KeyStroke.getKeyStroke(KeyEvent.VK_ESCAPE, 0)),
CTRLC("Control-C", KeyStroke.getKeyStroke(KeyEvent.VK_C, KeyEvent.CTRL_DOWN_MASK)),
UP("Up", KeyStroke.getKeyStroke(KeyEvent.VK_UP, 0)),
DOWN("Down", KeyStroke.getKeyStroke(KeyEvent.VK_DOWN, 0)),
LEFT("Left", KeyStroke.getKeyStroke(KeyEvent.VK_LEFT, 0)),
RIGHT("Right", KeyStroke.getKeyStroke(KeyEvent.VK_RIGHT, 0));
private String text;
private KeyStroke keyStroke;
Keys(String text, KeyStroke keyStroke)
{
this.text = text;
this.keyStroke = keyStroke;
}
public String getText()
{
return text;
}
public KeyStroke getKeyStroke()
{
return keyStroke;
}
#Override
public String toString()
{
return text;
}
}
You can try Java Webcam SDK library also.
SDK demo applet is available at link.
I have used JMF on a videoconference application and it worked well on two laptops: one with integrated webcam and another with an old USB webcam. It requires JMF being installed and configured before-hand, but once you're done you can access the hardware via Java code fairly easily.
You can try Marvin Framework. It provides an interface to work with cameras. Moreover, it also provides a set of real-time video processing features, like object tracking and filtering.
Take a look!
Real-time Video Processing Demo:
http://www.youtube.com/watch?v=D5mBt0kRYvk
You can use the source below. Just save a frame using MarvinImageIO.saveImage() every 5 second.
Webcam video demo:
public class SimpleVideoTest extends JFrame implements Runnable{
private MarvinVideoInterface videoAdapter;
private MarvinImage image;
private MarvinImagePanel videoPanel;
public SimpleVideoTest(){
super("Simple Video Test");
videoAdapter = new MarvinJavaCVAdapter();
videoAdapter.connect(0);
videoPanel = new MarvinImagePanel();
add(videoPanel);
new Thread(this).start();
setSize(800,600);
setVisible(true);
}
#Override
public void run() {
while(true){
// Request a video frame and set into the VideoPanel
image = videoAdapter.getFrame();
videoPanel.setImage(image);
}
}
public static void main(String[] args) {
SimpleVideoTest t = new SimpleVideoTest();
t.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
}
}
For those who just want to take a single picture:
WebcamPicture.java
public class WebcamPicture {
public static void main(String[] args) {
try{
MarvinVideoInterface videoAdapter = new MarvinJavaCVAdapter();
videoAdapter.connect(0);
MarvinImage image = videoAdapter.getFrame();
MarvinImageIO.saveImage(image, "./res/webcam_picture.jpg");
} catch(MarvinVideoInterfaceException e){
e.printStackTrace();
}
}
}
I used Webcam Capture API. You can download it from here
webcam = Webcam.getDefault();
webcam.open();
if (webcam.isOpen()) { //if web cam open
BufferedImage image = webcam.getImage();
JLabel imageLbl = new JLabel();
imageLbl.setSize(640, 480); //show captured image
imageLbl.setIcon(new ImageIcon(image));
int showConfirmDialog = JOptionPane.showConfirmDialog(null, imageLbl, "Image Viewer", JOptionPane.YES_NO_OPTION, JOptionPane.QUESTION_MESSAGE, new ImageIcon(""));
if (showConfirmDialog == JOptionPane.YES_OPTION) {
JFileChooser chooser = new JFileChooser();
chooser.setDialogTitle("Save Image");
chooser.setFileFilter(new FileNameExtensionFilter("IMAGES ONLY", "png", "jpeg", "jpg")); //this file extentions are shown
int showSaveDialog = chooser.showSaveDialog(this);
if (showSaveDialog == 0) { //if pressed 'Save' button
String filePath = chooser.getCurrentDirectory().toString().replace("\\", "/");
String fileName = chooser.getSelectedFile().getName(); //get user entered file name to save
ImageIO.write(image, "PNG", new File(filePath + "/" + fileName + ".png"));
}
}
}
http://grack.com/downloads/school/enel619.10/report/java_media_framework.html
Using the Player with Swing
The Player can be easily used in a Swing application as well. The following code creates a Swing-based TV capture program with the video output displayed in the entire window:
import javax.media.*;
import javax.swing.*;
import java.awt.*;
import java.net.*;
import java.awt.event.*;
import javax.swing.event.*;
public class JMFTest extends JFrame {
Player _player;
JMFTest() {
addWindowListener( new WindowAdapter() {
public void windowClosing( WindowEvent e ) {
_player.stop();
_player.deallocate();
_player.close();
System.exit( 0 );
}
});
setExtent( 0, 0, 320, 260 );
JPanel panel = (JPanel)getContentPane();
panel.setLayout( new BorderLayout() );
String mediaFile = "vfw://1";
try {
MediaLocator mlr = new MediaLocator( mediaFile );
_player = Manager.createRealizedPlayer( mlr );
if (_player.getVisualComponent() != null)
panel.add("Center", _player.getVisualComponent());
if (_player.getControlPanelComponent() != null)
panel.add("South", _player.getControlPanelComponent());
}
catch (Exception e) {
System.err.println( "Got exception " + e );
}
}
public static void main(String[] args) {
JMFTest jmfTest = new JMFTest();
jmfTest.show();
}
}
Java usually doesn't like accessing hardware, so you will need a driver program of some sort, as goldenmean said. I've done this on my laptop by finding a command line program that snaps a picture. Then it's the same as goldenmean explained; you run the command line program from your java program in the takepicture() routine, and the rest of your code runs the same.
Except for the part about reading pixel values into an array, you might be better served by saving the file to BMP, which is nearly that format already, then using the standard java image libraries on it.
Using a command line program adds a dependency to your program and makes it less portable, but so was the webcam, right?
I believe the web-cam application software which comes along with the web-cam, or you native windows webcam software can be run in a batch script(windows/dos script) after turning the web cam on(i.e. if it needs an external power supply). In the bacth script , u can add appropriate delay to capture after certain time period. And keep executing the capture command in loop.
I guess this should be possible
-AD
There's a pretty nice interface for this in processing, which is kind of a pidgin java designed for graphics. It gets used in some image recognition work, such as that link.
Depending on what you need out of it, you might be able to load the video library that's used there in java, or if you're just playing around with it you might be able to get by using processing itself.
FMJ can do this, as can the supporting library it uses, LTI-CIVIL. Both are on sourceforge.
Recommand using FMJ for multimedia relatived java app.
Try using JMyron How To Use Webcam Using Java. I think using JMyron is the easiest way to access a webcam using java. I tried to use it with a 64-bit processor, but it gave me an error. It worked just fine on a 32-bit processor, though.