I'm writing some code to play media, I noticed that vlc mediaplayer correctly honours the --avcodec-hw parameter, but vlcj, with the same parameter passed to the library
This is how I build the mediaplayer component
//This uses jfx17
MediaPlayerFactory factory;
EmbeddedMediaPlayer mediaPlayer;
var args = new ArrayList<String>();
args.add("-vv");
args.add("--network-caching=100");
args.add("--avcodec-hw=vdpau_avcodec");
factory = new MediaPlayerFactory(args);
mediaPlayer = factory.mediaPlayers().newEmbeddedMediaPlayer();
var vs = new ImageViewVideoSurface(this.videoImageView);
mediaPlayer.videoSurface().set(vs);
videoImageView.setPreserveRatio(true);
String murl = "/tmp/mytestvideo.mp4";
mediaPlayer.media().play(murl);
mediaPlayer.controls().start();
In vlc I set the vdpau trough the GUI menu (tools->preferences->Input/codecs)
the command nvidia-smi dmon shows few percentage when using vlc (and top gives really low cpu usage) but 0 when using the javaFx applcation (and top gives a 30% cpu usage).
How can I force the GPU video decoding?
EDIT:
Here are the versions of the software I'm using:
Ubuntu 20.04
vlc media player: 3.0.9.2
vlcj: 4.7.3
vlcj-natives: 4.7.0
vlcj-javafx: 1.1.0
** EDIT **
edited according to #caprica no effect
Also, at runtime I force the parameter -Dprism.forceGPU=true
Related
PROBLEM:
I've put together a some code based on solutions I've found on the net, but when I run it, I get:
NullPointerException - Attempt to invoke virtual method
'java.lang.Object.android.content.Context.getSystemService(java.lang.Class)'
on a null object reference
I get this exception when I run the final line, mMediaPlayer.play();
Checking whether these relevant objects (mLibVLC, mMediaPlayer, mMedia) are null when I'm using them suggests they are not.
When I put the RTP address directly into the VLC app on the tablet, the video plays just fine.
BACKGROUND:
I'm trying to write a plugin for Android Team Awareness Kit (ATAK) that will allow me to stream and control a PTZ camera from an Android tablet, plugged in through an Ethernet adapter.
ATAK Plugin SDK - https://github.com/deptofdefense/AndroidTacticalAssaultKit-CIV/releases
Because I'm working with an existing SDK, I'm limited to what I can use. I believe I have to use a minSdkVersion = 21, and a compileSdkVersion = 26 in my build.gradle. I have tried changing them to newer versions (e.g. compileSdkVersion = 28) but quickly fall into a vortex of other errors.
AIM:
I'm trying to play a multicast UDP video stream live in my plugin. To do this, I'm using libvlc and trying to tie it into the existing plugin structure.
CODE:
Relevant Imports
import org.videolan.libvlc.LibVLC;
import org.videolan.libvlc.Media;
import org.videolan.libvlc.MediaPlayer;
import org.videolan.libvlc.media.VideoView;
import org.videolan.libvlc.util.VLCVideoLayout;
Definitions
private VLCVideoLayout VideoFrame;
private LibVLC mLibVLC = null;
private org.videolan.libvlc.MediaPlayer mMediaPlayer;
Relevant Video Playing Code
VideoFrame = myFirstFragment.findViewById(R.id.videoViewFrame);
String Video_URL = "rtsp://*****************";
mLibVLC = new LibVLC(this.pluginContext);
mMediaPlayer = new MediaPlayer(mLibVLC);
mMediaPlayer.attachViews(VideoFrame, null, false, false);
Media media = new Media(mLibVLC, Video_URL);
media.setHWDecoderEnabled(true, false);
mMediaPlayer.setMedia(media);
media.release();
mMediaPlayer.play();
In my frame constructor .XML
<org.videolan.libvlc.util.VLCVideoLayout
android:id="#+id/videoViewFrame"
android:layout_width="155dp"
android:layout_height="94dp"
android:fitsSystemWindows="true"
android:visibility="visible"
tools:visibility="visible" />
Added to my build.gradle dependencies
implementation files('C:\\******\\libvlc-all-4.0.0-eap1.aar')
OTHER THINGS I HAVE TRIED:
I've also tried implementing an alternate that I found online, but ATAK refuses to load the plugin with the setVideoURI line, which is obviously a pretty important line. I get a popup saying my plugin is incompatible.
Uri uri = Uri.parse("rtsp://***********");
org.videolan.libvlc.media.VideoView mVideoView = (VideoView) myFirstFragment.findViewById(R.id.videoViewFrame);
VideoFrame.setMediaController(new MediaController(this.pluginContext));
VideoFrame.setVideoURI(uri);
VideoFrame.requestFocus();
VideoFrame.start();
In this case, I also changed the frame type in my frame constructor .XML from VLCVideoLayout to org.videolan.libvlc.media.VideoView
I've also tried putting the entire code into an AppCompatActivity (which is how the original example was built) however had great trouble figuring out how to get the VideoView/VLCVideoLayout frame into the activity to bind the video to, and also determining how to get the activity to actually run. Also, this solution just seemed to be so much more complicated than I figure it should be.
When I get this running through whatever method, I'll bind it to a start and stop button.
I'm working on Java APP that will process the stream from the IP Camera (Milesight MS-C2682-P) located on Local network. It will detect objects and trigger actions depending on what's in the image (let´s say it will start an alarm, when a person is detected) - for that I need it to be with minimal delay.
I have an RTSP link "rtsp://username:password#ip_addr:rtsp_port/main", to access stream from my IP Camera, but in my JAVA app there is a 12 seconds delay (and it's increasing). This happens, when images are not handled fast enough, so they are buffered. There are "hacks" and "workarounds" (OpenCV VideoCapture lag due to the capture buffer), but I believe there has to be a prettier solution.
The other link I was able to get is an HTTP one, that uses also H.264 codec (can be used with MJPEG and MPEG4, if there is a possible way to use them effectively). "http://username:password#ip_addr:http_port/ipcam/mjpeg.cgi" - works like a charm.. in Python and browser. However, it doesn´t work in Java, an error is thrown:
OpenCV(4.2.0) C:\build\master_winpack-bindings-win64-vc14-static\opencv\modules\videoio\src\cap_images.cpp:253: error: (-5:Bad argument) CAP_IMAGES: can't find starting number (in the name of file): HTTP_URL in function 'cv::icvExtractPattern'
Both links work smoothly in VLC.
So, the network is not a problem ('cause VLC handles stream with minimal delay) and Python using OpenCV is also doing a good job. It all comes down to Java implementation of OpenCV.. I guess.
Here is a Java code:
VideoPlayer videoPlayer = new VideoPlayer(); // My Class, just creates and updates JFrame, works like a charm with laptop's webcam, so certainly no issues here
Mat image = new Mat();
VideoCapture ipCamera = new VideoCapture(RTSP_URL);
// or the HTTP link
// VideoCapture ipCamera = new VideoCapture(HTTP_URL);
// verify if u got access to camera
if (!ipCamera.isOpened()) {
System.out.println("ERROR: Camera isn't working !!! ");
return;
}
System.out.println("OK: Connected to camera.");
while (true) {
ipCamera.read(image);
videoPlayer.updateVideo_MatImage(image);
}
And this is the Python code I'm using:
import cv2
cap = cv2.VideoCapture(RTSP_URL)
# or the HTTP link
# cap = cv2.VideoCapture(HTTP_URL)
while True:
ret, image = cap.read()
cv2.imshow("Test", image)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cv2.destroyAllWindows()
I just need to get the latest image, when a request is made. So I need to avoid any kind of buffering. It has to be implemented in Java since it's a requirement for this project.
So is there a way to get only latest image from camera?
What could cause the error mentioned above?
Thank you guys for any advice.
I am trying to get VLCJ to do a visualizer for the mp3 files its playing from a HTTP stream. Video shows up when I stream one. But when audio plays, nothing happens.
This is my code on the cliente side
EmbeddedMediaPlayerComponent empc = new EmbeddedMediaPlayerComponent();
String[] op = {"audio-visual=visual", "effect-list=spectrum", "effect-width=800", "effect-height=80"};
empc.mediaPlayer().media().play("http://127.0.0.1:" + port, op);
There's a lot more code, but nothing directly related to VLCJ.
I can post the server code if you think it's necessary, but I think it's not needed since the media reaches the client perfectly.
So, audio and video work fine, but the visualizer simply doesn't show up.
Any help would be appreciated.
First, check if you have the visualisation plugins installed on your OS distribution.
I am using Linux Mint and those plugins are NOT installed by default when you install VLC.
Do this:
sudo apt install vlc-plugin-visualization
Second, it seems you have to set the visualisation options on the MediaPlayerFactory rather than passing them when you invoke play() on the media player.
For example:
String[] options = new String[] {"--audio-visual=visual", "--effect-list=scope,vuMeter,spectrometer,spectrum"};
factory = new MediaPlayerFactory(options);
mediaPlayer = factory.mediaPlayers().newEmbeddedMediaPlayer();
This example configures the factory before creating a media player from it, you can use any of the media player factory creation methods.
The visualisations scale with the size of the window, I could not get the width and height parameters to do anything.
This is fine for audio.
If you play video, then the video will go the video surface embedded in your application and VLC will open up a new separate window to show the visualisations (probably you don't want that).
The image quality and the framerate I get when using the camera2 API does not match the one I get when I manually record a video using the camera app to a file.
I am trying to do real-time image processing using OpenCV on Android. I have manually recorded a video using the built-in camera application and everything worked perfectly: the image quality was good, the framerate was a stable 30 FPS.
My min SDK version is 22, so I am using the camera2 API's repeating requests. I have set it up, together with an ImageReader and the YUV_420_888 format. I have tried both the PREVIEW and the RECORD capture request templates, tried manually setting 18 capture request parameters in the builder (eg. disabling auto-white-balance, setting the color correction mode to fast), but the FPS was still around 8-9 and the image quality was poor as well. Another phone yielded the same results, despite its max. FPS being 16.67 (instead of 30).
The culprit is not my image processing (which happens in another thread, except for reading the image's buffer): I checked the FPS when I don't do anything with the frame (I didn't even display the image), it was still around 8-9.
You can see the relevant code for that here:
//constructor:
HandlerThread thread = new HandlerThread("MyApp:CameraCallbacks", Process.THREAD_PRIORITY_MORE_FAVORABLE);
thread.start();
captureCallbackHandler = new Handler(thread.getLooper());
//some UI event:
cameraManager.openCamera(cameraId, new CameraStateCallback()), null);
//CameraStateCallback#onOpened:
//size is 1280x720, same as the manually captured video's
imageReader = ImageReader.newInstance(size.getWidth(), size.getHeight(), ImageFormat.YUV_420_888, 1);
imageReader.setOnImageAvailableListener(new ImageAvailableListener(), captureCallbackHandler);
camera.createCaptureSession(Collections.singletonList(imageReader.getSurface()), new CaptureStateCallback(), captureCallbackHandler);
//CaptureStateCallback#onConfigured:
CaptureRequest.Builder builder = activeCamera.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
builder.addTarget(imageReader.getSurface());
//setting the FPS range has no effect: this phone only has one option
session.setRepeatingRequest(builder.build(), null, captureCallbackHandler);
//ImageAvailableListener#onImageAvailable:
long current = System.nanoTime();
deltaTime += (current - last - deltaTime) * 0.1;
Log.d("MyApp", "onImageAvailable FPS: " + (1000000000 / deltaTime));
//prints around 8.7
last = current;
try (Image image = reader.acquireLatestImage()) { }
On Samsung Galaxy J3 (2016), doing Camera.Parameters#setRecordingHint(true) (while using the deprecated camera API) achieves exactly what I wanted: the video quality and the framerate becomes the same as the built-in video recorder's. Unfortunately, it also means that I was unable to modify the resolution, and setting that hint did not achieve this same effect on a Doogee X5 MAX.
i am trying to make my mac tray icon as suggested in How to make my app icon bounce in the Mac dock
this works fine with pure java applications and swings
but this doesn't works with e4 swt applications , how to make it bounce in this type applications
ref:
pfa of the sample code in the following link
https://bugs.eclipse.org/bugs/show_bug.cgi?id=321949
Application.requestUserAttention works for me in an e4 application (Eclipse 4.3.2 on Mac 10.9.3 with Java 1.8 update 5).
Note: It only does something if the application is not the focused app. With the false parameter there is only one bounce, specify true to make it bounce until the app has focus.
Update:
You can also do this using the SWT Mac specific classes, like this:
private static final long sel_requestUserAttention_ = OS.sel_registerName("requestUserAttention:");
private static final int NSCriticalRequest = 0;
private static final int NSInformationalRequest = 10;
...
NSApplication app = NSApplication.sharedApplication();
OS.objc_msgSend(app.id, sel_requestUserAttention_, NSInformationalRequest);
Use NSInformationalRequest for a single bounce, NSCriticalRequest to bounce until the app receive focus.
Since this is Mac only SWT code you will have to put it in a plugin or fragment with a platform filter in the MANIFEST.MF such as:
Eclipse-PlatformFilter: (& (osgi.ws=cocoa) (osgi.os=macosx) (osgi.arch=x86_64) )
Update:
The above code is for 64 bit SWT on Mac OSX, for 32 bit SWT use
private static final int sel_requestUserAttention_ = OS.sel_registerName("requestUserAttention:");