VLCJ - Visualizer has no output - java

I am trying to get VLCJ to do a visualizer for the mp3 files its playing from a HTTP stream. Video shows up when I stream one. But when audio plays, nothing happens.
This is my code on the cliente side
EmbeddedMediaPlayerComponent empc = new EmbeddedMediaPlayerComponent();
String[] op = {"audio-visual=visual", "effect-list=spectrum", "effect-width=800", "effect-height=80"};
empc.mediaPlayer().media().play("http://127.0.0.1:" + port, op);
There's a lot more code, but nothing directly related to VLCJ.
I can post the server code if you think it's necessary, but I think it's not needed since the media reaches the client perfectly.
So, audio and video work fine, but the visualizer simply doesn't show up.
Any help would be appreciated.

First, check if you have the visualisation plugins installed on your OS distribution.
I am using Linux Mint and those plugins are NOT installed by default when you install VLC.
Do this:
sudo apt install vlc-plugin-visualization
Second, it seems you have to set the visualisation options on the MediaPlayerFactory rather than passing them when you invoke play() on the media player.
For example:
String[] options = new String[] {"--audio-visual=visual", "--effect-list=scope,vuMeter,spectrometer,spectrum"};
factory = new MediaPlayerFactory(options);
mediaPlayer = factory.mediaPlayers().newEmbeddedMediaPlayer();
This example configures the factory before creating a media player from it, you can use any of the media player factory creation methods.
The visualisations scale with the size of the window, I could not get the width and height parameters to do anything.
This is fine for audio.
If you play video, then the video will go the video surface embedded in your application and VLC will open up a new separate window to show the visualisations (probably you don't want that).

Related

Hardware acceleration honoured by Vlc player but ignored by vlcj

I'm writing some code to play media, I noticed that vlc mediaplayer correctly honours the --avcodec-hw parameter, but vlcj, with the same parameter passed to the library
This is how I build the mediaplayer component
//This uses jfx17
MediaPlayerFactory factory;
EmbeddedMediaPlayer mediaPlayer;
var args = new ArrayList<String>();
args.add("-vv");
args.add("--network-caching=100");
args.add("--avcodec-hw=vdpau_avcodec");
factory = new MediaPlayerFactory(args);
mediaPlayer = factory.mediaPlayers().newEmbeddedMediaPlayer();
var vs = new ImageViewVideoSurface(this.videoImageView);
mediaPlayer.videoSurface().set(vs);
videoImageView.setPreserveRatio(true);
String murl = "/tmp/mytestvideo.mp4";
mediaPlayer.media().play(murl);
mediaPlayer.controls().start();
In vlc I set the vdpau trough the GUI menu (tools->preferences->Input/codecs)
the command nvidia-smi dmon shows few percentage when using vlc (and top gives really low cpu usage) but 0 when using the javaFx applcation (and top gives a 30% cpu usage).
How can I force the GPU video decoding?
EDIT:
Here are the versions of the software I'm using:
Ubuntu 20.04
vlc media player: 3.0.9.2
vlcj: 4.7.3
vlcj-natives: 4.7.0
vlcj-javafx: 1.1.0
** EDIT **
edited according to #caprica no effect
Also, at runtime I force the parameter -Dprism.forceGPU=true

Simple Multicast Video Player in Java

PROBLEM:
I've put together a some code based on solutions I've found on the net, but when I run it, I get:
NullPointerException - Attempt to invoke virtual method
'java.lang.Object.android.content.Context.getSystemService(java.lang.Class)'
on a null object reference
I get this exception when I run the final line, mMediaPlayer.play();
Checking whether these relevant objects (mLibVLC, mMediaPlayer, mMedia) are null when I'm using them suggests they are not.
When I put the RTP address directly into the VLC app on the tablet, the video plays just fine.
BACKGROUND:
I'm trying to write a plugin for Android Team Awareness Kit (ATAK) that will allow me to stream and control a PTZ camera from an Android tablet, plugged in through an Ethernet adapter.
ATAK Plugin SDK - https://github.com/deptofdefense/AndroidTacticalAssaultKit-CIV/releases
Because I'm working with an existing SDK, I'm limited to what I can use. I believe I have to use a minSdkVersion = 21, and a compileSdkVersion = 26 in my build.gradle. I have tried changing them to newer versions (e.g. compileSdkVersion = 28) but quickly fall into a vortex of other errors.
AIM:
I'm trying to play a multicast UDP video stream live in my plugin. To do this, I'm using libvlc and trying to tie it into the existing plugin structure.
CODE:
Relevant Imports
import org.videolan.libvlc.LibVLC;
import org.videolan.libvlc.Media;
import org.videolan.libvlc.MediaPlayer;
import org.videolan.libvlc.media.VideoView;
import org.videolan.libvlc.util.VLCVideoLayout;
Definitions
private VLCVideoLayout VideoFrame;
private LibVLC mLibVLC = null;
private org.videolan.libvlc.MediaPlayer mMediaPlayer;
Relevant Video Playing Code
VideoFrame = myFirstFragment.findViewById(R.id.videoViewFrame);
String Video_URL = "rtsp://*****************";
mLibVLC = new LibVLC(this.pluginContext);
mMediaPlayer = new MediaPlayer(mLibVLC);
mMediaPlayer.attachViews(VideoFrame, null, false, false);
Media media = new Media(mLibVLC, Video_URL);
media.setHWDecoderEnabled(true, false);
mMediaPlayer.setMedia(media);
media.release();
mMediaPlayer.play();
In my frame constructor .XML
<org.videolan.libvlc.util.VLCVideoLayout
android:id="#+id/videoViewFrame"
android:layout_width="155dp"
android:layout_height="94dp"
android:fitsSystemWindows="true"
android:visibility="visible"
tools:visibility="visible" />
Added to my build.gradle dependencies
implementation files('C:\\******\\libvlc-all-4.0.0-eap1.aar')
OTHER THINGS I HAVE TRIED:
I've also tried implementing an alternate that I found online, but ATAK refuses to load the plugin with the setVideoURI line, which is obviously a pretty important line. I get a popup saying my plugin is incompatible.
Uri uri = Uri.parse("rtsp://***********");
org.videolan.libvlc.media.VideoView mVideoView = (VideoView) myFirstFragment.findViewById(R.id.videoViewFrame);
VideoFrame.setMediaController(new MediaController(this.pluginContext));
VideoFrame.setVideoURI(uri);
VideoFrame.requestFocus();
VideoFrame.start();
In this case, I also changed the frame type in my frame constructor .XML from VLCVideoLayout to org.videolan.libvlc.media.VideoView
I've also tried putting the entire code into an AppCompatActivity (which is how the original example was built) however had great trouble figuring out how to get the VideoView/VLCVideoLayout frame into the activity to bind the video to, and also determining how to get the activity to actually run. Also, this solution just seemed to be so much more complicated than I figure it should be.
When I get this running through whatever method, I'll bind it to a start and stop button.

Get the latest image from IP Camera in Java

I'm working on Java APP that will process the stream from the IP Camera (Milesight MS-C2682-P) located on Local network. It will detect objects and trigger actions depending on what's in the image (let´s say it will start an alarm, when a person is detected) - for that I need it to be with minimal delay.
I have an RTSP link "rtsp://username:password#ip_addr:rtsp_port/main", to access stream from my IP Camera, but in my JAVA app there is a 12 seconds delay (and it's increasing). This happens, when images are not handled fast enough, so they are buffered. There are "hacks" and "workarounds" (OpenCV VideoCapture lag due to the capture buffer), but I believe there has to be a prettier solution.
The other link I was able to get is an HTTP one, that uses also H.264 codec (can be used with MJPEG and MPEG4, if there is a possible way to use them effectively). "http://username:password#ip_addr:http_port/ipcam/mjpeg.cgi" - works like a charm.. in Python and browser. However, it doesn´t work in Java, an error is thrown:
OpenCV(4.2.0) C:\build\master_winpack-bindings-win64-vc14-static\opencv\modules\videoio\src\cap_images.cpp:253: error: (-5:Bad argument) CAP_IMAGES: can't find starting number (in the name of file): HTTP_URL in function 'cv::icvExtractPattern'
Both links work smoothly in VLC.
So, the network is not a problem ('cause VLC handles stream with minimal delay) and Python using OpenCV is also doing a good job. It all comes down to Java implementation of OpenCV.. I guess.
Here is a Java code:
VideoPlayer videoPlayer = new VideoPlayer(); // My Class, just creates and updates JFrame, works like a charm with laptop's webcam, so certainly no issues here
Mat image = new Mat();
VideoCapture ipCamera = new VideoCapture(RTSP_URL);
// or the HTTP link
// VideoCapture ipCamera = new VideoCapture(HTTP_URL);
// verify if u got access to camera
if (!ipCamera.isOpened()) {
System.out.println("ERROR: Camera isn't working !!! ");
return;
}
System.out.println("OK: Connected to camera.");
while (true) {
ipCamera.read(image);
videoPlayer.updateVideo_MatImage(image);
}
And this is the Python code I'm using:
import cv2
cap = cv2.VideoCapture(RTSP_URL)
# or the HTTP link
# cap = cv2.VideoCapture(HTTP_URL)
while True:
ret, image = cap.read()
cv2.imshow("Test", image)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cv2.destroyAllWindows()
I just need to get the latest image, when a request is made. So I need to avoid any kind of buffering. It has to be implemented in Java since it's a requirement for this project.
So is there a way to get only latest image from camera?
What could cause the error mentioned above?
Thank you guys for any advice.

How can I play Alert.startAudio() through the phone speaker?

I need to play Alert.startAudio() through the loudspeaker of a Blackberry device instead of the headset.
I know that I can change the Audio path through the AudioPathControl interface, but I don't know how to get an instance of AudioPathControl.
I found a LINK on how to do it on the Blackberry Knowledge base, but it only tells me how to do it using the Player class, which I don't want to do. Is there any way to get an instance of AudioPathControl of the current Application?
I would prefer to play a tone programmatically instead of including my own sound file. I found the following code snippet for that.
Player p = javax.microedition.media.Manager.createPlayer(javax.microedition.media.Manager.TONE_DEVICE_LOCATOR);
p.realize();
ToneControl tc = (ToneControl) p.getControl("ToneControl");
AudioPathControl apc = (AudioPathControl) p
.getControl("AudioPathControl");
apc.setAudioPath(AudioPathControl.AUDIO_PATH_HANDSFREE);
tc.setSequence(mySequence);
p.start();
But the problem is that apc is null and throws an Exception. Any solution?
Check the section Where Does the Sound Go? (preview from Google Books), from Advanced BlackBerry 6 Development By Chris King.

Take Screenshot of Android screen and save to SD card

There are a few questions here on SO about capturing screenshots of an android application. However, I haven't found a solid solution on how to take a screenshot programatically using the android SDK or any other method.
So I thought I would ask this question again in the hopes that I can find a good solution, hopefully one that will allow capturing full length images that I can save to the SD card or somewhere similar.
I appreicate any help
This is not possible directly on the device/emulator, unless it is rooted.
to honest all I need it for is the emulator as this is for a testing application on a PC
This sounds like a job for monkeyrunner.
monkeyrunner tool can do the job for you with bit of adb command, [python script]
from com.android.monkeyrunner import MonkeyRunner, MonkeyDevice
//waits for connection
device = MonkeyRunner.waitForConnection()
//take the current snapshot
device.takeSnapshot()
//stores the current snapshot in current dir in pc
device.writeToFile('current.png')\
//copy it to the sd card of device
os.subprocess.call('adb push current.png /sdcard/android/com.test.myapp/current.png')
Note: call this jython script file
monkeyrunner.bat <file name>
You will most likely not be happy with this answer, but the only ones that I have seen involve using native code, or executing native commands.
Edit:
I hadn't seen this one before. Have you tried it?:
http://code.google.com/p/android-screenshot-library/
Edit2: Checked that library, and it also is a bad solution. Requires that you start the service from a pc. So my initial answer still holds :)
Edit3: You should be able to save a view as an image by doing something similar to this. You might need to tweek it a bit so that you get the width/height of the view. (I'm inflating layouts, and specify the width/height when I layout the code)
View content = getView();
Bitmap bitmap = Bitmap.createBitmap(width, height, Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
File file = new File(pathAndFilename);
file.createNewFile();
FileOutputStream ostream = new FileOutputStream(file);
bitmap.compress(CompressFormat.PNG, 100, ostream);
ostream.close();
You can look at http://codaset.com/jens-riboe/droidatscreen/wiki (with a write up at http://blog.ribomation.com/2010/01/droidscreen/): this is a Java library that uses adb to capture a screen shots. I've been able to (with a lot of elbow grease) modify the source to let me automatically capture a timed series of screen shots (which I use for demo videos).
You can see the class structure at http://pastebin.com/hX5rQsSR
EDIT: You'd invoke it (after bundling all the requirements) like this:
java -cp DroidScreen.jar --adb "" --device "" --prefix "" --interval

Categories