vlcj webcam stream in java - java

I am trying to make a simple program that will live stream from my webcam.
public static void main(String[] args) throws Exception
{
int port = nextAvailable();
//String media = "/root/Desktop/525600.mp4";
String media = "/dev/video0";
String[] options = {":sout=#duplicate{dst=rtp{sdp=rtsp://:"+port+"/stream},dst=display}", ":sout-all", ":sout-keep"};
MediaPlayerFactory mediaPlayerFactory = new MediaPlayerFactory(args);
HeadlessMediaPlayer mediaPlayer = mediaPlayerFactory.newHeadlessMediaPlayer();
mediaPlayer.playMedia(media,options);
System.out.println("Using port: "+port);
Thread.currentThread().join();
}
if I use the commented media (/root/Desktop/525600.mp4), the stream works without any issues. However, I do not know how to stream from the webcam. I tried /dev/video0 but it gives the following errors:
[00007fae70008f78] core access error: read error: Invalid argument
[00007fae70008f78] filesystem access error: read error: Invalid
argument
[00007fae7000d3d8] core stream error: cannot pre fill buffer
What am I doing wrong?

Also you can refer to this :-
https://github.com/sarxos/webcam-capture/tree/master/webcam-capture-examples/webcam-capture-live-streaming

Simply replaced
String media = "/dev/video0";
with
String media = "v4l2:///dev/video0";
and it works now.

Related

GCP Speech to text - Java API not working

I have a sample .webm file recorded using MediaRecorder in Chrome Browser. When I use Google speech java client to get transcription for the video, it returns empty transcription. Here is what my code looks like
SpeechSettings settings = null;
Path path = Paths.get("D:\\scrap\\gcp_test.webm");
byte[] content = null;
try {
content = Files.readAllBytes(path);
settings = SpeechSettings.newBuilder().setCredentialsProvider(credentialsProvider).build();
} catch (IOException e1) {
throw new IllegalStateException(e1);
}
try (SpeechClient speech = SpeechClient.create(settings)) {
// Builds the request for remote FLAC file
RecognitionConfig config = RecognitionConfig.newBuilder()
.setEncoding(AudioEncoding.LINEAR16)
.setLanguageCode("en-US")
.setUseEnhanced(true)
.setModel("video")
.setEnableAutomaticPunctuation(true)
.setSampleRateHertz(48000)
.build();
RecognitionAudio audio = RecognitionAudio.newBuilder().setContent(ByteString.copyFrom(content)).build();
// RecognitionAudio audio = RecognitionAudio.newBuilder().setUri("gs://xxxx/gcp_test.webm") .build();
// Use blocking call for getting audio transcript
RecognizeResponse response = speech.recognize(config, audio);
List<SpeechRecognitionResult> results = response.getResultsList();
for (SpeechRecognitionResult result : results) {
SpeechRecognitionAlternative alternative = result.getAlternativesList().get(0);
System.out.printf("Transcription: %s%n", alternative.getTranscript());
}
} catch (Exception e) {
e.printStackTrace();
System.err.println(e.getMessage());
}
If, I use the same file and visit https://cloud.google.com/speech-to-text/ and upload file in the demo section. It seems to work fine and shows transcription. I am clueless about whats going wrong here. I verified the request sent by demo and here it what looks like
I am sending the exact set of parameters, but that didn't work. Tried uploading file to Cloud storage but that too gave same result (no transcription).
After going through error and trials (and looking at the javascript samples), I could solve the issue. The serialized version of audio should be in FLAC format. I was sending the video file(webm) as is to Google Cloud. The demo on the site extracts audio stream using Javascript Audio API and then sents the data in base64 format to make it work.
Here are the steps that I executed to get the output.
Used FFMPEG to extract audio stream into FLAC format from webm.
ffmpeg -i sample.webm -vn -acodec flac sample.flac
The extracted file should be made available using either Storage cloud or send as ByteString.
Set the appropriate model while calling the speech API (for english language video model works, while for french language command_and_search). I don't have any logical reason for this. I realised it after trial and error with demo on Google cloud site.
I got results with flac encoded file.
Sample code results words with timestamp,
public class SpeechToTextSample {
public static void main(String... args) throws Exception {
try (SpeechClient speechClient = SpeechClient.create()) {
String gcsUriFlac = "gs://yourfile.flac";
RecognitionConfig config =
RecognitionConfig.newBuilder()
.setEncoding(AudioEncoding.FLAC)
.setEnableWordTimeOffsets(true)
.setLanguageCode("en-US")
.build();
RecognitionAudio audio = RecognitionAudio.newBuilder().setUri(gcsUriFlac).build(); //for large files
OperationFuture<LongRunningRecognizeResponse, LongRunningRecognizeMetadata> response = speechClient.longRunningRecognizeAsync(config, audio);
while (!response.isDone()) {
System.out.println("Waiting for response...");
Thread.sleep(1000);
}
// Performs speech recognition on the audio file
List<SpeechRecognitionResult> results = response.get().getResultsList();
for (SpeechRecognitionResult result : results) {
SpeechRecognitionAlternative alternative = result.getAlternativesList().get(0);
System.out.printf("Transcription: %s%n", alternative.getTranscript());
for (WordInfo wordInfo : alternative.getWordsList()) {
System.out.println(wordInfo.getWord());
System.out.printf(
"\t%s.%s sec - %s.%s sec\n",
wordInfo.getStartTime().getSeconds(),
wordInfo.getStartTime().getNanos() / 100000000,
wordInfo.getEndTime().getSeconds(),
wordInfo.getEndTime().getNanos() / 100000000);
}
}
}
}
}
GCP supports different languages, I have used "en-US" for my example.
Please refer following link document to know language list.

How to pass video from IP camera to LiveStream on Youtube?

I am trying to send a video captured from an IP Camera (stream from IP Webcam) through vlcj. My stream can be grabbed from http://<phoneIP>:8080/video
How can I send the video through Java to YT using YouTube Streaming API?
I saw the documentation about Youtube Streaming Api and Youtube Data Api v3 and by far I've managed to upload a video to my channel by using the code provided by them.
public static void main(String[] args) throws GeneralSecurityException, IOException, GoogleJsonResponseException {
YouTube youtubeService = getService();
// Define the Video object, which will be uploaded as the request body.
Video video = new Video();
// Add the snippet object property to the Video object.
VideoSnippet snippet = new VideoSnippet();
Random rand = new Random();
snippet.setCategoryId("22");
snippet.setDescription("Description of uploaded video.");
snippet.setTitle("Test video upload. "+ rand.nextInt());
video.setSnippet(snippet);
// Add the status object property to the Video object.
VideoStatus status = new VideoStatus();
status.setPrivacyStatus("unlisted");
video.setStatus(status);
File mediaFile = new File(FILE_PATH);
InputStreamContent mediaContent = new InputStreamContent("video/*",
new BufferedInputStream(new FileInputStream(mediaFile)));
mediaContent.setLength(mediaFile.length());
// Define and execute the API request
YouTube.Videos.Insert request = youtubeService.videos().insert("snippet,status",
video, mediaContent);
Video response = request.execute();
System.out.println(response);
}
But in the code presented by them about creating a Live stream isn't presented the part where you actually stream some content.
Thanks!
EDIT 1 25.06.2019/17:00
I found the field named ingestion address and completed it like this:
cdn.setIngestionInfo(new IngestionInfo().setIngestionAddress("http://192.168.0.100:8080/video"));, but in YouTube Studio, nothing shows up when I run the app (as seen in the photo below)
After some digging, i found out that LiveBroadcast is larger than LiveStream and it can embed a LiveStream. So far, i took the code from LiveBroadcast insert docs presented below.
public static void main(String[] args)
throws GeneralSecurityException, IOException, GoogleJsonResponseException {
YouTube youtubeService = getService();
// Define the LiveBroadcast object, which will be uploaded as the request body.
LiveBroadcast liveBroadcast = new LiveBroadcast();
LiveStream liveStream = new LiveStream();
// Add the contentDetails object property to the LiveBroadcast object.
LiveBroadcastContentDetails contentDetails = new LiveBroadcastContentDetails();
contentDetails.setEnableClosedCaptions(true);
contentDetails.setEnableContentEncryption(true);
contentDetails.setEnableDvr(true);
contentDetails.setEnableEmbed(true);
contentDetails.setRecordFromStart(true);
liveBroadcast.setContentDetails(contentDetails);
// Add the snippet object property to the LiveBroadcast object.
LiveBroadcastSnippet snippet = new LiveBroadcastSnippet();
snippet.setScheduledStartTime(new DateTime("2019-06-25T17:00:00+03:00"));
snippet.setScheduledEndTime(new DateTime("2019-06-25T17:05:00+03:00"));
snippet.setTitle("Test broadcast");
liveBroadcast.setSnippet(snippet);
// Add the status object property to the LiveBroadcast object.
LiveBroadcastStatus status = new LiveBroadcastStatus();
status.setPrivacyStatus("unlisted");
liveBroadcast.setStatus(status);
// Define and execute the API request
YouTube.LiveBroadcasts.Insert request = youtubeService.liveBroadcasts()
.insert("snippet,contentDetails,status", liveBroadcast);
LiveBroadcast response = request.execute();
System.out.println(response);
}
After running the code from above, I got this result on YouTube Studio:
Now I Don't know how to combine the two, or how to integrate LiveStream in LiveBroadcast so I can stream content from my phone.
Thanks again!
EDIT 2 25.06.2019/17:25
I found a function that can bind a stream to a broadcast, but when I open Live Control Room, i get this:
Still haven't managed to bind them, but i think i am getting closer, can someone push me towards the right direction here?
The LiveStream is a sort of metadata collection of information that the YouTube API uses to be aware of your stream and to hold information about it.
Part of the information is the CDN URL that you must send you actual video stream from your camera to (from https://developers.google.com/youtube/v3/live/docs/liveStreams)
You can see an answer here with an example of using this here: https://stackoverflow.com/a/29653174/334402

Jetty Pi4J I2C error oppening /dev/i2c-1

I'm trying to send data from Raspberry PI to Arduino via I2c.
When i execute the code with a stand alone java application i'm able to send and receive data with NO problem (the code bellow is giving me the expected result).
public static void main(String[] args) throws Exception {
// get I2C bus instance
final I2CBus bus = I2CFactory.getInstance(I2CBus.BUS_1);
I2CDevice arduino = bus.getDevice(0x04);
byte[] buffer = new byte[1];
buffer[0] = 1;
arduino.write(buffer, 0, buffer.length);
Thread.sleep(100);
buffer[0] = 0;
int number = arduino.read(buffer, 0, 1);
}
Then i try the same code, but this time it is inside a Servlet, using Jetty in the Raspberry Pi, and i get the following error:
java.io.IOException: Cannot open file handle for /dev/i2c-1 got -1 back.
at com.pi4j.io.i2c.impl.I2CBusImpl.<init>(I2CBusImpl.java:96)
at com.pi4j.io.i2c.impl.I2CBusImpl.getBus(I2CBusImpl.java:70)
at com.pi4j.io.i2c.I2CFactory.getInstance(I2CFactory.java:56)..
Does anyone know what may be happening?
Regards,
Could it be that in one case your process has sudo rights and it doesn't in the other case?
The answer is
I2CFactory.getInstance(I2CBus.BUS_0);
in some cases, the BUS is inverted so try this, i hope it helps :)

JNetPcap in eclipse does not print error... Ubuntu 12.04

Have some problems with JNetPcap.
I uses Ubuntu 12.04, and trying to make packet snipper that based in java language.
What I did is below.
I have downloaded JNetPcap 1.3.0.
And as tutorial said built a java project.
http://jnetpcap.com/examples/dumper <- this is the link.
I typed just like that link and I got my first problem.
PcapHandler Class is deprecated. So I find the document and replace it with ByteBufferHandler.
Now I compile this project and got an unsatifiedLinked Error.
I have tried with static block to load that library.
After some attempts I copied "libjnetpcap.so" to /usr/lib/
now I remove unsatisfiedLinked Error. but somehow it stops in 1st Error check.
It prints "1st error check : ", then exit automatically.
public static void main(String[] args) {
List<PcapIf> alldevs = new ArrayList<PcapIf>();
StringBuilder errbuff = new StringBuilder();
int r = Pcap.findAllDevs(alldevs, errbuff);
//============1st check
if(r == Pcap.NOT_OK || alldevs.isEmpty()){
System.err.printf("1st error check : %s\n", errbuff.toString());
return;
}
PcapIf device = alldevs.get(1);
//===================== END
int snaplen = 64 * 1024;
int flags = Pcap.MODE_PROMISCUOUS;
int timeout = 10 * 1000;
Pcap pcap = Pcap.openLive(device.getName(),snaplen, flags, timeout, errbuff);
//============2nd check
if(pcap == null){
System.err.printf("2nd error check : %s\n", errbuff.toString());
return;
}
//===================== END
String ofile = "/home/juneyoungoh/tmp_capture_file.cap";
final PcapDumper dumper = pcap.dumpOpen(ofile);
ByteBufferHandler<PcapDumper> handler = new ByteBufferHandler<PcapDumper>() {
#Override
public void nextPacket(PcapHeader arg0, ByteBuffer arg1, PcapDumper arg2) {
dumper.dump(arg0, arg1);
}
};
pcap.loop(10,handler, dumper);
File file = new File(ofile);
System.out.printf("%s file has %d bytes in it!\n", ofile, file.length());
dumper.close();
pcap.close();
if(file.exists()){
file.delete();
}
}
if is there any good reference or wonderful idea, please share.
Thanks.
On Linux, a program will probably have to run as root, or with sufficient privileges granted in some other fashion, in order to be able to open any devices, and, currently, pcap_findalldevs(), which is presumably what the Pcap.findAllDevs method uses, tries to open each of the devices it finds, and only returns the devices it can open.
So you'll have to run your Java program as root, or will somehow have to arrange that it have sufficient privileges (CAP_NET_RAW and CAP_NET_ADMIN) to get a list of network adapters and open those adapters.

I can not start the player in my code

I am trying to launch my my code and start the player. But I can not do that.
import javax.media.*;
import java.io.*;
public class MP3Player {
public static void main(String[] args) throws Exception {
File file = new File("c://player/trigger.mpg");
MediaLocator mrl = new MediaLocator(file.toURL());
Player player = Manager.createPlayer(mrl);
player.start();
}
}
[Edit by Philipp]
According to a comment by the original author, Netbeans prints the following error message:
Unable to handle format: MPEG, 160x120, FrameRate=30.0, Length=28800 Failed to realize:
com.sun.media.PlaybackEngine#131f71a Error: Unable to realize
com.sun.media.PlaybackEngine#131f71a BUILD SUCCESSFUL (total time: 1 second)
[/Edit by Philipp]
I don't know JMF player at all, but I assume the problem is that the code exits immediately after issuing the command, terminating any other threads...
I'd try inserting a Thread.sleep(1000); after player.start(); :
public class MP3Player {
public static void main(String[] args) throws Exception
{
File file = new File("c:/player/trigger.mpg");
MediaLocator mrl = new MediaLocator(file.toURL());
Player player = Manager.createPlayer(mrl);
player.start();
Thread.sleep(1000);
}
}
If now the first second of the MP3 is heard, this was the issue.
EDIT Also, someone pointed out problems with the slashes, the path should be correct too, but the slash is not missing, but there is rather one too much of it...
EDIT2 Ok, I misread mpg for mp3, and as the poster posted the error he got: the format of the video is not supported by JMF, you need a codec.
This might be of help: Tek-tips: Play MPEG-4 movie with JMF?
Unable to handle format: MPEG, 160x120, FrameRate=30.0
It is unable to play a video stream it founds. From the description and the name of your code, the file is expected to contain only audio streams of the compression format MP3 (MPEG-1 Audio Layer III). An .mpg extension may contains lot of different mpeg formats

Categories