convert a class file to a applet - java

I have a java class which running perfectly but now I need to run this class as a web application so I need to convert this class to an applet how can I convert this class to an applet.
I know little bid about applet like it's life cycle
init()
start()
paint()
stop()
destroy()
and to run a applet
applet code = "LifeTest.class"
so any one help me to convert this class to an applet and if it is not possible then any suggestion as a substitute
import java.io.ByteArrayInputStream;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.SourceDataLine;
class Server {
AudioInputStream audioInputStream;
static AudioInputStream ais;
static AudioFormat format;
static boolean status = true;
static int port = 50005;
static int sampleRate = 8000;
public static void main(String args[]) throws Exception {
DatagramSocket serverSocket = new DatagramSocket(50005);
/**
* Formula for lag = (byte_size/sample_rate)*2
* Byte size 9728 will produce ~ 0.45 seconds of lag. Voice slightly broken.
* Byte size 1400 will produce ~ 0.06 seconds of lag. Voice extremely broken.
* Byte size 4000 will produce ~ 0.18 seconds of lag. Voice slightly more broken then 9728.
*/
byte[] receiveData = new byte[5000];
format = new AudioFormat(sampleRate, 16, 1, true, false);
while (status == true) {
DatagramPacket receivePacket = new DatagramPacket(receiveData,
receiveData.length);
serverSocket.receive(receivePacket);
ByteArrayInputStream baiss = new ByteArrayInputStream(
receivePacket.getData());
ais = new AudioInputStream(baiss, format, receivePacket.getLength());
toSpeaker(receivePacket.getData());
}
}
public static void toSpeaker(byte soundbytes[]) {
try {
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
sourceDataLine.open(format);
FloatControl volumeControl = (FloatControl) sourceDataLine.getControl(FloatControl.Type.MASTER_GAIN);
volumeControl.setValue(6.0206f);
sourceDataLine.start();
sourceDataLine.open(format);
sourceDataLine.start();
System.out.println("format? :" + sourceDataLine.getFormat());
sourceDataLine.write(soundbytes, 0, soundbytes.length);
System.out.println(soundbytes.toString());
sourceDataLine.drain();
sourceDataLine.close();
} catch (Exception e) {
System.out.println("Not working in speakers...");
e.printStackTrace();
}
}
}

You need to extend Applet. Let me give you sample code for it.
import java.applet.Applet;
import java.awt.Graphics;
public class HelloWorld extends Applet {
public void paint(Graphics g) {
g.drawString("Hello world!", 50, 25);
}
}
Create MANIFEST.MF file using some text editor. Place it in same directory where your .java file is. Its content should be like.
Manifest-Version: 1.0
Permissions: all-permissions
Application-Name: Name of your application
Now you need to compile your code and need to attach MANIFEST.MF file in it.
javac HelloWorld.java
jar cvfm MANIFEST.MF HelloWorld.jar *.class
Now create one .html file and place <applet> tag in it.
<applet name="HelloWorld" code="HelloWorld.class"
archive="HelloWorld.jar" width="100" height="100">
</applet>

Related

Audio - Streaming Audio from Java is Choppy

My main objective is to create live streaming of encrypted voice chat from mic.
The encrypted audio is then transmitted over the network from one client to another.
The problem is that the audio is always getting stuttering and choppy while running the program (streaming).
I tried different types of hardware (PC, laptop, Raspberry Pi).
Different OSes as well.
Only sampling un-encrypted audio to eliminated any issue causes by the encryption algorithm.
Changing audio sample rate.
Unfortunately everything failed.
To make it simple, I only included the code needed to transmit the audio over the network without the encryption.
MAIN CLASS - both sender and receiver
package com.emaraic.securevoice;
import com.emaraic.securevoice.utils.AES;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import javax.sound.sampled.*;
public class SecureVoice
{
public static void main(String[] args)
{
Receiver rx = new Receiver();
rx.start();
Transmitter tx = new Transmitter();
tx.start();
}
public static AudioFormat getAudioFormat()
{ //you may change these parameters to fit you mic
float sampleRate = 8000.0f; //8000,11025,16000,22050,44100
int sampleSizeInBits = 16; //8,16
int channels = 1; //1,2
boolean signed = true; //true,false
boolean bigEndian = false; //true,false
return new AudioFormat(sampleRate, sampleSizeInBits, channels, signed, bigEndian);
}
public static final String ANSI_BOLD = "\033[0;1m"; //not working in NetBeans
public static final String ANSI_RESET = "\033[0m";
public static final String ANSI_BLACK = "\033[30m";
public static final String ANSI_RED = "\033[31m";
public static final String ANSI_GREEN = "\033[32;4m";
public static final String ANSI_YELLOW = "\033[33m";
public static final String ANSI_BLUE = "\033[34m";
public static final String ANSI_PURPLE = "\033[35m";
public static final String ANSI_CYAN = "\033[36m";
public static final String ANSI_WHITE = "\033[37m";
}
SENDER
package com.emaraic.securevoice;
import com.emaraic.securevoice.utils.AES;
import java.io.*;
import java.io.File;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.text.SimpleDateFormat;
import java.util.Date;
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.Mixer;
import javax.sound.sampled.Port;
import javax.sound.sampled.TargetDataLine;
public class Transmitter extends Thread
{
// these parameters must be copied and used in the Receiver class of the other client
private static final String TX_IP = "10.101.114.179"; //ip to send to
private static final int TX_PORT = 1034;
#Override
public void run()
{
SecureVoice color = new SecureVoice();
Mixer.Info minfo[] = AudioSystem.getMixerInfo();
System.out.println(color.ANSI_BLUE + "Detecting sound card drivers...");
for (Mixer.Info minfo1 : minfo)
{
System.out.println(" " + minfo1);
}
if (AudioSystem.isLineSupported(Port.Info.MICROPHONE))
{
try
{
DataLine.Info dataLineInfo = new DataLine.Info(TargetDataLine.class, SecureVoice.getAudioFormat());
final TargetDataLine line = (TargetDataLine) AudioSystem.getLine(dataLineInfo); //recording from mic
line.open(SecureVoice.getAudioFormat());
line.start(); //start recording
System.out.println(color.ANSI_GREEN + "Recording...");
byte tempBuffer[] = new byte[line.getBufferSize()];
System.out.println(color.ANSI_BLUE + "Buffer size = " + tempBuffer.length + " bytes");
//AudioCapture audio = new AudioCapture(line); //capture the audio into .wav file
//audio.start();
while (true) //AES encryption
{
int read = line.read(tempBuffer, 0, tempBuffer.length);
byte[] encrypt = AES.encrypt(tempBuffer, 0, read);
// sendToUDP(encrypt);
sendToUDP(tempBuffer);
}
}
catch (Exception e)
{
System.out.println(e.getMessage());
System.exit(0);
}
}
}
public static void sendToUDP(byte soundpacket[])
{
try
{
// EncryptedAudio encrypt = new EncryptedAudio(soundpacket);
// encrypt.start();
DatagramSocket sock = new DatagramSocket();
sock.send(new DatagramPacket(soundpacket, soundpacket.length, InetAddress.getByName(TX_IP), TX_PORT));
sock.close();
}
catch (Exception e)
{
System.out.println(e.getMessage());
}
}
}
RECEIVER
package com.emaraic.securevoice;
import com.emaraic.securevoice.utils.AES;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.SourceDataLine;
public class Receiver extends Thread {
// these parameters must by used in the Transmitter class of the other client
private static final String RX_IP = "localhost";
private static final int RX_PORT = 1034;
#Override
public void run() {
byte b[] = null;
while (true) {
b = rxFromUDP();
speak(b);
}
}
public static byte[] rxFromUDP() {
try {
DatagramSocket sock = new DatagramSocket(RX_PORT);
byte soundpacket[] = new byte[8192];
DatagramPacket datagram = new DatagramPacket(soundpacket, soundpacket.length, InetAddress.getByName(RX_IP), RX_PORT);
sock.receive(datagram);
sock.close();
// return AES.decrypt(datagram.getData(),0,soundpacket.length); // soundpacket ;
return soundpacket; // if you want to hear encrypted form
} catch (Exception e) {
System.out.println(e.getMessage());
return null;
}
}
public static void speak(byte soundbytes[]) {
try {
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, SecureVoice.getAudioFormat());
try (SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo)) {
sourceDataLine.open(SecureVoice.getAudioFormat());
sourceDataLine.start();
sourceDataLine.write(soundbytes, 0, soundbytes.length);
sourceDataLine.drain();
}
} catch (Exception e) {
System.out.println(e.getMessage());
}
}
}
EXTRA LINK
http://emaraic.com/blog/secure-voice-chat
IDE Used
- Netbeans 11.1
Java JDK version
- Java 13 (Windows)
- OpenJDK11 (Linux)
Two problems. Network streamed data will have jitter in arrival time. Starting and stopping audio play will cause delay gaps and jitter due to OS and hardware driver overhead time. There is also the smaller problem of audio sample rate clock rate synchronization between the record and play systems. All of those can impact a continuous stream of audio samples at a fixed rate.
To avoid the audio start-up latency problem, don't stop your audio play or record system between network packets, always have audio data ready to play continuously at the current sample rate. To help cover network jitter, buffer some amount of audio data before starting playback, so there is always some audio ready to play even if the next network packet is sightly delayed.
You may have to gather some statistics on the audio startup and network latency and latency variation to determine a suitable amount to buffer. The alternative is an audio dropout concealment algorithm, which is far more complicated to implement.

NullPointerException whe reading audio from JAR

I tried to make a runnable JAR, but for some reason I couldn't get my game to play. I did some research and ran it through my command prompt to try to find the error and I got this below. So obviously I know the issue I just need to fix it. I am new to programming so not quite sure what this is telling me.
Exception in thread "main" java.lang.NullPointerException
at java.base/java.util.Objects.requireNonNull(Unknown Source)
at java.desktop/javax.sound.sampled.AudioSystem.getAudioInputStream(Unknown Source)
at builder.AudioPlayer.playMenuSound(AudioPlayer.java:20)
at builder.Game.<init>(Game.java:56)
at builder.Game.main(Game.java:61)
package builder;
import java.io.File;
import java.io.IOException;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Clip;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.UnsupportedAudioFileException;
public class AudioPlayer {
private static Clip play;
public static void playMenuSound()
{
try {
//AudioInputStream menuSound = AudioSystem.getAudioInputStream(new File("src/res/introSong.wav")); //Take in audio from res folder
AudioInputStream menuSound = AudioSystem.getAudioInputStream(AudioPlayer.class.getClassLoader().getResourceAsStream("res/introSong.wav"));
play = AudioSystem.getClip(); //
play.open(menuSound); //Play the sound
FloatControl volume = (FloatControl) play.getControl(FloatControl.Type.MASTER_GAIN); //Get control of volume
volume.setValue(1.0f); //0.0 - 1.0 volume
play.loop(Clip.LOOP_CONTINUOUSLY); //Loop once clip is over
}catch (LineUnavailableException | IOException | UnsupportedAudioFileException e){
e.printStackTrace();
}
}
public static void playGameSound()
{
try {
//AudioInputStream gameSound = AudioSystem.getAudioInputStream(new File("src/res/inGame.wav")); //Take in audio from res folder
AudioInputStream gameSound = AudioSystem.getAudioInputStream(AudioPlayer.class.getClassLoader().getResourceAsStream("res/inGame.wav"));
play = AudioSystem.getClip(); //
play.open(gameSound); //Play the sound
FloatControl volume = (FloatControl) play.getControl(FloatControl.Type.MASTER_GAIN); //Get control of volume
volume.setValue(0.5f); //0.0 - 1.0 volume
play.loop(Clip.LOOP_CONTINUOUSLY); //Loop once clip is over
}catch (LineUnavailableException | IOException | UnsupportedAudioFileException e){
e.printStackTrace();
}
}
public static void stopMusic()
{
play.close(); //Stop music
}
}
Your sound file inGame.wav is located in a directory res inside your source directory.
When you export it, the sources will be compiled and copied to the JAR.
This results in the sound file being in a subdirectory res inside the JAR.
You try to read the file inGame.wav but you have to read res/inGame.wav instead.
The second problem is that the system cannot set markers on the InputStream. This can be solved by changing getResourceAsStream to getResource().
AudioInputStream menuSound = AudioSystem.getAudioInputStream(AudioPlayer.class.getClassLoader().getResource("res/inGame.wav"));

Xuggler Video To Audio conversion

I am converting video to audio using Xuggler library in Java. There are no errors or exceptions arising in the program but the audio file generated is 0 Kb. Could someone fix the problem?
Environment: Eclipse Helios, OS: Windows 7
External JAR libraries added to project:
(1)slf4j-api-1.7.7.jar
(2)slf4j-simple-1.7.7.jar
(3)xuggle-xuggler-5.4.jar
Code snippet for video to audio conversion.
import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.IMediaWriter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.xuggler.ICodec;
public class VideoToAudio{
public void convertVideoToAudio(){
IMediaReader reader = ToolFactory.makeReader("D://vid.mp4");
IMediaWriter writer = ToolFactory.makeWriter("D://a.mp3",reader);
int sampleRate = 44100;
int channels = 1;
writer.addAudioStream(0, 0, ICodec.ID.CODEC_ID_MP3, channels, sampleRate);
while (reader.readPacket() == null);
}
public static void main(String [] args){
VideoToAudio vta = new VideoToAudio();
try{
vta.convertVideoToAudio();
}
catch(Exception e){
System.out.println("Could not open video file");
}
}
}
Your program looks fine, you just forgot a line :)
public void convertVideoToAudio(){
IMediaReader reader = ToolFactory.makeReader("D://vid.mp4");
IMediaWriter writer = ToolFactory.makeWriter("D://a.mp3",reader);
int sampleRate = 44100;
int channels = 1;
writer.addAudioStream(1, 0, ICodec.ID.CODEC_ID_MP3, channels, sampleRate);
-> reader.addListener(writer);
while (reader.readPacket() == null);
}

Audio recorder problem in java

I have a problem while recording the audio. I created a servlet and I modified the java sound API demo code to some extent and finally I can record the audio. The problem is that when I play the audio I can see the total time of the audio stored as 645.45 or something like that, but I have been recording the audio only for a couple of mins. One more problem is the audio is getting saved in the Eclipse directory instead of the project directory.
This is the servlet code.
package com;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.Clip;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.TargetDataLine;
public class SoundRecorder extends HttpServlet {
private static final long serialVersionUID = 1L;
static protected boolean running;
static ByteArrayOutputStream out;
double fileName = Math.random();
//strFilename = nowLong.toString();
public SoundRecorder() {
System.out.println("Filename will be..." + fileName + ".wav");
}
public void init() {
}
public void destroy() {
}
public void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
System.out.println("call received..");
String method = request.getParameter("method");
System.out.println(method);
if("record".equalsIgnoreCase(method)) {
captureAudio(true);
}
else if("stop".equalsIgnoreCase(method)) {
captureAudio(false);
}
else if("play".equalsIgnoreCase(method)) {
System.out.println("yet to write");
playAudio();
}
}
public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
System.out.println("call received..");
String method = request.getParameter("method");
System.out.println(method);
doGet(request, response);
}
private void captureAudio(boolean capturing) {
File outputFile = new File(fileName + ".wav");
AudioFormat audioFormat = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED,44100.0F, 16, 2, 4, 44100.0F, false);
DataLine.Info info = new DataLine.Info(TargetDataLine.class, audioFormat);
TargetDataLine targetDataLine = null;
try
{
targetDataLine = (TargetDataLine) AudioSystem.getLine(info);
targetDataLine.open(audioFormat);
}
catch (LineUnavailableException e)
{
System.out.println("unable to get a recording line");
e.printStackTrace();
System.exit(1);
}
AudioFileFormat.Type targetType = AudioFileFormat.Type.WAVE;
final Recorder recorder = new Recorder(targetDataLine,targetType,outputFile);
System.out.println("Recording...");
if(capturing){
recorder.start();
}
else {
recorder.stopRecording();
}
}
private void playAudio() {
try {
File file = new File(fileName + ".wav");
AudioInputStream stream = AudioSystem.getAudioInputStream(file);
AudioFormat format = stream.getFormat();
DataLine.Info info = new DataLine.Info(Clip.class, stream.getFormat());
Clip clip = (Clip) AudioSystem.getLine(info);
clip.open(stream);
clip.start();
} catch (Exception e) {
System.err.println("Line unavailable: " + e);
System.exit(-4);
}
}
}
And this is the recorder class
public class Recorder extends Thread {
private TargetDataLine m_line;
private AudioFileFormat.Type m_targetType;
private AudioInputStream m_audioInputStream;
private File m_outputFile;
public Recorder(TargetDataLine line,
AudioFileFormat.Type targetType,
File file)
{
m_line = line;
m_audioInputStream = new AudioInputStream(line);
m_targetType = targetType;
m_outputFile = file;
}
/** Starts the recording.
To accomplish this, (i) the line is started and (ii) the
thread is started.
*/
public void start()
{
m_line.start();
super.start();
}
/** Stops the recording.
*/
public void stopRecording()
{
m_line.stop();
m_line.close();
}
/** Main working method.
*/
public void run()
{
try
{
AudioSystem.write(
m_audioInputStream,
m_targetType,
m_outputFile);
}
catch (IOException e)
{
e.printStackTrace();
}
}
private static void closeProgram()
{
System.out.println("Program closing.....");
System.exit(1);
}
private static void out(String strMessage)
{
System.out.println(strMessage);
}
}
When developing with servlets, you need to realize that there's only one servlet instance throughout the whole webapp's lifetime, from startup until shutdown. So, the HTTP requests from all visitors, all sessions, all browser windows/tabs, etc will all share the same servlet instance. Also, when you make a variable static, it will be shared among all instances of the same class (which is not really relevant here since there's only one servlet instance anyway).
In other words, those variables which you've declared in the servlet are not threadsafe:
static protected boolean running;
static ByteArrayOutputStream out;
double fileName = Math.random();
There's only one of them and they are used by all visitors simultaneously. For the first two variables, which are continuously modified, this will lead to major threadsafety problems and for the third variable this means that all visitors record to the very same file. You need to declare them inside the doGet() block. You'd like to store the recording in the session by an unique request based token as key and then pass that key to the subsequent requests.
As to the problem of the file being saved at the unexpected location; when you use relative paths in java.io.File in a servlet, then it will be relative to the directory from where the webserver is started. If you start it from inside Eclipse, then it's saved in Eclipse directory. You'd like to use absolute path in java.io.File instead. If your intent is to save it in public webcontent (there where your JSP's and the /WEB-INF folder is located), then you need ServletContext#getRealPath() to convert a web path to an absolute disk path.
String relativeWebPath = "filename.ext";
String absoluteDiskPath = getServletContext().getRealPath(relativeWebPath);
File file = new File(absoluteDiskPath);
There's however another problem with this: all files will get erased whenever you redeploy the webapp. If you want a bit more permanent storage, then you'd like to store it outside the web project. E.g. C:/path/to/recordings.
File file = new File("C:/path/to/recordings/filename.ext");

how to play wav file in java 1.4

as title
How can i play a sound file repeatedly in java v1.4?
If you just want to play the wav file then 'org.life.java''s answer is correct. For other format types you can use JMF( http://www.oracle.com/technetwork/java/javase/tech/index-jsp-140239.html ).
Note: JMF is obsolete now... But it will work with jdk 1.4
import java.net.URL;
import javax.sound.sampled.*;
public class LoopSound {
public static void main(String[] args) throws Exception {
URL url = new URL(
"http://pscode.org/media/leftright.wav");
Clip clip = AudioSystem.getClip();
AudioInputStream ais = AudioSystem.
getAudioInputStream( url );
clip.open(ais);
clip.loop(0);
javax.swing.JOptionPane.
showMessageDialog(null, "Close to exit!");
}
}
This will work in JDK 1.4 (tested in Windows XP and JDK 1.4.2_06).
The other answer fails because as correctly stated in the comments, AudioSystem.getClip() does not exist on JDK 1.4. Below is a complete source (in the form of a main function, but it's adaptable to anything else) that uses DataLine and plays in a separate Thread for better overall performance as well:
import java.io.File;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.SourceDataLine;
public class AudioTest {
public static void main(String[] args) throws Exception {
AudioInputStream ais = AudioSystem.getAudioInputStream(new File("C:/sound1.wav"));
AudioFormat format = ais.getFormat();
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
class PlayThread extends Thread {
private AudioInputStream ais;
private AudioFormat format;
private SourceDataLine sourceDataLine;
byte tempBuffer[] = new byte[10000];
public PlayThread(AudioInputStream ais, SourceDataLine sourceDataLine, AudioFormat format) {
this.ais = ais;
this.sourceDataLine = sourceDataLine;
this.format = format;
}
public void run() {
try {
sourceDataLine.open(this.format);
sourceDataLine.start();
int cnt;
while ((cnt = this.ais.read(tempBuffer, 0, tempBuffer.length)) != -1) {
if (cnt > 0) {
sourceDataLine.write(tempBuffer, 0, cnt);
}
}
sourceDataLine.drain();
sourceDataLine.close();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
new PlayThread(ais, sourceDataLine, format).start();
}
}
Both question and answers are really old, but I just had to make this work on a fanless mini PC that only run windows XP so... ¯\_(ツ)_/¯

Categories