Is there any way to make a telnet app to clear the output client-side (using Java Socket connection + Buffers)? For example, the program queries the connected user for login and password and when they've succeeded logging in, I do cls for Windows or clear for Linux.
The telnet application is a terminal emulator. In really old times the only way to communicate with a computer was by using a terminal with a pure text based screen and a keyboard. The terminal sent everything you typed to the computer. The computer sent characters back that was printed on the screen. Just like telnet.
DEC created a series of terminals called VT52, VT100 etc. They was able to interpret special control sequences so that the computer could give more fancy instructions to the terminal. These control sequences was standardized by ANSI and is now called ANSI escape codes. Terminal emulators that understand the VT100 escape codes are called VT100 terminal emulators.
You may look up the ansi escape codes on wikipedia and other places. They all start with the character codes for escape and [ followd by the control characters. The control characters for clearing the screen is "2J".
So, what you need to do is sending this string from your server to the telnet client:
myOutputStream.print("\u001B[2J");
myOutputStream.flush();
You may send other control characters as well. Try "\u001B[7m" to reverse the screen.
On the Linux side, clear simply issues some terminal control characters to tell it to clear the screen. For VT terminals, that's Esc]2J. Not sure if Windows would support something similar.
Scrolling blank lines is the only way I can think of in Java if you need to support windows.
Related
So I was trying to do a something like tic-tac-toe in java. Instead of printing a new board every time a player make a move I want to just overwrite the previous board.
Example board:
x|x|x
-+-+-
x|x|x
-+-+-
x|x|x
I tried with \r while printing the last line but seems like it only goes back to the start of the last line but what I want is to the start of the first line.
In old-school C, this was done with a Unix library called Curses. So I did a google for Java curses library and got this link:
Stack Overflow Question
The other choice might be to use Swing to make it a GUI instead of a terminal program. But if you just want to work within the terminal window, this question points out several libraries that could help you.
Basically you need to send escape codes to the terminal window that tell the terminal what to do. Google VT100 escape codes as an example of what I mean, but your terminal window may not emulate a VT100 (an old, old text terminal from Digital Equipment Corp). So you can't just use VT100 escape sequences, as they might not be right. Instead, those libraries should look at your TERM environment variable and send the right escape sequences.
I have absolutely no idea if they'll work on Windows (if that's your environment). Good luck.
In my application, I'm reading the properties file(UTF-8 encoded) containing Chinese characters and printing them on the Windows command line. But,for some reason the the messages are not getting displayed correctly(some extra characters are coming). However the same mesages are getting displayed correctly on Eclipse console and Cygwin. I've set the command line code page to utf-8(65001) and used "Lucida" font as well.
If you see the above image, on Windows it printed one extra 0 on the second line, but that is not expected;on cygwin the message was printed correctly.
Please let me know if I'm missing something. From this post , I can see that there are some issues with Windows UTF-8 code page implementation. If so, is there any other way to get over this problem ?
I can see that there are some issues with Windows UTF-8 code page implementation
oh most surely yes
is there any other way to get over this problem ?
The typical solution is to accept that the Windows command prompt is a broken disaster and move on. But if you really have to, you can use JNA to call the Win32 function WriteConsole directly, avoiding the broken byte encoding layer, when you can determine you are outputting to the console (rather than eg a pipe).
I'm trying to write a simple ssh client using java library 'sshj'.
try {
session = ssh.startSession();
try {
final Command cmd = session.exec("ls");
System.out.println(IOUtils.readFully(cmd.getInputStream()).toString());
} finally {
session.close();
}
} finally {
ssh.disconnect();
}
But, my problem is that I can't figure out how ssh clients (like putty or iterms) distinguish colors of text (for example, colors in vi or colors for the result of command 'ls').
I searched Google a little bit with queries like 'ssh protocol text color' or something, but I couldn't find a satisfying result.
And I find out some ssh java libraries like jsch, sshj and sshtools, and
I'm using sshj because it's short codes. But if you have any comments about these libraries, feel free to share it to me :)
ssh simply forwards (and encrypts/decrypts) bytes between the server and the client. In most cases the server is running a unix shell and "the bytes" are simply stdin/stdout/stderr from that shell.
If you run vim in that shell, then the bytes are produced by vim. To do the special stuff like colors (syntax highlighting), positioning of the cursor and so on, vim will send escape sequences, which are series of bytes starting with an "ESCAPE" byte (hex 27).
Escape sequences stem from the days of "green terminals" and are interpreted as "instructions" by the physical terminal, or the terminal emulator e.g. putty.
Vim and other complex terminal programs will look for the $TERM environment variable to determine what terminal you are using client-side. Depending on that value, the escape sequences (think terminal-dependent instructions) will be different.
You can find many lists of terminal escape sequences on the internet, e.g. for ansi: https://en.wikipedia.org/wiki/ANSI_escape_code
I have a simple program that makes a request to a remote server running a service which I believe is written in Delphi, but definately running on Windows.
I'm told the service will be using whatever the default encoding is for Windows.
When I get a response and use println to output it I'm getting some strange symbols in the output, which make me think it is a character encoding issue.
How can I tell Java the the input from the remote system is in the windows encoding?
I have tried the following:
_receive = new BufferedReader(new InputStreamReader(_socket.getInputStream(),"ISO-8859-1"));
_System.out.println(_receive.readLine());
The extra characters appear as squares in the output with 4 numbers in the square.
Unless you KNOW what the "default encoding" is, you can't tell what it is. The "default encoding" is generally the system-global codepage, which can be different on different systems.
You should really try to make people use an encoding that both sides agree on; nowadays, this should almost always be UTF-16 or UTF-8.
Btw, if you are sending one character on the Windows box, and you receive multiple "strange symbols" on the Java box, there's a good chance that the Windows box is already sending UTF-8.
Use cp1252 instead of ISO-8859-1, as it is default on windows.
Ultimately I would like to use a Java program to send and receive messages from a phone that I have plugged in via USB. I can do this using a C# program, however the program I want to send and receive messages is written in Java. To do this I am using the Rxtx library (but using the Windows x64 compiled version from Cloudhopper). But whenever I try and send any commands to the phone via USB my computer completely locks up and I have to hard-restart it.
The code I am running is here: Two way communication with the serial port. I think that it successfully establishes a link since it gets to the stage where it accepts input from the console, though when I press enter, and the input is sent, the computer locks up.
I am running Windows 7 x64, using Eclipse. Thank you for any help.
A little hard to tell from the code, but here are some debugging tips:
Try stepping through the code with the debugger line by line, and step in to the library itself to see if you can find the problem.
Instead of reading/writing from the console, try sending character codes programmatically. The console operates very differently from direct access. i.e. instead of System.in.read()) just try passing in a known good String.
Keep in mind that Java works with UCS-16 internally, but that consoles typically work with different character encodings (e.g. cp1252 for Windows). So, your "enter" may be a completely different character from what the system is expecting. If your device is expecting ASCII 13 and your keyboard is generating ASCII 12, that could be enough to confuse things.
The crash makes it seem very likely that there is something going on with the native library. If you find that the Java debugging keeps dropping you into the JNI boundary, you may need to debug with a C/C++ toolset.
Good luck!