Disable Parity Bit for serial Communication does not work - java

I have a problem with sending bytes through my comport. It sends a parity bit, although it is explicitly turned off (i need the byte without parity to communicate with some hardware). My code is really simple:
Process p = Runtime.getRuntime().exec("cmd.exe /c mode com1: baud=115200 parity=n data=8 stop=1 to=off xon=off rts=off dtr=off");
p.waitFor();
fp = new RandomAccessFile("COM1","rw");
fp.write((byte)0x21);
I have my oscillator connected to the port and whatever I do, there is one extra bit, which appears to be a parity bit. But as you can see, I disabled the parity via code, and also disabled it via my device manager. What i see on the oscillator is a : 0 0010 0001 11 (start and stop bit incuded). I can't figure out, where this parity or extra bit comes from... Anybody has an idea?

While that mode command one upon a time was intended to be used like you do and did work, I would be very skeptical to how much effort Microsoft has put into maintaining that kind of legacy support. My first step would be to open a command prompt and run
C:\>mode
C:\>rem The above command will display values to all configurable settings
C:\>mode com1: baud=115200 parity=n data=8 stop=1 to=off xon=off rts=off dtr=off
C:\>mode
C:\>rem Any visible changes compared to the first mode command?
C:\>echo U >> COM1
C:\>rem Check bits on oscilloscope
If this does not work like expected then I think you should just give up on the mode command. If all of that works, verify that the mode settings are not just properties within the shell running the mode command, e.g. after changing some parameters, run mode in a different shell to check that the parameters are changed there as well.
Additionally, according to documentation from Microsoft, the syntax for the baud=... parameter is not the numerical baud value but a two digit number mapped to a given baud rate (e.g. baud=96 -> 9600, see table in documentation). This site mentions an alternative syntax, MODE COM1:9600,N,8,1 which is more in line with what I remember being used, you could try that as well.
All of that failing, you could try using a java serial library. Rxtx is a commonly used, although not everybody likes it. This post recommends
http://code.google.com/p/java-simple-serial-connector/ over rxtx. This post mentions
http://code.google.com/p/jperipheral.

OK, I found a solution, but still do not know, why it does not work (and i don't intend to do so ;) )But as I want to send an array of bytes, it is the better solution anyway.
If I use fp.write like this:
byte[] ba = {(byte)0xaa, (byte)0xaa};
fp.write(ba, 0, 2);
The parity bit does not get attached... What so ever. Maybe the library itself attaches a parity bit in the overloaded function or something else weird is happening :/

Related

Binary writing from Android-Java reading wrong from UE4-C++

I am writing a binary reader that needs to read in a very specific set of binary written by an Android Tablet. I have run into several issues reading this binary, the first and foremost being that it does not resemble the data I wrote in the first place. I have read a little bit about endianness, words, and how they are made on different systems and I am curious as to if this could be the root of the problem.
Any information would be good at this point, but the specific thing I would like to know is: Considering the lines below, why is the binary not being read in as the same value as it is written out as? How can I fix this?
Say numPoints = 5000.
(OUT FUNCTION - Android-java)
out.writeInt(numPoints);
(IN FUNCTION - UE4-c++)
reader << numPoints;
numPoints now equals some really really large number that I can't explain.
I am using Windows 8.1 x64 and a Google-Project-Tango Tablet.
Your C++ code is probably reading numbers in little-endian. Java (at least DataOutputStream) writes binary numbers as big-endian.
It's probably simpler to fix this on the Java side since there's a built-in function for it:
out.writeInt(Integer.reverseBytes(numPoints));
You should also check whether UE4 guarantees that binary files are little-endian, or whether it's the machine's native endianness - otherwise there's a chance that you might want to port your game to a big-endian platform and run into problems later.

Settings for RS232 serial port communication

I am using this library for communicating with serial ports from Java. Also I am using USB to Serial converter to connect to the device.
Here is the documentation that is related to the device :
2.1 Physical Interface
The required physical interface between the host and the VGM is the EIA-232 interface.
2.2 Logical Interface
The serial data link shall operate at the speed of 19,200 bits per second (BPS), with one
start bit, eight data bits, a wake-up bit and one stop bit. The wake-up bit should be set in
the first byte of the message; the wake-up bit should be cleared for the remainder of the
message. The VGM shall clear the wake-up bit when responding to the host.
I am a little puzzled how to setup the rs232 library settings when connecting to the serial port. There are baud settings, data length in bits (5, 6, 7, 8), stop bits (1, 2) and parity setting. When I mess with these settings I of course get different output (most of the time looking like trash). Can you help me guess the settings with regards to the quoted documentation?
From what I understand your protocol requires an additional 9th data bit which is used in some exotic applications like Multidrop bus (see also Stackoverflow 14212660). In your case this 9th bit is called "wake-up bit", but you will not find such a thing or name in your java library or and standard RS232 application.
There is a workaround using standard USB to Serial converters. It is exactly what in Stackoverflow 14212660 is called
and no fudging by using the parity bit as a 9th data bit
So, unless you want to buy specialized hardware - I suggest the 'fudging':
Using the parity settings MARK and SPACE should correspond to your desired setting "wake up bit set", resp. "wake up bit cleared". Our software Docklight Scripting allows you in the free evaluation already to do this kind of temporary parity switching, but I assume there are also other tools or code examples around. MDB / multidrop bus should be good Google keywords for this.

VUgen: Recording trivial RMI interaction records invalid script?

After recording just the appearance of the logon window of our Java app in LR/VUgen 9.51 using the RMI protocol, the resulting script replays with a java.lang.ArrayIndexOutOfBoundsException. The code fragment looks like this:
_hashtable2 = new Hashtable();
_object_array3 = ((java.util.Collection)_hashtable2.values()).toArray();
_hashtable2.put("sessionId",(java.lang.String)_object_array3[0]); //yields exception!
_boolean1 = _mopsconstantserverif1.psi_requiresHostCommunication((java.util.Hashtable)_hashtable2, (java.util.Vector)null);
Of course generating an empty hashtable, converting it to an array, and referencing its first array element must yield an ArrayIndexOutOfBoundsException, right? But why does LR generate this kind of code at all? Is this a bug, or what am I doing wrong? I have never seen problems like this one when using RMI and LoadRunner.
Since the cause of the playback error is quite obvious and independent of the remainder of the recorded code (i.e. limited to the four statements shown), I try to ask without showing the whole script...
Ahh, RMI, the bane of my existence. I so dislike the RMI/Java combination in LoadRunner I do as much RMI work in Winsock as I can. You might consider the use of Winsock as a plan B option to avoid the Java issues you are experiencing today as Winsock is a straight C virtual user type. The use of a Windows sockets virtual user avoids the complications of the dark magic of Java and LoadRunner, plus it's lighter weight on the resource front and it's faster as a result. And, I am just a glutton for punishment on the Winsock front plus it keeps the C skills razor sharp!

Parsing IBM 3270 data in java

I was wondering if anyone had experience retrieving data with the 3270 protocol. My understanding so far is:
Connection
I need to connect to an SNA server using telnet, issue a command and then some data will be returned. I'm not sure how this connection is made since I've read that a standard telnet connection won't work. I've also read that IBM have a library to help but not got as far as finding out any more about it.
Parsing
I had assumed that the data being returned would be a string of 1920 characters since the 3278 screen was 80x24 chars. I would simply need to parse these chars into the appropriate fields. The more I read about the 3270 protcol the less this seems to be the case - I read in the documentation provided with a trial of the Jagacy 3270 Java library that attributes were marked in the protocol with the char 'A' before the attribute and my understanding is that there are more chars denoting other factors such as whether fields are editable.
I'm reasonably sure my thinking has been too simplistic. Take an example like a screen containing a list of items - pressing a special key on one of the 24 visible rows drills down into more detailed information regarding that row.
Also it's been suggested to me that print commands can be issued. This has some positive implications - if the format of the string returned is not 1920 since it contains these characters such as 'A' denoting how users interact with the terminal, printing would eradicate these. Also it would stop having to page through lots of data. The flip side is I wouldn't know how to retrieve the data from the print command back to Java.
So..
I currently don't have access to the SNA server but have some screen shots of what the terminal will look like once I get a connection and was therefore going to start work on parsing. With so many assumptions and not a lot of idea on what the data will look like I feel really stumped. Does anyone have any knowledge of these systems that might help me back on track?
You've picked a ripper of a problem there. 3270 is a very complex protocol indeed. I wouldn't bother about trying to implement it, it's a fool's errand, and I'm speaking from painful personal experience. Try to find a TN3270 (Telnet 3270) client API.
This might not specifically answer your question, but...
If you are using Rational Developer for z/OS, your java code should be able to use the integrated HATS product to deal with the 3270 stream. It might not fit your project, but I thought I would mention it if all you are trying to do is some simple screen scraping, it makes things very easy.

Monitoring Windows directory size

I'm looking for something that will monitor Windows directories for size and file count over time. I'm talking about a handful of servers and a few thousand folders (millions of files).
Requirements:
Notification on X increase in size over Y time
Notification on X increase in file count over Y time
Historical graphing (or at least saving snapshot data over time) of size and file count
All of this on a set of directories and their child directories
I'd prefer a free solution but would also appreciate getting pointed in the right direction. If we were to write our own, how would we go about doing that? Available languages being Ruby, Groovy, Java, Perl, or PowerShell (since I'd be writing it).
There are several solutions out there, including some free ones. Some that I have worked with include:
Nagios
and
Big Brother
A quick google search can probably find more.
You might want to take a look at PolyMon, which is an open source systems monitoring solution. It allows you to write custom monitors in any .NET language, and allows you to create custom PowerShell monitors.
It stores data on a SQL Server back end and provides graphing. For your purpose, you would just need a script that would get the directory size and file count.
Something like:
$size = 0
$count = 0
$path = '\\unc\path\to\directory\to\monitor'
get-childitem -path $path -recurse | Where-Object {$_ -is [System.IO.FileInfo]} | ForEach-Object {$size += $_.length; $count += 1}
In reply to Scott's comment:
Sure. you could wrap it in a while loop
$ESCkey = 27
Write-Host "Press the ESC key to stop sniffing" -foregroundcolor "CYAN"
$Running=$true
While ($Running)
{
if ($host.ui.RawUi.KeyAvailable) {
$key = $host.ui.RawUI.ReadKey("NoEcho,IncludeKeyUp,IncludeKeyDown")
if ($key.VirtualKeyCode -eq $ESCkey) {
$Running=$False
}
#rest of function here
}
I would not do that for a PowerShell monitor, which you can schedule to run periodically, but for a script to run in the background, the above would work. You could even add some database access code to log the results to a database, or log it to a file.. what ever you want.
You can certainly accomplish this with PowerShell and WMI. You would need some sort of DB backend like SQL Express. But I agree that a tool like Polymon is a better approach. The one thing that might make a diffence is the issue of scale. Do you need to monitor 1 folder on 1 server or hundreds?
http://sourceforge.net/projects/dirviewer/ -- DirViewer is a light pure java application for directory tree view and recursive disk usage statistics, using JGoodies-Looks look and feel similar to windows XP.

Categories