Script that's run from processing is not executing properly - java

I have a shell script that creates a text file that contains the current settings from my camera:
#!/bin/sh
file="test.txt"
[[ -f "$file" ]] && rm -f "$file"
var=$(gphoto2 --summary)
echo "$var" >> "test.txt"
if [ $? -eq 0 ]
then
echo "Successfully created file"
exit 0
else
echo "Could not create file" >&2
exit 1
fi
The script works as I think it should when I run it from terminal, but when I run the following processing app the text file is created but does not contain any of the information from the camera:
import java.util.*;
import java.io.*;
void setup() {
size(480, 120);
camSummary();
}
void draw() {
}
void camSummary() {
String commandToRun = "./ex2.sh";
File workingDir = new File("/Users/loren/Documents/RC/CamSoft/");
String returnedValues; // value to return any results
try {
println("in try");
Process p = Runtime.getRuntime().exec(commandToRun, null, workingDir);
int i = p.waitFor();
if (i==0) {
BufferedReader stdInput = new BufferedReader(new InputStreamReader(p.getInputStream()));
while ( (returnedValues = stdInput.readLine ()) != null) {
println(returnedValues);
}
} else{
println("i is: " + i);
}
}
catch(Throwable t) {
println(t);
}
}
Eventually I would like to read some of the data directly from the script into variables and then use those variables in processing.
Could someone help me sort this out?
Thank you,
Loren
Alternate script:
#!/bin/sh
set -x
exec 2>&1
file="test.txt"
[ -f "$file" ] && rm -f "$file"
# you want to store the output of gphoto2 in a variable
# var=$(gphoto2 --summary)
# problem 1: what if PATH environment variable is wrong (i.e. gphoto2 not accessible)?
# problem 2: what if gphoto2 outputs to stderr?
# it's better first to:
echo first if
if ! type gphoto2 > /dev/null 2>&1; then
echo "gphoto2 not found!" >&2
exit 1
fi
echo second if
# Why using var?...
gphoto2 --summary > "$file" 2>&1
# if you insert any echo here, you will alter $?
if [ $? -eq 0 ]; then
echo "Successfully created file"
exit 0
else
echo "Could not create file" >&2
exit 1
fi

There are several issues in your shell script. Let's correct and improve it together.
#!/bin/sh
file="test.txt"
[ -f "$file" ] && rm -f "$file"
# you want to store the output of gphoto2 in a variable
# var=$(gphoto2 --summary)
# problem 1: what if PATH environment variable is wrong (i.e. gphoto2 not accessible)?
# problem 2: what if gphoto2 outputs to stderr?
# it's better first to:
if ! type gphoto2 > /dev/null 2>&1; then
echo "gphoto2 not found!" >&2
exit 1
fi
# Why using var?...
gphoto2 --summary > "$file" 2>&1
# if you insert any echo here, you will alter $?
if [ $? -eq 0 ]; then
echo "Successfully created file"
exit 0
else
echo "Could not create file" >&2
exit 1
fi

Related

mongo java driver slow compared to shell (15 times)

Trying to dump the _id column only.
With mongo shell it finishes in around 2 minutes:
time mongoexport -h localhost -d db1 -c collec1 -f _id -o u.text --csv
connected to: localhost
exported 68675826 records
real 2m20.970s
With java it takes about 30 minutes:
java -cp mongo-test-assembly-0.1.jar com.poshmark.Test
class Test {
public static void main(String[] args) {
MongoClient mongoClient = new MongoClient("localhost");
MongoDatabase database = mongoClient.getDatabase("db1");
MongoCollection<Document> collection = database.getCollection("collec1");
MongoCursor<Document> iterator = collection.find().projection(new Document("_id", 1)).iterator();
while (iterator.hasNext()) {
System.out.println(iterator.next().toString());
}
}
}
CPU usage on box is low, don't see any network latency issues, since both tests are running from same box
Update:
Used Files.newBufferedWriter instead of System.out.println but ended up with same performance.
Looked at db.currentOp(), makes me think that mongo is hitting disk since it is having too many numYields
{
"inprog" : [
{
"desc" : "conn8636699",
"threadId" : "0x79a70c0",
"connectionId" : 8636699,
"opid" : 1625079940,
"active" : true,
"secs_running" : 12,
"microsecs_running" : NumberLong(12008522),
"op" : "getmore",
"ns" : "users.users",
"query" : {
"_id" : {
"$exists" : true
}
},
"client" : "10.1.166.219:60324",
"numYields" : 10848,
"locks" : {
},
"waitingForLock" : false,
"lockStats" : {
"Global" : {
"acquireCount" : {
"r" : NumberLong(21696)
},
"acquireWaitCount" : {
"r" : NumberLong(26)
},
"timeAcquiringMicros" : {
"r" : NumberLong(28783)
}
},
"MMAPV1Journal" : {
"acquireCount" : {
"r" : NumberLong(10848)
},
"acquireWaitCount" : {
"r" : NumberLong(5)
},
"timeAcquiringMicros" : {
"r" : NumberLong(40870)
}
},
"Database" : {
"acquireCount" : {
"r" : NumberLong(10848)
}
},
"Collection" : {
"acquireCount" : {
"R" : NumberLong(10848)
}
}
}
}
]
}
The problem resides in STDOUT.
Printing to stdout is not inherently slow. It is the terminal you work with that is slow.
https://stackoverflow.com/a/3860319/3710490
The disk appears to be faster, because it is highly buffered.
The terminal, on the other hand, does little or no buffering: each individual print / write(line) waits for the full write (i.e. display to output device) to complete.
https://stackoverflow.com/a/3857543/3710490
I've reproduced your use case with enough similar dataset.
mongoexport to FILE
$ time "C:\Program Files\MongoDB\Server\4.2\bin\mongoexport.exe" -h localhost -d test -c collec1 -f _id -o u.text --csv
2020-03-28T13:03:01.550+0100 csv flag is deprecated; please use --type=csv instead
2020-03-28T13:03:02.433+0100 connected to: mongodb://localhost/
2020-03-28T13:03:03.479+0100 [........................] test.collec1 0/21028330 (0.0%)
2020-03-28T13:05:02.934+0100 [########################] test.collec1 21028330/21028330 (100.0%)
2020-03-28T13:05:02.934+0100 exported 21028330 records
real 2m1,936s
user 0m0,000s
sys 0m0,000s
mongoexport to STDOUT
$ time "C:\Program Files\MongoDB\Server\4.2\bin\mongoexport.exe" -h localhost -d test -c collec1 -f _id --csv
2020-03-28T14:43:16.479+0100 connected to: mongodb://localhost/
2020-03-28T14:43:16.545+0100 [........................] test.collec1 0/21028330 (0.0%)
2020-03-28T14:53:02.361+0100 [########################] test.collec1 21028330/21028330 (100.0%)
2020-03-28T14:53:02.361+0100 exported 21028330 records
real 9m45,962s
user 0m0,015s
sys 0m0,000s
JAVA to FILE
$ time "C:\Program Files\Java\jdk1.8.0_211\bin\java.exe" -jar mongo-test-assembly-0.1.jar FILE
Wasted time for [FILE] - 271,57 sec
real 4m32,174s
user 0m0,015s
sys 0m0,000s
JAVA to STDOUT to FILE
$ time "C:\Program Files\Java\jdk1.8.0_211\bin\java.exe" -jar mongo-test-assembly-0.1.jar SYSOUT > u.text
real 6m50,962s
user 0m0,015s
sys 0m0,000s
JAVA to STDOUT
$ time "C:\Program Files\Java\jdk1.8.0_211\bin\java.exe" -jar mongo-test-assembly-0.1.jar SYSOUT > u.text
Wasted time for [SYSOUT] - 709,33 sec
real 11m51,276s
user 0m0,000s
sys 0m0,015s
Java code
long init = System.currentTimeMillis();
try (MongoClient mongoClient = new MongoClient("localhost");
BufferedWriter writer = Files.newBufferedWriter(Files.createTempFile("benchmarking", ".tmp"))) {
MongoDatabase database = mongoClient.getDatabase("test");
MongoCollection<Document> collection = database.getCollection("collec1");
MongoCursor<Document> iterator = collection.find().projection(new Document("_id", 1)).iterator();
while (iterator.hasNext()) {
if ("SYSOUT".equals(args[0])) {
System.out.println(iterator.next().get("_id"));
} else {
writer.write(iterator.next().get("_id") + "\n");
}
}
} catch (Exception e) {
e.printStackTrace();
}
long end = System.currentTimeMillis();
System.out.println(String.format("Wasted time for [%s] - %.2f sec", args[0], (end - init) / 1_000.));

No output of Perl script called from Java

I am using Ganymed ssh lib in Java to connect to Linux machine and execute some unix scripts, and display their output.
I am running a parent shell script which in turn runs few other sub-scripts and finally a perl script. All work well for the shell scripts, but when it reaches the perl script, I stop getting any output.
If I run the parent script manually on Linux server I see output from perl without issues.
Here's the relevant java code, connecting to the machine and calling the shell script, and returning a BufferedReader from where the output could be read line by line :
try {
conn = new Connection(server);
conn.connect();
boolean isAuthenticated = conn.authenticateWithPublicKey(user, keyfile, keyfilePass);
if (isAuthenticated == false) {
throw new IOException("Authentication failed.");
}
sess = conn.openSession();
if (param == null) {
sess.execCommand(". ./.bash_profile; cd $APP_HOME; ./parent_script.sh");
}
else {...}
InputStream stdout = new StreamGobbler(sess.getStdout());
reader = new BufferedReader(new InputStreamReader(stdout));
} catch (IOException e) {
e.printStackTrace();
}
My parent shell script called looks like this :
./start1 #script1 output OK
./start2 #script2 output OK
./start3 #script3 output OK
/u01/app/perl_script.pl # NO OUTPUT HERE :(
Would anyone have any idea why this happens ?
EDIT : Adding perl script
#!/u01/app/repo/code/sh/perl.sh
use FindBin qw/ $Bin /;
use File::Basename qw/ dirname /;
use lib (dirname($Bin). "/pm" );
use Capture::Tiny qw/:all/;
use Data::Dumper;
use Archive::Zip;
use XML::Simple;
use MXA;
my $mx = new MXA;
chdir $mx->config->{$APP_HOME};
warn Dumper { targets => $mx->config->{RTBS} };
foreach my $target (keys %{ $mx->config->{RTBS}->{Targets} }) {
my $cfg = $mx->config->{RTBS}->{Targets}->{$target};
my #commands = (
[
...
],
[
'unzip',
'-o',
"$cfg->{ConfigName}.zip",
'Internal/AdapterConfig/Driver.xml'
],
[
'zip',
"$cfg->{ConfigName}.zip",
'Internal/AdapterConfig/Driver.xml'
],
[
'rm -rf Internal'
],
[
"rm -f $cfg->{ConfigName}.zip"
],
);
foreach my $cmnd (#commands) {
warn Dumper { command => $cmnd };
my ($stdout, $stderr, $status) = capture {
system(#{ $cmnd });
};
warn Dumper { stdout => $stdout,
stderr => $stderr,
status => $status };
}
=pod
warn "runnnig -scriptant /ANT_FILE:mxrt.RT_${target}argets_std.xml /ANT_TARGET:exportConfig -jopt:-DconfigName=Fixing -jopt:-DfileName=Fixing.zip');
($stdout, $stderr, $status) = capture {
system('./command.app', '-scriptant -parameters');
}
($stdout, $stderr, $status) = capture {
system('unzip Real-time.zip Internal/AdapterConfig/Driver.xml');
};
my $xml = XMLin('Internal/AdapterConfig/MDPDriver.xml');
print Dumper { xml => $xml };
[[ ${AREAS} == "pr" ]] && {
${PREFIX}/substitute_mdp_driver_parts Internal/AdapterConfig/Driver.xml 123 controller#mdp-a-n1,controller#mdp-a-n2
} || {
${PREFIX}/substitute_mdp_driver_parts Internal/AdapterConfig/Driver.xml z8pnOYpulGnWrR47y5UH0e96IU0OLadFdW/Bm controller#md-uat-n1,controller#md-uat-n2
}
zip Real-time.zip Internal/AdapterConfig/Driver.xml
rm -rf Internal
rm -f Real-time.zip
print $mx->Dump( stdout => $stdout,
stderr => $stderr,
status => $status );
=cut
}
The part of your Perl code that produces the output is:
warn Dumper { stdout => $stdout,
stderr => $stderr,
status => $status };
Looking at the documentation for warn() we see:
Emits a warning, usually by printing it to STDERR
But your Java program is reading from STDOUT.
InputStream stdout = new StreamGobbler(sess.getStdout());
You have a couple of options.
Change your Perl code to send the output to STDOUT instead of STDERR. This could be a simple as changing warn() to print().
When calling the Perl program in the shell script, redirect STDERR to STDOUT.
/u01/app/perl_script.pl 2>&1
I guess you could also set up your Java program to read from STDERR as well. But I'm not a Java programmer, so I wouldn't be able to advise you on the best way to do that.

Internet Explorer dom behaves differently in different environment

I have a powershell script which is like this
param(
[string]$u,
[string]$p
)
$username = $u
$password = $p
cd HKCU:\"Software\Microsoft\Windows\CurrentVersion\Internet Settings"
set-itemproperty . ProxyEnable 1
$url = "https://ameriprisestage.service-now.com/"
$ie = New-Object -com internetexplorer.application;
$ie.visible = $true;
$ie.navigate($url);
while ($ie.ReadyState -ne 4 -or $ie.Busy)
{
start-sleep -s 5;
}
$ieFrame = $ie.Document.getElementById("gsft_main")
if (($ieFrame -eq $null)) {
"ieframe is null" | Out-File 'D:\\file.txt' -Append
exit
}
$usrCtrl = $ie.Document.getElementById("user_name")
if ($usrCtrl -eq $null) {
" usrCtrl is null at 1" |Out-File 'D:\\file.txt' -Append
}
$usrCtrl = $ieFrame.document.getElementById("user_name")
if ($usrCtrl -eq $null) {
" usrCtrl is null at 2" |Out-File 'D:\\file.txt' -Append
}
$usrCtrl = $ieFrame.contentWindow.document.getElementById("user_name")
if ($usrCtrl -eq $null) {
" usrCtrl is null at 3" |Out-File 'D:\\file.txt' -Append
}
$usrCtrl = $ieFrame.contentDocument.getElementById("user_name")
if ($usrCtrl -eq $null) {
" usrCtrl is null at 4" |Out-File 'D:\\file.txt' -Append
}
$usrCtrl = $ieFrame.getElementById("user_name")
if ($usrCtrl -eq $null) {
" usrCtrl is null at 5" |Out-File 'D:\\file.txt' -Append
}
$usrCtrl.value=$username
$pass=$ieFrame.contentWindow.Document.getElementById("user_password").value=$password
$buttn=$ieFrame.contentWindow.document.getElementById("sysverb_login").click()
when i run this code from powershell ISE I get usrCtrl not null at 3 and 5. but when i invoke the same code from java program i get usrCtrl is null at 1, 2,3 4 and 5.
I can't figure out what is happening. can someone help me ..
Thanks
Sujith
Change this
cd HKCU:\"Software\Microsoft\Windows\CurrentVersion\Internet Settings"
to
cd "HKCU:\Software\Microsoft\Windows\CurrentVersion\Internet Settings"
AND this:
D:\\file.txt
to
D:\file.txt

Shuffle multiple files in same order

Setup:
I have 50 files, each with 25000 lines.
To-do:
I need to shuffle all of them "in the same order".
E.g.:
If before shuffle:
File 1 File 2 File 3
A A A
B B B
C C C
then after shuffle I should get:
File 1 File 2 File 3
B B B
C C C
A A A
i.e. corresponding rows in files should be shuffled in same order.
Also, the shuffle should be deterministic, i.e. if I give File A as input, it should always produce same shuffled output.
I can write a Java program to do it, probably a script to. Something like, shuffle number between 1 and 25000 and store that in a file, say shuffle_order. Then simply process one file at a time and order existing rows according to shuffle_order. But is there a better/quick way to do this?
Please let me know if more info needed.
The next uses only basic bash commands. The principe is:
generate a random order (numbers)
order all files in this order
the code
#!/bin/bash
case "$#" in
0) echo "Usage: $0 files....." ; exit 1;;
esac
ORDER="./.rand.$$"
trap "rm -f $ORDER;exit" 1 2
count=$(grep -c '^' "$1")
let odcount=$(($count * 4))
paste -d" " <(od -A n -N $odcount -t u4 /dev/urandom | grep -o '[0-9]*') <(seq -w $count) |\
sort -k1n | cut -d " " -f2 > $ORDER
#if your system has the "shuf" command you can replace the above 3 lines with a simple
#seq -w $count | shuf > $ORDER
for file in "$#"
do
paste -d' ' $ORDER $file | sort -k1n | cut -d' ' -f2- > "$file.rand"
done
echo "the order is in the file $ORDER" # remove this line
#rm -f $ORDER # and uncomment this
# if dont need preserve the order
paste -d " " *.rand #remove this line - it is only for showing test result
from the input files:
A B C
--------
a1 a2 a3
b1 b2 b3
c1 c2 c3
d1 d2 d3
e1 e2 e3
f1 f2 f3
g1 g2 g3
h1 h2 h3
i1 i2 i3
j1 j2 j3
will make A.rand B.rand C.rand with the next example content
g1 g2 g3
e1 e2 e3
b1 b2 b3
c1 c2 c3
f1 f2 f3
j1 j2 j3
d1 d2 d3
h1 h2 h3
i1 i2 i3
a1 a2 a3
real testing - genereting 50 files with 25k lines
line="Consequatur qui et qui. Mollitia expedita aut excepturi modi. Enim nihil et laboriosam sit a tenetur."
for n in $(seq -w 50)
do
seq -f "$line %g" 25000 >file.$n
done
running the script
bash sorter.sh file.??
result on my notebook
real 1m13.404s
user 0m56.127s
sys 0m5.143s
Probably very inefficient but try below:
#!/bin/bash
arr=( $(for i in {1..25000}; do
echo "$i"
done | shuf) )
for file in files*; do
index=0
new=$(while read line; do
echo "${arr[$index]} $line"
(( index++ ))
done < "$file" | sort -h | sed 's/^[0-9]\+ //')
echo "$new" > "$file"
done
I propose to shuffle them with a python script. By setting the same seed for every shuffling, you will obtain the same final data order.
import argparse
import logging
import os
import random
from tqdm import tqdm
logging.getLogger().setLevel(logging.INFO)
def main(args):
assert os.path.isfile(args.input_file), (
f"filename {args.input_file} does not exist"
)
logging.info("Reading input file...")
with open(args.input_file) as fi:
data = fi.readlines()
logging.info("Generating indexes")
indexes = list(range(len(data)))
logging.info("Shuffling...")
random.seed(args.seed)
random.shuffle(indexes)
logging.info(f"Writing results, in place? {args.in_place}")
if not args.in_place:
name, ext = os.path.splitext(args.input_file)
new_filename = name + "_shuffled" + ext
args.input_file = new_filename
with open(args.input_file, "w") as fo:
for index in tqdm(indexes, desc="Writing to output file..."):
fo.write(data[index])
fo.flush()
os.fsync(fo)
logging.info("Done!")
if __name__ == '__main__':
parser = argparse.ArgumentParser("Shuffle file by lines.")
parser.add_argument('--input_file', type=str, required=True, help="Input file to be shuffled")
parser.add_argument('--in_place', action="store_true", help="Whether to shuffle file in-place.")
parser.add_argument('--seed', type=int, required=True, help="Seed with which the file will be shuffled.")
args = parser.parse_args()
main(args)
You can run this script with:
python shuffle.py --input_file File1 --seed 123
python shuffle.py --input_file File1 --seed 123
python shuffle.py --input_file File1 --seed 123
And all the files will be shuffled in the same way.

Name of applications running on port in Perl or Java

Xampp comes with a neat executable called xampp-portcheck.exe. This responds with if the ports required are free, and if not, which applications are running on those ports.
I can check if something is running on a port, by accessing the netstat details, but how do I find out the application running on the port within Windows?
The CPAN module Win32::IPHelper provides access to GetExtendedTcpTable which provides the ProcessID for each connection.
Win32::Process::Info gives information about all running processes.
Combining the two, we get:
#!/usr/bin/perl
use strict;
use warnings;
use Win32;
use Win32::API;
use Win32::IPHelper;
use Win32::Process::Info qw( NT );
use Data::Dumper;
my #tcptable;
Win32::IPHelper::GetExtendedTcpTable(\#tcptable, 1);
my $pi = Win32::Process::Info->new;
my %pinfo = map {$_->{ProcessId} => $_ } $pi->GetProcInfo;
for my $conn ( #tcptable ) {
my $pid = $conn->{ProcessId};
$conn->{ProcessName} = $pinfo{$pid}->{Name};
$conn->{ProcessExecutablePath} = $pinfo{$pid}->{ExecutablePath};
}
#tcptable =
sort { $a->[0] cmp $b->[0] }
map {[ sprintf("%s:%s", $_->{LocalAddr}, $_->{LocalPort}) => $_ ]}
#tcptable;
print Dumper \#tcptable;
Output:
[
'0.0.0.0:135',
{
'RemotePort' => 0,
'LocalPort' => 135,
'LocalAddr' => '0.0.0.0',
'State' => 'LISTENING',
'ProcessId' => 1836,
'ProcessName' => 'svchost.exe',
'ProcessExecutablePath' => 'C:\\WINDOWS\\system32\\svchost.exe',
'RemoteAddr' => '0.0.0.0'
}
],
...
[
'192.168.169.150:1841',
{
'RemotePort' => 80,
'LocalPort' => 1841,
'LocalAddr' => '192.168.169.150',
'State' => 'ESTABLISHED',
'ProcessId' => 1868,
'ProcessName' => 'firefox.exe',
'ProcessExecutablePath' => 'C:\\Program Files\\Mozilla Firefox\\firefox.exe',
'RemoteAddr' => '69.59.196.211'
}
],
Phewwww it was exhausting connecting all these dots.

Categories