How to get Wildfly server status via CLI? - java

I've been trying to validate if our server has started in Wildfly using the jboss-cli.bat
This is the command i'm using:
/host=slave-1/server-config=REST-server-one:read-resource(include-runtime=true)
and this is what i'm getting from the command
{
"outcome" => "success",
"result" => {
"auto-start" => true,
"cpu-affinity" => undefined,
"group" => "wildfly-server-group",
"name" => "wildfly-server",
"priority" => undefined,
"socket-binding-default-interface" => undefined,
"socket-binding-group" => undefined,
"socket-binding-port-offset" => 0,
"status" => "STARTED",
"update-auto-start-with-server-status" => false,
"interface" => undefined,
"jvm" => undefined,
"path" => undefined,
"ssl" => undefined,
"system-property" => undefined
}
Is there a command that will return the value of the status in that response?

You should be able to use the read-attribute operation.
/host=slave-1/server-config=REST-server-one:read-attribute(name=status)

I end up using this
BufferedReader reader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = reader.readLine();
while (line != null) {
String[] value = line.split("=>");
if(value.length > 1){
if(value[0].contains("\"status\"")){
System.out.println(value[1]);
}
}
line = reader.readLine();
}
If anyone can suggest a better method would be greatly appreciated.

Related

XML data display in grid using kibana and logstash

I wanted to display XML data using logstash and Kibana in grid format. using below conf file I am able to display data into grid but not able to split row data.
Example:
Output
logstash.conf file :
input {
file {
path => "C:/ELK Stack/logstash-8.2.0-windows-x86_64/logstash-8.2.0/Test.xml"
start_position => "beginning"
sincedb_path => "NUL"
codec => multiline {
pattern => "^<?stations.*>"
negate => "true"
what => "previous"
auto_flush_interval => 1
max_lines => 3000
}}}
filter
{
xml
{
source => "message"
target => "parsed"
store_xml => "false"
xpath => [
"/stations/station/id/text()", "station_id",
"/stations/station/name/text()", "station_name"
]
}
mutate {
remove_field => [ "message"]
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "logstash_index123xml"
workers => 1
}
stdout {
codec => rubydebug
}
}
xpath will always return arrays, to associate the members of the two arrays you are going to need to use a ruby filter. To get multiple events you can use a split filter to split an array which you build in the ruby filter. If you start with
<stations>
<station>
<id>1</id>
<name>a</name>
<id>2</id>
<name>b</name>
</station>
</stations>
then if you use
xml {
source => "message"
store_xml => "false"
xpath => {
"/stations/station/id/text()" => "[#metadata][station_id]"
"/stations/station/name/text()" => "[#metadata][station_name]"
}
remove_field => [ "message" ]
}
ruby {
code => '
ids = event.get("[#metadata][station_id]")
names = event.get("[#metadata][station_name]")
if ids.is_a? Array and names.is_a? Array y and ids.length == names.length
a = []
ids.each_index { |x|
a << { "station_name" => names[x], "station_id" => ids[x] }
}
event.set("[#metadata][theData]", a)
end
'
}
if [#metadata][theData] {
split {
field => "[#metadata][theData]"
add_field => {
"station_name" => "%{[#metadata][theData][station_name]}"
"station_id" => "%{[#metadata][theData][station_id]}"
}
}
}
You will get two events
{
"station_name" => "a",
"station_id" => "1",
...
}
{
"station_name" => "b",
"station_id" => "2",
...
}

Failed in loading csv data to Elasticsearch, translation issue

I am trying to import CSV file to elastic file, but it is failed and threw an error
Pipeline aborted due to error {:pipeline_id=>"main",
:exception=>#,
:backtrace=>["/usr/local/Cellar/logstash/7.6.1/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-filter-mutate-3.5.0/lib/logstash/filters/mutate.rb:222:in
block in register'", "org/jruby/RubyHash.java:1428:ineach'",
"/usr/local/Cellar/logstash/7.6.1/libexec/vendor/bundle/jruby/2.5.0/gems/logstash-filter-mutate-3.5.0/lib/logstash/filters/mutate.rb:220:in
register'",
"org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:in
register'",
"/usr/local/Cellar/logstash/7.6.1/libexec/logstash-core/lib/logstash/java_pipeline.rb:200:in
block in register_plugins'", "org/jruby/RubyArray.java:1814:in
each'",
"/usr/local/Cellar/logstash/7.6.1/libexec/logstash-core/lib/logstash/java_pipeline.rb:199:in
register_plugins'",
"/usr/local/Cellar/logstash/7.6.1/libexec/logstash-core/lib/logstash/java_pipeline.rb:502:in
maybe_setup_out_plugins'",
"/usr/local/Cellar/logstash/7.6.1/libexec/logstash-core/lib/logstash/java_pipeline.rb:212:in
start_workers'",
"/usr/local/Cellar/logstash/7.6.1/libexec/logstash-core/lib/logstash/java_pipeline.rb:154:in
run'",
"/usr/local/Cellar/logstash/7.6.1/libexec/logstash-core/lib/logstash/java_pipeline.rb:109:in
`block in start'"],
"pipeline.sources"=>["/Users/user/Document/Esk-Data/xudaxia.conf"],
:thread=>"#"}
Below is the conf file
input
{
file{
path => ["/test.csv"]
start_position => "beginning"
}
}
filter{
csv{
separator => ","
columns => ["comment_time","comment", "id", "video_time"]
}
mutate{
convert => {
"comment_time" => "date_time"
"comment" => "string"
"id" => "integer"
"video_time" => "float"
}
}
}
output{
elasticsearch{
hosts => ["localhost:9200"]
index => "test"
}
}
test.csv
comment_time comment id video_time
2020/03/22 15:59:41 バイ a 123.100
2020/03/22 15:59:45 บาย b 100.100
2020/04/22 15:59:50 ByeBye c 80.210
can anyone help?
According the documentation the option date_time doesn't exist for convert action on mutate plugin - doc here.
However this plugin is used to cast a type into another one, that it isn't your use case. If comment_time is not recognized as date field you should trasform it with date plugin - doc here.
So you should remove this block:
mutate{
convert => {
"comment_time" => "date_time"
"comment" => "string"
"id" => "integer"
"video_time" => "float"
}
}
and replace it with this one:
date {
match => [ "comment_time", "yyyy/MM/dd HH:mm:ss"
}

Wildfly Logs are not saved

After restarting my Ubuntu VM, Wildfly 18 automatically starts.
From ps aux
wildfly 1031 0.0 0.0 20048 3508 ? Ss Dez18 0:00 /bin/bash /opt/wildfly/bin/launch.sh standalone standalone.xml 0.0.0.0
wildfly 1067 0.0 0.0 4628 1756 ? S Dez18 0:00 /bin/sh /opt/wildfly/bin/standalone.sh -c standalone.xml -b 0.0.0.0
wildfly 1482 35.2 7.0 1658040 572176 ? Sl Dez18 0:36 /opt/jdk-13.0.1/bin/java -D[Standalone] -server -Xms64m -Xmx512m -XX:MetaspaceSize=96M -XX:MaxMetaspaceSize=256m -Djava.net.preferIPv4Stack=true -Djboss.modules.system.pkgs
But my server.log is empty (cat /opt/wildfly/standalone/log/server.log gives me no messages about startup process of wildfly etc.).
When I make a "service wildfly restart" no additional Entries are inserted into server.log. How can I get access to my Log?
I think I`ve changed nothing in comparison to the standard config.
/standalone/configuration/logging.properties
loggers=sun.rmi,io.jaegertracing.Configuration,org.jboss.as.config,com.arjuna
logger.level=INFO
logger.handlers=FILE,CONSOLE
logger.sun.rmi.level=WARN
logger.sun.rmi.useParentHandlers=true
logger.io.jaegertracing.Configuration.level=WARN
logger.io.jaegertracing.Configuration.useParentHandlers=true
logger.org.jboss.as.config.level=DEBUG
logger.org.jboss.as.config.useParentHandlers=true
logger.com.arjuna.level=WARN
logger.com.arjuna.useParentHandlers=true
handler.CONSOLE=org.jboss.logmanager.handlers.ConsoleHandler
handler.CONSOLE.level=INFO
handler.CONSOLE.formatter=COLOR-PATTERN
handler.CONSOLE.properties=enabled,autoFlush,target
handler.CONSOLE.enabled=true
handler.CONSOLE.autoFlush=true
handler.CONSOLE.target=SYSTEM_OUT
handler.FILE=org.jboss.logmanager.handlers.PeriodicRotatingFileHandler
handler.FILE.level=ALL
handler.FILE.formatter=PATTERN
handler.FILE.properties=append,autoFlush,enabled,suffix,fileName
handler.FILE.append=true
handler.FILE.autoFlush=true
handler.FILE.enabled=true
handler.FILE.suffix=.yyyy-MM-dd
handler.FILE.fileName=/opt/wildfly/standalone/log/server.log
formatter.PATTERN=org.jboss.logmanager.formatters.PatternFormatter
formatter.PATTERN.properties=pattern
formatter.PATTERN.pattern=%d{yyyy-MM-dd HH\:mm\:ss,SSS} %-5p [%c] (%t) %s%e%n
formatter.COLOR-PATTERN=org.jboss.logmanager.formatters.PatternFormatter
formatter.COLOR-PATTERN.properties=pattern
formatter.COLOR-PATTERN.pattern=%K{level}%d{HH\:mm\:ss,SSS} %-5p [%c] (%t) %s%e%n
Output from standalone-cli
[standalone#localhost:9990 /] /subsystem=logging:read-resource(recursive=true)
{
"outcome" => "success",
"result" => {
"add-logging-api-dependencies" => true,
"use-deployment-logging-config" => true,
"async-handler" => undefined,
"console-handler" => {"CONSOLE" => {
"autoflush" => true,
"enabled" => true,
"encoding" => undefined,
"filter" => undefined,
"filter-spec" => undefined,
"formatter" => "%d{HH:mm:ss,SSS} %-5p [%c] (%t) %s%e%n",
"level" => "INFO",
"name" => "CONSOLE",
"named-formatter" => "COLOR-PATTERN",
"target" => "System.out"
}},
"custom-formatter" => undefined,
"custom-handler" => undefined,
"file-handler" => undefined,
"filter" => undefined,
"json-formatter" => undefined,
"log-file" => undefined,
"logger" => {
"com.arjuna" => {
"category" => "com.arjuna",
"filter" => undefined,
"filter-spec" => undefined,
"handlers" => undefined,
"level" => "WARN",
"use-parent-handlers" => true
},
"io.jaegertracing.Configuration" => {
"category" => "io.jaegertracing.Configuration",
"filter" => undefined,
"filter-spec" => undefined,
"handlers" => undefined,
"level" => "WARN",
"use-parent-handlers" => true
},
"org.jboss.as.config" => {
"category" => "org.jboss.as.config",
"filter" => undefined,
"filter-spec" => undefined,
"handlers" => undefined,
"level" => "DEBUG",
"use-parent-handlers" => true
},
"sun.rmi" => {
"category" => "sun.rmi",
"filter" => undefined,
"filter-spec" => undefined,
"handlers" => undefined,
"level" => "WARN",
"use-parent-handlers" => true
}
},
"logging-profile" => undefined,
"pattern-formatter" => {
"PATTERN" => {
"color-map" => undefined,
"pattern" => "%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p [%c] (%t) %s%e%n"
},
"COLOR-PATTERN" => {
"color-map" => undefined,
"pattern" => "%K{level}%d{HH:mm:ss,SSS} %-5p [%c] (%t) %s%e%n"
}
},
"periodic-rotating-file-handler" => {"FILE" => {
"append" => true,
"autoflush" => true,
"enabled" => true,
"encoding" => undefined,
"file" => {
"relative-to" => "jboss.server.log.dir",
"path" => "server.log"
},
"filter" => undefined,
"filter-spec" => undefined,
"formatter" => "%d{HH:mm:ss,SSS} %-5p [%c] (%t) %s%e%n",
"level" => "ALL",
"name" => "FILE",
"named-formatter" => "PATTERN",
"suffix" => ".yyyy-MM-dd"
}},
"periodic-size-rotating-file-handler" => undefined,
"root-logger" => {"ROOT" => {
"filter" => undefined,
"filter-spec" => undefined,
"handlers" => [
"CONSOLE",
"FILE"
],
"level" => "INFO"
}},
"size-rotating-file-handler" => undefined,
"socket-handler" => undefined,
"syslog-handler" => undefined,
"xml-formatter" => undefined
}
}
My problem was that server.log has owner root because of prior testings ... just deleted it and now its re-created with owner wildfly.

Log analysis to query logs based on a log message

I have a Java application that ouputs log in the format
timestamp UUID1 some information
timestamp UUID1 some more information
timestamp UUID1 x = 1
timestamp UUID2 some information
timestamp UUID2 some more information
timestamp UUID2 x = 2
timestamp UUID3 some information
timestamp UUID3 some more information
timestamp UUID3 x = 1
I want to implement a log analysis framework using Elsatic Search, LogStash and Kibana. Is it possible to get the logs only according to X value?
For example:-
If I query X = 1, I should get only the following logs.
timestamp UUID1 some information
timestamp UUID1 some more information
timestamp UUID1 x = 1
timestamp UUID3 some information
timestamp UUID3 some more information
timestamp UUID3 x = 1
If I query X = 2, I should get only the following logs.
timestamp UUID2 some information
timestamp UUID2 some more information
timestamp UUID2 x = 2
I am in control of the log message format. If it is not directly popssible to do this query, I can change the message format also.
UPDATE 1:
I will be a little more specific.
The following are my log statements.
MDC.put("uuid", UUID.randomUUID().toString());
logger.info("Assigning value to the variable : {}", name);
this.setVal(value.getVal());
logger.info("{} = {}", name, value.getVal());
logger.info("Assigned value {} to the variable : {}", value.getVal(),
name);
MDC.clear();
I received the log statements in Logstash using UDP. And I am getting the messages like.
{
"#timestamp" => "2015-04-01T10:23:37.846+05:30",
"#version" => 1,
"message" => "Assigning value to the variable : X",
"logger_name" => "com.example.logstash.Variable",
"thread_name" => "pool-1-thread-1",
"level" => "INFO",
"level_value" => 20000,
"HOSTNAME" => "pnibinkj-W7-1",
"uuid" => "ab17b842-8348-4474-98e4-8bc2b8dd6781",
"host" => "127.0.0.1"
}
{
"#timestamp" => "2015-04-01T10:23:37.846+05:30",
"#version" => 1,
"message" => "Assigning value to the variable : Y",
"logger_name" => "com.example.logstash.Variable",
"thread_name" => "pool-1-thread-2",
"level" => "INFO",
"level_value" => 20000,
"HOSTNAME" => "pnibinkj-W7-1",
"uuid" => "d5513e4c-de3b-4144-87e4-87b077ac8056",
"host" => "127.0.0.1"
}
{
"#timestamp" => "2015-04-01T10:23:37.862+05:30",
"#version" => 1,
"message" => "Y = 1",
"logger_name" => "com.example.logstash.Variable",
"thread_name" => "pool-1-thread-2",
"level" => "INFO",
"level_value" => 20000,
"HOSTNAME" => "pnibinkj-W7-1",
"uuid" => "d5513e4c-de3b-4144-87e4-87b077ac8056",
"host" => "127.0.0.1"
}
{
"#timestamp" => "2015-04-01T10:23:37.863+05:30",
"#version" => 1,
"message" => "X = 1",
"logger_name" => "com.example.logstash.Variable",
"thread_name" => "pool-1-thread-1",
"level" => "INFO",
"level_value" => 20000,
"HOSTNAME" => "pnibinkj-W7-1",
"uuid" => "ab17b842-8348-4474-98e4-8bc2b8dd6781",
"host" => "127.0.0.1"
}
{
"#timestamp" => "2015-04-01T10:23:37.863+05:30",
"#version" => 1,
"message" => "Assigned value 1 to the variable : X",
"logger_name" => "com.example.logstash.Variable",
"thread_name" => "pool-1-thread-1",
"level" => "INFO",
"level_value" => 20000,
"HOSTNAME" => "pnibinkj-W7-1",
"uuid" => "ab17b842-8348-4474-98e4-8bc2b8dd6781",
"host" => "127.0.0.1"
}
{
"#timestamp" => "2015-04-01T10:23:37.863+05:30",
"#version" => 1,
"message" => "Assigned value 1 to the variable : Y",
"logger_name" => "com.example.logstash.Variable",
"thread_name" => "pool-1-thread-2",
"level" => "INFO",
"level_value" => 20000,
"HOSTNAME" => "pnibinkj-W7-1",
"uuid" => "d5513e4c-de3b-4144-87e4-87b077ac8056",
"host" => "127.0.0.1"
}
There are 2 UUIDs
"d5513e4c-de3b-4144-87e4-87b077ac8056" for "Y = 1"
"ab17b842-8348-4474-98e4-8bc2b8dd6781" for "X = 1"
There are two other messages for each UUID. I want to combine them into a single event.
I am not sure, how to write the multiline filter for this case.
filter {
multiline {
pattern => "."
what => "previous"
stream_identity => "%{uuid}"
}
}
"pattern" and "what" are required fields, it seems. What should I provide for these fields. How do I use Stream Identity?
Please point me in right direction.
Thanks,
Paul
You would need to combine your messages (see multiline{} filter, which supports stream_identity), and then a regular query would return the appropriate message.
this should be possible using the kibana filters if X is some unique value, but with the logs in the format shown you'd need to use the multiline filter to join the entries together.
With that in place, you could probably use a query something like
message: "X=1"

Name of applications running on port in Perl or Java

Xampp comes with a neat executable called xampp-portcheck.exe. This responds with if the ports required are free, and if not, which applications are running on those ports.
I can check if something is running on a port, by accessing the netstat details, but how do I find out the application running on the port within Windows?
The CPAN module Win32::IPHelper provides access to GetExtendedTcpTable which provides the ProcessID for each connection.
Win32::Process::Info gives information about all running processes.
Combining the two, we get:
#!/usr/bin/perl
use strict;
use warnings;
use Win32;
use Win32::API;
use Win32::IPHelper;
use Win32::Process::Info qw( NT );
use Data::Dumper;
my #tcptable;
Win32::IPHelper::GetExtendedTcpTable(\#tcptable, 1);
my $pi = Win32::Process::Info->new;
my %pinfo = map {$_->{ProcessId} => $_ } $pi->GetProcInfo;
for my $conn ( #tcptable ) {
my $pid = $conn->{ProcessId};
$conn->{ProcessName} = $pinfo{$pid}->{Name};
$conn->{ProcessExecutablePath} = $pinfo{$pid}->{ExecutablePath};
}
#tcptable =
sort { $a->[0] cmp $b->[0] }
map {[ sprintf("%s:%s", $_->{LocalAddr}, $_->{LocalPort}) => $_ ]}
#tcptable;
print Dumper \#tcptable;
Output:
[
'0.0.0.0:135',
{
'RemotePort' => 0,
'LocalPort' => 135,
'LocalAddr' => '0.0.0.0',
'State' => 'LISTENING',
'ProcessId' => 1836,
'ProcessName' => 'svchost.exe',
'ProcessExecutablePath' => 'C:\\WINDOWS\\system32\\svchost.exe',
'RemoteAddr' => '0.0.0.0'
}
],
...
[
'192.168.169.150:1841',
{
'RemotePort' => 80,
'LocalPort' => 1841,
'LocalAddr' => '192.168.169.150',
'State' => 'ESTABLISHED',
'ProcessId' => 1868,
'ProcessName' => 'firefox.exe',
'ProcessExecutablePath' => 'C:\\Program Files\\Mozilla Firefox\\firefox.exe',
'RemoteAddr' => '69.59.196.211'
}
],
Phewwww it was exhausting connecting all these dots.

Categories