Permission error occured while executing yardstick-ignite framework - java

Why is this console statement printed multiple times while running yardStick-ignite framework example?
If i've made a mistake can you please show me how to run the yardStick-ignite framework example?
<16:56:46><yardstick> Starting driver config '...-cn query -sn IgniteNode -ds Ignite-sql-query-put-1-backup' on localhost
Permission denied (publickey,password).
Permission denied (publickey,password).
Permission denied (publickey,password).
Run steps:
1) Create clone of git-hub library (git clone https://github.com/yardstick-benchmarks/yardstick-ignite)
2) Use (mvn package) command to compile the project and also to unpack scripts
3) Change Ip of driver and server from benchmark.properties
4) Run this command ./benchmark-run-all.sh

You need to configure ssh access to localhost without password. Refer to instruction there : How to ssh to localhost without password?
BTW this repository contains benchmarks for old Apache Ignite and yardstick version. New version of yardstick doesn't have this limitation and new version of Apache Ignite provide the better performance. In the last versions Apache Ignite distributed with benchmarks. You can download there https://ignite.apache.org/download.cgi#binaries and found them and instruction in /benchmarks folder.

Related

Java Azure Functions deployment error: Failed to execute goal com.microsoft.azure:azure-functions-maven-plugin:1.19.0:deploy

I have created a azure function in java created Using Command line by following below given Microsoft guide https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-cli-java
I have followed all the steps and locally I have deployed the functions and it is working fine but while trying to deploy into Azure using the following commands:
I am getting the following error on Command line:
Anyone please help me in resolving the above error to deploy the function into azure.
Thanks & Regards,
Preethi H R
I have created the user in my Azure Subscription and provided the reader role access and then deployed the function app from local to Azure using the maven cmdlet mvn azure-functions:deploy:
So, instead of reader role on the subscription, provided contributor role this time and checked the deployment which is successful:
In Azure Portal:
Not only the Contributor access, two more roles available
in accessing the Function App with specific privileges mentioned in one of the MSFT Q&A Answer by #MughundhanRaveendran.

What version of sqljdbc do I require?

I am running in to an issue with my application where I get the error.
The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "SQL Server returned an incomplete response."
I have looked and the only difference on my new server is version of SQL which leads me to think I have the wrong version of SQLJDBC4.jar
I have been looking for a version that works with Java version 1.6.0_27 and MS SQL Sever 2017
Does anyone have any suggestions of what version of this JAR I require?
Please try to add the "sqljdbc42-0.jar" file to your build path.
Try the following link if it is maven project, else download the jar from any trusted url.
https://mvnrepository.com/artifact/com.microsoft.sqlserver/sqljdbc42/6.0.8112

What is the Kerberos method?

I'm trying to connect to hive with jdbc. I keep getting this error. I tried looking it up but could not hind anything useful .
This is my connection string:
jdbc:hive2://hostname.xxx.com:10000/default;principal=hive/hostname.xxx.com#HADOOP_ENV.COM
What is this error: java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosTicket(Ljavax/security/auth/Subject;)Z
That method exists in Hadoop 2.8 but not in Hadoop 2.7 -- so my guess is that your project dependencies are not aligned with whatever version of Hadoop you have in Production.
Code in trunk
https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/util/KerberosUtil.java
code in branch-2.8.0
https://github.com/apache/hadoop/blob/branch-2.8.0/hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/util/KerberosUtil.java
code in branch-2.7.4
https://github.com/apache/hadoop/blob/branch-2.7.4/hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/util/KerberosUtil.java
Kerberos is an authentication protocol that is used by Hive server (https://en.wikipedia.org/wiki/Kerberos_(protocol))
The problem you are setting is more about a missing library in our pom.xml. Have you include <artifactId>hive-jdbc</artifactId> ?
I think your keberos ticket is not generated properly
Can you try running these two commands in order from the user you are trying to connect:
kdestroy (deleted any kerberos ticket generated before)
kinit (generates a new ticket)
Then try to connect again.

ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")

When starting for example Elasticsearch 5.5 :
main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
Workaround with Oracle Java 1.8.0_131 is to open file <jre>/lib/security/java.policy and add this line to grant section (i.e. between curly brackets):
permission javax.management.MBeanTrustPermission "register";
Why workaround? The upright solution would be to specify extra grant section which code exactly should get this permission.
got the same error and the answer is here java.security.AccessControlException when using Ant, but runs ok when invoking java from console
Append the grant section in java.policy file with
permission javax.management.MBeanTrustPermission "register";
I had this same issue when moving from a single instance to two instances locally.
I tried what Alice suggested above. Even re-installed Elasticsearch (5.5.0).
I also updated my Java to the latest one for Linux provided by Oracle.
Nothing was working. Then I discovered, that I couldn't just take the elasticsearch-5.5.0/config directory and rename it to elasticsearch-5.5.0/node1.
So... I had to leave that config directory in place and clone it to node1/node2.
EVEN if I configure path.config in the runtime args, ES still needs that base line config directory.
Hope this helps.
I faced same issue on Ubuntu-16.04 system.
Solution:
ElasticSeearch service is not allowed to run for "ROOT" user. That's why change the ownership of elasticsearch folder with below command:
go to Elasticseach installation directory
$ sudo chown -R user_name:user_Group elasticsearch-5.5.0
$/bin/elasticsearch
this will start elasticsearch service. It is working form me perfectly.

Confluence configuration Spring Application context has not been set

I tried to install confluence on my own ubuntu server, but always failed. The error is:
com.atlassian.util.concurrent.LazyReference$InitializationException: java.lang.IllegalStateException: Spring Application context has not been set
at com.atlassian.util.concurrent.LazyReference.getInterruptibly(LazyReference.java:149)
caused by: java.lang.IllegalStateException: Spring Application context has not been set
at com.atlassian.spring.container.SpringContainerContext.getComponent(SpringContainerContext.java:48)
I saw some solutions in the jira confluence forum saying try to fix the permission of installed directory and home directory. I tried but failed again. How can I fix the problem.
In my case the issue was corrupted confluence.cfg.xml file (contains DB connection stings and other settings). The file size was 0 bytes.
I would suggest to use a VM to create a new installation and borrow confluence.cfg.xml from that installation.
It's embarrassing that this behavior has been allowed to exist for nearly 7 years in a commercial product. This is a basic stuff...
I wish that was on instructions somewhere:
Make single backup copy of confluence.cfg.xml immediately before any writes to it by the application. Application should be able to restore from backed up copy if it gets corrupted.
Atlassian documentation lists the following causes of this problem:
The user running Confluence does not have write permissions to the home folder defined in <install>/confluence/WEB-INF/classes/confluence-init.properties or the install directory.
You are running Confluence as the root user or if you have an application firewall enabled (SeLinux or AppArmor).
The database driver is not located in the <install>/confluence/WEB-INF/lib folder or you are using a database version that is incompatible with the bundled driver.
The hostname of the server can not be resolved.
In my case I was running it as root user inside docker container.

Categories