I've tried loading data into hive with command line way. Its working fine with this way.
Now I want to load data through Java. So for this purpose I've written code & I'm able create tables,databases,inserting values into it, but while using load command it not working.
private static String driverName = "org.apache.hive.jdbc.HiveDriver";
private static String databaseURL = "jdbc:hive2://server_name:10001/test";
private static String userName = "<hadoop_user";
private static String password = "<password>";
private static Connection con = null;
private final static Logger log =
private static String dbName="db_name",
tableName="table_name",
path = "";
private void loadData(String path,String tableName) {
// create statement
Statement stmt;
try {
stmt = con.createStatement();
String sql = "LOAD DATA LOCAL INPATH 'file:/"+path+ "' OVERWRITE INTO TABLE "+tableName+"";
System.out.println("Load Data into successful"+sql);
stmt.execute("LOAD DATA LOCAL INPATH 'file:/"+path+ "' OVERWRITE INTO TABLE "+tableName+"");
con.close();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Giving This Issue,
Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: Principal [name=hadoop, type=USER] does not have following privileges for operation LOAD [[SELECT, INSERT, DELETE, OBJECT OWNERSHIP] on Object [type=LOCAL_URI, name=file:/D:/DTCC/Pig/Dummy_data_Main.tsv]]
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLAuthorizationUtils.assertNoDeniedPermissions(SQLAuthorizationUtils.java:414)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.checkPrivileges(SQLStdHiveAuthorizationValidator.java:96)
at org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAuthorizerImpl.checkPrivileges(HiveAuthorizerImpl.java:85)
at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:725)
at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:518)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:455)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:303)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1067)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1061)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:100)
... 15 more
What I tried:
1) I give all permission to hadoop user on HDFS path of table
2) I give all permission to table such as SELECT, INSERT, DELETE
Please help me to resolve this issue.
Make sure for the following -
if you have kerberos security setup, don't forget to use kinit
User "hadoop" should have access to write on the folder (hive table location). -- For any HDFS path that you are changing the permission, simply "change permission" or "chmod" command would not work. You need to run "hdfs dfs -setfacl -R -m user::rwx ".
Also, make sure this table location has the same parent directory as it is for other tables that you are successfully able to create. [Some times, admin can restrict to create table in other location ].
Related
I need a uniqueID from a message Object to save this in my database.
Afterwards I´m able to search for this UID in my database and can add other properties, like "emailTrackingActive" etc.
a) Is there a possibility to get a whole UID for a Email Inbox or is it always only per folder? Currently I´m getting this as you can see in the code.
Currently I´m doing the following as you can see in the code:
After I´ve send the message, I copy the message into my "Sent" folder and then I want to get the UID and save it in the database.
With "EmailHelperClass" I´m getting Store etc.
I think it should be clear and I will not post this code...
private void copyIntoSentAndSaveInDatabase(EmailHelperClass email, final Message msg){
final Store store = email.getMailConfiguration().getWriteStore();
final Folder folder = (Folder) store.getFolder("Sent");
if (folder.exists() == false) {
folder.create(Folder.HOLDS_MESSAGES);
}
folder.open(Folder.READ_WRITE);
folder.appendMessages(new Message[] { msg });
// Get UniqueID
UIDFolder uf = (UIDFolder) folder;
Long messageId = uf.getUID(msg);
// Todo Update in DB etc
}
But now I´m getting the following error message:
java.util.NoSuchElementException: Message does not belong to this
folder
What is wrong here?
Hi all you need to change folder to IMAPFolder
final Store store = email.getMailConfiguration().getWriteStore();
final IMAPFolder folder = (IMAPFolder) store.getFolder("Sent");
folder.open(Folder.READ_WRITE)
with the help of this folder fetch all your messages i have write the below method to get the UID of the message
private long getMessagesUID(UIDFolder folder,javax.mail.message message){
try{
return folder.getUID(message)
}catch(Exception ex)
{
ex.printStackTrace();
}
Let me know in case of any query.
I am new to spring, I am trying a simple web dynamic application getting data from database and showing on front end using impala.
This is connector class :
private static final String IMPALAD_HOST = "host";
private static final String IMPALAD_JDBC_PORT = "port";
private static final String CONNECTION_URL = "jdbc:hive2://" + IMPALAD_HOST + ':' + IMPALAD_JDBC_PORT + "/;auth=noSasl";
private static final String JDBC_DRIVER_NAME = "org.apache.hive.jdbc.HiveDriver";
public Connection getConnection() throws ClassNotFoundException{
Connection con = null;
try {
Class.forName(JDBC_DRIVER_NAME);
con = DriverManager.getConnection(CONNECTION_URL,"","");
}catch (SQLException e) {
e.printStackTrace();
}
return con;
}`
HIve-connector jar is added in the java build path in eclipse. getConnection() works if i call it from a main method of a java class, but getConnection() gives hive driver not found exception if i call this method from jsp page. :
java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver
You are not having the hive-jdbc.jar in your webapplication archive. i.e. war file. it is being missed while packaging the application.You should place it in the WEB-INF/lib directory. Please also ensure that you also add it in the deployment assembly of the eclipse project.
It works when you run the main class because the hive-jdbc.jar is configured in the build path. It is different from webapplication perspective.
Note: ClassNotFoundException shouldn't be thrown unless you are going to handle it. You should have all the required jars in your application package during runtime in classpath.
You are using the wrong Driver-Class.
Use org.apache.hadoop.hive.jdbc.HiveDriverinstead.
I use the following code to create db connection
public final static String driver = "org.apache.derby.jdbc.ClientDriver";
public final static String connectionURL = "jdbc:derby:projectDB;create=true;user=user1;password=psssword";
public CreateConnectionDOA(String driver, String connectionURL) throws ClassNotFoundException,SQLException
{
Class.forName(driver);
conn = DriverManager.getConnection(connectionURL);
conn.setAutoCommit(false);
}
The project was created in Netbeans-Platform-Application-Module.
When i run the project through netbeans platform 7.4.. it works properly.
But when i try to create a installer using netbeans and run.. the project opens but it also gives an exception
"ERROR 42Y07: Schema 'projectDB' does not exist
try fully pathing your DB in your url
public final static String connectionURL =
"jdbc:derby:d:/myproject/projectDB;create=true;user=user1;password=psssword";
Full path works because your relative path was probably wrong. With a correct relative path, it should work.
Keep in mind that current directory is your project directory; write the relative path (../dataBase if necessary works as expected) and it will work.
I am using cucumber-jvm to test the behaviour of the legacy system I'm working on at work. I have to use Java 1.5 and Hibernate 3.3, upgrading is not an option. Since, during my tests, it stores some objects in the database, I have created a new development database.
What bothers me is that I have to manually drop the records (using a sql script) everytime I'll rerun my tests, or they'll fail. And anyone else wanting to run them would have to do the same. I want to quickly and automatically clean my test database, by either:
Creating an empty database and populating it with what I need, or
Using an already existing database, droping the records before starting the tests.
What I have so far: I'm using the cucumber-junit plugin, and the RunTests class redirects to my test database:
#RunWith(Cucumber.class)
#Cucumber.Options(
features = "test/resources/cucumber",
format = "html:target/cucumber"
)
public class RunTests {
private static Configuration configuration;
#BeforeClass
public static void preparaBase() {
// gets the mapped configuration to the db
configuration = HibernateUtil.getConfiguration();
configuration.setProperty("hibernate.connection.url", "test-db-url");
configuration.setProperty("hibernate.connection.username", "user");
configuration.setProperty("hibernate.connection.password", "pass");
// configuration.setProperty("hibernate.hbm2ddl.auto", "create-drop");
// rebuilds the configuration using my test database
HibernateUtil.rebuildSessionFactory(configuration);
}
}
I have tried using the hibernate.hbm2ddl.auto property with the create-drop value and using the import.sql file to prepare the database, but it takes ages to start the tests and it seems that it's not detecting my import.sql file.
Sadly, using Maven and its excellent maven-sql-plugin is not an option (I have suggested a switch to Maven, to no avail). Is there an alternative?
I did it!
I used this ScriptRunner class as such:
#RunWith(Cucumber.class)
#Cucumber.Options(
features = "test/resources/cucumber",
format = "html:target/cucumber"
)
public class RunTests {
private static Configuration configuration;
String url = "test-db-url";
String user = "user";
String pass = "pass";
#BeforeClass
public static void preparaBase() {
// gets the mapped configuration to the db
configuration = HibernateUtil.getConfiguration();
configuration.setProperty("hibernate.connection.url", url);
configuration.setProperty("hibernate.connection.username", user);
configuration.setProperty("hibernate.connection.password", pass);
// rebuilds the configuration using my test database
HibernateUtil.rebuildSessionFactory(configuration);
// executes a script stored in test/resources/cucumber
try {
Class.forName("com.mysql.jdbc.Driver");
Connection conn = DriverManager.getConnection(url, user, pass);
ScriptRunner runner = new ScriptRunner(conn, false, true);
runner.runScript(new BufferedReader(new FileReader("test/resources/cucumber/db.sql")));
} catch (Exception e) {
throw new RuntimeException(e.getMessage(), e);
}
}
}
I have a common library project shared between a couple of different Android apps that I need to allow for development and production values to be available. Using Ant I was hoping to be able to swap a environment.properties file for either a dev.properties or prod.properties file.
I am getting a NullPointer exception when trying to load the "/com/iis/..." properties file.
Any ideas to what is being done wrong?
** common is the project
static {
Properties p = new Properties();
InputStream input = BaseWebApi.class.getClass().getResourceAsStream("/com/iis/common/environment.properties");
try {
p.load(input);
input.close();
} catch (IOException e) {
Log.d(TAG, "Unable to load environment.properties file");
e.printStackTrace();
}
sApiUserName = p.getProperty(KEY_API_USER_NAME);
sApiPassword = p.getProperty(KEY_API_PASSWORD);
sServerAddress = p.getProperty(KEY_SERVER_ADDRESS);
sServerProtocol = p.getProperty(KEY_SERVER_PROTOCOL);
sServerRootPath = p.getProperty(KEY_SERVER_ROOT_PATH);
sServerPort = Integer.parseInt(p.getProperty(KEY_SERVER_PORT));
}
** not to much to see in the stacktrace
09-09 12:35:43.935: ERROR/AndroidRuntime(853): Caused by: java.lang.NullPointerException
09-09 12:35:43.935: ERROR/AndroidRuntime(853): at java.util.Properties.load(Properties.java:290)
09-09 12:35:43.935: ERROR/AndroidRuntime(853): at com.iis.common.webapis.BaseWebApi.<clinit>(BaseWebApi.java:60)
09-09 12:35:43.935: ERROR/AndroidRuntime(853): ... 8 more
* I have tried every possible name combo that I can think of using the namespaces, folders and classnames
// namespaces, folder
"/com/iis/android_common/environment.properties"
"com/iis/android_common/environment.properties"
// namespaces, project name
"/com/iis/common/environment.properties"
"com/iis/common/environment.properties"
// folder
"/android_common/environment.properties"
// project name
"/common/environment.properties"
// direct path
/environment.properties
environment.properties
Are you certain your resources are being packaged in the dex file? See http://code.google.com/p/android/issues/detail?id=10076.
Just an additional note in regards to the path required to access the file and other issue fixes.
The code below works in accessing 'your_project_path/assets/api.properties`
public class WebApi {
private static final String TAG = "WebApi";
// configuration keys
private static final String API_URL = "api_url";
private static final String API_PROTOCOL = "api_protocol";
private static final String API_PORT = "api_port";
// server settings
protected static String URL;
protected static String PROTOCOL;
protected static int PORT;
// load server dev/prod settings via api.properties file
static {
InputStream input = WebApi.class.getResourceAsStream("/assets/api.properties");
Properties p = new Properties();
try {
Log.d(TAG, "Setting api properties");
p.load(input);
URL = p.getProperty(API_URL);
PROTOCOL = p.getProperty(API_PROTOCOL);
PORT = Integer.parseInt(p.getProperty(API_PORT));
} catch (IOException e) {
e.printStackTrace();
}
}
#...
}
One other thing to note the previously used keys to access the properties must be lower case (they were not in the initial posting, although I didn't post the properties file contents).
// api.properties (don't use upper case)
api_url=10.0.1.7
api_protocol=http
api_port=3000
Hopefully that will clear things up a little more for anyone else that has this problem.