I am getting the following error in hive database:
Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /home/usr/metastore_db
I've heard I can solve it with removing lock file - how safe is this? There's db.lck file inside metastore_db folder, it contains one line containing some ID string.
Yes you can delete that lck file. it just creates id for that derby instance.
also note if you change your local directory and then start hive , you will see another metastore_db directory created with lck file, but all your previous data will be their on your first metastore_db not the new one
Yes you can delete that file. It wont affect your hive. It is just a lock that has been created for that particular instance.
Inside the "metastore_db" folder, there will be 2 ".lck" files. Just delete them. "rm -r *.lck" - You can also start the derby in server mode, to avoid this, or mysql is a better option.
Please check for the metastore_db folder probably in the home folder and delete the dbex.lck
If its not there check the derby.log to see where the metastore_db location is mentioned
It will be created next time when you execute the spark-shell instance
Related
I have a directory M:\SOURCE from which I have listed and moved it's contents until it is empty
After that, I want to go ahead and delete it, I have tried (yes I also made sure it was empty):
sourceFile being "M:\SOURCE"
sourceFile.delete()
Files.delete(sourceFile.toPath());
FileUtils.deleteQuietly(sourceFile);
FileUtils.deleteDirectory(sourceFile);
FileUtils.forceDelete(sourceFile)
There are no exceptions being thrown by any of the other methods and .delete() returns true
HOWEVER, the directory still exists and when trying to access the folder I get the following message from windows:
When running process explorer I can see that java is using that resource (This only happens when I try to delete the source, and bear in mind that trying to delete source directory is the last thing my program does)
And just to make me freak out even more, once I stop my java virtual machine, THEN the folder magically disappears. So Java did got the instruction right, it's just that is not willing to delete it until it's terminated
Running System.gc() before deleting the directory also didn't help, and my working directory is not the one I'm trying to delete
You can get this problem when using Files NIO calls which list or return a Stream of directory contents before deleting the directory.
Using try with resources on any Stream of Path returned by Files NIO can help prevent this issue:
try(Stream<Path> stream = Files.list(directory)) {
// do any work on contents - move / delete
}
// delete directory after closing stream above
I'm trying to call liquibase programmatically.
For that I use the following code :
val database = DatabaseFactory.getInstance()
.findCorrectDatabaseImplementation(JdbcConnection(connection))
Liquibase(pathToChangelog, ClassLoaderResourceAccessor(), database)
.update(Contexts(), LabelExpression())
Liquibase managed to connect to the database and to acquire the lock, but it fails when parsing the pathToChangelog with a path/to/changelog does not exist error.
Here is my WAR file structure :
WEB-INF
\ changelog
\ db.changelog-master.xml
I tried
"WEB-INF/changelog/db.changelog-master.xml"
"/WEB-INF/changelog/db.changelog-master.xml"
"changelog/db.changelog-master.xml"
System.getProperty("user.dir") + "/WEB-INF/changelog/db.changelog-master.xml"
and certainly some other stuff, to no avail. It keeps on telling me that the file does not exist.
Am I doing something wrong ?
Found the answer by looking at the source code. The files have to be placed inside the WEB-INF/classes directory provided path has to be a relative path starting from there.
For instance, if you put your master file here :
WEB-INF/classes/changelog/db.changelog-master.xml
the pathToChangelog parameter should simply be changelog/db.changelog-master.xml.
In fact, this seems to be the default target location for the resources copied by the maven-war-plugin for instance.
I'm trying to use a DB for reading and writing that is contained in a JAR.
I can read in it, but can't write throwed exeception :
java.sql.SQLException: path to '/database/scddata.db': 'LocationOfJar/database' does not exist
Is there any way I can bundle the database file inside a JAR?
Thanks in advance.
Jar files does not allows to write.
So :
define a working path (in properties for example). Let's call it : workingPath/file.db.
on init of your program, before opening your db.
check if db exists in working path
if does not exists : copy your jar file.db file to workingPath/file.db .
Then you program will use the db from workingPath/file.db for execution.
I have created a class to perform commits to a Subversion repository using SVNKit. When I've completed the commits I want to destroy the temporary directory I was using as a working copy.
The way I'm creating the working copy directory is as follows:
Path svnWorkingCopyDirectory = Files
.createTempDirectory("svn-working-copy-directory-"
+ String.valueOf(System.currentTimeMillis()));
LOGGER.debug("Created SVN working copy directory: " + svnWorkingCopyDirectory);
svnWorkingCopyDirectory.toFile().deleteOnExit();
However, the directory does not get deleted on exit. Is there a way in SVNKit to mark the directory as no longer a working copy? Or another method to remove the directory?
File.deleteOnExit() works like File.delete() and Files.delete() in respect to the fact that if its argument denotes a directory, the directory must be empty to be removed which in your case probably it is not.
So if you want to delete this directory on exit you must register a shutdown hook and in that method you have to iterate recursively through the directory depth first to remove the files and then the empty directories.
Apache commons has a delete method you might use in a shutdown hook or whenever it is appropriate for you: https://commons.apache.org/proper/commons-io/javadocs/api-release/org/apache/commons/io/FileUtils.html#deleteDirectory(java.io.File)
I created a software using NetBeans and SQLite database. When I clean and build, the executable jar file and database work fine. It can read and write data from the database.
Then I created .exe file using "install creator" and after installing the software, the same dist folder is created in Program files on my Windows PC. When I run the executable jar file from that dist folder, it can only read the database, but can't write. I get this message
java.sql.SQLException:attempt to write a readonly database
Can anyone please help me solve this problem? Thanks in advance.
check this
The problem, as it turns out, is that the PDO SQLite driver requires that if you are going to do a write operation (INSERT,UPDATE,DELETE,DROP, etc), then the folder the database resides in must have write permissions, as well as the actual database file.
I found this information in a comment at the very bottom of the PDO SQLite driver manual page.
You should write the user specific data to the Application Data folder of current user.
You can get the ROAMING directory from
String path = System.getenv("APPDATA");
Or if you want to make it platform independent you can use getProperty which will give you users home directory and then you can write to specific directory:
String path = System.getProperty("user.home");
You can form the sqlite on path like
String sqliteUrl = "jdbc:sqlite:"+path+"sample.db";
Use this code line:
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "\\databasefile.db"
I can say this is the proper way of creating DB in application folder on to the drive like C:\ without permission
Use this code line:
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "\\Yourfile.db"
I can say this is the proper way of crating DB without permission on to the drive like C:\