I need to execute some sequence of commands at the remote server via ssh, using sshj library.
I do
Session session = ssh.startSession();
Session.Command cmd = session.exec("ls -l");
System.out.println(IOUtils.readFully(cmd.getInputStream()).toString());
cmd.join(10, TimeUnit.SECONDS);
Session.Command cmd2 = session.exec("ls -a");
System.out.println(IOUtils.readFully(cmd2.getInputStream()).toString());
and it throws me
net.schmizz.sshj.common.SSHRuntimeException: This session channel is
all used up
But I can't recreate session for every single command, because this example it will show home directory list, but not the /some/dir list.
As odd as it is, session can only be used once. So you have to reset the session every time.
Session session = ssh.startSession();
Session.Command cmd = session.exec("ls -l");
System.out.println(IOUtils.readFully(cmd.getInputStream()).toString());
cmd.join(10, TimeUnit.SECONDS);
session = ssh.startSession();
Session.Command cmd2 = session.exec("ls -a");
System.out.println(IOUtils.readFully(cmd2.getInputStream()).toString());
Or if the shell you are connecting to supports delimited commands (and the situation allows it), you can do this (bash example):
session.exec("ls -l; <command 2>; <command 3>");
You can consider using an Expect-like third party library which simplifies working with remote services and capturing output. Those libraries are designed to execute a sequence of commands. Here is a good set of options you can try:
Expect4J
ExpectJ
Expect-for-Java
However, when I was about to solve similar problem I found these libraries are rather old. They also introduce a lot of unwanted dependencies. So I created my own and made it available for others. It is called ExpectIt. The advantages of my library it are stated on the project home page. You can give it a try.
Here is an example of interacting with a public remote SSH service using sshj:
SSHClient ssh = new SSHClient();
...
ssh.connect("sdf.org");
ssh.authPassword("new", "");
Session session = ssh.startSession();
session.allocateDefaultPTY();
Shell shell = session.startShell();
Expect expect = new ExpectBuilder()
.withOutput(shell.getOutputStream())
.withInputs(shell.getInputStream(), shell.getErrorStream())
.build();
try {
expect.expect(contains("[RETURN]"));
expect.sendLine();
String ipAddress = expect.expect(regexp("Trying (.*)\\.\\.\\.")).group(1);
System.out.println("Captured IP: " + ipAddress);
expect.expect(contains("login:"));
expect.sendLine("new");
expect.expect(contains("(Y/N)"));
expect.send("N");
expect.expect(regexp(": $"));
expect.send("\b");
expect.expect(regexp("\\(y\\/n\\)"));
expect.sendLine("y");
expect.expect(contains("Would you like to sign the guestbook?"));
expect.send("n");
expect.expect(contains("[RETURN]"));
expect.sendLine();
} finally {
session.close();
ssh.close();
expect.close();
}
Here is the link to the complete workable example.
This question is old but just to clarify, quoting from the wiki https://github.com/hierynomus/sshj/wiki
Session objects are not reusable, so you can only have one command/shell/subsystem via exec(), startShell() or startSubsystem() respectively. But you can start multiple sessions over a single connection.
In our case we have made put it in a function
public String runCmd(SSHClient sshClient, String command) throws IOException {
String response = "";
try (Session session = sshClient.startSession()) {
final Command cmd = session.exec(command);
response = (IOUtils.readFully(cmd.getInputStream()).toString());
cmd.join(5, TimeUnit.SECONDS);
// System.out.println("\n** exit status: " + cmd.getExitStatus());
}
return response;
}
Related
I am adding the Neo4j Bolt driver to my application just following the http://neo4j.com/developer/java/:
import org.neo4j.driver.v1.*;
Driver driver = GraphDatabase.driver( "bolt://localhost", AuthTokens.basic( "neo4j", "neo4j" ) );
Session session = driver.session();
session.run( "CREATE (a:Person {name:'Arthur', title:'King'})" );
StatementResult result = session.run( "MATCH (a:Person) WHERE a.name = 'Arthur' RETURN a.name AS name, a.title AS title" );
while ( result.hasNext() )
{
Record record = result.next();
System.out.println( record.get( "title" ).asString() + " " + record.get("name").asString() );
}
session.close();
driver.close();
However, always from the official documentation unit testing is made using:
GraphDatabaseService db = new TestGraphDatabaseFactory()
.newImpermanentDatabaseBuilder()
So if I want to test in some way the code above, I have to replace the GraphDatabase.driver( "bolt://localhost",...) with the GraphDatabaseService from the test. How can I do that? I cannot extract any sort of in-memory driver from there as far as I can see.
The Neo4j JDBC has a class called Neo4jBoltRule for unit testing. It is a junit rule starting/stopping an impermanent database together with some configuration to start bolt.
The rule class uses dynamic port assignment to prevent test failure due to running multiple tests in parallel (think of your CI infrastructure).
An example of a unit test using that rule class is available at https://github.com/neo4j-contrib/neo4j-jdbc/blob/master/neo4j-jdbc-bolt/src/test/java/org/neo4j/jdbc/bolt/SampleIT.java
An easy way now is to pull neo4j-harness, and use their built-in Neo4jRule as follows:
import static org.neo4j.graphdb.factory.GraphDatabaseSettings.boltConnector;
// [...]
#Rule public Neo4jRule graphDb = new Neo4jRule()
.withConfig(boltConnector("0").address, "localhost:" + findFreePort());
Where findFreePort implementation can be as simple as:
private static int findFreePort() {
try (ServerSocket socket = new ServerSocket(0)) {
return socket.getLocalPort();
} catch (IOException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
As the Javadoc of ServerSocket explains:
A port number of 0 means that the port number is automatically allocated, typically from an ephemeral port range. This port number can then be retrieved by calling getLocalPort.
Moreover, the socket is closed before the port value is returned, so there are great chances the returned port will still be available upon return (the window of opportunity for the port to be allocated again in between is small - the computation of the window size is left as an exercise to the reader).
Et voilà !
I'm using this Code Example, to execute commands in Tcl shell .
If you look at the main function down the page , the way of executing commands is :
SSHClient ssh = new SSHClient("linux_host", "root", "password");
List<String> cmdsToExecute = new ArrayList<String>();
cmdsToExecute.add("ls");
cmdsToExecute.add("pwd");
cmdsToExecute.add("mkdir testdir");
String outputLog = ssh.execute(cmdsToExecute);
In my program I'm doing :
SSHClient ssh = new SSHClient("linux_host", "root", "password");
List<String> cmdsToExecute = new ArrayList<String>();
cmdsToExecute.add("bpsh"); // open Tcl Shell
cmdsToExecute.add("set bps [bps::connect ... ]"); // Tcl shell commands
String outputLog = ssh.execute(cmdsToExecute);
Now the problem is that i can't execute commands from arrayList without exiting the Tcl Shell .
meaning if i run this Code :
SSHClient ssh = new SSHClient("linux_host", "root", "password");
List<String> cmdsToExecute = new ArrayList<String>();
cmdsToExecute.add("bpsh"); // open Tcl Shell
cmdsToExecute.add("set bps [bps::connect ... ]"); // Tcl shell commands
String outputLog = ssh.execute(cmdsToExecute);
cmdsToExecute.clear();
cmdsToExecute.add("set sf [$bps createSuperflow ... ]");
String outputLog = ssh.execute(cmdsToExecute);
i get that after the first execute on the remote machine it exited the first tcl shell and went back to original shell , and in the second execute it tries to run :
"set sf [$bps createSuperflow ... " in the original shell .
i assume because the line :
cmdsToExecute.add("bpsh");
doesn't exist .
The Code of the expect4j that i'm using is in the link above , can someone tell me what i need to modify so that i can execute many commands using ssh.execute() without it exiting the Tcl shell ?
You can try to build a file with the list of commands and source it.
Something like...
cmdsToExecute.add("echo \"\" > tmpcmd.txt");
cmdsToExecute.add("echo \"set bps [bps::connect ... ]\" >> tmpcmd.txt");
cmdsToExecute.add("echo \"set sf [$bps createSuperflow ... ]\" >> tmpcmd.txt");
cmdsToExecute.add("bpsh");
cmdsToExecute.add("source tmpcmd.txt");
Here is my code for mysql database restore code .when i tried this code app works without exception but application get hangs and database is not restored ..please help me
String databaseName = "sample"; //database name
String userName = "root"; // MySQL username
String password = ""; // MySQL password
int processComplete; // this variable for verify the process
String[] executeCmd = new String[]{"C:\\wamp\\bin\\mysql\\mysql5.5.24\\bin\\mysql",
databaseName, "-u" + userName, "-p" + password, "-e", " source D:/data.sql"};
System.out.println(executeCmd);
Process runtimeProcess = Runtime.getRuntime().exec(executeCmd);// execute the command
processComplete = runtimeProcess.waitFor();
System.out.println(processComplete);
if (processComplete == 1) { // if return value equal to 1 then failed the process
JOptionPane.showMessageDialog(null, "Restore Failed");
} else if (processComplete == 0) {{// if return value equal to 0 then failed the process
JOptionPane.showMessageDialog(null, "Restore Completed");
}
I suspect that the last parameter is been mishandled
String[] executeCmd = new String[]{
"C:\\wamp\\bin\\mysql\\mysql5.5.24\\bin\\mysql",
databaseName,
"-u" + userName,
"-p" + password,
"-e",
" source D:/data.sql" }
It should probably look more like...
String[] executeCmd = new String[]{
"C:\\wamp\\bin\\mysql\\mysql5.5.24\\bin\\mysql",
databaseName,
"-u" + userName,
"-p" + password,
"-e",
"source",
"D:/data.sql" }
Each element in the array will be a separate argument passed to the command, this allows you the flexibility of passing arguments that have spaces in them without need to first escape them using quotes
You should consider using ProcessBuilder instead of trying to build the Process yourself, apart from allowing you to re-direct the error stream to the input stream, it also allows you to specify the starting context for the process.
You should also be reading the output of the process (via it's InputStream) which would possibly highlight issues and may also allow the process to exit (as some process won't exit until there stdout is read completely)
For example: How do I execute Windows commands in Java?
I'd like to generate alarms on my Java desktop application :
alarms set with a specific date/time which can be in 5 minutes or 5 months
I need to be able to create a SWT application when the alarm is triggered
I need this to be able to work on any OS. The software users will likely have Windows (90% of them), and the rest Mac OS (including me)
the software license must allow me to use it in a commercial program, without requiring to open source it (hence, no GPL)
I cannot require the users to install Cygwin, so the implementation needs to be native to Windows and Unix
I am developing using Java, Eclipse, SWT and my application is deployed from my server using Java Web Start. I'm using Mac OS X.6 for developing.
I think I have a few options:
Run my application at startup, and handle everything myself;
Use a system service.
Use the cron table on Unix, and Scheduled Tasks on Windows
Run at startup
I don't really like this solution, I'm hoping for something more elegant.
Refs: I would like to run my Java program on System Startup on Mac OS/Windows. How can I do this?
System service
If I run it as a system service, I can benefit from this, because the OS will ensure that my software:
is always running
doesn't have/need a GUI
restarts on failure
I've researched some resources that I can use:
run4j — CPL — runs on Windows only, seems like a valid candidate
jsvc — Apache 2.0 — Unix only, seems like a valid candidate
Java Service Wrapper — Various — I cannot afford paid licenses, and the free one is a GPL. Hence, I don't want to/can't use this
My questions in the system service options are:
Are there other options?
Is my planned implementation correct:
at the application startup, check for existence of the service
if it is not installed:
escalate the user to install the service (root on Unix, UAC on Windows)
if the host OS is Windows, use run4j to register the service
if the host OS is Unix, use jsvc to register the service
if it is not running, start it
Thus, at the first run, the application will install the service and start it. When the application closes the service is still running and won't need the application ever again, except if it is unregistered.
However, I think I still miss the "run on startup" feature.
Am I right? Am I missing something?
cron / Task Scheduler
On Unix, I can easily use the cron table without needing the application to escalate the user to root. I don't need to handle restarts, system date changes, etc. Seems nice.
On Windows, I can use the Task Scheduler, even in command-line using At or SchTasks. This seems nice, but I need this to be compatible from XP up to 7, and I can't easily test this.
So what would you do? Did I miss something? Do you have any advice that could help me pick the best and most elegant solution?
Bicou: Great that you shared your solution!
Note that the "schtasks.exe" has some localization issues, if you want to create a daily trigger with it, on an English Windows you'd have to use "daily", on a German one (for example) you'd have to use "täglich" instead.
To resolve this issue I've implemented the call to schtasks.exe with the /xml-option, providing a temporary xml-file which I create by template.
The easiest way to create such a template is to create a task "by hand" and use the "export"-function in the task management GUI tool.
Of the available options you have listed, IMHO Option 3 is better.
As you are looking only for an external trigger to execute the application, CRON or Scheduled tasks are better solutions than other options you have listed. By this way, you remove a complexity from your application and also your application need not be running always. It could be triggered externally and when the execution is over, your application will stop. Hence, unnecessary resource consumption is avoided.
Here's what I ended up implementing:
public class AlarmManager {
public static final String ALARM_CLI_FORMAT = "startalarm:";
public static SupportedOS currentOS = SupportedOS.UNSUPPORTED_OS;
public enum SupportedOS {
UNSUPPORTED_OS,
MAC_OS,
WINDOWS,
}
public AlarmManager() {
final String osName = System.getProperty("os.name");
if (osName == null) {
L.e("Unable to retrieve OS!");
} else if ("Mac OS X".equals(osName)) {
currentOS = SupportedOS.MAC_OS;
} else if (osName.contains("Windows")) {
currentOS = SupportedOS.WINDOWS;
} else {
L.e("Unsupported OS: "+osName);
}
}
/**
* Windows only: name of the scheduled task
*/
private String getAlarmName(final long alarmId) {
return new StringBuilder("My_Alarm_").append(alarmId).toString();
}
/**
* Gets the command line to trigger an alarm
* #param alarmId
* #return
*/
private String getAlarmCommandLine(final long alarmId) {
return new StringBuilder("javaws -open ").append(ALARM_CLI_FORMAT).append(alarmId).append(" ").append(G.JNLP_URL).toString();
}
/**
* Adds an alarm to the system list of scheduled tasks
* #param when
*/
public void createAlarm(final Calendar when) {
// Create alarm
// ... stuff here
final long alarmId = 42;
// Schedule alarm
String[] commandLine;
Process child;
final String alarmCL = getAlarmCommandLine(alarmId);
try {
switch (currentOS) {
case MAC_OS:
final String cron = new SimpleDateFormat("mm HH d M '*' ").format(when.getTime()) + alarmCL;
commandLine = new String[] {
"/bin/sh", "-c",
"crontab -l | (cat; echo \"" + cron + "\") | crontab"
};
child = Runtime.getRuntime().exec(commandLine);
break;
case WINDOWS:
commandLine = new String[] {
"schtasks",
"/Create",
"/ST "+when.get(Calendar.HOUR_OF_DAY) + ":" + when.get(Calendar.MINUTE),
"/SC ONCE",
"/SD "+new SimpleDateFormat("dd/MM/yyyy").format(when.getTime()), // careful with locale here! dd/MM/yyyy or MM/dd/yyyy? I'm French! :)
"/TR \""+alarmCL+"\"",
"/TN \""+getAlarmName(alarmId)+"\"",
"/F",
};
L.d("create command: "+Util.join(commandLine, " "));
child = Runtime.getRuntime().exec(commandLine);
break;
}
} catch (final IOException e) {
L.e("Unable to schedule alarm #"+alarmId, e);
return;
}
L.i("Created alarm #"+alarmId);
}
/**
* Removes an alarm from the system list of scheduled tasks
* #param alarmId
*/
public void removeAlarm(final long alarmId) {
L.i("Removing alarm #"+alarmId);
String[] commandLine;
Process child;
try {
switch (currentOS) {
case MAC_OS:
commandLine = new String[] {
"/bin/sh", "-c",
"crontab -l | (grep -v \""+ALARM_CLI_FORMAT+"\") | crontab"
};
child = Runtime.getRuntime().exec(commandLine);
break;
case WINDOWS:
commandLine = new String[] {
"schtasks",
"/Delete",
"/TN \""+getAlarmName(alarmId)+"\"",
"/F",
};
child = Runtime.getRuntime().exec(commandLine);
break;
}
} catch (final IOException e) {
L.e("Unable to remove alarm #"+alarmId, e);
}
}
public void triggerAlarm(final long alarmId) {
// Do stuff
//...
L.i("Hi! I'm alarm #"+alarmId);
// Remove alarm
removeAlarm(alarmId);
}
}
Usage is simple. Schedule a new alarm using:
final AlarmManager m = new AlarmManager();
final Calendar cal = new GregorianCalendar();
cal.add(Calendar.MINUTE, 1);
m.createAlarm(cal);
Trigger an alarm like this:
public static void main(final String[] args) {
if (args.length >= 2 && args[1] != null && args[1].contains(AlarmManager.ALARM_CLI_FORMAT)) {
try {
final long alarmId = Long.parseLong(args[1].replace(AlarmManager.ALARM_CLI_FORMAT, ""));
final AlarmManager m = new AlarmManager();
m.triggerAlarm(alarmId);
} catch (final NumberFormatException e) {
L.e("Unable to parse alarm !", e);
}
}
}
Tested on Mac OS X.6 and Windows Vista. The class L is an helper to System.out.println and G holds my global constants (here, my JNLP file on my server used to launch my application).
You can also try using Quartz http://quartz-scheduler.org/ . It has a CRON like syntax to schedule jobs.
I believe your scenario is correct. Since services are system specific things, IMHO you should not user a generic package to cover them all, but have a specific mechanism for every system.
I want to execute an SQL script file in Java without reading the entire file content into a big query and executing it.
Is there any other standard way?
There is great way of executing SQL scripts from Java without reading them yourself as long as you don't mind having a dependency on Ant. In my opinion such a dependency is very well justified in your case. Here is sample code, where SQLExec class lives in ant.jar:
private void executeSql(String sqlFilePath) {
final class SqlExecuter extends SQLExec {
public SqlExecuter() {
Project project = new Project();
project.init();
setProject(project);
setTaskType("sql");
setTaskName("sql");
}
}
SqlExecuter executer = new SqlExecuter();
executer.setSrc(new File(sqlFilePath));
executer.setDriver(args.getDriver());
executer.setPassword(args.getPwd());
executer.setUserid(args.getUser());
executer.setUrl(args.getUrl());
executer.execute();
}
There is no portable way of doing that. You can execute a native client as an external program to do that though:
import java.io.*;
public class CmdExec {
public static void main(String argv[]) {
try {
String line;
Process p = Runtime.getRuntime().exec
("psql -U username -d dbname -h serverhost -f scripfile.sql");
BufferedReader input =
new BufferedReader
(new InputStreamReader(p.getInputStream()));
while ((line = input.readLine()) != null) {
System.out.println(line);
}
input.close();
}
catch (Exception err) {
err.printStackTrace();
}
}
}
Code sample was extracted from here and modified to answer question assuming that the user wants to execute a PostgreSQL script file.
Flyway library is really good for this:
Flyway flyway = new Flyway();
flyway.setDataSource(dbConfig.getUrl(), dbConfig.getUsername(), dbConfig.getPassword());
flyway.setLocations("classpath:db/scripts");
flyway.clean();
flyway.migrate();
This scans the locations for scripts and runs them in order. Scripts can be versioned with V01__name.sql so if just the migrate is called then only those not already run will be run. Uses a table called 'schema_version' to keep track of things. But can do other things too, see the docs: flyway.
The clean call isn't required, but useful to start from a clean DB.
Also, be aware of the location (default is "classpath:db/migration"), there is no space after the ':', that one caught me out.
No, you must read the file, split it into separate queries and then execute them individually (or using the batch API of JDBC).
One of the reasons is that every database defines their own way to separate SQL statements (some use ;, others /, some allow both or even to define your own separator).
You cannot do using JDBC as it does not support . Work around would be including iBatis iBATIS is a persistence framework and call the Scriptrunner constructor as shown in iBatis documentation .
Its not good to include a heavy weight persistence framework like ibatis in order to run a simple sql scripts any ways which you can do using command line
$ mysql -u root -p db_name < test.sql
Since JDBC doesn't support this option the best way to solve this question is executing command lines via the Java Program. Bellow is an example to postgresql:
private void executeSqlFile() {
try {
Runtime rt = Runtime.getRuntime();
String executeSqlCommand = "psql -U (user) -h (domain) -f (script_name) (dbName)";
Process pr = rt.exec();
int exitVal = pr.waitFor();
System.out.println("Exited with error code " + exitVal);
} catch (Exception e) {
System.out.println(e.toString());
}
}
The Apache iBatis solution worked like a charm.
The script example I used was exactly the script I was running from MySql workbench.
There is an article with examples here:
https://www.tutorialspoint.com/how-to-run-sql-script-using-jdbc#:~:text=You%20can%20execute%20.,to%20pass%20a%20connection%20object.&text=Register%20the%20MySQL%20JDBC%20Driver,method%20of%20the%20DriverManager%20class.
This is what I did:
pom.xml dependency
<!-- IBATIS SQL Script runner from Apache (https://mvnrepository.com/artifact/org.apache.ibatis/ibatis-core) -->
<dependency>
<groupId>org.apache.ibatis</groupId>
<artifactId>ibatis-core</artifactId>
<version>3.0</version>
</dependency>
Code to execute script:
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.Reader;
import java.sql.Connection;
import org.apache.ibatis.jdbc.ScriptRunner;
import lombok.extern.slf4j.Slf4j;
#Slf4j
public class SqlScriptExecutor {
public static void executeSqlScript(File file, Connection conn) throws Exception {
Reader reader = new BufferedReader(new FileReader(file));
log.info("Running script from file: " + file.getCanonicalPath());
ScriptRunner sr = new ScriptRunner(conn);
sr.setAutoCommit(true);
sr.setStopOnError(true);
sr.runScript(reader);
log.info("Done.");
}
}
For my simple project the user should be able to select SQL-files which get executed.
As I was not happy with the other answers and I am using Flyway anyway I took a closer look at the Flyway code. DefaultSqlScriptExecutor is doing the actual execution, so I tried to figure out how to create an instance of DefaultSqlScriptExecutor.
Basically the following snippet loads a String splits it into the single statements and executes one by one.
Flyway also provides other LoadableResources than StringResource e.g. FileSystemResource. But I have not taken a closer look at them.
As DefaultSqlScriptExecutor and the other classes are not officially documented by Flyway use the code-snippet with care.
public static void execSqlQueries(String sqlQueries, Configuration flyWayConf) throws SQLException {
// create dependencies FlyWay needs to execute the SQL queries
JdbcConnectionFactory jdbcConnectionFactory = new JdbcConnectionFactory(flyWayConf.getDataSource(),
flyWayConf.getConnectRetries(),
null);
DatabaseType databaseType = jdbcConnectionFactory.getDatabaseType();
ParsingContext parsingContext = new ParsingContext();
SqlScriptFactory sqlScriptFactory = databaseType.createSqlScriptFactory(flyWayConf, parsingContext);
Connection conn = flyWayConf.getDataSource().getConnection();
JdbcTemplate jdbcTemp = new JdbcTemplate(conn);
ResourceProvider resProv = flyWayConf.getResourceProvider();
DefaultSqlScriptExecutor scriptExec = new DefaultSqlScriptExecutor(jdbcTemp, null, false, false, false, null);
// Prepare and execute the actual queries
StringResource sqlRes = new StringResource(sqlQueries);
SqlScript sqlScript = sqlScriptFactory.createSqlScript(sqlRes, true, resProv);
scriptExec.execute(sqlScript);
}
The simplest external tool that I found that is also portable is jisql - https://www.xigole.com/software/jisql/jisql.jsp .
You would run it as:
java -classpath lib/jisql.jar:\
lib/jopt-simple-3.2.jar:\
lib/javacsv.jar:\
/home/scott/postgresql/postgresql-8.4-701.jdbc4.jar
com.xigole.util.sql.Jisql -user scott -password blah \
-driver postgresql \
-cstring jdbc:postgresql://localhost:5432/scott -c \; \
-query "select * from test;"
JDBC does not support this option (although a specific DB driver may offer this).
Anyway, there should not be a problem with loading all file contents into memory.
Try this code:
String strProc =
"DECLARE \n" +
" sys_date DATE;"+
"" +
"BEGIN\n" +
"" +
" SELECT SYSDATE INTO sys_date FROM dual;\n" +
"" +
"END;\n";
try{
DriverManager.registerDriver ( new oracle.jdbc.driver.OracleDriver () );
Connection connection = DriverManager.getConnection ("jdbc:oracle:thin:#your_db_IP:1521:your_db_SID","user","password");
PreparedStatement psProcToexecute = connection.prepareStatement(strProc);
psProcToexecute.execute();
}catch (Exception e) {
System.out.println(e.toString());
}
If you use Spring you can use DataSourceInitializer:
#Bean
public DataSourceInitializer dataSourceInitializer(#Qualifier("dataSource") final DataSource dataSource) {
ResourceDatabasePopulator resourceDatabasePopulator = new ResourceDatabasePopulator();
resourceDatabasePopulator.addScript(new ClassPathResource("/data.sql"));
DataSourceInitializer dataSourceInitializer = new DataSourceInitializer();
dataSourceInitializer.setDataSource(dataSource);
dataSourceInitializer.setDatabasePopulator(resourceDatabasePopulator);
return dataSourceInitializer;
}
Used to set up a database during initialization and clean up a
database during destruction.
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/jdbc/datasource/init/DataSourceInitializer.html