Running integration tests for a spring-boot REST service using gradle - java

I am currently trying to setup integration test framework for a REST service which is built on:
Spring-boot
Gradle
Jetty
I was able to use spring-boot integration test framework along with spring-boot junit runner to bring up the app context and run the tests successfully.
The next thing that I was trying to do was to have a gradle task which will do the following:
Build the jar(not war)
Start jetty and deploy the jar
Run a set of test-cases against this jar.
Stop jetty
=> I tried using the 'jetty' plugin. But it does not seem to be supporting jar files.
=> I then tried using the JavaExec task to run the jar and then run the tests, but then I couldn't find a straight-forward way to stop the jar process after the tests are done.
=> The same issue with the Exec type task.
So, I have two questions regarding this:
Is there a way to achieve the above said form of integration testing using gradle.
Is this way of integration-testing recommended or is there a better way of doing it?
Any thoughts and insights are much appreciated.
Thanks,

There are different ways to achieve what you want. The approach I helped with at a client relied on the /shutdown URL provided by Spring Boot Actuator. Important If you use this approach, be sure to either disable or secure the /shutdown endpoint for production.
Within the build file you have two tasks:
task startWebApp(type: StartApp) {
dependsOn 'assemble'
jarFile = jar.archivePath
port = 8080
appContext = "MyApp"
}
task stopWebApp(type: StopApp) {
urlPath = "${startWebApp.baseUrl}/shutdown"
}
You should make sure that your integration tests depend on the startWebApp tasks and they should be finalised by the stop task. So something like this:
integTest.dependsOn "startWebApp"
integTest.finalizedBy "stopWebApp"
Of course, you need to create the custom task implementations too:
class StartApp extends DefaultTask {
static enum Status { UP, DOWN, TIMED_OUT }
#InputFile
File jarFile
#Input
int port = 8080
#Input
String appContext = ""
String getBaseUrl() {
return "http://localhost:${port}" + (appContext ? '/' + appContext : '')
}
#TaskAction
def startApp() {
logger.info "Starting server"
logger.debug "Application jar file: " + jarFile
def args = ["java",
"-Dspring.profiles.active=dev",
"-jar",
jarFile.path]
def pb = new ProcessBuilder(args)
pb.redirectErrorStream(true)
final process = pb.start()
final output = new StringBuffer()
process.consumeProcessOutputStream(output)
def status = Status.TIMED_OUT
for (i in 0..20) {
Thread.sleep(3000)
if (hasServerExited(process)) {
status = Status.DOWN
break
}
try {
status = checkServerStatus()
break
}
catch (ex) {
logger.debug "Error accessing app health URL: " + ex.message
}
}
if (status == Status.TIMED_OUT) process.destroy()
if (status != Status.UP) {
logger.info "Server output"
logger.info "-------------"
logger.info output.toString()
throw new RuntimeException("Server failed to start up. Status: ${status}")
}
}
protected Status checkServerStatus() {
URL url = new URL("$baseUrl/health")
logger.info("Health Check --> ${url}")
HttpURLConnection connection = url.openConnection()
connection.readTimeout = 300
def obj = new JsonSlurper().parse(
connection.inputStream,
connection.contentEncoding ?: "UTF-8")
connection.inputStream.close()
return obj.status == "UP" ? Status.UP : Status.DOWN
}
protected boolean hasServerExited(Process process) {
try {
process.exitValue()
return true
} catch (IllegalThreadStateException ex) {
return false
}
}
}
Note that it's important to start the server on a thread, otherwise the task never ends. The task to stop the server is more straightforward:
class StopApp extends DefaultTask {
#Input
String urlPath
#TaskAction
def stopApp(){
def url = new URL(urlPath)
def connection = url.openConnection()
connection.requestMethod = "POST"
connection.doOutput = true
connection.outputStream.close()
connection.inputStream.close()
}
}
It basically sends an empty POST to the /shutdown URL to stop the running server.

Related

create an Java api that will manually trigger Kubernetes already created jobs

I have a job already running in Kubernates which is scheduled for 4 hours. But I need to write a Java API so that whenever I want to run the job I just need to call this API and it runs the Job.
Please help to solve this requirement.
There is two way either you run your application in POD which create JOB for you OR you write java API and when you hit endpoint, it will create the job that time.
For creation, you can use the Java Kubernetes client library.
Example - Click here
Java client - Click here
package io.fabric8.kubernetes.examples;
import io.fabric8.kubernetes.api.model.PodList;
import io.fabric8.kubernetes.api.model.batch.v1.Job;
import io.fabric8.kubernetes.api.model.batch.v1.JobBuilder;
import io.fabric8.kubernetes.client.ConfigBuilder;
import io.fabric8.kubernetes.client.DefaultKubernetesClient;
import io.fabric8.kubernetes.client.KubernetesClient;
import io.fabric8.kubernetes.client.KubernetesClientException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Collections;
import java.util.concurrent.TimeUnit;
/*
* Creates a simple run to complete job that computes π to 2000 places and prints it out.
*/
public class JobExample {
private static final Logger logger = LoggerFactory.getLogger(JobExample.class);
public static void main(String[] args) {
final ConfigBuilder configBuilder = new ConfigBuilder();
if (args.length > 0) {
configBuilder.withMasterUrl(args[0]);
}
try (KubernetesClient client = new DefaultKubernetesClient(configBuilder.build())) {
final String namespace = "default";
final Job job = new JobBuilder()
.withApiVersion("batch/v1")
.withNewMetadata()
.withName("pi")
.withLabels(Collections.singletonMap("label1", "maximum-length-of-63-characters"))
.withAnnotations(Collections.singletonMap("annotation1", "some-very-long-annotation"))
.endMetadata()
.withNewSpec()
.withNewTemplate()
.withNewSpec()
.addNewContainer()
.withName("pi")
.withImage("perl")
.withArgs("perl", "-Mbignum=bpi", "-wle", "print bpi(2000)")
.endContainer()
.withRestartPolicy("Never")
.endSpec()
.endTemplate()
.endSpec()
.build();
logger.info("Creating job pi.");
client.batch().v1().jobs().inNamespace(namespace).createOrReplace(job);
// Get All pods created by the job
PodList podList = client.pods().inNamespace(namespace).withLabel("job-name", job.getMetadata().getName()).list();
// Wait for pod to complete
client.pods().inNamespace(namespace).withName(podList.getItems().get(0).getMetadata().getName())
.waitUntilCondition(pod -> pod.getStatus().getPhase().equals("Succeeded"), 1, TimeUnit.MINUTES);
// Print Job's log
String joblog = client.batch().v1().jobs().inNamespace(namespace).withName("pi").getLog();
logger.info(joblog);
} catch (KubernetesClientException e) {
logger.error("Unable to create job", e);
}
}
}
Option : 2
You can also apply the YAML file
ApiClient client = ClientBuilder.cluster().build(); //create in-cluster client
Configuration.setDefaultApiClient(client);
BatchV1Api api = new BatchV1Api(client);
V1Job job = new V1Job();
job = (V1Job) Yaml.load(new File("<YAML file path>.yaml")); //apply static yaml file
ApiResponse<V1Job> response = api.createNamespacedJobWithHttpInfo("default", job, "true", null, null);
I had the same question as you since it was needed for me and my team, to develop a web application, that makes it possible for any user to start a new execution from our jobs.
I have a job already running in Kubernetes which is scheduled for 4 hours.
If I'm not mistaken, it's not possible to schedule a Job on Kubernetes, you need to create a Job from a CronJob, that's our case.
We have several CronJobs scheduled to start through the day, but it's also needed to start it again, during some error or something else.
After some research, I decided to use the Kubernetes-client library.
When it was needed to trigger a job manually, I used to use kubectl CLI kubectl create job batch-demo-job --from=cronjob/batch-demo-cronjob -n ns-batch-demo , so I was also seeking for a way that makes that possible.
From an issue opened on the Kubernetes-client GitHub it is not possible to do that, you need to search for your cronJob, then use the spec to create your job.
So I've made it a POC and it works as expected, it follows the same logic, but in a more friendly way.
In this example, I just need the cronJob spec to get the volume tag.
fun createJobFromACronJob(namespace: String) {
val client = Config.defaultClient()
Configuration.setDefaultApiClient(client)
try {
val api = BatchV1Api(client)
val cronJob = api.readNamespacedCronJob("$namespace-cronjob", namespace, "true")
val job = api.createNamespacedJob(namespace, createJobSpec(cronJob), "true", null, null, null)
} catch (ex: ApiException) {
System.err.println("Exception when calling BatchV1Api#createNamespacedJob")
System.err.println("Status Code: ${ex.code}")
System.err.println("Reason: ${ex.responseBody}")
System.err.println("Response Header: ${ex.responseHeaders}")
ex.printStackTrace()
}
}
private fun createJobSpec(cronJob: V1CronJob): V1Job {
val namespace = cronJob.metadata!!.namespace!!
return V1Job()
.kind("batch/v1")
.kind("Job")
.metadata(
V1ObjectMeta()
.name("$namespace-job")
.namespace(namespace)
.putLabelsItem("app.kubernetes.io/team", "Jonas-pangare")
.putLabelsItem("app.kubernetes.io/name", namespace.uppercase())
.putLabelsItem("app.kubernetes.io/part-of", "SINC")
.putLabelsItem("app.kubernetes.io/tier", "batch")
.putLabelsItem("app.kubernetes.io/managed-by", "kubectl")
.putLabelsItem("app.kubernetes.io/built-by", "sinc-monitoracao")
)
.spec(
V1JobSpec()
.template(
podTemplate(cronJob, namespace)
)
.backoffLimit(0)
)
}
private fun podTemplate(cronJob: V1CronJob, namespace: String): V1PodTemplateSpec {
return V1PodTemplateSpec()
.spec(
V1PodSpec()
.restartPolicy("Never")
.addContainersItem(
V1Container()
.name(namespace)
.image(namespace)
.imagePullPolicy("Never")
.addEnvItem(V1EnvVar().name("TZ").value("America/Sao_Paulo"))
.addEnvItem(V1EnvVar().name("JOB_NAME").value("helloWorldJob"))
)
.volumes(cronJob.spec!!.jobTemplate.spec!!.template.spec!!.volumes)
)
}
You also can use the Kubernetes client from Fabric8, it's great too, and easier to use.

Heroku throwing R14 (memory) errors when running Selenium tests

I've been trying to get Selenium tests setup on Heroku. However, I've observed that when even a small test runs, Heroku throws R14 errors due to memory consumption.
I started trying to run these tests using free dynos and have upgraded up to the Standard 2X dynos and still see this issue. I can't imagine that these tests require more than 1GB of RAM to run. I suppose it's possible but I was interested in hearing other people's experiences with this and to see if anyone ran into similar issues and had solutions.
Locally the tests are super speedy and run fine. I know on the Heroku side that the buildpacks need to dynamically download chrome and the driver each time the tests are run so a little slowness there is to be expected.
A couple notes about my process:
I'm using the Gradle wrapper to run the test script
I'm running the Selenium test via the jUnit framework
I'm using the heroku/google-chrome and heroku/chromedriver buildpacks
Here's my setup so you can see if I'm doing anything strange...
SeleniumConfig.kt
class SeleniumConfig {
val driver: WebDriver
init {
val options = ChromeOptions()
val driverFile = System.getenv("CHROME_DRIVER_PATH")
val binary = System.getenv("GOOGLE_CHROME_BIN")
options.apply {
addArguments("--enable-javascript")
addArguments("--start-maximized")
addArguments("--incognito")
addArguments("--headless")
addArguments("--disable-gpu")
addArguments("--no-sandbox")
}
val service = ChromeDriverService.Builder()
.usingDriverExecutable(File(driverFile))
.build()
binary?.let { options.setBinary(it) }
driver = ChromeDriver(service, options)
driver.manage().timeouts().implicitlyWait(60, TimeUnit.SECONDS)
}
}
BaseSeleniumTest.kt
open class BaseSeleniumTest(private val path: String) {
companion object {
const val DEFAULT_WAIT = 5000L
}
protected val config = SeleniumConfig()
protected val driver: WebDriver
get() = config.driver
#BeforeEach
fun setUp() {
driver.get("https://<mysite>$path")
implicitWait()
}
#AfterEach
fun tearDown() {
driver.close()
driver.quit()
}
fun implicitWait() {
driver.manage().timeouts().implicitlyWait(DEFAULT_WAIT, TimeUnit.MILLISECONDS)
}
}
SigninTest.kt
class SigninTest : BaseSeleniumTest("/signin") {
#Test
fun `Succeed signing in`() {
val username = driver.findElement(By.name("email"))
val password = driver.findElement(By.name("password"))
username.sendKeys("test#test.com")
password.sendKeys("password")
username.submit()
implicitWait()
}
#Test
fun `Fail with invalid username`() {
val username = driver.findElement(By.name("email"))
val password = driver.findElement(By.name("password"))
username.sendKeys("wrong")
password.sendKeys("password")
username.submit()
implicitWait()
assertEquals("Invalid email or password",
driver.findElement(By.cssSelector(".form-error")).text)
}
#Test
fun `Fail with invalid password`() {
val username = driver.findElement(By.name("email"))
val password = driver.findElement(By.name("password"))
username.sendKeys("test#test.com")
password.sendKeys("wrong")
username.submit()
implicitWait()
assertEquals("Invalid email or password",
driver.findElement(By.cssSelector(".form-error")).text)
}
}
Any help that anyone could give me would be appreciated!

Spring Boot rest controller not working when package in a jar

I am developing a REST API using Spring Boot rest controller. Something strange is happening ; When I test my controller with Eclipse it is working just fine BUT when i deploy the app, packaged in a jar and started with the "java" command line in a docker container then, it doesn't work.
What confuse me is that there is no log. And when I put a sysout at the very beginning of my controller I realized that the controller is not even executed !
Here is the controller with the concerned endpoint, but i am not sure it will help :
#RestController
#RequestMapping("/pdf")
#EnableSwagger2
public class PDFGeneratorResources {
#Autowired
PDFGenerator pdfService;
#Autowired
ResourceLoader resourceLoader;
#PostMapping("/generate-recipies-shoppinglist")
public ResponseEntity<String> generateRecipiesAndShoppingListPDF(#RequestBody List<Day> daysList) {
System.out.println("TRACE");
ResponseEntity<String> responseEntity = null;
String generatedPDFFileURL = "";
try {
generatedPDFFileURL = pdfService.generatePDFFromHTML(PDFTemplates.RecipiesAndShoppingList,
new RecipiesShoppinglistContextBuilder(new ArrayList<Day>(daysList)));
responseEntity = new ResponseEntity<String>(generatedPDFFileURL, HttpStatus.OK);
} catch (Exception e) {
e.printStackTrace();
responseEntity = new ResponseEntity<>(HttpStatus.INTERNAL_SERVER_ERROR);
}
return responseEntity;
}
}
Question : Is there any way of making spring boot log everything that's happening between tomcat and my controller ? King of --verbose option for spring boot ?
PS:
Here is the DockerFile I am using to deploy the app
FROM registry.gitlab.com/softreaver/meals-ready-backend/runners:centos7jdk11
MAINTAINER MILAZZO_christopher
COPY ./target/*.jar /app.jar
RUN echo -e "/usr/bin/java -Xms128m -Xmx128m -jar /app.jar\n" > /start-app.sh
RUN chmod u+x /start-app.sh
EXPOSE 8080
ENTRYPOINT ["/bin/bash", "/start-app.sh"]
I finally found the problem thx to log.level.root=debug ; I am using the Spring resourceloader to load the template for my PDF service but it seems that it is not able to find the resources folder inside a jar file.
It says : cannot be resolved to absolute file path because it does not reside in the file system: jar:file:/app.jar!/BOOT-INF/classes!/templates/......
I found a solution on internet and made it work by using inputStream :
#Service
public class ResourceLoaderService {
private final Logger logger = LoggerFactory.getLogger(this.getClass());
#Autowired
ResourceLoader resourceLoader;
public String getResourceAbsolutePathString(String location) throws Exception {
Resource resource = resourceLoader.getResource(location);
String absolutePathString = "/";
try {
if (resource.getURL().getProtocol().equals("jar")) {
logger.debug("Jar file system activated");
File tempFile = Files.createTempFile("Mealsready_backend_", null).toFile();
resource.getInputStream().transferTo(new FileOutputStream(tempFile));
absolutePathString = tempFile.getAbsolutePath();
} else {
absolutePathString = resource.getFile().getAbsolutePath();
}
} catch (IOException e) {
logger.error("Error while trying to retrieve a resource : " + e.getMessage());
// TO DELETE Remplacer par un ServiceException
throw new Exception();
}
return absolutePathString;
}
}

How to execute javaFX Tasks, Services in sequential manner

With my Controller class I have to execute several IO commands (ex: SSH, RCP commands with some parameter values) sequential manner. Each of this command will get some amount of time to execute.
I have to update UI controller when each command is start to execute.
Then depending on that execution result (whether success or failed) I have to update UI again.
Then have to execute the next command with same steps.
Execution of each command is depending on the result of previous command. As a example,
for (IOCommand command : commandsList) {
// Update the UI before start the command execution
messageTextArea.append("Command " + command.getType() + " Stated");
boolean result = commandExecutor(command);
if(result) {
// Update the UI after successful execution
messageTextArea.append("Command " + command.getType() + " Successfully Executed");
// Then go to next command execution
} else {
// Update the UI after failure execution
messageTextArea.append("Command " + command.getType() + " Failed");
// Fix the issue and do re execution
commandReExecutor(command);
}
}
For accomplish this gradual UI update I have to use some JavaFX related Task or Service related features (otherwise it will hang the application until finish all commands were executed and also it will update the UI all at once). But due to nature or concurrency I can not execute these commands with help of Task or Service, in sequential manner (not all at once, one after another). How can I address this problem. Thanks in advance.
I'd the exact requirement in a project and it can be done with Task and Service. You just need a correct implementation.
Few notes:
1. Always start a background task using service or Platform.runLater.
2. If you want to update UI, it must be done from either Task or Service.
3. Bind progress property of task to that of progress bar for smooth updation.
4. Similarly bind text property of a Label to message property of a task for smooth updation of status or something else.
To execute external commands like shell, etc. I've written following class:
package utils;
import controller.ProgressController;
import java.io.BufferedReader;
import java.io.File;
import java.io.InputStreamReader;
import java.util.Map;
import java.util.logging.Level;
import java.util.logging.Logger;
import javafx.concurrent.Task;
import main.Installer;
public class ProcessExecutor extends Task<Integer>
{
Logger logger =Logger.getLogger("ProcessExecutor");
File dir;
String []cmd;
String cmds;
int exitCode=-1;
boolean NextStepExists=false;
Task nextStep;
public ProcessExecutor(String...cmd )
{
this.cmd=cmd;
this.dir=new File(System.getProperty("user.dir"));
this.nextStep=null;
NextStepExists=false;
}
public ProcessExecutor(Task nextStep,String...cmd )
{
this.cmd=cmd;
this.dir=new File(System.getProperty("user.dir"));
this.nextStep=nextStep;
NextStepExists=true;
}
public ProcessExecutor(Task nextStep,File dir,String...cmd)
{
this.cmd=cmd;
this.dir=dir;
this.nextStep=nextStep;
NextStepExists=true;
}
#Override
protected final Integer call()
{
cmds=new String();
for(String i:cmd)
cmds+=i+" "; // just to log cmd array
try
{
logger.info("Starting new process with cmd > "+cmds);
ProcessBuilder processBuilder=new ProcessBuilder(cmd);
processBuilder.directory(dir);
processBuilder.redirectErrorStream(true);
Map<String, String> env = processBuilder.environment();
// create custom environment
env.put("JAVA_HOME", "/opt/jdk1.7.0_45/");
Process pr=processBuilder.start();
BufferedReader in = new BufferedReader(new InputStreamReader(pr.getInputStream()));
String line = in.readLine();
while (line != null) {
logger.log(Level.FINE,line);
ProgressController.instance.printToConsole(line);
line = in.readLine();
}
BufferedReader er = new BufferedReader(new InputStreamReader(pr.getErrorStream()));
String erLine = in.readLine();
while (erLine != null) {
logger.log(Level.FINE,erLine);
ProgressController.instance.printToConsole(erLine);
erLine = in.readLine();
}
exitCode=pr.waitFor();
exitCode=pr.exitValue();
logger.info("Exit Value="+exitCode);
updateMessage("Completed Process");
if(exitCode!=0 && exitCode!=1)
{
logger.info("Failed to execute process commands >"+cmds+" with exit code="+exitCode);
failed();
}
else
{
logger.info("PE succeeded()");
if(NextStepExists)
Installer.pool.submit(nextStep);
succeeded();
}
}
catch(Exception e)
{
logger.log(Level.SEVERE,"Exception: Failed to execute process commands >"+cmds,e);
updateMessage(e.getMessage());
}
return new Integer(exitCode);
}
#Override
public void failed()
{
super.failed();
logger.log(Level.SEVERE,"Failed to execute process commands >"+cmds+"; ExitCode="+exitCode);
}
}
This class uses ProcessBuilder to create required environment for new process,
It waits to finish execution of process using process.waitFor(), the directory of process can be set using processBuilder.directory(dir). In order to execute a single Task<> at any time, use java.util.concurrent.ExecutorService
public ExecutorService pool=Executors.newSingleThreadExecutor();
pool.submit(new ProcessExecutor("installTomcat.bat","tomcat7"));
pool.submit(new ProcessExecutor("installPostgres.bat","postgresql","5432"));
In this way you can execute batch files one after another. Executors.newSingleThreadExecutor() takes care of executing a single task at any time and queing the newly submitted tasks. I've written a generalized working example of sequential execution here:
github This is a NetBeans JavaFX project and its a generalized & stripped down version of a project. Hope this helps

Self-destructing application

Along the lines of "This tape will self-destruct in five seconds. Good luck, Jim"...
Would it be possible for an application to delete itself (or it's executable wrapper form) once a preset time of use or other condition has been reached?
Alternatively, what other approaches could be used to make the application useless?
The aim here is to have a beta expire, inviting users to get a more up-to-date version.
It is possible. To get around the lock on the JAR file, your application may need to spawn a background process that waits until the JVM has exited before deleting stuff.
However, this isn't bomb-proof. Someone could install the application and then make the installed files and directories read-only so that your application can't delete itself. The user (or their administrator) via the OS'es access control system has the final say on what files are created and deleted.
If you control where testers download your application, you could use an automated build system (e.g. Jenkins) that you could create a new beta versions every night that has a hard-coded expiry date:
private static final Date EXPIRY_DATE = <90 days in the future from build date>;
the above date is automatically inserted by the build process
if (EXPIRY_DATE.before(new Date()) {
System.out.println("Get a new beta version, please");
System.exit(1);
}
Mix that with signed and sealed jars, to put obstacles in the way of decompiling the bytecode and providing an alternative implementation that doesn't include that code, you can hand out a time-expiring beta of the code.
The automated build system could be configured to automatically upload the beta version to the server hosting the download version.
Since Windows locks the JAR file while it is running, you cannot delete it from your own Java code hence you need a Batch file:
private static void selfDestructWindowsJARFile() throws Exception
{
String resourceName = "self-destruct.bat";
File scriptFile = File.createTempFile(FilenameUtils.getBaseName(resourceName), "." + FilenameUtils.getExtension(resourceName));
try (FileWriter fileWriter = new FileWriter(scriptFile);
PrintWriter printWriter = new PrintWriter(fileWriter))
{
printWriter.println("taskkill /F /IM \"java.exe\"");
printWriter.println("DEL /F \"" + ProgramDirectoryUtilities.getCurrentJARFilePath() + "\"");
printWriter.println("start /b \"\" cmd /c del \"%~f0\"&exit /b");
}
Desktop.getDesktop().open(scriptFile);
}
public static void selfDestructJARFile() throws Exception
{
if (SystemUtils.IS_OS_WINDOWS)
{
selfDestructWindowsJARFile();
} else
{
// Unix does not lock the JAR file so we can just delete it
File directoryFilePath = ProgramDirectoryUtilities.getCurrentJARFilePath();
Files.delete(directoryFilePath.toPath());
}
System.exit(0);
}
ProgramDirectoryUtilities class:
public class ProgramDirectoryUtilities
{
private static String getJarName()
{
return new File(ProgramDirectoryUtilities.class.getProtectionDomain()
.getCodeSource()
.getLocation()
.getPath())
.getName();
}
public static boolean isRunningFromJAR()
{
String jarName = getJarName();
return jarName.contains(".jar");
}
public static String getProgramDirectory()
{
if (isRunningFromJAR())
{
return getCurrentJARDirectory();
} else
{
return getCurrentProjectDirectory();
}
}
private static String getCurrentProjectDirectory()
{
return new File("").getAbsolutePath();
}
public static String getCurrentJARDirectory()
{
try
{
return getCurrentJARFilePath().getParent();
} catch (URISyntaxException exception)
{
exception.printStackTrace();
}
throw new IllegalStateException("Unexpected null JAR path");
}
public static File getCurrentJARFilePath() throws URISyntaxException
{
return new File(ProgramDirectoryUtilities.class.getProtectionDomain().getCodeSource().getLocation().toURI().getPath());
}
}
Solution inspired by this question.
Here is a better method for Windows:
private static void selfDestructWindowsJARFile() throws Exception
{
String currentJARFilePath = ProgramDirectoryUtilities.getCurrentJARFilePath().toString();
Runtime runtime = Runtime.getRuntime();
runtime.exec("cmd /c ping localhost -n 2 > nul && del \"" + currentJARFilePath + "\"");
}
Here is the original answer.
it is pretty possible i guess. maybe you can delete the jar like this and make sure the application vanishes given that you have the rights.
File jar = new File(".\\app.jar");
jar.deleteOnExit();
System.exit(0);
also using something like Nullsoft Scriptable Install System which enables you to write your own installed/uninstaller should help.

Categories