I have a Gherkin executor where I execute my feature files. What I would like to do would be to add a StepDefinition file from an other jar. The user would be able to use my project with the step def that I have already wrote but he would also be able to add custom definitions from his own jar file.
Currently I have a JavaClassLoader where I load my class from my jar and I use it in my main
public class JavaClassLoader<C> extends ClassLoader {
public C LoadClass(String directory, String classpath, Class<C> parentClass) throws ClassNotFoundException {
File pluginsDir = new File(System.getProperty("user.dir") + directory);
for (File jar : pluginsDir.listFiles()) {
try {
ClassLoader loader = URLClassLoader.newInstance(
new URL[] { jar.toURL() },
getClass().getClassLoader()
);
Class<?> clazz = Class.forName(classpath, true, loader);
Class<? extends C> newClass = clazz.asSubclass(parentClass);
// Apparently its bad to use Class.newInstance, so we use
// newClass.getConstructor() instead
Constructor<? extends C> constructor = newClass.getConstructor();
return constructor.newInstance();
} catch (ClassNotFoundException e) {
// There might be multiple JARs in the directory,
// so keep looking
continue;
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InstantiationException e) {
e.printStackTrace();
}
}
throw new ClassNotFoundException("Class " + classpath
+ " wasn't found in directory " + System.getProperty("user.dir") + directory);
}
}
JavaClassLoader<AbstractStepDefs> loader = new JavaClassLoader<AbstractStepDefs>();
loader.LoadClass("/", "stepDef.dynamicClass", AbstractStepDefs.class);
The problem is that Cucumber isn't able to read the methods that I wrote in my other jar. Is there a way to use a step def file that isn't in the project?
Is there a way to use a step def file that isn't in the project?
Yes, and no. The yes part is closer to "sort of". git supports a couple of ways of referencing subprojects within other projects. The common "subprojects" are maintained in their own project and then pulled into using projects. Look here for a discussion. I looked into doing submodules once. I even had it working. But I couldn't convince the TeamCity owners to support it. It works but you have to be careful about how you use it. There are haters.
What I ended up doing was to create a shared "global" project that contained page files for the common login and navigation pages. This global project also contained all the startup and shutdown support for different browsers and for remote execution on SauceLabs.
The step definitions had to be repeated (yuck; I prefer DRY) but these are small as they mostly just call the page file methods. All of these web pages are defined in the global project in their own class files. The common housekeeping code is defined in class WebDriverManager.
#Given("^I navigate to the public ACME WebPage and select Login$")
public void iNavigateToTheAcmePublicWebPage() {
pageFactory.AcmePublicWebPage().navigateTo(
WebDriverManager.instance().getAcmeUrl());
pageFactory.AcmePublicWebPage().closeNotificationPopUp(); //If there is one
pageFactory.AcmePublicWebPage().selectLoginLink();
}
#When("^I close the browser$")
public void iCloseTheBrowser() {
WebDriverManager.instance().closeBrowser();
}
I have reduced most, but not all duplication. Most junior test automation engineers don't have to worry about the heavy lifting as long as I maintain the global git project and notify them when they need to download a new global jar from TeamCity.
Related
I have Gradle project with Spring Boot and AspectJ.
Want to load aspectjweaver and spring-instrument javaagents dynamically and directly from WEB-INF/libs (where Spring Boot locate all dependencies)
Gradle dependencies:
AgentLoader:
public class AgentLoader {
private static final Logger LOGGER = LoggerFactory.getLogger(AgentLoader.class);
public static void loadJavaAgent() {
if (!isAspectJAgentLoaded()) {
LOGGER.warn("Aspect agent was not loaded!");
}
}
public static boolean isAspectJAgentLoaded() {
try {
Agent.getInstrumentation();
} catch (NoClassDefFoundError e) {
return false;
} catch (UnsupportedOperationException e) {
LOGGER.info("Dynamically load AspectJAgent");
return dynamicallyLoadAspectJAgent();
}
return true;
}
public static boolean dynamicallyLoadAspectJAgent() {
String nameOfRunningVM = ManagementFactory.getRuntimeMXBean().getName();
int p = nameOfRunningVM.indexOf('#');
String pid = nameOfRunningVM.substring(0, p);
try {
VirtualMachine vm = VirtualMachine.attach(pid);
String jarFilePath = AgentLoader.class.getClassLoader().getResource("WEB-INF/libs/aspectjweaver-1.9.6.jar").toString();
vm.loadAgent(jarFilePath);
jarFilePath = AgentLoader.class.getClassLoader().getResource("WEB-INF/libs/spring-instrument-5.3.2.jar").toString();
vm.loadAgent(jarFilePath);
vm.detach();
} catch (Exception e) {
LOGGER.error("Exception while attaching agent", e);
return false;
}
return true;
}
}
But found out that return value of getResource() in null
What is the best solution to handle this issue?
Nikita, today is your lucky day. I just had a moment and was curious how to make my code snippet from https://www.eclipse.org/aspectj/doc/released/README-187.html, which obviously you found before, work in the context of Spring Boot. I just used my Maven Spring Boot playground project. Depending on which Java version you are using, you either need to make sure that tools.jar from JDK 8 is defined as a system-scoped dependency and also copied into the executable Spring uber JAR, or you need to make sure that the Java attach API is activated in Java 9+. Here is what I did for Java 8:
Maven:
<dependency>
<groupId>com.sun</groupId>
<artifactId>tools</artifactId>
<version>1.8</version>
<scope>system</scope>
<systemPath>${java.home}/../lib/tools.jar</systemPath>
</dependency>
<!-- (...) -->
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<mainClass>spring.aop.DemoApplication</mainClass>
<!-- Important for tools.jar on Java 8 -->
<includeSystemScope>true</includeSystemScope>
</configuration>
</plugin>
The <includeSystemScope> option is necessary because otherwise Boot does not know how to find the attach API classes. Just do something equivalent in Gradle and you should be fine.
Java:
You need to know that in order to attach an agent, it must be a file on the file system, not just any resource or input stream. This is how the attach API works. So unfortunately, you have to copy it from the uber JAR to the file system first. Here is how you do it:
public static boolean dynamicallyLoadAspectJAgent() {
String nameOfRunningVM = ManagementFactory.getRuntimeMXBean().getName();
int p = nameOfRunningVM.indexOf('#');
String pid = nameOfRunningVM.substring(0, p);
try {
VirtualMachine vm = VirtualMachine.attach(pid);
ClassLoader classLoader = AgentLoader.class.getClassLoader();
try (InputStream nestedJar = Objects.requireNonNull(classLoader.getResourceAsStream("BOOT-INF/lib/aspectjweaver-1.9.4.jar"))) {
File targetFile = new File("aspectjweaver.jar");
java.nio.file.Files.copy(nestedJar, targetFile.toPath(), StandardCopyOption.REPLACE_EXISTING);
vm.loadAgent(targetFile.getAbsolutePath());
}
try (InputStream nestedJar = Objects.requireNonNull(classLoader.getResourceAsStream("BOOT-INF/lib/spring-instrument-5.1.9.RELEASE.jar"))) {
File targetFile = new File("spring-instrument.jar");
java.nio.file.Files.copy(nestedJar, targetFile.toPath(), StandardCopyOption.REPLACE_EXISTING);
vm.loadAgent(targetFile.getAbsolutePath());
}
vm.detach();
}
catch (Exception e) {
LOGGER.error("Exception while attaching agent", e);
return false;
}
return true;
}
Besides, in my case the files were unter BOOT-INF/lib, not WEB-INF/lib.
Update: You said you have this follow-up problem somewhere along the line (reformatted for readability):
failed to access class
org.aspectj.weaver.loadtime.Aj$WeaverContainer
from class
org.aspectj.weaver.loadtime.Aj
(
org.aspectj.weaver.loadtime.Aj$WeaverContainer is in
unnamed module of
loader 'app';
org.aspectj.weaver.loadtime.Aj is in
unnamed module of
loader org.springframework.boot.loader.LaunchedURLClassLoader #3e9b1010
)
at org.aspectj.weaver.loadtime.Aj.preProcess(Aj.java:108)
This means that Aj is unable to find its own inner class Aj.WeaverContainer. This indicates that they are loaded at different points in time and in different classloaders. When remote-debugging into my sample Boot application starting from an executable JAR, I see that the application classloader is actually the LaunchedURLClassLoader's parent, i.e. the class loaded in the parent is trying to access another class only available to its child classloader, which is impossible in Java. It only works the other way around.
Maybe it helps not to import and reference AspectJ weaver classes from inside the agent loader. Try commenting out the loadJavaAgent() and isAspectJAgentLoaded() methods and also remove import org.aspectj.weaver.loadtime.Agent;. Then in your application just directly call AgentLoader.dynamicallyLoadAspectJAgent() and see if this helps. I have some more aces up my sleeves with regard to agent loading, but let's keep it as simple as possible first.
Java fails to launch when the classpath is too long. The length limit is particularly short on Windows.
Gradle seem uninterested in fixing the issue on their side (even though it's sort of their responsibility since they're the ones launching Java), so we ended up substituting the JavaExec task out with our own alternative.
The alternative works like this:
public class WorkingJavaExec extends JavaExec {
private static final String MATCH_CHUNKS_OF_70_CHARACTERS =
"(?<=\\G.{70})";
private final Logger logger = LoggerFactory.getLogger(getClass());
#Override
public void exec() {
FileCollection oldClasspath = getClasspath();
File jarFile = null;
try {
if (!oldClasspath.isEmpty()) {
try {
jarFile =
toJarWithClasspath(oldClasspath.getFiles());
setClasspath(getProject().files(jarFile));
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}
super.exec();
} finally {
setClasspath(oldClasspath);
if (jarFile != null) {
try {
Files.delete(jarFile.toPath());
} catch (Exception e) {
logger.warn("Couldn't delete: " + jarFile, e);
}
}
}
}
public static File toJarWithClasspath(Set<File> files)
throws IOException {
File jarFile = File.createTempFile("long-classpath", ".jar");
try (ZipOutputStream zip =
new ZipOutputStream(new FileOutputStream(jarFile))) {
zip.putNextEntry(new ZipEntry("META-INF/MANIFEST.MF"));
try (PrintWriter writer =
new PrintWriter(
new OutputStreamWriter(
zip, StandardCharsets.UTF_8))) {
writer.println("Manifest-Version: 1.0");
String classPath = files.stream().map(
file -> file.toURI().toString())
.collect(Collectors.joining(" "));
String classPathEntry = "Class-Path: " + classPath;
writer.println(Arrays.stream(
classPathEntry.split(MATCH_CHUNKS_OF_70_CHARACTERS))
.collect(Collectors.joining("\n ")));
}
}
return jarFile;
}
}
Using this is cumbersome, though, because everywhere someone might run JavaExec, I have to replace it with WorkingJavaExec. New developers also don't know that there is this pitfall in Gradle in the first place, so they don't even know it's something they have to work around.
In reading the internals of Gradle, I saw that JavaExec internally uses a JavaExecAction to do the actual exec.
I thought that maybe by replacing this, we could fix the problem as if Gradle had fixed it themselves, and maybe it would then also apply to other tasks, such as Test. But I haven't been able to find any examples anywhere. (Even in other large projects, which you would expect to have hit the same issue!)
Is it possible to substitute JavaExecAction, and if so, how?
I'm not sure you can "substitute" JavaExecAction because it is set during JavaExec task instanciation, but I think you can solve this problem in a nicer way, using a custom Plugin as follow:
class FixClasspathLimitPlugin implements Plugin<Project> {
#Override
void apply(Project project) {
// after project has been evaluated, hack into all tasks of type JavaExec declared.
project.afterEvaluate {
project.tasks.stream().filter { task -> task instanceof JavaExec }.forEach {
println "Reconfiguring classpath for : $it"
JavaExec javaExec = (JavaExec) it;
FileCollection oldClasspath = javaExec.getClasspath()
// insert an Action at first position, that will change classpath
javaExec.doFirst { task ->
((JavaExec) task).setClasspath(getProject().files(toJarWithClasspath(oldClasspath.getFiles())));
}
// optional - reset old classpath
javaExec.doLast { task ->
((JavaExec) task).setClasspath(oldClasspath)
}
}
}
}
public static File toJarWithClasspath(Set<File> files)
throws Exception {
// same method implementation as given in your question
}
This way, you won't have to replace JavaExec in all build scripts written by your team, you will only have to ensure that these scripts apply your plugin.
And if you use a custom distribution of Gradle and use wrapper in you enterprise, you can even include this plugin in this distribution as an Init Script, as explained here: https://docs.gradle.org/current/userguide/init_scripts.html#sec:using_an_init_script
Put a file that ends with .gradle in the GRADLE_HOME/init.d/ directory, in the Gradle distribution. This allows you to package up a custom Gradle distribution containing some custom build logic and plugins. You can combine this with the Gradle wrapper as a way to make custom logic available to all builds in your enterprise.
This way, the plugin will be applied in a "transparent" way.
Concerning the Test task: it does not use JavaExecAction, I think, but a similar solution could be applied, using a similar plugin.
You can use the jar task to add the class path to the manifest for you:
jar {
baseName = "my-app"
version = "1.0.0"
manifest {
attributes("Class-Path": configurations.compile.collect { it.getName() }.join(' '))
}
}
And then you can reference that jar when launching:
task run(type:JavaExec) {
classpath = jar.outputs.files
main = "myapp.MainClass"
}
That works around the command line path limit. You might also want to copy the dependency JARs to the output folder, so they will be available at runtime.
task copyDependencies(type: Copy, dependsOn: [ "build" ]) {
from configurations.runtime
into "./build/libs"
}
build.finalizedBy(copyDependencies)
Helpful?
I just downloaded the soucre code from the library simplexml
http://simple.sourceforge.net/download.php
I want to make a few modifications in the source code and use it in android,
Unfortunatly there are two classes StreamProvider, and StreamReader,
that need an external reference,
The original project provide this libraries
trying to use this libraries in the android
project you get compilations errors.
How to modifiy the source code from this library and use in Android?
I want to use the source of this library directly, and be able to modify, not just use it with gradle or a jar file.
You can delete StreamProvider.java and StreamReader.java clases,
this classes are not compatible with Android
And you have to choose a xml-provider
And in ProviderFactory.java use PullProvider or Document Provider
public static Provider getInstance() {
/*
try {
try {
return new StreamProvider();
} catch(Throwable e) {
return new PullProvider();
}
} catch(Throwable e) {
return new DocumentProvider();
}
*/
try {
return new PullProvider();
} catch (Exception e) {
e.printStackTrace();
}
return new DocumentProvider();
}
I have two classes :
MainClaz
MyTest2
In MainClaz ,
public class MainClaz {
public static void main(String[] args) throws InterruptedException {
while (true) {
try {
Class aClass = Class.forName("com.test.MyTest2");
Object t = aClass.newInstance();
} catch (Exception e) {
System.out.println("Exception For MyTest2 ");
}
Thread.sleep(10000);
try {
Class aClass = Class.forName("com.test.MyTest3");
Object t = aClass.newInstance();
} catch (Exception e) {
System.out.println("Exception For MyTest3 ");
e.printStackTrace();
}
}
}
}
I have packaged both class in a jar (Jar 1) and put it to the class path .
Since MyTest3 does not exist in this it will keep throwing ClassNotFoundException.
Now lets say ... I create new jar (Jar 2) containing class MyTest3 and copy this jar to class path Folder.
Since I have put MyTest3 class in different new jar in class , It should find MyTest3 in class path on-wards but it is throwing ClassNotFoundException .
How can i make this to work ?
Adding More information to Requirement :
As of now class names are hard coded . But they would be read from external source (Let's say some database ). But I want is , add new class in new jar in class path , add entry of fully qualified name of class in database , so that in next iteration of loop program can dynamically load class .
Change package name of new jar to test2
Add an import for MyTest3:
import com.test2.MyTest3;
I am currently working on a method that will create files and directories. Bellow is the use case & problem explained.
1) When a user specifies a path e.g "/parent/sub folder/file.txt", the system should be able to create the directory along with the file.txt. (This one works)
2) When a user specifies a path e.g "/parent/sub-folder/" or "/parent/sub-folder", the system should be able to create all directories. (Does not work), Instead of it creating the "/sub-folder/" or /sub-folder" as a folder, it will create a file named "sub-folder".
Here is the code I have
Path path = Paths.get(rootDir+"test/hello/");
try {
Files.createDirectories(path.getParent());
if (!Files.isDirectory(path)) {
Files.createFile(path);
} else {
Files.createDirectory(path);
}
} catch (IOException e) {
System.out.println(e.getMessage());
}
You need to use createDirectories(Path) instead of createDirectory(path). As explained in the tutorial:
To create a directory several levels deep when one or more of the
parent directories might not yet exist, you can use the convenience
method, createDirectories(Path, FileAttribute). As with the
createDirectory(Path, FileAttribute) method, you can specify an
optional set of initial file attributes. The following code snippet
uses default attributes:
Files.createDirectories(Paths.get("foo/bar/test"));
The directories
are created, as needed, from the top down. In the foo/bar/test
example, if the foo directory does not exist, it is created. Next, the
bar directory is created, if needed, and, finally, the test directory
is created.
It is possible for this method to fail after creating some, but not
all, of the parent directories.
Not sure of which File API you are using. But find below the simplest code to create file along with folders using java.io package.
import java.io.File;
import java.io.IOException;
public class FileTest {
public static void main(String[] args) {
FileTest fileTest = new FileTest();
fileTest.createFile("C:"+File.separator+"folder"+File.separator+"file.txt");
}
public void createFile(String rootDir) {
String filePath = rootDir;
try {
if(rootDir.contains(File.separator)){
filePath = rootDir.substring(0, rootDir.lastIndexOf(File.separator));
}
File file = new File(filePath);
if(!file.exists()) {
System.out.println(file.mkdirs());
file = new File(rootDir);
System.out.println(file.createNewFile());
}
} catch (IOException e) {
System.out.println(e.getMessage());
}
}
}