I am using Google Reflections 0.9.10 to scan an external jar file (using a URLClassLoader) and my main class of my application, which is called Volts of Doom, (using ClassName.getClassLoader())
I am searching for a custom annotation, #Mod, so I do not want to try to search for it in files such as:
could not scan file META-INF/MANIFEST.MF in url file:/C:/Users/admin/.m2/repository/com/google/code/gson/gson/2.3.1/gson-2.3.1.jar because it is slowing down the loading cycle a lot
I assume that this is being scanned because it is on my Maven classpath, as it should be because I use it as a dependancy.
In this case, they are not java files, but resources, so they do not impact the speed, as they are not being scanned (as it says), but to me, this means that it is still searching in those files. If it does find java files there, those will impact the loading speed.
How do I prevent Reflections from scanning these external jar files?
Thus far, I have tried using:
new FilterBuilder().excludePackage("java")
.excludePackage("org.reflections")
.excludePackage("com.google")
.excludePackage("com.sun")
...however this does not solve the issue.
A sample of my logs:
11:58:09.005 [main] DEBUG org.reflections.Reflections - going to scan these urls:
file:/C:/Users/admin/AppData/Roaming/zapbyte/voltsofdoom/resources/mods/test3.jar
file:/C:/Users/admin/.m2/repository/org/lwjgl/lwjgl/3.2.3/lwjgl-3.2.3-natives-windows.jar
file:/C:/Program%20Files/Java/jdk1.8.0_241/jre/lib/ext/zipfs.jar
file:/C:/Users/admin/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar
file:/C:/Program%20Files/Java/jdk1.8.0_241/jre/lib/ext/jfxrt.jar
file:/C:/Users/admin/OneDrive/Desktop/Java/VoltsOfDoom/voltsofdoomparent/voltsofdoom/target/classes/
file:/C:/Program%20Files/Java/jdk1.8.0_241/jre/lib/ext/nashorn.jar
file:/C:/Users/admin/.m2/repository/org/lwjgl/lwjgl-openal/3.2.3/lwjgl-openal-3.2.3.jar
file:/C:/Users/admin/.m2/repository/org/lwjgl/lwjgl-glfw/3.2.3/lwjgl-glfw-3.2.3-natives-windows.jar
... etc for more files
11:58:09.065 [main] DEBUG org.reflections.Reflections - could not scan file .classpath in url file:/C:/Users/admin/AppData/Roaming/zapbyte/voltsofdoom/resources/mods/test3.jar with scanner SubTypesScanner
11:58:09.065 [main] DEBUG org.reflections.Reflections - could not scan file .classpath in url file:/C:/Users/admin/AppData/Roaming/zapbyte/voltsofdoom/resources/mods/test3.jar with scanner MethodAnnotationsScanner
^^ This is good: they are in the jar I want to scan
11:58:09.080 [main] DEBUG org.reflections.Reflections - could not scan file META-INF/MANIFEST.MF in url file:/C:/Users/admin/.m2/repository/org/lwjgl/lwjgl/3.2.3/lwjgl-3.2.3-natives-windows.jar with scanner TypeAnnotationsScanner
11:58:09.080 [main] DEBUG org.reflections.Reflections - could not scan file META-INF/MANIFEST.MF in url file:/C:/Users/admin/.m2/repository/org/lwjgl/lwjgl/3.2.3/lwjgl-3.2.3-natives-windows.jar with scanner SubTypesScanner
^^ This is not, because they are extraneous.
I have patched the issue with this FilterBuilder:
public static FilterBuilder defaultFilterBuilder() {
return new FilterBuilder()//
// Exclude sources
.excludePackage("java")//
.excludePackage("lib/").exclude("lib/")//
.excludePackage("lib.").exclude("lib.")//
.excludePackage("resources/").exclude("resources/")//
.excludePackage("resources.").exclude("resources.")//
.excludePackage("META-INF/").exclude("META-INF/")//
.excludePackage("META-INF.").exclude("META-INF.")//
.excludePackage("org.reflections").excludePackage("org/reflections")//
.excludePackage("com.google").excludePackage("com/google")//
.excludePackage("org.lwjgl").excludePackage("org/lwjgl")//
.excludePackage("ch.qos").excludePackage("ch/qos")//
.excludePackage("edu.umd").excludePackage("edu/umd")//
.excludePackage("jdk.nashorn").excludePackage("jdk/nashorn")//
.excludePackage("jdk.internal").excludePackage("jdk/internal")//
.excludePackage("windows.x64").excludePackage("windows/x64")//
.excludePackage("net.jcip").excludePackage("net/jcip")//
.excludePackage("com.sun").excludePackage("com/sun")//
.excludePackage("sun.text").excludePackage("sun/text")//
// Exclude resources
.excludePackage("image/").exclude("image/")//
.excludePackage("image.").exclude("image.")//
.excludePackage("font/").exclude("font/")//
.excludePackage("font.").exclude("font.")//
;
}
Which excludes every package which was logged as slowing the process, and has cut the average log size during loading from 2.5MB to 91KB.
According to this comment on the org.reflections GitHub repository you need to exclude both “org.google” and “org/google” to catch everything.
Related
I've got a simple setup with Cucumber feature files and Java step definition files.
feature.feature -> StepDefinition.java -> PageObject.java
Specifically:
STEP: Step definition file:
Given I am logged in as .... -> LoginSteps.java
And I am at the workspace -> WorkspaceSteps.java
And I start a new application -> WorkspaceSteps.java
And I accept -> AcceptPage.java - does not work.
- BUT:
And I accept -> AcceptPage.java - DOES work
As shown above, I'm using three step defintion files here. And cucumber recognizes the step definitions in both files. But it doesn't even attempt to run the "And I accept" step when it's defined in the AcceptPage.java file. If I move it to the WorkspaceSteps.java file, it runs fine.
There are no complaints about missing step definitions. In fact, there are no error messages, just orange coloring of the test results.
IntelliJ Run window:
Test results 4s462ms
- Feature: Complete application 4s462ms
- - Scenario: Budget 4s462ms
- - - And I accept 0ms
The .feature file:
Scenario: Budget
# Login/workspace
Given I am logged in as "12048746711"
And I am on the Workspace screen
And I start a new application
# Accept
And I accept
LoginSteps.java:
#Given("^I am logged in as \"([^\"]*)\"$")
public void iLogInWithId(String id) {
login(createPersonInfo(id));
}
WorkspaceSteps.java
#Given("^And I am on the Workspace screen$")
public void iAmOnScreenWorkspace() {
iAmOnWorkspace();
}
#Given("^I start a new application$")
public void startNewApplication() {
workSpacePage.applyForLoan.click();
}
AcceptPage.java
#Given"^I accept$")
public void iAccept() {
System.out.print("");
acceptPage.iAccept.click();
}
The step pointing to the iAccept() method on the AcceptPage.java doesn't run. Even when I put a break point on the System.out.print() line and debug, it doesn't stop or even get there.
But If I move the entire iAccept() method into the WorkSpace.java file, everything works.
Any ideas?
Of course, I've invalidated IntelliJ's caches and restarted. I've even tried creating a brand new Steps file, containing a new step with a new name. It's the same: The cucumber test refuses to go anywhere except to the WorkspaceSteps.java file. This is starting to look completely idiotic, hillarious and absurd.
It turns out the solution was really silly. For some reason, IntelliJ had removed the top package from the Glue in the configuration, and instead added specific packages. So, when a file not belonging to those packages was used, it was just ignored without further explanation. Deleting the specific packages and entering the top package solved the problem.
I am trying to run an implementation a jason code that is using some Internal Actions. The interpreter is showing that it was not possible to find the "java" code of the internal action, as showed:
Server running on http://191.36.8.42:3272
[aslparser] [peleus.asl:29] warning: The internal action class for 'org.soton.peleus.act.plan(Goals)' was not loaded! Error:
java.lang.ClassNotFoundException: org.soton.peleus.act.plan
[aslparser] [peleus.asl:42] warning: The internal action class for 'org.soton.peleus.act.isTrue(H)' was not loaded! Error:
java.lang.ClassNotFoundException: org.soton.peleus.act.isTrue
[peleus] Could not finish intention: intention 1: +des([on(b3,table),on(b2,b3),on(b1,b2)])[source(self)] <- ... org.soton.peleus.act.plan(Goals); !checkGoals(Goals); .print("Goals ",Goals," were satisfied") /
{Goals=[on(b3,table),on(b2,b3),on(b1,b2)]}Trigger: +des([on(b3,table),on(b2,b3),on(b1,b2)])[noenv,code(org.soton.peleus.act.plan([on(b3,table),on(b2,b3),on(b1,b2)])),code_line(29),code_src("peleus.asl"),error(action_failed),error_msg("no environment configured!"),source(self)]
[peleus] Adding belief clear(table)
This mas2j file is as following:
MAS peleus {
infrastructure: Centralised
agents:
peleus;
}
Part of agent code (written by Felipe Meneguzzi) is showed bellow:
//The next line is line 28
+des(Goals) : true
<- org.soton.peleus.act.plan(Goals);
!checkGoals(Goals);
.print("Goals ",Goals," were satisfied").
+!checkGoals([]) : true <- true.
//The next line is line 40
+!checkGoals([H|T]) : true
<- .print("Checking ", H);
org.soton.peleus.act.isTrue(H);
!checkGoals(T).
I guess it is about the folder structure, how to set up Jason to search for java files in specific locations?
The folders structure is like this:
Peleus\src\org\soton\peleus for java files
Peleus\examples for mas2j and asl tested project
It all depends on how you are executing the application.
If you are using java, the CLASSPATH should be defined to include the missing classes.
if you are using jason script (that uses Ant), the .mas2j file should include the class path as well.
More on that in the FAQ. Notice that CLASSPATH is where .class files are found, not .java source code files. The error regards a missing class, not a missing source code.
rawGSODData = LOAD '/usr/local/Cellar/pig/0.12.0/gsod_2016/999999-93816-2016.op.gz' USING org.apache.pig.piggybank.storage.FixedWidthLoader('
1-6,
8-12,
15-18,
19-22,
25-30,
32-33,
36-41,
43-44,
47-52,
54-55,
58-63,
65-66,
69-73,
75-76,
79-83,
85-86,
89-93,
96-100,
103-108,
109-109,
111-116,
117-117,
119-123,
124-124,
126-130,
133-138',
'SKIP_HEADER');
When I try to run this code I will get an error saying
ERROR 1070: Could not resolve org.apache.pig.piggybank.storage.FixedWidthLoader using imports: [, java.lang., org.apache.pig.builtin., org.apache.pig.impl.builtin.]
I have the file FixedWidthLoader.java file in the directory
/usr/local/Cellar/pig/0.12.0/build/classes/org/apache/pig/piggybank/storage
Please help me with this error
Where is the piggybank.jar located ? Ensure you have registered piggybank.jar in your Pig script.If not, add this to the top of your pig script.Ensure the path to piggybank.jar is correct.Below statement registers the jar file i.e. piggybank located in /usr/local/
REGISTER '/usr/local/piggybank.jar';
Let's say there is a jar main.jar which depends on two other jars - dep1.jar and dep2.jar. Both dependencies are in a classpath in MANIFEST.MF of main.jar. Each of dependency jars has a directory foo inside with a file bar.txt within:
dep1.jar
|
\--foo/
|
\--bar.txt
dep2.jar
|
\--foo/
|
\--bar.txt
Here is a main class of main.jar:
public class App
{
public static void main( String[] args ) {
ApplicationContext ctx = new StaticApplicationContext();
Resource barResource = ctx.getResource("classpath:foo/bar.txt");
}
}
Which of two bar.txt files will be loaded? Is there a way to specify in a resource URL a jar the file should be loaded from?
Which one you get is undefined. However, you can use
Resource[] barResource = ctx.getResources("classpath*:foo/bar.txt");
to get them both (all). The URL in the Resource will tell you which jar they are in (though I don't recommend you start programming based on that information).
Flip a quarter, that's the one you'll get. Most likely, it will be the one highest alphabetically, so in your case the one inside dep1.jar. The files both have identical classpaths (foo.Bar), and while this should look to throw a compile time exception, it will not because it will just package both jars up and not try to compile/look at the (this specific file) file as it is a .txt file.
You wouldn't expect a compile time exception as resource loading is a run time process.
You can't specify which jar the resource will come from in code, and this is a common issue, particularly when someone bundles something like log4j.properties into a jar file.
What you can do is specify the order of jars in your classpath, and it will pick up the resource from the first one in the list. This is tricky in itself as when you are using something like ivy or maven for classpath dependencies, you are not in control of the ordering in the classpath (in the eclipse plugins at any rate).
The only reliable solution is to call the resources something different, or put them in separate packages.
The specification says that the first class/resource on the class path is taken (AFAIK).
However I would try:
Dep1Class.class.getResource("/foo/bar.txt");
Dep2Class.class.getResource("/foo/bar.txt");
As Class.getResource works cannot take resources from another jar, as opposed to the system class loader.
With a bit of luck, you will not need to play with ClassLoaders and hava a different class loader load dep2.jar.
As #Sotirios said, you can get all resources with the same name using ctx.getResources(...), code such as :
ApplicationContext ctx = new StaticApplicationContext();
Resource[] resources = ctx.getResources("classpath*:/foo/bar.txt");
for (Resource resource : resources) {
System.out.println("resource file: " + resource.getURL());
InputStream is = new FileInputStream(resource.getFile());
if (is == null) {
System.out.println("resource is null");
System.exit(-1);
}
Scanner scanner = new Scanner(is);
while(scanner.hasNextLine()) {
System.out.println(scanner.nextLine());
}
}
I've an issue with a project I try to deliver using the one-jar packager to simplify the deployment process.
Without the packaging, everything works fine and the logging configuration is perfectly loaded, but within the packaging, only part of the configuration is appied.
So, here is the logging.properties I use:
handlers= java.util.logging.ConsoleHandler, java.util.logging.FileHandler
.level= INFO
java.util.logging.FileHandler.pattern = C:\\MyPath\\logging.csv
java.util.logging.FileHandler.limit = 50000
java.util.logging.FileHandler.count = 1
java.util.logging.FileHandler.formatter = my.package.logging.Formatter
java.util.logging.ConsoleHandler.level = INFO
java.util.logging.ConsoleHandler.formatter = my.package.logging.Formatter
And In my main class, here is how I load it:
public class MainClass {
public static void main(final String[] args) {
try {
LogManager.getLogManager().readConfiguration(
new MainClass().getClass().getResourceAsStream("logging.properties"));
// main process goes here.
} catch(Exception e) {
// Exception handling
}
}
}
The log level as well as the FileHandler pattern are well understood because the logging ends up in the correct file, but as row XML output, which makes me think that the formatter is not loaded as it normally outputs a CSV format.
Could it be related to a classpath issue? Does anyone knows how to handle this?
It could be in your jars you have more than one logging.properties file, with similar but slightly different settings. When you combine them with one-jar the order changes and one of them gets hidden. Do a "jar -tf *.jar |grep logging.properties" and see what you see.
If that doesn't work can you try unjarring the onejar result to a directory structure, and then running with the directory on the classpath instead of the jar? That will let you see if it is something to do with the jar, and actually inspect the logging.properties you have in the onejar, and see if it matches what you expect.
Use LogManager.getLogManager().readConfiguration(LogManager.class.getResourceAsStream("/logging.debug.properties"));
(note the extra slash).