In Eclipse 3.5, say I have a package structure like this:
tom.package1
tom.package1.packageA
tom.package1.packageB
if I right click on an the tom.package1 package and go to Refactor->Rename, an option "Rename subpackages" appears as a checkbox. If I select it, and then rename tom.package1 to tom.red my package structure ends up like this:
tom.red
tom.red.packageA
tom.red.packageB
Yet I hear that Java's packages are not hierarchical. The Java Tutorials back that up (see the section on Apparent Hierarchies of Packages). It certainly seems like Eclipse is treating packages as hierarchical in this case.
I was curious why access specifiers couldn't allow/restrict access to "sub-packages" in a previous question because I KNEW I had seen "sub-packages" referenced somewhere before.
So are Eclipse's refactoring tools intentionally misleading impressionable young minds by furthering the "sub-package" myth? Or am I misinterpreting something here?
Eclipse can't possibly violate the JLS in this case, because it has nothing to do with compiling or running Java source or bytecode.
The refactoring tools behave as they do because that behaviour is useful to developers. The behaviour is useful to developers because, for many intents and purposes, we do treat packages as hierarchal (a.b.c has some kind of relationship with a.b, even if that relationship is not consistent from project to project). That doesn't mean Java treats them as hierarchal intrinsically.
One example where people treat packages as very hierarchal is in configuring a logging framework such as log4j. Again, it's not intrinsic to log4j, but that's how people use it in practice.
Java packages are not hierarchical in the sense that importing everything from package A does not import everything from package A.B.
However, Java packages do correspond directly to the directory structure on the file system, and directories are hierarchical. So Eclipse is doing the correct thing - it is renaming the directory, which automatically changes the name of the parent directory of the renamed directory's children (to state the very obvious).
even java itself has the concept of subpackage:
http://java.sun.com/j2se/1.5.0/docs/tooldocs/windows/java.html
java -ea[:<package name>"..." | :<class name> ]
Enable assertions. Assertions are disabled by default.
With no arguments, enableassertions or -ea enables assertions. With one argument ending in "...", the switch enables assertions in the specified package and any subpackages. If the argument is simply "...", the switch enables assertions in the unnamed package in the current working directory. With one argument not ending in "...", the switch enables assertions in the specified class.
If a single command line contains multiple instances of these switches, they are processed in order before loading any classes. So, for example, to run a program with assertions enabled only in package com.wombat.fruitbat (and any subpackages), the following command could be used:
java -ea:com.wombat.fruitbat... <Main Class>
Java's packages are not hierarchical, but Eclipse stores packages on your system's file structure.
tom.package1.packageA is represented on a Windows file system as tom/package1/packageA.
When you ask Eclipse to refactor a package name, you're asking Eclipse to change the name of the file system directory structure.
You can have packages in Eclipse like:
tom.package1.packageA
tom.package2.packageB
tom.package3.packageC
You'll just have different 2nd level file system directories.
Related
It seems that in Java 9 it is not allowed to have so-called Split Packages, i.e. the same package being defined in two different modules. This causes a problem with my migration process: The (Gradle) project contains a Jar file that is called bootstrap.jar, with a structure like this:
bootstrap.jar
- com
- example
- Foo.class
- Bar.class
- Baz.class
The src directory contains a class com.example.Bar that depends on Foo as well as a module definition, for com.example. The bootstrap.jar file does not contain a module info, as it was compiled before Java 9, so it uses an automatic module called bootstrap. The problem is that now the package com.example is defined in both modules, com.example and bootstrap.
The reason there is this bootstrap.jar file, to begin with, is as follows:
The src/com/example folder actually contains Bar.java, Baz.java and another file, Foo.dyvil. The latter is written in a JVM-based programming language. So the dependency chain looks like this:
Bar.java -> Foo.dyvil -> Baz.java
During the build process, it gets compiled to Foo.class, which gets placed in a new Jar file that later replaces bootstrap.jar. The reason all these files are placed is that both the Java and Dyvil compiler cannot process the other languages files, so they require some access to the compiled classes from the previous build. So that is why there is bootstrap.jar.
Now for the actual problem: Since split packages are disallowed in Java 9, is there any way to achieve "split builds" using "bootstrap" jar files as described and used in my project? Or is there any other approach?
Though the long-term solution to this is resolving such packages to exist in a single module and then modularising the code.
As a temporary solution, you can make use of the option:-
--patch-module <module>=<file>(<pathsep><file>)*
as in your case
--patch-module com.example=bootstrap.jar
Do keep in mind though, the --patch-module option is intended only for testing and debugging. Its use in production settings is strongly discouraged.
Is IntelliJ compiling all the time since it tells me with red squiggly lines when there is an error? (in addition to the autocomplete features) Or is it doing some sort of psuedo compiling?
If it is doing legit compiling, where does it put these compiled classes? I'de like to point my JRebel to that directory instead of the individual module target folders.
Meo is right, from what I learned when I developed plugins for custom languages, IntelliJ does not compile anything until you explicitly make your project. While you are typing, its lexer/parser detects any invalid token or code construct. In the meantime, it maintains an index of every class and method in your project and its dependencies, along with their signature, etc.
After you stop typing, you'll see a little colored eye in the top part of the right gutter. It indicates that the IDE is running "annotators" and "code inspections". They are able to tell whether or not classes, methods and variable are valid based on the current index and the current state of your file (imports, declarations, etc.). The same goes for unused variables, invalid parameters in method calls, etc.
Pros:
annotators work directly on what they call a PSI tree, which is basically an enhanced AST representing your current file
it may be faster that compiling every time (it uses an index and does not need to recompile every dependent class)
annotators can detect things javac don't care about, such as potential bugs (e.g. using = instead of == in a while condition)
Cons:
that's a loooot of work, basically they need to rewrite the logic to find every error that javac can produce (which is why you can find many issues on their bugtracker labelled "good code is red" or "bad code is green", meaning there is a difference between what they detect and what the compiler would output)
TL;DR: it does not produce any .class until you make your project, everything is done "by hand"
For every module, the compiler output path can be found from Paths tab in Module Settings.
JRebel plugin generates rebel.xml automatically and derives the directory path from Module Settings, so you do not need to point to the locations manually - just generate rebel.xml using the IDE plugin: right click on module in the project view -> JRebel -> generate rebel.xml
Just to add, after compilation, the classes are stored in the target directory if it's a Maven project - otherwise, the directory is specified in IntelliJ's Project Structure, in "Project compiler output":
IntelliJ understands the code, it does not need to compile the code to know what is wrong.
I found my .class files by going to the out/production/main folders from the home directory of the project.
Today I started learning Java.
I saw that package automatic gets included in .Java file.
I was wondering if it always need to be included?
Consider specify a common package for all the types within a same project.
In Java is common to start a project with a specific package setting. A package creates a namespace to disambiguate the types that it includes, to play nicelly with other projects that may or may not be in the same classpath. Normally, the package is bound to a URL of the project.
Think of Java packages like C++ namespaces.
A huge project/product written in Java can depend on lots and lots of projects, each described in a different package.
Organizations like Apache have lots of projects, organized under a common package pattern: org.apache.<<name_of_the_project>>.
Consider starting your project with a package named: com.user3552670; or something like your personal site, so persons that will consume your project can relate to the creator.
Yes and no.
It's used to specify the package of the class, read more here.
You could create a class without a package, but your code will look bad..
They exists to avoid conflicts, example between your code and default java package.
If packages doesn't exists, you can't create a class named ArrayList because already exists in Java.
Some IDEs force the fact that, if your .java file is in com/a/b/c folder his package should be com/a/b/c (If i don't remember wrong, IntellIJ IDEA do that)
Yes and no.
It must be there, but the IDE takes care of it (I don't use Netbeans, but I'd bet that it can do it, too). When moving files between packages, it has to be updated, but again, the IDE does it all.
I want to decompile a java program and recompile the derived (obfuscated) source. I unpacked the .jar archive and got a directory structure like that:
com/
com/foo/A/
com/foo/A/A.class
com/foo/A/B.Class
com/foo/B/A.class
...
com/foo/A.class
com/foo/B.class
org/foo/Bar.class
...
The problem is that there are name collisions between packages and classes, which makes it impossible to recompile the decompiled class files.
A decompiled class will look like this:
package org.foo;
import com.foo.A; // <-- name collision error
class Bar {
...
}
Is there any way to resolve those naming issues, without renaming the class files?
EDIT:
This is not a decompiler problem, but the question how it is possible to have a working .jar file with classes that violate naming conventions.
EDIT2:
Okay, i guess on bytecode level such naming is possible, so with a smarter decompiler (who automatically renames the classes and fixes their references) this problem could be solved.
Do you really need to unpack the entire jar and recompile everything? Instead of recompiling the entire decompiled source by itself, use the original jar as the classpath, and extract and recompile only those classes that you need to modify. Then, when you need to package up your recompiled code, just copy the original jar and use jar -uf to replace the modified class files in place:
jar -uf ./lib/copy_of_original_jar_file.jar -C ./bin com/foo/A.class com/foo/B.class [...]
...and ./lib/copy_of_original_jar_file.jar becomes your new library.
One thing is for sure, and that is that the original jar must work properly with a Java classloader in order for the program to run. It should work just as well for compiling your one-off .class files.
You should experience much fewer naming collision issues by using the original jar because you keep the same classpath scanning order that the running application would use. Not only that, but Java decompilers aren't perfect. By eliminating the majority of the decompiled code from recompilation, you avoid the majority of issues that decompilers have with things like exception handler overlaps, special characters in obfuscated symbols, variable scoping issues, etc.
Java's import mechanism provides a shorthand for naming things, but you obviously cannot use it when there are collisions. You can always use the fully qualified name in your code, e.g.
package org.foo;
class Bar {
private com.foo.Bar aDifferentBar;
...
}
EDIT:
I suppose there could be class files that comply with the JVM spec but which cannot be produced by a Java program that complies with the JLS spec. If so then you'll definitely need a smarter decompiler.
You can not import packages in Java, so why should this be a name collision? Which error message do you get from the compiler?
If there would be a name collision in the obfuscated code, the code would not run. So the decompiled code should be collision free.
I am looking for a replacement for javadeps, which I used to use to generate sections of a Makefile to specify which classes depended on which source files.
Unfortunately javadeps itself has not been updated in a while, and cannot parse generic types or static imports.
The closest thing I've found so far is Dependency Finder. It almost does what I need but does not match non-public classes to their source files (as the source filename does not match the class name.) My current project has an interface whose only client is an inner class of a package-private class, so this is a significant problem.
Alternatively if you are not aware of a tool that does this, how do you do incremental compilation in large Java projects using command-line tools? Do you compile a whole package at a time instead?
Notes:
javadeps is not to be confused with jdepend, which is for a very different purpose.
This question is a rewrite of "Tool to infer dependencies for a java project" which seemed to be misunderstood by 2 out of 3 responders.
I use the <depend> task in ant, which is ok, but not 100% trustworthy. Supposedly JavaMake can do this dependency analysis, but it seems to be rarely updated and the download page is only sometimes available.