Changing dalvik/libcore causes rebuilding the whole framework - java

I'm adding some interception routines to Dalvik libcore methods (e.g. file open method in libcore/luni/src/main/java/org/apache/harmony/luni/platform/OSFileSystem.java), which I think only changes basic sharing libraries. But to my surprise, every time I run make after modifications, it rebuilds nearly everything of the framework, such as Calculator application, W3C DOM parser, etc. It really takes time to build the framework after a small modification. I'm wondering if it is possible to reduce number of rebuilt components after modifying dalvik libcore? Thanks.

It actually isn't too surprising that changing core.jar causes many things to be rebuilt. core.jar contains many/all of the core java classes, like Object, String etc. So that every other jar/apk that gets built actually depends on core.jar.
From a makefile perspective, it has no clue what you changed in core.jar, and whether it is safe to not rebuild all these other things that depend on core.jar. It simply sees that the last modified time on core.jar is newer than on all of the other jars/apk that depend on it, so it rebuilds them all.
The trick, however, is to tell make specifically what you want to build, instead of telling it to build everything.
Assuming that you have already done a full build previously, you can simply do
make core snod
The core target will specifically build a new core.jar with your changes, without rebuilding anything that depends on core.jar.
And the snod target (short for systemimage-nodeps) will cause it to repackage everything from out/target/product//system into a new system.img. This is a "special" target that is declared in build/core/Makefile.
In general, the target for a particular jar/apk is simply the name of that jar/apk, without the extension. Alternatively, you can look at the Android.mk file for that module, and find the module name, which is typically something like LOCAL_PACKAGE_NAME or LOCAL_MODULE, depending on the type of module.
For core.jar (in gingerbread at least), the module name is in libcore/JavaLibrary.mk (which is actually included by libcore/Android.mk). This file contains definitions for a number of different modules, but the first one, with LOCAL_MODULE := core is the one resposible for building core.jar. The rest seem to mostly be test related modules.

Related

Altering open source code

Assume that I am using an open source jar file in my project which is of size 11mb. But I am not utilizing this jar fully (I'll never utilize in the future too). I know that I just need couple of classes from this jar which does my job. In such case, can I just delete the other classes in the jar file and use it?
I'll make sure that whatever the classes remain in the jar is complete by itself. Meaning, these classes do not depend on any other classes in the jar. So Can I just remove the unwanted classes in the jar so that the jar file gets reduced? If I do this job, it it legal? Am I allowed to do such stuff and use in my project?
SmartGWT appears to use the LGPL licence. This means you can link to it even in a proprietary closed-source application without the need to release your source code if you distribute it.
However, this freedom may not apply if you modify the library.
A program that contains no derivative of any portion of the Library, but is designed to work with the Library by being compiled or linked with it, is called a "work that uses the Library". Such a work, in isolation, is not a derivative work of the Library, and therefore falls outside the scope of this License.
It could be argued that chopping out bits of the library creates a derivative work even though you've not altered the source code itself, but IANAL.
Of course, if you are not distributing your project (for example, it's an internal business application for your company) then I don't believe the requirement to release your source code applies even with a derivative work.

Can I force GWT compilation without entry point? (To validate compatibility to GWT)

This question is related to, but not a duplicate of, this question.
My issue is slightly different; I have a "utility module", shared between the client and server code, and it contains no GWT-specific code.
I understand that normally, all the sources are pulled into one specific project, where everything is compiled together. But there is one issue with that: I only get to know if my utility project is "GWT compatible", when I compile the main project. This is way too late; I haven't even got around to start on the main project, but I want to know before I make a "commit" to my SCM, that my utility project is "GWT compatible".
In other words, I want to validate the utility project for GWT compatibility, independently from it's use in a separate project (module).
There's a large part of the JRE that is not covered by GWT, and it is particularly likely in a utility module that non-GWT-compatible classes or method be used. That is what I want to validate against.
EDIT: I could add a "dummy entry point", I suppose, but that makes the project depend on GWT, which I don't want to, since it is "general" code, also to be used by people that don't use GWT. If it matters, I use Maven as build system.
EDIT2: No matter what I do, I will only get real compilation/validation with an entry point (does NOT need to reference any of the classes). Neither <force>true</force>, nor <failOnError>true</failOnError> will do. Is there a way I can define that entry point, for the shared project, such that only gwt-maven-plugin sees it, but not javac (so as not to add an unneeded dependency in the Java code)?
The compiler actually always visits all code on the source path (note: not quite the same as the classpath), by starting at the requested module with any <source> tags, and then checking each <inherits> along the way. If it finds something that isn't compatible or isn't compilable, it will mark it as broken, and move on - as long as nothing actually depends on it (i.e. an EntryPoint, or something that an EntryPoint depends on) you'll just see this message:
Validating newly compiled units
Ignored 1 unit with compilation errors in first pass.
Compile with -strict or with -logLevel set to TRACE or DEBUG to see all errors.
If you include that -strict flag, the compile will actually fail when it hits something that can't be included correctly.
This work is done in the very early stages of the compile, while constructing the TypeOracle, which is used for Generators, long before any JS is built. That type oracle is passed to generators, which need to be able to ask questions like 'what interfaces on the sourcepath have a JSO implementation' and 'what are all possible subclasses of List'. Generators can do a huge number of things, including emit even more types which then need to be parsed, compiled, and the process continues until a full JProgram is created of all possible types, based on the current set of modules.
That JProgram then gets compiled down based on what can be reached from the roots - the entrypoint, as well as a few other details such as how to emulate Java details like casts, arrays, longs, exceptions, etc.
If -strict was not specified, and the compiler ends up needing to reach something which is unavailable due to earlier compilation problems, that is the time you find out. Using -strict to stop earlier will help ensure that you catch those issues sooner.
One more fun fact: By default, with com.google.gwt.user.User in your module (or any other <inherits> that depends on it), you already have an entrypoint, or several! These do some quick checking that your page is working correctly, such as using a strict doctype, or the browser actually matching the expected user.agent setting. This means that it is usually possible to compile a module even without an entrypoint (except with gwt-maven-plugin:compile, which will not consider a module for compilation just by those built-in ones).
EDIT: Okay, even one more: From http://www.gwtproject.org/doc/latest/DevGuideCompilingAndDebugging.html, combined with -strict, it looks like you can force the validation to run without actually compiling to JS:
-validateOnly Validate all source code, but do not compile
I don't think it's possible because the GWT compiler does not compile any unused code.
This means that your shared utility "module" may have code in it that is not compatible with GWT, but it will not cause any problems as long as GWT code never calls such incompatible classes or methods. Without an entry point GWT compiler won't know which code is used and which is not - it will assume that all of it is unused.

Best ways to manage generated artifacts for web service/xml bindings in a java webapp/client?

I'm working on a couple of web services that use JAXB bindings for the messages (in JAX-WS or spring-ws). When using these bindings there's always some code that is automatically generated from the WSDL to bind the message objects. I'm struggling to figure out the best way I can make this work so that it's easy to work with, hard to break and integrates nicely with IDEs (mostly using eclipse).
I think there are a couple of ways to go about this. The three main options I see right now are:
Generate code, keep the source artifacts and check them into the repository. Pros: integrates easily with IDEs (source highlighting etc), works within the build system. Cons: generated code changes each time you regenerate it, possibly creating noisy commits. It's also redundant since the WSDL file is already checked in, usually.
Generate code as part of the build process. Don't keep source artifacts or only keep them in output directories. Pros: fixes all the cons from the previous one. Cons: harder to integrate with IDE, though maybe this build step can be run automatically? I currently use this on one of my projects but the first time I checkout the project it appears broken, which is a minor nuisance.
Keep generated bindings in separate libraries (jars) included with maven or manually updated jars, depending on your build process. I got the idea from a thread on java.net. This seems more stable and uses explicit versioning but seems a bit heavyweight.
Which one of these options would you implement and how? We're currently using maven and eclipse, so any ideas in that regard would be great. I think this problem generalises to most other build systems and IDE combinations though, even other languages perhaps.
I went for option 3. If you already host your own repository (and optionally CI), it's not that heavyweight. All it takes is a simple POM. It's even possible to include some utility/wrapper/builder classes (that often make life easier with generated classes) and use them in several projects.
I'd go for option 2 and generate code in the "standard" ${project.build.directory}/generated-sources/<toolname> location as part of the build process. Using generated sources is well supported by m2eclipse (use Maven > Update Project Configuration once sources have been generated) and, if I remember well, by the maven eclipse plugin as well (i.e. the folder will be added to the Java Build Path). Actually, I think NetBeans also handle this fine. Not sure for Idea.
For the generation itself, you may need the maven-jaxb2-plugin if I understood correctly.

How to automate a build of a Java class and all the classes it depends on?

I guess this is kind of a follow-on to question 1522329.
That question talked about getting a list of all classes used at runtime via the java -verbose:class option.
What I'm interested in is automating the build of a JAR file which contains my class(es), and all other classes they rely on. Typically, this would be where I am using code from some third party open source product's "client logic" but they haven't provided a clean set of client API objects. Their complete set of code goes server-side, but I only need the necessary client bits.
This would seem a common issue but I haven't seen anything (e.g. in Eclipse) which helps with this. Am I missing something?
Of course I can still do it manually by: biting the bullet and including all the third-party code in a massive JAR (offending my purist sensibilities) / source walkthrough / trial and error / -verbose:class type stuff (but the latter wouldn't work where, say, my code runs as part of a J2EE servlet, and thus I only want to see this for a given Tomcat webapp and, ideally, only for classes related to my classes therein).
I would recommend using a build system such as Ant or Maven. Maven is designed with Java in mind, and is what I use pretty much exclusively. You can even have Maven assemble (using the assembly plugin) all of the dependent classes into one large jar file, so you don't have to worry about dependencies.
http://maven.apache.org/
Edit:
Regarding the servlet, you can also define which dependencies you want packaged up with your jar, and if you are making a stand alone application you can have the jar tool make an executable jar.
note: yes, I am a bit of a Maven advocate, as it has made the project I work on much easier. No I do not work on the project personally. :)
Take a look at ProGuard.
ProGuard is a free Java class file shrinker, optimizer, obfuscator, and preverifier. It detects and removes unused classes, fields, methods, and attributes. It optimizes bytecode and removes unused instructions. It renames the remaining classes, fields, and methods using short meaningless names. Finally, it preverifies the processed code for Java 6 or for Java Micro Edition.
What you want is not only to include the classes you rely on but also the classes, the classes you rely on, rely on. And so on, and so forth.
So that's not really a build problem, but more a dependency one. To answer your question, you can either solve this with Maven (apparently) or Ant + Ivy.
I work with Ivy and I sometimes build "ueber-jar" using the zipgroupfileset functionality of the Ant Jar task. Not very elegant would say some, but it's done in 10 seconds :-)

What does combineaccessrules mean in Eclipse classpaths?

This has been bugging me for years now, and I thought one of you fine people would know - in Eclipse's .classpath files, what is the combineaccessrules attribute of the classpathentry element actually used for?
I can see in the Java Build Path config dialog that it can be maniuplated, but I can't think of a good use case for it. If I muck about with the settings, or modify the .classpath file manually, it doesn't seem to have any effect.
I'm hoping someone else has put it to good use, and I can steal their ideas. Basically, it's an itch I'm trying to scratch.
With proper use of access rules you can prevent using "internal" and/or "non-api" classes and methods. When you add a class or package as Forbidden or Discouraged the compiler show an error or warning when you use that class or class from the specified package. For a longer introduction of access rules you should read this short article.
For using combine access rules imagine the following situation:
You have 2 projects, A and B.
On the classpath of project A there is a jar file that is exported. The jar contains some "stable api", "unstable api" and "non-api" public classes.
Project B depends on project A.
You do not allow using "non-api" classes in project A so you set some Forbidden access rules on those classes / packages.
In project B you do not allow using "non-api" as well, but you do want to get a warning when using "unstable api". In this case in project B you only have to set the additional Discouraged access rules if you check the Combine rules with the access rules of the exported project entries.
Access rules are handy little things, but dangerous. They exclude a source file from the project compiler but leave the file intact in the filesystem.
The project I work on has a bootstrap class in one of our source folders, but if we include the entire folder the project classpath it won't compile (it's a long story and the build process handles this).
So we use an eclipse access rule to exclude it and it never bothers us during development. This means we can't easily change the code, but it's one of those classes that literally hasn't been touched in years.
Combine Access Rules, judging by the JavaDoc, is a real edge use case. To use it you would have to have:
an access rule in an exported source entry of one project
a link to that project from a parent project
a need to combine the access rules of the sub project with the parent
I really can't say how it would be useful, but I hope that at least answers your "what is it" question :)
although i have never used it myself, a little bit of into can be found here.
whether the access rules of the project's exported entries should be combined with this entry's access rules
the access rules would be something like including "com/tests/**"

Categories