I'm trying to get a better picture of what happens behind the scenes in Android Studio when building an Android application. I've been reading up on Gradle, but one thing I cannot figure out is how to see the respective CLI command and arguments that is being invoked by Gradle. It seems to be abstracted and not logged to the Gradle Console or Event Log.
The closest I've gotten to seeing what's going on inside Gradle is the AOSP code.
2.2.2 Source:
https://android.googlesource.com/platform/tools/base/+/gradle_2.2.2/build-system/gradle-core/src/main/java/com/android/build/gradle/tasks
Goals
I want to be able to see the respective CLI command that is generated by the Gradle tasks inside Android Studio.
Use Case Example
I want to view the Legacy Android Build Process in depth. This includes going through the following:
Source Code / Library Code -> javac -> Java bytecode (.class) -> proguard -> minimized bytecode (.class) -> dex -> DEX bytecode (.dex)
For example I would want to see the respective javac command invoked by AndroidJavaCompile. https://android.googlesource.com/platform/tools/base/+/gradle_2.2.2/build-system/gradle-core/src/main/java/com/android/build/gradle/tasks/factory/AndroidJavaCompile.java
I fear that the only way to do this is to look directly through source code or even build directly from source.
Due Diligence
I've done quite a bit of searching on Google, Android blogs, Google I/O talks, Android books, and much more. I haven't been able to find a straight-forward answer.
That's not possible. Simply, because most of the Gradle tasks do not invoke CLI commands.
Every Gradle build file is a piece of Groovy code that gets executed in a JVM along with the Gradle API (written in Java). Therefor, you can implement any task or configuration functionality directly in any JVM language, from which most plugins make use of instead of executing command line tools. Nevertheless, this is possible by using or extending the Exec task.
The compilation step is handled by a AndroidJavaCompile task, which extends the common JavaCompile Gradle task by some version checks and the Instant Run feature. However, you don't know how Gradle actually compiles the .java files. In the internal source files for the JavaCompile task of the Gradle API, there seem to be various implementations (DaemonJavaCompiler, JdkJavaCompiler and even CommandLineJavaCompiler). Since you can specify CompilerOptions with your task, Gradle seems to choose the real compiler based on these options. Please note, that even if a CommandLineJavaCompiler exists, it is also possible (and highly likely), that Gradle prefers to use the javax.tools package and its JavaCompiler implementation to compile the source files instead of invoking a command line tool.
I also took a look on the ProGuard step in your example build process: ProGuard can be used as command line tool, where you can specify arguments to define how it'll work. But ProGuard also provides a Gradle task (ProGuardTask), that executes without invoking ProGuard from command line. The ProGuard Java code will be executed in the Gradle JVM.
As you can see, even if each Gradle task may be replaced by one (or multiple) CLI command(s), Gradle does not execute these commands. Instead, the functionality is called directly in the Gradle JVM. If you want to get a better insight, you can increase the Gradle log level. Good implementations of Gradle tasks should provide all necessary information in logs.
Related
The feature build automatically under the eclipse is much faster than the ./gradlew build.
My findings after some research is that it compiles and builds only the changed file and replaces it in build folder.
So why can't ./gradlew build command compiles and builds files that have changed and replace it in build folder and make the whole building process faster.
I have recently started using build automatically feature with hotswap agent + DCEVM.
Why can't gradlew build command compile and build only the things that have changed and make the process faster?
There's no dependable way how to determine what needs to be recompiled. For example, compile-time constants get inlined and there's no trace of where they come from in the class files (it can be found in the source files, that implies parsing them and losing time; it can be stored in some auxiliary files and some tools do it).
See the "Limitations" section of this for details.
The reason maybe is that they don't go through configure step of gradle.
Sure, but the configure step doesn't usually take that long.
Eclipse knows which files have changed
Good point (in a comment by holwgler).
Some time ago I spent some time trying to make my gradle compilation faster and I gave up. Eclipse is damn fast for many reasons:
incremental compilation
multithreading using all cores
knowing all changed files
having the whole compiler code optimized by the JIT
probably caching file dependencies
ugly highly optimized code
My "solution" is ignoring the problem. I do everything in Eclipse, except for integration tests (which take way longer than the compilation) and production builds (which are rare enough so I don't care).
You may want to read these performance tips.
To find out where the time gets spent, use
./gradlew clean; ./gradlew --profile jar
For me, 90% of the time is just :compileJava.
I have always used ADT to develop Android applications, but have moved to a new machine which does not have ADT installed. Google does not seem to support using ADT any more anyway, saying "you should migrate your app development projects to Android Studio as soon as possible". But Android Studio apparently uses Gradle which requires an Internet connection to compile, which won't work for me when I'm on the road, when I can download stuff to install if needed, but don't have an Internet connection when I'm actually working.
So as a work-around I am exploring the idea of not using an IDE at all and just manually compiling everything. (This also seems more future proof against the next time Google decides everyone needs to switch to a whole set of new tools.) I assume that to do this I need a few command line tools, for instance, something like "javac" to compile Java files into class files and next something to create dex file(s) and finally something to package everything together into an apk and sign it. However, when I search for instructions on manual builds I still find constant references to a build system such as Ant or Gradle. I don't want to use any build system!
From scratch, what is the minimum I need in terms of tools I need to download and figure out the command line invocation to turn a simple (let's say "Hello World" simple) java file (and a few support files like a layout XML file and manifest) into a working APK? (Note that I need to build an APK that will work on APIs as old as API level 10.)
Update: Ok, so far I have installed the standalone SDK tools, and have used the SDK Manager to install the SDKs that I need. But now I'm unsure of what commands I need to run: I'm familiar with javac and I see that, but I know there are other commands too. Again, I have seen several SO questions asking about how to build and it refers to things like the ant or android command, which are NOT in the stand-along SDK tools and which shouldn't be necessary if I just knew which commands to invoke manually.
I've written a plugin that uses the org.eclipse.jdt.core.compilationParticipant extension to gather some compile information to be used elsewhere. I've tested in multiple versions of the Eclipse IDE and it works like a charm. My ultimate goal is to be able to use it in a headless production PDE build. I've added some logging to the bundle so I am aware when it starts up, when it shuts down, and when source compilation occurs. The problem is that these events never get caught in my headless build buy the participant. The headless PDE build is kicked off by starting the equinox launcher from an ant script that runs the antrunner executing the PDE build script. There are so many scopes of execution involved I'm unsure where to start looking. My first question is, is what I'm trying to do even possible? It didn't seem like the CompilationParticipant would only work in the UI, but I want to make sure before I go down the road of debugging this. Has anyone ever done this?
I tried to add a comment, but I'm too wordy so I will try to clarify here a bit. Unfortunately I can't do much to change the build system except to apply hooks like I am attempting. I did spend some time running through the ant scripts that PDE generates and see the it is calling the JDT compiler adapter which made me curious if the JDT compiler adapter could reference the compilation participant since it is running ant from the plugin and should have access to the framework, and it seemed to be the intent of the participant API to allow the hooking of the JDT compiler to do things like the implementation of the APT processor and other DSL implementations. That was my read on the intent of the participants, and assume they would be available in a headless build since the APT processor works, but since I can't find a really good tutorial I'm kind of putting things together piecemeal and I'm guessing I'm missing something, or at least I hope so..
It is true that PDE is generating ant scripts and calling the javac task, but it is also setting the build.compiler property to use the JDT compiler and therefore I would assume have access to the OSGi framework. Here is a snippet from one of the generated build files to show what I am talking about:
<compilerarg line="-log '${temp.folder}/pde.example3.jar.bin${logExtension}'" compiler="org.eclipse.jdt.core.JDTCompilerAdapter"/>
Debugging org.eclipse.jdt.internal.core.JavaModelManager reveals that the JDT compiler is in fact being used but getRegisteredParticipants is not being called for some reason, startup() is however being called, so the question it why does it not try to register participants.
After spending hours in the debugger attaching to the various VMs that spawn during my build process I was able to determine the flow through a PDE build. I don't believe that CompilationParticipants come in to play, in fact I don't even think the JavaBuilder is called. It looks like the execution path is something like the following:
Ant spawns my VM which starts the Equinox Launcher which starts up the OSGi framework and instantiates the AntRunner application, this in turn starts ant from the Elcipse Ant plugin that runs the build.xml file from the PDE plugin, the Build.xml file generates all the ant scripting used to generate the eclipse plugins which includes setting the build.compiler to the JDTCompilerAdapter which wraps the Eclipse Java Compiler ( originally based on Visual Age for Java ). The JDTCompilerAdapter does some setup and instantiates the org.eclipse.jdt.internal.compiler.batch.Main class which does the real compilation, and also instantiates the org.eclipse.jdt.internal.compiler.apt.dispatch.BatchAnnotationProcessorManager class to handle annotation processing. Nowhere in this path of execution are participants notified, and the JDTCompilerAdapter seems to be specifically designed to be able to be used outside the OSGi environment in ant. so it looks like CompilationParticipants will not give me what I need in a headless PDE build using the antrunner..
AFAIK PDE build is "just" a fancy way of generating a lot of Ant-scripts, and I belive it just uses the javac target to compile files. You can check that after the PDE build has run, by going into your source folders, find the Ant-script, and check.
If what you do is important for the build, I would recommend that you check out Buckminster. It is a build tool designed of OSGi applications. It is special in the sense, that it actually builds in an Eclipse workspace, so it uses the same builders and stuff like CompilationParticipants as you do during development, assuming you have installed the plugins in the headless build-application.
Well after tons of debugging, reading docs, and stepping though the PDE sources it seems like this can NOT be done. It seems in a headless build the execution of the JDTCompilerAdapter is designed to work outside of OSGi and does not have access to the framework it is simply called from the javac task and does NOT involve the JavaBuilder and therefore does not call any participants.
Currently I'm looking at integrating some build processes into my source control (Git hooks specifically). I'm trying to write a pre-commit hook that checks for build errors in my Java project (a medium-large test development project) and fails to allow commits that contain errors in the build. This is turning out to be rather challenging.
The approach here uses a command-line Eclipse tool to build and output warnings and errors. This does technically work, but it's slow and may cause problems with the Eclipse IDE (I've already had heap allocation errors). I've also looked at solutions using ant but these approaches don't seem to be a simple one-line solution, and may still be slow.
My main question: what's the fastest (run-time compilation speed) way to build and validate a Java project, by command line? I'd like a solution that returns 0 with no errors and something else if errors are present, but I'm willing to look at other things.
Let's start with some basics:
pre-commit hooks run on the server and not the client. There is no working directory by default. You have to make sure that javac is available, and is the correct version.
Your pre-commit hook will freeze up the user's terminal until completion.
Now, how long will it take to checkout a fresh copy of your Java project, run Ant, wait for it to compile, and then process the output of the compile? a minute or two? 20 seconds? 10 seconds? Even 10 seconds will feel like forever as you wait for the Git push to complete. And, if other users want to commit code, they have to also wait.
A better, and easier approach is to use a Continuous Build Server like Jenkins. Jenkins is easy to setup. (It comes with its own application server built in) and has hundreds of plugins that you can use to help report the health of your project. If a compile cannot happen, Jenkins will email the culprit and whomever else you mention.
We have our Jenkins setup to do Ant builds, Maven builds, and use either Git or Subversion as our repository (depending upon the project). Jenkins builds the project, keeps the console log, and will fail the build if build.xml fails. At our place, this means I start pestering the developer to fix the problem or to undo their changes. At my last workplace, developers were given 10 minutes to fix the build, or I would undo their changes.
Not only can Jenkins let you know when a build fails, but has plugins that can report on the Java compiler warnings, Javadoc warnings, run Findbugs, PMD, find duplicate lines of code (via CPD that comes with PMD), and then report everything in a series of graphs. You can also mark builds as unstable (build completes, but is problematic) or simply fail the build based upon the number of issues found with these tools.
Jenkins can also run Unit tests, and again graph the results, then run coverage analysis with JaCoCo or Cobertura or Emma.
So, take a look at Jenkins. It's easy to setup and will do exactly what you want and more.
Ant. There isn't going to be a "one-line-solution". Write an ANT script that compiles the code, and fails if there are any errors. It's not easy, but it's the best option.
Out of the choices you mention, Ant is the best. But let's face it, writing XML sucks. My guess is that any build tool will fail and return an error code when compilation fails. My favorite is sbt, but there's a bit of a learning curve if you aren't into Scala (and even those in Scala like to complain about sbt). Another great option IMO is Gradle. You write your scripts in Groovy which is a dynamically-typed superset of Java.
Jenkins may be a something you could look at
After several late-night debugging nightmares, I've somehow fallen into the paranoid habit of clicking 'Project -> Clean...' in Eclipse every time I'm about to export a signed/unsigned .APK for upload to an App Store. I can only assume that, some time in the distant past, it seemed to be a necessary safeguard when debugging some external JAR or otherwise.
This adds several tedious minutes to the overall export process, particularly with multiple builds and/or apps.
Is this ever a necessary step, or just paranoia?
Cleaning the project recompile all of your classes and it may fix some problems that R.java class cause when resource id's have changed but already compiled classes that refrenced to those id's no longer exist or changed.
I would consider a build system using Maven, Ant or the upcoming Gradle. This avoids problems with corrupt workspaces and lets you integrate automated testing easily; e.g. UnitTests or the simple question "Does it run on Android Version X.Y?"
I prefer Jenkins/Hudson as build server.
Especially when you have several apps for different targets, it can be cumbersome to run all the exports manually.
Ant based build system using Jenkins: this and this
New Gradle based build system: this