I tried to run a sample Preon application on Android 2.1 without luck. I wonder if running a Preon application on Android is even possible. How hard would it be to make the Preon framework Dalvik friendly?
Preon is a Java library for building codecs for bitstream-compressed data in a declarative way. Think JAXB or Hibernate, but then for binary encoded data written by Wilfred Springer.
Below are my finding when trying to run a simple application that uses Preon in Android:
Preon has a dependency on Pecia. Pecia indirectly depends on stax-api which is not supported out of the box in Android. Is the stax-api used in the Preon core processing? Can I exclude the stax-api from the Preon dependencies?
After excluding pecia from the dependencies (without knowing the consequences), I found out that preon brings multiple copies of the log4j.properties file. I suggest moving log4j.properties files to the /src/test/resources directory on the preon and pecia projects to avoid bringing them with the classes.
Because duplicated log4j.properties files, the android-maven-plugin fails at the package goal with the following message:
[INFO] java.util.zip.ZipException: duplicate entry: log4j.properties
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19.717s
[INFO] Finished at: Wed Mar 23 14:30:55 PST 2011
[INFO] Final Memory: 7M/62M
Well, I will answer my own question. It is POSSIBLE to use the Preon framework in Android. However, Preon does not work out of the box. I managed to run a sample application after doing the following changes:
1. I moved all log4.properties in the
preon projects to their
corresponding /src/test/resources
directory.
2. Remove dependency on pecia.
3. Embedded the following interfaces from pecia in preon-binding:
DocumentElement.java
Documenter.java
Para.java
ParaContents.java
4. org.codehaus.preon.code.Codecs:
I commented out the following imports and all its related code (the ones that won't compile after this change):
import javax.xml.stream.XMLStreamException;
import nl.flotsam.pecia.builder.ArticleDocument;
import nl.flotsam.pecia.builder.base.DefaultArticleDocument;
import nl.flotsam.pecia.builder.base.DefaultDocumentBuilder;
import nl.flotsam.pecia.builder.html.HtmlDocumentBuilder;
import nl.flotsam.pecia.builder.xml.StreamingXmlWriter;
import nl.flotsam.pecia.builder.xml.XmlWriter;
5. org.codehaus.preon.codec.ObjectCodeFactory
commented out:
/* target.document(codec.getCodecDescriptor().reference(CodecDescriptor.Adjective.THE, false));
*/
Suggestions:
I suggest to refactor preon code to have documentation code separated from runtime dependencies.
Wilfred, If you want, I could contribute to your project.
Oscar.
you also could take a look at Java Binary Block Parser, the library is compatible with Android 2.1+
Related
I have an AnyLogic model that uses JAXB functions for parsing an XML file. The model used to work earlier but it doesn't now since apparently the newer versions of Java don't include JAXB. The online examples of how to do this in Java programs doesn't fit AnyLogic environment.
Based on online searches, I have downloaded and included the jaxb-api-2.4.0-b180830.0359.jar file in AnyLogic model. That by itself doesn't work and leads to the following error:
SEVERE: null
javax.xml.bind.JAXBException: Implementation of JAXB-API has not been found on module path or classpath.
I then added the following in the import section:
import java.xml.bind;
import com.sun.xml.bind;
Also tried:
import java.xml.bind.*;
import com.sun.xml.bind.*;
Both resulted in the same error:
The import com.sun.xml.bind cannot be resolved.
The import java.xml cannot be resolved.
On line guidance from for example https://www.dariawan.com/tutorials/java/using-jaxb-java-11/ recommend adding the dependencies using below code to Java programs:
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>${jaxb.api.version}</version>
</dependency>
How do we specify such dependencies via AnyLogic interface?
Once JAXB functions work, they should lead to parsing the data in XML file and creating and populating corresponding objects in AnyLogic as was the case before.
Sounds like you didn't add the .jar file properly in AnyLogic...
Click on your model in the projects view (the top-most entry above all agents and Main):
In the properties, you can add your .jar file under the "Dependencies" tab:
Do follow the advise to "keep a copy in your model folder" so it doesn't get lost.
Now you can import what you need and it will work (unless the .jar file is broken itself).
PS: Might be better to rephrase your question to "How to load an external jar file to my model dependencies"
The problem
As already stated in the question, the problem is this:
JAXB as a Java tool for XML treatment has been removed from the Java standard library beginnig with Java 9. The reason for this decision was to make the standard Java library more lightweight.
Dependencies in Java
This can be solved by including the removed package(s) manually into your project. As a lot of people already had this problem, quite an amount of SO questions for this exist, like this and this.
In these answers it is suggested to 'load dependecies', eg like this:
<!-- https://mvnrepository.com/artifact/pacakgename -->
<dependency>
<groupId>packagename</groupId>
<artifactId>modulename</artifactId>
<version>1.0.0</version>
</dependency>
These dependency statements are interpreted by the Java IDE such as Eclipse (with a package manager such as Maven), and the stated packages then get automatically loaded from an online package repository and included in the project.
In AnyLogic, the procedure is slightly different!
Dependencies in AnyLogic
In AnyLogic the exact same thing is happening when you click on your project in the AnyLogic editor, and add a JAR file under the Dependencies tab, see Benjamin's answer for this. In order to do that you will manually have to find the JAR file in a package repository and download it first.
Needed Pacakges
This is where I am not completely sure. I'll got an example to run when I included the following packages, but probably there is redundancy, so you might want to try around with variations of them:
javax.xml.bind / jaxb-api / 2.3.0-b170201.1204
javax.activation / activation / 1.1
org.glassfish.jaxb / jaxb-runtime / 2.3.0-b170127.1453
com.sun.xml.bind / jaxb-impl / 2.2.11
com.sun.xml.bind / jaxb-core / 2.2.11
Example
I created a simple example model in AnyLogic, that is based on this blog post. You can run and download it (including the dependeny Java packages) here.
According to this question, the standard way to determine the memory size of an object in Java is by using java.lang.instrumentation. After to some research, it looks like there is no Scala specific way to achieve this, so the Java approach should also applies here.
Unfortunately, for a Scala programmer without Java background it is not fully straightforward to adapt this technique in Scala. My questions are:
Question 1
What exactly is happening here? I guess the reason why we have to put a class like ObjectSizeFetcher in a separate JAR is to ensure that it is somehow loaded before the actual program where we want to use it. I assume it is not possible to use instrumentation without the Premain-Class entry and the parameter -javaagent:TheJarContainingObjectFetcher.jar?
Question 2
Is there a simple way to implement the complete work flow in SBT? Currently I only see a somewhat cumbersome solution: I first have to set up a secondary SBT project where I define ObjectSizeFetcher and package it into a JAR. So far I did not figure out how to automatically add the Premain-Class entry to the JAR during packaging, so I would have to solve that manually. Than I can add the resulting JAR to the local libraries of the project where I want to use getObjectSize. For this project I now have to enable fork in run and use javaOptions in run += "-javaagent:TheJarContainingObjectFetcher.jar". Is there a more simple (and less intrusive) work flow to quickly use instrumentation within an existing SBT project? Maybe I can tell SBT directly about a Premain-Class to make this secondary JAR unnecessary?
Question 3
Would you recommend a completely different way to evaluate the memory usage of an object in Scala?
Answer 1: Yes, if you want iinstrumentation, you need to get an instance. You probably can't get it without Premain-Class and -javaagent.
Answer 2: You can (and may need to) use classloaders and create a very simple bootstrap project (in Java or in Scala with Proguard). There are two reasons:
The first reason: conveniency. You may use java.net.URLClassLoader to include standard Scala library and the classes directory of your project. You will not need to repackage it in JAR any more when testing.
The second reason: preventing JAR hell. You probably know that Scala is not binary compatible. You should also know that the Java agent is loaded with the application in the same classloader. If the classloader includes Scala library, application can't simply use another Scala version.
However, if the Java agent doesn't use directly Scala library (e.g. it is a bootstrap application and loads the real agent and its libraries in another classloader), the instrumented application is free to use any Scala library.
Answer 3: I would probably use instrumentation, too.
Answer 3: you can have a look to ktoso/sbt-jol which displays the JOL (Java Object Layout), ie the analysis of object layout schemes in JVMs
// project/plugins.sbt
addSbtPlugin("pl.project13.sbt" % "sbt-jol" % pluginVersionHere)
It does include the size:
> jol:internals example.Entry
...
[info] example.Entry object internals:
[info] OFFSET SIZE TYPE DESCRIPTION VALUE
[info] 0 12 (object header) N/A
[info] 12 4 int Entry.value N/A
[info] 16 4 String Entry.key N/A
[info] 20 4 (loss due to the next object alignment)
[info] Instance size: 24 bytes
[info] Space losses: 0 bytes internal + 4 bytes external = 4 bytes total
Having a very basic issue with running tests in a Java Play 2.0 app. It won't pick up my tests.
I've tried to put a test folder in the root of the project and use Junit as the docs seem to suggest. This results in:
$ play test
[info] Loading project definition from /Users/.../play20TestTest/project
[info] Set current project to play20TestTest (in build file:/Users/.../testapps/play20TestTest/)
[info] No tests to run for test:test
[success] Total time: 1 s, completed May 22, 2012 11:16:52 AM
I've also tried putting test under app. This at least picks up my code, but the test packages aren't available on my classpath so I assume it's incorrect.
I made a simple example and copied SimpleTest from the docs exactly: https://github.com/jsimone/Play2UnitTest
You placed your test in test/test/SimpleTest.java, move it to test/SimpleTest.java and it will work with play test command.
Has someone tried to use MessagePack with an Android app?
Is it possible? I have tried to use the Jar from msgpack-java and received the following Exception:
Caused by: java.lang.ExceptionInInitializerError
at org.msgpack.Packer.pack(Packer.java:532)
at org.msgpack.MessagePack.pack(MessagePack.java:31)
... 15 more
Caused by: java.lang.ExceptionInInitializerError
at org.msgpack.template.TemplateRegistry.<clinit>(TemplateRegistry.java:38)
... 17 more
Caused by: java.lang.VerifyError: org.msgpack.template.BeansFieldEntryReader
at org.msgpack.template.builder.BeansTemplateBuilder.<init (BeansTemplateBuilder.java:42)
at org.msgpack.template.builder.BuilderSelectorRegistry.initForJava(BuilderSelectorRegistry.java:73)
at org.msgpack.template.builder.BuilderSelectorRegistry.<clinit>(BuilderSelectorRegistry.java:38)
... 18 more
The code that I use is very simple
PrintWriter out = new PrintWriter(socket.getOutputStream());
Message msg = new Message();
msg.body = "asdasdasd";
msg.from = "qwe";
msg.to = "ttt";
byte[] bytes = MessagePack.pack(msg);
out.print(bytes);
out.flush();
I have javassist.jar, msgpack-0.5.2.jar, slf4j-api-1.6.2.jar and slf4j-jdk14-1.6.2.jar in my lib directory.
In my server application this code works fine with the same libraries.
(Hopefully) FINAL UPDATE
msgpack : 0.6.8 works on Android without any problems
msgpack-rpc : 0.7.0 works on Android with one caveat.
Specifically, you need to add the following to onCreate for API Level 8 (Android 2.2.1), and possibly lower:
java.lang.System.setProperty("java.net.preferIPv4Stack", "true");
java.lang.System.setProperty("java.net.preferIPv6Addresses", "false");
due to this bug.
If you want to see a simple example, here's a pair of projects set up for this purpose:
https://github.com/mikkoz/msgpack-android-test-server/tree/master/msgpack-android-test-server
https://github.com/mikkoz/msgpack-android-test-client/tree/master/msgpack-android-test-client
Previous Versions
UPDATE: as of 0.6.7 msgpack should be compatible with Android (there is a small dependency exclusion issue). Check the text below for msgpack-rpc (which also might be adapted in the future).
NOTE: If you're also using msgpack-rpc, you need to do the following steps:
Download the msgpack-rpc source from git://github.com/msgpack/msgpack-rpc.git (specifically, the "java" folder).
Change the main msgpack artifact version to the one you've built.
In org.msgpack.rpc.loop.netty.NettyEventLoop, change the NioClientSocketChannelFactory to OioClientSocketChannelFactory(getWorkerExecutor()).
Build the MessagePack-RPC in the same way as in the case of the main MessagePack JAR (see Step 11 above).
The NettyEventLoop replacement is due to this issue:
http://markmail.org/message/ypa3nrr64kzsyfsa .
Important: I've only tested synchronous communication. Asynchronous might not work.
And here's the reason for msgpack not working with Android prior to 0.6.7:
The reason for the error is that MessagePack uses several java.beans classes that are not included in the Android SDK. You're probably using the MessagePackBeans annotation.
This is a similar problem to the one described here, for which the general solution is outlined here. Unfortunately, in our case it requires a rebuild of msgpack. Here's what I did (you can almost certainly skip Steps 5 and 8, but I haven't tried it that way) :
Download the MessagePack source from https://github.com/msgpack/msgpack-java.git.
Import the MessagePack source as a project in your IDE.
Download the Apache Harmony source for the relevant packages from http://svn.apache.org/repos/asf/harmony/enhanced/java/trunk/classlib/modules/beans/src/main/java .
Copy these packages into your MessagePack project's src/main/java folder:
java.beans
java.beans.beancontext
org.apache.harmony.beans
org.apache.harmony.beans.internal.nls
In your MessagePack project, remove the following classes:
PropertyChangeListener
IndexedPropertyChangeEvent
PropertyChangeEvent
PropertyChangeListenerProxy
PropertyChangeSupport
Rename the java.beans packages to something different, e.g. custom.beans .
Change all java.beans references to the renamed ID, so again e.g. custom.beans. This applies especially to BeansFieldEntryReader (this class is the reason for the original error).
Change the custom.beans references for the five classes you removed in Step 5 back to java.beans.
In the org.apache.harmony.beans.internal.nls.Messages class, comment out the method setLocale, and remove the imports associated with it.
Remove all classes that still have errors, except Encoder. In that class, comment out all references to the classes you've removed. You should now have an error-free project.
Build the MessagePack JAR:
If you're using Maven, change the version in the pom.xml to something unique, run Maven build with the install goal, then add the dependency in your Android project with that version.
If you're not using Maven, you have to run the jar goal for Ant with the included build.xml. Replace the msgpack JAR in your Android project with this one.
If you're publishing your app, remember to include the relevant legal notice for Apache Harmony. It's an Apache License, just like MessagePack.
That should do it. Using your example code, and my own data class, I was successfully able to pack and unpack data.
The entire renaming ritual is due to the fact that the DEX compiler complains about java.* package naming.
There is a critical msgpack bug saying data packed with msgpack will get corrupted on the Dalvik VM. http://jira.msgpack.org/browse/MSGPACK-51
There is an ongoing effort by #TheTerribleSwiftTomato and the MessagePack core team to get MessagePack working on Android, please see the related GitHub issue. The fix mentioned in #TheTerribleSwiftTomato's answer is to be found here.
Update
I've managed to get it at least running on Android by (painstakingly) adding all the necessary javassist Classes which are currently required for the build to succeed. An extra 600KB gain in size, yet at least it seems to work. All in all, it appears to be working to some extent on Android, eventually check out the lesser-known resources about Message Pack such as its User Group and its Wiki for more information.
On a side-note, be sure to use a HTTP Request Library (such as LoopJ's Android Async HTTP or Apache's HttpClient) which can handle binary data.
Last but not least you can ping me if there is interest in this jar which makes MessagePack seemingly work on Android – credits go out of course to #TheTerribleSwiftTomato who supplied the fix above!
I suggest you write this in the main proguard-rules file-
-dontwarn org.msgpack.**
-keep class org.msgpack.** { *; }
I maintain the build process for a large (> 500,000 LOC) Java project. I've just added a Sonar analysis step to the end of the nightly builds. But it takes over three hours to execute ... This isn't a severe problem (it happens overnight), but I'd like to know if I can speed it up (so that I could run it manually during work hours if desired).
Any Sonar, Hudson, Maven or JDK options I can tweak that might improve the situation?
[INFO] ------------- Analyzing Monolith
[INFO] Selected quality profile : Sonar way, language=java
[INFO] Configure maven plugins...
[INFO] Sensor SquidSensor...
[INFO] Java AST scan...
[INFO] Java AST scan done: 103189 ms
[INFO] Java bytecode scan...
... (snip)
[INFO] Java bytecode scan done: 19159 ms
[INFO] Squid extraction...
[INFO] Package design analysis...
... (over three hour wait here)
[INFO] Package design analysis done: 12000771 ms
[INFO] Squid extraction done: 12277075 ms
[INFO] Sensor SquidSensor done: 12404793 ms
12 million milliseconds = 200 minutes. That's a long time! By comparison, the compile and test steps before the sonar step take less than 10 minutes. From what I can tell, the process is CPU-bound; a larger heap has no effect. Maybe it has to be this way because of the tangle / duplication analysis, I don't know. Of course, I know that splitting up the project is the best option! But that will take a fair amount of work; if I can tweak some configuration in the meantime, that would be nice.
Any ideas?
I walked in your shoes: on a 2million+ loc project (that should have been split into sub-projects years ago, indeed), I never saw the package design analysis to complete within 4 days of computation...
As of SONAR-2164 (Add an option to skip the quadratic "Package design analysis" phase), I have submitted a patch that would allow users to set true in their maven project file so that the package design analysis is skipped.
This patch is pending approval and is currently scheduled for inclusion in v2.7.
From Freddy Mallet on the list:
"... the problem doesn't come from the DB but come from the algorithm to identify all the package dependencies to cut. ... If you manage to cut this project in several modules, then your problem will vanish."
I tested this theory by excluding a relatively large package, and sure enough it dropped dramatically. In theory the number of connections could grow quadratically with the number of packages, so this approach is probably as good as is possible with such a large codebase.