How can I use BitmapRegionDecoder code in android 2.2.2 (Froyo)? - java

I was reading an answer to a different question on SO, in which #RomainGuy commented that one could (please correct me if I'm paraphrasing incorrectly) back-port code from later versions of android to earlier versions. Specifically, I am interested in back-porting code for BitmapRegionDecoder from Android version 2.3.3 (Gingerbread) to version 2.2.2 (Froyo).
I would have rather asked the question more generally as what is the best practice / what should be avoided when back-porting code from newer versions of Android to older versions, but stackoverflow hinted that my question might be closed as being too subjective.
Maybe if there is enough interest in the topic, this question could be "morphed" into a more general one..possibly a community wiki?
In any case, I would appreciate any insight into how this is done..whether specific to my use case, or more general advice. Do calls to native methods from within the java class complicate the matter (necessarily involving the NDK)?
If it is indeed possible (and reasonable) to cherry-pick and back-port code in this way, I think many would find it very useful to know how.

As #hackbod mentioned BitmapRegionDecoder is based on external skia library. Yet it's may be a benefit.
Let's examine original source:
BitmapRegionDecoder.java. Mostly defines wrappers around native methods:
private static native Bitmap nativeDecodeRegion(int lbm,
int start_x, int start_y, int width, int height,
BitmapFactory.Options options);
private static native int nativeGetWidth(int lbm);
private static native int nativeGetHeight(int lbm);
private static native void nativeClean(int lbm);
// ...multiply nativeNewInstance overloads follow
Class doesn't use any new Java APIs we'd need to backport.
BitmapRegionDecoder.cpp. Header files it includes consist of ones which are present in Froyo except these two:
AutoDecodeCancel.h. The only line it's used in:
AutoDecoderCancel adc(options, decoder);
This class handles SkDecoder instances lifecycle. It's a small piece of code and may be well back-ported.
SkBitmapRegionDecoder.h
As filename states this is a core component. In fact, all previous were a kind of wrappers around it. The good news is that we may not need to back-port it as it should be possible to take a whole skia library from the Gingerbeard and compile it under Froyo as it is external and doesn't contain any new dependencies.
P.S. I didn't actually dive deep in the code so please correct me if there's anything I overlooked.
Update:
Source code we need is located in following repositories on branches froyo-release and gingerbread-mr4-release:
External skia library repository
Header files are in include/core and include/images
Android framework base
Java code: graphics/java/android/graphics/BitmapRegionDecoder.java
Native code: core/jni/android/graphics/...

You can back-port some code, if it can exist on top of the SDK you are porting it to.
You can't back-port anything. For example, you couldn't back-port a kernel feature. :)
In this case, there is no easy solution to back-porting it. The implementation of this sits on top of Skia and the jpeg decoder, which are both native code. You will need to do your own implementation of that code. You could try copy/pasting the code from the platform, gluing it in with your code with JNI, but this will be a significant amount of work and leave you with native code you need to continue to maintain.
Sorry there is no easy solution for this.

You should consider BitmapRegionDecoderCompat, an API 8+ version of the standard BitmapRegionDecoder (API 10+).
Features
It operates in "compat" mode on devices running API < 10 using a basic
Java/Android fallback (which means it won't be as efficient/fast as the native JNI implementation of API 10+, but it will avoid ugly boilerplates and manual fallbacks).
It uses the native JNI implementation when running on API 10+.
It adds extra usuful methods like decodeBestRegion(),
which extracts the "best" image subregion given your parameters (gravity, size). This method also works on API < 10.
Download
In order to use it in your project you can manually download and add the library as an AAR file:
or you can add the dependecy in your build.gradle (requires jCenter repository):
dependencies {
//...your dependecies
compile 'org.bonnyfone:brdcompat:0.1'
}
Usage
As stated in the docs, in order to migrate to BRDCompat you just need to change the base class name from BitmapRegionDecoder to BitmapRegionDecoderCompat:
//BitmapRegionDecoder brd = BitmapRegionDecoder.newInstance(...);
BitmapRegionDecoderCompat brd = BitmapRegionDecoderCompat.newInstance(...);

Related

Workarounds to import java lib for mingw / ios / linus / other source sets?

I am aware that it's quite a weird use case to depend on having JVM installed for some OS source sets, allow me to go through my use case.
I'm writing a simple utility to wrap calls for the steamCMD (https://developer.valvesoftware.com/wiki/SteamCMD), which has platform dependent installation procedures. So, naturally I should have
// commonMain / steamCmdGetter.kt
expect interface SteamCmdGetter {
fun installClient()
}
// [OS] / steamCmdGetter.kt
actual interface SteamCmdGetter { /* ... */ }
On the other hand, my utility also needs to do work with the file storage (for example, downloading and checking client existence in storage), so I could also use a file class.
// commonMain / File.kt
expect interface File
I am aware that the JB team has an explicit recommendation on its tutorials.
We recommend that you use expected and actual declarations only for Kotlin declarations that have platform-specific dependencies. It is better to implement as much functionality as possible in the shared module even if doing so takes more time.
Yet, against the warnings I wish not to write a MyFile implementation to save efforts from reinventing the wheel for such a common task, but java.io.File has been so dominant in the scene that I could not find any Kotlin alternatives on Gradle / Maven.
Does this means I am forced to write MyFile in the end? Or is there a workaround for importing Java libraries to Kotlin MPP platform sourceSets?
First of all, one can use Java libraries only for jvm and android targets, not the others provided by the Kotlin/Multiplatform. In fact, this is exactly a targets subset that is using Kotlin/JVM. Neither Kotlin/JS nor Kotlin/Native provide interoperability with Java, they has their own interop capabilities. See this page to get some details on the difference. About working with files in particular. Most probably the answer is yes and you'll have to implement it per-target. This kind of work is usually platform-specific, as it hardly rely on the OS implementation. However, part of the functionality you search for should be definitely found in the platform.posix.* platform library, even if it would appear more C-stylish.
P.S. Quick search across the Web led me to this community libraries list, maybe it would help. Also, kotlinlang Slack community(find link here) may have some interesting solutions to share.

How to take a screenshot with JNA in Windows?

I found a few code examples, but I don't know with which JNA versions I can use which methods. I did only find snippets, where classes were missing and I wasn't able to import them.
I would like to know which JNA version I should use and how to get a screenshot as BufferedImage.
A list of required imports would also be great.
It looks like there are several examples at this link. I'll discuss one below (#3) for discussion purposes, but you may find one of the other examples more applicable to your situation and hopefully this answer will help you understand the process.
Before the example, I will answer your question "which JNA versions"... you should use the latest version in almost all cases. JNA is a user-supported library, and the core JNA code doesn't change much but each new version adds more user-contributed mappings to native functions. Note their FAQ question, "JNA is missing function XXX in its platform library mappings" and the answer, "No, it's not, it's just waiting for you to add it :)". If the mapping you need is not in JNA, you can simply add it using the example provided, for your immediate needs. Better yet, contribute your mapping to the JNA project so that the next person in your situation will benefit from the work you've done!
Now, example #3 from the link takes a screenshot of the entire screen and returns it as a BufferedImage object. The full source code for that example shows all the imports you will need, most from JNA's WinGDI class.
If you scroll to the bottom of the class you may also see that the authors have extended two JNA platform interface contributions with mappings that aren't in JNA (or weren't in 2010 when that code was written). You will have to do similar mappings (and perhaps contribute them to their respective JNA classes when you're done).

Transliteration with Android

I want to transliterate (not translate!) text from arbitrary (as far as possible) languages to English in an Android app. Is there a built-in way?
I've found https://android.googlesource.com/platform/libcore/+/master/luni/src/main/java/libcore/icu/Transliterator.java but it doesn't seem to be available by default (at least the IDE doesn't find it). Do I simply need to add this code, as suggested by the comment in Where can I get a JAR to import libcore.io??
Alternately, I could add ICU4J to dependencies and follow icu4j cyrillic to latin. But this is a very large dependency (though Proguard should help).
Finally, I could easily add transliteration from Cyrillic myself and wait until/if someone actually needs other languages (with obvious drawbacks).
Under the hood, Android has all of ICU4J available under android.icu, but only a subset is exposed as public API. If you want to use a class that isn't exposed, you can write code that uses the class and it should work fine. However, doing so is not technically supported, so there could be some version of Android somewhere that doesn't have the class for whatever reason and causes your code to break. (This is unlikely to happen in practice, but possible nonetheless.)
NOTE: The namespace android.icu was added in Android 7 Nougat, so it may or may not be usable yet depending on the version of Android you are targeting.
try compile "org.robovm:robovm-rt:+".
robovm-rt has libcore inside. works for me.

Handling changes in dependent 3rd party libraries

I have a project which depends on several 3rd party libraries, the project itself is packaged as a jar and distributed to other developers as a library.
Those developers add the dependencies to their classpath and use my library in their code.
Recently I had an issue with one of the 3rd party dependencies, the apache commons codec libary,
The problem is this:
byte[] arr = "hi".getBytes();
// Codec Version 1.4
Base64.encodeBase64String(arr) == "aGk=\r\n" // this is true
// Codec Version 1.6
Base64.encodeBase64String(arr) == "aGk=" // this is true
As you can see the output of the method has changed with the minor version bump.
My question is, I don't want to force the user of my library to a specific minor version of a 3rd party library. Assuming I know about the change to the dependent library, is there anyway in which I can recognize which library version is being included in the classpath and behave accordingly? or alternatively, what is considered to be the best practice for these kind of scenarios?
P.S - I know that for the above example I can just use new String(Base64.encodeBase64(data, false)) which is backwards compatible, this is a more general question.
You ask what is the "best practice" for this problem. I'm going to assume that by "this problem" you mean the problem of 3rd party library upgrades, and specifically, these two questions:
When should you upgrade?
What should you do to protect yourself against bad upgrades (like the commons-codec bug mentioned in your example)?
To answer the first question, "when should you upgrade?," many strategies exist in industry. In the majority of the commercial Java world I believe the current dominant practice is "you should upgrade when you are ready to." In other words, as the developer, you first need to realize that a new version of a library is available (for each of your libraries!), you then need to integrate it into your project, and you are the one who makes the final go/no-go decision based on your own test bed --- junit, regression, manual testing, etc... whatever it is you do to ensure quality. Maven facilitates this approach (I call it version "pinning") by making multiple versions of most popular libraries available for automatic download into your build system, and by tacitly fostering this "pinning" tradition.
But other practices do exist, for example, within the Debian Linux distribution it is theoretically possible to delegate a lot of this work to the Debian package maintainers. You would simply dial in your comfort level according to the 4 levels Debian makes available, choosing newness over risk, or vice versa. The 4 levels Debian makes available are: OLDSTABLE, STABLE, TESTING, UNSTABLE. Unstable is remarkably stable, despite its name, and OLDSTABLE offers libraries that may as much as 3 years out of date compared to the latest-and-greatest versions available on their original "upstream" project websites.
As for the 2nd question, how to protect yourself, I think the current 'best practice' in industry is twofold: choose your libraries based on reputation (Apache's is generally pretty good), and wait a little while before upgrading, e.g., don't always rush to be on the latest-and-greatest. Maybe choose a public release of the library that has already been available 3 to 6 months, in the hope that any critical bugs have been flushed out and patched since the initial release.
You could go farther, by writing JUnit tests that specifically protect the behaviours you rely on in your dependencies. That way, when you bring down the newer version of a library, your JUnit would fail right away, warning you of the problem. But I don't see a lot of people doing that, in my experience. And it's often difficult to be aware of the precise behaviour you are relying on.
And, by the way, I'm Julius, the guy responsible for this bug! Please accept my apologies for this problem. Here's why I think it happened. I will speak only for myself. To find out what others on the apache commons-codec team think, you'll have to ask them yourself (e.g., ggregory, sebb).
When I was working on Base64 in versions 1.4 and 1.5, I was very much focused on the main problem of Base64, that is, encoding binary data into the lower-127 ASCIi, and the decoding it back to binary.
So in my mind (and here's where I went wrong) the difference between "aGk=\r\n" and "aGk=" is immaterial. They both decode to the same binary result!
But thinking about it in a broader sense after reading your stackoverflow posting here, I realize there is probably a very popular usecase that I never considered. That is, password checking against a table of encrypted passwords in a database. In that usecase you probably do the following:
// a. store user's password in the database
// using encryption and salt, and finally,
// commons-codec-1.4.jar (with "\r\n").
//
// b. every time the user logs in, encrypt their
// password using appropriate encryption alg., plus salt,
// finally base64 encode using latest version of commons-codec.jar,
// and then check against encrypted password in the database
// to see if it matches.
So of course this usecase fails if commons-codec.jar changes its encoding behaviour, even in immaterial ways according to the base64 spec. I'm very sorry!
I think even with all of the "best-practices" I spelled out at the beginning of this post, there's still a high probability of getting screwed on this one. Debian Testing already contains commons-codec-1.5, the version with the bug, and to fix this bug essentially means screwing people who used version 1.5 instead of version 1.4 where you did. But I will try to put some documentation on the apache website to warn people. Thanks for mentioning it here on stack-overflow (am I right about the usecase?).
ps. I thought Paul Grime's solution was pretty neat, but I suspect it relies on projects pushing version info in the the Jar's META-INF/MANIFEST.MF file. I think all Apache Java libraries do this, but other projects might not. The approach is a nice way to pin yourself to versions at build-time though: instead of realizing that you depend on the "\r\n", and writing the JUnit that protects against that, you can instead write a much easier JUnit: assertTrue(desiredLibVersion.equals(actualLibVersion)).
(This assumes run-time libs don't change compared to build-time libs!)
package stackoverflow;
import org.apache.commons.codec.binary.Base64;
public class CodecTest {
public static void main(String[] args) {
byte[] arr = "hi".getBytes();
String s = Base64.encodeBase64String(arr);
System.out.println("'" + s + "'");
Package package_ = Package.getPackage("org.apache.commons.codec.binary");
System.out.println(package_);
System.out.println("specificationVersion: " + package_.getSpecificationVersion());
System.out.println("implementationVersion: " + package_.getImplementationVersion());
}
}
Produces (for v1.6):
'aGk='
package org.apache.commons.codec.binary, Commons Codec, version 1.6
specificationVersion: 1.6
implementationVersion: 1.6
Produces (for v1.4):
'aGk=
'
package org.apache.commons.codec.binary, Commons Codec, version 1.4
specificationVersion: 1.4
implementationVersion: 1.4
So you could use the package object to test.
But I would say that it's a bit naughty for the API to have changed the way it did.
EDIT Here is the reason for the change - https://issues.apache.org/jira/browse/CODEC-99.
You could calculate a md5 sum of the actual class file and compare it to the expected. Could work like this:
String classname = "java.util.Random"; //fill in the your class
MessageDigest digest = MessageDigest.getInstance("MD5");
Class test = Class.forName(classname);
InputStream in = test.getResourceAsStream("/" + classname.replace(".", "/") + ".class");
byte[] buffer = new byte[8192];
int read = 0;
while ((read = in.read(buffer)) > 0) {
digest.update(buffer, 0, read);
}
byte[] md5sum = digest.digest();
BigInteger bigInt = new BigInteger(1, md5sum);
String output = bigInt.toString(16);
System.out.println(output);
in.close();
Or maybe you could iterate over the filenames in the classpath. Of course this only works, if the devs use the original filenames.
String classpath = System.getProperty("java.class.path");
for(String path:classpath.split(";")){
File o = new File(path);
if(o.isDirectory()){
....
}
}
Asaf, I solve this problem by using Maven . Maven has nice versioning support for all artifacts you use in your project. On top of that, I use the excellent Maven Shade Plugin which gives you ability to package all 3rd party libraries (maven artifacts) in a single JAR file, ready for deployment. All other solutions are just inferior - I am talking from my personal experience - I've been there, done that... Even wrote my own plugin-manager, etc. Use Maven, that is my friendly advice.
replacing the newline with empty string could be a solution?
Base64.encodeBase64String(arr).replace("\r\n","");
I would create 2+ different versions of a library to complement appropriate third party library version and provide manual which one to use. Probably write correct pom for it.
To resolve your problem I think the best way is to use a OSGi container, so you can choose your version of the 3rd party dependency and other libraries can safely use the other version without any conflict.
If you cannot rely on a OSGi container then you can use the implementation version in the MANIFEST.MF
Maven is a great tool, but cannot alone resolve your problem.

Google App Engine "repackaged" package

What is the purpose of the classes in this package?
I want to use Base64 encoding in my app. As I'm typing away in Eclipse, I am prompted if I want to import a class called "com.google.appengine.repackaged.com.google.common.util.Base64"
I can't find any documentation about what this class does. No javadoc, or no mention in the Google App Engine manual (that I can see). Is this some kind of "hidden" API that I'm not supposed to have access to?
Is this some kind of "hidden" API that I'm not supposed to have access to?
Yes.
The purpose of repackaging Java classes is to have a private copy of a library that otherwise might conflict with another version of that some library (that the application developer adds to his project as a jar file).
It is one possible answer to JAR-hell.
Even the JDK makes use of this mechanism, e.g. with com.sun.org.apache.xerces which is an XML parsing library developed by the Apache Project that Sun choose to include (repackaged).
Do not call these classes directly. (You could, and they would probably work okay, but as they are not part of the official API, they could disappear in the next version).

Categories