Usually when I create something I have a number of different ideas on how to do it. Anytime I'm not sure which is better, rather than posting here each time, I'd like to test things myself. What methods can I use to test speed/performance/memory usage etc in Android?
While I personally have little experience with them, Performance Tips offered by the Android Documentation gives mention to two potentially useful programs for checking benchmarks.
Upon a cursory glance, it seems Traceview is useful for testing individual methods as well as broad scopes of an application.
Based on the documentation, creating a trace, retrieving it, and processing it can be done as follows:
First, find the locations where you want to test performance. Here's a quick sample:
public void doSomething() {
//maybe a long process that needs optimizing
}
protected void onCreate(Bundle savedInstanceState) {
// set up stuff
Debug.startMethodTracing("myfilename");
doSomething();
Debug.stopMethodTracing();
}
Second, retrieve the file from your Android device (note, you need a write-permission to create this file, so be sure to add that to your manifest). This can be done using the Android Debug Bridge (located in <Android SDK Home>/platform-tools). Open a command shell and run the following command
<Android SDK home>/platform-tools/adb pull /sdcard/myfilename.trace <Desired Location>
Lastly, you can process the output using Traceview with another command call. Traceview is located in the <Android SDK Home>/tools directory, so the call would look like this:
<Android SDK Home>/tools/traceview <Desired Location>/myfilename
Note that depending on your operating system, you may need to switch forward slashes with back slashes. This is a rough outline of the process outlined in the documentation (cited above), so I would recommend reading that before trying to implement this.
Related
I have been developing an android application that needs to retrieve a JSON file from dweet.io, an IOT machine-to-machine communication site. With some quick google searches I found a java wrapper for interfacing with dweet, but although it seems to work fine in theory, the Application throws some errors involving a 'Strict Mode' when it tries to reach the site. This appears to be some method of optimization standard put in place by google that prevents me from putting disk or network I/O in my Main Activity. If I can't place it in my main activity, where should I put it and how should I do that?
edit: it appears the solution involves something called 'AsyncTask', but I'm not sure how to use it or what that would entail.
Create a MyTask class, which extends AsyncTask.
Override the doInBackground() and maybe other functions on, need, but not required for now.
Take a look at thistutorial.
I'm not sure this question belongs to SO since it is maybe to broad, but I don't know where to ask it (I did not find a better stackexchange site).
Context
I'm using UiAutomator to write some Ui test on android. I created some functions to simplify the write of the tests like the one in the doc
public void startMainActivityFromHomeScreen() {
/*Start the app from the home screen*/
}
As a developper, this works fine. But non technical peoples (contracting owner) can't easily use this functions to write tests.
Needs
I'm searching for a way for non technical users to write some scripts using the function I already defined. Here is a dummy example (both script format and actions)
Suite: Launch the app twice from the home screen
Case: Launch the app for the first time
Do startMainActivityFromHomeScreen
Expect ...
Case: Launch the app for the second time
Do startMainActivityFromHomeScreen
Expect ...
The important point here is to interact with java functions. I know other tools like calabash but it does not provide java interfaces.
Current approach
Here is an idea (nothing implemented)
Put all the functions in a lib
Write a groovy dsl (because groovy interact well with java) which allows non technical users to easily write scripts
Create a java program which will evaluate the groovy script and generate the android code source associated (with the lib (from (1)) as a gradle dependency).
Run gradle androidTestCompile
Since the functions are in a lib, developers can easily include and use it into their projects. So the same lib can be used for all users.
I hate this idea since I have to generate code source from my code, but this is the only one I have.
Questions
Is this approach as horrible as I think it is ?
Do you know another way to do it ?
I'd check out https://cucumber.io/docs/reference/jvm#java, it's a library that accomplishes pretty much exactly what you're looking for by letting you associate regex's with Java test methods.
Your Java code would look something like:
#When("^I open the app from the (main|home) screen$")
public void openApp(String launchScreen) {
...
}
And the testing file would look like:
Feature: Launching app
Scenario: Launching from first screen
When I open the app from the main screen
Then I see a blue icon...
Scenario: Launching from second screen
When I open the app from the second screen
Then I see a green icon...
I have copy-pasted together MainActivity class for a simple Android application, which uses BluetoothAdapter, TextToSpeech, and also Runnable status checker, running periodically. For the latter, I have "implemented TextToSpeech.OnInitListener".
The code is probably too large to post here, so I will try to formulate the question in general form and then explain why.
How one is supposed to understand where "thread forks" happen in the Java code where everything looks like method invocations? Are there any conventions (for Android libraries specifically, of maybe it's Java-wide) to distinguish simple call from a call, which forks it's own thread? Or is the RTFM the only way to find where forking is happening?
And the actual problem is that Bluetooth LE scanning (?) sometimes duplicates, producing double (or triple, sometimes more) log entries one after another. I guess, I have a mistake somewhere in onStop, onResume, onDestroy, etc hooks (though I was trying to follow state transitions as on the diagram in the docs), but it is hard to find out why some threads/tasks survive and duplicate.
I am "programming" in Java only occasionally, so I have very little idea how concurrency could be debugged. But may be folks with experience can share general hints/advices/guidelines, useful in much more cases than mine.
Update:
Since the media side of JFX has been open sourced, I've looked into this myself and it is indeed possible, but requires changing and rebuilding the JFX source (both Java and C parts.) The process is described here for anyone that wants to have a go - I add MKV support in that example, but it should be very similar for other plugins.
The remainder of the question is thus mainly historical, but I'll leave it here for reference.
Background
I've been using VLCJ thus far for playing video in my application. It works, but if possible I'd like to see if I can achieve a similar level of support for common codecs by migrating to JavaFX and saving myself a lot of hassle with multiple VMs and suchlike that VLCJ needs to play multiple videos reliably. I won't go into it here but see my answer to this question if you're interested in the details. There's also the issue of cross-platform compatibility, it works on Mac and Linux ok but I haven't worked out how to get it to show on Mac yet (I believe there's some security in place to prevent one process gaining access to another's native components, but again that's beyond the scope of this question.)
It boils down to the fact that while it works, it's a lot of maintenance and hassle working with multiple VMs and bridging them stably if there's another solution that would be easier. VLC does have a pretty legendary level of support for playing pretty much anything which is why I've gone with it thus far, and I'd be interested to see if I can get a similar result in JavaFX - or at least if it can provide the means for doing so in a cross platform manner.
Research
JavaFX 2.0 supports video - great! But at the moment the official line is it supports "FLV containing VP6 video and MP3 audio". Is there a way to extend this to add in support for more codecs? There's no hard codec that I'd like to support, it's more a case of as many as I can so I'm looking for an extensible method to go about the above.
I wondered if it would play video for codecs installed natively on the machine and that it just doesn't advertise itself as such (because that functionality obviously is machine dependant and not cross-platform.) But no dice, I've tried a number of common formats and it really does refuse to play anything other than what it states.
From looking at JavaFX 1.3 it also supports other platform dependant codecs depending on where it's installed. Is there a way to get this behaviour with JavaFX 2? Or is it planned at all for a subsequent release? I haven't been able to find any information on it on the roadmap or any comment from Oracle about it.
Only thing I could find from searching extensively is here which implies that it may be possible but no-one seems to know how. I'd also be interested to know if it's based on GStreamer why all the formats supported by GStreamer aren't included by default either?
In terms of playing DVDs with JavaFX I've got absolutely nowhere, so I'm assuming that's just a no-go at the moment. If anyone does have any ideas or information though, I'm all ears.
Other approaches
One approach which I was half wondering may be possible is crowbarring the JMC jar out of the old JavaFX as described here and trying to get that working alongside JavaFX 2. I don't suppose anyone has had any luck with that approach or something similar?
All things failing, if anyone has any information or links on if / when support for additional codecs will be supported out of the box, then I'd be interested to hear that also. Or if anyone has any contact details for someone at Oracle I could ask that would also be appreciated! I've been longing for decent video support in Java for some time, and I guess what this boils down to is trying to figure out if JavaFX is the answer to this, or just another half hearted attempt that will never play more than what it does at the moment! I'm hoping it's not the latter, but I've yet to see much to show that's the case.
Believe me, I feel and know your frustration. I have pondered this for a while, but I had to use un-straight means of solving my issues.
There are many ways around this, each with limitations but depends on what works for you:
Docs say WebView works with HTML5, which plays videos supported on the platform (Though sadly not flash). If using a webview to play video works for you, you can try this out. You can even draw over it with other nodes.
Portable VLC Player! If maybe you're developing some sort of projector/director app and you want fullscreen video, you can have portable VLC player play the video in fullscreen in one screen with it's controls in the other. Used this solution and it works quite well for mac and windows. :)
Only thing is you can't draw nodes on the video as it's an external app, with just the illusion of fullscreen video of your app.
If you ever need to utilize the power of flash within your javafx 2.0 application, then use a swt-based browser(or something Like the DJ Project if you're a Swinger) as they support all features of your native browser.
I've now managed to compile MKV support into JavaFX successfully, and it does take some, but not a great deal of effort on the native layer also. See here for the discussion surrounding it, and here for the result submitted as a patch / JIRA ticket.
I've written a much more comprehensive guide on the process here which may be of interest to anyone else looking to go down this route.
What follows is my brief investigation before I actually seriously looked at compiling other media support in, though I'll leave it here for reference.
Now that JFX8 has been released and is completely open source, I've spent a bit of time looking at how this could be done, and whether it could be done without patching the JFX source. Unfortunately the answer to that latter point is an almost definite no, at least not without horrible bytecode manipulation hacks. I may look into this more practically at a later date, but I'll document what I've worked out so far from the source available.
The magic starts from the Media constructor, which is ultimately where the MediaException pops out from (with the MEDIA_UNSUPPORTED flag if you try to play an unsupported format.) From there it creates the Locator, whose constructor ensures that the URL is one that's supported. It's init() method is then called in a separate thread, which performs some sanity checking on the URL string, reads the file, then proceeds to try to work out what the format is.
The relevant code for this part of the method is thus:
if (scheme.equals("file") || scheme.equals("jar")) {
InputStream stream = getInputStream(uri);
stream.close();
isConnected = true;
contentType = MediaUtils.filenameToContentType(uriString); // We need to provide at least something
}
if (isConnected) {
// Check whether content may be played.
// For WAV use file signature, since it can detect audio format
// and we can fail sooner, then doing it at runtime.
// This is important for AudioClip.
if (MediaUtils.CONTENT_TYPE_WAV.equals(contentType)) {
contentType = getContentTypeFromFileSignature(uri);
if (!MediaManager.canPlayContentType(contentType)) {
isMediaSupported = false;
}
} else {
if (contentType == null || !MediaManager.canPlayContentType(contentType)) {
// Try content based on file name.
contentType = MediaUtils.filenameToContentType(uriString);
if (Locator.DEFAULT_CONTENT_TYPE.equals(contentType)) {
// Try content based on file signature.
contentType = getContentTypeFromFileSignature(uri);
}
if (!MediaManager.canPlayContentType(contentType)) {
isMediaSupported = false;
}
}
}
// Break as connection has been made and media type checked.
break;
}
From this we can see a first "dumb" attempt is made to grab the file content based on its name (this is what MediaUtils.filenameToContentType() does.) There's then some special cases for checking for different types of wav file, but if that fails then we fall back on a cleverer check which looks at the actual file signature. Both these checks are in MediaUtils. This latter check is much more extensive, and looks at the first few bytes of the file to see if it can work out a format that way. If it can't, then it bails out and throws the exception that then pops out as our dreaded MEDIA_UNSUPPORTED flag.
If the type is identified correctly though, there's still another hurdle to go through - it has to be supported by the current platform. Some platforms are loaded dynamically depending on the environment, however the GSTPlatform always exists, thus we would need to put any additional (universal) formats here. This is relatively simple, a CONTENT_TYPES array exists which just holds the array of supported formats.
Unfortunately cloning the JavaFX repo seems to be failing for me at the moment, otherwise I'd attempt to put some of this in practice. But in lieu of the above, what actually needs to happen to add support for further formats? It actually doesn't seem hugely difficult.
In MediaUtils, support needs to be added to the filenameToContentType() method to handle the new file extension. This is trivial.
In the same class, support needs to be added to the fileSignatureToContentType() method to work out the file type based on its signature. This is a tad more complex, but still not too bad. This may even be optional, since the current code only seems to use this as a fallback if the format isn't identified correctly (or at all) from the file extension. A comprehensive list of file signatures for different formats can be found here which should help with this task.
In GSTPlatform, the new content type needs to be added to the list of supported content types.
On the Java side of things, this appears to be all that's necessary to get it to accept the content type and at least attempt to pass it down to the native Gstreamer layer.
However, I'm no expert in GStreamer, so while I'm aware there's many more formats that it can handle and play that JavaFX currently refuses, I'm unsure as to how exactly they've removed this capacity. They've definitely done it in the Java layer above, but they may have also done it on the native GStreamer level - at this point I'm unsure.
I assume they've made some changes to GStreamer for JFX8 - but at the present time they're not listed on the relevant project page, so it's quite hard to work out exactly what they've changed for this version.
The next step would be to grab the JFX8 source, build with the above proposed changes for a new content type, and then see what errors (if any) occur on the native level, then take it from there.
The API design does not appear to have support for rolling your own codecs. Pretty much all of the classes are final (e.g. VideoTrack, Media, MediaPlayer etc). I assume that the actual video decoding is done with internal classes at present, meaning there is no way to override them.
There is a plan to Open Source JavaFX 2.0, I suspect as we approach the release of JDK8. Hopefully when they do this we can see how they resolve their codecs from the Media(String source) constructor and see if we can hook into this somehow.
And now, Javafx2.1 finally supports mp4 H.264 so you should now be good to go without the above posted stunts. :)
Current open feature requests for this in the JavaFX bug tracking system:
JDK-8091656 Wishlist for more media format support
JDK-8091755 Media should support InputStream
Read the linked feature requests and the associated comments on them to understand their current status (or lack thereof ;-) for the JavaFX distribution version that you are using.
Note, for the InputStream based Media API, one of the later comments by a JavaFX developer is "I propose we consider this for JDK 10", so I guess it may be a possibility in the future...
Also note, if you are not sure if JavaFX currently has in-built support for a given encoding type or not, a comprehensive overview of supported media encodings and media container types is provided in the javadoc for the javafx.media package (just ensure that you review the version of the javadoc which matches your version of JavaFX).
Those who may be interested in other solutions to at least get a video to play from JavaFX, even if it is a media type not natively supported by JavaFX and you don't want to hack the native JavaFX media support just to get your video to play, can also see my answer to the related question:
Playing h265 HEVC in a JavaFX client
I'm working with a legacy Java app that is new to me so one way to figure out how it works and find things easier, I have thought would be to be able to get the full stack trace after I perform actions, so as to be able to see which classes are being used based on a particular UI action. I had thought this was possible in the debugger but it seems that it only works if I insert a breakpoint and in this case part of the purpose of this is so that I don't have to know what's being called to be able to insert the breakpoint first (as this would help tell me that).
I apologize if this is a basic question, I have searched on this but I'm not finding the correct answer.
This doesn't directly answer your question, but maybe it will solve your problem better. Take a look at BTrace. It lets you instrument a running Java app and insert some basic code of your own. You could, for instance, have it write out entire method call chains to help you find your way through the app. It's somewhat similar to AspectJ, but with an entirely different purpose and requiring no change in the project source:
"BTrace is a safe, dynamic tracing tool for Java. BTrace works by dynamically (bytecode) instrumenting classes of a running Java program. BTrace inserts tracing actions into the classes of a running Java program and hotswaps the traced program classes."
A few suggestions:
Some profilers will allow you to walk from any particular method up (and sometimes down) to see what's calling and being called. I've found this surprising informative about flow, even in apps I thought I knew well.
For understanding the mainline flow, I don't think there's a better substitute for working interactively with a debugger. It will lead you into learning other important things. Not what you wanted to hear, I know. This presumes that you can rapidly restart the app when you miss a key off-ramp.
Reverse-designing large legacy apps is the one place where I use UML fairly regularly. There's too much to keep in my head to form a good big picture. If you have a UML tool that will do reverse-engineering, load it up with the app, then probably prune down hard on the classes you don't care about, because they are trivial or obvious. Arrange the diagrams in a way that helps you understand. I've used Together, Magic Draw, and Visual Paradigm in this way. Together worked the best - but it was a decade ago.
When you are in the debugger perspective, you will see a view showing the launched processes. In that view you can tell it to pause all threads of a process. Once stopped, you will be able to browse through threads to see what they are all doing. To try to catch what a particular action is doing, you would have to start the action and then quickly pause all threads.
You could always run the application with the VM arg of -verbose:class. You could then watch the console output and see what classes the VM is loading when you perform a particular action. This could possibly give you a starting place for where to place breakpoints. This won't always work depending on the scenario, but may be helpful.
Another trick you can use is to figure what classes you know that have to be involved in the code path you are trying to trap. For instance, you mentioned that it's a Java EE web app and therefore the action is likely some kind of a servlet interaction (at some level). I don't have the API in front of me, but you can place a breakpoint on the method in the response object where the output stream is retrieved. Once that breaks, you will know the code that's trying to service the request.
You can always see where a method is called by clicking "Open Call Hierarchy" from eclipse (left click on the selected method or CTRL+ALT+H ). Also, you always can inspect where a method/class is defined by clicking "Open Declaration" (left click on the selected method/class or F3).