I am looking for tools to help with testing my android app. It contains a hardware component and so far our only method is manual; obviously this isn't optimal.
The app uses the phones camera & a hardware attachment. Normally I would look into espresso for UI testing and Junit for unit tests but I don't want to start and get nowhere.
An answer here would suggest testing tools, and how they get around the hardware and camera constraints.
How about calabash? (GitHub) You can basically automate the touches you would make manually. We use it here to simulate barcodes being entered from a external usb scanner.
I am currently working on an android app that talks to custom hardware. We ended up just mocking the interactions with the device for our espresso tests.
Related
I'm testing an app that supports NFC payments, however I was unable to find any solutions on automating these tests.
I'm using Appium as my open-source automation tool.
Fully automate NFC payments with desired variables.
It's not supported by appium neither any other framework. However, I have a solution for you. Have your app's developers add a menu for you with several choices, each choice represents a payment scenario from your design. Once you click on each one of them, it will mock the real NFC action. Make sure that this menu is only in debug mode and not in release mode.
i designed a pattern view using this library
'''implementation 'com.andrognito.patternlockview:patternlockview:1.0.0''''
but how to apply pattern lock on all apps on my android device? how to show my pattern app screen when pattern apply on any app in android device? Thanks in advance.
As I already wrote in my comment. what you want to do is not possible. You can modify the behavior of your own application, but not the one of others.
The main reason is security issues. Imagine you have an app that mimics a log-in screen from BankXY, and displays this screen right before your actual BankXY app log-in screen is displayed. This is called phishing, and is one of the most popular attacking scenarios out there. What you would like to implement would not be a security risk, but the tools to make it possible certainly are.
EDIT:
This was possible before Android 8.0, where the ability of using Services for background executions was significantly limited. It was possible to create a Service that would check the package names of running applications and in case this name is on a black-list, request a pin code for example (See this Question). However, you cannot run background Services for an infinite amount of time anymore.
I am trying to build a monitoring app using LUA in corona labs. I was wondering how I would go about doing that.
Background: the app should notify the user when there is another app or outside third party accessing the camera/microphone.
I'm thinking trying to get access to the devices resource manager and it notify the app when the camera and microphone is being used.
I haven't come across anything like this before. Am I on the right path? Or am I way off? How would you go about doing this?
I'm not expecting a definite answer, just a nudge in the right direction :)
In general this kind of behavior is not what Corona is designed for. You would likely have to use native tools, or Corona native if you want to use corona for your UI, to access this kind of data.
There is an interesting discussion about how it can be done in Android here How to check whether microphone is used by any background app, hope it helps!
The reason why most people would ever make their android project in javafx would be to have the same codebase across different platforms (such as ios, desktop, android, maybe even web using Bck2Brwsr/teavm/doppio)
But my question is, is there any advantage in javafx ui framework itself when compared to android ui framework?
I have never ever written even a hello world application for android, but I intend to do it now. So I am wondering if having the code in javafx is worth the effort when I can develop directly on android apart from the benifit of portability.
This type of question might result in a subjective/opinionated answer but I think it is a good question so I will provide my assessment.
Having the same codebase across all those platforms is huge. Do not dismiss this. I'm using Gluon Mobile to port aspects of the Deep Space Trajectory Explorer (DSTE) to Android and iOS. As you can see from the video its extremely complex application. There's no way I would rewrite that in native Android... it would be a no-go from a cost perspective.
Starting development from JavaFX makes it easier to make complex visuals. I don't just mean traditional 2D GUI forms. Again looking at the DSTE you will see we use Canvas to do dense renderings and JavaFX 3D along with the FXyz library to do 3D renders. These things are easy in JavaFX and again using Gluon simply "just work" on Android/iOS. In fact it only took about a day to get those aspects of the DSTE code base to work on a Pixel C tablet, most of which was getting the Gradle build setup properly. Now imagine having to port 3D code from JavaFX to a Native framework? I'm a 3D guy and I still wouldn't try it.
Testing is so much easier on the desktop than a mobile device. This doesn't mean the testing is 100% on desktop. Sometimes something that works on desktop "doesn't work" on the mobile platform and you have to tweak accordingly. However you can save a LOT of time standing up the application using JavaFX knowing that 90% of it will work the same on your mobile device.
Word of advice though... remember that a desktop application is NOT a mobile application. You will be tempted to just "port" your desktop app to your device. I was my first time. You can get into other issues where the interfaces and layouts you design for a desktop "work" on the mobile device but are not appropriate and so the usability goes down. Start slow when you port. Think of what aspects of your desktop workflow should be mobilized. Only port the things you absolutely belong in a mobile workflow. Save yourself some headaches.
I am trying to develop a java app that will run on a Raspberry PI. Raspberry PI will be mounted on a vehicle and I will know my position through a gps device. To solve this, I’ve been thinking on a solution like this:
Use a Webview on my JavaFX app and use your javascript API to build a real-time turn by turn navigation app. However, I’ve seen that your web API is not as complet as mobile platforms APIs. My question is: Is what I am trying to do feasible using your APIs? If so, could you please give me a brief description how to do it?
Thanks!
The Javascript API is not a turn by turn API - that is currently something a bit too heavy for javascript to handle (it could be feasible but it's not commercially attractive right now).
In theory you could integrate directly with the C++ code of the SDK as that should be able to run on Linux (depends here on the gcc version used and the OpenGl support offered - send an email to dev#telenav.com with your scenario and they will advise you).
Or if you can run Android on the device then you can use directly the Android SDK.