Using CCTV SDK for Windows on Android? - java

A DVR I bought includes its SDK which gives me all the programming codes for the DVR.
The example I found in the SDK was made for Windows only, is there a way I can implement these methods on Android using NDK?
The SDK uses .cpp files and some .h files. It also links to an external .dll file which I have no way of reading the codes to implement to Android
EDIT
I'm trying to implement these methods for a live view application. I managed to get the stream using rtsp with low latency using a different source, but I need to get the playback function which is available in the SDK
EDIT 2
What I mean by "playback" is to play the recorded videos saved on the hard disk of the DVR.
I asked some of my connections and they confused "playback" with "livestream" so this might also confuse some who reads this question

Related

How can I implement function for usb camera for my application?

I am creating an application for tablet in android where, I want to use usb camera as a default camera when I start my application. I want to click picture and save them in either jpeg,jpg or png format. I could not find any helpful resources on web. How can I implement such functionality? Any help would be appreciated.
The solutions are all complex with advantages and disadvantages. Information on USB and UVC is freely available from the USB IF web site, although there are many hundreds of pages of detail. The main issue is that the isochronous transfer method is missing from the Android framework, although descriptor retrieval, control, interrupt and bulk are implemented. Hence, most of the usb access can be done in Java or Kotlin, but the streaming side is a real pain. Using method one below and a Logitek C615 webcam, I obtained 640x480#30 fps on a Lenovo Tab10 using Android 6, and 1920x1080#30fps on a Lenovo IdeaPad 520 Miix using Android9 x86. The Tab10 appears to run at USB v1 speeds although it has a USB v2 micro-B socket. The Miix has a type A USB socket and does not need an OTG converter. I know of three methods:
Use Libusb. This requires a separate compilation and build in Android Studio to a shared library. Write C++ code to setup, transfer packets and teardown the webcam. Write Java code for the user interface, decompress MJPEG, display preview and write JPEGs to storage. Connect the two via JNI. The C++ and Java run in separate threads but are non blocking. The inefficiency is in memcopying from a native image frame array to a JNI array and then freeing for garbage collection after each frame. Android apps run in a container, which means a special startup for Libusb and some functions fail, probably because of selinux violations.
Use JNA to wrap the Libc libray. JNA is obtained from a repository and does not require a separate build. Libc is in Android Studio. All the code is Java, but IOCTL is used to control the usb file streams and the small stuff is really sweated. Libusb does much of the low level stuff in design choice 1.
Use the external camera option with the Camera2 API. Non of my devices support an external camera, not even my Samsung Android 13 phone. I suspect this is meant for systems integrators to build custom versions of Android with the appropriate settings as documented by Google, implemented on SBCs for point-of-sale etc.

Is there a way for Android apk to download executable java module and run it?

I've got an app in Google Play, as well as on Amazon Appstore, Huawei Marketplace, Samsung Apps and one more e-shop in China.
It's a puzzle game. Regularly, I implement new types of puzzles. Each time, I then have to make them available to the users by creating new version of the app and publishing it to those 5 shops.
Implementation of a single puzzle consists of 2 Java classes, a binary file and a few pngs (icons).
I am thinking that maybe it would be possible to write my app in a way so that it could instead download such 'puzzle modules' from my server? The apk would then need to be able to download executable Java code from Internet and somehow 'adjoin' it as a module to itself. Is that possible?
it is possibile, but it is also restricted in some stores, in Google Play for shure. (also possible on iOS, also restricted in App Store)
this is just very unsafe letting developers adding some executable code without store reviews and informing users, so policies are forbidding it

Image Recognition using OpenCv and android

I want to work on making an android app by integrating OpenCV with android Studio. I have a set of 2D hardcopy card images that i want to save as templates with in the app. Then, using the app, when i place my camera on any of the cards, the app should search the directory which contain the templates and look for match and provide feedback if a match is found. If anyone can guide on how to achieve this, it will be highly appreciated.
Also, if not OpenCV, then which SDK or tool should be preferred ?
The question is a general one, so the answer will be general as well, and will make assumptions about what you'd like to accomplish with your application.
Android Studio with OpenCV is probably a reasonable stack to use.
Presuming the library has more than a trivial number of images, you'll probably want to extract matching information for each image in your library in an offline process (on your code-development machine). So for instance, you would configure your development machine with a SQLite database driver and OpenCV, construct a program that extracts feature points and saves those to your a SQLite database. That file can then be loaded into the Android APK assets, and it would be ready upon the application's first use. The application would take frames from the camera and compare those with each item in the database, looking for matches.

How to get access to computer files for development through Android Emulator

My problem is that I'm using parse.com to upload files, and I can't simply create a java project to do so, because it requires an Android context:
Parse.initialize(this, "", "");
I was wondering if there was a work around to access my computer's files through an Android emulator. I know the ACTUAL app technically wouldn't have any idea of my files, but the emulator is still running on my computer. Or is the emulator completely independent to near-perfectly imitate the real thing, and basically this would be impossible?
If the answer is no, what can I do aside from getting a phone and putting the files onto the phone?
You can use the Parse.com REST API to upload from anywhere.
The Parse API for Android is just a wrapper for this REST API.
There are also some existing Pure Java wrappers for the REST API by third parties. See the Java section of the Parse API Library page. Almonds in particular looks to give you what you want.

Google Drives SDK and powerpoint

I am currently developing a small android app using the google drive SDK/google doc's embedded player which will play through power point files in a slide-show manner. Since there's no direct way I can tell when one powerpoint ends and another begins using the SDK, I was wondering if there was any way for me to retrieve the number of slides from a power point. With the amount of slides I'll be attempting to use the interval between slide changes to calculate the time taken for each powerpoint to play and then using that I can switch through files. I know .pptx files carry the amount of slides in it's metadata (not so sure about .ppt), but I'm not sure how to go on reading it. I've looked at google drive SDK's part to read metadata, but it seems rather limited to what you can actually read. I've looked at ApachePOI but it doesn't seem it's android compatible. Could anyone point me in the right direction with this?
Thanks :).
Sorry, this is not possible using the Google Drive SDK. You could easily download the pptx, unzip it (it is just zipped XML) and extract the required metadata.

Categories