we wanted to use amazon kinesis service to stream video from producer to consumer, so in the docs they had mentioned different producer libraries available. They have producer library for Java, android, c++, c. We wanted to do streaming from android device to kinesis stream so we had a look at android producer library but from the sample code we came to know that we require to use another amazon service Cognito to create user pool, identity pool and all.
So what we wanted was to stream from android device using only kinesis and no other amazon service. So since android support java, we wanted to know whether the Java producer library can be used in android since it doesn't use any other amazon service or even use C++ producer library for android?
Searched on web for this but there were no post related to this. So guys any help will be appreciated. Any reference will be really really helpful. Please help.
Related
We are in the process of integrating .Net applications which are deployed on VM's on premises data centers with pub/sub resource topic in Google cloud platform on the cloud. I have a scenario which I am currently not able to decide and would need help and a right direction. Below is the brief detail of the use case. Please have a look and provide your thoughts.
Currently there is a .Net application which is deployed on a Windows VM on legacy on-prem client data centers. What it does is that it publishes XML messages to a Tibco EMS topic on a EMS server deployed in same data centers on-prem. Few Java applications which are deployed on different VM's subscribe to this Tibco topic and pull messages and process them. This is the legacy flow.
As a part of modernization GCP is coming into the mix. Now the scenario is that XML messages that On-Prem .Net application publishes to the Tibco topic should also get pushed to pub/sub resource topic on GCP cloud. A Java microservice which has been deployed on GCP infra would subscribe to this topic and consume these messages from it.
Now the problem I am facing is that how to go about this integration between On-Prem and Cloud applications. I thought about a couple of options.
Copy the messages directly from legacy Tibco topic to which .Net app publishes messages to Pub/sub topic in GCP. I am not a Tibco expert and not sure If this is supported. I found the below link but not sure if this suits my use-case. Also client wants to move away from Tibco and not sure if the legacy Tibco EMS on data centers support the below Tibco connector feature.
https://www.tibco.com/connected/google-cloud-pub/sub
Make changes to the .Net code base so that point in code where it publishes message to Tibco topic we can add additional code to also publish it directly to Pub/Sub topic in GCP. Not sure if this is ok as .Net application is on legacy on-prem VM and the Pub/Sub is in the Cloud. Here also I not familiar with .Net but found out that there are .Net Google client library which can be added in .Net code to achieve this flow. Also is Google Pub/Sub the right integration tool to be used or something else has to be used to connect these two systems to-gather.
This is by far i could proceed. Could you guys let me know are the above 2 approaches right or there is an issue and which one is the right approach. Also if there is any other solution apart from above it would really help me to move forward. Hoping for a positive reply and help from you all.
Thanks, Vikeng21
For the 1st scenario the mentioned connector is in fact a TIBCO BusinessWorks plugin. So the approach would be to build a kind of GCP Cloud Messaging / TIBCO EMS gateway using TIBCO BusinessWorks. It would be then possible to run this solution on premise or in the Cloud (using TIBCO TCI offering).
The advantage of this approach is that it would be transparent for your applications and local .Net applications and Cloud applications would receive exact same messages.
I think EmmanuelM's answer covers the first scenario, it would probably be the easiest and most transparent approach.
In regards of scenario #2. I think this is a valid approach as well, although it requires to modify your application code to publish messages to Pub/Sub alongside Tibco, I'm no Tibco expert either; but when it comes to Pub/Sub, as you've mentioned, Pub/Sub offers a .NET client library which you can you use in your application to easily publish and consume Pub/Sub messages. I see that you've mentioned:
Not sure if this is ok as .Net application is on legacy on-prem VM and the Pub/Sub is in the Cloud
It is completely ok; Cloud Pub/Sub service is used through its API, and you can consume the API regardless of whether you do it on-prem or cloud.
The thing about this approach is that I'm not sure how consistency could be kept between Tibco and Pub/Sub, I'm assuming that this is why the first approach would be easier and transparent as this is probably what the integration plugin is in charge of. Without that, some custom application logic would probably be required to guarantee that messages are being successfully published both to Tibco and Pub/Sub.
Having said that, I would really recommend that you get in contact with Google Cloud Sales to describe in detail your use case, business requirements and get personalized assistance for your migration plan.
Could anybody give me a hint how to add an item into Google Cloud Tasks queue in Java?
https://cloud.google.com/tasks/docs/
Google has a very smooth description of their queues v1 https://cloud.google.com/appengine/docs/standard/java/taskqueue/pull/creating-tasks but nothing similar for a new beta Google Cloud Tasks.
I need simply to add an item into a queue with a specific tag and after to pull it from the queue by the oldest_tag() function.
Does anybody has an experience with Google Cloud Tasks?
The v1 documentation page you references is specific for pull queues, which aren't (yet at least) supported by Cloud Tasks.
From The App Engine SDK versus the Cloud Tasks API (emphasis mine):
Some features are not yet available via the Cloud Tasks API:
Pull queues
Adding tasks asynchronously
Transactional tasks
Deferred tasks
Namespacing
Local emulator
There are two separate ways to access the Task service:
Using the App Engine SDK (App Engine standard first generation runtimes)
Using the Cloud Tasks API which is in beta (every body else, particularly second generation runtimes like Python 3.7 or App Engine flex). There are REST or gRPC- based APIs available. Currently the Cloud Tasks API supports only push-type queues. The Cloud Pub/Sub API can be used for many pull queue type use cases.
I am new to amazon kinesis so might be this question is very basic question.
But i need help on this,
I am having one use case where i need to pull the data from amazon kinesis into my web application which has been created in JAva, i need to obtain the connection between the kinesis and java so that i can take the data from amazon kinesis and do some of the analytics on the data and if the data is modified then i need to put the data back to amazon kinesis from java application.
here my java application is not on the amazon clould, it is on my private cloud. so how do i do the above task.
Kindly help me.
First of all, Amazon Kinesis API endpoints are on public IP addresses, so you don't need to have EC2 instances within AWS environment to access Kinesis.
To read data from Kinesis, you can use Amazon's own Kinesis Client Library (KCL).
https://github.com/awslabs/amazon-kinesis-client
On AwsLabs GitHub, there are also sample applications written in Java.
https://github.com/awslabs/amazon-kinesis-connectors/tree/master/src/main/samples
About your architecture; if you want to process raw data and create meaningful extracts, I recommend you do some ETL tasks (aka: post processing) and write your results to another place (ie. RDBMS). On the view layer (your web app) you can display the resulting output in any format you like by reading from your database.
I'm currently using Splunk as a way to index generated log files from a Java application. I have a Splunk Enterprise instance running using development data (on a local server), and currently the log data is just being pushed to Splunk via their REST API (using their Java SDK).
However, this Java app will eventually be used against production data, live on AWS EC2 instances. I'm wondering if there's any advantages to ditching their REST API and implementing Splunk Universal Forwarders on these EC2 instances.
Would there be any advantages? When is it appropriate to use forwarders instead of the REST API?
From what I can gather, the forwarders do well at scale, so perhaps this is an advantage? I've searched around but didn't find any clear-cut comparisons between the two, so I was hoping someone on here would perhaps have a better idea.
Splunk REST API is really meant for integration with external apps and requests to manage the already indexed data.
Any serious volume coming in should be handled by the Universal/Heavy Forwarders as they were purposely developed for that function (thus orders of magnitude more efficient).
I want to store my SD card contents to Google docs (or any Google cloud service) and retrieve them from there via an Android device.
Can anyone tell me how to do that?
I guess the best place is to go here http://code.google.com/apis/documents/
You will find examples there on how to programmatically get access to Google Docs.
You can use http library available in Android for accomplishing cloud communication via Android. Android has the Apache http library in its runtime. So you can rely on that. I have written a cloud based app. Hope this little blog can help you
This might be more than you need, but real cloud storage at Google Storage for Developers can be easily and securely accessed using jetS3t (pronounced "jetset"). The icing is that jetset also is compatible with Amazon S3 cloud storage!