Could anybody give me a hint how to add an item into Google Cloud Tasks queue in Java?
https://cloud.google.com/tasks/docs/
Google has a very smooth description of their queues v1 https://cloud.google.com/appengine/docs/standard/java/taskqueue/pull/creating-tasks but nothing similar for a new beta Google Cloud Tasks.
I need simply to add an item into a queue with a specific tag and after to pull it from the queue by the oldest_tag() function.
Does anybody has an experience with Google Cloud Tasks?
The v1 documentation page you references is specific for pull queues, which aren't (yet at least) supported by Cloud Tasks.
From The App Engine SDK versus the Cloud Tasks API (emphasis mine):
Some features are not yet available via the Cloud Tasks API:
Pull queues
Adding tasks asynchronously
Transactional tasks
Deferred tasks
Namespacing
Local emulator
There are two separate ways to access the Task service:
Using the App Engine SDK (App Engine standard first generation runtimes)
Using the Cloud Tasks API which is in beta (every body else, particularly second generation runtimes like Python 3.7 or App Engine flex). There are REST or gRPC- based APIs available. Currently the Cloud Tasks API supports only push-type queues. The Cloud Pub/Sub API can be used for many pull queue type use cases.
Related
My use case requires to publish metering data to AWS hourly, I am a seller on Marketplace where buyers can subscribe to my SaaS application.
Currently We are handling the use case by having a TaskTimer class and calling meter-usage command on aws to publish data hourly, now I want to convert this timer class to a RestAPI where i can leverage BatchMeterUsage to send/publish data to AWS, as BatchMeter will allow me to send 25 subscequent requests in a batch instead me having to call MeterUsage 25 times.
Any suggestions on this are welcomed. It will be great if I can get Java based solution, Complete Rest flow not needed, controller and services are in place just need to integrate BatchMeterUsage utility in that, can any help me with that.
we wanted to use amazon kinesis service to stream video from producer to consumer, so in the docs they had mentioned different producer libraries available. They have producer library for Java, android, c++, c. We wanted to do streaming from android device to kinesis stream so we had a look at android producer library but from the sample code we came to know that we require to use another amazon service Cognito to create user pool, identity pool and all.
So what we wanted was to stream from android device using only kinesis and no other amazon service. So since android support java, we wanted to know whether the Java producer library can be used in android since it doesn't use any other amazon service or even use C++ producer library for android?
Searched on web for this but there were no post related to this. So guys any help will be appreciated. Any reference will be really really helpful. Please help.
We have got a web based Java application which we are planning to migrate to cloud with an intention that multiple clients will be using it in a SaaS based environment. The current architecture of the application is quite asynchronous in nature. There are 4 different modules, each having a database of its own. When there is a need of data exchange between the modules we push the data using Pentaho and make use of a directory structure to store the interim data file, which is then picked up by the other module to populate its database. Given the nature of our application this asynchronous communication is very important for us.
Now we are facing a couple of challenges while migrating this application to cloud:
We are planning to use Multi Tenancy on our database server, but how do we ensure that the flat files we use for transferring the data between different modules are also channelized to their respective tenants in the DB.
Since we are planning to host this in cloud, would seek your views, if keeping a text file on a cloud server would be safe from a data security perspective.
File storage in cloud is safe and you can use control IAM roles setup to control the permissions of a file. Cloud providers like Google (Cloud storage), Amazon (AWS S3), etc provides a secure and scalable infrastructure to maintain files in the cloud.
In general setup, cloud storage provides you with buckets which are tagged with a global unique identification. For a multi-tenant setup you can create multiple buckets for individual tenants and store the necessary data feeds in it. Next, you can have jobs batch or streaming jobs using kettle (Pentaho) to push it to the right database based on the unique bucket definition.
Alternatively, you can also push (like other answers) to a streaming setup (like ActiveMQ, Kafka, etc) with user specific topics and have a streaming service (using java or pentaho) to ingest the data to respective database based on the topic.
Hope this helps :)
I cannot realistically give any specific advice without knowing more
about your system. However, based on my experience, I would
recommend switching to message queues, something like Kafka would
work nicely.
Yes, cloud providers offer enough security for static file storage. You can
limit access however you see fit, for example using AWS S3.
1- The multi tenancy may create a bit of issue while transferring the files. But from what information you have given the process of flat file movement across application will not be impacted. Still you can think of moving to MQ mode for passing the data across.
2-From data security view, AWS provides lot of features at access level, MFA, etc. If it needs to be highly secured i would recommend to get AWS Private cloud where nothing is shared with any one at any level.
I want to backup (and later restore) data from GAE datastore using the export facilities that went live this year. I want to use cron and java. I have found this post which points at this page but it's just for phython.
Originally I wanted to do this automatically every day using the Google Cloud Platform console but I can't find a way of doing it. Now I am resorting to incorporating it into Java and a cron job. I need restore instructions as well as backup.
I'm not interested in using the datastore admin backup as this will no longer be available next year.
According to the docs, the way to do it is, indeed, through Cron for GAE and having a GAE module call the API to export.
The point is not the code itself, but understanding why this is so.
Currently, the easiest way to schedule tasks in GCP is though Cron jobs in GAE, but those can only call GAE modules. Following the docs that you pointed out, the Cron will be quite similar to the one described there.
Regarding the handler itself, you only need to call the Datastore Admin API authenticated with an account with the proper permissions.
Since the Cloud Client Library does not have admin capabilities for Datastore, you'll have to either construct the call manually, or use the Datastore API Client Library.
Notice that, for GCP APIs, there are usually two client libraries available: the Cloud Client Library and the API Client Library. The first one is hand-crafted while the second one is auto-generated from the discovery document of each API.
If one specific functionality is not available through the Cloud Client Library (the recommended way of interacting with GCP APIs), you can always check the API Client Library for that same functionality.
I am currently working on order management system for ecommerce portal
The backend are rest webservices in java while the front end is angular js.
The rest service in java does many tasks when an order is placed /updated
store/update order and items in the order in the db
Notify 3rd party logistics regd this order
send email notification to the customer
send sms notification to the customer
etc
We already have an async queue implemented for another feature using blocking queue.
1. use the same queue(current size is 200 and is in memory) and post to it
2. create a new queue inside the rest webservice application
3. integrate with 3rd party queues.
Can someone give insights on #3? or is it wise to go for #1 or #2?
Order management of e-commerce portal is not a simple problem. It will most likely have scalability requirements and using simple Blocking Queue for async processing will not be a good idea.
JVM based Blocking Queue is in-memory queue and requires producers and consumers to be running in same JVM process.
For sending emails whenver a customer places a new order, you need to ensure that email was really sent and application restarts does not result in loss of data present in Blocking Queue.
Hence, most likely you should use an out-of-process Queue systems such as Apache Kafka or Apache ActiveMQ or equivalent.