Java Azure Function with IoT Hub trigger is not starting - java

i am trying to trigger a java function each time my IoT Hub receives a batch of 64 messages (or whatever, the number is not important). I followed this guide to create the basic code, then i edited creating this function
public class Function {
#FunctionName("ProcessIotMessages")
public void processIotMessages(
#EventHubTrigger(name = "message",
eventHubName = "samples-workitems",
connection = "HUB-1544-DEV_events_IOTHUB") List<String> messages,
final ExecutionContext context) {...Function Logic...}
}
The connection parameter is the IoT Hub connection string in formatted as event hub compatible endpoint (e.g. Endpoint=sb://iothub-hostname-blablabla).
I package and deploy this code with maven plugins specified in the guide linked above. The deploys works fine, i can see mi function up and running with no errors from the portal, the HUB-1544-DEV_events_IOTHUB setting app is correctly created with the correct connection string.
The only strange thing i notice in the portal is in the trigger blade. As you can see, cardinality is One, while it should be set to many, since i did not specify the cardinality parameter in the function. The default one is many according to this guide. This makes me think that i not being able to pass the correct trigger syntax.
Anyway, the problem is that this function is not starting either from my local machine or the portal. Any suggestions? Thx

As #evilSnobu posted in the comments, the problem was the event hub name. Just go to Portal -> your IoT Hub -> Built-in endpoints and find all the information to configure the trigger in there.

Related

Azure Functions in Java - using value from Service Bus trigger in Cosmos DB input binding

I have an Azure Function (in Java) that receives msgs via Service Bus, and I'd like to get a handle on a Cosmos DB record based on one of the properties in the incoming message, for example I receive a json message like so:
{ "id": "foo" }, and I want the Cosmos Input binding to give me the corresponding DB record.
There are plenty of examples on how to do this with an http trigger, but I can't figure out how to use a Service Bus trigger. I've tried variants like this:
#FunctionName("ServiceBusListener")
public void serviceBusListener(
#ServiceBusTopicTrigger(
name = "message",
topicName = "mytopic",
subscriptionName = "mysubscription",
connection = "AzureWebJobsServiceBus") String message,
#CosmosDBInput(name = "name",
databaseName = "MyDatabase",
collectionName = "MyCollection",
connectionStringSetting = "CosmosDbConnectionString",
id = "{message.id}") String item,
final ExecutionContext context) {
// do something with item
}
Is there a way to do this?
[Edit] Forgot to mention that C# supports doing this. My guess is Azure just doesn't support it with Java, as there are similar things supported in C# but not Java, like getting a handle on a DocumentClient or CosmosClient via an input binding.
I spoke with an Azure product manager, who confirmed that Java is lagging C# in Functions, and that the Cosmos input/output bindings aren't yet supported. Performance is lagging, too, e.g. HTTP triggers are all blocking. For now a static singleton instance of the Cosmos client is the recommended approach.
I'm going to stick with Java for my app b/c this is a simple application and we use Java for everything else, and I expect Functions Java support will eventually catch up. However if I didn't have all of those other considerations and this was a stand-alone project I'd probably go with C# at this point in time.

How check my device is connected (IOTCore - GCP) [duplicate]

Does anybody know of an easy way to trigger an event when a device on Google Core IoT goes offline? Before I switched to Google's IoT implementation, this was very easily handled by triggering an event when MQTT disconnects, but it seems Google has no easy way of doing this.
Does anybody know if there is something planned for this?
Who's back do I need to scratch to get them to see that something like this is a basic requirement for IoT device management!
Other platforms like AWS and Microsoft already have this implemented (or some way to handle it easily):
https://docs.aws.amazon.com/iot/latest/developerguide/life-cycle-events.html
Device connectivity(online/offline)status with the Auzure iot hub
I wish I had known this before writing all my code and implementing my setup using Google's IoT platform, I guess that's my fault for assuming something so simple and that should be standard for IoT devices would be available.
How are you going to compete with other IoT providers if you can't even provide basic offline/online events?!
My reply in this SO question shows how I had to write 100+ lines of code just to create a firebase function to check if a device is online (but that still doesn't handle offline events, and is just a hack for something that should be native to ANY IoT service provider!):
https://stackoverflow.com/a/54609628/378506
I'm hoping someone else has figured out a way to do this, as i've spent numerous days searching SO, Google, Google Core IoT Documentation, and still have not found anything.
Even if MQTT Last Will was supported we could make that work, but even that IS NOT SUPPORTED by Google (https://cloud.google.com/iot/docs/requirements) ... come on guys!
Your cloud project does have access to the individual MQTT connect/disconnect events, but currently they only show up in the Stackdriver logs. Within the cloud console, you can create an exporter that will publish these events to a Pub/Sub topic:
Visit the Stackdriver Logs in the
Cloud Console.
Enter the following advanced filter:
resource.type="cloudiot_device"
jsonPayload.eventType="DISCONNECT" OR "CONNECT"
Click CREATE EXPORT
Enter a value for Sink Name
Select Cloud Pub/Sub for Sink Service
Create a new Cloud Pub/Sub topic as the Sink Destination
The exporter publishes the full LogEntry, which you can then consume from a cloud function subscribed to the same Pub/Sub topic:
export const checkDeviceOnline = functions.pubsub.topic('online-state').onPublish(async (message) => {
const logEntry = JSON.parse(Buffer.from(message.data, 'base64').toString());
const deviceId = logEntry.labels.device_id;
let online;
switch (logEntry.jsonPayload.eventType) {
case 'CONNECT':
online = true;
break;
case 'DISCONNECT':
online = false;
break;
default:
throw new Error('Invalid message type');
}
// ...write updated state to Firebase...
});
Note that in cases of connectivity loss, the time lag between the device being unreachable and an actual DISCONNECT event could be as long the MQTT keep-alive interval. If you need an immediate check on whether a device is reachable, you can send a command to that device.
The best solution i think is that
We need 3 things
cloud sheduler ,
and 2 cloud functions
The first function will be the #devunwired answer but instant of
// ...write updated state to Firebase... schedule a second function to trigger in 2-3 min (let device to recconect)
the seccond function will send a command to device
if the device resposne to command
if stored status is connected dont do nothing
else if the stored status is disconnected then update the status to connected and do what ever you want maybe email
else
if stored status is disconnected dont do nothing
if stored status is connected change the status alert by email or something

How to start CloudFoundry app using ReactorCloudFoundryClient?

I used StartApplicationRequest to create a sample request to start the application as given below:
StartApplicationRequest request = StartApplicationRequest.builder()
.applicationId("test-app-name")
.build();
Then, I used the ReactorCloudFoundryClient to start the application as shown below:
cloudFoundryClient.applicationsV3().start(request);
But my test application test-app-name is not getting started. I'm using latest Java CF client version (v4.5.0 RELEASE), but not seeing a way around to start the application.
Quite surprisingly, the outdated version seems to be working with the below code:
cfstatus = cfClient.startApplication("test-app-name"); //start app
cfstatus = cfClient.stopApplication("test-app-name"); //stop app
cfstatus = cfClient.restartApplication("test-app-name"); //stop app
I want to do the same with latest CF client library, but I don't see any useful reference. I referred to test cases written at CloudFoundry official Github repo. I derived to the below code after checking out a lot of docs:
StartApplicationRequest request = StartApplicationRequest.builder()
.applicationId("test-app-name")
.build();
cloudFoundryClient.applicationsV3().start(request);
Note that cloudFoundryClient is ReactorCloudFoundryClient instance as the latest library doesn't support the client class used with outdated code. I would like to do all operations (start/stop/restart) with latest library. The above code isn't working.
A couple things here...
Using the reactor based client, your call to cloudFoundryClient.applicationsV3().start(request) returns a Mono<StartApplicationResponse>. That's not the actual response, it's the possibility of one. You need to do something to get the response. See here for more details.
If you would like similar behavior to the original cf-java-client, you can call .block() on the Mono<StartApplicationResponse> and it will wait and turn into a response.
Ex:
client.applicationsV3()
.start(StartApplicationRequest.builder()
.applicationId("test-app-name")
.build())
.block()
The second thing is that it's .applicationId not applicationName. You need to pass in an application guid, not the name. As it is, you're going to get a 404 saying the application doesn't exist. You can use the client to fetch the guid, or you can use CloudFoundryOperations instead (see #3).
The CloudFoundryOperations interface is a higher-level API. It's easier to use, in general, and supports things like starting an app based on the name instead of the guid.
Ex:
ops.applications()
.start(StartApplicationRequest.builder()
.name("test-app-name").build())
.block();

Java Google datastore async calls

I do not want to block threads in my application and so I am wondering are calls to the the Google Datastore async? For example the docs show something like this to retrieve an entity:
// Key employeeKey = ...;
LookupRequest request = LookupRequest.newBuilder().addKey(employeeKey).build();
LookupResponse response = datastore.lookup(request);
if (response.getMissingCount() == 1) {
throw new RuntimeException("entity not found");
}
Entity employee = response.getFound(0).getEntity();
This does not look like an async call to me, so it is possible to make aysnc calls to the database in Java? I noticed App engine has some libraries for async calls in its Java API, but I am not using appengine, I will be calling the datastore from my own instances. As well, if there is an async library can I test it on my local server (for example app engine's async library I could not find a way to set it up to use my local server for example I this library can't get my environment variables).
In your shoes, I'd give a try to Spotify's open-source Asynchronous Google Datastore Client -- I have not personally tried it, but it appears to meet all of your requirements, including being able to test on your local server. Please give it a try and let us all know how well it meets your needs, so we can all benefit and learn -- thanks!

How to capture "send mail" in plugin for IBM Lotus notes

Here is what I am trying to do:
Add a special button to attach files to Notes "New message" window. If files were attached using this button, when email sent, they should be uploaded to the server and link to them added to the email.
My question - is it possible (and how) to capture "send mail" event in the plugin for Lotus Notus?
I don't know how an Eclipse plugin would do this. Furthermore, since Notes can be used off-line -- when it would be impossible to upload files to a server -- it would be better to have code running on the Domino server intercept the mail messages and perform the upload.
Most products that hook mail operations on the server use the Lotus Notes C API's Extension Manager functions to hook the EM_BEFORE notification for the EM_NSFNOTEUPDATE event and check whether the NSFNoteUpdate operation occurred within the server's mail.box files, and then check whether the the message requires special processing (i.e., in your case that would be by looking for a special NotesItem that your button code has inserted into the message). The usual coding method for this is to immediately change the status of the message to put it on hold, preventing the Domino router from attempting to send the message while your code is still working on it. Many products actually have two components - the EM hook DLL and a separate server task that receives a signal from the hook DLL, processes the message, and then releases it from on hold status. This approach keeps your code from tying up router threads while processing large files.
(Note: Newer versions of the Domino server have the ability to use OSGI plugins written in Java instead of using the Notes C API for operations like this. I've not looked into the details of how this might work for operations that process mail messages. )
I sort of figured it out. There is a very nice extension point provided in 8.5 - "com.ibm.notes.mailsend.MailSendAttachmentsDialog", that is specifically exists for custom handling of attachments. You can see it in plugin.xml, in IBM\Lotus\Notes\framework\shared\eclipse\plugins\com.ibm.notes.mailsend_8.5.*.jar.
The only problem is - it handles just attachments and does not have access to anything else. So if somebody figured how to get subject line and the message text from there, please reply.
Update: got it.
NotesUIElement elem = (new NotesUIWorkspace()).getCurrentElement();
if (elem instanceof NotesUIDocument) {
NotesUIDocument doc = ((NotesUIDocument) elem);
String to = doc.getField("EnterSendTo").getText();
String cc = doc.getField("EnterCopyTo").getText();
String bcc = doc.getField("EnterBlindCopyTo").getText();
String subject = doc.getField("Subject").getText();
String body = doc.getField("Body").getText();
....
}

Categories