I am creating an IOT device where the user can set a particular time to trigger an action by an IOT device. For eg: At 01:00 PM, the Air Conditioner starts automatically. In this case, user will set 01:00PM in the Mobile App which will further be stored the value in a database table.
Now i need to read these database values and trigger an Azure function that will switch on the A/C, but this process is complex since i need to write my own scheduler.
Is there any alternate way, perhaps using Azure Timer Functions.
Thanks
Related
Lets say I have two bounded contexts Project and Budgets. When project adds new member it is publishing MemberAdded event. Budgets listen to this events and it creates local version of Member entity - it just holds what it needs. This will work fine if both services are deployed at the same time. Lets say I use kafka. These MemberAdded events will be stored for example for next 7 days and then kafka deletes them. What if later I need to add new service for example Orders that also need replicated version of Member entity - Customer. I would need to start listening for MemberAdded events but I would still not have old data. How you can handle this case? The only solution I can think of is to use persistent event store that I can query for all events and from this I could restore all data required for new service.
You have two options:
Use a persistent event store and store all events ever generated. This can be thought of a pull system, where new services (or even old services) can fetch data as and when they need it.
Regenerate events from the source system on the fly (Project) and target them to specific services (using a particular pub-sub or message channel, for example). This can be considered a push system, where you must trigger republishing of old events whenever another part of the system needs to access data from another service.
Needless to say, the former is much more efficient and is a one-time effort to gather all events at a central location than the latter, which needs you to do some work whenever a new service (or old service) needs to access/process data from another service.
Extending the thought process, the central event store can also double up as a source of truth and help in audits, reporting, and debugging if made immutable (like in the event sourcing pattern).
I want to create a logic app which processes items from service bus queue during specific time (say 7 AM to 8 PM EST) and during specific days (Monday to Friday). The queue message has json which needs to be converted to file and send the file to file location. The downstream system is only expecting files during 7 AM to 8 PM EST. Is there way to create service bus queue trigger and configure it to fire during defined day and time? Can this be achieved using Recurrence trigger?
You need to rethink the way you set up the logic app. Instead of using a service bus queue trigger use a recurrence trigger. You can configure it to run on the times and days you specified.
Then, when it is triggered you can read all non-processed messages using the Service Bus connector. It contains an operation to read messages from a queue and an operation to complete the message.
There is no out-of-the-box way to stop the logica app run automatically on a given time. You will need to do that yourself. The way I see you will probably have a loop in which you pull messages from the queue. Inside that you could check the time and stop if it is close to the ending period of the downstream system.
My logic app gets triggered with a Recurrence which sends a message to the Service Bus Queue.
What I am trying to do is send a different property (or body) to my service bus queue based on the time of day that the Recurrence gets triggered. Right now I am triggering the Recurrence 8 times a day.
I could do this by creating 8 logic apps, however this seems unpractical because the logic apps would be pretty much identical except for the recurrence time and the property being sent to the service bus. And if I have to scale this up to more than 8 times a day, that will be annoying having to create a new logic app each time.
Any ideas on how to accomplish this?
You can use utcNow() with a custom format string that returns maybe only the hour component which you can use to determine when in the day you are.
If the exact local time is important, you can also use convertFromUtc() to get the zone local time first.
I have made a web real time application that connected to Node.js server through a websocket. In my website I can turn on/off an LED connected to Arduino Uno.
What I want to do is, I want my website have capability to turn on/off led at certain date and time dynamically. What I mean 'dynamically' is I can add new or remove current schedule task.
I have been trying using node-schedule, cron, but it's just a static schedule task. I can't change or add new task.
Use a db / file. You can store the dates in a json and then edit it as per your convenience. Use node-cron to create events of what you wanna do from the data. Create function that removes entry from json when you want to and it also remove it from the upcoming tasks by task.destroy() method of node-cron.
https://github.com/kdichev/Green-Systems/blob/development/PumpController.js
check what I have done with my pump. on line 19 I have an array of times that will run the pump according to the entries given.
Here's the scenario. I'm not working with real-time data. Instead, I get data from my electric company for the past day's electric usage. Specifically, each day I can get # of kwhs for each hour on the clock on the past day.
So, I'd like to load this past information into event hub each following day. Is this doable? Does event hub support loading past information, or is it only and forever about realtime streaming data, with no ability to load past data in?
I'm afraid this is the case, as I've not seen any date specification in what limited api documentation I could find for it. I'd like to confirm, though...
Thanks,
John
An Azure Event Hub is really meant for short-term storage. By default you may only retain data up to 7 days. After which the data will be deleted based upon an append timestamp that was created when the message first entered the Event Hub. Therefore it is not practical to use an Azure Event Hub for data that's older than 7 days.
An Azure Event Hub is meant for message/event management, not long term storage. A possible solution would be to write the Event Hub data to an Azure SQL server or blob storage for long term storage. Then use Azure Stream Analytics (an event processor) to join the active stream with the legacy data that has accumulated on the SQL server. Also note, you can call this appended attribute. It's called "EventEnqueuedUtcTime". Keep in mind that it will be on the server time, whose clock may be different from the date/time of actual measurement.
As for appending a date time. If you are sending it in as a JSON, just simply append it as a key and message value. Example Message with Time: { "Time": "My UTC Time here" }
A streaming system of this type doesn't care about times a particular application may wish to apply to the items. There simply isn't any processing that happens based on a time field unless your code does it.
Each message sent is an EventData which contains a message with an arbitrary set of bytes. You can easily include a date/time in that serialized data structure, but EventHubs won't care about it. There is no sorting performed or fixed ordering other than insertion order within a partition which is defined by the sequence number. While the enqueued time is available it's mostly useful for monitoring how far behind in processing you are.
As to the rest of your problem, I'd agree with the comment that EventHubs may not really be the best choice. You can certainly load data into it once per day, but if it's really only 24 data points/day, it's not really the appropriate technology choice unless it's a prototype/tech demo for a system that's eventually supposed to have a whole load of smart meters reporting to it with fair frequency. (Note also that EventHubs cost $11/month minimum, Service Bus Queue $10/Month min, and AWS SQS $0 min)