How to upload rss file and update it when need in azure - azure

I want to use logic app with my rss connections.
I created rss by this tutorial.
I am searching about suitable azure service how to upload it and set rss url in logic app according to this.

First thing you should know, the logic app rss connector could not update, check the connector reference, it only have a List all RSS feed items action except the trigger.
If you still want to use logic app to do this thing with azure service, you could use azure function to implement it, and if you want to upload the file you could use azure storage blob. And the function type you could use HTTP trigger function , also azure function support java, more details you could refer to this tutorial:Azure Functions Java developer guide.

Related

how can I upload a file from blob storage using logic apps vs azure functions for large files

I need to use logic app to load some csv files in a files storage in Azure to a blob storage. what trigger to use in logic app to access the files storage in Azure?
The files are quite large up to 1 GB and I'd like to be able to send them to an ftp server or to a restful endpoint for upload (using example PUT verb).
Is logic apps able to do this or would it be better to use Azure functions? Any resources or help pointing me in right direction would be useful.
For your question about which trigger you can use in logic app, it depends on your requirements. If you want the logic app be triggered periodically, you can add a "Recurrence" schedule. If you want to trigger it manually, you can add a request trigger, then you can trigger the logic app by calling the request url.
For your concern about if logic app can do this, I'm a little confused about what you want to do by logic app, you want to load csv files from azure file storage to blob storage in logic app? Or load csv files from blob storage to ftp? Both of them can be implemented by logic app if your files don't exceed its limits.
The "Azure File Storage" connector has general limits below:
The "Azure Blob Storage" connector also has some general limits, shown as below:
Ftp connector's limits are shown as below:
According to the two screenshots above, if your 1 GB files are lots of small files(the number of list blobs can't exceed 5000), your requirements can be implemented in logic app.
If you want to load files from azure file storage to blob storage(your files don't exceed the limits above), you can refer to the logic app below:
If you want to load files from azure blob storage to ftp(your files don't exceed the limits above), you can refer to the logic app below:
By the way, I think it is necessary to mention the price of logic app. It is billed by number of actions' execution, we can know more information about logic app price by this link. So if you have too many files and it will lead to too many action executions in you logic app, you need to notice the cost between logic app and azure function. Maybe function will be cheaper than logic app.

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

how to load file from share point online to Azure Blob using Azure Data Factory

Can any one help me how to load csv file from share point online to azure Blob storage using Azure Data Factory.
I tried with Logic apps and succeed however logic app will not upload all file unless we made any change to the file or upload new.
I need to load all the file even there is no changes.
ADF v2 now supports loading from sharepoint online by OData connector with AAD service principal authentication: https://learn.microsoft.com/en-us/azure/data-factory/connector-odata
You probably can use a Logic App by changing to a Recurrence Trigger.
On that interval, you List the files in the Library then take any action on them you want.

How to make code on an Azure VM trigger from storage blob change (like Functions do)

I've got some image processing code that I need to run in Azure. It's perfect for an Azure Function, but unfortunately requires a component with a complex installation procedure and therefore will need to run in a VM.
However, I'd like to make it behave much like an Azure Function, and trigger whenever new items arrive in blob storage.
My question is: Does Azure provide me with any handy way of doing this, or do I have to write code that polls the blob storage looking for new items?
Have a look at Azure WebJobs SDK. It shares API model with Functions, but you can host it in any .NET application. Blob Trigger.

How programically do what Microsoft Azure Storage Explorer is doing?

there is a tutorial in Microsoft docs that you can see here:
Tutorial: Build your first pipeline to transform data using Hadoop cluster
in "Prerequisites" section, in step 6, they wrote "Use tools such as Microsoft Azure Storage Explorer".
the question is, can I use some other tools? especially, is it possible to use scripting languages like Python directly?
I need to do all these 7 steps dynamically, using something like Azure Function Apps. do you know is it possible and if it is, from where I should start?
Short answer is YES. But again, you have not shared details on what functionality are you looking for specifically.
What you can do is call the REST API endpoints for the corresponding service.
Depending on whether you are using Blobs or Table or Queues, there are specific API's that you can call.
Azure Storage Services REST API Reference
Blob service REST API's
Queue Service REST API
Table Service REST API
File Service REST API
Taking Blobs as example, we have API's to upload content using PUT method. See this for more details: Put Blob
Similarly, you have API's for reading containers, listing containers etc.
There is also some samples on working with Azure Storage in Python on Github. See this: Azure-Samples/storage-python-getting-started
HTH

Resources