I have a file service (WCF) writing file info to an on-prem DB and saving the file to the Azure blob storage. We are thinking of updating this process to publish an event to Azure EventHubs. Some team members are saying that everything should be reversed if the event can't be posted but I think we should just re-try in case something happens while publishing the event, so the user doesn't have to re upload the file.
What would you do?
Related
What are possible ways to implement such scenario?
I can think of some Azure function which will periodically check the share for new files. Are there any other possibilities.
I have been thinking also about duplicating the files to Blob storage and generate the notifications from there.
Storage content trigger is by default available for blobs. If you look for migrating to blob storage, then you can utilise BlobTrigger Azure function. In case of file trigger in File Share, the below are my suggestions as requested:
A TimerTrigger Azure function that acts as a poll to check for new file in that time frame the previous trigger occured.
Recurrence trigger in logic app to poll and check for new contents.
A continuous WebJob to continuously poll the File Share checking for new contents.
In my opinion, duplicating the files to Blob storage and making your notification work may not be a great option, because such operation once again requires a polling mechanism which can be achieved with options like a few mentioned above, but is still unnecessary.
At the moment, I am having to routinely manually upload a file onto a Teams Channel.
I have managed to create a pipeline to upload the file into my Azure Data Lake. I would now like to push the file from my Azure environment to my Teams Channel. I have found that webhooks cannot work with files and that bots can send files in the chat but not "Upload" them into a channel.
Is there a way to upload files from Azure to MS Teams using Data Factory or other alternatives?
Thank you.
The most appropriate way to partially achieve this would be to use a Teams Adaptive Card Connector but I couldn't find any way to easily set it up. It seems quite complex. Second option would be to use a Teams Post Message Connector from Azure Logic App but unfortunately it might not yet support attachments but you can send links to files stored elsewhere (SharePoint, Blob etc). Here's what you can try in the meantime.
In ADF:
Create a Copy Activity that will send the files to a Storage.
Create a Web Activity that will send a notification to a Logic App when the pipeline. You only need to use this as a trigger for the Logic App.
In Logic Apps:
Create an HTTP Connector that will receive the pipeline run information the files using the Azure Storage Blob Create blob action.
After that, create a Microsoft Teams Post a message (V3) Teams Connector, choose your Team and Channel
Use the Get blob content using path action to get a URL to the file - you might need to construct this URL.
Create your message using variables from the above connector and parse a link to the file to be downloaded.
Depending on the content of the file, you could also try to retrieve partial content and display it in the message body directly (if you can consider that an alternative for you).
I've not tried using the Adaptive Card Connector but I know it does give you a far richer dynamic experience. You would need to spend some time to design a custom card solution. Use this playground to see if it's something you can explore in the future.
The flow is as follows and kinda runs in parallel:
ADF Pipeline runs > ADF Copy Activity saves file in Blob storage
ADF Pipeline runs > ADF Web Activity triggers Logic App HTTP Connector > Logic App retrieves file from Blob storage > Sends a message to a Teams channel with a link to the file.
Here are all the supported Teams Actions.
I have a network location where every hour a csv file gets dumped. I need to copy that file to an azure blob. How do I know that a file has been uploaded to that network drive. Is there something like a file watcher in azure which monitors this network location? Also, is it possible to copy a file from network location to an azure blob through code?
I'm using .net core APIs deployed to an Azure App Service.
Please suggest a possible solution.
You can use Azure Event Grid but as of today Event Grid does not support Azure File Share.
As your fileshare in on-prem the only way I see is that you can write a custom publisher which can run on-prem and uses Azure Event Grid to send the event to Azure Event Grid and a subscriber which can be Azure Function does the work you want it to do.
https://learn.microsoft.com/en-us/azure/event-grid/custom-event-quickstart-portal
But it will only be an Event and not the file itself which has been added\changed and to do that you will have to then upload the file itself into Azure for processing as well. As the above way requires you to do two things I would recommend run a custom code on-prem which runs CRON job like and looks for the new or edited file and then uploads to Azure BLOB Storage and then execute Azure Function to do your processing task.
Since the files are on-prem you can use powershell to monitor a folder for new files. Then fire an event to upload the file to an Azure blob.
There is a video showing how to do this here: https://www.youtube.com/watch?v=Usih7UywZYA
The changes you need to make are:
replace the action with an upload to azure https://argonsys.com/microsoft-cloud/library/how-to-upload-files-to-azure-blob-storage-using-powershell-and-azcopy/
Run powershell in the context of a user that can upload files
I've been asked to change an old Azure Cloud Service worker's logging to the System.Diagnostics.Trace logging style of logging. I've done that and now I'm about ready to deploy it to azure.
The client requirement is that these logs should appear in blob storage, similar to how the more modern app service logs can be configured to write their diagnostics to blob storage. There is an expectation that logs can be batched up and uploaded periodically (perhaps time or number of lines based).
Is there a nuget package or other library or config I should enable to connect the application to blob storage? I've spent about 20 mins searching here and online for a solution, but information seems to mainly talk about writing logs to Table Storage..
Edit: More detail:
This is an existing app (C# .Net Framework 4.5) that used to use an external logging service.
I assumed (incorrectly, I think) that the logging to blob storage was something I could configure in the Azure Portal.
As things are right now, NO log file of any kind is generated, but when I run the code in Visual Studio, I can see some Output from the logging statements
I have updated the code to use a standard (custom) logging system
that eventually boils down to using statements like the below:
Trace.TraceInformation($"DEBUG: {message}");
Here are some links I found with related information:
Streaming from command line
Trace listener question
Adding Trace to existing website
Performance Impact of Logging
Smarx Library
The logging is configured by the diagnostics.wadcfgx file which you can see in your solution.
This holds all of the diagnostic information that you want to collect. This can be controlled via the "Properties" of the Web\Worker role (right-click -> Properties).
From there, there is also the option to specify the Storage Account:
This isn't always ideal if you are deploying to multiple environments, so you should be able to alter the configuration from the Azure Portal, by downloading and uploading new configuration, following these instructions.
So logging to blob storage, think of it as uploading existing files to the blob storage. If your current app creates files, then you should use put blob property or blob append to add these files to blob storage. So you must interact with the storage SDK to make these transactions. You could also leverage logic apps which uses connectors to blob storage, and would perform certain actions based on specific triggers(time stamp and other conditions).
If you would like to see the generated logs in Azure Storage, you'll have to enable azure diagnostics but these logs would pertain to the storage account itself, not your app.
Since you mentioned that you see the output, you have to transfer that output as an object ex: (text file), then upload it to the storage account. You can find SDK information for C# here. I hope this helps.
so I'm new to Azure and I need to create a service that when given an office365 email (subscriber) will Automatically move files attached to new mails
of the subscriber to VM on azure and then run some tests on them there (inside the VM).
The only way I found to implement it so far is creating a logic-app for each subscriber which is done manualy.
Any help would be appreciated!
Few things if you want to get started.
Create the logic app that stores attachments to the database when a new email is received for a specific user
Add some parameters to your logic app so user email/credentials/tenant are not hard coded.
https://blog.mexia.com.au/preparing-azure-logic-apps-for-cicd
Create an ARM Template to deploy this logic app.
Create another logic app that will deploy the previous logic app.
Whenever a new user is created, call the second logic.
Also do you really need to store your files in a database ? As an alternative you can use Azure Blob Storage to store all these files.
EDIT
If you need to move the files to a VM, I would suggest you to do this:
When you receive an email
Store the attachments in Blob storage.
Generate a SAS Token (with Read permission)
Put the URL + Sas Token of your file into aan Azure Servicebus Queue.
On the VM, Have a service that reads messages from the queue and download files.