Notify email when Azure Storage table gets new entry - azure

Is there an inbuilt way to notify an email when a new entry is added to the Table?
I am asking for anything programatically just within their own UI

Not currently but you could put it on an Azure Storage Queue and process it to Table Storage and send an Email with Azure Functions.
Check out this page what is possible - https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings

Okay, so the easiest way to see the data is to use their desktop app
Use https://azure.microsoft.com/en-us/features/storage-explorer/

The lastest Azure product updates covering event-driven applications suggest to adopt these application patterns to react to events published by Azure Storage. These docs/resources might help to explore the application pattern and current platform/framework support.
Azure Storage
Reacting to Blob storage events (preview)
Did not found anything similar for Azure Tables - what about suggesting this on UserVoice?
CosmosDB
Working with the change feed support in Azure Cosmos DB

Related

How to choose only specific table as follower instead of entire DB in Azure data explorer using azure data share

I am working on something, where I need to replicate only few tables instead of entire database from the leader cluster. How should I do it in the Azure Portal using Azure data share? I can see from azure documentation, that they are using C# or some other language for it, can we do it directly via Azure Portal?
As of this writing, table-level sharing isn't yet available through Azure Data Share, but should become available in the next few weeks (follow this doc for updates: https://learn.microsoft.com/en-us/azure/data-explorer/data-share)
As you mentioned correctly, it is already available programmatically using the management API (documented here: https://learn.microsoft.com/en-us/azure/data-explorer/follower#table-level-sharing)

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Azure Unzip automation

I am looking to do the following in Azure however I should point out that on my local machine I have no visual studio, no admin rights, no IT support and no tools (except SSMS) but I have a VERY strong drive to complete this work if its possible.
I have created an Azure blob which receives a file each day (zipped) from a 3rd party. I am looking to do the following:
1)Unzip the data in an automated fashion
2)Get the data into an Azure SQL database (already created) in an automated fashion
What I want to know is if this is possible to do using Azure alone or am I going to need admin rights / Visual Studio? If it is possible any directions that you could point me in would be greatly received!
Thanks
Dave
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Spin up your data factory on Azure, unzip function is available on adf

Unable to create an Import/Export Job on the new Azure portal

I have been trying to set up an import job as described here; the problem is that we do not have "Classic" storage, rather we are trying to set it up with "New" storage. Using the new portal I cannot find the place where one is meant to create a new job. The linked article shows how to do this for classic storage on the old portal only.
I have tried using the second approach they mention, which is to use the API, but that is turning out to be more of a pain than I though.
Does anyone know where I can add an import/export job in the new portal? Is this possible with "new" storage? If I manage to get the API way to work, can it be applied to "new" storage or is it only for "classic"?
Unfortunately, Import/Export is not available in the Preview Portal, and does not work at this time with v2 storage accounts. Can you use a Classic storage account instead?
We may be able to provide a sample code to unblock your scenario. Can you please send an email with your detailed requirements to waimportexport#microsoft.com so that we can set up a call to discuss further.
Thanks,
Aung

Where does Azure keeps non-vm logs? Can them be downloaded programmatically?

Azure keeps a bunch of VM (and cloud service) related logs in WAD* tables. The question is about actions which do not necessarily affect VMs. Say one deleted a Table Storage. Does Azure keep a log record about that? If yes, where? How to fetch them using a program/script?
The Service Management REST API can be used to retrieve the operation logs programmatically.
List Subscription Operations
https://msdn.microsoft.com/library/azure/gg715318.aspx

Resources