My Requirement:
As soon as a file being uploaded to a Blob Container, the Azure Function is being alerted and withing that Azure Function I want to call a Webjob who uses the uploaded file and does any task.
What I learnt:
I learnt that Azure Function can be triggered when a file being uploaded to a blob container. I tried the tutorials and was able to configure an Azure function and it acts for any change in blob container. I did this via Azure Portal and not used Visual Studio.
Now I want to call a WebJob within the Azure Function. Please help me on this.
Assuming that you have written your function in C#, below is the code that might help you. Essentially the idea is to send a POST request to trigger your job:
HttpClient client = new HttpClient();
client.BaseAddress = new Uri(“https://your_web_site.azurewebsites.net/api/”);
var byteArray = Encoding.ASCII.GetBytes(“your_username:your_password”);
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(“Basic”, Convert.ToBase64String(byteArray));
var response = await client.PostAsync(“triggeredwebjobs/your_web_job_name/run”, null);
Username and password you find in Azure portal, in the properties of your job.
Related
I have hosted Azure Web Job inside Azure Web Apps which runs every hour and I need to write the Web Job Run time as key value pair. Next time when Webjob run then it will pick the last run time and do its operations. I was thinking of adding Key Value pair in Azure AppSettings of Azure App Service but I am not able to fine any code to update the value in Azure AppSettings.
Can anyone please let me know the code? Please let me know if it is good approach or should I go for Azure Storage Container to store Last Batch Run Time value.
but I am not able to fine any code to update the value in Azure AppSettings.
You could use Microsoft.WindowsAzure.Management.WebSites to achieve it.
var credentials = GetCredentials(/*using certificate*/);
using (var client = new WebSiteManagementClient(credentials))
{
var currentConfig = await client.WebSites.GetConfigurationAsync(webSpaceName,
webSiteName);
var newConfig = new WebSiteUpdateConfigurationParameters
{
ConnectionStrings = null,
DefaultDocuments = null,
HandlerMappings = null,
Metadata = null,
AppSettings = currentConfig.AppSettings
};
newConfig.AppSettings[mySetting] = newValue;
await client.WebSites.UpdateConfigurationAsync(webSpaceName, webSiteName,
newConfig);
}
Or using Azure Fluent Api, refer to this SO thread.
I am trying to create an Azure function that reads from a .mmdb file GeoLite2 Country DB
I have added the geolite2 file next to my function. But I cannot find a way programmatically to reference the file path such that it remains the same on my local machine as well as deployed/published.
string GeoLocationDbPath = "D:<path_to_project>\Functions\GeoLocation-Country.mmdb"
var reader = new DatabaseReader($"{GeoLocationDbPath}");
I came across this article How to add assembly references to an Azure Function App
I was hoping there was a better way to reference a file both locally and deployed.
Any ideas?
Other links I've looked at:
How to add and reference a external file in Azure Function
How to add a reference to an Azure Function C# project?
Retrieving information about the currently running function
Azure functions – Read file and use SendGrid to send an email
You can get the path to folder by injecting ExecutionContext to your function:
public static HttpResponseMessage Run(HttpRequestMessage req, ExecutionContext context)
{
var funcPath = context.FunctionDirectory; // e.g. d:\home\site\wwwroot\HttpTrigger1
var appPath = context.FunctionAppDirectory; // e.g. d:\home\site\wwwroot
// ...
}
I've come to the conclusion that referencing files local to the Azure function was not a good approach. I read the Azure functions best practices and Azure functions are meant to be stateless. Also, folder structure changes when you deploy.
If I were to continue I would upload the .mmdb file to a Azure blob container and use the CloudStorageAccount to access the file.
I am having a problem updating a csv file in my blob. I have an existing file inside my blob which is a CSV file, and when I press the download button it will automatically download the file to my machine.
Now I already created a Logic App that will update the csv file. When I run the trigger of the app, it updates the file but when I press download, it opens up a new tab where the csv file will be displayed.
I want it the way like the original when I press download it download the file to my machine.
Any help will do or verification if this is possible.
I already tried "compose" and "create to csv" but this way it will not store it to a blob.
As I have test, when you want to create a .csv file with "create blob" action in logic app, it will always get the same problem with you. Because the blob content is "text/plan" which will display in another tab to show.
So, I suggest that you could use azure function to create the blob. In azure function you could set:blob.Properties.ContentType = "application/octet-stream";
Here is the create blob method in azure function:
storageAccount = CloudStorageAccount.Parse(connectionString);
client = storageAccount.CreateCloudBlobClient();
container = client.GetContainerReference("data");
await container.CreateIfNotExistsAsync();
blob = container.GetBlockBlobReference(name);
blob.Properties.ContentType = "application/octet-stream";
using (Stream stream = new MemoryStream(Encoding.UTF8.GetBytes(data)))
{
await blob.UploadFromStreamAsync(stream);
}
For more detailed code, you could refer to this article. After following that, you could download your blob in your local machine.
Note: when you create the azure function action in logic app, it will show error. Just delete Appsetting of "AzureWebJobsSecretStorageType" to "blob".
In Azure, I have a Storage Account that I use to upload files from an IoT device. The files are sent when the IoT device detects certain conditions. All the files are uploaded to the same Blob Container and in the same folder (inside the Blob container).
What I would like to do is to send an email automatically (as an alert) when a new file is uploaded to the Blob Container. I have checked the different options that are provided by Azure to set alerts in Storage accounts (in Azure Portal) but I have not found anything useful.
How could I create this kind of alert?
As far as I know, the azure provides the azure function or webjobs which could be triggered when the new files uploaded to the special container.
I suggest you could use azure function blob trigger to achieve your requirement.
More details, you could refer to this article.
In the azure function blob trigger fired method you could also bind sendgrid to send the email.
More details, you could refer to below steps:
Notice: I used C# azure function as example, you could also use another language.
1.Create a blob trigger azure function.
2.Create a sendgrid(Link) account and create API key.
3.Create set the created azure function sendgrid outbind.
4.Add below codes to azure function run.csx.
#r "SendGrid"
using System;
using SendGrid;
using SendGrid.Helpers.Mail;
public static Mail Run(Stream myBlob, string name, TraceWriter log)
{
var message = new Mail
{
Subject = "Azure news"
};
var personalization = new Personalization();
personalization.AddTo(new Email("sendto email address"));
Content content = new Content
{
Type = "text/plain",
Value = name
};
message.AddContent(content);
message.AddPersonalization(personalization);
return message;
}
I am a beginner in Azure and need some help. We are facing a bit of problem with Azure Storage services and are unable to proceed.
Ok now the issue is
http://blogs.msdn.com/b/windowsazurestorage/archive/2014/08/05/microsoft-azure-storage-service-version-removal.aspx
To summarize:
We have to inspect the log version of an/all of blobs,tables,queues in case any of them are using the one set for planned removal. I have enabled logging for the webapplication on the azure portal site. I am able to see the three services as under
https://.blob.core.windows.net
https://.table.core.windows.net
https://.queue.core.windows.net
Now in the articles as below I gather that we get the log format as this where they have a version included but have NOT specfied from where to locate the logs and how to gather the logs. I have tried different things from using https://.blob.core.windows.net/$logs but makes no difference.
The logs required should be in this format(sample)
Here is a sample log entry, with the version used highlighted – in this case the request was an anonymous GetBlob request which implicitly used the 2009-09-19 version:
1.0;2011-08-09T18:52:40.9241789Z;GetBlob;AnonymousSuccess;200;18;10;anonymous;;myaccount;blob;"https:// myaccount.blob.core.windows.net/thumbnails/lake.jpg?timeout=30000";"/myaccount/thumbnails/lake.jpg";a84aa705-8a85-48c5-b064-b43bd22979c3;0;123.100.2.10;2009-09-19;252;0;265;100;0;;;"0x8CE1B6EA95033D5";Friday, 09-Aug-11 18:52:40 GMT;;;;"8/9/2011 6:52:40 PM ba98eb12-700b-4d53-9230-33a3330571fc"
Can you please show me a way to view these logs. Any tool to use ?
Since these logs are stored in a blob container called $logs, any storage explorer which supports viewing data from this blob container can be used to view the contents. To the best of my knowledge following tools support viewing data from this container: Azure Storage Explorer, Cerebrata Azure Management Studio, Cloud Portam (Disclosure: I am the developer working on this tool).
However before you could view the data you would need to enable logging on your storage account. Only when logging is enabled on the storage account you will see this container show up in your storage account. To enable logging, again you can use Azure Management Studio or Cloud Portam or you could use the code below (the code I mentioned below assumes you have the latest version of Storage Client Library):
static void SetLoggingProperties()
{
CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials(StorageAccount, StorageAccountKey), true);
LoggingProperties properties = new LoggingProperties()
{
LoggingOperations = LoggingOperations.All,
RetentionDays = 365,
Version = "1.0",
};
ServiceProperties serviceProperties = new ServiceProperties()
{
Cors = null,
HourMetrics = null,
MinuteMetrics = null,
Logging = properties,
};
var blobClient = account.CreateCloudBlobClient();
blobClient.SetServiceProperties(serviceProperties);
var tableClient = account.CreateCloudTableClient();
tableClient.SetServiceProperties(serviceProperties);
var queueClient = account.CreateCloudQueueClient();
queueClient.SetServiceProperties(serviceProperties);
}
Once logging properties are set, give it some time for logs to show up.