I have created two storage accounts, one have my azure function while another is a file storage account. What I want is to create a file from azure function and store it in my azure file storage. I went through official documentation of file storage as well as of Azure function, but I am not finding any connecting link between the two.
Is it possible to create file from azure function and store it in file storage account, if yes, please assist accordingly.
There is a preview of Azure Functions External File bindings to upload file to external storage, but it doesn't work with Azure File Storage. There is a github issue to create a new type of binding for Files.
Meanwhile, you can upload the file just by using Azure Storage SDK directly. Something like
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.File;
public static void Run(string input)
{
var storageAccount = CloudStorageAccount.Parse("...");
var fileClient = storageAccount.CreateCloudFileClient();
var share = fileClient.GetShareReference("...");
var rootDir = share.GetRootDirectoryReference();
var sampleDir = rootDir.GetDirectoryReference("MyFolder");
var fileToCreate = sampleDir.GetFileReference("output.txt");
fileToCreate.UploadText("Hello " + input);
}
Related
Is there any difference between azure storage of local environment and with online storage.
We have created a Azure local storage using storage emulator. Refer the below link.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-emulator
https://medium.com/oneforall-undergrad-software-engineering/setting-up-the-azure-storage-emulator-environment-on-windows-5f20d07d3a04
But, we are unable to access the files for (read files) Azure local storage. Refer the below code.
const string accountName = "devstoreaccount1";// Provide the account name
const string key = "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==";// Provide the account key
var storageCredentials = new StorageCredentials(accountName, key);
var cloudStorageAccount = new CloudStorageAccount(storageCredentials, true);
// Connect to the blob storage
CloudBlobClient serviceClient = cloudStorageAccount.CreateCloudBlobClient();
// Connect to the blob container
CloudBlobContainer container = serviceClient.GetContainerReference(**"container name"**);
container.SetPermissionsAsync(new
BlobContainerPermissions
{
PublicAccess =
BlobContainerPublicAccessType.Blob
});
// Connect to the blob file
CloudBlockBlob blob = container.GetBlockBlobReference("sample.txt");
blob.DownloadToFileAsync("sample.txt", System.IO.FileMode.Create);
// Get the blob file as text
string contents = blob.DownloadTextAsync().Result;
The above code works correctly for reading the files in online Azure storage. Anyone suggest how to resolve the issue in reading the files in local Azure storage.
The document explains clearly about differences between the Storage Emulator and Azure Storage.
If you would like to access the local storage, you could call this API. For more details about URI, see here.
Get http://<local-machine-address>:<port>/<account-name>/<resource-path>
I have an Azure SAS URI. I need to upload file in the same. Can I access it using java code. I can only find example using the key and the SAS(shared access signatures) URI.
You can use the Azure Storage SDK for java . To upload a file, you just need to use the class CloudBlockBlob which should look like,
URL url = new URL("yourSASUrl");
try
{
CloudBlockBlob blob = new CloudBlockBlob(url)
File source = new File(filePath);
blob.upload(new FileInputStream(source), source.length());
}
I have saved pdf files in azure blob storage blob, I want to show these files on my website but when a file render on html its link should be deactivated means no one can use that link to download the file again. Is this possible in azure blob storage?
You can use the blob policy to make it:
CloudStorageAccount account = CloudStorageAccount.Parse("yourStringConnection");
CloudBlobClient serviceClient = account.CreateCloudBlobClient();
var container = serviceClient.GetContainerReference("yourContainerName");
container
.CreateIfNotExistsAsync()
.Wait();
CloudBlockBlob blob = container.GetBlockBlobReference("test/helloworld.txt");
blob.UploadTextAsync("Hello, World!").Wait();
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
// define the expiration time
policy.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(1);
// define the permission
policy.Permissions = SharedAccessBlobPermissions.Read;
// create signature
string signature = blob.GetSharedAccessSignature(policy);
// get full temporary uri
Console.WriteLine(blob.Uri + signature);
If I understand correctly, you're looking for single use links to Azure Blobs. Natively this feature is not available in Azure Storage. You would need to write code to implement something like this where you would keep track of the number of times a link has been used and in case the limit exceeds, you will not process that link.
In Azure, I have a Storage Account that I use to upload files from an IoT device. The files are sent when the IoT device detects certain conditions. All the files are uploaded to the same Blob Container and in the same folder (inside the Blob container).
What I would like to do is to send an email automatically (as an alert) when a new file is uploaded to the Blob Container. I have checked the different options that are provided by Azure to set alerts in Storage accounts (in Azure Portal) but I have not found anything useful.
How could I create this kind of alert?
As far as I know, the azure provides the azure function or webjobs which could be triggered when the new files uploaded to the special container.
I suggest you could use azure function blob trigger to achieve your requirement.
More details, you could refer to this article.
In the azure function blob trigger fired method you could also bind sendgrid to send the email.
More details, you could refer to below steps:
Notice: I used C# azure function as example, you could also use another language.
1.Create a blob trigger azure function.
2.Create a sendgrid(Link) account and create API key.
3.Create set the created azure function sendgrid outbind.
4.Add below codes to azure function run.csx.
#r "SendGrid"
using System;
using SendGrid;
using SendGrid.Helpers.Mail;
public static Mail Run(Stream myBlob, string name, TraceWriter log)
{
var message = new Mail
{
Subject = "Azure news"
};
var personalization = new Personalization();
personalization.AddTo(new Email("sendto email address"));
Content content = new Content
{
Type = "text/plain",
Value = name
};
message.AddContent(content);
message.AddPersonalization(personalization);
return message;
}
I am uploading zip files to Azure Blob Storage which are relatively huge.
Now I need to go to those containers, get the blob reference and store that zip file into multiple zips on Cloud file share. I am not sure how to proceed with that.
var storageAccount = AzureUtility.CreateStorageAccountFromConnectionString();
var container = AzureUtility.GetAzureCloudBlobContainerReference("fabcd");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("sample-share");
share.CreateIfNotExists();
CloudBlockBlob sourceBlob = container.GetBlockBlobReference("Test.zip");
var file = share.GetRootDirectoryReference().GetFileReference("Test.zip").Exists();
if(file)
{
//split and share
}
any suggestions
My understanding is that you're trying to download a blob and then divide and upload it into multiple files.
There's two options here both of which are exposed through the .Net API. The blob API exposes both DownloadRangeTo* methods and DownloadTo* methods. The file API exposes UploadFrom* methods. If you know in advance the divisions you want to make you can download the range and then upload it to the file. Otherwise you can download the entire blob, divide it client side, and then upload the divisions.