Accessing Azure File Storage from Azure Function - azure

I'm attempting to retrieve a file out of Azure File Storage to be used by an .exe that is executed within an Azure Function and can't seem to get pass the UNC credentials.
My app gets the UNC file path out of an Azure SQL database and then attempts to navigate to that UNC path (in Azure File Storage) to import the contents of the file. I can navigate to the file location from my PC in windows explorer but am prompted for the credentials.
I've tried using a "net use" command prior to executing the app but it doesn't seem to authenticate.
net use \\<storage account>.file.core.windows.net\<directory>\ /u:<username> <access key>
MyApp.exe
Azure Function Log error:
Unhandled Exception: System.UnauthorizedAccessException: Access to the path '<file path>' is denied.
If possible, I'd rather not modify my C# app and do the authentication in the Azure Function (its a Batch function at the moment and will be timer based).

I believe it is not possible to mount an Azure File Service Share in an Azure Function as you don't get access to underlying infrastructure (same deal as WebApps).
What you could do is make use of Azure Storage SDK which is a wrapper over Azure Storage REST API and use that in your application to interact with files in your File Service Share.

You cannot use SMB (445/TCP). Functions run inside the App Service sandbox.
From https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox#restricted-outgoing-ports:
Restricted Outgoing Ports
Regardless of address, applications cannot connect to anywhere using ports 445, 137, 138, and 139. In other words, even if connecting to a non-private IP address or the address of a virtual network, connections to ports 445, 137, 138, and 139 are not permitted.
Use the Azure Storage SDK to talk to your Azure File endpoint:
using Microsoft.Azure; // Namespace for CloudConfigurationManager
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.File;
// Parse the connection string and return a reference to the storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create a CloudFileClient object for credentialed access to File storage.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Get a reference to the file share we created previously.
CloudFileShare share = fileClient.GetShareReference("logs");
// Ensure that the share exists.
if (share.Exists())
{
// Get a reference to the root directory for the share.
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
// Get a reference to the directory we created previously.
CloudFileDirectory sampleDir = rootDir.GetDirectoryReference("CustomLogs");
// Ensure that the directory exists.
if (sampleDir.Exists())
{
// Get a reference to the file we created previously.
CloudFile file = sampleDir.GetFileReference("Log1.txt");
// Ensure that the file exists.
if (file.Exists())
{
// Write the contents of the file to the console window.
Console.WriteLine(file.DownloadTextAsync().Result);
}
}
}
Sample uses CloudConfigurationManager - i think that's a bit too much for such a simple scenario. I would do this instead:
using System.Configuration;
// "StorConnStr" is the Storage account Connection String
// defined for your Function in the Azure Portal
string connstr = ConfigurationManager.ConnectionStrings["StorConnStr"].ConnectionString;

Related

How does Azure BlobStorage connection data have to be stored to support all available addressing modes?

I am using libraries Microsoft.Azure.Storage.Blob 11.2.3.0 and Microsoft.Azure.Storage.Common 11.2.3.0 to connect to an Azure BlobStorage from a .NET Core 3.1 application.
Users of my application are supposed to supply connection information to an Azure BlobStorage to/from where the application will deposit/retrieve data.
Initially, I had assumed allowing users to specify a connection string and a custom blob container name (as an optional override of the default) would be sufficient. I could simply stuff that connection string into the CloudStorageAccount.Parse method and get back a storage account instance to call CreateBlobCloudClient on.
Now that I'm trying to use this method to connect using a container-specific SAS (also see my other question about that), it appears that the connection string might not be the most universal way to go.
Instead, it now seems a blob container URL, plus a SAS token or an account key (and possibly an account name, thought that seems to be included in the blob container URL already) are more versatile. However, I am concerned that the next way of pointing to a blob storage that I need to support (whichever that may be) might require yet another kind of information - hence my question:
What set of "fields" do I need to support in the configuration files of my application to make sure my users can point to their BlobStorage whichever way they want, as long as they have a BlobStorage?
(Is there maybe even a standard solution or best practice recommendation by Microsoft?)
Please note that I am exclusively concerned with what to store. An arbitrarily long string? A complex object of sorts? If so, with what fields?
I am not asking how to store that configuration once I know what it must comprise. For example, this is not about securely encrypting credentials etc.
On Workaround To access the Storage account using the SAS Token you need to pass the Account Name along with the SAS Token and Blob Name if you trying to upload and You need give the permission for your SAS Token .
Microsoft recommends using Azure Active Directory (Azure AD) to authorize requests against blob and queue data if possible, instead of Shared Key. Azure AD provides superior security and ease of use over Shared Key. For more information about authorizing access to data with Azure AD, see Authorize access to Azure blobs and queues using Azure Active Directory..
Note: Based on my testes you need to pass the Storage Account Name And SAS Token and the Container Name And Blob name
Example: I tried with uploading file to container using container level SAS Token . able to upload the file successfully.
const string sasToken = "SAS Token";
StorageCredentials storageCredentials = new StorageCredentials(sasToken);
const string accountName = "teststorage65";//Account Name
const string blobContainerName = "test";
const string blobName = "test.txt";
const string myFileLocation = #"Local Path ";
var storageAccount = new CloudStorageAccount(storageCredentials, accountName, null, true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference(blobContainerName);
//blobContainer.CreateIfNotExists();
CloudBlockBlob cloudBlob = blobContainer.GetBlockBlobReference(blobName);
cloudBlob.UploadFromFile(myFileLocation);
As you already know You can use the Storage connection string to connect with Storage.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("Connection string");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("test");
Your application needs to access the connection string at runtime to authorize requests made to Azure Storage.
You have several options for storing your connection string Or SAS Token
1) You can store your connection string in an environment variable.
2) An application running on the desktop or on a device can store the connection string in an app.config or web.config file. Add the connection string to the AppSettings section in these files.
3) An application running in an Azure cloud service can store the connection string in the Azure service configuration schema (.cscfg) file. Add the connection string to the ConfigurationSettings section of the service configuration file.
Reference: https://learn.microsoft.com/en-us/azure/storage/common/storage-configure-connection-string

Where in the file system is a Worker Role's Local Storage?

I've created a log file in a Worker Role, which is being stored in Local Storage.
Where in the file system of the instance can I find that log now? IOW, where is the root of Local Storage for a worker role?
To find the path of the local storage, you can use LocalResource.RootPath. This will give you the full directory path of the local storage resource.
For example:
try
{
LocalResource myConfigsStorage = RoleEnvironment.GetLocalResource("local-resource-setting-name");
string s = System.IO.File.ReadAllText(myConfigStorage.RootPath + "myFile.txt");
//… do your work with s
}
catch (Exception myException)
{
…
}
Please look under C:\Resources directory in the Azure VM running your Worker Role and you should be able to find the log file you created. Please note that Azure SDK creates folders inside this directory for your worker role instance id and deployment id.

How to use azure file storage with Azure Function?

I have created two storage accounts, one have my azure function while another is a file storage account. What I want is to create a file from azure function and store it in my azure file storage. I went through official documentation of file storage as well as of Azure function, but I am not finding any connecting link between the two.
Is it possible to create file from azure function and store it in file storage account, if yes, please assist accordingly.
There is a preview of Azure Functions External File bindings to upload file to external storage, but it doesn't work with Azure File Storage. There is a github issue to create a new type of binding for Files.
Meanwhile, you can upload the file just by using Azure Storage SDK directly. Something like
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.File;
public static void Run(string input)
{
var storageAccount = CloudStorageAccount.Parse("...");
var fileClient = storageAccount.CreateCloudFileClient();
var share = fileClient.GetShareReference("...");
var rootDir = share.GetRootDirectoryReference();
var sampleDir = rootDir.GetDirectoryReference("MyFolder");
var fileToCreate = sampleDir.GetFileReference("output.txt");
fileToCreate.UploadText("Hello " + input);
}

Saving an X509 certificate from an Azure Blob and using it in an Azure website

I have an Azure Website and an Azure Blob that I'm using to store a .cer X509 certificate file.
The goal is to get the .cer file from the blob and use it to perform an operation (the code for that is in the Controller for my Azure website and it works).
When I run the code locally (without publishing my site) it works, because I save it in D:\
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("myContainer");
// Retrieve reference to a blob named "testcert.cer".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("testcert.cer");
// Save blob contents to a file.
using (var fileStream = System.IO.File.OpenWrite("D:/testcert.cer"))
{
blockBlob.DownloadToStream(fileStream);
}
**string certLocation = "D:/testcert.cer";
X509Certificate2 myCert = new X509Certificate2();
myCert.Import(certLocation);**
I am unable to figure out how/where I can save it. If I try and use the Import method but enter a url (that of the Azure blob where the certificate is stored) I get an error because Import can't handle urls.
Any idea what I can use as temp storage on the Azure website or in the blob and create an X509Certificate from it?
Edit: I'm trying to add more detail about the problem I'm trying to solve.
Get a cert file from an Azure blob and write it to an Azure website.
Use the .Import(string pathToCert) on an X509Certificate object to create the cert which will be used to make a call in a method I've written in my controller.
I've been able to work around 1 by manually adding the .cer file to the wwwroot folder of my site via FTP. But now when I use Server.MapPath("~/testcert.cer"); to get the path for my certificate I get this: D:\home\site\wwwroot\testcert.cer
Obviously when the Import method uses the string above as a path once it's deployed to my azure website, it's not a valid path and so my cert creation fails.
Any ideas? Thanks!
Saving the certificate locally is generally a no-no for Azure, you've got BlobStorage for that.
Use the Import(byte[]) overload to keep and load the certificate in memory. Here's a quick hand coded attempt...
// Used to store the certificate data
byte[] certData;
// Save blob contents to a memorystream.
using (var stream = new MemoryStream())
{
blockBlob.DownloadToStream(stream);
certData = stream.ToArray();
}
X509Certificate2 myCert = new X509Certificate2();
// Import from the byte array
myCert.Import(certData);
Very simple. And the answer covers all and any web hosters, not just Azure.
First of all, I would highly recommend that you never statically put a path to a folder in your Web Projects! Then what you can do is:
User the Server.MapPath("~/certs") method to obtain the physical path of a certs folder within your web site root folder.
Make Sure that no one can access this folder from the outside world:
By adding this additional location section in your web.config you are blocking any external access to this folder. Please note that locationelement has to be direct descendant of the root configuration element in your web.config file.
UPDATE with non-azure relevant part on how to write file to local file system of a ASP.NET web project:
var path = Server.MapPath("~/certs");
using (var fileStream = System.IO.File.OpenWrite(path + "\testcert.cer"))
{
// here use the fileStream to write
}
And a complete sample code on how to use Blob Stroage client to write content of a blob to a local file:
var path = Server.MapPath("~/certs");
using (var fileStream = System.IO.File.OpenWrite(path + "\testcert.cer"))
{
blockBlob.DownloadToStream(fileStream );
}
But, #SeanCocteau has a good point and much simpler approach - just use MemoryStream instead!
You can now upload your certificates via the Portal, add an app setting to your site, and have the certificate show up in your site's Certificate Store.
See this blog post for more details:
http://azure.microsoft.com/blog/2014/10/27/using-certificates-in-azure-websites-applications/

Azure Drive addressing using local emulated blob store

I am unable to get a simple tech demo working for Azure Drive using a locally hosted service running the storage/compute emulator. This is not my first azure project, only my first use of the Azure Drive feature.
The code:
var localCache = RoleEnvironment.GetLocalResource("MyAzureDriveCache");
CloudDrive.InitializeCache(localCache.RootPath, localCache.MaximumSizeInMegabytes);
var creds = new StorageCredentialsAccountAndKey("devstoreaccount1", "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==");
drive = new CloudDrive(new Uri("http://127.0.0.1:10000/devstoreaccount1/drive"), creds);
drive.CreateIfNotExist(16);
drive.Mount(0, DriveMountOptions.None);
With local resource configuration:
LocalStorage name="MyAzureDriveCache" cleanOnRoleRecycle="false" sizeInMB="220000"
The exception:
Uri http://127.0.0.1:10000/devstoreaccount1/drive is Invalid
Information on how to address local storage can be found here: https://azure.microsoft.com/en-us/documentation/articles/storage-use-emulator/
I have used the storage emulator UI to create the C:\Users...\AppData\Local\dftmp\wadd\devstoreaccount1 folder which I would expect to act as the container in this case.
However, I am following those guidelines (as far as I can tell) and yet still I receive the exception. Is anyone able to identify what I am doing wrong in this case? I had hoped to be able to resolve this easily using a working sample where someone else is using CloudDrive with 127.0.0.1 or localhost but was unable to find such on Google.
I think you have passed several required steps before mounting.
You have to initialize the local cache for the drive, and the URI of the page blob containing the Cloud Drive before mounting it.
Initializing the cache:
// Initialize the local cache for the Azure drive
LocalResource cache = RoleEnvironment.GetLocalResource("LocalDriveCache");
CloudDrive.InitializeCache(cache.RootPath + "cache", cache.MaximumSizeInMegabytes);
Defining the URI of the page blob, usually made in the configuration file:
// Retrieve URI for the page blob that contains the cloud drive from configuration settings
string imageStoreBlobUri = RoleEnvironment.GetConfigurationSettingValue("< Configuration name>");

Resources