I am thinking about implementing IFileProvider interface with Azure File Storage.
What i am trying to find in docs is if there is a way to send the whole path to the file to Azure API like rootDirectory/sub1/sub2/example.file or should that actually be mapped to some recursion function that would take path and traverse directories structure on file storage?
just want to make sure i am not missing something and reinvent the wheel for something that already exists.
[UPDATE]
I'm using Azure Storage Client for .NET. I would not like to mount anything.
My intentention is to have several IFileProviders which i could switch based on Environment and other conditions.
So, for example, if my environment is Cloud then i would use IFileProvider implementation that uses Azure File Services through Azure Storage Client. Next, if i have environment MyServer then i would use servers local file system. Third option would be environment someOther with that particular implementation.
Now, for all of them, IFileProvider operates with path like root/sub1/sub2/sub3. For Azure File Storage, is there a way to send the whole path at once to get sub3 info/content or should the path be broken into individual directories and get reference/content for each step?
I hope that clears the question.
Now, for all of them, IFileProvider operates with path like ˙root/sub1/sub2/sub3. For Azure File Storage, is there a way to send the whole path at once to getsub3` info/content or should the path be broken into individual directories and get reference/content for each step?
For access the specific subdirectory across multiple sub directories, you could use the GetDirectoryReference method for constructing the CloudFileDirectory as follows:
var fileshare = storageAccount.CreateCloudFileClient().GetShareReference("myshare");
var rootDir = fileshare.GetRootDirectoryReference();
var dir = rootDir.GetDirectoryReference("2017-10-24/15/52");
var items=dir.ListFilesAndDirectories();
For access the specific file under the subdirectory, you could use the GetFileReference method to return the CloudFile instance as follows:
var file=rootDir.GetFileReference("2017-10-24/15/52/2017-10-13-2.png");
Related
In my case I need to read file/icon.png from cloud storage/bucket which is a token base URL/path. Token resides in header of request.
I tried to use fs.readFile('serverpath') but it gave back error as 'ENOENT' i.e. 'No such file or directory' is existed, but file is existed on that path. So are these methods are eligible to make calls and read files from server or they work only with static path, if that is so then in my case how to read file from cloud bucket/server.
Here i need to pass that file-path to UI, to show this icon.
Use this lib to handle GCS operations.
https://www.npmjs.com/package/#google-cloud/storage
If you do need use fs, install https://cloud.google.com/storage/docs/gcs-fuse, mount bucket to your local filesystem, then use fs as you normally would.
I would like to complement Cloud Ace's answer by saying that if you have Storage Object Admin permission you can make the URL of the image public and use it like any other public URL.
If you don't want to make the URL public you can get temporary access to the file by creating a signed URL.
Otherwise, you'll have to download the file using the GCS Node.js Client.
I posted this as an answer as it is quite long to be a comment.
Problem:
trying to get an image out of azure fileshare for manipulation. I need to read the file as an Drawing.Image for manipulation. I cannot create a valid FileInfo object or Image using uncpath (which I need to do in order to use over IIS)
Current Setup:
Attach a virtual directory called Photos in IIS website pointing to UNCPath of the Azure file share (e.g. \myshare.file.core.windows.net\sharename\pathtoimages)
This works as http://example.com/photos/img.jpg so I know it is not a permissions or authentication issue.
For some reason though I cannot get a reference to File.
var imgpath = Path.Combine(Server.MapPath("~/Photos"),"img.jpg")
\\resolves as \\myshare.file.core.windows.net\sharename\pathtoimages\img.jpg
var fi = new FileInto(imgpath);
if(fi.exists) //this returns false 100% of the time
var img = System.Drawing.Image.FromFile(fi.FullName);
The problem is that the file is never found to exist, even though I cant take that path and put it in an explorer window and return the img.jpg 100% of the time.
Does anyone have any idea why this would not be working?
Do I need to be using CloudFileShare object to just get a read of a file I know is there?
It turns out the issue is that I needed to wrap my code in an impersonation of the azure file share userid since the virtual directory is not really in play at all at this point.
using (new impersonation("UserName","azure","azure pass"))
{
//my IO.File code
}
I used this guys impersonation script found here.
Can you explain why DirectoryInfo.GetFiles produces this IOException?
I've created a file system abstraction, where I store files with a relative path, e.g /uploads/images/img1.jpg.
These can then be saved both on local file system (relative to folder), or Azure. Then, I can also ask a method to give me the url to access that relative path.
In Azure, currently this is being done similar to the below:
public string GetWebPathForRelativePathOnUserContentStorage(string relativeFileFullPath)
{
var container = getCloudBlobContainer();
CloudBlockBlob blob = container.GetBlockBlobReference(relativeFileFullPath);
return blob.Uri.ToString();
}
On a normal website, there might be say 40 images in one page - So this get's called like 40 times. Is this first of all slow? I've noticed there is a particular pattern in the generated URL:
https://[storageAccountName].blob.core.windows.net/[container_name]/[relative_path]
Can I safely generate that URL without using the Azure storage API?
On a normal website, there might be say 40 images in one page - So
this get's called like 40 times. Is this first of all slow?
Not at all. The code you wrote above does not make any calls to storage. It just creates an instance of CloudBlockBlob object. If you were using GetBlockBlobReferenceFromServer method, then it would have been a different story because that method makes a call to storage.
`I've noticed there is a particular pattern in the generated URL:
_https://[storageAccountName].blob.core.windows.net/[container_name]/[relative_path]
Can I safely generate that URL without using the Azure storage API?
Absolutely yes. Assuming you're using just standard stuff that would be perfectly fine. Non standard stuff would include things like using a custom domain for your blob storage or connecting to geo-secondary location of your storage account.
I am trying to run a C executable on Azure. I have many workerRoles and they continuously check a Job Queue. If there is a job in the queue, a worker role runs an instance of the C executable as a process according to the command line arguments stored in a job class. The C executable creates some log files normally. I do not know how to access those created files. What is the logic behind it? Where are the created files stored? Can anyone explain me? I am new to azure and C#.
One other problem is that all of the working instances of the C executable need to read a data file. How can I distribute that required file?
First, realize that in Windows Azure, your worker role is simply running inside a Windows 2008 Server environment (either SP2 or R2). When you deploy your app, you would deploy your C executable as well (or grab it from blob storage, but that's a bit more advanced). To find out where your app lives on disk, call Environment.GetEnvironmentVariable("RoleRoot") - that returns a path. You'd typically have your app sitting in a folder called AppRoot under the role root. You'd find your C executable there.
Next, you'll want your app to write its files to an output directory you specify on the command line. You can set up storage in your local VM with your role's properties. Look at the Local Storage tab, and configure a named local storage area:
Now you can get the path to that storage area, in code, and pass it as a command line argument:
var outputStorage = RoleEnvironment.GetLocalResource("MyLocalStorage");
var outputFile = Path.Combine(outputStorage.RootPath, "myoutput.txt");
var cmdline = String.Format("--output {0}", outputFile);
Here's an example of launching your myapp.exe process, with command line arguments:
var appRoot = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot")
+ #"\", #"approot");
var myProcess = new Process()
{
StartInfo = new ProcessStartInfo(Path.Combine(appRoot, #"myapp.exe"), cmdline)
{
CreateNoWindow = false,
UseShellExecute = false,
WorkingDirectory = appRoot
}
};
myProcess.WaitForExit();
Normally you'd set CreateNoWindow to true, but it's easier to debug if you can see the command shell window.
Last thing: Once your app is done creating the file, you'll want to either:
Process it and delete it (it's not in a durable place so eventually it'll disappear)
Change your storage to use a Cloud Drive (durable storage)
Copy your file to a blob (durable storage)
In production, you'll want to add exception-handling, and you can re-route stdout and stderr to be captured. But this sample code should be enough to get you started.
OOPS - one more 'one more thing': When adding your 'myapp.exe' to your project, be SURE to go to its Properties, and set 'Copy to Output Directory' to 'Copy Always' - otherwise your myapp.exe file won't end up in Windows Azure and you'll wonder why things don't work.
EDIT: Pushing results to a blob - a quick example
First get set up a storage account and add to your role's Settings. Say you called it 'AzureStorage' - now set it up in code, get a reference to a blob container, get a reference to a blob within that container, and then perform a file upload to the blob:
CloudStorageAccount storageAccount = CloudStorageAccount.FromConfigurationSetting("AzureStorage");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer outputfiles = blobClient.GetContainerReference("outputfiles");
outputfiles.CreateIfNotExist();
var blobname = "myoutput.txt";
var blob = outputfiles.GetBlobReference(blobname);
blob.UploadFile(outputFile);
In Azure land you shouldn't write to the file system. You should write to SQL Azure, Table Storage or most likely in this case Blob storage (basically, I think you should think of blob storage as the old file system)
This is because:
You could have multiple instances running and you will end up having different files on different instances (which are just virtual machines)
Your instance could potentially be moved at any moment and you would lose the info on the file system as it's not part of your deployment package.
Using one of the three storage options will provide a central repository for all of your instances to access and it will be persisted over a redeployment.
How can I create a sub container in the azure storage location?
Windows Azure doesn't provide the concept of heirarchical containers, but it does provide a mechanism to traverse heirarchy by convention and API. All containers are stored at the same level. You can gain simliar functionality by using naming conventions for your blob names.
For instance, you may create a container named "content" and create blobs with the following names in that container:
content/blue/images/logo.jpg
content/blue/images/icon-start.jpg
content/blue/images/icon-stop.jpg
content/red/images/logo.jpg
content/red/images/icon-start.jpg
content/red/images/icon-stop.jpg
Note that these blobs are a flat list against your "content" container. That said, using the "/" as a conventional delimiter, provides you with the functionality to traverse these in a heirarchical fashion.
protected IEnumerable<IListBlobItem>
GetDirectoryList(string directoryName, string subDirectoryName)
{
CloudStorageAccount account =
CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
CloudBlobClient client =
account.CreateCloudBlobClient();
CloudBlobDirectory directory =
client.GetBlobDirectoryReference(directoryName);
CloudBlobDirectory subDirectory =
directory.GetSubdirectory(subDirectoryName);
return subDirectory.ListBlobs();
}
You can then call this as follows:
GetDirectoryList("content/blue", "images")
Note the use of GetBlobDirectoryReference and GetSubDirectory methods and the CloudBlobDirectory type instead of CloudBlobContainer. These provide the traversal functionality you are likely looking for.
This should help you get started. Let me know if this doesn't answer your question:
[ Thanks to Neil Mackenzie for inspiration ]
Are you referring to blob storage? If so, the hierarchy is simply StorageAccount/Container/BlobName. There are no nested containers.
Having said that, you can use slashes in your blob name to simulate nested containers in the URI. See this article on MSDN for naming details.
I aggree with tobint answer and I want to add something this situation because I also
I need the same way upload my games html to Azure Storage with create this directories :
Games\Beautyshop\index.html
Games\Beautyshop\assets\apple.png
Games\Beautyshop\assets\aromas.png
Games\Beautyshop\customfont.css
Games\Beautyshop\jquery.js
So After your recommends I tried to upload my content with tool which is Azure Storage Explorer and you can download tool and source code with this url : Azure Storage Explorer
First of all I tried to upload via tool but It doesn't allow to hierarchical directory upload because you don't need : How to create sub directory in a blob container
Finally, I debug Azure Storage Explorer source code and I edited Background_UploadBlobs method and UploadFileList field in StorageAccountViewModel.cs file. You can edit it what you wants.I may have made spelling errors :/ I am so sorry but That's only my recommend.
If you are tying to upload files from Azure portal:
To create a sub folder in container, while uploading a file you can go to Advanced options and select upload to a folder, which will create a new folder in the container and upload the file into that.
Kotlin Code
val blobClient = blobContainerClient.getBlobClient("$subDirNameTimeStamp/$fileName$extension");
this will create directory having TimeStamp as name and inside that there will be your Blob File. Notice the use of slash (/) in above code which will nest your blob file by creating folder named as previous string of slash.
It will look like this on portal
Sample code
string myfolder = "<folderName>";
string myfilename = "<fileName>";
string fileName = String.Format("{0}/{1}.csv", myfolder, myfilename);
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);