I use local storage on Windows Azure to store temporary files. In there I call an .exe file to make a conversion of several other files in same local storage folder. Problem is I always get the exception "Access to the path XYZ.exe is denied.".
I should mention the following:
- I am using a worker role
- set in the service definition file
and tried to add permission to the folder I am accessing:
public static void AddPermission(string absoluteFolderPath)
{
DirectoryInfo myDirectoryInfo = new DirectoryInfo(absoluteFolderPath);
DirectorySecurity myDirectorySecurity = myDirectoryInfo.GetAccessControl();
myDirectorySecurity.AddAccessRule(new FileSystemAccessRule(
"NETWORK SERVICE",
FileSystemRights.FullControl,
AccessControlType.Allow));
myDirectoryInfo.SetAccessControl(myDirectorySecurity);
}
UPDATE:
I tried with this code now:
public static void FixPermissions()
{
var tempDirectory = RoleEnvironment.GetLocalResource("localStorage").RootPath;
Helper.addPermission(tempDirectory);
var dir = new DirectoryInfo(tempDirectory);
foreach (var d in dir.GetDirectories())
Helper.addPermission(d.FullName);
}
private static void addPermission(string path)
{
FileSystemAccessRule everyoneFileSystemAccessRule = new FileSystemAccessRule("Everyone",
FileSystemRights.FullControl,
InheritanceFlags.ContainerInherit | InheritanceFlags.ObjectInherit,
PropagationFlags.None, AccessControlType.Allow);
DirectoryInfo directoryInfo = new DirectoryInfo(path);
DirectorySecurity directorySecurity = directoryInfo.GetAccessControl();
directorySecurity.AddAccessRule(everyoneFileSystemAccessRule);
directoryInfo.SetAccessControl(directorySecurity);
}
I get a really strange behaviour of the page. I still get the errors but sometimes some files gets converted by the ffmpeg.exe file.
Can someone help me out here??
Thanks a lot.
SOLUTION:
So seems the problem was that I ran the .exe file within local storage and therefore had the given security issues. Putting the .exe into the application and referring directly solved my issue.
Thx for your help.
By default your worker role will most likely not be running with sufficient privilege to allow changes to the access control lists on Azure folders.
There's two possible options:
Best: run a script at startup to set the permissions. Details are on MSDN here: http://msdn.microsoft.com/en-us/library/gg456327.aspx. You'll want to set executionContext="elevated".
The best way to write the script itself is through Powershell. An example is here: http://weblogs.thinktecture.com/cweyer/2011/01/fixing-windows-azure-sdk-13-full-iis-diagnostics-and-tracing-bug-with-a-startup-task-a-grain-of-salt.html. Alternatively, write a console application to do the same thing.
Easiest, but much less secure: set the security in your OnStart method, and run your whole worker role elevated: in your service definition file include
<WebRole name="WebApplication2">
<Runtime executionContext="elevated" />
<Sites>
However, I'd really not recommend that as it's a terrible security hole for something that's running in the public cloud.
Related
I have a simple .Net Core MVC web application that I deploy to Azure. In the application, I am creating a little text file called "test.txt" using File.CreateText(). This works fine on my local PC, but when I deploy it to Azure, I get a strange message: "Could not find file 'D:\home\site\wwwroot\wwwroot\test.txt'."
Indeed, the file does not exist--that's why I'm creating it. Also, it appears someone else on SO is having a similar problem: FileNotFoundException when using System.IO.Directory.CreateDirectory()
Do I not have write permissions? Why is Azure not letting me create the file?
Code (in Startup.cs):
public void Configure(IApplicationBuilder app, IHostingEnvironment env){
using (var sw = File.CreateText(env.WebRootPath + "/test.txt")) { }
}
Screenshot of error:
I found the reason for this error shortly after posting this question, but forgot to post the answer:
It turns out that when you deploy to Azure, your site runs from a temporary directory, which gets wiped and re-created every time you deploy. Because of this, Azure disables the creation and editing of files in the app's directory, since they're just going to get wiped upon your next deployment (it would be nice if the error message Azure displayed was a little more informative).
Therefore, the way to create and edit files on your Azure app and have them persist across deployments is to use one of the following:
Database BLOB storage (simply store your files as byte arrays in a database. doesn't cost any money if you are already using a database)
Azure BLOB storage (store your files to a special cloud on Azure. costs money)
I read this on some site shortly after posting this question but I can't remember where, so I apologize for not having the source.
The best way is use the blog storage AZURE to do this. But you can try save in the root this way:
static void Main(string[] args)
{
// Create a string with a line of text.
string text = "First line" + Environment.NewLine;
// Set a variable to the Documents path.
string docPath = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
// Write the text to a new file named "WriteFile.txt".
File.WriteAllText(Path.Combine(docPath, "WriteFile.txt"), text);
// Create a string array with the additional lines of text
string[] lines = { "New line 1", "New line 2" };
// Append new lines of text to the file
File.AppendAllLines(Path.Combine(docPath, "WriteFile.txt"), lines);
}
Reference: https://learn.microsoft.com/pt-br/dotnet/standard/io/how-to-write-text-to-a-file
I have a webjob that I have set up to be triggered when a file is added to a directory:
[FileTrigger(#"<DIR>\<dir>\{name}", "*", WatcherChangeTypes.Created, autoDelete: true)] Stream file,
I have it configured:
var config = new JobHostConfiguration
{
JobActivator = new NinjectActivator(kernel)
};
var filesConfig = new FilesConfiguration();
#if DEBUG
filesConfig.RootPath = #"C:\Temp\";
#endif
config.UseFiles(filesConfig);
config.UseCore();
The path is for working locally and I was expecting that commenting out the FilesConfiguration object leaving it default would allow it to pick up the connection string I have set up and trigger when files are added. This does not happen it turns out that by default the RootPath is set to "D:\Home" and produces an InvalidOperationException
System.InvalidOperationException : Path 'D:\home\data\<DIR>\<dir>' does not exist.
How do I get the trigger to point at the File storage area of the storage account I have set up for it. I have tried removing the FilesConfiguration completely from Program.cs in the hope that it would work against the settings but it only produces the same Exception.
System.InvalidOperationException : Path 'D:\home\data\\' does not exist.
When you publish to azure, the default directory is D:\HOME\DATA, so when you run webjob it could not find the path so you get the error message.
How do I get the trigger to point at the File storage area of the storage account I have set up for it.
The connectionstring you have set have two applies: one is used for dashboard logging and the other is used for application functionality (queues, tables, blobs).
It seems that you could not get filetrigger working with azure file storage.
So, if you want to invoke your filetrigger when you create new file, you could go to D:\home\data\ in KUDU to create a DIR folder and then create new .txt file in it.
The output is as below:
BTW, it seems that you'd better not use autoDelete when you create file, if use you will get error like:
NotSupportedException: Use of AutoDelete is not supported when using change type 'Changed'.
Problem:
trying to get an image out of azure fileshare for manipulation. I need to read the file as an Drawing.Image for manipulation. I cannot create a valid FileInfo object or Image using uncpath (which I need to do in order to use over IIS)
Current Setup:
Attach a virtual directory called Photos in IIS website pointing to UNCPath of the Azure file share (e.g. \myshare.file.core.windows.net\sharename\pathtoimages)
This works as http://example.com/photos/img.jpg so I know it is not a permissions or authentication issue.
For some reason though I cannot get a reference to File.
var imgpath = Path.Combine(Server.MapPath("~/Photos"),"img.jpg")
\\resolves as \\myshare.file.core.windows.net\sharename\pathtoimages\img.jpg
var fi = new FileInto(imgpath);
if(fi.exists) //this returns false 100% of the time
var img = System.Drawing.Image.FromFile(fi.FullName);
The problem is that the file is never found to exist, even though I cant take that path and put it in an explorer window and return the img.jpg 100% of the time.
Does anyone have any idea why this would not be working?
Do I need to be using CloudFileShare object to just get a read of a file I know is there?
It turns out the issue is that I needed to wrap my code in an impersonation of the azure file share userid since the virtual directory is not really in play at all at this point.
using (new impersonation("UserName","azure","azure pass"))
{
//my IO.File code
}
I used this guys impersonation script found here.
Can you explain why DirectoryInfo.GetFiles produces this IOException?
I am developing "azure web application".
I have created drive and drivePath static members in WebRole as follows:
public static CloudDrive drive = null;
public static string drivePath = "";
I have created development storage drive in WebRole.OnStart as follows:
LocalResource azureDriveCache = RoleEnvironment.GetLocalResource("cache");
CloudDrive.InitializeCache(azureDriveCache.RootPath, azureDriveCache.MaximumSizeInMegabytes);
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
// for a console app, reading from App.config
//configSetter(ConfigurationManager.AppSettings[configName]);
// OR, if running in the Windows Azure environment
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
});
CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
CloudBlobClient blobClient = account.CreateCloudBlobClient();
blobClient.GetContainerReference("drives").CreateIfNotExist();
drive = account.CreateCloudDrive(
blobClient
.GetContainerReference("drives")
.GetPageBlobReference("mysupercooldrive.vhd")
.Uri.ToString()
);
try
{
drive.Create(64);
}
catch (CloudDriveException ex)
{
// handle exception here
// exception is also thrown if all is well but the drive already exists
}
string path = drive.Mount(azureDriveCache.MaximumSizeInMegabytes, DriveMountOptions.None);
IDictionary<String, Uri> listDrives = Microsoft.WindowsAzure.StorageClient.CloudDrive.GetMountedDrives();
drivePath = path;
The drive keeps visible and accessible till execution scope remain in WebRole.OnStart, as soon as execution scope leave WebRole.OnStart, drive become unavailable from application and static members get reset (such as drivePath get set to "")
Am I missing some configuration or some other error ?
Where's the other code where you're expecting to use drivePath? Is it in a web application?
If so, are you using SDK 1.3? In SDK 1.3, the default mode for a web application is to run under full IIS, which means running in a separate app domain from your RoleEntryPoint code (like OnStart), so you can't share static variables across the two. If this is the problem, you might consider moving this initialization code to Application_Begin in Global.asax.cs instead (which is in the web application's app domain).
I found the solution:
In development machine, request originate for localhost, which was making the system to crash.
Commenting "Sites" tag in ServiceDefinition.csdef, resolves the issue.
I have a class library that is shared between multiple Role projects in my solution. Two of these projects are a Web Role and a Worker Role.
They each have the same Configuration Setting:
<Setting name="QueueConnectionString" value="UseDevelopmentStorage=true" />
each of them is making a call to this function:
public static void AddMessage(string Message)
{
var account = CloudStorageAccount.DevelopmentStorageAccount;
ServicePoint queueServicePoint = ServicePointManager.FindServicePoint(account.QueueEndpoint);
queueServicePoint.UseNagleAlgorithm = false;
var client = account.CreateCloudQueueClient();
var queue = client.GetQueueReference(DefaultRoleInstanceQueueName);
queue.CreateIfNotExist();
queue.AddMessage(new CloudQueueMessage(Message));
}
When this executes in the Worker Role, it works without any issues; I've confirmed proper reads and writes of Queue messages. When it executes in the Web Roles, the call to queue.CreateifNotExist() crashes with the error "Response is not available in this context". I've tried searching for information on what could be causing this but so far my search efforts have been fruitless. Please let me know if there's any additional information I can include.
ok, so after a lot more work, I determined that it was because i was calling it from within the Global.asax Application_Start.
I moved this to my WebRole.cs OnStart() function and it works properly.