I am developing "azure web application".
I have created drive and drivePath static members in WebRole as follows:
public static CloudDrive drive = null;
public static string drivePath = "";
I have created development storage drive in WebRole.OnStart as follows:
LocalResource azureDriveCache = RoleEnvironment.GetLocalResource("cache");
CloudDrive.InitializeCache(azureDriveCache.RootPath, azureDriveCache.MaximumSizeInMegabytes);
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
// for a console app, reading from App.config
//configSetter(ConfigurationManager.AppSettings[configName]);
// OR, if running in the Windows Azure environment
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
});
CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
CloudBlobClient blobClient = account.CreateCloudBlobClient();
blobClient.GetContainerReference("drives").CreateIfNotExist();
drive = account.CreateCloudDrive(
blobClient
.GetContainerReference("drives")
.GetPageBlobReference("mysupercooldrive.vhd")
.Uri.ToString()
);
try
{
drive.Create(64);
}
catch (CloudDriveException ex)
{
// handle exception here
// exception is also thrown if all is well but the drive already exists
}
string path = drive.Mount(azureDriveCache.MaximumSizeInMegabytes, DriveMountOptions.None);
IDictionary<String, Uri> listDrives = Microsoft.WindowsAzure.StorageClient.CloudDrive.GetMountedDrives();
drivePath = path;
The drive keeps visible and accessible till execution scope remain in WebRole.OnStart, as soon as execution scope leave WebRole.OnStart, drive become unavailable from application and static members get reset (such as drivePath get set to "")
Am I missing some configuration or some other error ?
Where's the other code where you're expecting to use drivePath? Is it in a web application?
If so, are you using SDK 1.3? In SDK 1.3, the default mode for a web application is to run under full IIS, which means running in a separate app domain from your RoleEntryPoint code (like OnStart), so you can't share static variables across the two. If this is the problem, you might consider moving this initialization code to Application_Begin in Global.asax.cs instead (which is in the web application's app domain).
I found the solution:
In development machine, request originate for localhost, which was making the system to crash.
Commenting "Sites" tag in ServiceDefinition.csdef, resolves the issue.
Related
I am currently developing a UWP application for my school project and one of the pages allows the user to take a picture of themselves. I created the feature by following this tutorial: CameraStarterKit
For now I am storing the pictures taken on my desktop's picture folder. But the requirement of my project is to store the pictures taken in a folder called "Photos" under inetpub\wwwroot.
I dont really understand what wwwroot or IIS is... hence, I have no idea how I should modify my codes and store them into the folder.
Here are my codes for storing on my local desktop:
private async Task TakePhotoAsync()
{
idleTimer.Stop();
idleTimer.Start();
var stream = new InMemoryRandomAccessStream();
//MediaPlayer mediaPlayer = new MediaPlayer();
//mediaPlayer.Source = MediaSource.CreateFromUri(new Uri("ms-appx:///Assets/camera-shutter-click-03.mp3"));
//mediaPlayer.Play();
Debug.WriteLine("Taking photo...");
await _mediaCapture.CapturePhotoToStreamAsync(ImageEncodingProperties.CreateJpeg(), stream);
try
{
var file = await _captureFolder.CreateFileAsync("NYPVisitPhoto.jpg", CreationCollisionOption.GenerateUniqueName);
Debug.WriteLine("Photo taken! Saving to " + file.Path);
var photoOrientation = CameraRotationHelper.ConvertSimpleOrientationToPhotoOrientation(_rotationHelper.GetCameraCaptureOrientation());
await ReencodeAndSavePhotoAsync(stream, file, photoOrientation);
Debug.WriteLine("Photo saved!");
}
catch (Exception ex)
{
// File I/O errors are reported as exceptions
Debug.WriteLine("Exception when taking a photo: " + ex.ToString());
}
}
For the storing of the files:
private static async Task ReencodeAndSavePhotoAsync(IRandomAccessStream stream, StorageFile file, PhotoOrientation photoOrientation)
{
using (var inputStream = stream)
{
var decoder = await BitmapDecoder.CreateAsync(inputStream);
using (var outputStream = await file.OpenAsync(FileAccessMode.ReadWrite))
{
var encoder = await BitmapEncoder.CreateForTranscodingAsync(outputStream, decoder);
var properties = new BitmapPropertySet { { "System.Photo.Orientation", new BitmapTypedValue(photoOrientation, PropertyType.UInt16) } };
await encoder.BitmapProperties.SetPropertiesAsync(properties);
await encoder.FlushAsync();
}
}
}
I would add an answer since there are tricky things about this requirement.
The first is the app can only access a few folders, inetpub is not one of them.
Using brokered Windows runtime component (I would suggest using FullTrustProcessLauncher, which is much simpler to develop and deploy) can enable UWP apps access folders in the same way as the traditional desktop applications do.
While this works for an ordinary folder, the inetpub folder, however, is different that it requires Administrators Privileges to write to, unless you turn UAC off.
The desktop component launched by the app does not have the adequate privileges to write to that folder, either.
So it think an alternative way would be setting up a virtual directory in IIS manager that maps to a folder in the public Pictures library, and the app saves picture to that folder.
From the website’s perspective, a virtual directory is the same as a real folder under inetpub, what differs is the access permissions.
Kennyzx is right here that you cannot access inetpub folder through your UWP application due to permissions.
But if your application fulfills following criteria then you can use Brokered Windows Component(a component within your app) to copy your file to any location in the system.
Your application is a LOB application
You are only targetting desktop devices(I assume this will be true because of your requirement)
You are using side-loading for your app installation and distribution.
If all three are Yes then use Brokered Windows Component for UWP, it's not a small thing that can be showed here on SO using an example. So give worth a try reading and implementing it.
Is it possible to stream MP3 or WAV files with Microsoft Azure?
I want to have file which can be played with any js-player including to start the playback from any point the users wants (like the soundcloud player,...).
I tried to use the Blob Storage for that, but thats not possible because it does support streaming so the file has to be downloaded completely to jump to a certain point of the song.
Is there a way to make this possible with the Blob Storage? Or do I have to user Azure Media Services (tried that but only found support for video)?
The problem here is the service version of the storage. I had to set it manually via a Java program to a higher version (2014-02-14).
Here is the code. You need the azure SDK (https://github.com/Azure/azure-sdk-for-java) and the slf4j, lang3 and fasterxml libs.
import com.microsoft.azure.storage.*;
import com.microsoft.azure.storage.CloudStorageAccount;
import com.microsoft.azure.storage.blob.CloudBlobClient;
import com.microsoft.azure.storage.ServiceProperties;
public class storage {
public static void main(String[] args) {
CloudStorageAccount storageAccount;
ServiceProperties serviceProperties;
try {
storageAccount = CloudStorageAccount.parse("AccountName=<storageName>;AccountKey=<storageAccessKey>;DefaultEndpointsProtocol=http");
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
// Get the current service properties
serviceProperties = blobClient.downloadServiceProperties();
// Set the default service version to 2011-08-18 (or a higher version like 2012-03-01)
serviceProperties.setDefaultServiceVersion("2014-02-14");
// Save the updated service properties
blobClient.uploadServiceProperties(serviceProperties);
} catch(Exception e) {
System.out.print("Exception");
}
};
};
I'm trying to work with Azure Webjobs, I understand the way its works but I don't understand why I need to use two connection strings, one is for the queue for holding the messages but
why there is another one called "AzureWebJobsDashboard" ?
What its purpose?
And where I get this connection string from ?
At the moment I have one Web App and one Webjob at the same solution, I'm experiment only locally ( without publishing anything ), the one thing I got up in the cloud is the Storage account that holds the queue.
I even try to put the same connection string in both places ( AzureWebJobsDashboard,AzureWebJobsStorage) but its throw exception :
"Cannot bind parameter 'log' when using this trigger."
Thank you.
There are two connection strings because the WebJobs SDK writes some logs in the storage account. It gives you the possibility of having one storage account just for data (AzureWebJobsStorage) and the another one for logs (AzureWebJobsDashboard). They can be the same. Also, you need two of them because you can have multiple job hosts using different data accounts but sending logs to the same dashboard.
The error you are getting is not related to the connection strings but to one of the functions in your code. One of them has a log parameter that is not of the right type. Can you share the code?
Okay, anyone coming here looking for an actual answer of "where do I get the ConnectionString from"... here you go.
On the new Azure portal, you should have a Storage Account resource; mine starts with "portalvhds" followed by a bunch of alphanumerics. Click that so see a resource Dashboard on the right, followed immediately by a Settings window. Look for the Keys submenu under General -- click that. The whole connection string is there (actually there are two, Primary and Secondary; I don't currently understand the difference, but let's go with Primary, shall we?).
Copy and paste that in your App.config file on the connectionString attribute of the AzureWebJobsDashboard and AzureWebJobsStorage items. This presumes for your environment you only have one Storage Account, and so you want that same storage to be used for data and logs.
I tried this, and at least the WebJob ran without throwing an error.
#RayHAz - Expanding upon your above answer (thanks)...
I tried this https://learn.microsoft.com/en-us/azure/app-service/webjobs-sdk-get-started
but in .Net Core 2.1, was getting exceptions about how it couldn't find the connection string.
Long story short, I ended up with the following, which worked for me:
appsettings.json, in a .Net Core 2.1 Console app:
{
"ConnectionStrings": {
"AzureWebJobsStorage": "---your Azure storage connection string here---",
"AzureWebJobsDashboard":"---the same connectionstring---"
}
}
... and my Program.cs file...
using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
namespace YourWebJobConsoleAppProjectNamespaceHere
{
public class Program
{
public static IConfiguration Configuration;
static void Main(string[] args)
{
var builder = new ConfigurationBuilder()
.SetBasePath(Path.Combine(AppContext.BaseDirectory))
.AddJsonFile("appsettings.json", true);
Configuration = builder.Build();
var azureWebJobsStorageConnectionString = Configuration.GetConnectionString("AzureWebJobsStorage");
var azureWebJobsDashboardConnectionString = Configuration.GetConnectionString("AzureWebJobsDashboard");
var config = new JobHostConfiguration
{
DashboardConnectionString = azureWebJobsDashboardConnectionString,
StorageConnectionString = azureWebJobsStorageConnectionString
};
var loggerFactory = new LoggerFactory();
config.LoggerFactory = loggerFactory.AddConsole();
var host = new JobHost(config);
host.RunAndBlock();
}
}
}
I use local storage on Windows Azure to store temporary files. In there I call an .exe file to make a conversion of several other files in same local storage folder. Problem is I always get the exception "Access to the path XYZ.exe is denied.".
I should mention the following:
- I am using a worker role
- set in the service definition file
and tried to add permission to the folder I am accessing:
public static void AddPermission(string absoluteFolderPath)
{
DirectoryInfo myDirectoryInfo = new DirectoryInfo(absoluteFolderPath);
DirectorySecurity myDirectorySecurity = myDirectoryInfo.GetAccessControl();
myDirectorySecurity.AddAccessRule(new FileSystemAccessRule(
"NETWORK SERVICE",
FileSystemRights.FullControl,
AccessControlType.Allow));
myDirectoryInfo.SetAccessControl(myDirectorySecurity);
}
UPDATE:
I tried with this code now:
public static void FixPermissions()
{
var tempDirectory = RoleEnvironment.GetLocalResource("localStorage").RootPath;
Helper.addPermission(tempDirectory);
var dir = new DirectoryInfo(tempDirectory);
foreach (var d in dir.GetDirectories())
Helper.addPermission(d.FullName);
}
private static void addPermission(string path)
{
FileSystemAccessRule everyoneFileSystemAccessRule = new FileSystemAccessRule("Everyone",
FileSystemRights.FullControl,
InheritanceFlags.ContainerInherit | InheritanceFlags.ObjectInherit,
PropagationFlags.None, AccessControlType.Allow);
DirectoryInfo directoryInfo = new DirectoryInfo(path);
DirectorySecurity directorySecurity = directoryInfo.GetAccessControl();
directorySecurity.AddAccessRule(everyoneFileSystemAccessRule);
directoryInfo.SetAccessControl(directorySecurity);
}
I get a really strange behaviour of the page. I still get the errors but sometimes some files gets converted by the ffmpeg.exe file.
Can someone help me out here??
Thanks a lot.
SOLUTION:
So seems the problem was that I ran the .exe file within local storage and therefore had the given security issues. Putting the .exe into the application and referring directly solved my issue.
Thx for your help.
By default your worker role will most likely not be running with sufficient privilege to allow changes to the access control lists on Azure folders.
There's two possible options:
Best: run a script at startup to set the permissions. Details are on MSDN here: http://msdn.microsoft.com/en-us/library/gg456327.aspx. You'll want to set executionContext="elevated".
The best way to write the script itself is through Powershell. An example is here: http://weblogs.thinktecture.com/cweyer/2011/01/fixing-windows-azure-sdk-13-full-iis-diagnostics-and-tracing-bug-with-a-startup-task-a-grain-of-salt.html. Alternatively, write a console application to do the same thing.
Easiest, but much less secure: set the security in your OnStart method, and run your whole worker role elevated: in your service definition file include
<WebRole name="WebApplication2">
<Runtime executionContext="elevated" />
<Sites>
However, I'd really not recommend that as it's a terrible security hole for something that's running in the public cloud.
I have created a local storage in my web role called "MyTestCache" as so in my
ServiceDefinition.csdef file. But when ever I call the System.IO.File.WriteAllBytes method I get a UnauthorizedAccess exception. Does anyone know what would be causing this? I dont get this when creating the directory in the code below, only when writing. I am using SDK 1.3.
private void SaveFileToLocalStorage(byte[] remoteFile, string filePath)
{
try
{
LocalResource myIO = RoleEnvironment.GetLocalResource("MyTestCache");
// Creates directory if it doesn't exist (ie the first time)
if (!Directory.Exists(myIO.RootPath + "/thumbnails"))
{
Directory.CreateDirectory(myIO.RootPath + "/thumbnails");
}
string PathToFile = Path.Combine(myIO.RootPath + "/thumbnails", filePath);
var path = filePath.Split(Char.Parse("/"));
// Creates the directory for the content item (GUID)
if (!Directory.Exists(Path.Combine(myIO.RootPath + "/thumbnails", path[0])))
{
Directory.CreateDirectory(Path.Combine(myIO.RootPath + "/thumbnails", path[0]));
}
// Writes the file to local storage.
File.WriteAllBytes(PathToFile, remoteFile);
}
catch (Exception ex)
{
// do some exception handling
return;
}
}
Check ACLs. In SDK 1.3 by default web roles are started in full IIS worker process, using Network Service as identity of application pool. Make sure Network Service account has permissions to execute operations you expect. In your case you are trying to create a sub-directory, so most probably you need at least Write permission. If your role also modifies ACLs on this directory, you need to grant Full access to this directory.