Azure app service access file storage to copy content using webjob - azure

I have an Azure app service that host a wordpress site. I want to write a console application that will copy files from the website (file storage) and paste them in a deployment slot. All of the online resources talk about "access keys" for connection to the file storage, but I do not see anything like this in the app service portal. Can I use deployment credentials or web deploy credentials to access these files?

According to your description, I suggest you could use webjob’s file tigger to achieve your requirement.
Link:webjob-extension
You could use the file tigger to watch the file changes in your system file path, you could find the deployment slot’s ftp credential, then use it to upload the file form production folder to the deployment slot by webjob’s extension package.
More details, you could refer to follow image and codes:
1.Find the ftp credential and set password
Set username and password
2..Install the Microsoft.Azure.WebJobs.Extensions from nugget package manager and write the webjob method.
Codes like below:
Note: The default file path is D:/home/data, if your file inside your website folder, you need change its path as below.
static void Main()
{
var config = new JobHostConfiguration();
FilesConfiguration filesConfig = new FilesConfiguration();
string home = Environment.GetEnvironmentVariable("HOME");
if (!string.IsNullOrEmpty(home))
{
filesConfig.RootPath = Path.Combine(home, "site");
}
config.UseFiles(filesConfig);
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
Function:
public static void ImportFile(
[FileTrigger(#"wwwroot\doc\{name}", "*.*", WatcherChangeTypes.Created | WatcherChangeTypes.Changed)] Stream file,
FileSystemEventArgs fileTrigger,
TextWriter log)
{
log.WriteLine(string.Format("Processed input file '{0}'!", fileTrigger.Name));
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(string.Format("ftp://yourftpurl.ftp.azurewebsites.windows.net/site/wwwroot/doc/{0}", fileTrigger.Name));
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(#"username", "password");
Stream requestStream = request.GetRequestStream();
file.CopyTo(requestStream);
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
log.WriteLine("Upload File Complete, status {0}", response.StatusDescription);
response.Close();
}
Result:
If you add file to the production’s doc folder, the web job will copy it to deploymeny_solt’s doc folder.

You could use the "Azure Site Replicator" extension. A slot is like another azure app service, so it should replicate between slots just fine.
In your deployment slot that you want everything copied to, download the Publish Settings from the overview tab by clicking "Get Publish Profile"
In your app service production slot go to Extensions and add the Site Replicator extension. Then after it is installed, click it and click 'Browse.' That will open a new window with the configuration options.
In the configuration window, upload the Publish Settings file.

Related

How to use Azure App Settings in a Blazor WebAssembly Client side application at runtime as appsettings.json configuration?

I'm Working on Blazor WebAssembly Client/Server project (directory structure as above)
Have some application settings in both client and server projects.
The projects are hosted in Azure.
The problem is in the Client side with the appsettings.json
In the client side, the appsettings.json is within the wwwroot directory. It is okay to access the file within the app, However, the settings cannot be overwritten by Azure Portal Application Settings of the App service.
It means, that after the app is deployed in Azure portal on a Web App Service, my configuration settings do not work with the application settings' variables.
This is the code in the Progam.cs, which works fine and read the configuration from the file, but ignores the configuration settings of the Web App Service on Azure.
public static async Task Main(string[] args)
{
var builder = WebAssemblyHostBuilder.CreateDefault(args);
builder.RootComponents.Add<App>("app");
//Add a named httpClient and set base Address and Default Request Headers
builder.Services.AddHttpClient("SOME_WEB_URL", client => // SOME_WEB_URL is defined in the appsettings.json or in the Azure App Service configuration (Application Settings)
{
client.BaseAddress = new Uri(builder.Configuration["sbformsapi"]);
});
//Add a named httpClient and set base Address and Default Request Headers
builder.Services.AddHttpClient("WEB_APP_API", client => // WEB_APP_API is defined in the
{
client.BaseAddress = new Uri(builder.Configuration["sbwebappapi"]);
});
builder.Services.AddAuthorizationCore();
....
await builder.Build().RunAsync();
}
Could someone please guide how can I either
set the appsettings.json file outside the wwwroot and read it from there?
OR
inject/use the values from Azure App Service configuration's Application settings at runtime?
I am talking about the application settings here (as in the pic)...
Currently application settings are only available for the backend API associated with your Blazor App (assuming using Static App?).
https://learn.microsoft.com/en-gb/azure/static-web-apps/application-settings
So, looking at the Blazor docs, I don't think it is possible to load Azure App Settings directly in a WebAssembly. You can look for yourself
https://learn.microsoft.com/en-us/aspnet/core/blazor/fundamentals/configuration?view=aspnetcore-6.0
I suggest instead to put the backend URL in the appsettings.json and then use a backend service to load the configuration information from there.

Error while running after publishing the asp.net core project into Azure free account

I have a free account in Azure. I created a simple project without database and published in Azure and it is running fine in Azure
But after that I created another project with database and I published the following way
select project->publish->appservice->create new.
select create profile. Then I give Appservice name
create sqldatabase - I given the database server name , Administrator user name and admin password , - click ok
finally I clicked create button to create Azure sqlDatabase, SQL server, App Service, then it has created Appservice successfully
after creating the Appservice,I given the database defaultconnection ticked to use this connection string at run time, then i clicked save button.
After that I clicked Publish button and then when it try run the published file the following error message is being showed.
HTTP Error 500.30 - ANCM In-Process Start Failure
Common solutions to this issue:
The application failed to start
The application started but then stopped
The application started but threw an exception during startup
Troubleshooting steps:
Check the system event log for error messages
Enable logging the application process' stdout messages
Attach a debugger to the application process and inspect
I was calling a function SeedData from program.cs file to create the master file before start the application. But after commenting out it is working fine.
public static void Main(string[] args)
{
var host = CreateHostBuilder(args).Build();
using (var scope = host.Services.CreateScope())
{
var services = scope.ServiceProvider;
var context = services.GetRequiredService<MvcProj3_1Context>();
SeedData(context);
}
host.Run();
CreateHostBuilder(args).Build().Run();
}

Azure Logic App: Create CSV File in Blob

I am having a problem updating a csv file in my blob. I have an existing file inside my blob which is a CSV file, and when I press the download button it will automatically download the file to my machine.
Now I already created a Logic App that will update the csv file. When I run the trigger of the app, it updates the file but when I press download, it opens up a new tab where the csv file will be displayed.
I want it the way like the original when I press download it download the file to my machine.
Any help will do or verification if this is possible.
I already tried "compose" and "create to csv" but this way it will not store it to a blob.
As I have test, when you want to create a .csv file with "create blob" action in logic app, it will always get the same problem with you. Because the blob content is "text/plan" which will display in another tab to show.
So, I suggest that you could use azure function to create the blob. In azure function you could set:blob.Properties.ContentType = "application/octet-stream";
Here is the create blob method in azure function:
storageAccount = CloudStorageAccount.Parse(connectionString);
client = storageAccount.CreateCloudBlobClient();
container = client.GetContainerReference("data");
await container.CreateIfNotExistsAsync();
blob = container.GetBlockBlobReference(name);
blob.Properties.ContentType = "application/octet-stream";
using (Stream stream = new MemoryStream(Encoding.UTF8.GetBytes(data)))
{
await blob.UploadFromStreamAsync(stream);
}
For more detailed code, you could refer to this article. After following that, you could download your blob in your local machine.
Note: when you create the azure function action in logic app, it will show error. Just delete Appsetting of "AzureWebJobsSecretStorageType" to "blob".

Accessing Azure File Storage from Azure Function

I'm attempting to retrieve a file out of Azure File Storage to be used by an .exe that is executed within an Azure Function and can't seem to get pass the UNC credentials.
My app gets the UNC file path out of an Azure SQL database and then attempts to navigate to that UNC path (in Azure File Storage) to import the contents of the file. I can navigate to the file location from my PC in windows explorer but am prompted for the credentials.
I've tried using a "net use" command prior to executing the app but it doesn't seem to authenticate.
net use \\<storage account>.file.core.windows.net\<directory>\ /u:<username> <access key>
MyApp.exe
Azure Function Log error:
Unhandled Exception: System.UnauthorizedAccessException: Access to the path '<file path>' is denied.
If possible, I'd rather not modify my C# app and do the authentication in the Azure Function (its a Batch function at the moment and will be timer based).
I believe it is not possible to mount an Azure File Service Share in an Azure Function as you don't get access to underlying infrastructure (same deal as WebApps).
What you could do is make use of Azure Storage SDK which is a wrapper over Azure Storage REST API and use that in your application to interact with files in your File Service Share.
You cannot use SMB (445/TCP). Functions run inside the App Service sandbox.
From https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox#restricted-outgoing-ports:
Restricted Outgoing Ports
Regardless of address, applications cannot connect to anywhere using ports 445, 137, 138, and 139. In other words, even if connecting to a non-private IP address or the address of a virtual network, connections to ports 445, 137, 138, and 139 are not permitted.
Use the Azure Storage SDK to talk to your Azure File endpoint:
using Microsoft.Azure; // Namespace for CloudConfigurationManager
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.File;
// Parse the connection string and return a reference to the storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create a CloudFileClient object for credentialed access to File storage.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Get a reference to the file share we created previously.
CloudFileShare share = fileClient.GetShareReference("logs");
// Ensure that the share exists.
if (share.Exists())
{
// Get a reference to the root directory for the share.
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
// Get a reference to the directory we created previously.
CloudFileDirectory sampleDir = rootDir.GetDirectoryReference("CustomLogs");
// Ensure that the directory exists.
if (sampleDir.Exists())
{
// Get a reference to the file we created previously.
CloudFile file = sampleDir.GetFileReference("Log1.txt");
// Ensure that the file exists.
if (file.Exists())
{
// Write the contents of the file to the console window.
Console.WriteLine(file.DownloadTextAsync().Result);
}
}
}
Sample uses CloudConfigurationManager - i think that's a bit too much for such a simple scenario. I would do this instead:
using System.Configuration;
// "StorConnStr" is the Storage account Connection String
// defined for your Function in the Azure Portal
string connstr = ConfigurationManager.ConnectionStrings["StorConnStr"].ConnectionString;

Saving an X509 certificate from an Azure Blob and using it in an Azure website

I have an Azure Website and an Azure Blob that I'm using to store a .cer X509 certificate file.
The goal is to get the .cer file from the blob and use it to perform an operation (the code for that is in the Controller for my Azure website and it works).
When I run the code locally (without publishing my site) it works, because I save it in D:\
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("myContainer");
// Retrieve reference to a blob named "testcert.cer".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("testcert.cer");
// Save blob contents to a file.
using (var fileStream = System.IO.File.OpenWrite("D:/testcert.cer"))
{
blockBlob.DownloadToStream(fileStream);
}
**string certLocation = "D:/testcert.cer";
X509Certificate2 myCert = new X509Certificate2();
myCert.Import(certLocation);**
I am unable to figure out how/where I can save it. If I try and use the Import method but enter a url (that of the Azure blob where the certificate is stored) I get an error because Import can't handle urls.
Any idea what I can use as temp storage on the Azure website or in the blob and create an X509Certificate from it?
Edit: I'm trying to add more detail about the problem I'm trying to solve.
Get a cert file from an Azure blob and write it to an Azure website.
Use the .Import(string pathToCert) on an X509Certificate object to create the cert which will be used to make a call in a method I've written in my controller.
I've been able to work around 1 by manually adding the .cer file to the wwwroot folder of my site via FTP. But now when I use Server.MapPath("~/testcert.cer"); to get the path for my certificate I get this: D:\home\site\wwwroot\testcert.cer
Obviously when the Import method uses the string above as a path once it's deployed to my azure website, it's not a valid path and so my cert creation fails.
Any ideas? Thanks!
Saving the certificate locally is generally a no-no for Azure, you've got BlobStorage for that.
Use the Import(byte[]) overload to keep and load the certificate in memory. Here's a quick hand coded attempt...
// Used to store the certificate data
byte[] certData;
// Save blob contents to a memorystream.
using (var stream = new MemoryStream())
{
blockBlob.DownloadToStream(stream);
certData = stream.ToArray();
}
X509Certificate2 myCert = new X509Certificate2();
// Import from the byte array
myCert.Import(certData);
Very simple. And the answer covers all and any web hosters, not just Azure.
First of all, I would highly recommend that you never statically put a path to a folder in your Web Projects! Then what you can do is:
User the Server.MapPath("~/certs") method to obtain the physical path of a certs folder within your web site root folder.
Make Sure that no one can access this folder from the outside world:
By adding this additional location section in your web.config you are blocking any external access to this folder. Please note that locationelement has to be direct descendant of the root configuration element in your web.config file.
UPDATE with non-azure relevant part on how to write file to local file system of a ASP.NET web project:
var path = Server.MapPath("~/certs");
using (var fileStream = System.IO.File.OpenWrite(path + "\testcert.cer"))
{
// here use the fileStream to write
}
And a complete sample code on how to use Blob Stroage client to write content of a blob to a local file:
var path = Server.MapPath("~/certs");
using (var fileStream = System.IO.File.OpenWrite(path + "\testcert.cer"))
{
blockBlob.DownloadToStream(fileStream );
}
But, #SeanCocteau has a good point and much simpler approach - just use MemoryStream instead!
You can now upload your certificates via the Portal, add an app setting to your site, and have the certificate show up in your site's Certificate Store.
See this blog post for more details:
http://azure.microsoft.com/blog/2014/10/27/using-certificates-in-azure-websites-applications/

Resources