In Azure Frontdoor, how can we bulk import/export Rulesets?
I am looking for a way to create a CSV/Excel file with all the redirection URLs that I want to configure on my site and want to import all of them at once in Frontdoor.
Related
Currently, we host this file on AWS using S3:
https://s3.amazonaws.com/apretaste/mirrors
We would like to move gigs of files to Azure, where we have all our servers, but since we are an anti-censorship tool, having the name of the project "apretaste" (or any other string) as a subdomain will make it an easy target.
On Azure, I can only host the file as:
https://apretaste.blob.core.windows.net/mirrors
As you can see, the subdomain "apretaste" is fully exposed in Azure, while in AWS is hidden, encrypted as part of the https request.
Is there a way to hide the name in Azure? I could have not find one. Any help is appreciated.
Is there a way to hide the name encrypted as part of the https request in Azure?
AFAIK, there is no direct way to hide or encrypt the storage account name in Azure.
Alternatively for workaround you can use the Azure CDN (Content Delivery Network <https://name.azureedge.net>) endpoints hostname to connect the origin hostname<https://storageaccountname.blob.core.windows.net>
You can create the Azure CDN through portal:
Portal -> Storage account -> Azure CDN -> create endpoints.
I tried with postman to upload file to azure blob storage using CDN endpoints it uploaded successfully.
URL:
https://name.azureedge.net/<containername>/<filename> + sas token
Postman:
You can refer this Document for more in detail of Azure CDN.
You can also use another method in that you can configure custom domain and map it to the Azure blob storage account. Once the custom domain has been linked to your blob service endpoint it shows name of the organization since the displayed custom domain effectively encrypts the storage account name. To know more in detail kindly refer the below document.
Map a custom domain to an Azure Blob Storage endpoint - Azure Storage | Microsoft Learn
Can any one help me how to load csv file from share point online to azure Blob storage using Azure Data Factory.
I tried with Logic apps and succeed however logic app will not upload all file unless we made any change to the file or upload new.
I need to load all the file even there is no changes.
ADF v2 now supports loading from sharepoint online by OData connector with AAD service principal authentication: https://learn.microsoft.com/en-us/azure/data-factory/connector-odata
You probably can use a Logic App by changing to a Recurrence Trigger.
On that interval, you List the files in the Library then take any action on them you want.
I have a website where I would like to cache the few images/stylesheets/javascript-files I have. Is it possible to have Azure CDN point directly on the files on my server, and then cache them, instead of having to upload them to an Azure storage?
It's not possible. Azure will not allow you to configure arbitrary domain as origin domain to support origin content pull. The only available targets are existing azure website, cloudservice or storage account.
Let us discuss your desired end goal.
If you want to improve your caching with CDN related functionality with the same domain name, take a look at Cloud Flare.
However, if you were going to a separate your content into a CDN domain and the application domain, you could look at expanding the following MSDN sample. The idea with this sample is so that as a deployment step, you upload all the CDN related content to the Azure Storage Account.
https://code.msdn.microsoft.com/windowsazure/Synchronizing-Files-to-a14ecf57
I have been exploring the features available in Azure and AWS. The fact that most features is not available or not clear.In CDN part i have comparisson criteria like 'Whether I can push/upload content to CDN Servers like in AKamai.
I have seen the feedback program and find that Custom-Origin is not available(
Link : http://feedback.azure.com/forums/169397-cdn/status/191761 ).But this one i could not find any link.Anyone has any idea?
No. Azure CDN currently does not support direct interaction (i.e. direct content upload, explicit or on-demand content expiration, etc.). It works as advertised serves files from Azure Storage Account or azure Cloud Service.
We are migrating our PHP website to Azure Cloud Web Service (Web Role).
Currently the website saves user submitted image files to the filesystem via drive letter access. These images are then served via a url e.g. content.example.com.
What options have I got id I want persistent file storage on an Azure Cloud Web Service.
I am currently assessing BLOB Storage for this.
Thanks
Blob storage is the right answer. Although you could convert your images in base64 and save them in Azure Sql as well, it is really not recommended.
Check: Azure, best way to store and deploy static content (e.g. images/css)? or Where to store things like user pictures using Azure? Blob Storage?
One of the options to reduce re-writing of your application is to mount blob-storage as a network drive. Here is some information how to do it: http://blogs.msdn.com/b/windowsazurestorage/archive/2014/05/12/introducing-microsoft-azure-file-service.aspx
Mounting of the drives can be done on Web-Role start-up task and can be scripted.