How to disable public access for app configuration in terraform - azure

I can create azurerm_app_configuration app configuaration for Azure app configuration.
And can create azurerm_private_endpoint using terraform.
But I didnot find out which terraform function can be used to disable public access, as image below.
Does anyone can help

Just by creating the private endpoint, the public access will be denied. From docs:
By default, when a private endpoint is added to your App Configuration store, all requests for your App Configuration data over the public network are denied. You can enable public network access by using the following Azure CLI command.
However, if you want to control this explicitly, then such a control is not supported in TF. There is github issue about that already:
Support for azurerm_app_configuration public_network_access_enabled

Related

Azure Static Web App - Function App API - How to load IOptions?

Everything is working locally still using storage in Azure. The local settings file to load the IOptions are:
"StorageOptions": {
"ConnectionString": "...xxx..."
}
The static web app is hitting the API and getting a 500 error due to not being able to load the connection string settings from the application settings. Other API calls that do not use Azure storage are working as expected.
I am unable to save the static web app settings in the normal manner of StorageOptions:ConnectionString with the specified value.
Can API settings for Azure static web apps use the IOptions pattern? If yes, how should the application settings be added in Azure to load the IOptions properly?
The static web app is hitting the API and getting a 500 error due to not being able to load the connection string settings from the application settings.
Application settings for the static web app does not allow for ":" in the setting name. So, instead of using "StorageOptions:ConnectionString" it would be "StorageOptions__ConnectionString" for the hierarchical data binding.
Noted here in step 4 of "Configure app settings": https://learn.microsoft.com/en-us/azure/app-service/configure-common?tabs=portal
If yes, how should the application settings be added in Azure to load the IOptions properly?
I found an issue in the SO 70461295 where user #HariKrishna and #GaryChan given that the Application Settings are available only for the Azure Static Web App associated backend APIs.
If using dependency injection for configuring the application settings through Azure Static Web Apps - Azure Functions Context, then Option pattern is available which is returned when the functionality is required.
Your given format of Application Settings:
"StorageOptions": {
"ConnectionString": "...xxx..."
}
Then, you have to configure inside the Startup.Configure method such as:
builder.Services.AddOptions<StorageOptions>()
.Configure<IConfiguration>((settings, configuration) =>
{
configuration.GetSection("StorageOptions").Bind(settings):
});
Updated Answer:
 As #BretOoten mentioned that the hierarchical data binding in azure static web apps configuration is possible with double underscore (__), even in the azure functions the nested objects/configuration from local.settings.json file is called with the double underscore (__) as mentioned in this MS Doc.
 For example:  
"WebApp1": {
"Storage1": {
"ConnString": value
}
}  
configuration will be like: 
WebApp1__Storage1__ConnString

Model deployment to managed online endpoints inside VNet in Azure Machine Learning

I am trying to deploy a model to a managed online endpoint in Azure Machine Learning.
(Along the lines of https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-managed-online-endpoints).
This works fine with a publically accessible AML workspace, but not with our workspace inside our private VNET.
With an AML workspace/storage account in a private VNET the deployment fails:
I found that this is directly related to the network setting of the storage account. This is the setting that fails. Note that allowing Azure Services does not mitigate the problem:
Is this is blind spot of managed endpoints, which is simply not yet supported or is this a bug?
The problem can be reproduced with the sample code at
https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/managed/sample
i.e.
az ml online-deployment create --name blue -f endpoints/online/managed/sample/blue-deployment.yml
The 'troubleshooting' guide in the error message refers to the importance of accessibility of the storage account (and Azure Container Registry), but does not consider the usecase, where AML is inside a private VNET:
https://learn.microsoft.com/en-us/azure/machine-learning/how-to-troubleshoot-online-endpoints?tabs=cli#authorization-error
If workspace and storage are private, you need to make egress_public_network_access flag disabled. This flag is required to establish private endpoint connections from managed online deployment to your private resources. Do not forget to approve PE connections.
Doc for Managed Online Endpoint network isolation
https://learn.microsoft.com/en-us/azure/machine-learning/how-to-secure-online-endpoint?tabs=model

Azure Private Endpoint - The resource type 'Microsoft.Storage/storageAccounts' is not a supported resource type

When trying to configure a Private Endpoint on a storage account, I get an empty list of resource types
And if I try to do it with the resource id instead, I get the following error
Any idea?
I tried to reproduce the same in my environment works fine successfully.
This issue may occur if your region is mismatched it will not show your resource type. To resolve issue check whether while create a storage account you can able to create a private endpoint as below.
Thus, my private endpoint is created successfully as below.
Suppose, if you are already created a storage account you want to configure private endpoint make use of below workarounds.
In your storage account -> click networking under security and networking ->private endpoint connection ->Add private endpoint as below.
While adding a new private endpoint in a storage account your resource type and resource are automatically recognize and display as below. This way helps you in order to prevent a resource unsupported issue.

How do I allow ContinuousIntegration.exe to use a connection string in Azure Key Vault

I've got a Kentico Xperience (v13) instance in Azure and I want to run ContinuousIntegration.exe to populate my database up there with content from my CI xml files. The catch is that we're injecting the CMSConnectionString setting into the web app from Azure Key Vault (AKV) and the CI.exe isn't seeing it. Instead I get this error message:
CMS.DataEngine.ApplicationInitException: Cannot access the database specified by the 'CMSConnectionString' connection string. Please install the database externally and set a correct connection string.
Or maybe this error message:
Failed to execute the command.
Here's the relevant section from our web.config (that works for the website!):
<connectionStrings>
<!--Should be provided by Azure Key Vault-->
</connectionStrings>
How do I ensure that the executable gets access to the secrets in AKV?
It is possible to let ContinuousIntegration.exe know about a secure connection string with a small custom module that sets the connection string at startup. Here is the basic code of the module:
[assembly: AssemblyDiscoverable]
[assembly: RegisterModule(typeof(AzureConnectionStringModule))]
public class AzureConnectionStringModule : Module
{
public AzureConnectionStringModule()
: base(nameof(AzureConnectionStringModule))
{
}
protected override void OnPreInit()
{
base.OnPreInit();
var azureConnectionString = Environment.GetEnvironmentVariable("SQLAZURECONNSTR_CMSConnectionString");
if (string.IsNullOrWhiteSpace(azureConnectionString))
{
azureConnectionString = Environment.GetEnvironmentVariable("CMSConnectionString");
}
if (!string.IsNullOrWhiteSpace(azureConnectionString))
{
SettingsHelper.ConnectionStrings.SetConnectionString("CMSConnectionString", azureConnectionString);
}
}
}
From a fresh installation of Kentico Xperience 13, here are steps to configure this:
Follow the steps here to add Key Vault support to the admin app locally: https://learn.microsoft.com/en-us/azure/key-vault/general/vs-key-vault-add-connected-service.
Add the module above to the solution in a class library. Make sure the main project references the class library so that it is included during building.
Ensure that the ~\web.config does not have a connection string, or an app setting, named CMSConnectionString.
Deploy the app to an App Service.
In Azure, create an App Service configuration setting with name CMSConnectionString and value #Microsoft.KeyVault(VaultName=your-keyvault;SecretName=CMSConnectionString).
In the Key Vault, create a secret with name CMSConnectionString and value a connection string to an Azure SQL database. You may also need to follow https://learn.microsoft.com/en-us/azure/app-service/app-service-key-vault-references to create an access policy for my App Service.
At this point, the Kentico Xperience 13 admin should load with access to the database.
In the App Service portal, under Development Tools select Console.
In the console, run cd bin and then ContinuousIntegration.exe -r. This should produce a message about the repository not being configured, or output on the restore action.

How can I connect my azure function with my azure sql

I developed a cron trigger azure fuction who needs to search for soe data in my database.
Localy i can connect whit sql server, so i change the connection string in loca.settings.json to connect in azure sql and published the function, but the function cant connect with database.
I need to do something more than configure the local.settings.json?
The local.settings.json is only used for local testing. It's not even exported to azure.
You need to create a connection string in your application settings.
In Azure Functions - click Platform features and then Configuration.
Set the connection string
A function app hosts the execution of your functions in Azure. As a best security practice, store connection strings and other secrets in your function app settings. Using application settings prevents accidental disclosure of the connection string with your code. You can access app settings for your function app right from Visual Studio.
You must have previously published your app to Azure. If you haven't already done so, Publish your function app to Azure.
In Solution Explorer, right-click the function app project and choose Publish > Manage application settings.... Select Add setting, in New app setting name, type sqldb_connection, and select OK.
Application settings for the function app.
In the new sqldb_connection setting, paste the connection string you copied in the previous section into the Local field and replace {your_username} and {your_password} placeholders with real values. Select Insert value from local to copy the updated value into the Remote field, and then select OK.
Add SQL connection string setting.
The connection strings are stored encrypted in Azure (Remote). To prevent leaking secrets, the local.settings.json project file (Local) should be excluded from source control, such as by using a .gitignore file.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
If you are using entity framework core to make a connection, Other Way of connection to SQL is by using dependency injection from .netcore library.
You can keep the connection string in Azure Key-vault or the config file from there you can read the same using azure function startup class. which need below code setup in your function app.
using Microsoft.Azure.Functions.Extensions.DependencyInjection;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
[assembly: FunctionsStartup(typeof( TEST.Startup))]
namespace TEST
{
internal class Startup : FunctionsStartup
{
public override void Configure(IFunctionsHostBuilder builder)
{
Contract.Requires(builder != null);
builder.Services.AddHttpClient();
var configBuilder = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("local.settings.json", optional: true, reloadOnChange: true)
.AddAzureKeyVault($"https://XYZkv.vault.azure.net/");
var configuration = configBuilder.Build();
var conn = configuration["connectionString"];
builder.Services.AddDbContext<yourDBContext>(
options => options.UseSqlServer(configuration["connectionString"]));
}
}
}
after that where ever you are injecting this dbcontext, with context object you can do all CRUD operations by following microsoft's entity framework core library documentation.
Having just dealt with this beast (using a custom handler with Linux), I believe the simple way is to upgrade your App to premium-plan, allowing you to access the "Networking" page from "App Service plans". This should allow you to put both sql-server and app in the same virtual network, which probably makes it easier. (but what do I know?)
Instead, if you don't have the extra cash laying around, you can try what I did, and set up a private endpoint, and use the proxy connection setting for your database:
Create a virtual network
I used Address space: 10.1.0.0/16 (default I think)
Add subnet 10.1.0.0/24 with any name (adding a subnet is required)
Go to "Private link center" and create a private endpoint.
any name, resource-group you fancy
use resource type "Microsoft.Sql/Server" and you should be able to select your sql-server (which I assume you have created already) and also set target sub-resource to "sqlServer" (the only option)
In the next step your virtual network and submask should be auto-selected
set Private DNS integration to yes (or suffer later).
Update your firewall by going to Sql Databases, select your database and click "Set Server Firewall" from the overview tab.
Set Connection Policy to proxy. (You either do this, or upgrade to premium!)
Add existing virtual network (rule with any name)
Whitelist IPs
There probably is some other way, but the azure-cli makes it easy to get all possible IP's your app might use: az functionapp show --resource-group <group_name> --name <app_name> --query possibleOutboundIpAddresses
https://learn.microsoft.com/en-us/azure/app-service/overview-inbound-outbound-ips
whitelist them all! (copy paste exercise)
Find your FQDN from Private link center > Private Endpoints > DNS Configuration. It's probably something like yourdb.privatelink.database.windows.net
Update your app to use this url. You just update your sql server connection string and replace the domain, for example as ADO string: Server=tcp:yourdb.privatelink.database.windows.net,1433;Initial Catalog=somedbname;Persist Security Info=False;User ID=someuser;Password=abc123;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;
Also note that I at some point during all of this I switched to TrustServerCertificate=True and now I can't bother to figure out if it does a difference or not. So I left it as an exercise to the reader to find out.
So what we have done here...?
We have forced your function app to go outside the "azure-sphere" by connecting to the private endpoint. I think that if you bounce between azure-services directly, then you'll need some sort of authentication (like logging in to your DB using AD), and in my case, using custom handler and linux base for my app, I think that means you need some trust negotiation (kerberos perhaps?). I couldn't figure that out, so I came up with this instead.

Resources