Azure function RSA decrypt code not working - azure

I am trying to decrypt a sting in azure function using private key . The code is working fine on my local PC but when i try to execute the code via azure function deployed it is throwing error 500 .
I am new to azure hence seeking some good advice . After disabling few method inside the code i found out the method that is actually breaking the code. Below is the code part when commented function is giving 200 response but not the actual output because this code is decrypting the key .
using (var rsa = RSAWrapper.Create())
{
rsa.ImportPkcs8PrivateKey(Convert.FromBase64String(privkey), out _); // passing the private key here
byte[] data = Convert.FromBase64String(input);
byte[] bytesEncrypted = rsa.Decrypt(data, RSAEncryptionPadding.Pkcs1);
output = Convert.ToBase64String(bytesEncrypted);
}
return output;
}

Please check the below steps if that helps to:
I am currently passing private key hardcoded in a string in azure function . on local is is working but not via Azure function . Are there any security settings at azure function to consume private key ?
AFAIK, you can secure the RSA Private Keys using Azure Key Vault instead of hardcoding in the Azure Functions Project.
As we know that Azure Key Vault is the place where we can store/import/maintain the keys and secrets essential for our cloud applications and where we do not have direct access to it.
Here is an article you can find the practical workaround that retrieves Azure Key Vault Secrets using Azure Functions and Managed Service Identity.
Here are the few references that helps you the issues recorded on storing the RSA private keys/certificates on Azure key vault along with their resolution discussions:
Is it possible to put RSA Private Keys in Azure Key Vault Certificates?
https://github.com/MicrosoftDocs/azure-docs/issues/50164

Related

How to create SFTP connection in Azure Data Factory using Public SSH key

I am trying to create SFTP lined service (Using keys) in azure data factory.
Soruce (SFTP) team has shared public key.
But in ADF, it is asking for private key content and pass phrase.
Please help me if this is somthing source team has to share the pass phrase and private key content or do I need to generate these keys using public key shared by source.
Regards,
Srinivas.
Convert your public key file into base64 string (On MAC: run in terminal base64 -i youkey.pub) then you can use that value for privateKeyContent
In adf connector authenticationType change to SshPublicKey.
passPhrase - is required only if you key is protected with password.
Also i would suggest you to store those sensitive data in keyvault

How can i renew a Host Key in Azure function App without losing the link to the Key Vault?

I have a Function App Hosted in Azure. I access the functions via a Key in the Host Keys that i created , MyKey. This is linked to a secret in the KeyVault via the following format :
#Microsoft.KeyVault(SecretUri=secret_uri_with_version)
Now if the Key inside the function App is renewed, I lose the edited value as above and it is replaced with a random key value .
How can i make it so that if someone renews the key in the function app then the link to the Key Vault is not lost ?

How to get the certificate password from a self signed certificate

I want to aquire a token from an Azure app registration with a certificate.
I followed the instructions here and generated a self signed certificate with Powershell. I also imported the public key into the portal.
But if I want to access the app via .NET, I need to provide the following MSAL configuration:
The CertificateFileContents is just the public key I exported from the certgmgr. But what should I put as the CertificatePass? Is this a hash? Or a private key? I could not find anything in the docs and also the link above does not give me any advice...
Also I do not really understand why the private key is not imported to the portal?
In my experience, CertificatePass should be required when you export a private key.
This document has such content before:
Export the private key, specify a password for the cert file, and
export to a file.
But now it only tells you to export a public key. You can see details from this issue.
So based on the SharePoint document, if you are reading a PFX file from your local machine, I think you should use private key with a password.
Okay, the CertificatePass was the password for the certificate itself.
The Azure Portal itself only holds the public key.
The client application needs to provide the whole certificate with private and public key.
If you export a private/public key from certificate manager in Windows 10, you will not be able to directly export this as base64, but you can create a pfx file.
Those files can later be encoded to Base64 with a tool of your choice. For example this.
The password for your certificate has to be the CertificatePass, the FileContents are the Base64 public and private key, but decrypted with the password.
This is of course only an approach for testing purpose. In a production environment you would rather use key vault or something similar to not have any secrets in your appsettings.json.

No XML encryptor configured - When using Key Vault

I have an netcoreapp2.2 containerized application that uses azure key vault to store keys and also uses:
app.UseAuthentication();
And
services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme)
I am building/running a docker image in a hosted linux environment under App Services. I am using the azure container registry and dev ops pipe line to maintain my app. Azure controls the deployment process and the "docker run" command.
My app works great, however in the container logs I see:
2019-12-13T17:18:12.207394900Z [40m[1m[33mwarn[39m[22m[49m: Microsoft.AspNetCore.DataProtection.KeyManagement.XmlKeyManager[35]
2019-12-13T17:18:12.207436700Z No XML encryptor configured. Key {...} may be persisted to storage in unencrypted form.
...
2019-12-13T17:18:14.540484659Z Application started. Press Ctrl+C to shut down.
I realize there are many other posts on this that allude to using other storage mechanisms, however I am using key vault to store my sensitive data. JWT is all handled by key vault. I have a few application settings that control static variables for DEV/QA/PROD but they are not sensitive data at all.
I am also not sure what key is being stored in memory as all my sensitive keys are completely outside of the application and are called by:
var azureServiceTokenProvider = new AzureServiceTokenProvider();
var keyVaultClient = new KeyVaultClient(
new KeyVaultClient.AuthenticationCallback(
azureServiceTokenProvider.KeyVaultTokenCallback));
config.AddAzureKeyVault(
$"https://{builtConfig["MY_KEY_VAULT_ID"]}.vault.azure.net/",
keyVaultClient,
new DefaultKeyVaultSecretManager());
I am having a difficult time understanding why this warning is being thrown and also if I should take additional steps to mitigate the issue. I have not personally seen side effects, and app restarts do not seem to have any effect as I am using bearer tokens and other concerns such as token expiration, password resets and the like are not applicable.
So I am left with asking are there any additional steps I can take to avoid this warning? Do I need to ensure that there is a better data at rest mechanism for any configuration settings that may be in my linux environment? Can I safely ignore this warning?
It took me a while to find a way that suited the needs that I have for my application but I wanted to lend some clarity to a number of other stack answers that just did not make sense to me and how I finally understood the problem.
TLDR; Since I was already using key vault, I was confusing how .net core works. I didn't realize that config.AddAzureKeyVault() has nothing to do with how .net core decides to store data at rest on your app service.
When you see this warning:
No XML encryptor configured. Key {GUID} may be persisted to storage in unencrypted form.
it really doesn't matter what GUID was being set: that string of data was not being stored encrypted at rest.
For my risk analysis any information that is not being encrypted at rest is a bad idea as it could mean at anytime in the future some sort of sensitive data could leak and then be exposed to an attacker. In the end, I chose to classify my data at rest as sensitive and err on the side of caution with a potential attack surface.
I have been struggling to try and explain this in a clear and concise way and it is difficult to sum up in a few words. This is what I learned.
Access control (IAM) is your friend in this situation as you can declare a system assigned identity for your application and use role based accessed control. In my case I used my application identity to control access to both key vault and azure storage with RBAC. This makes it much easier to get access without SAS tokens or access keys.
Azure storage will be the final destination for the file you are creating, but it will be the vault that controls the encryption key. I created an RSA key in key vault, and that key is what encrypts the XML file that is throwing the original error.
One of the mistakes I was making in my head was that I wanted two write the encrypted XML to key vault. However, that is not really the use case Microsoft describes. There are two Mechanisms: PersistKeysTo and ProtectKeysWith. As soon as I got that through my thick head, it all made sense.
I used the following to remove the warning and create encrypted data at rest:
services.AddDataProtection()
// Create a CloudBlockBlob with AzureServiceTokenProvider
.PersistKeysToAzureBlobStorage(...)
// Create a KeyVaultClient with AzureServiceTokenProvider
// And point to the RSA key by id
.ProtectKeysWithAzureKeyVault(...);
I had already used RBAC for my application with key vault (with wrap/unwrap permissions), but I also added Storage Blob Data Contributor to the storage account.
How you create your blob is up to you, but one gotcha is creating the access token synchronously:
// GetStorageAccessToken()
var token = new AzureServiceTokenProvider();
return token.GetAccessTokenAsync("https://storage.azure.com/")
.GetAwaiter()
.GetResult();
Then I called it from a method:
var uri = new Uri($"https://{storageAccount}.blob.core.windows.net/{containerName}/{blobName}");
//Credentials.
var tokenCredential = new TokenCredential(GetStorageAccessToken());
var storageCredentials = new StorageCredentials(tokenCredential);
return new CloudBlockBlob(uri, storageCredentials);
After this hurdle was overcame, putting the encryption in was straight forward. The Keyvault ID is the location of the encryption key you are using.
https://mykeyvaultname.vault.azure.net/keys/my-key-name/{VersionGuid}
And creating the client is
var token = new AzureServiceTokenProvider();
var client = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(token.KeyVaultTokenCallback));
services.AddDataProtection()
.ProtectKeysWithAzureKeyVault(client, keyVaultId);
I also have to give credit to this blog: https://joonasw.net/view/using-azure-key-vault-and-azure-storage-for-asp-net-core-data-protection-keys as this pointed me in the right direction.
https://learn.microsoft.com/en-us/aspnet/core/security/data-protection/configuration/default-settings?view=aspnetcore-2.2 this also pointed out why keys are not encrypted
https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles - RBAC for apps
https://learn.microsoft.com/en-us/aspnet/core/security/data-protection/configuration/overview?view=aspnetcore-3.1 this was confusing at first but has a good warning about how to grant access and limit access in production.
Might be you have to configure your data protection policy to use CryptographicAlogrithms as follow:
.UseCryptographicAlgorithms(new AuthenticatedEncryptorConfiguration()
{
EncryptionAlgorithm = EncryptionAlgorithm.AES_256_CBC,
ValidationAlgorithm = ValidationAlgorithm.HMACSHA256
});
Also, following are few warning which you get around Data protection policy
ASP.Net core DataProtection stores keys in the HOME directory (/root/.aspnet/DataProtection-Keys) so when container restart keys are lost and this might crash the service.
This can be resolve by persisting key at
Persist key at the persistent location (volume) and mount that volume
to docker container
Persist key at the external key store like Azure or Redis
More details about ASP.NET DataProtection:
https://learn.microsoft.com/en-us/aspnet/core/security/data-protection/configuration/overview?view=aspnetcore-3.1
https://learn.microsoft.com/en-us/aspnet/core/security/data-protection/introduction?view=aspnetcore-3.1
To mount an external volume (C:/temp-kyes) to docker container volume (/root/.aspnet/DataProtection-Keys) using following command
docker run -d -v /c/temp-keys:/root/.aspnet/DataProtection-Keys container-name
Also, You need to update your Starup.cs - ConfigureServices to configure DataProtection policy
services.AddDataProtection().PersistKeysToFileSystem(new DirectoryInfo(#"C:\temp-keys\"))
.UseCryptographicAlgorithms(new AuthenticatedEncryptorConfiguration()
{
EncryptionAlgorithm = EncryptionAlgorithm.AES_256_CBC,
ValidationAlgorithm = ValidationAlgorithm.HMACSHA256
});

Create RSA key pair and retrieve public key in Azure key vault

We have a requirement to create RSA key pair using Azure key vault and copy the RSA public key to external system. The requirement is the external system will encrypt the data using public key and internal system will talk to azure key vault and de-crypt the data. I don't have access to Azure key vault yet, so going through the documentation. I have two basic questions:
Is there a way to export the RSA public key in a text format using Azure portal without using API (https://learn.microsoft.com/en-us/rest/api/keyvault/getkey/getkey).
If I don't select 'set activation' or 'set expiration' date while creating the keys, will the key expire? Do they have a default expiration value?
Thanks in advance.
Is there a way to export the RSA public key in a text format using Azure portal without using API
The only way to export the key in the portal is Download Backup, you will get a file like xxxxvault1-testkey-20181227.keybackup, but the key will be encrypted, it could not be used outside the Azure Key Vault system.
If you want to export the key that will not be encrypted, you could use Azure CLI:
az keyvault key show --vault-name 'keyvaultname' --name 'testkey' --version 'e8dfb0f7b7a045b5a1e80442af833270' > C:\Users\joyw\Desktop\output.txt
It will export the key as a file output.txt.
If I don't select 'set activation' or 'set expiration' date while creating the keys, will the key expire? Do they have a default expiration value?
AFAIK, if you don't set expiration date, it will never expire.

Resources