Currently we have the front end sending files to an Azure storage account into specific blob containers. The front-end is manually getting SAS tokens put into the build via a person getting a SAS from the storage account and pasting it into the front-end code so it can read and write to the storage account.
We're wanting to have the front-end send a request to APIM with a file. We then want to hash that file, use that hash as the name and store it in azure blob storage. I'm new to Azure API Management, is this even possible? It seems like I can't get at the uploaded file.
In APIM policies I currently have the Authorization to the storage account working but I can't figure out how to get at the Request.Files like I normally would in an MVC app.
I've been looking all over https://learn.microsoft.com/ as well as https://techcommunity.microsoft.com/ and SO and I've even started looking on the second page of Google search results. I can't find anything that points to this being possible or not.
Here is my current policy. It works in the sense that the front-end can hit it and pass through a file and that file is saved. But we want to hash the file and use that hash as the name to avoid name collisions in the Azure storage account blob container
<policies>
<inbound>
<base />
<set-variable name="UTCNow" value="#(DateTime.UtcNow.ToString("R"))" />
<set-variable name="Verb" value="#(context.Request.Method)" />
<set-variable name="documentstorage" value="{{documentstorage}}" />
<set-variable name="documentstoragekey" value="{{documentstorageaccesskey}}" />
<set-variable name="version" value="2019-12-12" />
<set-variable name="bodySize" value="#(context.Request.Headers["Content-Length"][0])" />
<set-variable name="contentType" value="#(context.Request.Headers["Content-Type"][0])" />
<set-header name="x-ms-version" exists-action="override">
<value>#((string)context.Variables["version"] )</value>
</set-header>
<set-header name="x-ms-blob-type" exists-action="override">
<value>BlockBlob</value>
</set-header>
<set-header name="date" exists-action="override">
<value>#((string)context.Variables["UTCNow"])</value>
</set-header>
<set-header name="Authorization" exists-action="override">
<value>#{
var account = (string)context.Variables["documentstorage"];
var key = (string)context.Variables["documentstoragekey"];
var verb = (string)context.Variables["Verb"];
var container = context.Request.MatchedParameters["container"];
var fileName = context.Request.MatchedParameters["fileName"];
var dateNow = (string)context.Variables["UTCNow"];
string contentType = (string)context.Variables["contentType"];//"application/pdf";
var contentLength = (string)context.Variables["bodySize"];
var stringToSign = string.Format("{0}\n\n\n{1}\n\n{2}\n{3}\n\n\n\n\n\nx-ms-blob-type:BlockBlob\nx-ms-version:{4}\n/{5}/{6}/{7}",
verb,
contentLength,
contentType,
(string)context.Variables["UTCNow"],
(string)context.Variables["version"],
account,
container,
fileName);
string signature = "";
var unicodeKey = Convert.FromBase64String(key);
using (var hmacSha256 = new HMACSHA256(unicodeKey))
{
var dataToHmac = Encoding.UTF8.GetBytes(stringToSign);
signature = Convert.ToBase64String(hmacSha256.ComputeHash(dataToHmac));
}
var authorizationHeader = string.Format(
"{0} {1}:{2}",
"SharedKey",
account,
signature);
return authorizationHeader;
}</value>
</set-header>
</inbound>
<backend>
<base />
</backend>
<outbound>
<base />
</outbound>
<on-error>
<base />
</on-error>
</policies>
I haven't tried this but it sounds like you can get the request body (that's where I assume your file is):
var inBody = context.Request.Body.As<byte[]>(preserveContent: true);
Based on this: https://learn.microsoft.com/en-us/azure/api-management/api-management-policy-expressions#ref-imessagebody and this https://learn.microsoft.com/en-us/azure/api-management/api-management-transformation-policies#SetBody
However, if you just want to get a unique files names, why not simply generate a GUID? Or do you mean you want to make sure that every file only gets uploaded once? (Then hashing probably makes sense)
Related
hope you are doing well !
I have been trying to write a GraphQL Mutation Resolver for a REST POST request in Azure APIM but nothing is working so far.
The REST call takes an object with firstName, lastName, username, password and returns an object with the same fields.
Below is my current code. Any Help would be appreciated.
<set-graphql-resolver parent-type="Mutation" field="createUser">
<http-data-source>
<http-request>
<set-method>POST</set-method>
<set-url>[URL]</set-url>
<set-header name="Content-Type" exists-action="override">
<value>application/json</value>
</set-header>
<set-body>#{
var args = context.Request.Body.As<JObject>(true)["arguments"];
JObject jsonObject = new JObject();
jsonObject.Add("firstName", args["firstName"]);
jsonObject.Add("lastName", args["lastName"]);
jsonObject.Add("username", args["username"]);
jsonObject.Add("password", args["password"]);
return jsonObject.ToString();
}</set-body>
</http-request>
</http-data-source>
</set-graphql-resolver>
UPDATE:
This is the schema i am using:
And this is how i am testing the mutation with the arguments and the original error i am getting:
In Application insights, i am getting this error log:
NOTE: This is the original response that i'm getting from a normal REST Request
Thank you!
I contacted Microsoft about this issue and they told me that it is a bug on the service side.
The product team is working on the fix and it will be released in v.33 (4 to 6 weeks).
Having Azure API Gateway with an exteral IdP (Okta) we setup a simple and working setup.
The API Gateway is able to authenticate and authorize the JWT token and call the backend service (App Logic or Azure function).
Next to the invoking the backend services, the API Gateway can pass claims from the JWT token
<validate-jwt header-name="Authorization" failed-validation-httpcode="401" failed-validation-error-message="No JWT token" require-expiration-time="true" require-scheme="Bearer" require-signed-tokens="true" clock-skew="10" output-token-variable-name="jwttoken">
<set-header name="jwt-token" exists-action="append">
<value><![CDATA[#(context.Variables.ContainsKey("jwttoken") ? (((Jwt)(context.Variables["jwttoken"])).Subject) : "") ]]></value>
</set-header>
The backend services (AppLogic) are authorized using its SAS token, so we have to remove the Authorization header. I'm considering to send the JWT token to the backend anyway, but so far I found no way to do so (serialize the jwttoken variable, the API GW doesn't allow to call the .ToString() method).
Q: Is there / How do I send the original JWT token to the backend (in another header?)
Jwt.TryParse can be used to break up the JWT token and pass on in another header in clear text:
<policies>
<inbound>
<base />
<set-header name="parsed-token" exists-action="override">
<value>#{
string parsedToken = "error";
string tokenHeader = context.Request.Headers.GetValueOrDefault("jwt-token", "");
if (tokenHeader?.Length > 0)
{
Jwt jwt;
if (tokenHeader.TryParseJwt(out jwt))
{
foreach(var claim in jwt.Claims)
{
parsedToken += claim.Key + ":" + string.Join("-",claim.Value) + ";";
}
}
}
return parsedToken;
}</value>
</set-header>
</inbound>
<backend>
<base />
</backend>
<outbound>
<base />
</outbound>
<on-error>
<base />
</on-error>
</policies>
For a sample token from https://jwt.io/#debugger-io this would give you headers like:
"jwt-token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c",
"parsed-token": "errorsub:1234567890;name:John Doe;iat:1516239022;",
I still cannot find proper documentation for class Jwt used in Azure API policy expressions - this class does not seem to have a ToString().
Since your backend expects SAS token at some point in your policy you have set-header policy to set SAS token, correct? Well, then just copy value from Authorization header before you overwrite it:
<set-header name="jwt" exists-action="override">
<value>#(context.Request.Headers.GetValueOrDefault("Authorization"))</value>
</set-header>
I'm trying to log the requests and responses from API Management Gateway to Azure Event Hubs.I'm using "log-to-event-hub" policy for that.I want to send a single event to event hub containing both the request and response together.
I tried including the event-hub policy inside the inbound policy with both the request and response together but I'm only getting the request and not the response.Similarly I tried including it in the outbound policy but got only the response.As I'm sending the event-hub logs to Azure Log Analytics I wanted to get the complete request and response together.I know that keeping the "log-to-event-hub" policy in both inbound and outbound policies will give me two different log events.
<inbound>
<set-variable name="message-id" value="#(Guid.NewGuid())" />
<log-to-eventhub logger-id="all-logs" partition-id="0">#{
var requestLine = string.Format("{0} {1} HTTP/1.1\r\n",
context.Request.Method,
context.Request.Url.Path + context.Request.Url.QueryString);
var body = "Request " + context.Request.Body?.As<string>(true) + "Response " + context.Response.Body?.As<string>(true);
var headers = context.Request.Headers
.Where(h => h.Key != "Authorization" && h.Key != "Ocp-Apim-Subscription-Key")
.Select(h => string.Format("{0}: {1}", h.Key, String.Join(", ", h.Value)))
.ToArray<string>();
var headerString = (headers.Any()) ? string.Join("\r\n", headers) + "\r\n" : string.Empty;
return "staging: " + context.Response.StatusCode + " " + context.Variables["message-id"] + "\n"
+ requestLine + headerString + "\r\n" + body;
}</log-to-eventhub>
</inbound>
Is it possible to have both in the same event and only one event is being logged.
I would capture the required values from request in a variable, combine it with request values in the outbound policy and log it there:
<policies>
<inbound>
<base />
<set-variable name="requestHeaders" value="#(JsonConvert.SerializeObject(context.Request.Headers))" />
<set-variable name="requestBody" value="#(context.Request.Body.As<string>(true))" />
</inbound>
<backend>
<base />
</backend>
<outbound>
<base />
<log-to-eventhub logger-id="testlogger">#{
var content = new JObject();
content["reqHeaders"] = context.Variables.GetValueOrDefault<string>("requestHeaders");
content["reqBody"] = context.Variables.GetValueOrDefault<string>("requestBody");
content["resStatus"] = JsonConvert.SerializeObject(context.Response.StatusCode);
content["resBody"] = context.Response.Body.As<string>(true);
return content.ToString();
}</log-to-eventhub>
</outbound>
<on-error>
<base />
</on-error>
</policies>
I need to programmatically permission Azure Key Vault and the closest I got to it is Set-AzureRmKeyVaultAccessPolicy PowerShell command.
Is there an equivalent in the .NET SDK for that or perhaps in the REST API?
here you go, you could probably find something similar for the .NET SDK.
Also, if you do Set-AzureRmKeyVaultAccessPolicy -debug you would find the information needed:
DEBUG: ============================ HTTP REQUEST ============================
HTTP Method:
PUT
Absolute Uri:
https://management.azure.com/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.KeyVault/vaults/xxx?api-version=2015-06-01
Body {Omitted}
edit: For future reference, PowerShell uses the REST APIs. If there is a PS command for it, there is definitely a REST endpoint. By Junnas
We can use Microsoft Azure Key Vault Management to do that.It is a preview version. We can create or update Key Vault using keyVaultManagementClient.Vaults.CreateOrUpdateAsync() function.
I did a demo for it. My detail steps are as following:
Prerequisites:
Registry an App in Azure AD and create service principle for it. More detail steps please refer to document.
Steps:
1.Create a C# console application
2.Add the demo code in the project
using System;
using System.Collections.Generic;
using Microsoft.Azure.Management.KeyVault;
using Microsoft.Azure.Management.KeyVault.Models;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Rest;
var subscriptionId = "Your Subscription Id";
var clientId = "Your Registry Application Id";
var tenantId = "Your tenant Id";
var secretKey = "Application secret Key";
var objectId = "Registry Application object Id"
var clientCredential = new ClientCredential(clientId, secretKey);
var context = new AuthenticationContext("https://login.windows.net/" + tenantId);
const string resourceGroupName = "tom";
// The name of the vault to create.
const string vaultName = "TomNewKeyVaultForTest";
var accessPolicy = new AccessPolicyEntry
{
ApplicationId = Guid.Parse(clientId),
TenantId = Guid.Parse(tenantId),
Permissions = new Permissions
{
Keys = new List<string> { "List","Get" },
Secrets = new List<string> { "All" }
},
ObjectId = Guid.Parse(objectId)
};
VaultProperties vaultProps = new VaultProperties
{
EnabledForTemplateDeployment = true,
TenantId = Guid.Parse(tenantId),
AccessPolicies = new List<AccessPolicyEntry>
{
accessPolicy
}
};
Microsoft.Rest.ServiceClientCredentials credentials = new TokenCredentials(token);
VaultCreateOrUpdateParameters vaultParams = new VaultCreateOrUpdateParameters("eastasia", vaultProps);
KeyVaultManagementClient keyVaultManagementClient= new KeyVaultManagementClient(credentials)
{
SubscriptionId = subscriptionId
};
var result = keyVaultManagementClient.Vaults.CreateOrUpdateAsync(resourceGroupName, vaultName, vaultParams).Result;
3.Debug the demo
4.Check the created or updated KeyVault in the azure portal
More SDK information please refer to the package.config file:
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Hyak.Common" version="1.0.2" targetFramework="net452" />
<package id="Microsoft.Azure.Common" version="2.1.0" targetFramework="net452" />
<package id="Microsoft.Azure.Common.Dependencies" version="1.0.0" targetFramework="net452" />
<package id="Microsoft.Azure.Management.KeyVault" version="2.0.0-preview" targetFramework="net452" />
<package id="Microsoft.Bcl" version="1.1.9" targetFramework="net452" />
<package id="Microsoft.Bcl.Async" version="1.0.168" targetFramework="net452" />
<package id="Microsoft.Bcl.Build" version="1.0.14" targetFramework="net452" />
<package id="Microsoft.IdentityModel.Clients.ActiveDirectory" version="2.28.3" targetFramework="net452" />
<package id="Microsoft.Net.Http" version="2.2.22" targetFramework="net452" />
<package id="Microsoft.Rest.ClientRuntime" version="2.3.1" targetFramework="net452" />
<package id="Microsoft.Rest.ClientRuntime.Azure" version="3.3.1" targetFramework="net452" />
<package id="Newtonsoft.Json" version="6.0.8" targetFramework="net452" />
</packages>
I would like to perform a scheduled task of exporting an Azure SQL database as DACPAC to the Blob Storage. I would like to know can I do this. Web Job? Powershell script?
We also can do this with WebJob. I create a demo with Microsoft.Azure.Management.Sql -Pre .Net SDK,and it works successfully for me.
More information about how to deploy webjob and create scheduled job please refer to the following documents.
creating-and-deploying-microsoft-azure-webjobs
create-a-scheduled-webjob-using-a-cron-expression
The following is my detail steps and sample code:
Prerequisites:
Registry an App in Azure AD and create service principle for it. More detail steps about how to registry app and get access token please refer to document.
Steps:
1.Create a C# console Application
2.Get accessToken by using registry App in Azure AD
public static string GetAccessToken(string tenantId, string clientId, string secretKey)
{
var clientCredential = new ClientCredential(clientId, secretKey);
var context = new AuthenticationContext("https://login.windows.net/" + tenantId);
var accessToken = context.AcquireTokenAsync("https://management.azure.com/", clientCredential).Result;
return accessToken.AccessToken;
}
3.Create Azure sqlManagementClient object
SqlManagementClient sqlManagementClient = new SqlManagementClient(new TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId,clientId, secretKey)));
4.Use sqlManagementClient.ImportExport.Export to export .dacpac file to azure storage
var export = sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,
exportRequestParameters)
5. Go the the Bin/Debug path of the Application and Add all the contents in a .zip file.
Add the webjob from the Azure portal
Check the webjob log from the kudu tool
Check the backup file from the azure storage.
SDK info please refer to the Package.config file.
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Hyak.Common" version="1.0.2" targetFramework="net452" />
<package id="Microsoft.Azure.Common" version="2.1.0" targetFramework="net452" />
<package id="Microsoft.Azure.Common.Dependencies" version="1.0.0" targetFramework="net452" />
<package id="Microsoft.Azure.Management.Sql" version="0.51.0-prerelease" targetFramework="net452" />
<package id="Microsoft.Bcl" version="1.1.9" targetFramework="net452" />
<package id="Microsoft.Bcl.Async" version="1.0.168" targetFramework="net452" />
<package id="Microsoft.Bcl.Build" version="1.0.14" targetFramework="net452" />
<package id="Microsoft.IdentityModel.Clients.ActiveDirectory" version="2.28.3" targetFramework="net452" />
<package id="Microsoft.Net.Http" version="2.2.22" targetFramework="net452" />
<package id="Microsoft.Web.WebJobs.Publish" version="1.0.12" targetFramework="net452" />
<package id="Newtonsoft.Json" version="6.0.4" targetFramework="net452" />
</packages>
Demo code:
static void Main(string[] args)
{
var subscriptionId = "Your Subscription Id";
var clientId = "Your Application Id";
var tenantId = "tenant Id";
var secretKey = "secretkey";
var azureSqlDatabase = "Azure SQL Database Name";
var resourceGroup = "Resource Group of Azure Sql ";
var azureSqlServer = "Azure Sql Server";
var adminLogin = "Azure SQL admin login";
var adminPassword = "Azure SQL admin password";
var storageKey = "Azure storage Account Key";
var baseStorageUri = "Azure storage URi";//with container name endwith "/"
var backName = azureSqlDatabase + "-" + $"{DateTime.UtcNow:yyyyMMddHHmm}" + ".bacpac"; //back up sql file name
var backupUrl = baseStorageUri + backName;
ImportExportOperationStatusResponse exportStatus = new ImportExportOperationStatusResponse();
try
{
ExportRequestParameters exportRequestParameters = new ExportRequestParameters
{
AdministratorLogin = adminLogin,
AdministratorLoginPassword = adminPassword,
StorageKey = storageKey,
StorageKeyType = "StorageAccessKey",
StorageUri = new Uri(backupUrl)
};
SqlManagementClient sqlManagementClient = new SqlManagementClient(new TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId,clientId, secretKey)));
var export = sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,
exportRequestParameters); //do export operation
while (exportStatus.Status != OperationStatus.Succeeded) // until operation successed
{
Thread.Sleep(1000 * 60);
exportStatus = sqlManagementClient.ImportExport.GetImportExportOperationStatus(export.OperationStatusLink);
}
Console.WriteLine($"Export DataBase {azureSqlDatabase} to Storage wxtom2 Succesfully");
}
catch (Exception)
{
//todo
}
}
Hi have you had a look at the following documentation which includes a PowerShell script and an Azure automation reference with sample script.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-export-powershell