How do I configure the Azure Caching Service in code? Note I'm talking specifically about the new Caching Service, currently in preview, that uses v2+ of the caching libraries. I am not talking about the older Azure Shared Caching service, which is very similar.
In my particular case I want to configure caching parameters in the service configuration (.cscfg) file, rather than in web.config. But there are probably other reasons as well.
This seems to do the trick.
//Utility function
private static SecureString MakeSecureString(string s)
{
SecureString secureString = new SecureString();
foreach (char c in s)
{
secureString.AppendChar(c);
}
secureString.MakeReadOnly();
return secureString;
}
//Populate these values from whereever you want, perhaps read from config
string host = "foo.cache.windows.net"
string rawAuthToken = "abcdefgblahblahblahfoobar==";
//Create the SecureString that we need for the security config
SecureString authToken = MakeSecureString(rawAuthToken);
DataCacheFactoryConfiguration cacheConfig = new DataCacheFactoryConfiguration();
cacheConfig.AutoDiscoverProperty = new DataCacheAutoDiscoverProperty(true, host);
cacheConfig.SecurityProperties = new DataCacheSecurity(authToken);
var cacheFactory = new DataCacheFactory(cacheConfig);
//Get a cache as normal
DataCache cache = cacheFactory.GetCache("default");
Related
When I inject IConfiguration in a function, it does not find any keys that only live in my "Azure App Configuration".
I have a functionApp (V3) that accesses App Configuration using the DefaultAzureCredential. I am running this locally in debug hence the need for a default credential. I also have multiple Tenants so I had to set the VisualStudioTenantId and SharedTokenCacheTenantId on DefaultAzureCredentialOptions. My Visual studio user was also given the role "App Configuration Data Reader" to be able to debug.
When connecting to App configuration I get no errors.
Editedto add: I have setup AppConfiguration to authenticate with AzureAD.
See code below:
public override async void ConfigureAppConfiguration(IFunctionsConfigurationBuilder builder)
{
var credOptions = new DefaultAzureCredentialOptions();
var tenantId = Environment.GetEnvironmentVariable("Tenant_Id");
credOptions.VisualStudioTenantId = tenantId;
credOptions.SharedTokenCacheTenantId = tenantId;
var cred = new DefaultAzureCredential(credOptions);
/*Works but requires SharedTokenCacheTenantId*/
var secretClient = new SecretClient(new Uri(vaultURI), cred);
var secret = await secretClient.GetSecretAsync("<secret name>");
/*Works but where are my keys when I try to access them?*/
builder.ConfigurationBuilder.AddAzureAppConfiguration(options =>
{
options.Connect(new Uri(appConfigURI), cred);
}).Build(); //Should I be building this??
}
In my function
public FunctName(IConfiguration configuration)
{
_configuration = configuration;
}
And when I access the property
var prop = _configuration["PropertyName"];
There is an example function app that uses IFunctionsConfigurationBuilder here https://github.com/Azure/AppConfiguration/blob/main/examples/DotNetCore/AzureFunction/FunctionApp/Startup.cs . I would recommend taking a look and seeing if there are any missing pieces.
The title mentions "using DefaultAzureCredential on local". Does that mean that this works as expected if you use a connection string?
Notice the async void ConfigureAppConfiguration. This caused my ConfigureAppConfiguration to not execute synchronously, causing configure to add my App Configuration before it was populated.
I am trying to figure out how to generate a time based token for users to access data that is stored in an Azure Storage Account Blob container. Users upload various data (PDFs, images) but I don't want links to this data to be public. The recommended strategy was to use a SAS token which I was able to get this working under .Net using the following function which I found on the MS site about a year ago:
//Function for getting a temporary Azure SAS
public static string GetAzureSASToken(string userhashid, int minutes)
{
//Get Azure SAS Token so we can allow them to temporarily view the photos
UploadedFileInfo uploadedfileinfo = new UploadedFileInfo();
//Azure User containter must be all lowercase!!
uploadedfileinfo.usercontainer = "user-" + userhashid;
uploadedfileinfo.azureurl = ConfigurationManager.AppSettings["AzureURL"].ToString() + uploadedfileinfo.usercontainer + "/";
// Retrieve storage account from connection string.
string azureconnection = CloudConfigurationManager.GetSetting("StorageConnectionString");
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(azureconnection);
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = client.GetContainerReference(uploadedfileinfo.usercontainer);
//Set the expiry time and permissions for the container.
//In this case no start time is specified, so the shared access signature becomes valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(minutes);
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;
//Generate the shared access signature on the container, setting the constraints directly on the signature.
string sasContainerToken = blobContainer.GetSharedAccessSignature(sasConstraints);
return sasContainerToken;
}
The problem is that I now need to access these files from an Asp.Net Core 2.2 app and I can't seem to figure out how to replicate the code to get the token (the Core libraries are different)
Any suggestions on how to accomplish this in .Net Core 2.2?
Thanks!
Just install the latest nuget package Microsoft.Azure.Storage.Blob -Version 11.1.0.
Then your .net core code( same as .net framework code in your post) can work well as .net framework code.
Here is an sample code of .net core 2.2. I didn't read the settings from configure file, so it's a little different:
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Auth;
using Microsoft.Azure.Storage.Blob;
using System;
namespace ConsoleApp5
{
class Program
{
static void Main(string[] args)
{
string sas = GetAzureSASToken();
Console.WriteLine(sas);
Console.ReadLine();
}
public static string GetAzureSASToken()
{
string accountName = "xxx";
string accountKey = "xxx";
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = client.GetContainerReference("test1");
//Set the expiry time and permissions for the container.
//In this case no start time is specified, so the shared access signature becomes valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(5);
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;
//Generate the shared access signature on the container, setting the constraints directly on the signature.
string sasContainerToken = blobContainer.GetSharedAccessSignature(sasConstraints);
return sasContainerToken;
}
}
}
And the test result:
I am using the Change Password functionality that visual studio generated for the accountcontroller. I am able to change the password without errors but when I go to login using the new password, I get a login error but if I use the old password, it works.
If I restart the app then the newly changed password takes effect. I am also using Autofac, may be I am not configuring the container correctly.
var builder = new ContainerBuilder();
builder.Register(c => new ApplicationDataContext(connectionString)).InstancePerLifetimeScope();
builder.RegisterType<ApplicationUserManager>().AsSelf();
builder.RegisterType<ApplicationRoleManager>().AsSelf();
builder.Register(c => new UserStore<ApplicationUser>(c.Resolve<ApplicationDataContext>())).AsImplementedInterfaces();
builder.Register(c => new RoleStore<IdentityRole>(c.Resolve<ApplicationDataContext>())).AsImplementedInterfaces();
builder.Register(c => HttpContext.Current.GetOwinContext().Authentication).As<IAuthenticationManager>();
builder.Register(c => new IdentityFactoryOptions<ApplicationUserManager>
{
DataProtectionProvider = new DpapiDataProtectionProvider("Application")
});
builder.Register(c => new ApplicationOAuthProvider(publicClientId, c.Resolve<ApplicationUserManager>())).As<IOAuthAuthorizationServerProvider>();
Any help will be much appreciated.
Thanks
--------UPDATED----------
ContanierConfig.cs
public static void Configure(HttpConfiguration config)
{
// Configure the application for OAuth based flow
const string publicClientId = "self";
// ContainerConfig Config
var connectionString = ConfigurationManager.ConnectionStrings["DefaultConnection"].ConnectionString;
var elasticsearchUrl = ConfigurationManager.AppSettings["ElasticSearchUrl"];
var elasticSearchName = ConfigurationManager.AppSettings["ElasticSearchName"];
var builder = new ContainerBuilder();
builder.Register(c => new BimDataContext(connectionString)).InstancePerRequest();
builder.RegisterType<ApplicationUserManager>().AsSelf().InstancePerRequest();
builder.RegisterType<ApplicationRoleManager>().AsSelf().InstancePerRequest();
builder.Register(c => new UserStore<ApplicationUser>(c.Resolve<BimDataContext>())).AsImplementedInterfaces().InstancePerRequest();
builder.Register(c => new RoleStore<IdentityRole>(c.Resolve<BimDataContext>())).AsImplementedInterfaces().InstancePerRequest();
builder.Register(c => HttpContext.Current.GetOwinContext().Authentication).As<IAuthenticationManager>().InstancePerRequest();
builder.Register(c => new IdentityFactoryOptions<ApplicationUserManager>
{
DataProtectionProvider = new DpapiDataProtectionProvider("Application")
}).InstancePerRequest(); ;
builder.RegisterType<SimpleRefreshTokenProvider>().As<IAuthenticationTokenProvider>().InstancePerRequest();
builder.RegisterType<AuthRepository>().As<IAuthRepository>().InstancePerRequest();
builder.Register(c => new ApplicationOAuthProvider(
publicClientId,
c.Resolve<ApplicationUserManager>(),
c.Resolve<IAuthRepository>()))
.As<IOAuthAuthorizationServerProvider>().InstancePerRequest();
// Register your Web API controllers.
builder.RegisterApiControllers(Assembly.GetExecutingAssembly()).InstancePerRequest();
// UoW registration: being explicit
builder.RegisterType<UnitOfWork>().As<IUnitOfWork>().InstancePerRequest();
// Repositories registration
builder.RegisterAssemblyTypes(typeof(ClientRepository).Assembly)
.AsImplementedInterfaces()
.InstancePerRequest();
// Services registration
builder.RegisterAssemblyTypes(typeof(ClientService).Assembly)
.AsImplementedInterfaces()
.InstancePerRequest();
builder.RegisterAssemblyTypes(typeof(ClientSearchService).Assembly)
.AsImplementedInterfaces()
.InstancePerRequest();
builder.RegisterType<IfcFileImportTask>().As<IIfcFileImportTask>().InstancePerRequest();
builder.RegisterType<COBieFileImportTask>().As<ICOBieFileImportTask>().InstancePerRequest();
// Hangfire registration
builder.RegisterType<BackgroundJobClient>().As<IBackgroundJobClient>().InstancePerRequest();
// OPTIONAL: Register the Autofac filter provider.
builder.RegisterWebApiFilterProvider(config);
// Set the dependency resolver to be Autofac.
var container = builder.Build();
JobActivator.Current = new AutofacJobActivator(container);
config.DependencyResolver = new AutofacWebApiDependencyResolver(container);
}
Startup.Auth.Cs
public partial class Startup
{
public static OAuthAuthorizationServerOptions OAuthOptions { get; private set; }
public static string PublicClientId { get; private set; }
// For more information on configuring authentication, please visit http://go.microsoft.com/fwlink/?LinkId=301864
public void ConfigureAuth(IAppBuilder app)
{
app.UseCors(Microsoft.Owin.Cors.CorsOptions.AllowAll);
// Configure the db context and user manager to use a single instance per request
//app.CreatePerOwinContext(BimDataContext.Create);
//app.CreatePerOwinContext<ApplicationUserManager>(ApplicationUserManager.Create);
// Enable the application to use a cookie to store information for the signed in user
// and to use a cookie to temporarily store information about a user logging in with a third party login provider
app.UseCookieAuthentication(new CookieAuthenticationOptions());
app.UseExternalSignInCookie(DefaultAuthenticationTypes.ExternalCookie);
// Configure the application for OAuth based flow
PublicClientId = "self";
var oAuthAuthorizationServerProvider = GlobalConfiguration.Configuration.DependencyResolver.GetRequestLifetimeScope().Resolve<IOAuthAuthorizationServerProvider>();
var authenticationTokenProvider = GlobalConfiguration.Configuration.DependencyResolver.GetRequestLifetimeScope().Resolve<IAuthenticationTokenProvider>();
OAuthOptions = new OAuthAuthorizationServerOptions
{
TokenEndpointPath = new PathString("/Token"),
Provider = oAuthAuthorizationServerProvider,
AuthorizeEndpointPath = new PathString("/api/Account/ExternalLogin"),
AccessTokenExpireTimeSpan = TimeSpan.FromMinutes(5),
RefreshTokenProvider = authenticationTokenProvider,
// In production mode set AllowInsecureHttp = false,
AllowInsecureHttp = true
};
// Enable the application to use bearer tokens to authenticate users
app.UseOAuthBearerTokens(OAuthOptions);
}
Getting error
"value cannot be null. parameter name context autofac" on line var oAuthAuthorizationServerProvider = GlobalConfiguration.Configuration.DependencyResolver.GetRequestLifetimeScope().Resolve<IOAuthAuthorizationServerProvider>();
I was missing a key component of oauth2, the solution to this problem is refresh_tokens. On change password, invalidate the refresh token and force user to log out.
http://bitoftech.net/2014/07/16/enable-oauth-refresh-tokens-angularjs-app-using-asp-net-web-api-2-owin/
If using ASP.NET (this includes MVC and Web API, Web Forms, etc) and AutoFac you should register all your components using the extension method .InstancePerRequest(). The only exception is for components that are thread safe and where you do not have to worry about errors/unexpected results occurring from one request accessing the same (stale) data as another. An example might be a Factory or a Singleton.
Example of use on a line of code:
builder.Register(c => new UserStore<ApplicationUser>(c.Resolve<ApplicationDataContext>())).AsImplementedInterfaces().InstancePerRequest();
This ensures that every new incoming Http Request will get its own copy of that implementation (resolved and injected hopefully via an interface). Autofac will also cleanup the Disposable instances at the end of each request.
This is the behavior you need. It ensures that there is no cross request interference (like one request manipulating data on a shared dbcontext on another request). It also ensures that data is not stale as it is cleaned up after each request ends.
See the Autofac documentation for more details (here an excerpt).
Instance Per Request
Some application types naturally lend themselves to “request” type semantics, for example ASP.NET web forms and MVC applications. In these application types, it’s helpful to have the ability to have a sort of “singleton per request.”
Instance per request builds on top of instance per matching lifetime scope by providing a well-known lifetime scope tag, a registration convenience method, and integration for common application types. Behind the scenes, though, it’s still just instance per matching lifetime scope.
Changing your DI definitions above to include this should resolve your issues (I think based on what you have provided). If not then it might be a problem with your Identity registration in which case you should post that code so it can be scrutinized.
Is it possible to change the app settings for a website from the app itself?
This is not meant to be an everyday operation, but a self-service reconfiguration option. A non-developer can change a specific setting, which should cause a restart, just like I can do manually on the website configuration page (app setting section)
You can also use the Azure Fluent Api.
using Microsoft.Azure.Management.Fluent;
using Microsoft.Azure.Management.ResourceManager.Fluent;
using Microsoft.Azure.Management.ResourceManager.Fluent.Authentication;
using Microsoft.Azure.Management.ResourceManager.Fluent.Core;
...
public void UpdateSetting(string key, string value)
{
string tenantId = "a5fd91ad-....-....-....-............";
string clientSecret = "8a9mSPas....................................=";
string clientId = "3030efa6-....-....-....-............";
string subscriptionId = "a4a5aff6-....-....-....-............";
var azureCredentials = new AzureCredentials(new
ServicePrincipalLoginInformation
{
ClientId = clientId,
ClientSecret = clientSecret
}, tenantId, AzureEnvironment.AzureGlobalCloud);
var _azure = Azure
.Configure()
.WithLogLevel(HttpLoggingDelegatingHandler.Level.Basic)
.Authenticate(azureCredentials)
.WithSubscription(subscriptionId);
var appResourceId = "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Web/sites/xxx"; //Get From WebApp -> Properties -> Resource ID
var webapp = _azure.WebApps.GetById(appResourceId);
//Set App Setting Key and Value
webapp.Update()
.WithAppSetting(key, value)
.Apply();
}
It wasn't that hard once I found the right lib to do it, Microsoft Azure Web Sites Management Library.
var credentials = GetCredentials(/*using certificate*/);
using (var client = new WebSiteManagementClient(credentials))
{
var currentConfig = await client.WebSites.GetConfigurationAsync(webSpaceName,
webSiteName);
var newConfig = new WebSiteUpdateConfigurationParameters
{
ConnectionStrings = null,
DefaultDocuments = null,
HandlerMappings = null,
Metadata = null,
AppSettings = currentConfig.AppSettings
};
newConfig.AppSettings[mySetting] = newValue;
await client.WebSites.UpdateConfigurationAsync(webSpaceName, webSiteName,
newConfig);
}
Have you read into the Service Management REST API? The documentation mentions that it allows you to perform most the actions that are available via the Management Portal programmatically.
In addition to Diego answer, to use the Azure Management Librairies within a WebApp (WebSites and/or WebJobs), you need to configure SSL which is a little bit tricky:
Using Azure Management Libraries from Azure Web Jobs
Is there anywhere in the service runtime that would tell me if I'm currently running on 'Staging' or 'Production'? Manually modifying the config to and from production seems a bit cumbersome.
You should really not change your configurations when you're based upon if you're in Prod or Staging. Staging area is not designed to be a "QA" environment but only a holding-area before production is deployed.
When you upload a new deployment, current deployment slot where you upload your package to is destroyed and is down for 10-15minutes while upload and start of VM's is happening. If you upload straight into production, that's 15 minutes of production downtime. Thus, Staging area was invented: you upload to staging, test the stuff, and click "Swap" button and your Staging environment magically becomes Production (virtual IP swap).
Thus, your staging should really be 100% the same as your production.
What I think you're looking for is QA/testing environment? You should open up a new service for Testing environment with its own Prod/Staging. In this case, you will want to maintain multiple configuration file sets, one set per deployment environment (Production, Testing, etc.)
There are many ways to manage configuration-hell that occurs, especially with Azure that has on top of .config files, its own *.cscfg files. The way I prefer to do it with Azure project is as follows:
Setup a small Config project, create folders there that match Deployment types. Inside each folder setup sets of *.config & *.cscfg files that match to particular deployment environment: Debug, Test, Release... these are setup in Visual Studio as well , as build target types. I have a small xcopy command that occurs during every compile of the Config project that copies all the files from Build Target folder of Config project into root folder of the Config project.
Then every other project in the solution, LINKS to the .config or .cscfg file from the root folder of the Config project.
Voila, my configs magically adapt to every build configuration automatically. I also use .config transformations to manage debugging information for Release vs. non-Release build targets.
If you've read all this and still want to get at the Production vs. Staging status at runtime, then:
Get deploymentId from RoleEnvironment.DeploymentId
Then use Management API with a proper X509 certificate to get at the Azure structure of your Service and call the GetDeployments method (it's rest api but there is an abstraction library).
Hope this helps
Edit: blog post as requested about the setup of configuration strings and switching between environments # http://blog.paraleap.com/blog/post/Managing-environments-in-a-distributed-Azure-or-other-cloud-based-NET-solution
Sometimes I wish people would just answer the question.. not explain ethics or best practices...
Microsoft has posted a code sample doing exactly this here: https://code.msdn.microsoft.com/windowsazure/CSAzureDeploymentSlot-1ce0e3b5
protected void Page_Load(object sender, EventArgs e)
{
// You basic information of the Deployment of Azure application.
string deploymentId = RoleEnvironment.DeploymentId;
string subscriptionID = "<Your subscription ID>";
string thrumbnail = "<Your certificate thumbnail print>";
string hostedServiceName = "<Your hosted service name>";
string productionString = string.Format(
"https://management.core.windows.net/{0}/services/hostedservices/{1}/deploymentslots/{2}",
subscriptionID, hostedServiceName, "Production");
Uri requestUri = new Uri(productionString);
// Add client certificate.
X509Store store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.OpenExistingOnly);
X509Certificate2Collection collection = store.Certificates.Find(
X509FindType.FindByThumbprint, thrumbnail, false);
store.Close();
if (collection.Count != 0)
{
X509Certificate2 certificate = collection[0];
HttpWebRequest httpRequest = (HttpWebRequest)HttpWebRequest.Create(requestUri);
httpRequest.ClientCertificates.Add(certificate);
httpRequest.Headers.Add("x-ms-version", "2011-10-01");
httpRequest.KeepAlive = false;
HttpWebResponse httpResponse = httpRequest.GetResponse() as HttpWebResponse;
// Get response stream from Management API.
Stream stream = httpResponse.GetResponseStream();
string result = string.Empty;
using (StreamReader reader = new StreamReader(stream))
{
result = reader.ReadToEnd();
}
if (result == null || result.Trim() == string.Empty)
{
return;
}
XDocument document = XDocument.Parse(result);
string serverID = string.Empty;
var list = from item
in document.Descendants(XName.Get("PrivateID",
"http://schemas.microsoft.com/windowsazure"))
select item;
serverID = list.First().Value;
Response.Write("Check Production: ");
Response.Write("DeploymentID : " + deploymentId
+ " ServerID :" + serverID);
if (deploymentId.Equals(serverID))
lbStatus.Text = "Production";
else
{
// If the application not in Production slot, try to check Staging slot.
string stagingString = string.Format(
"https://management.core.windows.net/{0}/services/hostedservices/{1}/deploymentslots/{2}",
subscriptionID, hostedServiceName, "Staging");
Uri stagingUri = new Uri(stagingString);
httpRequest = (HttpWebRequest)HttpWebRequest.Create(stagingUri);
httpRequest.ClientCertificates.Add(certificate);
httpRequest.Headers.Add("x-ms-version", "2011-10-01");
httpRequest.KeepAlive = false;
httpResponse = httpRequest.GetResponse() as HttpWebResponse;
stream = httpResponse.GetResponseStream();
result = string.Empty;
using (StreamReader reader = new StreamReader(stream))
{
result = reader.ReadToEnd();
}
if (result == null || result.Trim() == string.Empty)
{
return;
}
document = XDocument.Parse(result);
serverID = string.Empty;
list = from item
in document.Descendants(XName.Get("PrivateID",
"http://schemas.microsoft.com/windowsazure"))
select item;
serverID = list.First().Value;
Response.Write(" Check Staging:");
Response.Write(" DeploymentID : " + deploymentId
+ " ServerID :" + serverID);
if (deploymentId.Equals(serverID))
{
lbStatus.Text = "Staging";
}
else
{
lbStatus.Text = "Do not find this id";
}
}
httpResponse.Close();
stream.Close();
}
}
Staging is a temporary deployment slot used mainly for no-downtime upgrades and ability to roll back an upgrade.
It is advised not to couple your system (either in code or in config) with such Azure specifics.
Since Windows Azure Management Libraries and thanks to #GuaravMantri answer to another question you can do it like this :
using System;
using System.Linq;
using System.Security.Cryptography.X509Certificates;
using Microsoft.Azure;
using Microsoft.WindowsAzure.Management.Compute;
using Microsoft.WindowsAzure.Management.Compute.Models;
namespace Configuration
{
public class DeploymentSlotTypeHelper
{
static string subscriptionId = "<subscription-id>";
static string managementCertContents = "<Base64 Encoded Management Certificate String from Publish Setting File>";// copy-paste it
static string cloudServiceName = "<your cloud service name>"; // lowercase
static string ns = "http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration";
public DeploymentSlot GetSlotType()
{
var managementCertificate = new X509Certificate2(Convert.FromBase64String(managementCertContents));
var credentials = new CertificateCloudCredentials(subscriptionId, managementCertificate);
var computeManagementClient = new ComputeManagementClient(credentials);
var response = computeManagementClient.HostedServices.GetDetailed(cloudServiceName);
return response.Deployments.FirstOrDefault(d => d.DeploymentSlot == DeploymentSlot.Production) == null ? DeploymentSlot.Staging : DeploymentSlot.Production;
}
}
}
An easy way to solve this problem is setting at your instances an key to identify which environment it is running.
1) Set at your production slot:
Set it Settings >> Application settings >> App settings
And create a key named SLOT_NAME and value "production". IMPORTANT: check Slot setting.
2) Set at your staging slot:
Set it Settings >> Application settings >> App settings
And create a key named SLOT_NAME and value "staging". IMPORTANT: check Slot setting.
Access from your application the variable and identify which environment the application is running. In Java you can access:
String slotName = System.getenv("APPSETTING_SLOT_NAME");
Here are 4 points to consider
VIP swap only makes sense when your service faces the outside world. AKA, when it exposes an API and reacts to requests.
If all your service does is pull messages from a queue and process them, then your services is proactive and VIP swap is not a good solution for you.
If your service is both reactive and proactive, you may want to reconsider your design. Perhaps split the service into 2 different services.
Eric's suggestion of modifying the cscfg files pre- and post- VIP swap is good if the proactive part of your service can take a short down time (Because you first configure both Staging and Production to not pull messages, then perform VIP Swap, and then update Production's configuration to start pulling messages).