azure cache preview - azure

I tried the new azure preview that came with the new sdk on my computer.
I put a worker role with cache preview and put co-located role with 30% cache size.
on my controller i put this code:
[OutputCache(Duration=int.MaxValue, VaryByParam="none")]
public ActionResult Index()
{
ViewBag.Message = "Welcome to ASP.NET MVC!";
ViewBag.Id = Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.CurrentRoleInstance.Id;
return View();
}
now i ran the worker role via the emulator with 4 instances. the result was that every time i saw a different id - which mean the output cache never work with all the 4 instances ( to be clear i configure the output cache to work with the cache preview).
Only when i put an extra cache worker role as dedicated role everything start to work like it should be.
My questions is:
Do i need the extra worker role to actually make the cache preview to work ok? - which mean the trade off of not working with azure appfabric cache is putting extra machine
Did i do something work and it should work with the web roles as co located roles?
thanks
edit:
this another section of my web.config
<dataCacheClients>
<tracing sinkType="DiagnosticSink" traceLevel="Error" />
<dataCacheClient name="default">
<autoDiscover isEnabled="true" identifier="NugetTest" />
<!--<localCache isEnabled="true" sync="TimeoutBased" objectCount="100000" ttlValue="300" />
</dataCacheClient>
if my identifier have NugetTest ( which is my web roles - which i have 4) every time i switch machine i get a different cache. if i change the identifier to my worker role i get the result

Can you add applicationName tag in the provider configuration in web.config of you app? If this is not added, instances will not share the cache across. Please note the applicationName tag.
This should be added for the web.config of webrole in both dedicated or colocated cache scenario.
Please reply if this solves your issue.
<caching>
<outputCache defaultProvider="DistributedCache">
<providers>
<add name="DistributedCache" type="Microsoft.Web.DistributedCache.DistributedCacheOutputCacheProvider, Microsoft.Web.DistributedCache" cacheName="<cacheName>" applicationName ="<anyName>" dataCacheClientName="<dataCacheClientName>" />
</providers>
</outputCache>
</caching>

I'm unable to reproduce this issue. I always see the same instance, and I'm using Ctrl+F5 in the browser (thus rule out browser cache). PLease make sure you've configured output cache provider as described on http://www.windowsazure.com/en-us/develop/net/how-to-guides/cache/.
<!-- If output cache content needs to be saved in a Windows Azure
cache, add the following to web.config inside system.web. -->
<caching>
<outputCache defaultProvider="DistributedCache">
<providers>
<add name="DistributedCache"
type="Microsoft.Web.DistributedCache.DistributedCacheOutputCacheProvider, Microsoft.Web.DistributedCache"
cacheName="default"
dataCacheClientName="default" />
</providers>
</outputCache>
</caching>
Best Regards,
Ming Xu.

Related

IIS 10 application pool slow initialization times after recycle

we have an application pool that has a slower initialization time after an app pool recycle in IIS 10. (around 5-7 seconds after recycle then 30-50 ms after the first request.)
I have done some research and found that the "Application Initialization." module should do the trick.
I installed it onto the server and set the application pool to "AlwaysRunning" and the corresponding site to "PreloadEnabled == True." After making those changes we tested by recycling and the response times seemed a bit better...down to 3 to 4 seconds after recycle. I tried to then disable the "overlapped recycle" to see if that helped and again it did a bit better 1.5 to 2 seconds after recycle and then 20 to 30 ms after the first request.
Question is: Is that the best we can expect? I was hoping there would be away to fully pre-warm the app pool so that even the first request is around a few ms. The issue is that test messages we are sending to the API are small and the ones in Prod would be much larger so an initialization of 3-4 seconds could be much much longer in Prod.
Following are the steps you can perform to Auto Initialize application hosted on IIS –
• Installed Application Initialization feature - IIS 8.0 Application
https://learn.microsoft.com/en-us/iis/get-started/whats-new-in-iis-8/iis-80-application-initialization
• Make sure the warmup.dll (which should load from C:\Windows\SysWOW64\inetsrv\warmup.dll or from C:\Windows\system32\inetsrv\warmup.dll depending on the bitness of your process) present
• Configure the app pool to be always running (from the advanced properties)
eg. In the applicationHost.config (%WINDIR%\system32\inetsrv\config\applicationHost.config) file the application pool setting looks like this –
<add name="PreLoadApp" autoStart="true" managedRuntimeVersion="" startMode="AlwaysRunning">
<processModel idleTimeout="00:00:00" />
</add>
• Scroll down a little more in applicationHost.config to the configuration element. Within that section there will be an entry, modify your application as below
<site name="PreLoadApp" id="5">
<application path="/" applicationPool="PreLoadApp" preloadEnabled="true">
<virtualDirectory path="/" physicalPath="C:\inetpub\wwwroot\PreLoadApp" />
</application>
• Then selected the site from the IIS manager tree view on the left-hand side and go to the configuration editor.
this time underneath the <system.WebServer/applicationInitialization>
tag, and look at the list of requests
set the request to only target a page called (host header is optional) and also you can provide a query string
q=abhi // to identify if request is coming from preload only.
set the doAppInitAfterRestart parameter to true and apply the
settings
And you should be good by now, try recycling your application pool, it should Initialize and warmup automatically.
You can refer these MS docs to know more about Application Initialization and configuration steps –
https://learn.microsoft.com/en-us/iis/configuration/system.webServer/applicationInitialization/
https://learn.microsoft.com/en-us/iis/get-started/whats-new-in-iis-8/iis-80-application-initialization
IIS saves doAppInitAfterRestart in the web.config of the application, which might be overwritten by future deployments. Therefore I'd put the web.config under source control and make it part of the deployed artifact.
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.webServer>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModuleV2" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="%LAUNCHER_PATH%" arguments="%LAUNCHER_ARGS%" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" hostingModel="inprocess" />
<!-- Add this node to web.config -->
<applicationInitialization doAppInitAfterRestart="true">
<!-- Only needed when website contains multiple child apps -->
<add initializationPage='/hangfire',hostname='' />
<applicationInitialization />
</system.webServer>
</configuration>

application Initialization webapi always requesting default.aspx

I've been trying to configure IIS to request a custom URL to warm up my webapi.
My config is like this.
<applicationInitialization doAppInitAfterRestart="true" skipManagedModules="false">
<add initializationPage="/api/transaction/5" />
</applicationInitialization>
This is working but IIS also calls the root web app (/default.aspx) to warm up as well. And I'm wondering how to remove this call as I don't need it.
Thanks for you help!
Patrick
Looks like this config section is working with collection of initialization elements. Try to clear this collection before adding your page:
<applicationInitialization doAppInitAfterRestart="true" skipManagedModules="false">
<clear/>
<add initializationPage="/api/transaction/5" />
</applicationInitialization>

outputCacheProfiles with Redis cache

I have an MVC web project and I am trying to setup OutputCache for some of the pages using redis cache running locally, eventually to be hosted in azure.
I have decorated my ActionResult with
[OutputCache(CacheProfile = "cacheprofile1")]
and have the following in my web.config under system.web / caching
<outputCacheSettings>
<outputCacheProfiles>
<add name="cacheprofile1" duration="1800" varyByParam="none"/>
</outputCacheProfiles>
</outputCacheSettings>
My Cache provider is set accordingly
<outputCache defaultProvider="localRedisOutputCache">
<providers>
<add name="localRedisOutputCache" type="Microsoft.Web.Redis.RedisOutputCacheProvider" host="127.0.0.1" accessKey="" ssl="false" />
</providers>
</outputCache>
No entries are being made into my cache. If I change my ActionResult decoration to
[OutputCache(Duration=1800)]
it works, but I'd rather not have to set this manually against each method.
Any ideas on why the cache profile is being ignored and how to resolve would be appreciated.

Cache Host config in Azure In-Role caching

Accordingly to this MSDN article (on AppFabric Caching, which is what Azure is run on), I should be able to find a DistributedCacheService.exe.config file located at \Windows\System32\AppFabric, but it doesn't exist on any of the instances.
When remoting into one of the instances and searching for configs, I find several cache-related config files in E:\plugins\Caching.
The CacheService.config.exe file looks very promising (similar to DistributedCacheService .exe.config), except that the dataCacheConfig is not initialized:
<dataCacheConfig cacheHostName="">
<!-- Comment/uncomment below line to disable/enable file sink logging.
Also location attribute is not honored. It is just specified since its mandatory. -->
<!--<log logLevel="3" location="" />-->
<clusterConfig connectionString="" />
</dataCacheConfig>
I need to confirm that certain data cache settings are being configured properly on the server side in order to solve a previous post of mine.
My client-side web.config looks something like this:
<dataCacheClients>
<dataCacheClient name="DataCache1">
<autoDiscover isEnabled="true" identifier="MyRoleName" />
<transportProperties maxBufferPoolSize="6400000" maxBufferSize="256" />
</dataCacheClient>
<dataCacheClient name="DataCache2">
<autoDiscover isEnabled="true" identifier="MyRoleName" />
<transportProperties maxBufferPoolSize="0" maxBufferSize="10485760" />
</dataCacheClient>
<dataCacheClient name="DataCache3">
<autoDiscover isEnabled="true" identifier="MyRoleName" />
<transportProperties maxBufferPoolSize="3276800" maxBufferSize="32768" />
</dataCacheClient>
</dataCacheClients>
Where do I find the cache host configuration file in Azure In-Role caching (colocated)?
The host property that you configure in on premise AppFabric cache is dynamically initialized in InRole Cache. You can check Caching.csplugin at Program Files\Microsoft SDKs\Windows Azure.NET SDK\v2.2\bin\plugins\Caching to see the endpoints for the cache server.

MaxBufferPoolSize and MaxBufferSize?

This is my first post to Stack Overflow so please apologies if there is any non-conformity in it.
Question
I have developed a Windows Azure based site (similar to eBay) and hosted it on Azure platform. I have deployed multiple instances of web role with Azure caching enabled. Till last week everything was going fine but suddenly product search page started freezing while loading the data from db. It hangs only for specific categories which returns huge amount of data.
I read somewhere that we should enable localCache and transportProperties if we are expecting large messages. Hence I modified datacache item in my web.config as below but no luck. The page still hangs for those categories!
Could somebody please tell me what is wrong in following and show me some pointers?
<dataCacheClient name="default" channelOpenTimeout="20000" maxConnectionsToServer="4" requestTimeout="30000">
<localCache isEnabled="true" sync="TimeoutBased" ttlValue="300" objectCount="10000"/>
<clientNotification pollInterval="300" maxQueueLength="10000"/>
<transportProperties connectionBufferSize="64000" maxBufferPoolSize="5242880"
maxBufferSize="1242880" maxOutputDelay="2" channelInitializationTimeout="60000"
receiveTimeout="600000"/>
<hosts>
<host name="<<AZURE CACHE URL>>" cachePort="22233" />
</hosts>
<securityProperties mode="Message">
<messageSecurity
authorizationInfo="<<KEY>>">
</messageSecurity>
</securityProperties>
</dataCacheClient>
<dataCacheClient name="SslEndpoint" channelOpenTimeout="20000" maxConnectionsToServer="4" requestTimeout="30000">
<localCache isEnabled="true" sync="TimeoutBased" ttlValue="300" objectCount="10000"/>
<clientNotification pollInterval="300" maxQueueLength="10000"/>
<transportProperties connectionBufferSize="64000" maxBufferPoolSize="15242880"
maxBufferSize="5242880" maxOutputDelay="2" channelInitializationTimeout="60000"
receiveTimeout="600000"/>
<hosts>
<host name="<<AZURE CACHE URL>>" cachePort="22243" />
</hosts>
<securityProperties mode="Message" sslEnabled="true">
<messageSecurity
authorizationInfo="<<KEY>>">
</messageSecurity>
</securityProperties>
</dataCacheClient>
My dev env,
Azure SDK 1.8 (Oct 12), SQL Server 2008 R2, ASP.Net MVC 3
UPDATE
Today I deployed a build with CustomerErrors off to see the if it throws any exception, and this is what I got.
Thanks in advance
ND
I would advice to first find out which component is truly causing your intermittent slowdowns. Is it cache or is it SQL Azure?
If it is indeed cache, and since you're using Azure Shared Cache (previously known as Azure AppFabric Cache)
I would suggest looking at Dedicated cache as a solution instead of Shared cache. Performance of Shared Cache can sometimes be... unpredictable since it is a multi-tenant service and data travels over a network.

Resources